WO2018200417A1 - Systems and methods for 3d displays with flexible optical layers - Google Patents

Systems and methods for 3d displays with flexible optical layers Download PDF

Info

Publication number
WO2018200417A1
WO2018200417A1 PCT/US2018/028949 US2018028949W WO2018200417A1 WO 2018200417 A1 WO2018200417 A1 WO 2018200417A1 US 2018028949 W US2018028949 W US 2018028949W WO 2018200417 A1 WO2018200417 A1 WO 2018200417A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
light
flexible
wave
foil
Prior art date
Application number
PCT/US2018/028949
Other languages
French (fr)
Inventor
Jukka-Tapani Makinen
Kai Ojala
Jeffrey METZGER
Original Assignee
Pcms Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pcms Holdings, Inc. filed Critical Pcms Holdings, Inc.
Publication of WO2018200417A1 publication Critical patent/WO2018200417A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1066Beam splitting or combining systems for enhancing image performance, like resolution, pixel numbers, dual magnifications or dynamic range, by tiling, slicing or overlapping fields of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/30Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/54Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being generated by moving a 2D surface, e.g. by vibrating or rotating the 2D surface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/32Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using arrays of controllable light sources; using moving apertures or moving light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • H04N13/351Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Definitions

  • VAC vergence-accommodation conflict
  • FIGS. 2A- 2B looking at a real world object (FIG. 2A) and at an autostereoscopic 3D display (FIG. 2B) may have different focal distances and eye convergence angles or distances. This may result in portions of a view in the real world being blurred or out of focus, while all parts of a display may be in focus.
  • Three types of 3D displays are able to provide the correct focus cues for natural 3D image perception.
  • the first category is volumetric display techniques that can produce 3D images in true 3D space.
  • Each "voxel" of a 3D image is located physically at the spatial position where it is supposed to be and reflects or emits light from that position toward the observers to form a real image in the eyes of viewers.
  • the main problems in 3D volumetric displays are low resolution, large physical size and complexity of the systems that make them expensive to manufacture and too cumbersome outside special use cases like product displays, museums, etc.
  • a second existing 3D display device category capable of providing correct retinal focus cues is holographic displays. These aim to reconstruct the whole light wavefronts scattered from objects in natural settings.
  • the main problem in this field of technology is the lack of suitable Spatial Light Modulator (SLM) components that could be used in the creation of the extremely detailed wavefronts.
  • SLM Spatial Light Modulator
  • a holographic display technique has also been developed to a commercial prototype level by utilizing additional eye tracking technology, which has made it possible to use commercially available SLM components for creation of the wavefronts.
  • SLM components for creation of the wavefronts.
  • Such systems are still quite complex and require a large system, making it too expensive for average consumers.
  • a third 3D display technology category capable of providing natural retinal focus cues is called Light Field (LF) displays.
  • Vergence-accommodation conflict is one driver for moving from the current stereoscopic 3D displays to more advanced light field systems.
  • a flat form-factor LF 3D display may produce both the eye convergence and focus angles simultaneously.
  • FIGS. 3A-3D show these angles in four different 3D image content cases.
  • an image point 320 lies on the surface of the display 305, and only one illuminated pixel visible to both eyes 310 is needed. Both eyes focus (angle 322) and converge (angle 324) to the same point.
  • a virtual image point (e.g., voxel) 330 is behind the display 305, and two clusters of pixels 332 on the display are illuminated.
  • a virtual image 340 is at an infinite distance behind the display screen 305, and only parallel light rays are emitted from the display surface from two illuminated pixel clusters 342.
  • the minimum size for the pixel clusters 342 is the size of the eye pupil, and the size of the cluster also represents the maximum size of pixel clusters needed on the display surface.
  • a virtual image point (e.g., voxel) 350 is in front of the display 305, and two pixels clusters 352 are illuminated with the emitted beams crossing at the same point, where they focus.
  • both spatial and angular control of emitted light is used from the LF display device in order to create both the convergence and focus angles for natural eye responses to the 3D image content.
  • SMV Super Multi View
  • the diameter can be as small as 1.5 mm and in dark conditions as large as 8 mm.
  • the maximum angular density that can be achieved with SMV displays is limited by diffraction and there is an inverse relationship between spatial resolution (pixel size) and angular resolution. Diffraction increases the angular spread of a light beam passing through an aperture and this effect may be taken into account in the design of very high density SMV displays.
  • Light field displays call for a high amount of multiplexing from the optical hardware as all the different viewing directions and focal surfaces need to be presented through a single display surface.
  • Multiplexing can be done either spatially or temporally.
  • a limiting factor in temporally multiplexed systems is component switching speed or refresh-rate.
  • Different tuneable optical components such as "liquid lenses” are available and can be used in temporally multiplexed systems, but due to their complex structure, the lens-based systems become easily very large and expensive. They may be suitable for multiple user LF projection systems or for single users as parts of head mounted or table-top displays.
  • the spatial multiplexing approach uses more hardware than temporal approach as the multiple views are generated at the same time with parallel hardware components. This is especially problematic for systems that are intended for multiple users - the more viewers there are, the more different views will be generated and the more hardware is called for to realize this.
  • Flexible optical layers may comprise flexible layers of lenses or light emitting elements, in various embodiments.
  • a flexible optical layer may comprise a flexible light bending layer.
  • flexible optical layers may be controlled by actuators to generate propagating or traveling waves that, in synchronization with light emission, generate virtual depth of projected 3D images.
  • a flexible optical layer may comprise a flexible diffractive foil. In some embodiments, a flexible optical layer may comprise an array of tilting refractive plates. In some
  • a flexible optical layer may comprise a pair of arrays of tilting refractive plates. In some embodiments, a flexible optical layer may comprise a pair of flexible refractive foils.
  • a display apparatus may comprise: a light-emitting layer disposed within the display apparatus; a collimating microlens array disposed between the light-emitting layer and an outer surface of the display apparatus, the microlens array comprising a plurality of collimating microlenses; a first flexible light bending layer disposed between the light-emitting layer and the outer surface of the display apparatus; and at least one actuator operative to generate a traveling wave in the first flexible light bending layer to generate an oscillation in the orientation of the first flexible light bending layer relative to each of the collimating microlenses.
  • the display apparatus may include wherein a portion of the light- emitting layer including a plurality of sub-pixels is associated with a single microlens of the collimating microlens array to define one of a plurality of projector cells, and wherein a portion of the first flexible light bending layer spans a cell aperture of each projector cell.
  • the display apparatus may further comprise a controller configured to control at least a first projector cell and the at least one actuator to: based on a location of at least one voxel of 3D content to be displayed by the display apparatus, illuminating a subset of the plurality of sub-pixels of the first projector cell in synchrony with the orientation of the first flexible light bending layer relative to the microlens of the first projector cell to generate a 3D image.
  • the display apparatus may include wherein the generated 3D image comprises a plurality of independent views of the 3D content projected at a plurality of viewing angles.
  • the display apparatus may include wherein the first flexible light bending layer comprises a flexible diffractive foil, and the collimating microlens array is disposed between the light-emitting layer and the flexible diffractive foil.
  • the display apparatus may include wherein the flexible diffractive foil is disposed between the collimating microlens array and the microprism array.
  • the display apparatus may further comprise a spatial light modulator (SLM), wherein the flexible diffractive foil is disposed between the SLM and the light-emitting layer, and the SLM is configured to be controlled in synchronization with the at least one actuator and the light-emitting layer to modulate light diffracted by the flexible diffractive foil.
  • SLM spatial light modulator
  • the display apparatus may include wherein the first flexible light bending layer comprises a first array of tilting refractive plates, and the first array of tilting refractive plates is disposed between the light-emitting layer and the collimating microlens array.
  • the display apparatus may include wherein each refractive plate in the plate array is connected to one or more adjacent plates via a flexible joint.
  • the display apparatus may include wherein the first flexible light bending layer comprises a first flexible refractive foil, the first flexible refractive foil disposed between the light-emitting layer and the collimating microlens array.
  • the display apparatus may further comprise a second flexible light bending layer, wherein the first flexible light bending layer is disposed between the light-emitting layer and the collimating microlens array, and the second flexible light bending layer is disposed between the first flexible light bending layer and the collimating microlens array.
  • the display apparatus may include wherein the first and second flexible light bending layers comprise a first and a second array of tilting refractive plates.
  • the display apparatus may include wherein the first and second flexible light bending layers comprise a first and a second flexible refractive foil.
  • a method comprises: controlling at least a first actuator to generate a traveling wave in a first flexible light bending layer, said traveling wave generating an oscillation in the orientation of the first flexible light bending layer relative to a plurality of projector cells, each projector cell having (i) a subset of light sources of a light-emitting layer, and (ii) a focusing microlens; and controlling illumination of the light sources of the projector cells, based on 3D content to be projected as voxels, wherein the control of illumination is in synchronization with the traveling wave in the first flexible light bending layer.
  • the method may include wherein the first flexible light bending layer comprises a flexible diffractive foil, wherein the microlens of each projector cell is disposed between the light-emitting layer and the first flexible light bending layer, and further comprising: modulating the diffracted light from the flexible diffractive foil with a spatial light modulator, disposed between the first flexible diffractive foil and a second microlens, that is synchronized with the light sources of the projector cells and the traveling wave; and projecting the modulated light through a third microlens array to project the 3D content.
  • the method may include wherein the first flexible light bending layer comprises a first array of refractive tilting plates disposed between the light-emitting layer and microlens of each projector cell, and further comprising: modulating the light sources of the projector cells and synchronizing the emitted modulated light with the orientations of the refractive tilting plates; and passing the emitted modulated light through the microlens of each projector cell to project the 3D content.
  • FIG. 1A is a system diagram illustrating an example communications system in which one or more disclosed embodiments may be implemented;
  • FIG. 1 B is a system diagram illustrating an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in FIG. 1A according to an embodiment;
  • WTRU wireless transmit/receive unit
  • FIGS. 2A-2B illustrate different focal distances and eye convergence angles when looking a real world object (FIG. 2A) and an autostereoscopic 3D display (FIG. 2B).
  • FIGS. 3A-3D illustrate eye focus angles and convergence angles together with pixel clusters on a flat LF display in four generalized cases.
  • FIGS. 4A-4C illustrate different occlusion effects of three different light fields directed inside an eye pupil.
  • FIG. 5 illustrates an exemplary generation of dynamic movement in a wave display.
  • FIG. 6A depicts an example of a combination of viewing angle and wave slope angle creating a self-occlusion of a display foil at a first wave phase; and FIG. 6B depicts an alternative wave phase where there is no occlusion at the same position.
  • FIG. 7 depicts an example of a virtual image of an object formed for a viewer by switching light emitting pixels on and off.
  • FIG. 8 illustrates one embodiment of a wave display comprising a plurality of layers.
  • FIG. 9A depicts an example of non-Lambertian emitters resulting in uneven display illumination to a viewing direction; and FIG. 9B depicts how the uneven display illumination may be corrected by adding a diffusing optical layer on top of the emitters.
  • FIG. 10 illustrates one embodiment of an optical layer used as diffuser for evening out different sized gaps between pixels positioned at different parts of a propagating wave display.
  • FIG. 11 illustrates one embodiment of an optical layer used as a partial reflector to make a wave display viewable on a front side and a back side.
  • FIG. 12 illustrates one embodiment of an optical layer utilizing deformable lens structures on the optical layer.
  • FIG. 13 illustrates an exemplary embodiment of the structural elements of a 3D wave display.
  • FIG. 14 illustrates light emission angles of a light field display.
  • FIGS. 15A and 15B are schematic illustrations of two example structures: a flexible optical layer on top of a rigid light emitting layer (FIG. 15A) and a rigid optical layer on top of a flexible light emitting layer (FIG. 15B).
  • FIG. 16 illustrates generation of multiple viewing directions and virtual image depths for an array of projectors.
  • FIGS. 17A and 17B illustrate generation of an angular sweep through the eye box with one projector of the array of projectors of FIG. 16.
  • FIGS. 18A-18F are cross sectional views of a light field display illustrating display of a voxel.
  • FIG. 19A is a schematic presentation of the basic structure of a single LF projector cell; and FIG. 19B is a schematic presentation of how the angular sweep is generated in a display structure comprising a plurality of project cells as in FIG. 19A.
  • FIG. 20 is a schematic presentation an exemplary structure in use as a display.
  • FIG. 21 illustrates the optical structure of a single projector cell with holographic/standard grating film.
  • FIG. 22 illustrates the structure of a single projector cell using a prism-grating-prism optical element.
  • FIG. 23A depicts a schematic presentation of an exemplary structure of a single LF projector cell, with a refractive tilting plate; and FIG. 23B depicts a schematic presentation of sweeping through beam scanning angles in an exemplary display structure comprising a plurality of projector cells as in FIG. 23A.
  • FIG. 24A depicts an overview of various exemplary standing wave states of a tilting plate array, in accordance with an embodiment
  • FIG. 24B illustrates an exemplary standing wave with nodes and anti-nodes.
  • FIG. 25 depicts a schematic presentation of an exemplary structure for generating 3D Light Fields using tilting refractive plates, in accordance with an embodiment.
  • FIGS. 26A and 26B are schematic cross-sectional views of a portion of a display device in an exemplary embodiment.
  • FIG. 27A depicts a schematic presentation of an exemplary structure of a single LF projector cell, with a diffractive foil and a spatial light modular; and FIG. 27B depicts a schematic presentation of a similar projector cell to FIG. 27A, with multiple light emitting elements.
  • FIG. 28 depicts an overview of beam angle change using a diffractive foil, in accordance with an embodiment.
  • FIG. 29 depicts a schematic presentation of an exemplary internal structure a 3D Light Field display with directed backlight using a diffractive foil, in accordance with an embodiment.
  • FIG. 30 depicts an overview an exemplary 3D Light Field display with directed backlight using a diffractive foil, in accordance with an embodiment.
  • FIG. 31 is a schematic presentation of the basic structure of a single LF projector cell with double tilting refractive elements.
  • FIGS. 32A-32C illustrate optical functions of LF pixels with tilting elements that are in opposite phase, according to an embodiment.
  • FIGS. 33A-33C illustrate optical functions of LF pixels with tilting elements that are in different phases, according to an embodiment.
  • FIG. 34A is a schematic presentation of a display structure with multiple LF pixels and two refractive foils or films with synchronized propagating waveforms, according to an embodiment
  • FIG. 34B is a schematic presentation of the same display structure as FIG. 34A with propagating waveforms that are not in the same phase, according to an embodiment.
  • FIG. 35 is a schematic presentation of a light field display with dual flexible foils, according to an embodiment
  • FIG. 36 is a schematic presentation of an alternative display structure utilizing an exemplary method in a head mounted device.
  • FIG. 37 is a schematic presentation of an alternative display structure with wave modules, according to an embodiment.
  • FIG. 38A illustrates viewing geometry of an embodiment of a LF display with a 3D image zone in front of a curved display
  • FIG. 38B illustrates a viewing geometry for an embodiment with a flat screen display.
  • FIGS. 39A and 39B illustrate viewing geometry for two different scenarios using a curved display (as in FIG. 38A).
  • FIG. 40 illustrates simulated illumination patterns on different image distances with two different tilt angles and inside 3mm x 3mm sized sensor areas.
  • FIG. 41 illustrates an exemplary LF display structure with a diffractive aperture expander.
  • FIG. 1 A is a diagram illustrating an example communications system 100 in which one or more disclosed embodiments may be implemented.
  • the communications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users.
  • the communications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth.
  • the communications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), zero-tail unique-word DFT-Spread OFDM (ZT UW DTS-s OFDM), unique word OFDM (UW-OFDM), resource block-filtered OFDM, filter bank multicarrier (FBMC), and the like.
  • CDMA code division multiple access
  • TDMA time division multiple access
  • FDMA frequency division multiple access
  • OFDMA orthogonal FDMA
  • SC-FDMA single-carrier FDMA
  • ZT UW DTS-s OFDM zero-tail unique-word DFT-Spread OFDM
  • UW-OFDM unique word OFDM
  • FBMC filter bank multicarrier
  • the communications system 100 may include wireless transmit/receive units (WTRUs) 102a, 102b, 102c, 102d, a RAN 104/113, a CN 106/115, a public switched telephone network (PSTN) 108, the Internet 110, and other networks 112, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements.
  • WTRUs 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wireless environment.
  • the WTRUs 102a, 102b, 102c, 102d may be configured to transmit and/or receive wireless signals and may include a user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a subscription-based unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, a hotspot or Mi-Fi device, an Internet of Things (loT) device, a watch or other wearable, a head-mounted display (HMD), a vehicle, a drone, a medical device and applications (e.g., remote surgery), an industrial device and applications (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain contexts), a consumer electronics device, a device operating on commercial and/or industrial wireless networks, and the like.
  • UE user equipment
  • PDA personal digital assistant
  • HMD head-mounted display
  • a vehicle a drone
  • the communications systems 100 may also include a base station 114a and/or a base station 114b.
  • Each of the base stations 114a, 114b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the CN 106/115, the I nternet 110, and/or the other networks 112.
  • the base stations 114a, 114b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a gNB, a NR NodeB, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114a, 114b are each depicted as a single element, it will be appreciated that the base stations 114a, 114b may include any number of interconnected base stations and/or network elements.
  • the base station 114a may be part of the RAN 104/113, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc.
  • BSC base station controller
  • RNC radio network controller
  • the base station 114a and/or the base station 114b may be configured to transmit and/or receive wireless signals on one or more carrier frequencies, which may be referred to as a cell (not shown). These frequencies may be in licensed spectrum, unlicensed spectrum, or a combination of licensed and unlicensed spectrum.
  • a cell may provide coverage for a wireless service to a specific geographical area that may be relatively fixed or that may change over time. The cell may further be divided into cell sectors.
  • the cell associated with the base station 114a may be divided into three sectors.
  • the base station 114a may include three transceivers, i.e., one for each sector of the cell.
  • the base station 114a may employ multiple-input multiple output (MIMO) technology and may utilize multiple transceivers for each sector of the cell.
  • MIMO multiple-input multiple output
  • beamforming may be used to transmit and/or receive signals in desired spatial directions.
  • the base stations 114a, 114b may communicate with one or more of the WTRUs 102a, 102b, 102c, 102d over an air interface 116, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, centimeter wave, micrometer wave, infrared (IR), ultraviolet (UV), visible light, etc.).
  • the air interface 116 may be established using any suitable radio access technology (RAT).
  • RAT radio access technology
  • the communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like.
  • the base station 114a in the RAN 104/113 and the WTRUs 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 115/116/117 using wideband CDMA (WCDMA).
  • WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+).
  • HSPA may include High-Speed Downlink (DL) Packet Access (HSDPA) and/or High-Speed UL Packet Access (HSUPA).
  • the base station 114a and the WTRUs 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 116 using Long Term Evolution (LTE) and/or LTE-Advanced (LTE-A) and/or LTE-Advanced Pro (LTE-A Pro).
  • E-UTRA Evolved UMTS Terrestrial Radio Access
  • LTE Long Term Evolution
  • LTE-A LTE-Advanced
  • LTE-A Pro LTE-Advanced Pro
  • the base station 114a and the WTRUs 102a, 102b, 102c may implement a radio technology such as NR Radio Access , which may establish the air interface 116 using New Radio (NR).
  • a radio technology such as NR Radio Access , which may establish the air interface 116 using New Radio (NR).
  • the base station 114a and the WTRUs 102a, 102b, 102c may implement multiple radio access technologies.
  • the base station 114a and the WTRUs 102a, 102b, 102c may implement LTE radio access and NR radio access together, for instance using dual connectivity (DC) principles.
  • DC dual connectivity
  • the air interface utilized by WTRUs 102a, 102b, 102c may be characterized by multiple types of radio access technologies and/or transmissions sent to/from multiple types of base stations (e.g., a eNB and a gNB).
  • the base station 114a and the WTRUs 102a, 102b, 102c may implement radio technologies such as IEEE 802.11 (i.e., Wireless Fidelity (WiFi), IEEE 802.16 (i.e., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 1X, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.
  • IEEE 802.11 i.e., Wireless Fidelity (WiFi)
  • IEEE 802.16 i.e., Worldwide Interoperability for Microwave Access (WiMAX)
  • CDMA2000, CDMA2000 1X, CDMA2000 EV-DO Code Division Multiple Access 2000
  • IS-95 Interim Standard 95
  • IS-856 Interim Standard 856
  • GSM Global System for
  • the base station 114b in FIG. 1 A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, an industrial facility, an air corridor (e.g., for use by drones), a roadway, and the like.
  • the base station 114b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN).
  • WLAN wireless local area network
  • the base station 114b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN).
  • the base station 114b and the WTRUs 102c, 102d may utilize a cellular-based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, LTE-A Pro, NR etc.) to establish a picocell or femtocell.
  • the base station 114b may have a direct connection to the Internet 110.
  • the base station 114b may not be required to access the Internet 110 via the CN 106/115.
  • the RAN 104/113 may be in communication with the CN 106/115, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 102a, 102b, 102c, 102d.
  • the data may have varying quality of service (QoS) requirements, such as differing throughput requirements, latency requirements, error tolerance requirements, reliability requirements, data throughput requirements, mobility requirements, and the like.
  • QoS quality of service
  • the CN 106/115 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication.
  • the RAN 104/113 and/or the CN 106/115 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 104/113 or a different RAT.
  • the CN 106/115 may also be in communication with another RAN (not shown) employing a GSM, UMTS, CDMA 2000, WiMAX, E-UTRA, or WiFi radio technology.
  • the CN 106/115 may also serve as a gateway for the WTRUs 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 110, and/or the other networks 112.
  • the PSTN 108 may include circuit- switched telephone networks that provide plain old telephone service (POTS).
  • POTS plain old telephone service
  • the Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and/or the internet protocol (IP) in the TCP/IP internet protocol suite.
  • the networks 112 may include wired and/or wireless communications networks owned and/or operated by other service providers.
  • the networks 112 may include another CN connected to one or more RANs, which may employ the same RAT as the RAN 104/113 or a different RAT.
  • Some or all of the WTRUs 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities (e.g., the WTRUs 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wireless networks over different wireless links).
  • the WTRU 102c shown in FIG. 1A may be configured to communicate with the base station 114a, which may employ a cellular-based radio technology, and with the base station 114b, which may employ an IEEE 802 radio technology.
  • FIG. 1 B is a system diagram illustrating an example WTRU 102.
  • the WTRU 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a
  • the WTRU 102 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
  • the processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
  • the processor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment.
  • the processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 1 B depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
  • the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 114a) over the air interface 116.
  • the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals.
  • the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example.
  • the transmit/receive element 122 may be configured to transmit and/or receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MIMO technology. Thus, in one embodiment, the WTRU 102 may include two or more
  • transmit/receive elements 122 e.g., multiple antennas for transmitting and receiving wireless signals over the air interface 116.
  • the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122.
  • the WTRU 102 may have multi-mode capabilities.
  • the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as NR and IEEE 802.11 , for example.
  • the processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128.
  • the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.
  • the non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • SIM subscriber identity module
  • SD secure digital
  • the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).
  • the processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102.
  • the power source 134 may be any suitable device for powering the WTRU 102.
  • the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
  • the processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102.
  • location information e.g., longitude and latitude
  • the WTRU 102 may receive location information over the air interface 116 from a base station (e.g., base stations 114a, 114b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location- determination method while remaining consistent with an embodiment.
  • the processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs and/or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a Virtual Reality and/or Augmented Reality (VR/AR) device, an activity tracker, and the like.
  • FM frequency modulated
  • the peripherals 138 may include one or more sensors, the sensors may be one or more of a gyroscope, an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.
  • a gyroscope an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.
  • the WTRU 102 may include a full duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for both the UL (e.g., for transmission) and downlink (e.g., for reception) may be concurrent and/or simultaneous.
  • the full duplex radio may include an interference management unit to reduce and or substantially eliminate self-interference via either hardware (e.g., a choke) or signal processing via a processor (e.g., a separate processor (not shown) or via processor 118).
  • the WRTU 102 may include a half-duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for either the UL (e.g., for transmission) or the downlink (e.g., for reception)).
  • a half-duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for either the UL (e.g., for transmission) or the downlink (e.g., for reception)).
  • modules that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules.
  • a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as commonly referred to as RAM, ROM, etc.
  • a 3D volumetric display can be made with a single light emitting flexible layer by generating a propagating wave through the pixelated sheet and by synchronizing the light emittance of each pixel to the wave phase and 3D image content. Pixels create virtual images of 3D objects to the air volume while moving on the fast propagating wave as the human visual system integrates the image due to persistence of vision (POV) phenomena.
  • POV persistence of vision
  • the propagating display sheet wave sweeping through the 3D volume fills the whole depth dimension on the display area.
  • the wave display sheet oscillates and the user is looking at the device from the wave amplitude direction.
  • the wave propagates through the device display area in a perpendicular direction to the user's line of sight, and this makes it possible for the display to create occlusion behind the pixels as the whole area is filled with material all the time, unlike in the case where, for example, a rotating blade with light emitting diodes (LEDs) is used.
  • LEDs light emitting diodes
  • the propagating wave moving in one direction is generated by back-and-forth movement of the sheet edges or along the display area. Propagation of the display sheet wave ensures that the emitting surface sweeps through every voxel in the 3D volume during one refresh cycle. Each voxel on the 3D volumetric display is created by display elements, which intermitted light emission is synchronized to the wave propagation speed and 3D image content.
  • the wave display sheet can be a stack of flexible layers with one or more light emitting layers and optical layers.
  • the light emitting elements can be positioned only on one side of the display sheet in order to cover a hemisphere or on both sides in order to cover a full sphere.
  • Optical layers can also be used, for example, as a partially reflecting element, which may be used in a two-sided display with a single sparse emitter matrix. If the optical layer shapes are made from a gel-like deformable material, the waviness of the whole foil can be used to advantage as the optical layer shapes are deformed by the foil bending radius. In this case, the deformed optical shapes can be used in somewhat extending the visual depth of the structure without increasing the actual wave amplitude.
  • the wave display may be a fairly simple, compact, inexpensive, and robust structure. As the 3D display volume is filled with a wavy sheet, it can also be safer than the current 3D volumetric displays that are based on fast rotating elements, which need to be shielded with, for example, a glass or acrylic sphere.
  • the wave display can be made with an improved form factor compared to rotating displays, as the wave display may be made flatter, whereas the rotating displays generally need a cylindrical or spherical volume.
  • the propagating wave also makes a displayed 3D volume that may have a generally cubic shape, instead of the cylindrical volume created with some rotating displays. Because the propagating wave sweeps every part of the 3D volume equally, there is as much depth at the edges as there is at the center. In exemplary embodiments, there are also no "dead zones" on the display area, as there are no areas that move at different speeds from each other throughout the display area, in contrast to the case of a rotating display axis.
  • the wavy display may also enable creation of occlusion behind the display pixels, as the whole area is filled with material (e.g., black background) all the time, unlike in the case of many other POV displays, for example when a rotating structure is used.
  • material e.g., black background
  • the herein-disclosed display structures and methods may be used with both 3D and 2D image content without loss of resolution by making the propagating wave amplitude zero.
  • the systems and methods disclosed herein may, in some embodiments, use a flexible light emitting sheet (or display foil) to create 3D volumetric display for presenting 3D image content.
  • a propagating wave may be generated to a pixelated sheet, and by synchronizing the light emittance of each pixel to the wave phase, amplitude, and 3D image content, an image may be formed in an air volume covered by the propagating wave. Pixels may create virtual images of 3D objects while moving on the fast propagating wave, as the human visual system integrates the image due to the Persistence-of-Vision (POV) phenomena.
  • POV Persistence-of-Vision
  • Dynamic wave propagation (propagating wave 510) in one direction (e.g., horizontal) on a flexible display sheet 505 can be generated by moving one or both sheet ends linearly (linear motion 515) at the amplitude direction of the wave, as shown in FIG. 5.
  • the display sheet 505 may be longer than the dimension of a display apparatus in order to cover the whole display area when the waveform is used. The desired length may be determined by the number of waves and wave amplitude used. If the display is used in a 2D mode and the foil is flat, the extra foil may be collected inside a display frame. Either or both of linear motion and bending movement (e.g., angular movement 520 of a sheet end) may be used at sheet ends in order to generate the propagating wave through the device width.
  • linear motion and bending movement e.g., angular movement 520 of a sheet end
  • the light emitting display may be self-emitting and flexible.
  • Exemplary display technologies that can fulfill these criteria include, but are not limited to: OLED and LED matrix bonded to a flexible substrate.
  • the OLED may be the better candidate from these two options as the structures can be printed and made more flexible.
  • One example OLED display structure has been described in J. Wallace, "Highly flexible OLED light source has 10 micron bend radius,” Laser Focus World, July 31 , 2013. This structure is only 2 m thick and it can have a minimum 10 ⁇ bending radius.
  • the sheet's mass is 3g/m 2 and achieved brightness 100cd/m 2 . While some commercially available OLED displays are built by using glass as a barrier material, exemplary embodiments make use of flexible barrier layers. In some embodiments, a two-sided oscillating wave display is provided to enable 3D volumetric viewing from front and back sides of display.
  • the display sheet may create self-occlusion at certain wave phases to a viewer positioned at a viewing angle that is larger than the maximum slope angle of the wave.
  • FIG. 6A there is a first viewer 602 and a second viewer 604, both looking at the display sheet 605, which may comprise a light emitting foil.
  • the same position of the display sheet 605 becomes visible, as shown in FIG. 6B, where the light emitting element 615 may be visible to both viewers.
  • the oscillations of the light emitting substrate may be fast enough to enable the POV effect of the human visual system. At a central field of view, this may mean that ⁇ 60Hz refresh frequency for the display may be adequate, and no flicker would be visible.
  • ⁇ 60Hz refresh frequency for the display may be adequate, and no flicker would be visible.
  • each pixel sweeps through the whole depth of the volume during one refresh cycle. This means that movement of half a wave is used for the ⁇ 60Hz refresh rate and the full wave form should oscillate at ⁇ 30Hz frequency.
  • the shape of the propagating wave may be formed as to be optimal for a given use case.
  • sinusoidal, saw, triangle, or pulse waves may be generated by controlling the linear motors (or other sheet movement generators, such as electrical conductors, electromagnets, angular motors, etc.) on both ends of the foil appropriately.
  • Waveforms with fast slope angle changes e.g., triangle
  • triangular waves may lead to a more linear distance change and constant tilt angle for the light emitting pixels, which may benefit the overall system design.
  • a non-periodic waveform may also be used as content adaptive way to, among other things, save energy, but this may come with the cost of making the system more complex.
  • an even sweep depth for the foil movement through the display area may be most desirable. If the light emitting foil physical properties (e.g., stiffness) are such that the waveform amplitude or shape show degradation as the distance to the linear motor is increased. In some embodiments, amplitude degradation is compensated for by gradually changing the foil thickness in the direction of the wave propagation.
  • the minimum bending radius of the flexible light emitting foil may also set a limit for the minimum wave amplitude and propagation speed. If the example OLED display structure described in J. Wallace is used, the 10 m bend radius results in a minimum of 20 m wave amplitude and 40 m wavelength with a waveform that comprises two semicircles. This kind of volumetric display may be used for showing shallow relief patterns as described in further detail below. Adequate propagation speed for this waveform can be calculated from the refresh frequency (60Hz) and wavelength (40 ⁇ ) to be ⁇ 2.4mm/s (i.e., 0.04mm ⁇ 1/60s). With large-scale volumetric displays the bending radius can be much larger and material properties do not set a strict limit to the amplitude.
  • the bending radius would be around 125mm.
  • a speed of ⁇ 15m/s i.e., 0.25m ⁇ 1 /60s can be calculated for the propagating wave.
  • the brightness of the image producing substrate may be selected so as to be adequate considering the use case and, for example, the desired refresh frequency.
  • Currently available POV- displays use either projection systems or LED rows. LED matrix on a flexible substrate would be able to produce adequate brightness already with the volumetric wave display in normal indoor lighting conditions.
  • the light emitting foil material does not stretch, only transverse waves are generated and single pixels of the display sheet move only in one direction, as determined by the wave amplitude.
  • the wave is dynamic and it travels through the display area, its phase changes with time.
  • Position and surface normal angle of each pixel at all times can be calculated from the wave amplitude, shape, and phase, which are all controlled by the wave generation mechanism (or a wave generator, or the like).
  • a 3D picture can be generated by switching on and off the single pixels at the right moments when each pixel is positioned in the correct coordinate position, as determined by the 3D geometry to be presented.
  • the depth coordinates of each "voxel" may be continuous as the pixels sweep through the whole volume.
  • Each voxel of a 3D image is located physically at the spatial position where it is supposed to be, and the corresponding pixel emits light from that position toward the viewer.
  • the eyes 702 of the viewer 700 both focus and converge naturally to the virtual 3D image 710 as shown in FIG. 7.
  • Natural 3D perception occurs as the two eyes see two different views with correct retinal blur and eye convergence depth cues. The same viewing condition is also present at different viewing angles for a single user or for multiple users.
  • Pixel density visible to a single viewer changes with slope angle of the wave. The lowest density can be seen when the emitting element surface normal is in the same direction than viewing angle and the highest density is visible when the surface normal is furthest away from the viewing angle.
  • This resolution difference at different wave phase positions can be mitigated by switching several neighboring pixels on at the same time when the pixel density is larger, making the visible area size of this clustered pixel closer to the size of a pixel on the low-density area. With several viewers and viewing directions, the effect is balanced to accommodate the fact that the visible size of the pixels will be different from different viewing angles.
  • the display sheet itself is a stack of flexible layers, as shown in FIG. 8.
  • a base layer 810 may comprise an array of light emitting elements which are activated according to the 3D content.
  • the whole stack of layers may move as a single wave.
  • Optical layers use in different embodiments include arrays of refractive, reflective, diffractive, dichroic, absorbing or scattering elements.
  • the optical elements change the angular distribution of light emitted by the display active elements.
  • Normal OLED or LED emitters 910 (on a substrate foil 905) radiate light according to Lambert's law, which dictates that the emitted power is smaller for larger angles from the emitter surface normal (see FIG. 9A). In this case, the intensity distribution follows a cosine relation between the angular directions of observer's line of sight and emitter 910 surface normal.
  • the optical layer 915 may be configured so as to diffuse the light towards the more ideal distribution. Such a use case for the optical layer 915 is illustrated in FIG. 9B.
  • the optical layer operates to provide diffusion of the boundaries between pixels as illustrated in FIG. 10.
  • the visual pixel density is different along the display surface (which is backed by the substrate foil 1005) depending on the position of the pixels 1010 (or light emitting elements) on the propagating wave as well as on the viewer position (e.g., at one point along the surface there may be a large gap 1012 between pixels, whereas at another point along the surface there may be a small gap 1014 between pixels).
  • the diffusing optical layer 1015 can be used for evening out the visual differences.
  • optical layer 1115 can also be used for reflection of emitted light from pixels 1110 (e.g., light emitting elements 1110 on substrate foil 1105) towards the back side of the display, and in this way a single emitter layer can be used for creation of a double-sided display.
  • the optical layer 1115 may include reflective surfaces 1120, and also transmissive surfaces 1125.
  • the optical layer 1215 comprises small lenses that are deformable such that they may have different focus distance when they are bent at the trough of the wave (such as squeezed lens 1225) or flattened at the top of it (such as stretched lens 1220).
  • This feature may be used as an advantage by extending the visible depth of the display (by adding virtual depth) as the emitters 1210 of the light emitting layer 1205 appear to be further away (e.g., virtual image 1230) when the lens has smaller focal length.
  • a 3D wave display system may comprise a playback device 1305 that provides 3D content to a display device.
  • the display device may have support mechanics 1320 for a flexible wave display sheet 1335 inside a display frame 1330.
  • the support mechanism 1320 may include linear and/or angular momentum motors or moving supports at the vertical ends of the sheets (e.g., sheet movement generators 1325). Instead of motors, there may be electrical conductors or electromagnets along the display width which generate dynamic propagating wave movement (such as propagating wave 1340) for the flexible sheet 1335 with electromagnetic force.
  • the playback device 1305 may calculate the correct control signals 1315 to the linear and angular motors (or sheet movement generators 1325) and send them to the display apparatus, which has control electronics which activate the motors according to the control signals.
  • the playback device 1305 may also calculate the right timing for synchronized on-off switching of each pixel, and send these as a display signal 1310 to the display apparatus, which may have display control electronics which activate the pixels according to the display signal 1310.
  • Both the control signal 1315 and display signal 1310 may be calculated in the playback device 1305 on the basis of the 3D content to be displayed, and with the known physical parameters of the actuator motors (or sheet generator motors 1325) and display sheet 1335.
  • an array of moving display rods may be used to create a wave path instead of a single continuous sheet.
  • Light emitting elements may reflect ambient or projected light. They may also reflect white light or colors when turned on and absorb light when switched off.
  • the moving wave on the sheet may act like an air pump which causes air flow in front of the device. This may be avoided in some embodiments, for example, by placing the waving element in a vacuum or a thin and/or light gas within a closed device.
  • the wave display geometric form may have zero amplitude or frequency which generates a conventional flat 2D display surface.
  • the extra foil (or wave display sheet) may be gathered at the frame around the display device (or otherwise retained).
  • the wave display optical layer lenticular array is made from deformable, gel-like material.
  • the lens focal length is dependent on display wave phase and lens location.
  • a single lens may be "squeezed” to have smaller radius at the bottom of the wave and “flattened” at the top of the wave (see FIG. 12). This means that the lens power at different positions on the wave may be changed, and the emitting pixel below the lens appears to be either closer or further away from the viewer or observer.
  • the apparent distance of emitters can be used in extending the apparent range of depth without increasing the physical wave amplitude of the display.
  • 2D plus relief wave display I n the case of, for example, e-book reader devices, display content may be enhanced by displaying shallow 3D reliefs. Examples of this may include a select button with clearly defined edges or certain words that are emphasized. The 3D depth does not need to be high, and a wave display with small amplitude may be implemented in this environment.
  • the wave display optical layers can be utilized here to increase 3D image depth from wave physical amplitude.
  • the display includes an array of small wave displays with shallow wave amplitudes suitable for relief presentation.
  • the wave propagates in both X and Y directions that are orthogonal to the observer's line of sight. Wave propagation may be generated on two sides, or every side, of the array sub-element.
  • the sub- elements of the matrix may be the size of a few surface-wavelengths (e.g., between one and five surface- wavelengths, between one and ten surface-wavelengths, and/or the like).
  • the waves in a sub-element generate illusion of movement, texture or relief form by discrete cosine transforms (DCT).
  • DCT discrete cosine transforms
  • a relief wave display user is able to touch a true 3D volumetric relief image on the display surface of the reader device, making it a haptic feedback device. This may also enable blind people to read documents on reader devices, such as where the letters of a document are in the form of 3D relief or Braille pattern.
  • Such embodiments may be used as mobile device touch screens, or other user interfaces.
  • Multiview autostereoscopic 3D wave display In an embodiment of a multiview autostereoscopic 3D wave display, a lenticular sheet or absorbing parallax barrier grating is used as an optical layer in front of the light emitting layer. With this optical layer, the observer sees different images with the two eyes and perception of 3D content is created.
  • An autostereoscopic wave display can be designed for multiple viewers or viewing angles with different images for the separate surface tilt angles. As the propagating wave scans through the display surface, different images are scanned to different viewing angles by synchronizing the image content with the wave surface normal directions. The optical layer blocks all other visible angles making the view unique to the specific direction.
  • This approach has the benefit of using only two pixels as a stereo pair for all multiview directions instead of using several pixels, and a lower pixel count light emitting layer can be used.
  • These embodiments may combine spatial and temporal multiplexing, making it possible to optimize the autostereoscopic 3D display structure better for an available display and signal processing hardware.
  • FIG. 13 illustrates on embodiment of the structure of a table-top wave display system and device.
  • the playback device sends data that is used for switching on and off the light emitting elements on the display device light emitting flexible layer (e.g., OLED-display foil).
  • the playback device also provides control signals for the motors that generate movement and angular moment to the wave propagating on the flexible display sheet.
  • the linear motors move the flexible display sheet back and forth with amplitude appropriate for generating the 3D image content.
  • angular momentum may also be applied to initiate wave propagation through the device area.
  • embodiments there are movement actuators at both ends of the waving sheet.
  • Display elements on the propagating wave are switched on and off with control electronics according to their location and in synchronization to the 3D content.
  • An optical layer, positioned on top of the emitting layer, provides even illumination though the device width by diffusing the illumination patterns of the light emitting elements to ideal Lambertian distributions. An observer looks at the device from a direction that is perpendicular to wave propagation from a typical display viewing distance.
  • a method of displaying 3D images on a wave display comprising: generating a propagating wave in a flexible display sheet comprising a plurality of light emitting elements, with a wave generator; performing temporal tracking of the propagating wave, by at least one of: a priori knowledge based on a state of the wave generator; and monitoring of the propagating wave; and processing and driving video content to the flexible display sheet in a time synchronized manner based on a dynamic location of each light emitting element and content in a render buffer.
  • the method may include wherein the flexible display sheet further comprises at least one optical element added to provide a diffuser effect.
  • the method may include wherein the flexible display sheet further comprises at least one deformable optical element to extend a visible depth of the flexible display sheet.
  • the method may include wherein the wave display is configured as a two-dimensional array of small wave displays each having amplitudes suitable for relief presentation.
  • the method may include wherein the wave generator propagates the wave in directions orthogonal to an observer's line of sight.
  • the method may include wherein wave propagation is generated on two sides of each small wave display of the array, or wherein wave propagation is generated on all sides of each small wave display of the array.
  • each small wave display of the array is the size of a few surface-wavelengths, or the size of ten or fewer surface-wavelengths, or the size of five or fewer surface-wavelengths.
  • the method may include wherein propagated waves in each small wave display uses discrete cosine transforms (DCT) to generate a relief image.
  • DCT discrete cosine transforms
  • the method may include wherein each small wave display is configured to have at least one wave peak, such that DCT methods generate variations of 3D patterns recognizable by a user.
  • each small wave display is configured to display letters of a document in either 3D relief or a Braille pattern.
  • the wave display is further configured for haptic feedback.
  • the method may include wherein the wave display is configured with an absorbing parallax barrier grating as an optical layer to be a Multiview autostereoscopic 3D display.
  • the method may further comprise synchronizing image content with the wave surface normal directions to configure the wave display such that different images are displayed at separate surface tilt angles.
  • the method may include wherein the wave display is configured with a lenticular sheet as an optical layer to be a Multiview autostereoscopic 3D display.
  • the method may further comprise synchronizing image content with the wave surface normal directions to configure the wave display such that different images are displayed at separate surface tilt angles.
  • the method may include wherein groups of the plurality of light emitting elements are organized and controlled as voxel elements.
  • a 3D wave display comprising: a playback device; and a display device, comprising a flexible wave display sheet disposed within a display frame, the flexible wave display sheet supported and driven by a sheet movement generator within the display frame.
  • the display may include wherein the playback device is configured to calculate a control signal for a sheet movement generator.
  • the display may include wherein the playback device is configured to calculate a timing for synchronized on-off switching of each of a plurality of pixels of the flexible wave display sheet, and communicate said calculated timing as a display signal to the display device.
  • the display may include wherein the display device further comprises display control electronics configured to activate the plurality of pixels of the flexible wave display sheet according to the display signal from the playback device.
  • the display may include wherein the playback device is configured to calculate a control signal and a display signal based on 3D content to be displayed at the display device and physical parameters of the sheet movement generator.
  • the display may include wherein the flexible wave display sheet is disposed within the display frame in a vacuum region of a closed display device, or disposed within the display frame in a thin gas region of a closed display device, or disposed within the display frame in a light gas region of a closed display device.
  • the display may include wherein the sheet movement generator comprises a linear motor, or an angular momentum motor; or at least one moving support at a vertical end of the flexible wave display sheet; or an electrical conductor disposed along a width of the display frame, the electrical conductor configured to generate dynamic propagating wave movement for the flexible wave display sheet by an electromagnetic force; or an electromagnet disposed along a width of the display frame, the electromagnet configured to generate dynamic propagating wave movement for the flexible wave display sheet by an electromagnetic force.
  • the flexible wave display sheet comprises a plurality of LED or OLED emitters.
  • a 3D wave display comprising: a playback device; and a display device, comprising an array of moving display rods disposed within a display frame, the array of moving display rods supported and driven by a wave movement generator within the display frame.
  • a 3D wave display system comprising a processor and a non- transitory storage medium storing instructions operative, when executed on the processor, to perform functions including: generating a propagating wave in a flexible display sheet comprising a plurality of light emitting elements, with a wave generator; performing temporal tracking of the propagating wave, by at least one of: a priori knowledge based on a state of the wave generator; and monitoring of the propagating wave; and processing and driving video content to the flexible display sheet in a time synchronized manner based on a dynamic location of each light emitting element and content in a render buffer.
  • a flat form-factor display device may take advantage of the properties of a flexible sheet, as in the previously discussed volumetric display.
  • the particular embodiment of the display device may be either a multiview display with a very dense grid of angular views or a true light field display with multiple views and focal surfaces.
  • the structure may function as a regular 2D display by activating all the sub-pixels inside a LF pixel simultaneously.
  • Exemplary methods are able to provide both the large light emission angles that are useful for eye convergence and the small emission angles that are desirable for natural eye retinal focus cues.
  • some such methods make it possible to create multiple focal surfaces outside the display surface to address the VAC problem.
  • Such embodiments present a way to simultaneously scan the small light emission angles and focus the voxel-forming beams.
  • FIG. 14 is a schematic view of the geometry involved in creation of the light emission angles associated with a LF display 1405 capable of producing retinal focus cues and multiple views of 3D content with a single flat form-factor panel.
  • a single 3D display surface 1405 is preferably able to generate at least two different views to the two eyes of a single user in order to create the coarse 3D perception effect already utilized in current 3D stereoscopic displays.
  • the brain uses these two different eye images for calculation of 3D distance based on triangulation method and interpupillary distance. This means that at least two views are preferably projected into the Single-user Viewing Angle (SVA) shown in FIGS. 3A-3D.
  • SVA Single-user Viewing Angle
  • a true LF display is preferably able to project at least two different views inside a single eye pupil in order to provide the correct retinal focus cues.
  • an "eye-box" is usually defined around the viewer eye pupil when determining the volume of space within which a viewable image is formed (e.g., "eye-box" width 1425).
  • at least two partially overlapping views are preferably projected inside the Eye-Box Angle (EBA) covered by the eye-box at a certain viewing distance 1420.
  • EBA Eye-Box Angle
  • the display 1405 is intended to be used by multiple viewers (e.g., 1401 , 1402, 1403) looking at the display 1405 from different viewing angles, several views of the same 3D content (e.g., virtual object point 1410) are preferably projected to all viewers covering the whole intended Multi-user Viewing Angle (MVA).
  • MVA Multi-user Viewing Angle
  • a LF display is positioned at 1 m distance from a single viewer and eye-box width is set to 10mm, then the value for EBA would be -0.6 degrees and one view from the image 3D content should be produced for each -0.3 degree angle.
  • the SVA would be -4.3 degrees and around 14 different views would be called for just for a single viewer positioned at the direction of the display normal (if the whole facial area of the viewer is covered). If the display is intended to be used with multiple users, all positioned inside a moderate MVA of 90 degrees, then a total of 300 different views are called for.
  • the LF display 14 illustrates that three different angular ranges should be covered simultaneously by the LF display: one for covering the pupil of a single eye (e.g., EBA), one for covering the two eyes of a single user (e.g., SVA), and one for covering the multiuser case (e.g., MVA).
  • EBA pupil of a single eye
  • SVA single eyes of a single user
  • MVA multiuser case
  • the last two are usually covered in existing systems by using either several light emitting pixels under a lenticular or parallax barrier structure, or by using several projectors with a common screen.
  • These techniques are suitable for the creation of fairly large light emission angles that can be utilized in the creation of multiple views.
  • these systems lack the angular range dedicated to cover the eye pupil, and resultingly they are not capable of producing the correct retinal focus cues and are susceptible to the VAC.
  • LF sub-pixels Functioning of currently available, flat-panel-type multiview displays is generally based on spatial multiplexing only.
  • a row or matrix of light emitting pixels (LF sub-pixels) is placed behind a lenticular lens sheet or microlens array and each pixel is projected to a unique view direction in front of the display structure.
  • LF pixel size is desired from the 3D display, the size of individual sub-pixels may be reduced or a smaller number of viewing directions can be generated.
  • a high quality LF display should have both high spatial and angular resolutions in order to provide the user a natural view, and the current flat form-factor displays are limited in this respect.
  • each beam is preferably very well collimated and it should have a narrow diameter. Furthermore, ideally the beam waist should be positioned at the same spot where the beams are crossing in order to avoid contradicting focus cues for the eye. If the beam diameter is large, also the voxel formed in the beam crossing is imaged to the eye retina as a large spot. A large divergence value means that the beam is becoming wider as the distance between voxel and eye is getting smaller and the virtual focal surface spatial resolution becomes worse at the same time when the eye resolution is getting better due to the close distance.
  • the achievable light beam collimation is dependent on two geometrical factors: size of the light source and focal length of the lens. Perfect collimation without any beam divergence can only be achieved in the theoretical case in which a single-color point source (PS) is located exactly at focal length distance from an ideal positive lens.
  • PS single-color point source
  • ES extended sources
  • the total beam ends up formed from a group of collimated sub- beams that propagate to somewhat different directions after the lens. And as the source grows larger, the total beam divergence increases. This geometrical factor cannot be avoided with any optical means and it is the dominating feature causing beam divergence with relatively large light sources.
  • diffraction Another, non-geometrical, feature causing beam divergence is diffraction.
  • the term refers to various phenomena that occur when a wave (of light) encounters an obstacle or a slit. It can be conceptualized as the bending of light around the corners of an aperture into the region of geometrical shadow. Diffraction effects can be found from all imaging systems and they cannot be removed even with a perfect lens design that is able to balance out all optical aberrations. In fact, a lens that is able to reach the highest optical quality is often called "diffraction limited" as most of the blurring remaining in the image comes from diffraction.
  • the size of an extended source has a big effect on the achievable beam divergence.
  • the source geometry or spatial distribution is actually mapped to the angular distribution of the beam and this can be seen in the resulting "far field pattern" of the source-lens system.
  • the LF pixel projection lenses may have very small focal lengths in order to achieve the flat structure and the beams from a single LF pixel are projected to a relatively large viewing distance. This means that the sources are effectively imaged with high
  • magnification when the beams of light propagate to the viewer For example, if the source size is 50 ⁇ x 50 ⁇ , projection lens focal length is 1 mm and viewing distance is 1 m, the resulting magnification ratio is 1000:1 and the source geometric image will 50 mm x 50 mm in size. This means that the single light emitter can be seen only with one eye inside this 50 mm diameter eyebox. If the source has a diameter of 100 ⁇ , the resulting image would be 100 mm wide and the same pixel could be visible to both eyes simultaneously as the average distance between eye pupils is only 64 mm. In the latter case the stereoscopic 3D image would not be formed as both eyes would see the same images.
  • the example calculation shows how the geometrical parameters like light source size, lens focal length and viewing distance are tied to each other.
  • the spatial resolution achievable with the beams will get worse as the divergence increases.
  • the beam size at the viewing distance is larger than the size of the eye pupil, the pupil will become the limiting aperture of the whole optical system.
  • LEDs are LED chips that are manufactured with the same basic techniques and from the same materials as the standard LED chips in use today. However, the LEDs are miniaturized versions of the commonly available components, and can be made as small as 1 ⁇ - 10 ⁇ in size. A matrix has been manufactured with a density of 2 ⁇ x 2 ⁇ chips assembled with 3 ⁇ pitch. The ⁇ have been used so far as backlight components in TVs, but they are also expected to challenge OLEDs in the ⁇ -display markets.
  • When compared to OLEDs, ⁇ can be more stable components and can reach very high light intensities, making them useful for many applications from head mounted display systems to adaptive car headlamps (LED matrix) and TV backlights. ⁇ can also be seen as high-potential technology for 3D displays, which use a very dense matrices of individually addressable light emitters that can be switched on and off very fast.
  • One bare ⁇ chip emits a specific color with spectral width of -20-30 nm.
  • a white source can be created by coating the chip with a layer of phosphor, which converts the light emitted by blue or UV LEDs into a wider white light emission spectra.
  • a full-color source can also be created by placing separate red, green, and blue LED chips side-by-side, as the combination of these three primary colors creates the sensation of a full color pixel when the separate color emissions are combined by the human visual system.
  • the previously mentioned very dense matrix would allow the manufacturing of self-emitting full-color pixels that have a total width below 10 ⁇ (3 x 3 ⁇ pitch).
  • Light extraction efficiency from the semiconductor chip is one of the parameters that determine electricity-to-light efficiency of LED structures.
  • One approach, as discussed in US7994527, is based on the use of a shaped plastic optical element that is integrated directly on top of an LED chip. Due to lower refractive index difference, integration of the plastic shape extracts more light from the chip material in comparison to a case where the chip is surrounded by air. The plastic shape also directs the light in a way that enhances light extraction from the plastic piece and makes the emission pattern more directional.
  • Systems and methods set forth herein use a flexible or rigid optical layer and a flexible or rigid light emitting layer to create a dense 3D light field display for presenting 3D image content.
  • a propagating wave is generated in one or more of the flexible layers.
  • the distances between layers change locally.
  • a virtual 3D image is formed on one or both sides of the display surface. The varying distance between layers is used for altering locally the image virtual distance and light emission angles.
  • One exemplary structure forms an array of tiny projectors with a microlens sheet on the top and multiple display pixels below. Each microlens and the array of sub-pixels below it form a tiny projector system that acts as one pixel in creation of the LF 3D image.
  • the propagating wave crest and trough bring the display elements near or far from the projector lens. This range of distances may be selected to cover the whole virtual depth range desired for the LF display.
  • Each sub-pixel in the tiny projectors corresponds to a certain viewing angle.
  • exemplary embodiments add the possibility to scan virtual image distances and projected light angles as the distances between the optical layer and light emitting layer in each tiny projector change with the propagating wave. Such methods also add the possibility to provide the correct focus cues to the eyes.
  • Exemplary embodiments provide both the large light emission angles useful for eye convergence and the small emission angles that provide natural eye retinal focus cues. This may be accomplished by scanning the small light emission angles with the help of a propagating wave form in a stack of display layers. The structure may be built into a device that has a flat form-factor that is preferred for consumer use.
  • An exemplary embodiment utilizes a combination of spatial and temporal multiplexing in creation of a dense light field that can be used for displaying 3D content.
  • a micro-optical active component can be used for high-resolution temporal scanning of light rays, enabling the creation of a true dense light field with depth information instead of having just a set of multiple views.
  • Exemplary embodiments use a flexible or rigid optical layer and a flexible or rigid light emitting layer to create a dense 3D light field display for presenting 3D image content.
  • a propagating wave 1550 may be generated in one or more of the flexible layers (e.g., flexible optical layer 1515 or flexible substrate 1530).
  • the layers are rigid (e.g., rigid substrate 1505 or rigid optical layer 1535) and the other has a propagating wave 1550, the distances between layers change locally (e.g., short distance 1520 and long 1522 in FIG. 15A, or long distance 1537 and short distance 1539 in FIG. 15B) as shown in FIGS. 15A-15B.
  • a virtual 3D image is formed on one or both sides of the display surface.
  • the varying distance between layers is used for altering locally the image virtual distance and light emission angles.
  • FIG. 16 is a schematic view of an example structure that has a rigid light emitting layer (rigid substrate 1605 and arrays of light emitting elements 1610) and on top of that a flexible optical layer 1615 with microlenses.
  • the whole structure forms an array of tiny projectors with a microlens sheet on the top and multiple virtual display pixels 1625 below that.
  • Each microlens and the array of sub-pixels below it form a tiny projector system that acts as one pixel in creation of the LF 3D image.
  • the propagating wave crest (with amplitude 1622) and trough bring the display elements near or far from the projector lens.
  • the light emitting element distance from the optical element is near the focal distance of the optical element, a nearly collimated beam is generated. This corresponds to the case presented in FIG. 3C.
  • An image of the light emitting element positioned at the trough section can be focused between the viewer and display.
  • This range of virtual distances can be designed to cover the whole virtual depth range 1630 used for the LF display.
  • the structure can also be configured to cover multiple viewing directions (e.g., directions 1601 , 1602, and 1603).
  • Each sub-pixel in the tiny projectors corresponds to a certain viewing angle.
  • Exemplary embodiments add the possibility to scan virtual image distances and projected light angles as the distances between the optical layer and light emitting layer in each tiny projector change with the propagating wave.
  • the virtual distance of each pixel changes as the microlens structure is closer or further away from the sub-pixels matrix.
  • the pixels can be switched on the basis of image depth content adding the possibility to provide also the correct focus cues to the eyes.
  • the continuous 3D object virtual distances can be presented as a narrow range of ray angles in certain spatial positions on the display. These angle distributions and spatial position pairs generate the dense light field in front of the flat display. Exemplary embodiments are capable of generating both the larger angles useful for eye convergence as shown in FIG. 16 and also the smaller angles useful to provide the correct retinal focus cues.
  • the smaller angles are generated as the wavy foil changes distance and angle (relative to the rigid substrate 1705) between the optical element 1715 and light emitting sub-pixel (e.g., array of light emitting elements 1710). This change is illustrated schematically in FIGS. 17A-17B. In FIG. 17B, the optical element 1715 of the tiny projector (one of the array of projectors in FIG.
  • each projector e.g., microlens 1715 and associated array of light emitting elements 1710
  • SA Source Angle
  • DA Direction Angle
  • Simulation raytraces were prepared for four cases where the distance between a row of pixels and lens is a) 2 x the Back Focal Length (BFL) of the lens, b) between 1 x BFL and 2 x BFL, c) closer than BFL and d) at the BFL.
  • BFL Back Focal Length
  • five light emitting pixels were simulated.
  • simulated rays focused near the display surface and when the distance was between BFL and 2 x BFL, the simulated rays focused between the observer and display.
  • the lens distance was less than BFL, the simulated rays diverged and the virtual focus point was behind the display surface.
  • the lens tilting may be used for scanning through a small angular image projection range.
  • Such small tilt angles are adequate for covering the eye pupils of a nearby observer with more than one image, which may fulfill the super-multi-view condition and provide even more realistic focus cues, especially for fast moving 3D image content.
  • the small scan angles can also be used for generation of a very dense field of multiple viewing directions that create simultaneously the stereoscopic effect for multiple viewers positioned at a larger distance.
  • the spatial multiplexing made with a row of sub-pixels may be enhanced with the temporal multiplexing made with the propagating wave foil that tilts the projection angles by tilting the individual lens shapes on top of the sub-pixels.
  • dynamic wave propagation in one direction (e.g., horizontal) on the flexible sheet is generated by moving one or both sheet ends linearly at the amplitude direction of the wave as previously shown in, and discussed in relation to, FIG. 5.
  • the optical element may be, for example, a flexible foil of microlenses or, in the case of a flexible display element, a rigid lenslet panel.
  • piezo-electric actuators may be employed for wave generation.
  • An exemplary light field wave display system may generally be as previously discussed in relation to FIG. 13.
  • electrical conductors or electromagnets along the display width may be used to generate a dynamic propagating wave movement of the flexible sheet with electromagnetic force.
  • a 3D LF smartphone display may be in front of a user at -500 mm distance from the user's eyes.
  • the interpupillary distance between a person's eyes may be around 64 mm and eye pupil size may be around 7 mm.
  • a person's eye lens focal length may be around 17 mm.
  • the user sees different virtual distances through the device surface as it creates a dense light field with the help of a propagating wave micro-optical foil. When the user sees a virtual distance that is closer to or further away from the user than the actual device distance, their eyes may converge to two different spatial areas on the display surface.
  • the display surface emits light from these two areas to the user's eyes by activating the correct sub-pixels under microlenses positioned on the flexible foil. This happens if display surface sends light to the correct angle from that spatial position.
  • the display sends parallel collimated light to the user's eyes from two spatial areas at (or about) the interpupillary distance from each other. With all other virtual distances, the distance between the two display spatial areas is smaller.
  • Each separate microlens in the flexible lenticular sheet positioned on top of a high-resolution rigid OLED display may act as the objective of a small projector.
  • the initial distance between lens and sub-pixels may be close to the objective focal length distance of each microlens. Undulating movement of the flexible microlens sheet modulates this distance.
  • the small projector on the display surface sends light to the observer's eye.
  • the small projector spatial position is dependent on voxel distance. If a larger area on the display is to be used for the 3D image generation, the neighboring projectors are also illuminated. Different angles are used from different spatial positions to reach the observer's eyes. The angles are calculated and pixels activated according to the 3D image content in order to create the whole high-density light field in front of the display.
  • Table 1 shows simulated values for angular and spatial areas where the display surface sends light in order to present the correct voxels at four different virtual distances.
  • the eye accommodation angles are inside the convergence angles.
  • One of the tiny projector sub-pixels is lit in order to project the light to the right convergence angle from the projector objective.
  • the rough value of convergence angle is set by selecting the right sub-pixel.
  • the flexible layer may move as a dynamic wave and sweep the more dense angular values around the rough convergence angle values (e.g., super resolution).
  • the sub-pixels are lit only when the angular sweep is at the correct viewing direction.
  • the small projector can sweep ⁇ 1 ° angles for a voxel that is at 300 mm distance from the two eyes. Larger sweeps may be used for a larger eye-box, which may assist in device usability.
  • the dynamic wave amplitude in the stack of optical layers may be used for generating distance variation between light emitting elements and the optical layer. This distance variation is converted to angles by the small projector lenses, if the pixel or lens is riding the wave and other layers are flat.
  • the pixel X-Y-position is also converted to a rough set of angles in user space. If the user looks at the display through the small projector objectives, the pixel spatial position may seem to move together with the lens, when the wave phase is changing.
  • the propagating wave may create continuous angular sweeps or narrow changes in angles more accurately than a simple sub-pixel structure.
  • the wave propagating movement may propagate such that every slope angle, crest, and trough of the wave is sweeping through the whole small projector width.
  • a projector pixel is lit when the distance change sweeps angles within the defined eye-box according to virtual distance information.
  • a sub-pixel pixel is lit during the wave phase change, when the ray angle hits one of the eye boxes.
  • the rays sweep a set of angles during the period the pixel is turned on. These angular sweeps are generated also from neighboring small projectors on larger source areas.
  • the angular and spatial sweeps on display surface should happen during a time interval that is adequate for the observer persistence of vision effect.
  • angles from the display surface to be used for eye convergence are larger at closer distances when compared to the zero angles for an infinity virtual distance.
  • the user's interpupillary distance and display distance determine the appropriate convergence angles. Angles used for
  • Accommodation depend on eye pupil diameter at the same display distance. These accommodation angles are much smaller than convergence angles, and their range is within the convergence angles.
  • An arc with a center on the voxel at the virtual distance can be drawn from one eye to the other.
  • the arc normal represents the angle of light emission that is called for from the 3D LF display.
  • the display device may generate the correct convergence and accommodation angles in order to provide a realistic 3D experience for the user.
  • the display surface comprises an array of small projectors each with an objective lens and an array of sub-pixels behind it
  • the display device may generate virtual 3D distances by switching on and off the light emitting elements of the small projectors in synchronization to the 3D content.
  • the 3D angles are generated on a 2D display surface in order to provide the user's eyes with virtual distance cues.
  • Two projectors' spatial positions on the display surface create the right convergence angle for the two user eyes. The user looks at the 3D object "through" the display surface, and continuous movement or structure in 3D content can be perceived.
  • 3D content may be synthesized or recorded by two cameras at interpupillary distance from each other to obtain a realistic depth experience. Near and far object points on two camera images fall spatially on different positions on the camera sensors. Infinity distance images are similar in both cameras. Near objects are decentered more from each other in camera images. This separation represents the 3D virtual distance. Also, camera lens focus information can be used for obtaining the 3D virtual distance if multiple focus distances are recorded. In recent years, light field cameras (e.g., Lytro) have also emerged, which are based on a microlens array between camera lens and sensor and capable of recording multiple focus distances simultaneously. This focus data may be used for user accommodation distances in the exemplary display device.
  • Lytro light field cameras
  • a handheld device there may preferably be some tolerance for the display viewing distance, eye X-Y-movement and tilt. If the small projectors on the LF 3D display aim their light rays directly to the user's eyes, a narrow range of light field angles is adequate, but apparatus usability may be limited. A somewhat wider angle distribution for close-range virtual distances and spatial areas for far distances on the display surface may result in better user comfort.
  • the box-shaped 3D volume, called the "eye-box" around user eye pupil can be used for the 3D light field angle rendering. If the voxel is at display physical distance, only large angles are swept from one small projector. At further distances, the light should be emitted from two spatial areas on the display surface. At infinite virtual distance, the area on the display surface may be the same as eye box width, and collimated light is sent towards the viewer.
  • the wave movement may be generated and flexible sheet length controlled only at the vertical edges of the display device, if the observer's eyes are at horizontal plane.
  • the small mechanical movement may be generated, for example, with piezoelectric actuators positioned at the frame of the device, such as on the vertical edges of the display frame.
  • the wave amplitude may also be controlled by rolling or pulling the sheet from the vertical edges.
  • the display 1800 may include a plurality of projection cells 1802.
  • Each projection cell 1802 includes a set of controllable light-emitting sub-pixels 1810 and a microlens 1815.
  • Each set of light- emitting sub-pixels 1810 may be arranged in a two-dimensional pattern of sub-pixels (e.g. a 128x128 pixel array, among other possibilities), and each sub-pixel may be capable of displaying a full gamut of different colors.
  • the projection cells 1802 may also be arranged in a two-dimensional array, for example in a square or hexagonal array. In the example of FIG.
  • the microlenses 1815 are supported by (or are integral with) a membrane 1817 that is capable of serving as the medium of travel of a propagating wave.
  • the membrane 1817 and microlenses 1815 In its rest state (without a traveling wave), the membrane 1817 and microlenses 1815 (collectively the microlens array) may be supported at a predetermined distance from the sub-pixels (e.g., a distance of one focal length) by a plurality of resilient supports 1821.
  • a piezoelectric actuator 1825 or other linear or rotational actuator
  • appropriate power source 1827 are provided to generate a propagating wave in the microlens array.
  • One or more actuators 1825 may be positioned along one or more edges of the display 1800.
  • a plurality of actuators 1825 may be distributed throughout the display 1800.
  • each projection cell 1802 may be provided with a corresponding actuator 1825.
  • the resilient supports 1821 may include a position sensor used to determine the position of each microlens 1815 relative to the corresponding sub-pixel array 1810.
  • other types of position sensors may be used, and/or microlens position sensing may not be used in cases where lens position can be calculated based on the input to the driving actuator (using, for example, the appropriate traveling wave equation).
  • FIG. 18A illustrates a display 1800 in a rest state (with no traveling wave)
  • FIG. 18B illustrates a portion of a similar display 1800 at a frozen moment in time during passage of a standing wave across the projection cells 1802.
  • the passage of the standing wave causes the distance between each microlens 1815 and its respective set of subpixels 1810 to change as a function of time.
  • the illumination of sub-pixels in each projection cell 1802 is performed according to the distance of the microlens 1815 from the sub-pixels 1810.
  • An example of one technique of determining when to illuminate particular sub-pixels is provided with reference to FIG. 18C.
  • the time-varying distance of a microlens 1815 from its respective sub-pixel array 1810 may be represented by doff).
  • the focal length of the microlens may be represented by f.
  • the sub-pixel to be illuminated for a voxel with position and z (relative to the projector cell) is the sub-pixel that is at (or that best approximates) the position x 0 such that the following condition is met.
  • This pixel is illuminated when the following condition is met.
  • FIG. 18D is an example ray-tracing diagram illustrating the generation of a virtual image when the conditions discussed in relation to FIG. 18C are met. Specifically, under these conditions, for a focal point 1813 the virtual image 1831 of the illuminated sub-pixel 1811 substantially corresponds to the position of the virtual voxel 1830. It may be noted that other projection cells 1802 in addition to the one illustrated in FIG. 18D may operate (and generally do operate) to display the same virtual voxel. However, the values of the horizontal offset Xo and/or of a vertical offset y 0 will be different for different projector cells.
  • FIGS. 18E and 18F illustrate a situation in which the same voxel is displayed by different projector cells at different times. (It should be noted, however, that in exemplary embodiments, the turning on and off of different subpixels occurs sufficiently rapidly that, due to persistence of vision effects, the voxel may appear to be displayed using multiple projector cells simultaneously.)
  • the microlens array is in a position such that one of the sub-pixels 1811 of the projector cell on the right is illuminated to generate a virtual image 1831 at the desired voxel position. No subpixel 1812 is illuminated in the projector cell on the left because none of the resulting virtual images 1832 would correspond to the position of the desired voxel.
  • the microlens in the right-side projector cell is no longer in a position to accurately reproduce the desired voxel (would result in resulting virtual images 1832 out of position), so no subpixel 1812 of that projector cell is illuminated.
  • the microlens in the left-side projector cell is now in a position to reproduce the desired voxel (the same voxel as in FIG. 18E), and the appropriate pixel 1811 in the left-side projector cell is illuminated.
  • Exemplary embodiments described above use a traveling wave in the microlens array to periodically alter the configuration of the microlens with respect to the corresponding sub-pixels.
  • other techniques are used to alter the configuration of the microlens with respect to the sub- pixels.
  • the distance between each microlens and its corresponding set of sub-pixels may be adjusted with a piezoelectric actuator, microelectromechanical (MEMS) actuator, linear actuator, magnetic actuator, or other actuator.
  • MEMS microelectromechanical
  • Such an actuator may operate on a cell-by-cell basis or on a group of projection cells.
  • An undulating diffractive foil may be used in some embodiments of an optical method for and basic construction of an enhanced multi-view 3D light field display.
  • the performance of a multi-view display (such as previously discussed based on lenticular sheets and dense pixel matrices) may be enhanced by introducing a flexible diffractive foil manipulated by a propagating wave into the structure (similar to the flexible layers discussed above).
  • a propagating wave As the wave propagates in the grating foil, the angle of incidence between the grating and light emitted from a pixel change constantly. As the angle changes, the diffraction orders also change their propagation direction slightly. This small change in propagation angle is used for additional temporal multiplexing of view directions.
  • the propagating wave allows sweeping of spatially multiplexed view directions through small angles, and by synchronizing the wave movement to the activation of the pixel matrix a much denser LF display is created with a higher-quality 3D picture.
  • a grating film bends light rays that are going through.
  • the diffractive grating orders 1 and -1 bend the light rays symmetrically to two directions.
  • the zeroth order goes directly though the grating and may be obscured if not needed.
  • the bending angle depends on the grating period and light wavelength. If there is a tilted grating in the ray path, the incoming light may see a tighter grating period than when the grating is not tilted.
  • a tilted grating bends rays more than a non-tilted one.
  • An exemplary LF display using a wavy diffractive foil includes an array of small projector cells.
  • the light in a single projector cell is emitted from a pixelated layer and a microlens collimates the emitted light into a set of beams that exit the lens aperture at different propagation directions.
  • the beam directions create the stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions by modulating the sub-pixels according to the image content.
  • This projector cell functionality is similar to previously discussed approaches for flat form-factor autostereoscopic displays based on lenticular sheets.
  • the next layer in the present projector cell structure is a grating foil that alters the propagation direction of the emitted beams by diffraction.
  • An additional prism structure is positioned after the diffraction grating (or grating foil) in order to make another alteration to the beam propagation direction, compensating for angular tilt made by the grating.
  • Some embodiments may operate without the prism layer, but use of the prism layer may improve a central view parallel to the display surface normal and other view directions can be positioned symmetrically around it.
  • the display device may be either a multi-view display with a very dense grid of angular views or a true light field display with multiple views and focal surfaces.
  • Amplitudes for the propagating waveform can be kept below 1 mm even in fairly large-scale displays.
  • the diffractive optical foil may have flat surfaces and even thickness, which may be beneficial features for a dynamic undulating component in the display optical stack.
  • the flat sheet may be more robust and less susceptible to wearing.
  • the principle can be scaled by use case or designed in a product for different LF display view angles, voxel distance range, and resolution.
  • Embodiments using a diffraction grating may be applied to hardware constructions that are found in previously discussed 3D multi-view displays, such as utilizing lenticular sheets or other integral imaging approaches. Such construction may be beneficial for the reliability, setup, and calibration of the whole system, as very few components may be fitted together in some embodiments.
  • Activation of the propagating wave may employ additional actuators and control electronics as well as alteration of the rendering scheme, but these may be added to the structures, electronics, and rendering functions of existing hardware.
  • An exemplary embodiment provides an optical method and construction of an enhanced multi- view 3D light field display.
  • the performance of a multi-view display based on lenticular sheets and dense pixel matrices may be enhanced by introducing a flexible diffractive foil with a propagating wave into the structure.
  • the angle of incidence between grating and light emitted from a pixel change constantly.
  • the diffraction orders change their propagation direction slightly.
  • This small change in propagation angle is used in exemplary embodiments for additional temporal multiplexing of view directions.
  • the propagating wave allows sweeping of spatially multiplexed view directions through small angles and, by synchronizing the wave movement to the activation of the pixel matrix, a much denser multi-view display is created with a higher quality 3D picture.
  • FIG. 19A shows the structure of a single projector cell 1900 that forms one basic unit of a whole
  • the light is emitted from a pixelated Light Emitting Layer (LEL) 1911 (which may comprise a substrate
  • the beam directions create a stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions by modulating the pixels according to the image content. If only two pixels are used, the result is a stereoscopic image for a single user standing in the middle of the Field-Of-View (FOV) and the image from right half of LF pixels enter the left eye and the left half pixels are visible only to the right eye.
  • FOV Field-Of-View
  • the result is a set of unique views spread across the FOV and multiple users can see the stereoscopic images at different positions inside the predefined viewing zone.
  • This effectively generates a multi-view light field for a 3D scene each viewer has his/her own stereoscopic view of the same 3D content, and perception of a three dimensional image is generated. As the viewer moves around the display, the image is changed for each new viewing angle.
  • This first part of the projector cell functionality is identical to the method used in current flat form-factor autostereoscopic displays based on e.g. lenticular sheets.
  • a prism structure 1930 positioned after the diffraction grating 1920 may make another alteration to the beam propagation direction, compensating for the angular tilt made by the grating 1920.
  • the system may also operate without the prism 1930, but may assist in having the central view parallel to the display surface normal and for other view directions to be positioned symmetrically around it.
  • FIG. 19B illustrates the operation of an exemplary method, which utilizes a propagating wave motion that is introduced to the grating foil.
  • Four projector cells 1952, 1954, 1956, 1958 (each as in FIG. 19A) are emphasized representing four different wave phases. Projector cells are separated by the baffles 1907 of a baffle array 1908.
  • the different phases of the wave tilt the projection directions differently, making the views change direction slightly.
  • the result is an angular sweep of each view direction through a small angle.
  • the length of this sweep may be designed to be the same as the angular spacing between two adjacent views generated with different pixels. If the pixels are modulated in synchronization to the sweep angle, a dense set of different views can be generated in between the main directions, determined by the pixel positions and projector cell lens.
  • the projector cells 1954 and 1958 are generating views centered to the surface normal direction of the display. This is due to the fact that the grating foil wave is at trough and crest of the wave amplitude making the incident angles close to grating surface normal. As the prism structure 1932 compensates for the grating tilt, projected views become symmetrical around the central direction.
  • the grating 1920 is tilted to a counter-clockwise direction, altering the view propagation more to the clockwise direction. The same tilt direction, but with somewhat larger angle, is introduced to the propagation directions in cell 1956, where the grating 1920 is tilted to a clockwise direction with respect to the display normal.
  • angular sweeps go back and forth between one position determined by wave trough and crest, and two positions determined by the grating rotating to the left and right during propagation of one full waveform of the grating foil 1920 across the projector cell aperture.
  • the wavelength and amplitude used for a specific sweep angle range can be determined from the previously mentioned grating equation.
  • FIG. 20 shows a schematic presentation of a whole display structure 2000.
  • the light is emitted from a pixelated layer that can be, for example, a LED matrix, OLED display, or LCD display with backlight.
  • a matrix of baffles 2008 in the form of, for example, a punctured sheet may be placed on top of the light emitting layer 2011, optically isolating the projector cells from each other.
  • Light collimation optics 2016 is placed on top of the baffles, and may be, for example, a microlens/lenticular lens sheet or a foil with diffractive structures.
  • Actuators 2025 for controlling the linear (and/or in some embodiments angular) motion for generating the propagating wave motion to the grating foil 2020 may be placed at the frame 2030 of the whole display 2000. Wave amplitude below 1 mm may be sufficient for generating the desired angular sweep ranges even in fairly large displays if the grating period and projector cell size are small enough.
  • Microprisms 2032 that make the final adjustment of the view directions can be, for example, integrated in a display protective front window as an extruded groove structure or made as diffractive elements with microstructures.
  • the wavy grating foil 2020 is positioned in between the collimating lenses 2016 and angle adjusting prisms 2032.
  • Material of the grating film 2020 may be, for example, polyester, and may have a thickness of -0.1 mm. Such foils are manufactured by embossing or holographic methods and are readily available in the form of large rolls.
  • a blazed grating structure may be used in some embodiments, as it allows the use of, for example, diffraction order 1 for the beam tilting, and other diffraction orders naturally present in gratings can be attenuated.
  • the other diffraction orders (especially the 0th order) may cause cross-talk or lowered image contrast if not attenuated properly either by the grating design or by an additional baffle structure positioned after the grating foil 2020.
  • Optical structures can be one-dimensional (e.g., cylindrical lenses) if only horizontal views are needed, or two-dimensional (e.g., rotationally symmetric microlenses) if views are desirable in both directions. In the latter case, two orthogonal diffractive wavy foils can be used for the two-dimensional angular scan.
  • the movements may be synchronized by considering also the sheet length, in order to avoid a standing wave.
  • Exemplary structures may generally be the same as previously discussed in relation to FIG. 13, where continuous oscillation and correct synchronization may cause a propagating wave to travel through the display width.
  • piezo-electric actuators may be used to generate small wave amplitudes sufficient for wave generation, when projector array optical components are small.
  • instead of motors there may be electrical conductors or electromagnets along the display width that generate dynamic propagating wave movement to the wavy diffractive foil with the force based on electric and/or magnetic fields.
  • the conductors may be integrated between projector cells in the wavy grating film.
  • the grating foil and microprism elements may be built on top of existing display structures.
  • the light emitting layer preferably has faster refresh rates than those used by some existing multi-view displays.
  • a display using a LED matrix based on currently available components or LEDs may be an example of a light emitting structure capable of very fast switching speeds sufficient for these embodiments.
  • FIG. 21 shows an exemplary structure of a single projector cell 2100. The optical paths before and after the wavy grating 2120 may be offset.
  • Transmission of light from the LEL 2110 and collimating MLA1 (2115) through the wavy grating 2120 to the order 1 is about 30% for a single color band for a standard grating design.
  • This light may pass through a prism 2130 after the wavy grating foil 2120, as previously discussed. All the other orders are blocked (such as with block for orders 2137) in the physical aperture 2135 placed after the grating 2120.
  • relay optics are added, which focus the different diffraction orders to the aperture 2135 and then re-collimate the beams from diffraction order 1 that are allowed through the structure.
  • Focal length difference in relay optics lens elements MLA2 (2132) and MLA3 (2145) enables scaling of the LF display viewing angle and voxel depth range.
  • Crosstalk baffles 2140 may separate projector cells from one another.
  • FIG. 22 shows the structure of a single projector cell 2200 that uses a Prism-Grating-Prism (PGP) structure.
  • PGP Prism-Grating-Prism
  • FIG. 22 shows the structure of a single projector cell 2200 that uses a Prism-Grating-Prism (PGP) structure.
  • PGP Prism-Grating-Prism
  • FIG. 22 shows the structure of a single projector cell 2200 that uses a Prism-Grating-Prism (PGP) structure.
  • PGP Prism-Grating-Prism
  • Vision correction and HUD display with LF display may allow lower tolerances for the blocking aperture alignment.
  • an active mask e.g., LCD display
  • the LF display can show the display screen at a user-controlled distance, which may be independent of device distance.
  • the LF display may be configured to render the furthest voxel layer distance to provide presbyopia correction, such as for an older viewer.
  • a mobile phone with a LF display on a car dashboard may work as a HUD display via windscreen reflection, and a driver may not have to focus their eyes to a near distance when looking at the display.
  • LF display as camera Active tracking the observer eye direction. Different viewer locations can be detected, for example by an active near-infrared (NIR) camera system detecting the viewer direction from a LF display device. NIR wavelength (e.g., 840 nm) is reflected from human eye retina. Viewer locations can be detected on the basis of this reflection. For example, if every second cell in the LF display is used as a low-resolution camera instead of a light emitting pixel, the display structure itself may track viewer locations and where on the display the viewer is converging or even focusing their eyes. This may substantially reduce the amount of data processing needed for a true LF display, as only the actual viewer eye locations and focal surfaces then are considered in 3D image rendering.
  • NIR near-infrared
  • the undulating or wavy diffraction grating may be replaced with an alternative optical element, such as a stretchable soft grating, an ultrasonic grating, a moving pixel layer, or shaking of one lenslet layer.
  • an alternative optical element such as a stretchable soft grating, an ultrasonic grating, a moving pixel layer, or shaking of one lenslet layer.
  • there is a method comprising: providing a light-emitting layer having a plurality of sub-pixels; providing a microlens array over the light-emitting layer, the microlens array comprising a plurality of lenses, each lens corresponding to a subset of the sub-pixels; generating a traveling wave in at least one of the light-emitting layer and the microlens array to generate an oscillation in the distance between each microlens and the light-emitting layer; and illuminating selected sub-pixels in synchrony with the distance between each microlens and the light-emitting layer to generate a 3D image.
  • the method may include wherein the traveling wave is generated in the microlens array, or generated in the light-emitting layer.
  • the method may further comprise selecting a voxel location, wherein illuminating selected sub-pixels in synchrony with the distance comprises: selecting a plurality of sub-pixels such that, for at least one corresponding microlens distance, illumination of a selected sub-pixel generates an image of the selected sub-pixel substantially at the selected voxel location; and illuminating each of the selected sub-pixels at a time when the corresponding microlens is substantially at the corresponding microlens distance.
  • a method comprising: providing a light-emitting layer having a plurality of sub-pixels; providing a collimating microlens array over the light-emitting layer, the microlens array comprising a plurality of collimating microlenses; providing a diffractive layer over the collimating microlens array; generating a traveling wave in the diffractive layer to generate an oscillation in the orientation of the diffractive layer over each of the collimating microlenses; and illuminating selected sub- pixels in synchrony with the orientation of the diffractive layer over each of the collimating microlenses to generate a 3D image.
  • the method may further comprise blocking a zeroth-order transmission emitted from the diffractive layer.
  • the method may include wherein the 3D image is generated using first-order emissions from the diffractive layer.
  • a display apparatus comprising: a light-emitting layer having a plurality of sub-pixels; a microlens array mounted over the light-emitting layer, the microlens array comprising a plurality of lenses, each lens corresponding to a subset of the sub-pixels; and at least one actuator operative to generate a traveling wave in at least one of the light-emitting layer and the microlens array to generate an oscillation in the distance between each microlens and the light-emitting layer.
  • the display apparatus may include wherein the actuator is operative to generate the traveling wave in the microlens array, or to generate the traveling wave in the light-emitting layer.
  • the display apparatus may further comprise control circuitry operative to illuminate selected sub-pixels in synchrony with the distance between each microlens and the light-emitting layer to generate a 3D image.
  • the display apparatus may include wherein the control circuitry comprises a processor and a non-transitory computer storage medium storing instructions operative to perform functions comprising: selecting a plurality of sub-pixels such that, for at least one corresponding microlens distance, illumination of a selected sub-pixel generates an image of the selected sub-pixel substantially at the selected voxel location; and illuminating each of the selected sub-pixels at a time when the corresponding microlens is substantially at the corresponding microlens distance.
  • a display apparatus comprising: a light-emitting layer having a plurality of sub-pixels; a collimating microlens array over the light-emitting layer, the microlens array comprising a plurality of collimating microlenses; a diffractive layer over the collimating microlens array; and at least one actuator operative to generate a traveling wave in the diffractive layer to generate an oscillation in the orientation of the diffractive layer over each of the collimating microlenses.
  • the display apparatus may further comprise control circuitry operative to illuminate selected sub-pixels in synchrony with the orientation of the diffractive layer over each of the collimating microlenses to generate a 3D image.
  • the display apparatus may include wherein the 3D image is generated using first-order emissions from the diffractive layer.
  • the display apparatus may further comprise an element for blocking a zeroth-order transmission emitted from the diffractive layer.
  • the display apparatus may include wherein the element for blocking a zeroth-order transmission is configured to transmit a first-order transmission emitted from the diffractive layer.
  • an exemplary structure may scan the small light emission angles through use of tilting refractive plates.
  • Such systems and methods utilize a combination of spatial and temporal multiplexing in the creation of a dense light field that can be used for displaying 3D content.
  • the properties of a more traditional autostereoscopic multiview display are extended by introducing an active optical component to the structure that can be used for high-resolution temporal scanning of light rays, enabling the creation of a true dense light field with depth information instead of having just a set of multiple views.
  • Tilting refractive plates may be relatively easier to manufacture than some other active optical components.
  • a refractive plate sheet has flat surfaces and small thickness, which may be beneficial features for any component in the display optical stack.
  • a flat sheet is robust and less apt to wearing.
  • the associated functionality is based on refraction and not on diffraction, there may be reduced need for specialized optical component manufacturing.
  • tilting plates as described below may be placed in between the light emitting layer and the collimating lens, which may not be an available option with some other approaches, and may consequently result in more compact structures. Additionally, there may be minimal cross-talk between successive projected views as there are no optical structures in the tilting plates which may cause light leakage from one view to another.
  • a display with tilting plates can use similar hardware constructions that are found for previously discussed 3D multiview displays utilizing lenticular sheets or other integral imaging approaches. Activation and modulation of the tilting plates makes use of additional actuators and control electronics as well as alteration of standard rendering schemes. In exemplary embodiments, these components are added to the structures, electronics and rendering functions of existing hardware. Different embodiments may be adapted for different LF display view angles, voxel distance range and resolution.
  • FIG. 23A depicts a structural overview of a single projector cell (or LF pixel) 2302 that is one basic unit of a whole LF display 2300 utilizing tilting refractive plates 2320.
  • Light is emitted from a pixelated LEL 2310 and a microlens 2315 collimates the emitted light into a set of beams that exit a lens (cell boundary) aperture 2340 at different propagation directions.
  • Unique views of the same 3D image are projected to the different directions by modulating the pixels according to the image content. These unique views sent via various beam directions create a stereoscopic 3D effect.
  • the result is a stereoscopic image for a single user standing in the middle of a Field-Of-View (FOV).
  • FOV Field-Of-View
  • the image from a right half of LF pixels enter the left eye and left half pixels are visible only to the right eye.
  • the result is a set of unique views spread across the FOV and multiple users can see the stereoscopic images at different positions within the predefined viewing zone.
  • This is a multiview light field for a 3D scene, and each viewer has their own stereoscopic view of the same 3D content and natural perception of a three-dimensional image is provided. As a viewer moves around the display, the observed image is changed at each new viewing angle.
  • This projector cell functionality is analogous to operation of previously discussed flat form- factor autostereoscopic displays based on, for example, lenticular sheets.
  • a tilting refractive plate 2320 is placed between the LEL 2310 and microlens 2315.
  • the optical path of emitted light beams is not altered, but when the plate 2320 is tilted, the optical path is bent inside the plate. Bending of the light path occurs as the light rays are refracted in the first interface between air and plate material. This angular shift is compensated when the light exits the plate from the other side and rays are refracted again with the same magnitude angular shift but towards the opposite direction.
  • the plate is flat, so it will not have any optical power and it will cause only a minor effect on beam focus.
  • a small lateral shift (also called parallel shift in optics) between the beam paths before and after the tilting plate may be introduced, and this shift may cause the beams exiting the projector cell to have slightly shifted propagation directions. From the point-of-view of the projector microlens 2315, it may appear as if the light emitting pixel 2310 position is shifting together with the tilting of the plate 2320.
  • the amount of pixel apparent positional shift (and together with it the amount of propagation-angle change introduces) is related to three parameters of the tilting plate 2320: 1) tilt angle, 2) material refractive index, and 3) thickness.
  • the systems set forth herein may be highly tunable, based on the selection of materials with different refractive indexes and thicknesses during manufacturing.
  • FIG. 23B depicts a schematic presentation of an exemplary structure 2300 (comprising a plurality of projector cells 2302) for sweeping through beam scanning angles, in accordance with an embodiment.
  • the structure may comprise an array of tilting flat plates 2320 and an array of microlenses 2315 that together with the light emitting layer 2310 form a full light field display.
  • the plates 2320 are optically clear and may be reasonably light weight. They can be made from, for example, standard plastic optics material like PMMA or polycarbonate, or glass materials like float, grown, flint, or fused silica.
  • a bending plate may degrade the beam collimation level, which would lower the quality of the generated light field.
  • the plate thickness is at last as great as the plate diameter divided by 6.
  • the tolerances for a refractive plate are less demanding when compared to a mirror, and somewhat thinner plates could be utilized without sacrificing too much optical quality.
  • a rotating plate array structure may be made by connecting the plates together at their edges (such as at connections 2322).
  • the connection may comprise, for example, a soft material like silicone, rubber, nylon, etc.
  • the connections 2322 between the plates 2320 allow modulation of the plate array as a single sheet.
  • the rotating/tilting motion of each plate 2320 can be introduced to all plates at the same time by introducing linear motion with the appropriate synchronization to the edges of the array or to just a few contact points along the sheet.
  • the connecting structures between plates act like springs and they may be hidden in the space between two projector cells.
  • FIG. 23B two projector cells 2304 and 2306 are emphasized, representing two extreme plate tilt angles.
  • the beams exiting projector cell 2304 sweep in the counter-clockwise direction.
  • the plate in cell 2306 rotates in the counter-clockwise direction, the projected beams sweep in the clockwise direction.
  • small angular range is scanned with the exiting beams that is symmetric in respect to the whole display normal.
  • the angular sweep range can be designed to complement the angular spacing between two adjacent views generated with the different pixels. As the pixels are modulated in
  • FIG. 24A depicts an overview of various standing wave states of an exemplary tilting plate array 2425, in accordance with an embodiment.
  • the array 2425 comprises tilting plates 2420 which are connected to each other with a flexible material 2422, and the whole array 2425 can be treated as one sheet.
  • the refractive plates 2420 rotate back-and-forth between two maximum tilt angles, the total plate array 2425 forms a dynamic shape that may resemble a standing wave, as shown in FIG. 24B.
  • the connection structures 2422 between plates 2420 function as the antinodes of the standing wave.
  • the nodes of the wave are positioned at the center of the transparent plates 2420, and they can be tilted around a virtual axis without introducing a shift in the distance between the plate 2420 and LEL 2410.
  • the optical raytrace pictures show that when the standing wave is at phase 0, the plates 2420 are parallel to the LEL 2410 surface and no lateral shift is introduced to the optical paths.
  • the standing wave is at phase 1 (with stretched flexible material 2323)
  • the plates 2420 are tilted and the pixels appear to be slightly shifted from their original positions.
  • This virtual shift causes the beams exiting the projector cells (at apertures 2440 in the cell boundary 2442) to have a slight angular shift as well.
  • This structure can sweep separate beams through small angles with the continuous movement of the standing wave.
  • FIG. 25 depicts a schematic presentation of an exemplary display structure 2500 for generating 3D light fields using tilting refractive plates, in accordance with an embodiment.
  • light is emitted from a pixelated layer 2510 than can be, for example, an LED matrix, OLED display, or LCD display with backlight.
  • a tilting plate sheet (e.g., array of tilting plates) 2525 is on top of the pixelated LEL 2510.
  • the display 2500 includes actuators 2550 providing the linear and/or angular motion for generating standing waves in the plate array 2525.
  • the actuators 2550 may be secured to the frame 2570 of the display 2500.
  • Light collimation optics are placed on top of the plate array 2525. In the embodiment illustrated in FIG.
  • the collimation optics comprise a microlens/lenticular lens sheet 2515.
  • the collimation optics 2515 can be a foil with diffractive structures.
  • An array of apertures 2540 comprising, for example, a punctured sheet is placed on top of the microlens array 2515, optically isolating the projector cells from each other.
  • Optical structures may be one-dimensional (e.g., cylindrical lenses) if only horizontal views are used, and may be two-dimensional (e.g., rotationally symmetric microlenses) if views are desired in both horizontal and vertical directions. In the latter case, two orthogonal scanning plate arrays may be positioned in series to facilitate two-dimensional angular scanning.
  • Standing waves are set up along both the horizontal and vertical directions of the plate arrays, and further temporal multiplexing is used to scan the second dimension.
  • rendering schemes align timing between sub-pixel activation and standing wave frequencies of the horizontal and vertical to scan the desired projection angles.
  • a tilting plate array in accordance with various embodiments. It possible to use thick rigid plates (e.g., glass) that are joined together with elastomer materials such as silicon rubber or thermoplastic urethane.
  • elastomer materials such as silicon rubber or thermoplastic urethane.
  • the foil itself can also be used as a functional optical component by making a series of small grooves to the foil, such as by embossing, and the grooves may act as hinges between the more rigid parts that have the full foil thickness.
  • the foil material may be optically transparent and ductile, as well as have sufficiently high fatigue strength so as to endure repeated bending movements.
  • Suitable polymer materials for this purpose include, but are not limited to, polycarbonate and polyamide.
  • a display device frame may have support features for the rigid and flexible display components.
  • the support mechanism may include, for example, linear and/or angular momentum motors or moving supports at the sheet's vertical ends.
  • the conductors may be integrated between projector cells in the plate sheet, such as by screen printing (silver paste ink) or by using etched copper wiring on foil.
  • Graphene has mechanical and optical properties that are suitable for this kind of display. It is conductive and it can be stretched about 20% without damage, so it may be used both as hinge material between the plates and as a conductor for electrostatic actuation.
  • MEMS micro electro-mechanical systems
  • the movement of the tilting array may be generated by coupling the array with sound waves generated with speakers below or above the array.
  • the discussed methods utilize time multiplexing, and therefore the light emitting layer should have fast refresh rates.
  • An LED matrix based on currently available components or on LEDs is one example of a suitable light emitting structure capable of very fast switching speeds sufficient for the present systems and methods.
  • standing waves generated in the flexible sheet of plates are induced via actuators on the edges of the display device.
  • the small mechanical movement may be generated with piezoelectric actuators positioned at the frame of the device on the edges of the display.
  • FIGS. 26A-26B An exemplary display device using tilting plates is illustrated in cross section in FIGS. 26A-26B.
  • the display includes a plurality of projection cells 2602.
  • Each projection cell 2602 includes a set of controllable light-emitting sub-pixels 2610 and a tilting refractive plate 2620.
  • the micro- lenses are omitted from these figures to help provide focus for the disclosed plate structure.
  • Each set of light-emitting sub-pixels 2610 may be arranged in a two-dimensional pattern of sub-pixels (e.g., a 128x128 pixel array, among other possibilities), and each sub-pixel may be capable of displaying a full gamut of different colors.
  • the projection cells 2602 may also be arranged in a two-dimensional array, for example in a square or hexagonal array. In the example of FIG.
  • the tilting refractive plates 2620 are supported by (or are integral with) a membrane 2630 that is capable of serving as the medium of a standing wave.
  • the membrane 2630 and tilting plates 2620 In its rest state (without a standing wave), the membrane 2630 and tilting plates 2620 (collectively the tilting refractive plate array 2625) may be supported at a predetermined distance from the sub-pixels 2610 (e.g. a distance of one focal length) by a plurality of resilient supports 2635.
  • a piezoelectric actuator 2650 or other linear or rotational actuator
  • appropriate power source 2655 are provided to generate a standing wave in the tilting refractive plate array 2625.
  • One or more actuators 2650 may be positioned along one or more edges of the display 2600.
  • a plurality of actuators 2650 is distributed throughout the display 2600 including along the edges and within the perimeter.
  • each projection cell 2602 may be provided with a corresponding actuator 2650.
  • the resilient supports 2635 may include a position sensor used to determine the angle of each tilting refractive plate 2620 relative to the corresponding sub-pixel array 2610.
  • other types of position sensors may be used, and/or tilting refractive plate position sensing may not be used in cases where plate position can be calculated based on the input to the driving actuator (using, for example, the appropriate standing wave equation).
  • FIG. 26A illustrates a display in a rest state (with no standing wave)
  • FIG. 26B illustrates a portion of a similar display 2600 at a frozen moment in time while a standing wave is active in the projection cells 2602.
  • the standing wave causes the angle between each tilting refractive plate 2620 and its respective set of subpixels 2610 to change as a function of time.
  • ray-tracing or other simulation techniques may be used to determine when and whether (and with what intensity and hue) to illuminate different sub-pixels.
  • Extension to display of a plurality of voxels may be implemented by illuminating each sub-pixel if and when the corresponding tilting refractive plate is in a position such that any one (or more) of the voxels to be displayed would be accurately reproduced by illuminating that sub-pixel.
  • Exemplary embodiments described above use a traveling wave or a standing wave in the tilting refractive plate array to periodically alter the configuration of the plates with respect to the corresponding sub-pixels.
  • other techniques are used to alter the angle of incidence at the microlens layer with respect to the sub-pixels.
  • the angle of each tilting refractive plate and its corresponding set of sub-pixels may be adjusted with a piezoelectric actuator, microelectromechanical (MEMS) actuator, linear actuator, magnetic actuator, or other actuator.
  • MEMS microelectromechanical
  • Such an actuator may operate on a cell-by-cell basis or on a group of projection cells.
  • the realized display device can be either a multi-view display with a very dense grid of angular views or a more complex light field display with multiple views and focal surfaces.
  • Various embodiments employ a refractive film in place of the tilting plates.
  • a refractive film is used instead of the more complex sheet with connected refractive plates.
  • the film may offer a simpler approach to the plate components and generation of dynamic movement, but the desired optical effect of shifting the apparent pixel location can be more difficult to achieve.
  • the film may be comparatively thick and the continuous curvature of a wavy homogeneous material will cause negative effects to beam collimation limiting e.g. possible rendered voxel distance.
  • the film may be employed in use cases that allow very small pixel sizes and short viewing distance.
  • Various embodiments employ refractive plates/film on top of a multiview display.
  • the tilting plate sheet or refractive film is placed on top of a regular multiview display structure with lenticular lenses. As the plates tilt, they cause small spatial shifts between the beams exiting projector cell structures. These shifts can be used for enhancing the spatial resolution of such displays with temporal multiplexing.
  • the structure and method is different from the embodiments described previously in this document as the scanning is done in the spatial domain instead of the angular domain.
  • the current slanted lenticular structures could be utilized much better and balancing between lateral and horizontal resolutions could be done in a more flexible manner. This structure may also resolve the problem of low horizontal spatial resolution associated with current multiview displays.
  • Various embodiments employ a double tilted plate structure.
  • two separate tilting plates are used per cell instead of just one as previously described herein.
  • the double plate structure is discussed in more detail below.
  • Various embodiments employ an array of tilting mirror elements in place of the tilting plate array.
  • the tilting elements are reflective like in Digital Micromirror Devices (DMDs).
  • DMDs Digital Micromirror Devices
  • the images of the light emitting elements are scanned through a spatial range. This spatial shift is transferred by the collimating lens into angular shift in the projected view direction.
  • the optical path may be folded as the light is reflected from the planar mirror surfaces instead of being refracted making the system geometry different from the ones presented in this disclosure thus far.
  • an apparatus comprising: a light emitting layer having an array of pixels, wherein each pixel comprises a set of sub-pixels; an array of tilting refractive plates, wherein (i) each refractive plate in the plate array is connected to one or more adjacent plates via a flexible joint and (ii) each set of sub-pixels is projected through a tilting refractive plate in the plate array; a microlens array, wherein each set of sub-pixels is collimated by a microlens in the microlens array, and a control circuit for rendering a 3D light field that is projected via the microlens array, wherein the control circuit synchronizes activation of sub-pixels with tilt angles of the refractive plates.
  • the apparatus may include wherein the light emitting layer is an LED panel, or an OLED panel, or an LCD panel.
  • the apparatus may include wherein the microlens array is a lenticular sheet.
  • the apparatus may include wherein the flexible joint is a silicon connection, or a clear adhesive film affixed to the array of plates.
  • the apparatus may further comprise actuators connected to the array of tilting plates, wherein the actuators are controlled by the control circuit and drive the angular motion of each tilting plate.
  • the apparatus may include wherein the actuators are linear actuators, or are angular actuators.
  • the apparatus may include wherein the actuators set up a standing wave in the array of tilting plates.
  • the apparatus may include wherein multiple independent and binocular views of content are projected at multiple different viewing angles.
  • the apparatus may include wherein light from a given pixel is small in size so as to not create a false focal surface at the light source.
  • a method for producing a light field using a plurality of projector cells each cell having (i) multi-colored light sources on a light emitting layer, (ii) a blocking partition between projector cells; and (iii) a rocking refractive plate optical element, the method comprising: modulating the multi-colored light sources and synchronizing the emitted modulated light with the rocking refractive plate optical element; and passing the emitted modulated light through a collimating microlens.
  • the method may include wherein passing the emitted modulated light through the collimating microlens comprises projecting multiple independent and binocular views of content at different viewing angles.
  • the method may include wherein light from a given multi-colored light source is small enough in size to not create a false focal surface at the light source.
  • a light field display comprising: an array of small projector cells, each small projector cell having (i) a pixelated layer that emits sub-pixel light beams, (ii) a tilting plate that alters the propagation direction of the emitted beams, and (iii) a light collimating microlens; an array of actuators for driving an angular motion of each tilting plate; and a control means in communication with both the array of small projector cells and the array of actuators for synchronizing timing between the emitted sub-pixels and the angle of the tilting plates.
  • the light field display may include wherein the tilting plate of each cell is connected to the tilting plate of at least one neighboring cell with a bendable connector to form a tilting plate array.
  • the light field display may include wherein the array of actuators drive continuous oscillations at both ends of the tilting plate array to set up a standing waveform within the tilting plate array.
  • the light field display may include wherein the control means synchronizes timing based on a rendering schema.
  • an optical method and basic construction of an enhanced multi-view 3D light field display may extend the capabilities of a multi-view display using lenticular sheets with a Spatial Light Modulator (SLM) and a backlight structure based on a flexible diffractive grating foil with a propagating wave.
  • SLM Spatial Light Modulator
  • the number of light emitting elements on the backplane is optically multiplied in the foil layer as the light is diffracted to different grating orders, making it possible to use clusters of physical light emitting elements instead of filling the whole backlight panel with smaller light emitting components.
  • the waveform propagates in the grating foil, the angle of incidence between the grating and the light emitted from a pixel changes constantly.
  • the diffraction orders change their propagation direction slightly. This small change in propagation angle is used for additional temporal multiplexing of view directions.
  • the propagating wave allows sweeping of spatially multiplexed view directions through small angles. By synchronizing the wave movement to the activation of the illumination pixels and SLM, a much denser multi-view display is created.
  • FIG. 27A depicts a schematic presentation of an exemplary structure of a single LF projector cell 2702 that forms one basic unit of a whole LF display backlight system, in accordance with an embodiment.
  • a projector cell 2702 comprises a single light emitting 2705 element on a light emitting layer 2710.
  • the light is emitted from a small component placed on a pixelated Light Emitting Layer (LEL) 2710 and a microlens 2715 collimates the emitted light into a beam.
  • LEL pixelated Light Emitting Layer
  • the beam hits a grating foil 2720, which diffracts light into different diffraction orders, and the original beam is divided into several new beams that propagate in different directions. Propagation directions of the new beams are related to the grating parameters and follow the relation:
  • 0m arcsin (m * ⁇ / d - sin0i) Eq. 9 where 0m is the propagation direction of the new beam (in relation to a grating surface normal vector) after the grating at diffraction order m, ⁇ is the light wavelength, d is the distance from the center of one grating slit to the center of the adjacent slit (grating period), and ⁇ is the angle of incidence the original beam has in relation to the grating surface normal vector.
  • a focusing lens 2725 positioned after the grating foil 2720 re-focuses the beams into images 2727 of the original light emitting component.
  • a transmissive diffuser foil 2730 placed at the pixel image location mixes the angular distribution of the beams but maintains the spatial distribution.
  • the above-described elements of the projector cell 2702 form a backlight structure that generates a very dense array of small spots of light that can be individually activated.
  • the mixed beams are modulated via a SLM 2735 (e.g., a liquid crystal array) and then passed through a microlens array 2740.
  • a projector cell comprises a plurality of light emitting elements on a light emitting layer.
  • FIG. 27B depicts a schematic presentation of an exemplary structure of a single LF projector cell 2702 with multiple light emitting elements 2705, in accordance with an embodiment.
  • the structure may selectively transmit or block the light coming from individual LED images.
  • a microlens or lenticular lens sheet 2740 positioned in front of the SLM 2735 projects collimated beams to different directions out of the display.
  • a particular projection direction is based on the horizontal/vertical position of a pixel image 2729 on the diffuser foil 2730 behind the lens sheet 2740.
  • These microlenses 2740 together with the SLM 2735 form LF display pixels.
  • different beam directions are used to create a stereoscopic 3D effect.
  • Unique views of the same 3D image are projected to different directions by modulating the SLM pixels and light emitting elements 2705 according to the image content.
  • two image projections are used, and the result is a stereoscopic image for a single user standing in the middle of a display Field-Of-View (FOV).
  • FOV Field-Of-View
  • the image from a right half of the LF pixels enter the left eye and the image from left half pixels are visible only to the right eye.
  • the result is a set of unique views spread across the FOV.
  • Multiple users can be able see the stereoscopic images at different positions inside a predefined viewing zone. This represents a multi- view light field for a 3D scene, and each viewer may have their own stereoscopic view of the same 3D content. Accordingly, natural perception of a 3D image is generated. As any viewer moves around the display, the image is changed for each new viewing angle to show a correct scene, thereby emulating natural parallax and natural depth blur.
  • the SLM 2735 has a very high resolution
  • diffuser foil 2730 spots corresponding to each separate light emitting pixel can be blocked individually.
  • the system modulates a whole cluster of cell-exiting beams cotemporally with a single SLM pixel. This makes it possible to utilize lower resolution SLMs in the design of the display.
  • the LEL pixels are modulated faster than the SLM, and the temporal synchronization between light emitting components and the SLM can be modified to take advantage of this.
  • the SLM 2735 can be, for example, a LCD screen with polarizers, and the light emitting elements 2705 can be clusters of LEDs bonded to a backplane.
  • the light emitting elements have multiple images, which means that a single element can be used for illuminating multiple LF pixels on the SLM. This may lower manufacturing costs of the light emitting layer, as fewer components are needed, and they can be bonded as clusters to the backplane instead of employing dense matrix.
  • the number of images generated is dependent on the particular diffraction grating design— mainly on how many grating orders are created with even illumination intensity.
  • FIG. 28 depicts an overview of beam angle change using a diffractive foil 2820, in accordance with an embodiment.
  • a propagating waveform in a grating/diffractive foil 2820 can be used for additional temporal multiplexing of projection angles.
  • FIG. 28 depicts a schematic diagram of a situation where the grating foil 2820 propagating wave has propagated to a position where the foil 2820 is clearly tilted with respect to the direction of a light beam emitted from the LEL 2810. The tilt causes an additional angular spreading of grating order beams, which is followed by spatial spreading of the pixel images 2827 on the SLM 2835. The amount of additional angular spreading can be calculated from the previously noted grating equation.
  • the 0th order beam travels through the grating 2820 unaltered, except for a small lateral shift caused by refraction at the foil and air interfaces.
  • the pixel images also move further away from a centerline of the diffuser foil 2830 and a microlens array 2840 projects the resulting beams to an angle directed towards the 0th order.
  • the propagating wave moves through the projector cell aperture, the alternating trough, crest and slope parts of the wave are used for scanning of small angles.
  • the SLM pixels are filled with virtual sub-pixels that travel across the LF pixel aperture defined by the single microlenses of the microlens/lenticular sheet. This is used for the creation of an angularly denser light field as the SLM pixels and LEL elements are modulated in synchronization with the grating foil propagating wave.
  • a monochromatic system not needing color combination, can be implemented without the propagating wave movement. In those embodiments, the system may rely more on spatial multiplexing.
  • FIG. 29 depicts a schematic presentation of an exemplary internal structure a 3D Light Field display 2900 with directed backlight using a diffractive foil 2920, in accordance with an embodiment.
  • the depicted internal structure of a LF display highlights the functionality of a method used with the structure, which utilizes a propagating wave motion in the grating foil 2920.
  • Four successive projector cells 2901. 2902, 2903, 2904 are pictured representing four different wave phases. As the waveform propagates in the foil 2920, the different phases of the wave tilt the projection directions differently making the views to change directions slightly. The result is a set of angular sweeps of view direction through small angles.
  • the display 2900 includes a LEL 2910, collimating microlenses 2915, a diffractive grating foil 2920, focusing microlenses 2925, a light diffuser 2930, a SLM 2935, and collimating microlenses 2940.
  • projector cells 2902 and 2904 generate views that are symmetric with respect to a normal vector of the display. This is due to the fact that the grating foil wave is at trough and crest of the wave amplitude, making the incident angles align with the grating surface normal vector. As the grating 2920 diffracts the different orders symmetrically on both sides of the 0th order, the beams emitted from the LF pixels are symmetric. In projector cell 2901 , the grating 2920 is tilted in a counterclockwise direction, altering the beam directions such that they propagate at an angle towards the 0th order beam.
  • the 0th order beams are not affected by the grating tilt, as can also be seen from the previous grating equation. This means that the 0th order beams will always be symmetric with respect to the normal vector and the angular tilts occur only on the, for example, -1th and 1th order beams.
  • the same tilt directions with the same angles are introduced to the propagation directions in cell 2903, where the grating 2920 is tilted in a clockwise direction with respect to the display normal.
  • Angular sweeps go back and forth between (i) the positions determined by wave trough and crest and (ii) the positions determined by the grating tilted to the left and right during propagation of one full waveform of the grating foil 2920 across a projector cell aperture.
  • the beam bundle emitted from the LF pixels alternates between two states wherein the total beam bundle divergence angle is made larger and smaller.
  • the smaller angles occur when the wave trough or crest is used and the large bundle divergence occurs when the grating foil 2920 is tilted.
  • the particular foil wavelength and amplitude used for a specific sweep angle range can be determined using the previously mentioned grating equation.
  • the line of small arrows 2960 on above the lens 2940 signify the whole beam angular sweeping actions that follow when the waveform is propagating in the grating foil 2920.
  • FIG. 30 illustrates a schematic overview of an exemplary 3D Light Field display structure 3000 with directed backlight using a diffractive foil 3020, in accordance with an embodiment.
  • Light is emitted from a pixelated layer 3010 that has clusters of light emitting elements that can be, for example, LED matrices or printed OLEDs.
  • An array of light collimation optics 3015 is placed on top of the light emitting pixels 3010 and may comprise, for example, a microlens/lenticular lens sheet (e.g., PMMA or
  • the device includes actuators 3022 for providing the linear (and/or angular) motion to generate the propagating wave motion (arrows 3017) in the grating foil 3020.
  • the actuators 3022 are mounted on or in the frame 3070 of the whole display 3000.
  • actuators 3022 are positioned throughout the display structure. A wave amplitude below 1 mm is adequate for generating the desired angular sweep ranges even in fairly large displays if the grating period and projector cell size are kept small.
  • the grating film 3020 may comprise polyester and have a thickness -0.1 mm.
  • Such foils are manufactured via embossing or holographic methods, and are presently available in the form of large rolls.
  • a grating structure that distributes incident light intensity equally to the different grating orders may be used for this component, as it may allow lower complexity rendering schemes to be applied.
  • a commonly used sinusoidal grating pattern diffracts light evenly to the -1 , 0, and +1 orders, but fine-tuning of the pattern may provide a more consistent luminance output and performance. Re-focusing of the beams to a diffusing foil 3030 (e.g.
  • a light diffuser 3030 located behind the SLM 3035 apertures may be performed by using a focusing microlens sheet 3025 comprising, for example, PMMA or polycarbonate material.
  • the diffusing foil 3030 mixes up the angular distribution of light rays hitting the foil but maintains the spatial distribution of the beams.
  • This foil 3030 may be thin (e.g., 50 ⁇ polycarbonate sheet), and therefore the diffusing property resulting from surface structures in the foil does not blur the spot sizes excessively. Mixing of the angular distribution opens up the possibility to use different SLMs and lenticular lenses in the latter parts of the display structure 3000, as the backlight part of the system is "optically decoupled" from the front part that finally generates the LF beams.
  • the SLM 3035 may comprise, for example, a LCD screen with polarizers on both sides.
  • a collimating microlens/lenticular lens sheet 3040 that generates the multiple views from the sweeping virtual pixels in the diffuser 3030 behind the SLM 3035 layer.
  • Optical structures in the display device can be one-dimensional (e.g., cylindrical lenses) if only horizontal views are used or two- dimensional (e.g., rotationally symmetric microlenses) if views are desired in both directions.
  • two orthogonal diffractive foils may be used for the two-dimensional angular scan.
  • Each foil may carry its own propagating wave, and the two waves may propagate in orthogonal directions.
  • Each oscillating grating foil is driven by actuators, and the actuator mounting positions can be selected to achieve a desired propagation direction for each foil.
  • Spatial resolution in the direction of the propagating wave is multiplexed both (i) by using several successive components (spatial multiplexing) and (ii) by sweeping the apparent positions of these pixels (temporal multiplexing), whereas the spatial separation of components alone in the orthogonal direction determines the achievable resolution.
  • Spatial resolution achievable with the whole structure may be limited by the topmost microlens / lenticular sheet 3040 apertures.
  • the lenticular sheet can be slanted or the vertical array of lenses can have a small offset in the horizontal direction. In this manner, it is possible to manage trade-offs between horizontal and vertical spatial resolutions in order to balance an overall display spatial resolution.
  • the generation and propagation of a wave in a flexible sheet may be as discussed in relation to FIG. 5.
  • the propagating wave causes a linear motion on the sheet in the direction of a display normal vector. Because of the wave, the sheet also carries an angular momentum.
  • the propagating wave may be generated/driven using rotational and/or linear actuators coupled to a horizontal end of the sheet.
  • Linear (or angular) motion is generated at both ends of the wavy diffractive grating foil.
  • the movements of the actuators which drive the foil are selected, at least in part by considering the grating sheet length, in order to avoid standing waves. With continuous oscillation and correct synchronization, the propagating wave travels across the display width (as depicted in FIG. 5). Because the projector array optical components are small, small wave amplitudes are adequate for the wave generation. Piezo- electrics are one example class of actuators that are sufficient for this task.
  • the actuators comprise piezo-electric devices.
  • the actuators may comprise motors, electromagnets, or any other system/component capable of setting up propagating waves within the grating foil.
  • a display device frame may have supports for the rigid and flexible display components.
  • the support mechanism may include, for example, linear and/or angular momentum motors or moving supports at the sheets ends.
  • electrical conductors or electromagnets along the display width may be employed to generate the dynamic propagating wave movement.
  • the conductors are integrated between projector cells in the wavy grating film.
  • elements of embodiments of the present systems may be built as a backlight which can be integrated into existing display structures, potentially easing commercial exploitation of the disclosed systems and methods.
  • some embodiments set forth herein may comprise an enhanced backlight structure rather than a complete display structure.
  • These embodiments may omit, for example, the SLM and final lens array, and may also omit the diffusive foil layer, frame, and various control circuitry (e.g., circuits to control and synchronize the omitted SLM).
  • the present methods are based on additional time multiplexing, the light emitting layer utilizes faster refresh rates than used by current multi-view displays.
  • suitable light emitting structures capable of the very fast switching speeds adequate for embodiments disclosed herein may comprise an LED matrix based on traditional components and LEDs.
  • the display comprises a white backlight with LEDs and the diffractive foil is used as a wide color gamut light engine.
  • a LED matrix of blue chips is coated with a uniform phosphorous material layer.
  • the phosphorous layer converts the blue light into a wider white light spectrum.
  • the diffractive foil separates the white illumination beam into a continuous spectrum imaged to the diffuser layer.
  • An LCD SLM blocks parts of the single LED spots, which are much wider than the original image due to color diffraction spreading, and now exhibit a full spectrum of colors. With this blocking, it is possible to generate different colors through the (LCD) display pixels and also expand the overall display color gamut as the available color space is continuous. This color generation method may be used for LF displays or even for current LCD-type 2D displays for improving the color range and accuracy.
  • Various embodiments comprise a directable backlight structure within a Head Mounted Display (HMD) device.
  • the presented directed backlight structure may be used in a HMD.
  • a device equipped with, for example, a common LCD display becomes a LF system.
  • the structure generates several focal surfaces instead of the single focal surface usually present in current HMD devices.
  • the optical method may be applied to all HMDs using SLM displays including augmented reality (AR), virtual reality (VR), and mixed reality (MR) goggles.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • Fast switching speeds of LEDs may be combined with currently available LCD technologies as the illuminating elements could handle most of the temporal multiplexing for the LF system.
  • the thickness of the backlight structure may be designed to fit the head mounted use. Also, the system could function in various manners without the propagating wave movement making it simpler to implement.
  • a grating film bends light rays.
  • the diffractive grating orders 1 and -1 bend the light rays symmetrically to two directions.
  • the zeroth order goes directly though the grating and it can be obscured if not needed.
  • the bending angle depends on grating period and light wavelength.
  • the incoming light sees a tighter grating period than in the case when the grating is not tilted.
  • a tilted grating bends rays more than a non-tilted one.
  • a display apparatus includes a diffraction grating layer, and a propagating wave is generated in the diffraction grating layer.
  • the diffraction grating tilts the beam more if the grating is tilted on top of the light emitting element.
  • ray bending power is at the minimum.
  • the changes in ray bending angles generated by a wavy grating are small compared to what is generated by projector lens - pixel position pairs. These small angles may be used for changing observer accommodation-focus distance in the case when the observer is far away from the display and desired angles to the eye-box very small.
  • the wavy grating may also reduce the need for high pixel density on display lenticular sub-projector by introducing a super-resolution phenomenon on large light emitting elements.
  • This kind of a flexible grating foil may also be used in between flat display layers.
  • the performance of a normal multi-view display based on lenticular lens sheets and dense pixel matrices is enhanced by introducing a flexible diffractive foil with a propagating wave into the structure.
  • a flexible diffractive foil with a propagating wave As the wave propagates in the grating foil, the angle of incidence between grating and light emitted from a pixel change constantly. As the angle changes, also the diffraction orders change their propagation direction slightly. This small change in propagation angle is used for additional temporal multiplexing of view directions.
  • the propagating wave allows sweeping of spatially multiplexed view directions through small angles. By synchronizing the wave movement to the activation of the pixel matrix, a much denser LF display is created.
  • the system comprises an array of small projector cells.
  • the light in a single projector cell is emitted from a pixelated layer and a microlens collimates the emitted light into a set of beams that exit the lens aperture at different propagation directions.
  • the beam directions create a stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions via modulating the sub-pixels according to the image content.
  • This first part of the projector cell functionality is not dissimilar from the methods used in flat form-factor autostereoscopic displays based on e.g. lenticular sheets.
  • the next layer in the exemplary projector cell structure is a grating foil that alters the propagation direction of the emitted beams by diffraction.
  • an apparatus comprising: a light emitting layer having an array of pixels, wherein each pixel comprises a set of sub-pixels; a diffractive grating foil layer, wherein each pixel is projected through the diffractive foil layer and the diffractive foil layer carries a propagating wave across its surface; a focusing microlens array, wherein each pixel is focused by a microlens in the focusing microlens array; a diffusive layer; a spatial light modulator; a collimating micro lens array, wherein each pixel is collimated by a microlens in the collimating microlens array; and a control circuit for rendering a 3D light field that is projected via the collimating microlens array, wherein the control circuit synchronizes activation of sub-pixels with the propagating wave in the diffractive grating foil layer.
  • the apparatus may include wherein the light emitting layer is a LED panel, or a microLED panel.
  • the apparatus may include wherein the set of sub-pixels include at least one red, green, and blue sub-pixel; or includes a white-light pixel sub- pixel.
  • the apparatus may include wherein the spatial light modulator is an LCD panel.
  • the apparatus may include wherein the collimating microlens array is a lenticular sheet.
  • the apparatus may include wherein diffractive grating foil layer is made from polystyrene.
  • the apparatus may further comprise actuators connected to the diffractive grating foil layer, wherein the actuators are controlled by the control circuit and drive the propagating wave.
  • the apparatus may include wherein the actuators are linear actuators, or are angular actuators.
  • the apparatus may include wherein the actuators are mounted to a display frame.
  • a method comprising: activating a plurality of projector cells according to a rendering schema, each projector cell having (i) multi-colored light sources on a light emitting layer, and (ii) a focusing microlens; exciting a grating foil excited with a traveling wave, wherein light from each projector cell is diffracted by the grating foil; modulating the diffracted light with a Spatial Light Modulator that is synchronized with the multi-colored light sources and the traveling wave; and projecting the modulated light through a collimating micro lens array.
  • the method may further comprise a control means in communication with both the plurality of projector cells and the grating foil for synchronizing timing between the multi-colored light sources on a light emitting layer and an angle of incidence at the grating foil excited with the traveling wave, in accordance with the rendering schema.
  • the method may include wherein projecting the emitted modulated light through the collimating microlens comprises projecting multiple independent and binocular views of content at different viewing angles.
  • the method may include wherein light from a given multi-colored light source is small enough in size to not create a false focal surface at the light source.
  • exciting the grating foil excited with the traveling wave comprises using an array of actuators to drive oscillations in the grating foil.
  • the method may include wherein the grating foil is made of polystyrene.
  • the method may include wherein the grating foil diffracts the light from each projector cell into -1 , 0 and 1 orders.
  • the method may include wherein a number of focal surfaces created and displayed is determined by the rendering schema.
  • Optical Structure of a Light Field Display with Double Refractive Optical Elements In some embodiments of optical methods and structures for multiview 3D light field displays, two arrays of elements or foils for carrying a propagating wave may be utilized, rather than a single array or foil.
  • two arrays of tilting refractive optical components may be incorporated into the structure (such as that previously discussed in relation to tilting plate embodiments).
  • two different optical functions can be accomplished either separately or simultaneously depending on the tilting phases: 1 ) the apparent point of emission on the light emitting layer is slightly shifted due to bending of the optical path inside the optical elements and the projected beam directions change slightly, and 2) the optical path is made longer or shorter as light goes through the tilting components and the projected beams focus to different distances from the display surface.
  • the small change in pixel projection angle is used for additional temporal multiplexing of view directions.
  • the tilting motion of the components allow sweeping of spatially multiplexed view directions through small angles and by synchronizing the movement to the activation of the pixel matrix, a much denser multiview display is created with higher quality 3D picture.
  • the focusing function is used for creation of multiple focal surfaces which can be used for addressing the VAC problem. As the separate beams that form voxels of the 3D image both cross and focus to the same focal surface, the eyes are able to obtain better focal cues.
  • the particular embodiment of the display device may be either a multiview display with a very dense grid of angular views or a true light field display with multiple views and focal surfaces.
  • the structure may function as a regular 2D display by activating all the sub-pixels inside a LF pixel simultaneously.
  • Exemplary methods are able to provide both the large light emission angles that are useful for eye convergence and the small emission angles that are desirable for natural eye retinal focus cues.
  • some such methods make it possible to create multiple focal surfaces outside the display surface to address the VAC problem.
  • Such embodiments present a way to simultaneously scan the small light emission angles and focus the voxel-forming beams with the help of tilting refractive components.
  • a method utilizes a combination of spatial and temporal multiplexing in creation of a dense light field that can be used for displaying 3D content.
  • the properties of a more traditional autostereoscopic multiview display are extended by introducing simple active optical components to the structure that can be used for high-resolution temporal scanning of light rays, enabling the creation of a dense light field with depth information instead of having just a set of multiple views.
  • Construction techniques for exemplary embodiments can be adapted from hardware constructions that are found from current 3D multiview displays utilizing lenticular sheets or other integral imaging approaches. Activation of the tilting (or foil) components calls for additional actuators and control electronics as well as alteration of the rendering scheme, but these can be added to the structures, electronics and rendering functions of existing hardware.
  • One advantage of some embodiments is that the principle can be scaled by use case or designed in product to different LF display view angles, voxel distance range and resolution.
  • FIG. 31 shows the structure of a single projector cell or LF pixel 3102 that forms one basic unit of a LF display.
  • the light is emitted from a pixelated LEL 3110, and a microlens 3130 collimates the emitted light into a set of beams that exit the lens aperture at different propagation directions.
  • the beam directions create the stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions by modulating the LEL sub-pixels according to the image content.
  • the result is a stereoscopic image for a single user standing in the middle of the FOV and the image from the right half of the LF pixels enter the left eye and the left half pixels are visible only to the right eye. If more than two sub-pixels are used, the result is a set of unique views spread across the FOV, and multiple users can see the stereoscopic images at different positions inside the predefined image zone. This effectively generates a multiview light field for a 3D scene and each viewer has their own stereoscopic view of the same 3D content and perception of a three dimensional image is generated. As the viewer moves around the display, the image is changed for each new viewing angle.
  • two additional tilting refractive components 3120, 3125 which can be, for example, polycarbonate plates, are placed between the LEL 3110 and microlens 3130.
  • the plates 3120, 3125 are parallel to the light emitting surface 3110, the emitted light beam directions are not altered, but when the plates 3120, 3125 are tilted, the beam optical path is bent inside the plates 3120, 3125. Bending of the light path occurs when the light rays are refracted in the first interface between air and plate material. This angular shift is compensated when the light exits the plate from the other side and rays are refracted again with the same angular shift but to an opposite direction.
  • the plates 3120, 3125 are flat, with parallel opposing faces, they will not have any optical power and it will cause a minor shift to the beam focus.
  • a small lateral shift also called as parallel shift in optics
  • this shift causes the beams exiting the projector cell to have a slightly shifted propagation direction. From the point-of-view of the projector microlens 3130, it appears that the light emitting pixel position is shifting together with the tilting of the plate. If the two plates 3120, 3125 are tilted with the same amount, but in opposite directions, the two lateral shifts compensate each other and the beam directions are not altered.
  • the small longitudinal shift caused by the changed optical path through the tilted components 3120, 3125 changes the apparent distance of the light source 3110 from the collimating lens 3130.
  • the collimating lens 3130 starts to focus the beams to a different distance from the display than in the case when the plates 3120, 3125 are parallel to the LEL 3110.
  • the amount of pixel apparent positional shifts in both lateral and longitudinal directions is related to three main parameters of the tilting components 3120, 3125: 1) tilt angle difference between the two components, 2) material refractive indexes and 3) thicknesses. Larger tilt angle differences between the two components result in larger lateral shifts. Higher refractive indices result in elements that can introduce larger shifts with smaller tilt values as the light is bent more at the air-material interfaces. Larger thickness values allow for greater lateral and longitudinal shift to be introduced as the light propagates a longer distance inside the components.
  • FIGS. 32A-32C illustrate three LF pixel optical function cases.
  • the tilt angle 3242 is zero and the tilting elements 3220, 3225 are parallel to each other and parallel to the LEL 3210.
  • the collimating microlens 3230 is positioned at a distance from the LEL 3210, where the focal length of the lens 3230 is almost equal to the optical distance between the lens 3230 and LEL 3210.
  • the lens curvature is selected for the particular embodiment such that the resulting projected beam is focused (3250) to the desired minimum focal surface distance from the lens 3230.
  • the two refractive active elements 3220, 3225 are tilted (tilt angle 3244) the same amount in opposite directions.
  • the optical path is changed as the light propagates a longer distance through the plates 3220, 3225.
  • the opposite tilts balance each other, and there is no lateral shift remaining in the beam after the two elements 3220, 3225.
  • there is some longitudinal shift which causes the beam to focus (3252) to a longer distance from the collimating lens 3230.
  • the tilting elements 3220, 3225 may cause some coma and astigmatism to the projected beams, but some of these effects are compensated by the double element structure. Tilting angles may be limited to relatively low values in order to keep the off-axis optical aberrations adequately low.
  • FIG. 32C depicts a case where the two plates 3220, 3225 are tilted (tilt angle 3246) more than in FIG. 32B, causing the optical path length to change more, and the focal point (3254) is now much further away from the LF pixel and beam is almost collimated.
  • the LEL 3210 has a finite surface area
  • the beam has some divergence 3260 due to the geometric optical factors.
  • the distance of the focal point can be changed by changing the tilt angles of the refractive elements 3220, 3225. If the angular change is made continuously, the change in focal length becomes continuous and the number of focal surfaces can be very high.
  • 32A-32C also illustrate that different geometric magnification ratios are obtained with the different focal distances.
  • the projected source images have different sizes depending on the distance from the LF pixels.
  • Voxels that are created further away from the structure are bigger than those that are located closer, meaning that the achievable spatial resolution is also a function of the focal distance.
  • the focal surface that is closest to the viewer and furthest away from the display determines the largest voxel size inside the whole image zone, and this size can be used for balancing the spatial resolution over the whole image volume.
  • the two tilting refractive elements inside the LF pixel can also be in different phases, if the tilting is actuated with different frequencies or with a relative phase shift. This introduces a small lateral shift in the apparent position of the light emitter, making the emitted beams to tilt to an angle from the optical axis.
  • FIGS. 33A-33C show three example cases.
  • the tilting plates 3320, 3325 are again parallel, but both of them are tilted (tilt angle 3342) in the same direction. This makes the focused beam (3350) tilt from the optical axis.
  • the first active element 3320 is tilted with a smaller angle than the second active element 3325, which is tilted to the same direction.
  • FIG. 33C shows a case where two parallel active elements 3320, 3325 are tilted in the opposite direction from FIG. 33A. Here, the beam is again tilted off-axis, but in the opposite direction, to focus at location 3354. Overall, the example cases shown in FIGS. 33A-33C illustrate that two tilting elements can be used effectively for both tilting and focusing of the beams simultaneously.
  • a full LF display is created by building a large panel containing an array or matrix of the presented LF pixels.
  • FIG. 34A shows the functionality and display structure of such an embodiment of a full LF display, where light is emitted from a matrix of components 3410 and an array of focusing microlenses 3430 collimates the beams.
  • Two layers of tilting elements 3420, 3425 are placed in between these two layers and controlled for tilting motion.
  • the refractive elements 3420, 3425 comprise even thick continuous foils or films that are oscillating with a propagating waveform, rather than the previously discussed tilting plates. They are optically clear and reasonably light weight.
  • the wavelength of the waveform is desirable for the wavelength of the waveform to be long enough (-10 x LF pixel aperture width) in order not to introduce aberrations to the projected beams. Tilting occurs as the waves propagate through the structures and foils or films are bent locally at each LF pixel position to form the tilted refractive elements 3421, 3426.
  • the tilting elements 3420, 3425 have opposite phases. At the troughs and crests of the waves, the two foils or films 3420, 3425 are practically parallel to each other and parallel to the LEL 3410. When the foils or films 3420, 3425 are at an angle, the emitted beams become collimated with slight divergence, as shown for LF pixels 3402 and 3404. At the position of the central LF pixel 3403, the foils or films 3420, 3425 are parallel, causing the emitted beams to focus on an image surface 3450 outside the display structure.
  • the source images are magnified to this surface 3450 with a geometric magnification ratio determined by the distance of the surface 3450 from the collimating lens array 3430.
  • Two beams emitted from neighboring LF pixels can be overlapped on this surface 3450 as the focused emitter array images are larger than the LF pixel apertures, and single emitter beams start to cross each other.
  • the beams may be crossed at the same distance where they focus, allowing for unambiguous focal cues to a viewer's eyes.
  • FIG. 34B shows the same display structure of FIG. 34A, but with a phase shift in the two propagating waves of the refractive elements 3420, 3425.
  • the phase shift creates an angular difference between the tilting elements 3420, 3425, and causing the emitted beams to tilt from the optical axis as lateral shifts are introduced.
  • the same focusing function is present as the optical path lengths change dynamically with the waveform in the tilting elements 3420, 3425.
  • the collimated and focused beams sweep back and forth through an angular range, which may in some embodiments be the same as the angular distance between two beams emitted from neighboring sub-pixels inside each LF pixel. In such instances, the angular density can be increased with temporal multiplexing.
  • both the focusing and angular sweeping functions can be created simultaneously, such systems and methods may represent versatile hardware for rendering true 3D LF images.
  • the tilting elements may comprise continuous sheet structures, which may provide a simple overall construction.
  • a plastic (e.g., polycarbonate or PMMA) foil or film may be used, in which case tilting can be introduced with a propagating waveform that has a relatively long wavelength in comparison to the LF pixel aperture size.
  • the tilting elements may comprise rigid plates (e.g., glass), which are joined together into a sheet with elastomer materials, such as silicon rubber or thermoplastic urethane.
  • the tilting elements may comprise a continuous elastic foil or film, on top of which an array of more rigid plates is laminated with optically transparent glue, such as polyethylene.
  • the foil or film itself may also be used as a functional optical component by providing a series of small grooves in the foil or film, such as by embossing, with the grooves acting as hinges between more rigid parts having the full foil or film thickness.
  • the foil or films may comprise materials that are optically transparent and ductile, as well as have good fatigue strength in order to endure repeated bending movement. Suitable polymer materials include, but are not limited to, polycarbonate and polyamide.
  • FIG. 35 shows a schematic presentation of an embodiment of a display structure 3500 with dual tilting elements 3520, 3525.
  • the light is emitted from a pixelated layer 3510, which may comprise for example a LED matrix, OLED display, or LCD display with backlight.
  • the tilting elements 3520, 3525 comprise refractive flexible sheets which are disposed above the pixelated layer 3510.
  • Actuators 3527 provide the linear (and/or angular) motion for generating propagating wave motion in the tilting elements 3520, 3525, and may be disposed at, on, or in the frame 3570 of the whole display 3500.
  • Light collimation optics 3530 are disposed on top of the display structure, and may comprise, for example, a
  • microlens/lenticular lens polycarbonate sheet or a foil or film with embossed diffractive structures.
  • An array of apertures 3540 such as a punctured plastic sheet, is placed on top of the microlens array 3530, optically isolating the LF pixels from each other.
  • Optical structures may be one-dimensional (e.g., cylindrical lenses) if only horizontal views are desired, or two-dimensional (e.g., rotationally symmetric microlenses) if views are desired in both directions. In the latter case, two orthogonal refractive sheets may be used for the two- dimensional angular scan and focusing.
  • multiple view directions for multiple focal surfaces may be generated for a plurality of users.
  • the support mechanism may include, for example, linear and/or angular momentum motors or moving supports at the vertical ends of the sheets. Instead of motors there may be electrical conductors or electromagnets along the display width that generate dynamic wave movement to the sheets or foils or films with the force based on electric and/or magnetic fields.
  • the conductors may be integrated between projector cells in the sheet or film, such as by screen printing (e.g. using silver paste ink) or by using etched copper wiring.
  • Graphene is also a promising material that has mechanical and optical properties suitable for displays uses such as those set forth herein. It is conductive and it can be stretched about 20% without damage, so it may be used both as hinge material between refractive components and as a conductor for electrostatic actuation.
  • MEMS Micro Electro Mechanical Systems
  • bimorph actuators which may produce well-controlled tilting action of ⁇ 30° to an array of mirrors that have an aperture size or ⁇ 1.5 mm.
  • Some further options for movement generation may include sound waves generated with speakers below or above the range that can be heard by the human ear or memory metals that can be activated by heat or electricity.
  • Tilting element actuation may utilize accurate actuators that are small and efficient.
  • environmental factors such as temperature changes are taken into account, and dedicated calibration modules and/or routines may be employed.
  • the amplitude and frequency of the wavy motion may be selected based on the operating temperature.
  • polycarbonate With polycarbonate, a temperature change of 10°C around standard room temperature will induce -1 % change in the elastic modulus of the material. This change may be detected, and a feedback signal may be sent to the actuators in order to compensate for the slightly changed foil or film stiffness.
  • Some exemplary embodiments are used in a display structure in which a LF image generation module, as described herein, is combined with a separate projection lens or lens array.
  • the LF image generation module 3605 in this case may produce intermediate images at different focal surfaces in between some layers of the display system, and the final image may be formed by the front projection lens as shown in the example system in FIG. 36.
  • the modular construction approach of such embodiments may make it possible to use one image generation module for different kinds of viewing scenarios by changing a front projector lens.
  • the light emitters may comprise, for example, a very dense matrix of LEDs positioned behind microlens arrays with short focal lengths.
  • the LED sub-pixel images 3610 can then be formed at few millimeter distances from the focusing lens array, and a pair of injection-molded magnifying lenses 3615 may be used for projecting the images to the two eyes 3620 separately (FIG. 36).
  • Such lenses 3615 are commonly used in current virtual reality (VR) display devices. As the images are projected to the two eyes separately, two different sections of the display may be used for producing the stereoscopic image pairs separately.
  • VR virtual reality
  • This spatial division may simplify design and manufacture of the hardware and software, as there is reduced need for spatial multiplexing at the LF pixel level. Also, as the eye pupils 3625 are closer to the display, it can be easier to project more than two views inside them, which may be achieved using only a few light sources inside each projector cell. Some of the sub-pixel images may be focused (3640) out of the retina 3630 of the eye 3620, while other sub-pixel images may be focused (3650) on the retina 3630. [0270]
  • the micro-optical components may be manufactured with the high-quality and high-volume wafer-scale manufacturing methods used for making mobile phone camera optics today. The systems and methods set forth herein also permit the addition of multiple focal surfaces, making the device more user friendly than most available VR and AR devices available today.
  • FIG. 37 depicts an exemplary display structure 3700 in which flexible foils or films are used as the tilting refractive elements 3720, 3725, which are actuated with sound waves in wave modules 3758.
  • the modules contain a sound generator 3760 (loudspeaker) on one side and a sound sensor 3762 (microphone) on the other. Air pressure differences generated with the sound generator 3760 actuate a propagating wavy motion in the foils or films 3720, 3725, and the amplitude and wavelength of the motion can be adjusted by controlling the emitted sound frequency and volume.
  • a frequency of the sound source is kept above or below the human hearing range, which is commonly given as 20 to 20,000 Hz.
  • a sound sensor 3762 on the other side can be used for monitoring possible changes in, for example, the foil or film flexibility due to temperature changes, and/or the like.
  • the feedback signal from the sensor 3762 can be used for adjusting the sound generator 3760 output, and thus actively calibrate the wave module(s) 3758 according to environmental changes.
  • the two modules 3758 may be separated from each other with rigid transparent structures, such as glass sheets, making it possible to, for example, generate the wavy motion in orthogonal directions without excessive crosstalk between the modules that would induce errors to the wavy motion and to the 3D image voxel registration.
  • a particular balance between temporal and spatial multiplexing may be chosen on the basis of a particular use case in the embodiments set forth herein by selecting the light emitting components accordingly. If, for example, very small LEDs are used, spatial multiplexing may be emphasized by creating more views with a multitude of light emitting components per LF pixel, which may lead to reduction in angular sweep ranges and extension of the time a single component can be in the on-state per single view direction. Alternatively, if larger LED chips are used, there may be a lower number of physical light emitting elements for the same size LF pixel, and requirements for the switching speed for single components become more demanding and angular sweep range is extended to cover larger gaps between the spatially initiated views.
  • the various embodiments set forth herein use a propagating wave in a diffractive or refractive element(s) to provide additional freedom for LF display device optimization, as the performance is not limited by the spatial separation between the pixels.
  • the convergence beams sweep over a viewer or observer's eyes in order to create the correct focus cues.
  • the sweeping beam entering the eye should have the same direction as it would have if it were emitted from the voxel, in order to generate the correct in-eye focus for correct voxel distance.
  • Both beam angular sweep and spatial change in beam starting point on the display surface are used simultaneously.
  • the beam angular sweep may be realized by temporally switching LEL pixels on and off in sync with the wave propagation in the flexible diffractive grating, and smooth out the pixel grid discrete directional output.
  • the beam starting point on the display surface may have the appearance of jumping temporally from one cell to another in the cell array while sweeping the beam.
  • the orientation of the sweeping beams may follow the voxel centric arc normal. Such an arc may ensure eye convergence, accommodation, and even multi-view angles, if the display viewing angle is wide enough for multiple observers.
  • a factor to be considered in the design of the display structure using a wavy diffraction grating is that gratings diffract light with different wavelengths to different angles. This means that if three colored pixels (e.g., red, green, and blue) are used, the different colored beams are tilted to somewhat different directions from the grating foil. However, a prism structure positioned after the grating film may compensate for this effect, as it also tilts the different colored beam directions differently, but may do so in the opposite direction. As the colored sub-pixels are usually spatially separated on the LEL, the collimating lens may also cause some small angular differences to the colored beam projection angles.
  • the collimating lens may also cause some small angular differences to the colored beam projection angles.
  • this effect can be used to further compensate for color separation caused by the grating.
  • special rendering schemes can also be used for combining the different colored beams into mixed color pixels in the viewer's eye.
  • a propagating wave in the diffractive foil may create a situation where the projected beams are tilted when the wave passes the cell aperture.
  • the tilting angle can be different throughout the projector cell aperture as some parts of the beam hit the foil at slightly different incidence angles.
  • the foil surface wavelength is preferably large enough that the resulting additional beam divergence does not limit the achievable voxel depth range.
  • the wavelength should not be too long as the resulting tilt is used for the beam angular scan, and with higher slope values and smaller wavelength values the angular range is larger. This means that there is a trade-off situation between beam collimation level and angular sweep length, and the furthest achievable voxel distance is balanced with the desired angular scan range connected to pixel density.
  • the optical materials refract light with different wavelengths to different angles (color dispersion). This means that if three colored pixels (e.g. red, green and blue) are activated, the different colored beams are tilted to somewhat different directions from the tilting plates. As the colored sub-pixels are usually spatially separated on the LEL, the collimating lens will also cause some small angular differences to the colored beam projection angles.
  • a rendering scheme combines the different colored beams into mixed color pixels in the eye by activating the differently colored pixels with a slight delay from each other so that the sweeping motion of the tilting plate negates the angular differences.
  • the collimated beams sweep over the observer's eyes in order to create the correct focus cues.
  • the sweeping beam entering the eye has the same direction it would have if it were emitted from a voxel.
  • Both the beam angular sweep and spatial change in beam starting point on the display surface are controlled simultaneously.
  • the beam angular sweep is controlled by temporally switching LEL pixels on and off in sync with the tilting plates. It smooths out the pixel grid discrete directional output.
  • the beam starting point on the display surface jumps temporally from one cell to another in the cell array while sweeping the beam.
  • the sweeping beams orientation follows a voxel-centric arc normal. The arc allows for eye convergence, accommodation and also multiview angles if the display viewing angle is wide enough for multiple observers.
  • the depth range achievable with the LF display may be connected to the quality of beam collimation coming from each sub-pixel. Collimation is achieved when the lens focal length is equal to the distance between the lens and the light emitter.
  • the size of the light emitting pixel, diameter of the lens aperture, and lens focal length are three parameters that determine collimation quality.
  • the lens focal length is preferably much larger than the lens aperture size (large F#) in order to achieve good collimation. However, if the focal length is very long, diffraction will start to limit the collimation quality. In some embodiments, in order to reach an adequate level of airy disc diameter caused by diffraction, the F# of the lens should equal the pixel size in microns. Eye resolution is about 1 arc minute corresponding to about ⁇ 0.0083° collimation quality. This can be used as the upper boundary when designing the light emitting layer pixel size and lens focal length.
  • Displays of different sizes can be realized with the optical methods set forth herein. Pixel size is one limiting factor on the achievable beam collimation level and should be considered carefully when designs for different use cases are created.
  • Table 1 presents some example calculated values for the achievable beam collimation level (shown as beam divergence half angles) for three different exemplary display cases. All the displays are considered to have FullHD spatial resolution mapped with the projector cell structure.
  • the "TV" display has a projector cell pitch of 0.5 mm, which means that full horizontal width is -960 mm (-46" screen).
  • the desktop display has projector cell and screen size that is half from the TV case and the mobile display refers to a -5.3" display with 65 ⁇ projector cell size.
  • the calculations are made according to a situation where all the three different display types are positioned at 1 m distance from the observer.
  • the first column on the left shows possible pixel sizes and the other three columns show the achievable collimation angles with the different displays.
  • the table shows clearly that as the pixel size decreases, the collimation quality improves as beam divergence goes down. What is also possible to see from the table is the fact that if the pixel size is kept the same, collimation quality decreases as the projector cell width is reduced.
  • Optical analysis also shows that with very small pixel sizes, diffraction becomes a limiting factor.
  • the numbers in the table are calculated with a projector cell that has F# of 2 and with these structures the diffraction limit sets to a pixel size between 2 ⁇ and 3 ⁇ .
  • the last column on the right shows that a mobile device LF display would use a pixel size below 2 ⁇ in order to be able to generate over 500 mm voxel distances and the diffraction becomes a clear limiting factor making it impossible to reach this voxel distance without disturbing effects coming from the excessive beam divergence.
  • a desktop display could reach the 500 mm voxel distances with ⁇ 4 ⁇ pixels and the TV set may render voxels at 1500 mm distance with ⁇ 3 ⁇ pixels as the projector cell optics would be just above the diffraction limit.
  • all of the LF pixels in the display may project emitter images towards both eyes of the viewer.
  • one emitter inside the LF pixel should not be visible to both eyes simultaneously if the created voxel is located outside the display surface.
  • the FOV of one LF pixel may cover both eyes, but the sub-pixels inside the LF pixels may have FOVs that make the beams narrower than the distance between two eye pupils (-64 mm on average) at the viewing distance.
  • the FOV of one LF pixel and also the FOVs of the single emitters are determined by the widths of the emitter row/emitter and magnification of the whole imaging optics.
  • one voxel created with a focusing beam is visible to the eye only if the beam continues its propagation after the focal point and enters the eye pupil at the designated viewing distance.
  • the FOV of a voxel is preferably adequate for covering both eyes simultaneously. If the voxel would be visible to single eye only, the stereoscopic effect would not be formed and 3D image could not be seen.
  • the voxel FOV may be increased by directing multiple crossing beams from more than one LF pixel to the same voxel inside the human persistence-of-vision (POV) time frame. In this case, the total voxel FOV is the sum of individual emitter beam FOVs.
  • the display is, for example, curved with a certain radius, or the projected beam directions are turned towards a specific point, such as with a flat Fresnel lens sheet. If the FOVs do not overlap, the LF pixels cannot be seen and some parts of the 3D image cannot be formed. Due to the limited size of the display and practical limits for possible focal distances, an image zone is formed in front of and/or behind the display device where the 3D image is visible.
  • FIG. 38A shows a schematic presentation of an example viewing geometry that can be achieved with an exemplary 3D LF display structure.
  • FIG. 38B For an exemplary flat display as discussed below in relation to a display with a directional backlight.
  • FIG. 38B the portion of a viewing zone in front of a viewer's eyes is shaded.
  • FIGS. 39A-39B show schematic presentations of two different example viewing geometry cases with a curved display.
  • a single viewer is sitting in front of the display and both eye pupils are covered with a small viewing zone achieved with narrow LF pixel FOVs.
  • the minimum functional width of the zone is determined by the eye pupil distance (on average -64 mm).
  • a small width also means a small tolerance for viewing distance changes as the narrow FOVs start to separate from each other very fast both in front of and behind the optimal viewing location.
  • FIG. 39B shows a viewing geometry where the LF pixel FOVs are quite wide, making it possible to have multiple viewers inside the viewing zone and at different viewing distances. In this case also the positional tolerances are large.
  • the viewing zone can be increased by increasing the FOV of each LF pixel in the display. This can be done by either increasing the width of the light emitter row or by making the focal length of the collimating optics shorter. Maximum width for the emitter row is determined by the width of the projector cell (LF pixel aperture). There cannot be more components in the single projector cell than what can be bonded to the surface area directly below the collimating lens. If the focal length of the collimating lens is decreased, the geometric magnification increases, making it more difficult to achieve a specific voxel spatial resolution.
  • the LF pixel FOV is doubled - but the source image magnification to all focal surfaces increases by a factor of two and it follows that voxel size on a given focal surface is also doubled.
  • the resolution reduction can be compensated by decreasing the highest magnification ratio by bringing the edge of the image zone closer to the display surface.
  • LF rendering schemes Several different kinds of rendering schemes can be used together with the disclosed display structure(s) and optical method(s). Depending on the selected rendering scheme, a particular display device may be a multi-view display with a very dense grid of angular views, or a true LF display with multiple views and focal surfaces. [0286] In the simpler multi-view rendering scheme, each LF pixel projects one pixel of each 2D view from the same 3D scene. This leads to a situation where all pixels in one 2D view image are created with the sub-pixels that are at the same positions inside the projector cells (e.g., upper right corner LF sub- pixels projected towards a view at left and below display center line).
  • One 2D image representing one 3D view at one view direction, may be created and shown on the matrix of LF pixels simply by activating the same sub-pixel inside each projector cell.
  • the multi-view image field may then be made much denser by modulating these images in synchronization with the diffractive foil or tilting plate wave that initiates scanning of additional view directions in between the main view directions.
  • This rendering scheme would not be able to provide the correct focus cues for the eyes as there would be only one focal surface at the surface of the display. However, this scheme may be much simpler to implement, as the rendering may call for only a series of 2D views at small angular intervals.
  • the 3D data may be reduced to certain discrete depth layers that are just close enough to each other for the viewer's visual system to have continuous 3D depth experience. Covering the visual range from 50 cm to infinity would take about -27 different depth layers, based on estimated human visual system average depth resolution.
  • the depth layers can be displayed temporally in sequence according to distance, or they can be mixed and adapted on the basis of the image content.
  • viewer positions are actively detected by the system and voxels are rendered only to those directions where the viewers are located. Active viewer eye tracking, such as using near infrared light with cameras around or in the display structure, may be used for this viewer position detection.
  • Selection of the appropriate rendering scheme is dependent on the particular hardware limitations and use case. For example, in a wall-sized advertisement display that is used in a well-lit area, the desired high light intensity easily leads to relatively large light emission layer pixel size as high intensity light emitting components are not readily available in very small sizes.
  • the display may be configured to be viewable from a large distance by multiple simultaneous viewers (and not necessarily to be viewable by a nearby viewer).
  • a multi-view rendering scheme may be more appropriate, as the long distance between the viewers and the display means that the viewer perception of depth is less accurate and a dense multi-view display can create the 3D effect well enough.
  • the relatively large pixels also do not allow the fine-tuning used for a true LF display with multiple focal surfaces.
  • Another example case is a smaller display for a single user created with a light emitting layer that has a large number of very small pixels with lower light intensity.
  • a more complex true LF rendering scheme may be utilized, as the spatial resolution may be adequate and the large number of focal surfaces can be calculated for a single user direction and eyebox without excessive calculation power and/or data transfer speeds.
  • Exemplary methods may be applied to many different sized displays with different numbers of pixels.
  • the single view direction sweeping angle is dependent on the LEL pixel pitch and size, which means that it will also be considered during the rendering as the timing between the propagating wave and pixel activation is synchronized.
  • the color rendering scheme adapts to the fact that different colors are diffracted to different angular directions at the grating foil (or other flexible light bending layer). Some of this effect can be compensated with the hardware as previously discussed (e.g., by integrating diffractive structures into the focusing lens sheet to make it color corrected, so as to compensate for the different focus distances of the collimating lens), but the remaining color separation may be accomplished using special color rendering.
  • One rendering scheme is to use the movement of the propagating foil or tilting plates (or other flexible light bending layer) as an advantage, and activate the differently colored sub-pixels at slightly different times from each other.
  • the wave may have sufficient time to propagate (or the plates to tilt) to positions where all three colored pixels are projected to the same direction. This results in the colors being combined in the single projector cell by introducing a short time shift between the different colored image projections.
  • a 1 meter wide LF display with the discussed wavy diffraction foil projector cell structure (flexible diffractive foil disposed over the projector cells) is positioned at 1 meter distance from multiple viewers.
  • the display generates collimated beams to the observer directions.
  • a rendering scheme with multiple light emitting points for each voxel (a 3D pixel) is used in order to create a true LF display with correct eye convergence and focus cues.
  • For each voxel at least two beams are emitted from at least two points on the display surface. Those convergence beams are created by selectively activating different projector cell sub-pixels corresponding to the correct large angular directions.
  • the pixel projections are swept through the small angular ranges with the wavy diffractive foil approaches discussed herein, and as the pixels are modulated in synchronization to this angular scan a very dense light field is generated.
  • the beam sweeps also create virtual focal surfaces for the eyes at different depths calculated from the 3D data with the true LF rendering scheme.
  • Eye spatial resolution at 1 m distance can be as high as 0.29 mm, which results in a scenario for a 1 m wide display having a maximum of 3448 horizontal LF pixels.
  • Each of these LF pixels may also have several sub-pixels in them.
  • one projector cell sub-pixel is ⁇ 3 ⁇ in size
  • one LF pixel may have around 100 sub-pixels generating around 100 spatially multiplexed beam propagation directions. This may be done with a display that has the appearance of a surface without pixels as the projector cell size would be at the eye resolution limit.
  • the number of unique directions may be increased by the diffractive foil and propagating wave discussed herein, as additional projection directions can be packed in between these main directions.
  • Table 2 lists the beam angles that a single projector cell on the display surface should be able to provide for a single user at the central direction.
  • the eye pupil diameter can be estimated to be around 5 mm. If a voxel is rendered at 500 mm distance from the eyes, a maximum angular sweep of ⁇ 0.28° may be used from the display surface for one eye focus accommodation.
  • the beam starting point on the display surface changes from cell to cell when a LF 3D image is rendered for one observer direction.
  • the LF can be rendered for other eye pairs in the multi-view configuration by adding the new view direction angles to the convergence and accommodation angles mentioned in the table.
  • collimated beam quality and with it the furthest visible voxel distance for a true LF display, are determined by the pixel size on the light emitting layer, collimating lens F#, and diffraction.
  • pixels on a currently available OLED display are as small as -9.3 ⁇ and pixels on an LCoS display are as small as -3.74 ⁇ .
  • Table 2 It can be seen from Table 2 that if the observer looks at a voxel positioned at 1500 mm distance, their eyes converge 1.24° towards the center line, and ray bundles hitting the eye should have ⁇ 0.0985° collimation to achieve realistic focus-accommodation information.
  • This collimation level can be achieved with a projector cell that has -3 ⁇ sub-pixels and a collimation lens with 0.9 mm focal length.
  • 1.24° convergence angle means -10 ⁇ distance between pixels on LEL surface.
  • the projected beam diameter should be smaller than eye pupil size, as the collimated beam should not create a disturbing extra focus in the eye that interferes with the artificially generated swept-beam focus.
  • This means an exemplary display is capable of displaying 3D views inside this depth range when the beam diameter condition is met.
  • the beam generated in a single projector cell may have a diameter less than 0.27 mm, which corresponds to about one tenth of the eye pupil diameter. This cell size may provide 4K spatial resolution for a 46" display, or Full HD resolution for smaller 500 mm wide desktop display.
  • grating tilt change may be used to sweep the beam angle between two adjacent pixels on the LEL if the grating has 1000 lines/mm. With this angle, a continuous array of directions can be created as the projection angles of neighboring pixels start to overlap with each other. Such a propagating wave slope maximum angle can be considered as quite moderate.
  • the grating foil surface wavelength is at least 10 times the projector cell aperture size (so that aperture does not see the curvature of grating wave)
  • the propagating wavelength minimum is 2.9 mm at 0.29 mm projector cell pitch.
  • the desired wave amplitude can be calculated from the wave form, wavelength, and maximum grating tilt.
  • a propagating wave fulfilling these conditions may, for example, have a surface wavelength of 2.9 mm, such that the 10° maximum angle shift is achieved with a wave amplitude of 0.13 mm. If the foil thickness is 0.1 mm, the total minimum space for the foil with the propagating wave form is only 0.23 mm.
  • a 5" mobile phone LF display is viewed from 0.5 meter distance by a single observer.
  • the display projects collimated beams towards the observer into a viewing box that is 200 mm wide. This box size can easily accommodate the width of a human face.
  • a rendering scheme with multiple light emitting points for each voxel (a 3D pixel) is used in order to create a multi focal surface LF with correct eye convergence and focus cues.
  • For each voxel at least two beams are emitted from at least two points on the display surface.
  • the beams are created by selectively activating different projector cell sub-pixels corresponding to the correct angular directions.
  • the pixel beams are also swept through small angular ranges with the tilting plate method and as the pixels are modulated in synchronization to this angular scan, a very dense light field is generated in the horizontal direction where the angular scans are made.
  • the beam sweeps create virtual focal surfaces for the eyes at different depths.
  • the display projector cells project the LF pixel images into a viewing angle of 22 degrees, which covers -200 mm wide area at the 500 mm viewing distance.
  • a reasonable full-color pixel size of 12 m is selected and projector cell pitch is set to 250 ⁇ .
  • Pixels on a currently available OLED display can be as small as -9.3 ⁇ and on an LCoS display as small as -3.74 ⁇ .
  • the pixels are divided into 3 individually addressable sub-pixels with red, green and blue filters making each separate colored emitter surface only 4 ⁇ wide.
  • the desired ⁇ 11 degree FOV can be covered with a cell structure that has 600 ⁇ focal length microlenses made from optical plastic material Zeonex E48R and an aperture mask with 200 ⁇ diameter holes in front of the lenses.
  • a projector cell or LF pixel is able to produce -21 unique full-color pixel projections in the horizontal direction even without the plate tilting action.
  • Table 3 shows calculated figures for projected beam divergence values for voxels positioned at four different distances from the observer.
  • the described projector cell structure is able to generate individual beams with ⁇ 0.2 degree divergence from the 4 ⁇ sized colored sub-pixels. It can be seen from the table that this collimation level is adequate for presenting voxels that are somewhat further away from the viewer than the display surface (>500 mm), but voxels at 1 m distance are already too far away to be accurately rendered for the human eye.
  • the individual sub-pixel beams create a spot that is -3.5 mm in diameter at the 500 mm viewing distance from the display. This means that the light from two sub-pixels can enter the -5 mm diameter eye pupil simultaneously. It also means that with a static display, the pixels would appear as colored due to the fact that only two of the neighboring three color pixels can enter the eye at the same point in time.
  • the plate array of the exemplary embodiment may vary from all refractive plates being normal to a display surface to all refractive plates being at maximum tilt magnitudes (for example, with alternating orientation) with respect to the display surface.
  • about 12 degrees of plate tilt is used for creating a lateral shift corresponding to one 4 ⁇ colored sub-pixel width with a plate thickness of 50 ⁇ and a plate made from optically clear polystyrene.
  • a standing wave can have a surface wavelength of 0.5 mm corresponding to two projector cell widths, and the maximum tilting angle gives a maximum total thickness of only -0.1 mm to the structure.
  • the plate sheet can be manufactured by embossing V-grooves on both sides of a 50 ⁇ thick polystyrene foil with 250 ⁇ spacing. The thinner areas then function as hinges between the thicker foil parts that act as the tilting plates when the standing wave movement is activated.
  • Refresh frequencies for an LCD LEL can be as high as 600 Hz.
  • the display pixels can be modulated 10 times inside the 60 Hz limit commonly considered suitable for a flicker-free image for the human eye. It follows that if the plates tilt back-and-forth at a rate of 30 Hz, full angular scans can be performed with the 60 Hz rate and 210 (21 spatial * 10 temporal) unique full-color horizontal pixel beams could be generated with each projector cell inside the human visual system POV timeframe.
  • achievable spatial resolutions are different in all of the three orthogonal dimensions.
  • Pixel size in the vertical direction can be 12 m if the pixels have square aperture as the multiview scanning occurs only in the horizontal direction.
  • the selected rendering scheme determines the achievable spatial resolution in both the horizontal and depth directions.
  • Projector cells form an array on top of the display surface with 480 side-by-side LF pixels that are all capable of projecting 210 multiplexed beams into the space in front of the display without flicker.
  • Each voxel is created with at least two crossing beams, and as the display is used at close range, several different focal surfaces (e.g., 10) may be created in order to give the appearance of continuous depth.
  • the number of selected beams per each voxel and number of selected focal surfaces are parameters of the rendering scheme comprising a clear trade-off relationship. Together they determine the final number of pixels that may be projected in the horizontal direction and with it the horizontal pixel resolution.
  • the plate tilting may be used for the creation of full-color beams projected from the projector cell and there is no need for a separate color rendering scheme.
  • the full-color rendering can be achieved by introducing a small time delay between the activation of the sub-pixels. This time delay causes the different color pixels to combine into one full-color pixel that is only 4 ⁇ wide.
  • the position where the full- color pixel is activated is controlled separately by a LF rendering scheme that determines the suitable pixel projection directions. If the plate back-and-forth scanning is done with a 30 Hz rate, two angular sweeps are made each second. As only one 12-degree tilt is sufficient from the tilting plate to bring two sub-pixels on top of each other, a suitable time delay value of ⁇ 8 ms is obtained between successive red, green and blue sub-pixels.
  • An exemplary 3D LF display comprises the disclosed directional backlight method and projector cell structure.
  • the display has a width of -576 mm in the horizontal direction, which corresponds to a 26" monitor having a standard aspect ratio. With a single viewer positioned at 1 m distance from the display, the display fills an area corresponding to a 32 degree angular field-of-view. Multiple images from different view directions of the same 3D content are projected to a viewing zone covering the single user's facial area, and the user is able to see a 3D stereoscopic image.
  • a true LF rendering scheme may be used for the creation of the full-color images.
  • the exemplary 3D LF display may use a single cell projector structure as in FIG. 27A.
  • the light is emitted from a cluster of LEDs that are arranged into a matrix with horizontally alternating red, green and blue components.
  • Each single LED component has a width of 2.5 ⁇ , and the components are bonded to a backplane with a horizontal pitch of 3.5 ⁇ .
  • This means that one three-color light emitter pixel is fitted into a horizontal width of -10 ⁇ .
  • the different colored sub-pixels may be arranged in slanted columns which opens up a possibility to increase the horizontal spatial resolution at the cost of lowered vertical resolution.
  • the vertical light emitting pixel size can also be smaller than 10 ⁇ , there is ample space available between pixel rows and the spatial resolutions can be balanced in order to achieve the appearance of uniform spatial resolution in both directions.
  • the total width of the light emitting pixel cluster is -0.8 mm, which means that there are a total of 229 sub- pixels or 76 three-color pixels placed side-by-side.
  • a collimating lens made of the material Zeonex E48R is placed at a 5 mm distance in front of the [iLED cluster.
  • the collimating lens has a focal length of 5 mm and aperture diameter of 0.8 mm. This lens effectively collimates the light emitted by the ⁇ -EDs into narrow directional beams.
  • the beams hit a transmissive grating foil that is placed close (e.g., - 0.1 mm - 0.2 mm) to the collimating lens.
  • the grating foil is made from polystyrene in this example. It has 500 grating lines/mm that are designed to diffract the incident light intensity evenly into orders -1 , 0 and 1.
  • the grating may be designed to diffract the light into more than just three orders, in which case the single illumination beam would be split into more than three child-beams.
  • the three child-beams hit a focusing lens (made of the material Zeonex E48R) that has a focal length of 2.8 mm and aperture diameter of 2.4 mm. This lens is placed at 2.65 mm in front of the beam collimation lens and it re-focuses the separated beams into pixel images at a diffuser sheet positioned at 2.5 mm distance from the focusing lens. A magnification ratio of this two-lens system is -2:1.
  • This first part of the projector cell forms a backlight structure that is capable of producing an array of very small, color controlled virtual light emitting pixels that can be individually turned on and off with control electronics.
  • a more complete LF display projector cell is formed when an LCD with polarizing films on both sides is placed in contact with the diffuser foil and a lenticular sheet is placed in front of the display.
  • the pixel size of the LCD may be the size of individual LED sub-pixel images. However, this is not mandatory as the LEDs themselves are components that can be individually addressed and the pixel images can be turned on and off (with the limitation that all diffraction orders connected to a single LED are working in unison).
  • the smallest pixel sizes in currently available LCDs can produce DPI (Dots-Per-lnch) values between 2000 - 3000, which relates to -10 ⁇ pixel sizes. This presently available pixel size is suitable for the use of modulating images generated with the presented backlight system.
  • the -10 ⁇ pixel can block -2 side-by-side full-color virtual pixels, where the differently colored spots are overlaid on top of each other.
  • the LCD pixels do not need any color filters, and may be used as simple light valves that either block or pass the illumination beams.
  • three side-by-side lenticular lenses are used for one projector cell structure corresponding to the three diffraction orders.
  • the focal length of the lenticular sheet microlenses (made of a material such as PMMA) is -0.7 mm and each microlens is 0.8 mm wide.
  • the lens When positioned at approximately the focal length distance from the LCD layer, the lens projects well collimated beams towards the viewer.
  • other lens focal lengths and aperture sizes may be used for achieving different full display spatial resolution and beam divergence values.
  • Collimated beam quality and resultantly the furthest visible voxel distance for a true LF display, are determined by at least the pixel size on the light emitting layer, collimating lens F# and diffraction.
  • Table 4 lists the beam angles that a single projector cell on the display surface provides for a single user at the central direction. The table provides convergence and accommodation angles that the FL display provides for four different voxel distance layers in the exemplary scenario.
  • Voxels and eyes are at optical axis going
  • an eye pupil diameter can be estimated to be around 5 mm and for example, if a voxel is rendered at 1000 mm distance from the eyes, a maximum angle of ⁇ 0.17 degrees from the display surface is sufficient for one eye focus accommodation.
  • the previously described projector cell structure produces ⁇ 4.8 ⁇ - 6.3 ⁇ light emitting virtual pixels at the diffusing foil.
  • These beams can be further collimated into illumination beams, using the described lens, with divergences of ⁇ ⁇ 0.12 degrees - ⁇ 0.14 degrees.
  • These values are now limited by diffraction relating to the cell lens design, and they limit the longest voxel distance that can be rendered, without conflicting eye focus, to a value around 1 m. This means that by considering the beam quality alone, the 3D image space is constricted to a depth range with a maximum distance approximately at the display surface, and a minimum distance approximately at a -25 cm distance from the viewer.
  • the illumination beams have separated colors.
  • the colors can be combined into three-color beams by utilizing two different methods.
  • the first method is based on propagating wave in the grating foil and time delay between pixel activation. If the separately colored LEDs are activated at different time intervals, the propagating wave has shifted the positions of the different color component images to the same exact position on the LCD and the colors can be combined.
  • An angular local tilting range of ⁇ 3 degrees between the grating foil (over the collimating lens aperture) and illumination beam is sufficient for sweeping the images of single colored sub-pixels over the adequate range on the diffuser foil.
  • this grating tilting range results to a total spatial sweep length of -8 ⁇ , with green pixels (wavelength -550 nm) to -12 ⁇ and with red pixels (wavelength -650 nm) to -17 ⁇ .
  • green pixels wavelength -550 nm
  • red pixels wavelength -650 nm
  • the second color combination method is based on horizontal location of the colored LEDs and diffraction occurring in the diffractive foil.
  • the colored pixel images can be combined by compensating this angular difference by activating red, green and blue pixels that are at somewhat different locations on the backplane.
  • a distance of 0.25 mm between red and green as well as between blue and green pixels is adequate for the color combination.
  • Both of these two methods can also be used together for example by first selecting the right LEDs on the backplane for the crude positional adjustment and then utilizing the time-based method for fine adjustment of the single beam color.
  • This color combination may employ a calibration routine for the displays as the colored spots are so small that, for example, hardware manufacturing tolerances can have an effect on the combination capability.
  • ⁇ 3° grating tilt range is used to sweep the beam angle between two adjacent pixels on the LEL if the grating has 500 lines/mm. With this angular range, a continuous array of directions may be addressed, as the projection angles of neighboring pixels start to overlap with each other. This propagating-wave-slope maximum angle can be considered quite moderate.
  • An example grating foil surface wavelength is 10 times the projector cell aperture size (so that aperture does not see the curvature of grating wave). The propagating wavelength minimum is 8.0 mm for a 0.8 mm collimator lens aperture.
  • the wave amplitude can be calculated from the wave form, wavelength and maximum grating tilt.
  • the surface wavelength is 8.0 mm
  • the ⁇ 3 degree maximum angle shift is achieved with a wave amplitude of 0.1 mm.
  • the foil thickness is 0.1 mm
  • the total thickness of the space taken up by the foil with the propagating wave form is ⁇ 0.2 mm.
  • a suitable region width is -0.6 mm, which means that a single projector cell can produce -171 separately addressable colored pixels per each lenticular sheet lens and the single LF pixel can produce 171 separate colored beams in the horizontal direction at any given moment.
  • the number of unique view directions can be multiplied by utilizing the propagating wave in the diffractive foil. As the wave propagates, the different diffraction order LED images are shifted inside the LCD screen pixel aperture making it possible to scan the angular space between two beam directions set by the LED matrix.
  • the temporal multiplexing ability is connected to the refresh frequencies of the LEDs and LCD and to the Persistence-Of-Vision property of the human eye. If the LCD has a refresh frequency of, for example, 240 Hz, the image could be updated 4 times inside a 60 Hz refresh frequency rate that is commonly used for determining a flicker free video.
  • the LED matrix however could be modulated much faster and a series of several virtual pixels with slightly different locations could be generated inside one LCD refresh cycle. In some embodiments, this faster rate is used also for creation of different light intensity levels.
  • the overall number of views that could be generated with this method depends on the above- mentioned parameters and the final rendering scheme that is selected for creating the colors and 3D image voxels.
  • single LF pixels are able to project the image creation beams to a maximum angle of 18 degrees.
  • This width can easily cover both eyes of a single viewer as the average distance between eye pupils is -64 mm.
  • the view region is so wide that it also provides a lot of tolerance for head positioning making it more comfortable to use and more robust against head movements.
  • the single projection beam divergence values are - ⁇ 0.13 degrees
  • the beam widths at the 1 m distance are -5 mm, which is suitable given the average eye pupil diameter.
  • the spatial separation between two neighboring beams at this distance is only -1.9 mm (320 mm / 171 beams), which means that two side-by-side beams could be swept over the eye pupil simultaneously fulfilling the Super- Multi-View (SMV) condition.
  • the colored beams can be combined with the propagating wave in the grating foil. If the wave movement frequency is -60 Hz, the diffracted beams may be scanned over the LCD pixel apertures and colored sub-pixels can be combined into three-color beams inside the POV timeframe.
  • the described display structure and optical methods are well suited for dynamically creating high resolution full-color 3D voxels.
  • a tabletop 3D LF display device with curved 50" screen is placed at 1000 mm distance from a single viewer.
  • the display forms a light field image to a volumetric virtual image zone, which covers the distance from 500 mm from the viewer position to the display surface.
  • the display is able to generate multiple views both in the horizontal and vertical directions with the presented LF pixel structure.
  • Light is emitted from LED arrays where component size is 2 ⁇ x 2 ⁇ and pitch 3 ⁇ .
  • the array contains red, green, and blue components assembled to a matrix where the differently colored components are interlaced in alternating rows and columns with 9 ⁇ spacing between two same color chips.
  • the total width and height of the matrix in each LF pixel is -0.4 mm and there is a total number of
  • Rotationally symmetric collimator lenses are placed at -2.3 mm distance from the ⁇ and they have a focal length of 2.0 mm.
  • the lens array is made from polycarbonate as a hot-embossed 0.5 mm thick microlens sheet. Aperture size of the collimating lenses is 500 ⁇ , which is also the size of the single LF display pixel.
  • Two 350 ⁇ thick polycarbonate foils or films are placed between the LEDs and collimator lenses.
  • the first foil or film is positioned at an approximate distance between 0.55 - 0.75 mm from the LED matrix and it has a propagating waveform in the horizontal direction.
  • the second foil or film is positioned at minimum distance of 0.2 mm from the first foil or film and it has a propagating waveform in the vertical direction.
  • Foils or films are driven with piezoelectric actuators.
  • the propagating waves are synchronized with opposite phase to each other by synchronizing the actuators causing the movement.
  • the whole optical structure is only ⁇ 3 mm thick and the LF pixels are capable of projecting multiple beams that can be focused to multiple focal surface layers in front of the display with the foils or films that have propagating waves in both horizontal and vertical directions.
  • a full LF 3D display device can be created in which the whole device surface is filled with the described LF pixel structures.
  • the whole display is curved with a radius of 1000 mm in the horizontal direction. This arrangement makes the single LF pixel FOVs overlap, and a -200 mm wide viewing window is formed for a single user at the designated 1 m viewing distance.
  • the viewing window height is also -200 mm determined by the total height (0.4 mm) of the LED rows and LF pixel optics magnification to the viewing distance (500:1).
  • each LF pixel LED cluster in this example the red, green, and blue components have the same size and are bonded as interlaced matrices. Their colors are combined in the projected beams on the different focal layers when the crossing beams are combined into voxels.
  • the collimator lens array has integrated diffractive structures that compensate color dispersion in the polycarbonate material.
  • Each LF pixel has 44 red, green and blue rows of 44 LEDs, which are used for projecting 44 x 44 unique full-color views in both horizontal and vertical directions.
  • a single LF pixel has a total FOV of -11.4° x 11.4° and it covers the -200 mm wide viewing window at the 1 m viewing distance where the different horizontal view directions are spatially separated from each other by the distance of -4.5 mm.
  • the polycarbonate foil or film propagating waveforms have maximum tilt angles of ⁇ 6°, which are able to generate maximum beam tilts of - ⁇ 0.4° from the nominal directions. This makes it possible to overlap the neighboring colored pixels and to sweep intermediate view directions between the main view directions determined by the LED positions. This also means that at least two views can be projected into the -4 mm diameter eye pupils simultaneously fulfilling the SMV condition if temporal multiplexing is utilized in the 3D image rendering.
  • the created viewing zone around the viewing window allows the viewer to move his/her head -65 mm left and right as well as -125 mm forward and -180 mm backwards from the nominal position. Both eye pupils of an average person will stay inside the viewing zone with these measurements and the 3D image can be seen in the whole display image zone.
  • This tolerance for the head position is achieved by making it possible to adjust the display tilt angle in the vertical direction or display stand height and it can be considered as adequate for a single viewer sitting in front of the display in a fairly stable setting.
  • the real world display structure can be used in creation of a virtual 3D image zone between the viewer and display with only three focal surfaces located at the distances of 500 mm, 670 mm, and 1000 mm from the viewer. With these focal surfaces, the 3D image content looks practically continuous as the human visual system resolution is not able to see the different focal layers as discrete.
  • the waveform When the refractive foil or film geometry is in the propagating wave configuration with the maximum 6° tilting angles, the waveform has a surface wavelength of 5 mm corresponding to ten LF pixel widths, and the maximum tilting angle gives maximum wave amplitude of only -0.14 mm. This gives a minimum bending radius of -11 mm, which can be considered as feasible for the 0.35 mm thick polycarbonate foil or film considering a long term use. Total thickness of the foil or film with the waveform stays below 0.5 mm making the structures fairly compact.
  • One limiting factor for creation of multiple views in this embodiment is the refresh frequency of the pixelated light emitting layer.
  • OLED and LCD displays can be considered as suitable components that are available today.
  • LED matrices may replace these display types in some applications, and they offer a better alternative with faster modulation rates.
  • the highest reported display refresh frequencies for an LCD are in the range of 600 Hz. With such frequency, the display pixels can be modulated 10 times inside the 60 Hz limit commonly considered suitable for a flicker free image for the human eye.
  • FIG. 40 presents the simulation results as a table where the columns show simulated images at different distances from the viewer and rows mark the tilt angles. All pictures show a square detector area of 3 mm x 3 mm.
  • the LF pixel produces circular images that are the same size (0.5 mm) and shape as the aperture of the collimating lens. The result shows that on the display surface, the minimum voxel size is determined by the size of the LF pixel. Only two LEDs inside the pixel are used for creation of one voxel at this distance, one for each eye direction. Spatial resolution on the 50" display surface corresponds to Full HD.
  • the display can be used in a regular 2D mode by activating all the sub-pixels inside the LF pixel simultaneously for one pixel of the 2D image. This increases the visibility of each LF pixel into the full FOV of ⁇ 11.4° and all the created voxels are positioned on the display surface.
  • FIG. 40 On the virtual image plane at 670 mm distance from the viewer, a clear image of the four rectangular light sources can be seen in FIG. 40 when the foil or film tilt angles are 0° and the foils or films are parallel. When the foils or films are tilted by 6°, the images become clearly blurred making the image out of focus.
  • Each rectangle is -0.33 mm wide, which corresponds well to the geometric magnification ratio of 165:1 to this distance.
  • the image blurring airy disc caused by diffraction from the 500 ⁇ diameter LF pixel aperture has a radius of -1.0 mm with red (650 nm), 0.9 mm with green (550 nm) and 0.7 mm with blue (450 nm) light.
  • the -0.5-1.0 mm minimum voxel size means that the closest part of the 3D image zone is only capable of producing resolution that is approximately comparable to the SDTV (standard definition) on a surface size that corresponds to a 25" TV. Performance is clearly limited by diffraction.
  • the last column of FIG. 40 shows the simulated images at the viewing window (viewer distance 0 mm) at the designated 1000 mm distance from the display. These images show the eyebox size for each beam case when four neighboring sub-pixels are activated inside one LF pixel. It can be seen from the images that when the foils or films are tilted, the eyebox side does not change, but the images are just blurred. This does not mean that voxel images themselves are blurred as the eye is focused to the surface where the voxel is created and the eye lens makes the final imaging by adjusting its focal length. All of the emitters are visible to the single eye only and the beam sizes are -1 mm at the eyebox.
  • This size is not adequate for covering the whole eye pupil and it would likely be beneficial for the correct focal cues to use more than one LF sub-pixel for the creation of voxels.
  • This beam size also makes it possible to fulfill the SMV condition and make the display 3D image quality higher.
  • several active LF pixels and crossing beams may be used in all focal surface distances except for the display surface itself.
  • the simulation results of FIG. 40 show that the refractive tilting elements method can be effectively used for setting the distance of focal surfaces.
  • the presented optical structure allows a 3D LF display that is capable of actively controlling the focus distance of projected beams that form voxels, which can induce the correct focusing response from the human eye.
  • the real life example case shows that adequate resolution and 3D image viewing volume can be achieved with a display structure that is fairly simple and thin.
  • an embodiment removing the diffraction effects may be implemented, such as described in U.S. Provisional Patent Application No.
  • FIG. 41 shows schematically the functionality of a display structure that combines a LF display 4110 based on tilting refractive elements and an aperture expander 4150 based on a double grating 4142, 4144 and a SLM 4148.
  • a light source image can be generated close to the display structure and further away.
  • the aperture expander 4150 widens the beams by splitting them in the first grating layer 4142.
  • the second grating layer 4144 directs the beams back to the original directions and the beams focus again and form an image of the emitter, but to a longer focal distance. This focal distance change can be compensated with the collimator lens design.
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Abstract

Display devices may utilize flexible light bending layers to generate 3D images and/or light fields. These flexible layers may be components of volumetric display devices or flat form-factor display devices. Flexible light bending layers may comprise flexible layers of lenses or light emitting elements, and may include refractive and/or diffractive flexible layers. Flexible light bending layers may be controlled by actuators to generate propagating or traveling waves that, in synchronization with light emission, generate virtual depth of projected 3D images. One display apparatus may comprise: a light-emitting layer; an array of collimating microlenses; a flexible light bending layer; and an actuator operative to generate a traveling wave in the flexible light bending layer to generate an oscillation in the orientation of the flexible light bending layer relative to each of the collimating microlenses.

Description

SYSTEMS AND METHODS FOR 3D DISPLAYS WITH FLEXIBLE OPTICAL LAYERS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application is a non-provisional filing of, and claims benefit under 35 U.S.C. §119(e) from, U.S. Provisional Patent Application Serial No. 62/489,393, filed April 24, 2017, entitled "METHOD AND APPARATUS FOR DYNAMIC WAVE DISPLAY FOR PRESENTING 3D CONTENT"; U.S. Provisional Patent Application Serial No. 62/522,842, filed June 21 , 2017, entitled "METHOD AND APPARATUS FOR PRESENTING 3D IMAGE CONTENT WITH A LIGHT FIELD DISPLAY CREATED BY DYNAMIC WAVE"; U.S. Provisional Patent Application Serial No. 62/564,913, filed September 28, 2017, entitled "SYSTEMS AND METHODS FOR GENERATING 3D LIGHT FIELDS USING TILTING REFRACTIVE PLATES"; U.S. Provisional Patent Application Serial No. 62/564,908, filed September 28, 2017, entitled "SYSTEMS AND METHODS FOR 3D LIGHT FIELD DISPLAY WITH DIRECTED BACKLIGHT USING DIFFRACTIVE FOIL"; and U.S. Provisional Patent Application Serial No. 62/633,042, filed February 20, 2018, entitled "SYSTEMS AND METHODS FOR GENERATING PROJECTED 3D LIGHT FIELDS USING DOUBLE REFRACTIVE ELEMENTS", each of which is incorporated herein by reference in its entirety.
BACKGROUND
[0002] Overall, the current stereoscopic displays that are used in home theatres and cinema are an unnatural way of making 3D images. There is a neural connection in the human brain between light- sensitive cells on the eye retinas and the cells sensing eye muscle movement. The associated areas work together when the perception of depth is created. Autostereoscopic 3D displays lack the correct retinal focus cues due to the fact that the image information is limited to the plane of the display as illustrated in FIGS. 2A-2B. When the eyes focus to a different point than where they converge, physiological signals in the brain get mixed up. Depth cue mismatch of convergence and accommodation leads to, for example, eye strain, fatigue, nausea and slower eye accommodation to object distance. This phenomenon is called vergence-accommodation conflict (VAC) and it sometimes calls for non-proportional depth squeezing in artificial 3D images and allows only short-time viewing of scenes with large depth. As shown in FIGS. 2A- 2B, looking at a real world object (FIG. 2A) and at an autostereoscopic 3D display (FIG. 2B) may have different focal distances and eye convergence angles or distances. This may result in portions of a view in the real world being blurred or out of focus, while all parts of a display may be in focus. [0003] Three types of 3D displays are able to provide the correct focus cues for natural 3D image perception. The first category is volumetric display techniques that can produce 3D images in true 3D space. Each "voxel" of a 3D image is located physically at the spatial position where it is supposed to be and reflects or emits light from that position toward the observers to form a real image in the eyes of viewers. The main problems in 3D volumetric displays are low resolution, large physical size and complexity of the systems that make them expensive to manufacture and too cumbersome outside special use cases like product displays, museums, etc. A second existing 3D display device category capable of providing correct retinal focus cues is holographic displays. These aim to reconstruct the whole light wavefronts scattered from objects in natural settings. The main problem in this field of technology is the lack of suitable Spatial Light Modulator (SLM) components that could be used in the creation of the extremely detailed wavefronts. Some research groups have reported progress in this field, but they are still very far from commercialization of the technology. A holographic display technique has also been developed to a commercial prototype level by utilizing additional eye tracking technology, which has made it possible to use commercially available SLM components for creation of the wavefronts. However, such systems are still quite complex and require a large system, making it too expensive for average consumers. A third 3D display technology category capable of providing natural retinal focus cues is called Light Field (LF) displays.
[0004] Vergence-accommodation conflict is one driver for moving from the current stereoscopic 3D displays to more advanced light field systems. A flat form-factor LF 3D display may produce both the eye convergence and focus angles simultaneously. FIGS. 3A-3D show these angles in four different 3D image content cases. In FIG. 3A, an image point 320 lies on the surface of the display 305, and only one illuminated pixel visible to both eyes 310 is needed. Both eyes focus (angle 322) and converge (angle 324) to the same point. In FIG. 3B, a virtual image point (e.g., voxel) 330 is behind the display 305, and two clusters of pixels 332 on the display are illuminated. In addition, the direction of the light rays from these two illuminated pixel clusters 332 are controlled such that the emitted light is visible only to the correct eye, thus enabling the eyes to converge to the same single virtual point 330. In FIG. 3C, a virtual image 340 is at an infinite distance behind the display screen 305, and only parallel light rays are emitted from the display surface from two illuminated pixel clusters 342. In this case, the minimum size for the pixel clusters 342 is the size of the eye pupil, and the size of the cluster also represents the maximum size of pixel clusters needed on the display surface. In FIG. 3D, a virtual image point (e.g., voxel) 350 is in front of the display 305, and two pixels clusters 352 are illuminated with the emitted beams crossing at the same point, where they focus. In the generalized cases of FIGS. 3B-3D, both spatial and angular control of emitted light is used from the LF display device in order to create both the convergence and focus angles for natural eye responses to the 3D image content. [0005] With current relatively low density multiview displays, the views change in a stepwise fashion as the viewer moves in front of the device. This feature lowers the quality of 3D experience and can even cause a breakup of the 3D perception. In order to mitigate this problem (together with the VAC), some Super Multi View (SMV) techniques have been tested with as many as 512 views. SMV techniques generate an extremely large number of views that make the transition between two viewpoints very smooth. If the light from at least two images from slightly different viewpoints enters the eye pupil simultaneously, a much more realistic visual experience follows. In this case, motion parallax effects resemble the natural conditions better (FIGS. 4A-4C) as the brain unconsciously predicts the image change due to motion. The SMV condition can be met by reducing the interval between two views at the correct viewing distance to a smaller value than the size of the eye pupil. At normal illumination conditions the human pupil is generally estimated to be ~4 mm in diameter. If the ambient light levels are high (Sunlight), the diameter can be as small as 1.5 mm and in dark conditions as large as 8 mm. The maximum angular density that can be achieved with SMV displays is limited by diffraction and there is an inverse relationship between spatial resolution (pixel size) and angular resolution. Diffraction increases the angular spread of a light beam passing through an aperture and this effect may be taken into account in the design of very high density SMV displays.
[0006] Light field displays call for a high amount of multiplexing from the optical hardware as all the different viewing directions and focal surfaces need to be presented through a single display surface.
Multiplexing can be done either spatially or temporally. A limiting factor in temporally multiplexed systems is component switching speed or refresh-rate. Different tuneable optical components such as "liquid lenses" are available and can be used in temporally multiplexed systems, but due to their complex structure, the lens-based systems become easily very large and expensive. They may be suitable for multiple user LF projection systems or for single users as parts of head mounted or table-top displays. In general, the spatial multiplexing approach uses more hardware than temporal approach as the multiple views are generated at the same time with parallel hardware components. This is especially problematic for systems that are intended for multiple users - the more viewers there are, the more different views will be generated and the more hardware is called for to realize this.
[0007] The systems and methods disclosed herein address these issues, and others. SUMMARY
[0008] Systems and methods set forth herein utilize flexible optical layers as a component of a display device to generate 3D images and/or light fields. Some embodiments comprise volumetric display devices, and some embodiments comprise flat form-factor display devices. [0009] Flexible optical layers may comprise flexible layers of lenses or light emitting elements, in various embodiments. In some embodiments, a flexible optical layer may comprise a flexible light bending layer.
[0010] In some embodiments, flexible optical layers may be controlled by actuators to generate propagating or traveling waves that, in synchronization with light emission, generate virtual depth of projected 3D images.
[0011] In some embodiments, a flexible optical layer may comprise a flexible diffractive foil. In some embodiments, a flexible optical layer may comprise an array of tilting refractive plates. In some
embodiments, a flexible optical layer may comprise a pair of arrays of tilting refractive plates. In some embodiments, a flexible optical layer may comprise a pair of flexible refractive foils.
[0012] In an embodiment, a display apparatus may comprise: a light-emitting layer disposed within the display apparatus; a collimating microlens array disposed between the light-emitting layer and an outer surface of the display apparatus, the microlens array comprising a plurality of collimating microlenses; a first flexible light bending layer disposed between the light-emitting layer and the outer surface of the display apparatus; and at least one actuator operative to generate a traveling wave in the first flexible light bending layer to generate an oscillation in the orientation of the first flexible light bending layer relative to each of the collimating microlenses. The display apparatus may include wherein a portion of the light- emitting layer including a plurality of sub-pixels is associated with a single microlens of the collimating microlens array to define one of a plurality of projector cells, and wherein a portion of the first flexible light bending layer spans a cell aperture of each projector cell. The display apparatus may further comprise a controller configured to control at least a first projector cell and the at least one actuator to: based on a location of at least one voxel of 3D content to be displayed by the display apparatus, illuminating a subset of the plurality of sub-pixels of the first projector cell in synchrony with the orientation of the first flexible light bending layer relative to the microlens of the first projector cell to generate a 3D image. The display apparatus may include wherein the generated 3D image comprises a plurality of independent views of the 3D content projected at a plurality of viewing angles. The display apparatus may include wherein the first flexible light bending layer comprises a flexible diffractive foil, and the collimating microlens array is disposed between the light-emitting layer and the flexible diffractive foil. The display apparatus may include wherein the flexible diffractive foil is disposed between the collimating microlens array and the microprism array. The display apparatus may further comprise a spatial light modulator (SLM), wherein the flexible diffractive foil is disposed between the SLM and the light-emitting layer, and the SLM is configured to be controlled in synchronization with the at least one actuator and the light-emitting layer to modulate light diffracted by the flexible diffractive foil. The display apparatus may include wherein the first flexible light bending layer comprises a first array of tilting refractive plates, and the first array of tilting refractive plates is disposed between the light-emitting layer and the collimating microlens array. The display apparatus may include wherein each refractive plate in the plate array is connected to one or more adjacent plates via a flexible joint. The display apparatus may include wherein the first flexible light bending layer comprises a first flexible refractive foil, the first flexible refractive foil disposed between the light-emitting layer and the collimating microlens array. The display apparatus may further comprise a second flexible light bending layer, wherein the first flexible light bending layer is disposed between the light-emitting layer and the collimating microlens array, and the second flexible light bending layer is disposed between the first flexible light bending layer and the collimating microlens array. The display apparatus may include wherein the first and second flexible light bending layers comprise a first and a second array of tilting refractive plates. The display apparatus may include wherein the first and second flexible light bending layers comprise a first and a second flexible refractive foil.
[0013] In an embodiment, a method comprises: controlling at least a first actuator to generate a traveling wave in a first flexible light bending layer, said traveling wave generating an oscillation in the orientation of the first flexible light bending layer relative to a plurality of projector cells, each projector cell having (i) a subset of light sources of a light-emitting layer, and (ii) a focusing microlens; and controlling illumination of the light sources of the projector cells, based on 3D content to be projected as voxels, wherein the control of illumination is in synchronization with the traveling wave in the first flexible light bending layer. The method may include wherein the first flexible light bending layer comprises a flexible diffractive foil, wherein the microlens of each projector cell is disposed between the light-emitting layer and the first flexible light bending layer, and further comprising: modulating the diffracted light from the flexible diffractive foil with a spatial light modulator, disposed between the first flexible diffractive foil and a second microlens, that is synchronized with the light sources of the projector cells and the traveling wave; and projecting the modulated light through a third microlens array to project the 3D content. The method may include wherein the first flexible light bending layer comprises a first array of refractive tilting plates disposed between the light-emitting layer and microlens of each projector cell, and further comprising: modulating the light sources of the projector cells and synchronizing the emitted modulated light with the orientations of the refractive tilting plates; and passing the emitted modulated light through the microlens of each projector cell to project the 3D content.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] A more detailed understanding may be had from the following description, presented by way of example in conjunction with the accompanying drawings in which like reference numerals in the figures indicate like elements, and wherein: [0015] FIG. 1A is a system diagram illustrating an example communications system in which one or more disclosed embodiments may be implemented;
[0016] FIG. 1 B is a system diagram illustrating an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in FIG. 1A according to an embodiment;
[0017] FIGS. 2A-2B illustrate different focal distances and eye convergence angles when looking a real world object (FIG. 2A) and an autostereoscopic 3D display (FIG. 2B).
[0018] FIGS. 3A-3D illustrate eye focus angles and convergence angles together with pixel clusters on a flat LF display in four generalized cases.
[0019] FIGS. 4A-4C illustrate different occlusion effects of three different light fields directed inside an eye pupil.
[0020] FIG. 5 illustrates an exemplary generation of dynamic movement in a wave display.
[0021] FIG. 6A depicts an example of a combination of viewing angle and wave slope angle creating a self-occlusion of a display foil at a first wave phase; and FIG. 6B depicts an alternative wave phase where there is no occlusion at the same position.
[0022] FIG. 7 depicts an example of a virtual image of an object formed for a viewer by switching light emitting pixels on and off.
[0023] FIG. 8 illustrates one embodiment of a wave display comprising a plurality of layers.
[0024] FIG. 9A depicts an example of non-Lambertian emitters resulting in uneven display illumination to a viewing direction; and FIG. 9B depicts how the uneven display illumination may be corrected by adding a diffusing optical layer on top of the emitters.
[0025] FIG. 10 illustrates one embodiment of an optical layer used as diffuser for evening out different sized gaps between pixels positioned at different parts of a propagating wave display.
[0026] FIG. 11 illustrates one embodiment of an optical layer used as a partial reflector to make a wave display viewable on a front side and a back side.
[0027] FIG. 12 illustrates one embodiment of an optical layer utilizing deformable lens structures on the optical layer.
[0028] FIG. 13 illustrates an exemplary embodiment of the structural elements of a 3D wave display.
[0029] FIG. 14 illustrates light emission angles of a light field display.
[0030] FIGS. 15A and 15B are schematic illustrations of two example structures: a flexible optical layer on top of a rigid light emitting layer (FIG. 15A) and a rigid optical layer on top of a flexible light emitting layer (FIG. 15B). [0031] FIG. 16 illustrates generation of multiple viewing directions and virtual image depths for an array of projectors.
[0032] FIGS. 17A and 17B illustrate generation of an angular sweep through the eye box with one projector of the array of projectors of FIG. 16.
[0033] FIGS. 18A-18F are cross sectional views of a light field display illustrating display of a voxel.
[0034] FIG. 19A is a schematic presentation of the basic structure of a single LF projector cell; and FIG. 19B is a schematic presentation of how the angular sweep is generated in a display structure comprising a plurality of project cells as in FIG. 19A.
[0035] FIG. 20 is a schematic presentation an exemplary structure in use as a display.
[0036] FIG. 21 illustrates the optical structure of a single projector cell with holographic/standard grating film.
[0037] FIG. 22 illustrates the structure of a single projector cell using a prism-grating-prism optical element.
[0038] FIG. 23A depicts a schematic presentation of an exemplary structure of a single LF projector cell, with a refractive tilting plate; and FIG. 23B depicts a schematic presentation of sweeping through beam scanning angles in an exemplary display structure comprising a plurality of projector cells as in FIG. 23A.
[0039] FIG. 24A depicts an overview of various exemplary standing wave states of a tilting plate array, in accordance with an embodiment; and FIG. 24B illustrates an exemplary standing wave with nodes and anti-nodes.
[0040] FIG. 25 depicts a schematic presentation of an exemplary structure for generating 3D Light Fields using tilting refractive plates, in accordance with an embodiment.
[0041] FIGS. 26A and 26B are schematic cross-sectional views of a portion of a display device in an exemplary embodiment.
[0042] FIG. 27A depicts a schematic presentation of an exemplary structure of a single LF projector cell, with a diffractive foil and a spatial light modular; and FIG. 27B depicts a schematic presentation of a similar projector cell to FIG. 27A, with multiple light emitting elements.
[0043] FIG. 28 depicts an overview of beam angle change using a diffractive foil, in accordance with an embodiment.
[0044] FIG. 29 depicts a schematic presentation of an exemplary internal structure a 3D Light Field display with directed backlight using a diffractive foil, in accordance with an embodiment.
[0045] FIG. 30 depicts an overview an exemplary 3D Light Field display with directed backlight using a diffractive foil, in accordance with an embodiment. [0046] FIG. 31 is a schematic presentation of the basic structure of a single LF projector cell with double tilting refractive elements.
[0047] FIGS. 32A-32C illustrate optical functions of LF pixels with tilting elements that are in opposite phase, according to an embodiment.
[0048] FIGS. 33A-33C illustrate optical functions of LF pixels with tilting elements that are in different phases, according to an embodiment.
[0049] FIG. 34A is a schematic presentation of a display structure with multiple LF pixels and two refractive foils or films with synchronized propagating waveforms, according to an embodiment; and FIG. 34B is a schematic presentation of the same display structure as FIG. 34A with propagating waveforms that are not in the same phase, according to an embodiment.
[0050] FIG. 35 is a schematic presentation of a light field display with dual flexible foils, according to an embodiment
[0051] FIG. 36 is a schematic presentation of an alternative display structure utilizing an exemplary method in a head mounted device.
[0052] FIG. 37 is a schematic presentation of an alternative display structure with wave modules, according to an embodiment.
[0053] FIG. 38A illustrates viewing geometry of an embodiment of a LF display with a 3D image zone in front of a curved display; and FIG. 38B illustrates a viewing geometry for an embodiment with a flat screen display.
[0054] FIGS. 39A and 39B illustrate viewing geometry for two different scenarios using a curved display (as in FIG. 38A).
[0055] FIG. 40 illustrates simulated illumination patterns on different image distances with two different tilt angles and inside 3mm x 3mm sized sensor areas.
[0056] FIG. 41 illustrates an exemplary LF display structure with a diffractive aperture expander.
EXAMPLE NETWORKS FOR IMPLEMENTATION OF THE EMBODIMENTS
[0057] FIG. 1 A is a diagram illustrating an example communications system 100 in which one or more disclosed embodiments may be implemented. The communications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users. The communications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth. For example, the communications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), zero-tail unique-word DFT-Spread OFDM (ZT UW DTS-s OFDM), unique word OFDM (UW-OFDM), resource block-filtered OFDM, filter bank multicarrier (FBMC), and the like.
[0058] As shown in FIG. 1A, the communications system 100 may include wireless transmit/receive units (WTRUs) 102a, 102b, 102c, 102d, a RAN 104/113, a CN 106/115, a public switched telephone network (PSTN) 108, the Internet 110, and other networks 112, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements. Each of the WTRUs 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wireless environment. By way of example, the WTRUs 102a, 102b, 102c, 102d, any of which may be referred to as a "station" and/or a "STA", may be configured to transmit and/or receive wireless signals and may include a user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a subscription-based unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, a hotspot or Mi-Fi device, an Internet of Things (loT) device, a watch or other wearable, a head-mounted display (HMD), a vehicle, a drone, a medical device and applications (e.g., remote surgery), an industrial device and applications (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain contexts), a consumer electronics device, a device operating on commercial and/or industrial wireless networks, and the like. Any of the WTRUs 102a, 102b, 102c and 102d may be interchangeably referred to as a UE.
[0059] The communications systems 100 may also include a base station 114a and/or a base station 114b. Each of the base stations 114a, 114b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the CN 106/115, the I nternet 110, and/or the other networks 112. By way of example, the base stations 114a, 114b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a gNB, a NR NodeB, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114a, 114b are each depicted as a single element, it will be appreciated that the base stations 114a, 114b may include any number of interconnected base stations and/or network elements.
[0060] The base station 114a may be part of the RAN 104/113, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc. The base station 114a and/or the base station 114b may be configured to transmit and/or receive wireless signals on one or more carrier frequencies, which may be referred to as a cell (not shown). These frequencies may be in licensed spectrum, unlicensed spectrum, or a combination of licensed and unlicensed spectrum. A cell may provide coverage for a wireless service to a specific geographical area that may be relatively fixed or that may change over time. The cell may further be divided into cell sectors. For example, the cell associated with the base station 114a may be divided into three sectors. Thus, in one embodiment, the base station 114a may include three transceivers, i.e., one for each sector of the cell. In an embodiment, the base station 114a may employ multiple-input multiple output (MIMO) technology and may utilize multiple transceivers for each sector of the cell. For example, beamforming may be used to transmit and/or receive signals in desired spatial directions.
[0061] The base stations 114a, 114b may communicate with one or more of the WTRUs 102a, 102b, 102c, 102d over an air interface 116, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, centimeter wave, micrometer wave, infrared (IR), ultraviolet (UV), visible light, etc.). The air interface 116 may be established using any suitable radio access technology (RAT).
[0062] More specifically, as noted above, the communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like. For example, the base station 114a in the RAN 104/113 and the WTRUs 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 115/116/117 using wideband CDMA (WCDMA). WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+). HSPA may include High-Speed Downlink (DL) Packet Access (HSDPA) and/or High-Speed UL Packet Access (HSUPA).
[0063] In an embodiment, the base station 114a and the WTRUs 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 116 using Long Term Evolution (LTE) and/or LTE-Advanced (LTE-A) and/or LTE-Advanced Pro (LTE-A Pro).
[0064] In an embodiment, the base station 114a and the WTRUs 102a, 102b, 102c may implement a radio technology such as NR Radio Access , which may establish the air interface 116 using New Radio (NR).
[0065] In an embodiment, the base station 114a and the WTRUs 102a, 102b, 102c may implement multiple radio access technologies. For example, the base station 114a and the WTRUs 102a, 102b, 102c may implement LTE radio access and NR radio access together, for instance using dual connectivity (DC) principles. Thus, the air interface utilized by WTRUs 102a, 102b, 102c may be characterized by multiple types of radio access technologies and/or transmissions sent to/from multiple types of base stations (e.g., a eNB and a gNB). [0066] In other embodiments, the base station 114a and the WTRUs 102a, 102b, 102c may implement radio technologies such as IEEE 802.11 (i.e., Wireless Fidelity (WiFi), IEEE 802.16 (i.e., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 1X, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.
[0067] The base station 114b in FIG. 1 A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, an industrial facility, an air corridor (e.g., for use by drones), a roadway, and the like. In one embodiment, the base station 114b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN). In an embodiment, the base station 114b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN). In yet another embodiment, the base station 114b and the WTRUs 102c, 102d may utilize a cellular-based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, LTE-A Pro, NR etc.) to establish a picocell or femtocell. As shown in FIG. 1A, the base station 114b may have a direct connection to the Internet 110. Thus, the base station 114b may not be required to access the Internet 110 via the CN 106/115.
[0068] The RAN 104/113 may be in communication with the CN 106/115, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 102a, 102b, 102c, 102d. The data may have varying quality of service (QoS) requirements, such as differing throughput requirements, latency requirements, error tolerance requirements, reliability requirements, data throughput requirements, mobility requirements, and the like. The CN 106/115 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication. Although not shown in FIG. 1A, it will be appreciated that the RAN 104/113 and/or the CN 106/115 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 104/113 or a different RAT. For example, in addition to being connected to the RAN 104/113, which may be utilizing a NR radio technology, the CN 106/115 may also be in communication with another RAN (not shown) employing a GSM, UMTS, CDMA 2000, WiMAX, E-UTRA, or WiFi radio technology.
[0069] The CN 106/115 may also serve as a gateway for the WTRUs 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 110, and/or the other networks 112. The PSTN 108 may include circuit- switched telephone networks that provide plain old telephone service (POTS). The Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and/or the internet protocol (IP) in the TCP/IP internet protocol suite. The networks 112 may include wired and/or wireless communications networks owned and/or operated by other service providers. For example, the networks 112 may include another CN connected to one or more RANs, which may employ the same RAT as the RAN 104/113 or a different RAT.
[0070] Some or all of the WTRUs 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities (e.g., the WTRUs 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wireless networks over different wireless links). For example, the WTRU 102c shown in FIG. 1A may be configured to communicate with the base station 114a, which may employ a cellular-based radio technology, and with the base station 114b, which may employ an IEEE 802 radio technology.
[0071] FIG. 1 B is a system diagram illustrating an example WTRU 102. As shown in FIG. 1 B, the WTRU 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a
speaker/microphone 124, a keypad 126, a display/touchpad 128, non-removable memory 130, removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and/or other peripherals 138, among others. It will be appreciated that the WTRU 102 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
[0072] The processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment. The processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 1 B depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
[0073] The transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 114a) over the air interface 116. For example, in one embodiment, the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals. In an embodiment, the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receive element 122 may be configured to transmit and/or receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
[0074] Although the transmit/receive element 122 is depicted in FIG. 1 B as a single element, the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MIMO technology. Thus, in one embodiment, the WTRU 102 may include two or more
transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 116.
[0075] The transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122. As noted above, the WTRU 102 may have multi-mode capabilities. Thus, the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as NR and IEEE 802.11 , for example.
[0076] The processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128. In addition, the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132. The non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).
[0077] The processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102. The power source 134 may be any suitable device for powering the WTRU 102. For example, the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
[0078] The processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102. In addition to, or in lieu of, the information from the GPS chipset 136, the WTRU 102 may receive location information over the air interface 116 from a base station (e.g., base stations 114a, 114b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location- determination method while remaining consistent with an embodiment.
[0079] The processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs and/or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a Virtual Reality and/or Augmented Reality (VR/AR) device, an activity tracker, and the like. The peripherals 138 may include one or more sensors, the sensors may be one or more of a gyroscope, an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.
[0080] The WTRU 102 may include a full duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for both the UL (e.g., for transmission) and downlink (e.g., for reception) may be concurrent and/or simultaneous. The full duplex radio may include an interference management unit to reduce and or substantially eliminate self-interference via either hardware (e.g., a choke) or signal processing via a processor (e.g., a separate processor (not shown) or via processor 118). In an embodiment, the WRTU 102 may include a half-duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for either the UL (e.g., for transmission) or the downlink (e.g., for reception)).
DETAILED DESCRIPTION
[0081] A detailed description of illustrative embodiments will now be provided with reference to the various Figures. Although this description provides detailed examples of possible implementations, it should be noted that the provided details are intended to be by way of example and in no way limit the scope of the application.
[0082] Note that various hardware elements of one or more of the described embodiments are referred to as "modules" that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as commonly referred to as RAM, ROM, etc.
Volumetric Dynamic Wave Display for Displaying 3D Content
[0083] In some embodiments, a 3D volumetric display can be made with a single light emitting flexible layer by generating a propagating wave through the pixelated sheet and by synchronizing the light emittance of each pixel to the wave phase and 3D image content. Pixels create virtual images of 3D objects to the air volume while moving on the fast propagating wave as the human visual system integrates the image due to persistence of vision (POV) phenomena.
[0084] The propagating display sheet wave sweeping through the 3D volume fills the whole depth dimension on the display area. The wave display sheet oscillates and the user is looking at the device from the wave amplitude direction. The wave propagates through the device display area in a perpendicular direction to the user's line of sight, and this makes it possible for the display to create occlusion behind the pixels as the whole area is filled with material all the time, unlike in the case where, for example, a rotating blade with light emitting diodes (LEDs) is used.
[0085] The propagating wave moving in one direction (e.g., horizontal) is generated by back-and-forth movement of the sheet edges or along the display area. Propagation of the display sheet wave ensures that the emitting surface sweeps through every voxel in the 3D volume during one refresh cycle. Each voxel on the 3D volumetric display is created by display elements, which intermitted light emission is synchronized to the wave propagation speed and 3D image content.
[0086] The wave display sheet can be a stack of flexible layers with one or more light emitting layers and optical layers. The light emitting elements can be positioned only on one side of the display sheet in order to cover a hemisphere or on both sides in order to cover a full sphere.
[0087] Usually light emitting elements like LEDs radiate light diffusely to every direction in a hemisphere with a so-called Lambertian emission distribution, which is advantageous for this use case. If the distributions differ from Lambertian, the optical layers enable constant voxel illumination and better display uniformity from all viewing directions by altering the illumination distribution of the light emitting elements.
[0088] Optical layers can also be used, for example, as a partially reflecting element, which may be used in a two-sided display with a single sparse emitter matrix. If the optical layer shapes are made from a gel-like deformable material, the waviness of the whole foil can be used to advantage as the optical layer shapes are deformed by the foil bending radius. In this case, the deformed optical shapes can be used in somewhat extending the visual depth of the structure without increasing the actual wave amplitude.
[0089] In some embodiments, the wave display may be a fairly simple, compact, inexpensive, and robust structure. As the 3D display volume is filled with a wavy sheet, it can also be safer than the current 3D volumetric displays that are based on fast rotating elements, which need to be shielded with, for example, a glass or acrylic sphere. The wave display can be made with an improved form factor compared to rotating displays, as the wave display may be made flatter, whereas the rotating displays generally need a cylindrical or spherical volume.
[0090] The propagating wave also makes a displayed 3D volume that may have a generally cubic shape, instead of the cylindrical volume created with some rotating displays. Because the propagating wave sweeps every part of the 3D volume equally, there is as much depth at the edges as there is at the center. In exemplary embodiments, there are also no "dead zones" on the display area, as there are no areas that move at different speeds from each other throughout the display area, in contrast to the case of a rotating display axis.
[0091] The wavy display may also enable creation of occlusion behind the display pixels, as the whole area is filled with material (e.g., black background) all the time, unlike in the case of many other POV displays, for example when a rotating structure is used.
[0092] Flexible displays are already available (e.g., Organic Light Emitting Diode (OLED) displays) and under intense technical development leading to more robust and higher resolution displays.
[0093] The herein-disclosed display structures and methods may be used with both 3D and 2D image content without loss of resolution by making the propagating wave amplitude zero.
[0094] The systems and methods disclosed herein may, in some embodiments, use a flexible light emitting sheet (or display foil) to create 3D volumetric display for presenting 3D image content. A propagating wave may be generated to a pixelated sheet, and by synchronizing the light emittance of each pixel to the wave phase, amplitude, and 3D image content, an image may be formed in an air volume covered by the propagating wave. Pixels may create virtual images of 3D objects while moving on the fast propagating wave, as the human visual system integrates the image due to the Persistence-of-Vision (POV) phenomena.
[0095] Dynamic wave propagation (propagating wave 510) in one direction (e.g., horizontal) on a flexible display sheet 505 can be generated by moving one or both sheet ends linearly (linear motion 515) at the amplitude direction of the wave, as shown in FIG. 5. The display sheet 505 may be longer than the dimension of a display apparatus in order to cover the whole display area when the waveform is used. The desired length may be determined by the number of waves and wave amplitude used. If the display is used in a 2D mode and the foil is flat, the extra foil may be collected inside a display frame. Either or both of linear motion and bending movement (e.g., angular movement 520 of a sheet end) may be used at sheet ends in order to generate the propagating wave through the device width.
[0096] If linear motion is generated at both ends of the flexible display foil, the movements may be synchronized in order to avoid a standing wave by controlling the sheet length. Continuous oscillation and correct synchronization causes the propagating wave to travel through the display width. The light emitting display may be self-emitting and flexible. Exemplary display technologies that can fulfill these criteria include, but are not limited to: OLED and LED matrix bonded to a flexible substrate. The OLED may be the better candidate from these two options as the structures can be printed and made more flexible. One example OLED display structure has been described in J. Wallace, "Highly flexible OLED light source has 10 micron bend radius," Laser Focus World, July 31 , 2013. This structure is only 2 m thick and it can have a minimum 10μηι bending radius. The sheet's mass is 3g/m2 and achieved brightness 100cd/m2. While some commercially available OLED displays are built by using glass as a barrier material, exemplary embodiments make use of flexible barrier layers. In some embodiments, a two-sided oscillating wave display is provided to enable 3D volumetric viewing from front and back sides of display.
[0097] As the propagating wave travels through the display volume, there will generally be pixels over the whole display area at all times visible to all viewing directions. However, the display sheet may create self-occlusion at certain wave phases to a viewer positioned at a viewing angle that is larger than the maximum slope angle of the wave. As shown in FIG. 6A, there is a first viewer 602 and a second viewer 604, both looking at the display sheet 605, which may comprise a light emitting foil. There may be a light emitting element 610 which is not visible to both viewers, given the slope angle 612 of the display sheet 605 and the viewing angle 614 of the second viewer 604. With another wave phase (at slope angle 617 and viewing angle 619) (as compared to that of FIG. 6A), the same position of the display sheet 605 becomes visible, as shown in FIG. 6B, where the light emitting element 615 may be visible to both viewers.
[0098] In some embodiments, the oscillations of the light emitting substrate may be fast enough to enable the POV effect of the human visual system. At a central field of view, this may mean that ~60Hz refresh frequency for the display may be adequate, and no flicker would be visible. As the display is volumetric, in some embodiments each pixel sweeps through the whole depth of the volume during one refresh cycle. This means that movement of half a wave is used for the ~60Hz refresh rate and the full wave form should oscillate at ~30Hz frequency.
[0099] The shape of the propagating wave may be formed as to be optimal for a given use case. For example, sinusoidal, saw, triangle, or pulse waves may be generated by controlling the linear motors (or other sheet movement generators, such as electrical conductors, electromagnets, angular motors, etc.) on both ends of the foil appropriately. Waveforms with fast slope angle changes (e.g., triangle) may require a smaller bend radius from the flexible light emitting sheet than waveforms which have smoother slope angle transitions, making them more demanding from the material durability point-of-view. However, for example, triangular waves may lead to a more linear distance change and constant tilt angle for the light emitting pixels, which may benefit the overall system design. A non-periodic waveform may also be used as content adaptive way to, among other things, save energy, but this may come with the cost of making the system more complex. Considering the volumetric display application, an even sweep depth for the foil movement through the display area may be most desirable. If the light emitting foil physical properties (e.g., stiffness) are such that the waveform amplitude or shape show degradation as the distance to the linear motor is increased. In some embodiments, amplitude degradation is compensated for by gradually changing the foil thickness in the direction of the wave propagation.
[0100] The minimum bending radius of the flexible light emitting foil may also set a limit for the minimum wave amplitude and propagation speed. If the example OLED display structure described in J. Wallace is used, the 10 m bend radius results in a minimum of 20 m wave amplitude and 40 m wavelength with a waveform that comprises two semicircles. This kind of volumetric display may be used for showing shallow relief patterns as described in further detail below. Adequate propagation speed for this waveform can be calculated from the refresh frequency (60Hz) and wavelength (40μηι) to be ~2.4mm/s (i.e., 0.04mm ÷ 1/60s). With large-scale volumetric displays the bending radius can be much larger and material properties do not set a strict limit to the amplitude. For example, if the display width is 500mm (-25" display) and wave amplitude (volumetric depth) is 250mm, the bending radius would be around 125mm. In this case, a speed of ~15m/s (i.e., 0.25m ÷ 1 /60s) can be calculated for the propagating wave.
[0101] The brightness of the image producing substrate may be selected so as to be adequate considering the use case and, for example, the desired refresh frequency. Currently available POV- displays use either projection systems or LED rows. LED matrix on a flexible substrate would be able to produce adequate brightness already with the volumetric wave display in normal indoor lighting conditions.
[0102] If the light emitting foil material does not stretch, only transverse waves are generated and single pixels of the display sheet move only in one direction, as determined by the wave amplitude. This makes two pixel coordinates (e.g., x and y) fixed as the third coordinate (e.g., z) varies between values limited by the wave amplitude. As the wave is dynamic and it travels through the display area, its phase changes with time. Position and surface normal angle of each pixel at all times can be calculated from the wave amplitude, shape, and phase, which are all controlled by the wave generation mechanism (or a wave generator, or the like). A 3D picture can be generated by switching on and off the single pixels at the right moments when each pixel is positioned in the correct coordinate position, as determined by the 3D geometry to be presented. The depth coordinates of each "voxel" may be continuous as the pixels sweep through the whole volume. Each voxel of a 3D image is located physically at the spatial position where it is supposed to be, and the corresponding pixel emits light from that position toward the viewer. As the pixels of the light emitting foil 705 are located in their correct 3D positions, the eyes 702 of the viewer 700 both focus and converge naturally to the virtual 3D image 710 as shown in FIG. 7. Natural 3D perception occurs as the two eyes see two different views with correct retinal blur and eye convergence depth cues. The same viewing condition is also present at different viewing angles for a single user or for multiple users.
[0103] Pixel density visible to a single viewer changes with slope angle of the wave. The lowest density can be seen when the emitting element surface normal is in the same direction than viewing angle and the highest density is visible when the surface normal is furthest away from the viewing angle. This resolution difference at different wave phase positions can be mitigated by switching several neighboring pixels on at the same time when the pixel density is larger, making the visible area size of this clustered pixel closer to the size of a pixel on the low-density area. With several viewers and viewing directions, the effect is balanced to accommodate the fact that the visible size of the pixels will be different from different viewing angles.
[0104] In some embodiments, the display sheet itself is a stack of flexible layers, as shown in FIG. 8. A base layer 810 may comprise an array of light emitting elements which are activated according to the 3D content. In front of the emitting layer there may be one or more optical layers 820 controlling light direction, enabling better display uniformity if the emitting elements do not have an ideal Lambertian illumination angular distribution. The whole stack of layers may move as a single wave.
[0105] Optical layers use in different embodiments include arrays of refractive, reflective, diffractive, dichroic, absorbing or scattering elements. The optical elements change the angular distribution of light emitted by the display active elements. Normal OLED or LED emitters 910 (on a substrate foil 905) radiate light according to Lambert's law, which dictates that the emitted power is smaller for larger angles from the emitter surface normal (see FIG. 9A). In this case, the intensity distribution follows a cosine relation between the angular directions of observer's line of sight and emitter 910 surface normal. Such a
Lambertian emitter would appear to be evenly bright at all angles as the surface area is smaller with the same cosine relation. However, some emitters do not have perfect Lambertian distributions, and in these cases the optical layer 915 may be configured so as to diffuse the light towards the more ideal distribution. Such a use case for the optical layer 915 is illustrated in FIG. 9B.
[0106] In some embodiments, the optical layer operates to provide diffusion of the boundaries between pixels as illustrated in FIG. 10. The visual pixel density is different along the display surface (which is backed by the substrate foil 1005) depending on the position of the pixels 1010 (or light emitting elements) on the propagating wave as well as on the viewer position (e.g., at one point along the surface there may be a large gap 1012 between pixels, whereas at another point along the surface there may be a small gap 1014 between pixels). The diffusing optical layer 1015 can be used for evening out the visual differences.
[0107] Another use case for an optical layer is illustrated in FIG. 11. The optical layer 1115 can also be used for reflection of emitted light from pixels 1110 (e.g., light emitting elements 1110 on substrate foil 1105) towards the back side of the display, and in this way a single emitter layer can be used for creation of a double-sided display. For example, the optical layer 1115 may include reflective surfaces 1120, and also transmissive surfaces 1125.
[0108] Another functionality of an optical layer is illustrated in FIG. 12. In the embodiment of FIG. 12, the optical layer 1215 comprises small lenses that are deformable such that they may have different focus distance when they are bent at the trough of the wave (such as squeezed lens 1225) or flattened at the top of it (such as stretched lens 1220). This feature may be used as an advantage by extending the visible depth of the display (by adding virtual depth) as the emitters 1210 of the light emitting layer 1205 appear to be further away (e.g., virtual image 1230) when the lens has smaller focal length.
[0109] A 3D wave display system, one embodiment of which is illustrated in FIG. 13, may comprise a playback device 1305 that provides 3D content to a display device. The display device may have support mechanics 1320 for a flexible wave display sheet 1335 inside a display frame 1330. The support mechanism 1320 may include linear and/or angular momentum motors or moving supports at the vertical ends of the sheets (e.g., sheet movement generators 1325). Instead of motors, there may be electrical conductors or electromagnets along the display width which generate dynamic propagating wave movement (such as propagating wave 1340) for the flexible sheet 1335 with electromagnetic force. The playback device 1305 may calculate the correct control signals 1315 to the linear and angular motors (or sheet movement generators 1325) and send them to the display apparatus, which has control electronics which activate the motors according to the control signals. The playback device 1305 may also calculate the right timing for synchronized on-off switching of each pixel, and send these as a display signal 1310 to the display apparatus, which may have display control electronics which activate the pixels according to the display signal 1310. Both the control signal 1315 and display signal 1310 may be calculated in the playback device 1305 on the basis of the 3D content to be displayed, and with the known physical parameters of the actuator motors (or sheet generator motors 1325) and display sheet 1335.
[0110] In some embodiments, rather than a display sheet, an array of moving display rods may be used to create a wave path instead of a single continuous sheet. Light emitting elements may reflect ambient or projected light. They may also reflect white light or colors when turned on and absorb light when switched off. [0111] In some embodiments, the moving wave on the sheet may act like an air pump which causes air flow in front of the device. This may be avoided in some embodiments, for example, by placing the waving element in a vacuum or a thin and/or light gas within a closed device.
[0112] In additional embodiments, further variations of the systems and methods set forth above may also be utilized, for example as discussed below.
[0113] 2D display. I n some embodiments, the wave display geometric form may have zero amplitude or frequency which generates a conventional flat 2D display surface. In this case, the extra foil (or wave display sheet) may be gathered at the frame around the display device (or otherwise retained).
[0114] Deformable lens at lenticular optical layer in 3D wave display. I n some embodiments, the wave display optical layer lenticular array is made from deformable, gel-like material. In such embodiments, the lens focal length is dependent on display wave phase and lens location. A single lens may be "squeezed" to have smaller radius at the bottom of the wave and "flattened" at the top of the wave (see FIG. 12). This means that the lens power at different positions on the wave may be changed, and the emitting pixel below the lens appears to be either closer or further away from the viewer or observer. The apparent distance of emitters can be used in extending the apparent range of depth without increasing the physical wave amplitude of the display.
[0115] 2D plus relief wave display. I n the case of, for example, e-book reader devices, display content may be enhanced by displaying shallow 3D reliefs. Examples of this may include a select button with clearly defined edges or certain words that are emphasized. The 3D depth does not need to be high, and a wave display with small amplitude may be implemented in this environment. The wave display optical layers can be utilized here to increase 3D image depth from wave physical amplitude.
[0116] 2D plus relief wave display with haptic feedback. I n some embodiments, the display includes an array of small wave displays with shallow wave amplitudes suitable for relief presentation. In this embodiment, the wave propagates in both X and Y directions that are orthogonal to the observer's line of sight. Wave propagation may be generated on two sides, or every side, of the array sub-element. The sub- elements of the matrix may be the size of a few surface-wavelengths (e.g., between one and five surface- wavelengths, between one and ten surface-wavelengths, and/or the like). The waves in a sub-element generate illusion of movement, texture or relief form by discrete cosine transforms (DCT). There may be one or multiple wave peaks on one sub element so that the DCT method can generate enough variations of 3D patterns recognizable by the user. A relief wave display user is able to touch a true 3D volumetric relief image on the display surface of the reader device, making it a haptic feedback device. This may also enable blind people to read documents on reader devices, such as where the letters of a document are in the form of 3D relief or Braille pattern. Such embodiments may be used as mobile device touch screens, or other user interfaces.
[0117] Multiview autostereoscopic 3D wave display. In an embodiment of a multiview autostereoscopic 3D wave display, a lenticular sheet or absorbing parallax barrier grating is used as an optical layer in front of the light emitting layer. With this optical layer, the observer sees different images with the two eyes and perception of 3D content is created. An autostereoscopic wave display can be designed for multiple viewers or viewing angles with different images for the separate surface tilt angles. As the propagating wave scans through the display surface, different images are scanned to different viewing angles by synchronizing the image content with the wave surface normal directions. The optical layer blocks all other visible angles making the view unique to the specific direction. This approach has the benefit of using only two pixels as a stereo pair for all multiview directions instead of using several pixels, and a lower pixel count light emitting layer can be used. These embodiments may combine spatial and temporal multiplexing, making it possible to optimize the autostereoscopic 3D display structure better for an available display and signal processing hardware.
[0118] In an exemplary real world scenario, there may be a wave display device on a table. FIG. 13 illustrates on embodiment of the structure of a table-top wave display system and device. There may be a 3D content playback device connected with a cable to a wave display device placed on a table. The playback device sends data that is used for switching on and off the light emitting elements on the display device light emitting flexible layer (e.g., OLED-display foil). The playback device also provides control signals for the motors that generate movement and angular moment to the wave propagating on the flexible display sheet. The linear motors move the flexible display sheet back and forth with amplitude appropriate for generating the 3D image content. Depending on the stiffness of the display sheet, angular momentum may also be applied to initiate wave propagation through the device area. In some
embodiments there are movement actuators at both ends of the waving sheet. Display elements on the propagating wave are switched on and off with control electronics according to their location and in synchronization to the 3D content. An optical layer, positioned on top of the emitting layer, provides even illumination though the device width by diffusing the illumination patterns of the light emitting elements to ideal Lambertian distributions. An observer looks at the device from a direction that is perpendicular to wave propagation from a typical display viewing distance.
[0119] In one embodiment, there is a method of displaying 3D images on a wave display, comprising: generating a propagating wave in a flexible display sheet comprising a plurality of light emitting elements, with a wave generator; performing temporal tracking of the propagating wave, by at least one of: a priori knowledge based on a state of the wave generator; and monitoring of the propagating wave; and processing and driving video content to the flexible display sheet in a time synchronized manner based on a dynamic location of each light emitting element and content in a render buffer. The method may include wherein the flexible display sheet further comprises at least one optical element added to provide a diffuser effect. The method may include wherein the flexible display sheet further comprises at least one deformable optical element to extend a visible depth of the flexible display sheet. The method may include wherein the wave display is configured as a two-dimensional array of small wave displays each having amplitudes suitable for relief presentation. The method may include wherein the wave generator propagates the wave in directions orthogonal to an observer's line of sight. The method may include wherein wave propagation is generated on two sides of each small wave display of the array, or wherein wave propagation is generated on all sides of each small wave display of the array. The method may include wherein each small wave display of the array is the size of a few surface-wavelengths, or the size of ten or fewer surface-wavelengths, or the size of five or fewer surface-wavelengths. The method may include wherein propagated waves in each small wave display uses discrete cosine transforms (DCT) to generate a relief image. The method may include wherein each small wave display is configured to have at least one wave peak, such that DCT methods generate variations of 3D patterns recognizable by a user. The method may include wherein each small wave display is configured to display letters of a document in either 3D relief or a Braille pattern. The method may include wherein the wave display is further configured for haptic feedback. The method may include wherein the wave display is configured with an absorbing parallax barrier grating as an optical layer to be a Multiview autostereoscopic 3D display. The method may further comprise synchronizing image content with the wave surface normal directions to configure the wave display such that different images are displayed at separate surface tilt angles. The method may include wherein the wave display is configured with a lenticular sheet as an optical layer to be a Multiview autostereoscopic 3D display. The method may further comprise synchronizing image content with the wave surface normal directions to configure the wave display such that different images are displayed at separate surface tilt angles. The method may include wherein groups of the plurality of light emitting elements are organized and controlled as voxel elements.
[0120] In one embodiment, there is a 3D wave display, comprising: a playback device; and a display device, comprising a flexible wave display sheet disposed within a display frame, the flexible wave display sheet supported and driven by a sheet movement generator within the display frame. The display may include wherein the playback device is configured to calculate a control signal for a sheet movement generator. The display may include wherein the playback device is configured to calculate a timing for synchronized on-off switching of each of a plurality of pixels of the flexible wave display sheet, and communicate said calculated timing as a display signal to the display device. The display may include wherein the display device further comprises display control electronics configured to activate the plurality of pixels of the flexible wave display sheet according to the display signal from the playback device. The display may include wherein the playback device is configured to calculate a control signal and a display signal based on 3D content to be displayed at the display device and physical parameters of the sheet movement generator. The display may include wherein the flexible wave display sheet is disposed within the display frame in a vacuum region of a closed display device, or disposed within the display frame in a thin gas region of a closed display device, or disposed within the display frame in a light gas region of a closed display device. The display may include wherein the sheet movement generator comprises a linear motor, or an angular momentum motor; or at least one moving support at a vertical end of the flexible wave display sheet; or an electrical conductor disposed along a width of the display frame, the electrical conductor configured to generate dynamic propagating wave movement for the flexible wave display sheet by an electromagnetic force; or an electromagnet disposed along a width of the display frame, the electromagnet configured to generate dynamic propagating wave movement for the flexible wave display sheet by an electromagnetic force. The display may include wherein the flexible wave display sheet comprises a plurality of LED or OLED emitters.
[0121] In one embodiment, there is a 3D wave display, comprising: a playback device; and a display device, comprising an array of moving display rods disposed within a display frame, the array of moving display rods supported and driven by a wave movement generator within the display frame.
[0122] In one embodiment, there is a 3D wave display system comprising a processor and a non- transitory storage medium storing instructions operative, when executed on the processor, to perform functions including: generating a propagating wave in a flexible display sheet comprising a plurality of light emitting elements, with a wave generator; performing temporal tracking of the propagating wave, by at least one of: a priori knowledge based on a state of the wave generator; and monitoring of the propagating wave; and processing and driving video content to the flexible display sheet in a time synchronized manner based on a dynamic location of each light emitting element and content in a render buffer.
Light Fields and Flat Form Display
[0123] In some embodiments, rather than a volumetric 3D display, a flat form-factor display device may take advantage of the properties of a flexible sheet, as in the previously discussed volumetric display.
[0124] Several different kinds of rendering schemes may be used together with the disclosed display structures and optical methods. Depending on the selected rendering scheme, the particular embodiment of the display device may be either a multiview display with a very dense grid of angular views or a true light field display with multiple views and focal surfaces. In addition, the structure may function as a regular 2D display by activating all the sub-pixels inside a LF pixel simultaneously. [0125] Exemplary methods are able to provide both the large light emission angles that are useful for eye convergence and the small emission angles that are desirable for natural eye retinal focus cues. In addition, some such methods make it possible to create multiple focal surfaces outside the display surface to address the VAC problem. Such embodiments present a way to simultaneously scan the small light emission angles and focus the voxel-forming beams.
Light Field Display Geometry.
[0126] FIG. 14 is a schematic view of the geometry involved in creation of the light emission angles associated with a LF display 1405 capable of producing retinal focus cues and multiple views of 3D content with a single flat form-factor panel. A single 3D display surface 1405 is preferably able to generate at least two different views to the two eyes of a single user in order to create the coarse 3D perception effect already utilized in current 3D stereoscopic displays. The brain uses these two different eye images for calculation of 3D distance based on triangulation method and interpupillary distance. This means that at least two views are preferably projected into the Single-user Viewing Angle (SVA) shown in FIGS. 3A-3D. In addition, a true LF display is preferably able to project at least two different views inside a single eye pupil in order to provide the correct retinal focus cues. For optical design purposes, an "eye-box" is usually defined around the viewer eye pupil when determining the volume of space within which a viewable image is formed (e.g., "eye-box" width 1425). In the case of the LF display 1405, at least two partially overlapping views are preferably projected inside the Eye-Box Angle (EBA) covered by the eye-box at a certain viewing distance 1420. If the display 1405 is intended to be used by multiple viewers (e.g., 1401 , 1402, 1403) looking at the display 1405 from different viewing angles, several views of the same 3D content (e.g., virtual object point 1410) are preferably projected to all viewers covering the whole intended Multi-user Viewing Angle (MVA).
[0127] If a LF display is positioned at 1 m distance from a single viewer and eye-box width is set to 10mm, then the value for EBA would be -0.6 degrees and one view from the image 3D content should be produced for each -0.3 degree angle. As the standard human interpupillary distance is ~64mm, the SVA would be -4.3 degrees and around 14 different views would be called for just for a single viewer positioned at the direction of the display normal (if the whole facial area of the viewer is covered). If the display is intended to be used with multiple users, all positioned inside a moderate MVA of 90 degrees, then a total of 300 different views are called for. A similar calculation for a display positioned at 30cm distance (e.g., a mobile phone display) would result in only 90 different views for horizontal multi-view angle of 90 degrees. And if the display is positioned 3m away (e.g., a television screen) from the viewers, a total of 900 different views would be called for to cover the same 90 degree multi-view angle. [0128] These sample calculations show that a true LF multi-view system may be easier to create for use cases where the display is closer to the viewers than when the users are further away. Furthermore, FIG. 14 illustrates that three different angular ranges should be covered simultaneously by the LF display: one for covering the pupil of a single eye (e.g., EBA), one for covering the two eyes of a single user (e.g., SVA), and one for covering the multiuser case (e.g., MVA). From these three angular ranges, the last two are usually covered in existing systems by using either several light emitting pixels under a lenticular or parallax barrier structure, or by using several projectors with a common screen. These techniques are suitable for the creation of fairly large light emission angles that can be utilized in the creation of multiple views. However, these systems lack the angular range dedicated to cover the eye pupil, and resultingly they are not capable of producing the correct retinal focus cues and are susceptible to the VAC.
[0129] Functioning of currently available, flat-panel-type multiview displays is generally based on spatial multiplexing only. A row or matrix of light emitting pixels (LF sub-pixels) is placed behind a lenticular lens sheet or microlens array and each pixel is projected to a unique view direction in front of the display structure. The more pixels there are on the light emitting layer behind each lenticular feature, the more views can be generated. This leads to a direct trade-off situation between number of unique views generated and spatial resolution. If smaller LF pixel size is desired from the 3D display, the size of individual sub-pixels may be reduced or a smaller number of viewing directions can be generated. A high quality LF display should have both high spatial and angular resolutions in order to provide the user a natural view, and the current flat form-factor displays are limited in this respect.
[0130] One problem to be addressed in the LF displays aimed for consumer use is how to create all the desired light emission angles and multiple views used for the complex light fields with a system that is not overly complicated and large. A complex optical system would more likely use high cost components and accurate alignment between them making the systems easily too expensive and difficult to handle for average consumers. A large form-factor system would require a lot of space, which is usually not easily available at home settings making, for example, the volumetric 3D display types much less desirable for consumers than the flat panel 3D display types.
Optical Features and Resolution in Flat Form Light Field Displays.
[0131] In order to create good resolution 3D LF images at different focal surfaces with crossing beams, each beam is preferably very well collimated and it should have a narrow diameter. Furthermore, ideally the beam waist should be positioned at the same spot where the beams are crossing in order to avoid contradicting focus cues for the eye. If the beam diameter is large, also the voxel formed in the beam crossing is imaged to the eye retina as a large spot. A large divergence value means that the beam is becoming wider as the distance between voxel and eye is getting smaller and the virtual focal surface spatial resolution becomes worse at the same time when the eye resolution is getting better due to the close distance.
[0132] In the case of an ideal lens, the achievable light beam collimation is dependent on two geometrical factors: size of the light source and focal length of the lens. Perfect collimation without any beam divergence can only be achieved in the theoretical case in which a single-color point source (PS) is located exactly at focal length distance from an ideal positive lens. However, all real-life light sources have some surface area from which the light is emitted making them extended sources (ES). As each point of the source is separately imaged by the lens, the total beam ends up formed from a group of collimated sub- beams that propagate to somewhat different directions after the lens. And as the source grows larger, the total beam divergence increases. This geometrical factor cannot be avoided with any optical means and it is the dominating feature causing beam divergence with relatively large light sources.
[0133] Another, non-geometrical, feature causing beam divergence is diffraction. The term refers to various phenomena that occur when a wave (of light) encounters an obstacle or a slit. It can be conceptualized as the bending of light around the corners of an aperture into the region of geometrical shadow. Diffraction effects can be found from all imaging systems and they cannot be removed even with a perfect lens design that is able to balance out all optical aberrations. In fact, a lens that is able to reach the highest optical quality is often called "diffraction limited" as most of the blurring remaining in the image comes from diffraction. The angular resolution achievable with a diffraction limited lens can be calculated from the formula sin Θ = 1.22 * λ / D, where λ is the wavelength of light and D the diameter of the entrance pupil of the lens. It can be seen from the equation that the color of light and lens aperture size have an influence on the amount of diffraction, where beam divergence is increased when the lens aperture size is reduced. This effect can actually be formulated into a general rule in imaging optics design: if the design is diffraction limited, the only way to improve resolution is to make the aperture larger. Diffraction is the dominating feature causing beam divergence with relatively small light sources.
[0134] The size of an extended source has a big effect on the achievable beam divergence. The source geometry or spatial distribution is actually mapped to the angular distribution of the beam and this can be seen in the resulting "far field pattern" of the source-lens system. In practice this means that if the collimating lens is positioned at the focal distance from the source, the source is actually imaged to a relatively large distance from the lens and the size of the image can be determined from the system "magnification ratio". In the case of a simple imaging lens, this ratio can be calculated by dividing the distance between lens and image with the distance between source and lens. If the distance between source and lens is fixed, different image distances can be achieved by changing the optical power of the lens with the lens curvature. But when the image distance becomes larger and larger in comparison to the lens focal length, the desired changes in lens optical power become smaller and smaller, approaching the situation where the lens is effectively collimating the emitted light into a beam that has the spatial distribution of the source mapped into the angular distribution and source image is formed without focusing.
[0135] In flat form factor goggleless LF displays, the LF pixel projection lenses may have very small focal lengths in order to achieve the flat structure and the beams from a single LF pixel are projected to a relatively large viewing distance. This means that the sources are effectively imaged with high
magnification when the beams of light propagate to the viewer. For example, if the source size is 50 μηι x 50 μηι, projection lens focal length is 1 mm and viewing distance is 1 m, the resulting magnification ratio is 1000:1 and the source geometric image will 50 mm x 50 mm in size. This means that the single light emitter can be seen only with one eye inside this 50 mm diameter eyebox. If the source has a diameter of 100 μηι, the resulting image would be 100 mm wide and the same pixel could be visible to both eyes simultaneously as the average distance between eye pupils is only 64 mm. In the latter case the stereoscopic 3D image would not be formed as both eyes would see the same images. The example calculation shows how the geometrical parameters like light source size, lens focal length and viewing distance are tied to each other.
[0136] As the beams of light are projected from the LF display pixels, divergence causes the beams to expand. This applies not only to the actual beam emitted from the display towards the viewer but also to the virtual beam that appears to be emitted behind the display, converging to the single virtual focal point close to the display surface. In the case of a multiview display this is a good thing as the divergence expands the size of the eyebox and one only has to take care that the beam size at the viewing distance does not exceed the distance between the two eyes as that would break the stereoscopic effect. However, if the intent is to create a voxel to a virtual focal surface with two or more crossing beams anywhere outside the display surface, the spatial resolution achievable with the beams will get worse as the divergence increases. Note also that if the beam size at the viewing distance is larger than the size of the eye pupil, the pupil will become the limiting aperture of the whole optical system.
[0137] Both geometric and diffraction effects work in unison in all optical systems and they are considered in the display LF pixel design in order to achieve an advantageous voxel resolution. This is emphasized with very small light sources as the optical system measurements become closer to the wavelength of light and diffraction effects start to dominate the performance. The following discussion details how the geometric and diffraction effects work together in cases where one and two extended sources are imaged to a fixed distance with a fixed magnification. In a first case, the used lens aperture size is relatively small, and a Geometric Image (Gl) is surrounded by blur that comes from diffraction making the Diffracted Image (Dl) much larger. In a second case, two extended sources are placed side-by- side and imaged with the same small aperture lens. Even though the Gls of both sources are clearly separated, the two source images cannot be resolved as the diffracted images overlap. In practice this means that reduction of light source size would not improve the achievable voxel resolution, as the resulting source image size would be the same with two separate light sources as with one larger source that covers the area of both separate emitters. In order to resolve the two source images as separate pixels/voxels, the aperture size of the imaging lens should be increased. In a third case, the same focal length lens but with larger aperture is used in imaging the extended source. Now the diffraction is reduced and the Dl is only slightly larger than the Gl, which has remained the same as magnification is fixed. In a fourth case, two sources are used with the larger aperture, and the two spots can be resolved as the Dls are no longer overlapping, making it possible to use two different sources and improve the spatial resolution of the voxel grid. μί-EDs
[0138] One emerging display technology that may be used in some embodiments is based on the use of so-called micro-LEDs ( LEDs). These are LED chips that are manufactured with the same basic techniques and from the same materials as the standard LED chips in use today. However, the LEDs are miniaturized versions of the commonly available components, and can be made as small as 1 μηι - 10 μηι in size. A matrix has been manufactured with a density of 2 μηι x 2 μηι chips assembled with 3 μηι pitch. The μίΕϋε have been used so far as backlight components in TVs, but they are also expected to challenge OLEDs in the μ-display markets. When compared to OLEDs, μίΕϋε can be more stable components and can reach very high light intensities, making them useful for many applications from head mounted display systems to adaptive car headlamps (LED matrix) and TV backlights. μίΕϋε can also be seen as high-potential technology for 3D displays, which use a very dense matrices of individually addressable light emitters that can be switched on and off very fast.
[0139] One bare μίΕϋ chip emits a specific color with spectral width of -20-30 nm. A white source can be created by coating the chip with a layer of phosphor, which converts the light emitted by blue or UV LEDs into a wider white light emission spectra. A full-color source can also be created by placing separate red, green, and blue LED chips side-by-side, as the combination of these three primary colors creates the sensation of a full color pixel when the separate color emissions are combined by the human visual system. The previously mentioned very dense matrix would allow the manufacturing of self-emitting full-color pixels that have a total width below 10 μηι (3 x 3 μηι pitch).
[0140] Light extraction efficiency from the semiconductor chip is one of the parameters that determine electricity-to-light efficiency of LED structures. There are several approaches that aim to enhance the extraction efficiency and thus make it possible to build LED-based light sources that use the available electric energy as efficiently as possible, which is especially important with mobile devices that have a limited power supply. One approach, as discussed in US7994527, is based on the use of a shaped plastic optical element that is integrated directly on top of an LED chip. Due to lower refractive index difference, integration of the plastic shape extracts more light from the chip material in comparison to a case where the chip is surrounded by air. The plastic shape also directs the light in a way that enhances light extraction from the plastic piece and makes the emission pattern more directional. Another approach, as discussed in US7518149, enhances light extraction from a LED chip by shaping the chip itself to a form that favors light emission angles that are more perpendicular towards the front facet of the semiconductor chip and makes it easier for the light to escape the high refractive index material. These structures also direct the light emitted from the chip. In the latter case, the extraction efficiency was substantially greater when compared to regular LEDs, and considerably more light was emitted to an emission cone of 30° in comparison to the standard chip Lambertian distribution where light is distributed evenly to the surrounding hemisphere.
Dynamic Wave Light Field Display with a Single Waving Layer
[0141] Systems and methods set forth herein use a flexible or rigid optical layer and a flexible or rigid light emitting layer to create a dense 3D light field display for presenting 3D image content. A propagating wave is generated in one or more of the flexible layers. As one of the layers is rigid and the other has a propagating wave, the distances between layers change locally. By synchronizing the light emittance of each light emitting element of the light emitting layer to the wave phase, amplitude and 3D image content, a virtual 3D image is formed on one or both sides of the display surface. The varying distance between layers is used for altering locally the image virtual distance and light emission angles.
[0142] One exemplary structure forms an array of tiny projectors with a microlens sheet on the top and multiple display pixels below. Each microlens and the array of sub-pixels below it form a tiny projector system that acts as one pixel in creation of the LF 3D image. The propagating wave crest and trough bring the display elements near or far from the projector lens. This range of distances may be selected to cover the whole virtual depth range desired for the LF display.
[0143] Each sub-pixel in the tiny projectors corresponds to a certain viewing angle. By combining the output from several pixels from the different pixel arrays, different views of the same 3D image content are created for different viewing directions. Exemplary embodiments add the possibility to scan virtual image distances and projected light angles as the distances between the optical layer and light emitting layer in each tiny projector change with the propagating wave. Such methods also add the possibility to provide the correct focus cues to the eyes. [0144] Exemplary embodiments provide both the large light emission angles useful for eye convergence and the small emission angles that provide natural eye retinal focus cues. This may be accomplished by scanning the small light emission angles with the help of a propagating wave form in a stack of display layers. The structure may be built into a device that has a flat form-factor that is preferred for consumer use.
[0145] An exemplary embodiment utilizes a combination of spatial and temporal multiplexing in creation of a dense light field that can be used for displaying 3D content. A micro-optical active component can be used for high-resolution temporal scanning of light rays, enabling the creation of a true dense light field with depth information instead of having just a set of multiple views.
[0146] Exemplary embodiments use a flexible or rigid optical layer and a flexible or rigid light emitting layer to create a dense 3D light field display for presenting 3D image content. A propagating wave 1550 may be generated in one or more of the flexible layers (e.g., flexible optical layer 1515 or flexible substrate 1530). As one of the layers is rigid (e.g., rigid substrate 1505 or rigid optical layer 1535) and the other has a propagating wave 1550, the distances between layers change locally (e.g., short distance 1520 and long 1522 in FIG. 15A, or long distance 1537 and short distance 1539 in FIG. 15B) as shown in FIGS. 15A-15B. By synchronizing the light emittance of each light emitting element 1510 of the light emitting layer to the wave phase, amplitude and 3D image content, a virtual 3D image is formed on one or both sides of the display surface. The varying distance between layers is used for altering locally the image virtual distance and light emission angles.
[0147] FIG. 16 is a schematic view of an example structure that has a rigid light emitting layer (rigid substrate 1605 and arrays of light emitting elements 1610) and on top of that a flexible optical layer 1615 with microlenses. The whole structure forms an array of tiny projectors with a microlens sheet on the top and multiple virtual display pixels 1625 below that. Each microlens and the array of sub-pixels below it form a tiny projector system that acts as one pixel in creation of the LF 3D image. The propagating wave crest (with amplitude 1622) and trough bring the display elements near or far from the projector lens. If in a single tiny projector, the light emitting element distance from the optical element is near the focal distance of the optical element, a nearly collimated beam is generated. This corresponds to the case presented in FIG. 3C. An image of the light emitting element positioned at the trough section can be focused between the viewer and display. This range of virtual distances can be designed to cover the whole virtual depth range 1630 used for the LF display. The structure can also be configured to cover multiple viewing directions (e.g., directions 1601 , 1602, and 1603).
[0148] Each sub-pixel in the tiny projectors corresponds to a certain viewing angle. By combining the output from several pixels from the different pixel arrays, it is possible to create different views of the same 3D image content for the different viewing directions. Exemplary embodiments add the possibility to scan virtual image distances and projected light angles as the distances between the optical layer and light emitting layer in each tiny projector change with the propagating wave. As shown in FIG. 16, the virtual distance of each pixel changes as the microlens structure is closer or further away from the sub-pixels matrix. In addition to switching on and off the sub-pixels according to the correct view, the pixels can be switched on the basis of image depth content adding the possibility to provide also the correct focus cues to the eyes.
[0149] The continuous 3D object virtual distances can be presented as a narrow range of ray angles in certain spatial positions on the display. These angle distributions and spatial position pairs generate the dense light field in front of the flat display. Exemplary embodiments are capable of generating both the larger angles useful for eye convergence as shown in FIG. 16 and also the smaller angles useful to provide the correct retinal focus cues. The smaller angles are generated as the wavy foil changes distance and angle (relative to the rigid substrate 1705) between the optical element 1715 and light emitting sub-pixel (e.g., array of light emitting elements 1710). This change is illustrated schematically in FIGS. 17A-17B. In FIG. 17B, the optical element 1715 of the tiny projector (one of the array of projectors in FIG. 16) is farther away from the sub-pixel (e.g., than in FIG. 17A and the element 1715 is also slightly tilted as the wave amplitude changes the local shape of the flexible optical foil (of which the optical element 1715 is a portion). The distance change makes the light rays from one sub-pixel 1710 focus to a different depth by expanding and contracting the Source Angle (SA). The change in element 1715 tilt causes the Direction Angle (DA) to change (for example, relative to the optical axis 1750). As the propagating wave scans through the whole flexible element, each projector (e.g., microlens 1715 and associated array of light emitting elements 1710) is also able to scan through a set of source angles and direction angles. Very small wave amplitude is adequate for the generation of the small light angles used for covering the eye boxes of the viewer. In other words, by synchronizing the sub-pixel on and off durations to the image content, fine angular details of the whole light field may be generated.
[0150] Simulation raytraces were prepared for four cases where the distance between a row of pixels and lens is a) 2 x the Back Focal Length (BFL) of the lens, b) between 1 x BFL and 2 x BFL, c) closer than BFL and d) at the BFL. In each case, five light emitting pixels were simulated. When the distance between emitter and lens was two times the BFL, simulated rays focused near the display surface and when the distance was between BFL and 2 x BFL, the simulated rays focused between the observer and display. When the lens distance was less than BFL, the simulated rays diverged and the virtual focus point was behind the display surface. When the lens distance from the pixels was exactly the same as BFL, the simulated rays were close to parallel and the focus was at infinity. These exemplary raytrace simulations demonstrated that by varying the lens distance from the source pixels it is possible to position the visual focal point of a pixel or pixel group both in front of and behind the display surface, as well as to infinity.
[0151] Similarly, raytraces were simulated for cases where a lens remains at the same distance (BFL of the lens) from the light emitting pixels, but was tilted by a) 25°, b) 15° and c) 0° from the optical axis. In each case, five pixels were simulated. When the lens was tilted by 25°, the direction of the nearly collimated beam emitted by the central pixel was tilted by ~1.7°. At the same time, the beam from the edge pixel in the tilt direction was diverted by only -0.3° whereas the opposite side edge pixel beam was diverted by as much as -5°. When the lens was tilted by 15°, the corresponding angular shifts in beam directions were less than the values simulated with the larger lens tilt. This simulation illustrated that the more the lens was tilted, the more the projected image was tilted from the optical path, and that the lens tilting may be used for scanning through a small angular image projection range. Such small tilt angles are adequate for covering the eye pupils of a nearby observer with more than one image, which may fulfill the super-multi-view condition and provide even more realistic focus cues, especially for fast moving 3D image content. The small scan angles can also be used for generation of a very dense field of multiple viewing directions that create simultaneously the stereoscopic effect for multiple viewers positioned at a larger distance. In such a case, the spatial multiplexing made with a row of sub-pixels may be enhanced with the temporal multiplexing made with the propagating wave foil that tilts the projection angles by tilting the individual lens shapes on top of the sub-pixels.
[0152] In some embodiments, dynamic wave propagation in one direction (e.g., horizontal) on the flexible sheet is generated by moving one or both sheet ends linearly at the amplitude direction of the wave as previously shown in, and discussed in relation to, FIG. 5. In some embodiments, the optical element may be, for example, a flexible foil of microlenses or, in the case of a flexible display element, a rigid lenslet panel. As the projector array optical components are small, a small wave amplitude may be adequate for the wave generation, and in some embodiments piezo-electric actuators may be employed for wave generation. An exemplary light field wave display system may generally be as previously discussed in relation to FIG. 13. In some embodiments, as an alternative to motors, electrical conductors or electromagnets along the display width may be used to generate a dynamic propagating wave movement of the flexible sheet with electromagnetic force.
Exemplary Use Case: Smartphone 3D LF Display Device.
[0153] In an exemplary use case, a 3D LF smartphone display may be in front of a user at -500 mm distance from the user's eyes. Generally, the interpupillary distance between a person's eyes may be around 64 mm and eye pupil size may be around 7 mm. A person's eye lens focal length may be around 17 mm. The user sees different virtual distances through the device surface as it creates a dense light field with the help of a propagating wave micro-optical foil. When the user sees a virtual distance that is closer to or further away from the user than the actual device distance, their eyes may converge to two different spatial areas on the display surface. The display surface emits light from these two areas to the user's eyes by activating the correct sub-pixels under microlenses positioned on the flexible foil. This happens if display surface sends light to the correct angle from that spatial position. When looking at an infinite distance through the display, the display sends parallel collimated light to the user's eyes from two spatial areas at (or about) the interpupillary distance from each other. With all other virtual distances, the distance between the two display spatial areas is smaller. Generally, there may be two areas illuminated on the light field display with the size of at least eye pupil diameter (or thereabout).
[0154] Each separate microlens in the flexible lenticular sheet positioned on top of a high-resolution rigid OLED display may act as the objective of a small projector. Thus, there are many sub-pixels behind each lens, and together they form one 3D display pixel. The initial distance between lens and sub-pixels may be close to the objective focal length distance of each microlens. Undulating movement of the flexible microlens sheet modulates this distance. At an appropriate voxel distance, the small projector on the display surface sends light to the observer's eye. The small projector spatial position is dependent on voxel distance. If a larger area on the display is to be used for the 3D image generation, the neighboring projectors are also illuminated. Different angles are used from different spatial positions to reach the observer's eyes. The angles are calculated and pixels activated according to the 3D image content in order to create the whole high-density light field in front of the display.
Display surface distance 500mm Config. 1 Config. 2 Config. 3 Config. 4 Voxel is on optical axis
Voxel distance (mm) 300 1000 3000 10000
One eye convergence angle (°) 6.17 1.87 0.62 0.19
Eye accommodation (focal length mm) 16.10 16.72 16.90 16.97
Eye accommodation:
Angles from display surface used for 0.50 0.17 0.08 0.04 accommodation
(± angle °)
Eye convergence:
Min angle from display surface 5.65 1.69 0.52 0.12
Max angle from display surface 6.68 2.05 0.72 0.25
±angle(°) 1.03 0.36 0.20 0.13
Spatial area on display, where the light is
emitted (convergence, origin is on display
center)
Left X min -19.5 -14.7 -24.7 -28.3
Left X max -23.5 -17.7 -29.5 -33.5
Right X min 19.5 14.7 24.7 28.3
Right X max 23.5 17.7 29.5 33.5
Table 1. Simulated approximate values for angular and spatial areas connected for an embodiment.
[0155] Table 1 shows simulated values for angular and spatial areas where the display surface sends light in order to present the correct voxels at four different virtual distances. The eye accommodation angles are inside the convergence angles. One of the tiny projector sub-pixels is lit in order to project the light to the right convergence angle from the projector objective. The rough value of convergence angle is set by selecting the right sub-pixel. The flexible layer may move as a dynamic wave and sweep the more dense angular values around the rough convergence angle values (e.g., super resolution). The sub-pixels are lit only when the angular sweep is at the correct viewing direction. The small projector can sweep ±1 ° angles for a voxel that is at 300 mm distance from the two eyes. Larger sweeps may be used for a larger eye-box, which may assist in device usability.
[0156] The dynamic wave amplitude in the stack of optical layers may be used for generating distance variation between light emitting elements and the optical layer. This distance variation is converted to angles by the small projector lenses, if the pixel or lens is riding the wave and other layers are flat. The pixel X-Y-position is also converted to a rough set of angles in user space. If the user looks at the display through the small projector objectives, the pixel spatial position may seem to move together with the lens, when the wave phase is changing. The propagating wave may create continuous angular sweeps or narrow changes in angles more accurately than a simple sub-pixel structure. The wave propagating movement may propagate such that every slope angle, crest, and trough of the wave is sweeping through the whole small projector width. A projector pixel is lit when the distance change sweeps angles within the defined eye-box according to virtual distance information. A sub-pixel pixel is lit during the wave phase change, when the ray angle hits one of the eye boxes. The rays sweep a set of angles during the period the pixel is turned on. These angular sweeps are generated also from neighboring small projectors on larger source areas. The angular and spatial sweeps on display surface should happen during a time interval that is adequate for the observer persistence of vision effect.
[0157] The angles from the display surface to be used for eye convergence are larger at closer distances when compared to the zero angles for an infinity virtual distance. The user's interpupillary distance and display distance determine the appropriate convergence angles. Angles used for
accommodation depend on eye pupil diameter at the same display distance. These accommodation angles are much smaller than convergence angles, and their range is within the convergence angles. An arc with a center on the voxel at the virtual distance can be drawn from one eye to the other. The arc normal represents the angle of light emission that is called for from the 3D LF display.
[0158] The display device may generate the correct convergence and accommodation angles in order to provide a realistic 3D experience for the user. When the display surface comprises an array of small projectors each with an objective lens and an array of sub-pixels behind it, the display device may generate virtual 3D distances by switching on and off the light emitting elements of the small projectors in synchronization to the 3D content. Thus, the 3D angles are generated on a 2D display surface in order to provide the user's eyes with virtual distance cues. Two projectors' spatial positions on the display surface create the right convergence angle for the two user eyes. The user looks at the 3D object "through" the display surface, and continuous movement or structure in 3D content can be perceived.
[0159] Generally, 3D content may be synthesized or recorded by two cameras at interpupillary distance from each other to obtain a realistic depth experience. Near and far object points on two camera images fall spatially on different positions on the camera sensors. Infinity distance images are similar in both cameras. Near objects are decentered more from each other in camera images. This separation represents the 3D virtual distance. Also, camera lens focus information can be used for obtaining the 3D virtual distance if multiple focus distances are recorded. In recent years, light field cameras (e.g., Lytro) have also emerged, which are based on a microlens array between camera lens and sensor and capable of recording multiple focus distances simultaneously. This focus data may be used for user accommodation distances in the exemplary display device.
[0160] In a handheld device, there may preferably be some tolerance for the display viewing distance, eye X-Y-movement and tilt. If the small projectors on the LF 3D display aim their light rays directly to the user's eyes, a narrow range of light field angles is adequate, but apparatus usability may be limited. A somewhat wider angle distribution for close-range virtual distances and spatial areas for far distances on the display surface may result in better user comfort. The box-shaped 3D volume, called the "eye-box" around user eye pupil, can be used for the 3D light field angle rendering. If the voxel is at display physical distance, only large angles are swept from one small projector. At further distances, the light should be emitted from two spatial areas on the display surface. At infinite virtual distance, the area on the display surface may be the same as eye box width, and collimated light is sent towards the viewer.
[0161] In some embodiments, the wave movement may be generated and flexible sheet length controlled only at the vertical edges of the display device, if the observer's eyes are at horizontal plane. The small mechanical movement may be generated, for example, with piezoelectric actuators positioned at the frame of the device, such as on the vertical edges of the display frame. The wave amplitude may also be controlled by rolling or pulling the sheet from the vertical edges.
[0162] Another embodiment of a display device is illustrated in cross section in FIGS. 18A and 18B. As illustrated in FIG. 18A, the display 1800 may include a plurality of projection cells 1802. Each projection cell 1802 includes a set of controllable light-emitting sub-pixels 1810 and a microlens 1815. Each set of light- emitting sub-pixels 1810 may be arranged in a two-dimensional pattern of sub-pixels (e.g. a 128x128 pixel array, among other possibilities), and each sub-pixel may be capable of displaying a full gamut of different colors. The projection cells 1802 may also be arranged in a two-dimensional array, for example in a square or hexagonal array. In the example of FIG. 18A, the microlenses 1815 are supported by (or are integral with) a membrane 1817 that is capable of serving as the medium of travel of a propagating wave. In its rest state (without a traveling wave), the membrane 1817 and microlenses 1815 (collectively the microlens array) may be supported at a predetermined distance from the sub-pixels (e.g., a distance of one focal length) by a plurality of resilient supports 1821. A piezoelectric actuator 1825 (or other linear or rotational actuator) and appropriate power source 1827 are provided to generate a propagating wave in the microlens array. One or more actuators 1825 may be positioned along one or more edges of the display 1800. In alternative embodiments, a plurality of actuators 1825 may be distributed throughout the display 1800. For example, each projection cell 1802 may be provided with a corresponding actuator 1825. In some embodiments, the resilient supports 1821 may include a position sensor used to determine the position of each microlens 1815 relative to the corresponding sub-pixel array 1810. In alternative embodiments, other types of position sensors may be used, and/or microlens position sensing may not be used in cases where lens position can be calculated based on the input to the driving actuator (using, for example, the appropriate traveling wave equation).
[0163] While FIG. 18A illustrates a display 1800 in a rest state (with no traveling wave), FIG. 18B illustrates a portion of a similar display 1800 at a frozen moment in time during passage of a standing wave across the projection cells 1802. As illustrated in FIG. 18B, the passage of the standing wave causes the distance between each microlens 1815 and its respective set of subpixels 1810 to change as a function of time.
[0164] The illumination of sub-pixels in each projection cell 1802 is performed according to the distance of the microlens 1815 from the sub-pixels 1810. An example of one technique of determining when to illuminate particular sub-pixels is provided with reference to FIG. 18C. The time-varying distance of a microlens 1815 from its respective sub-pixel array 1810 may be represented by doff). The focal length of the microlens may be represented by f. Consider a case where it is desired to display a particular virtual voxel 1830, where the virtual voxel 1830 has a depth of a behind the sub-pixel array 1810 and a horizontal offset of / from the center of the projector cell 1802. The difference in depth between the microlens 1815 and the depth of the virtual voxel 1830 may be represented by the value di(t), where d = z, + doff). Consider a sub-pixel 1811 of interest with a position at a horizontal offset o from the center of the projector cell 1802. To display the virtual voxel 1830, the sub-pixel 1811 at offset Xo is illuminated (with the appropriate brightness and color) if and when the following conditions, based on the lens equations for a thin lens, are satisfied: d0{t) di t) f
Xi_ _ dj(t)
x0 d0 t)
Substituting di(t) = z, + do(t), the conditions may be expressed ai d0 t) Zi+d0 {t) f
Zj+d0 (t)
d0 (t) [0166] For cases where doff) (the distance between the microlens 1815 and the sub-pixel 1811) is small compared to (the depth of the voxel 1830), the above conditions may be simplified to the following:
Figure imgf000041_0001
and
Eq. 6
[0167] Combining these equations, it follows that the sub-pixel to be illuminated for a voxel with position and z, (relative to the projector cell) is the sub-pixel that is at (or that best approximates) the position x0 such that the following condition is met.
And this pixel is illuminated when the following condition is met.
Figure imgf000041_0002
[0168] Note that the foregoing equations are provided for a virtual image appearing behind the microlens array from the perspective of the viewer. Display systems as described herein may also be used to display images that appear to be in front of the microlens array. The use of appropriate sign conventions will be apparent to those of skill in the art when dealing with images in front of the microlens array ("real" images). In addition, it is noted that in practical embodiments it may be desirable to illuminate a sub-pixel not only when the above conditions are precisely met but also when the conditions are approximately satisfied (e.g., so long as the magnitude of the inequality is within a threshold). Appropriate boundaries may be tested to balance viewing quality considerations. For example, stricter equality is likely to lead to a greater depth- convergence agreement and more precise voxel positioning, but it also results in a shorter amount of time during which a pixel is illuminated, potentially resulting in dimming of the display.
[0169] FIG. 18D is an example ray-tracing diagram illustrating the generation of a virtual image when the conditions discussed in relation to FIG. 18C are met. Specifically, under these conditions, for a focal point 1813 the virtual image 1831 of the illuminated sub-pixel 1811 substantially corresponds to the position of the virtual voxel 1830. It may be noted that other projection cells 1802 in addition to the one illustrated in FIG. 18D may operate (and generally do operate) to display the same virtual voxel. However, the values of the horizontal offset Xo and/or of a vertical offset y0 will be different for different projector cells.
[0170] FIGS. 18E and 18F illustrate a situation in which the same voxel is displayed by different projector cells at different times. (It should be noted, however, that in exemplary embodiments, the turning on and off of different subpixels occurs sufficiently rapidly that, due to persistence of vision effects, the voxel may appear to be displayed using multiple projector cells simultaneously.) In FIG. 18E, the microlens array is in a position such that one of the sub-pixels 1811 of the projector cell on the right is illuminated to generate a virtual image 1831 at the desired voxel position. No subpixel 1812 is illuminated in the projector cell on the left because none of the resulting virtual images 1832 would correspond to the position of the desired voxel. In FIG. 18F, after the traveling wave has changed the position of the microlenses, the microlens in the right-side projector cell is no longer in a position to accurately reproduce the desired voxel (would result in resulting virtual images 1832 out of position), so no subpixel 1812 of that projector cell is illuminated. However, the microlens in the left-side projector cell is now in a position to reproduce the desired voxel (the same voxel as in FIG. 18E), and the appropriate pixel 1811 in the left-side projector cell is illuminated.
[0171] While some of the foregoing examples use the thin-lens approximation for purposes of illustration, other embodiments operate to take into consideration other factors (e.g., lens thickness, chromatic aberration, non-constant focal length due to flexing of the microlens array, and/or the like) in determining if and when different sub-pixels are to be illuminated. In some embodiments, ray-tracing or other simulation techniques may be used to determine when and whether (and with what intensity and hue) to illuminate different sub-pixels.
[0172] For ease of illustration, some of the above examples describe a situation in which only a single voxel is displayed. Extension to display of a plurality of voxels may be implemented by illuminating each sub-pixel if and when the corresponding microlens is in a position such that any one (or more) of the voxels to be displayed would be accurately reproduced by illuminating that sub-pixel.
[0173] Exemplary embodiments described above use a traveling wave in the microlens array to periodically alter the configuration of the microlens with respect to the corresponding sub-pixels. In other embodiments, other techniques are used to alter the configuration of the microlens with respect to the sub- pixels. For example, the distance between each microlens and its corresponding set of sub-pixels may be adjusted with a piezoelectric actuator, microelectromechanical (MEMS) actuator, linear actuator, magnetic actuator, or other actuator. Such an actuator may operate on a cell-by-cell basis or on a group of projection cells.
Light Field Display with Wavy Diffractive Foil
Displays Using Undulating Diffractive Foil.
[0174] An undulating diffractive foil may be used in some embodiments of an optical method for and basic construction of an enhanced multi-view 3D light field display. The performance of a multi-view display (such as previously discussed based on lenticular sheets and dense pixel matrices) may be enhanced by introducing a flexible diffractive foil manipulated by a propagating wave into the structure (similar to the flexible layers discussed above). As the wave propagates in the grating foil, the angle of incidence between the grating and light emitted from a pixel change constantly. As the angle changes, the diffraction orders also change their propagation direction slightly. This small change in propagation angle is used for additional temporal multiplexing of view directions. The propagating wave allows sweeping of spatially multiplexed view directions through small angles, and by synchronizing the wave movement to the activation of the pixel matrix a much denser LF display is created with a higher-quality 3D picture.
[0175] A grating film bends light rays that are going through. For example, the diffractive grating orders 1 and -1 bend the light rays symmetrically to two directions. The zeroth order goes directly though the grating and may be obscured if not needed. The bending angle depends on the grating period and light wavelength. If there is a tilted grating in the ray path, the incoming light may see a tighter grating period than when the grating is not tilted. A tilted grating bends rays more than a non-tilted one.
[0176] An exemplary LF display using a wavy diffractive foil includes an array of small projector cells. The light in a single projector cell is emitted from a pixelated layer and a microlens collimates the emitted light into a set of beams that exit the lens aperture at different propagation directions. The beam directions create the stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions by modulating the sub-pixels according to the image content. This projector cell functionality is similar to previously discussed approaches for flat form-factor autostereoscopic displays based on lenticular sheets. The next layer in the present projector cell structure is a grating foil that alters the propagation direction of the emitted beams by diffraction. An additional prism structure is positioned after the diffraction grating (or grating foil) in order to make another alteration to the beam propagation direction, compensating for angular tilt made by the grating. Some embodiments may operate without the prism layer, but use of the prism layer may improve a central view parallel to the display surface normal and other view directions can be positioned symmetrically around it.
[0177] In different embodiments, various rendering schemes may be used with the presented display structure and optical method. Depending on the particular rendering scheme selected, the display device may be either a multi-view display with a very dense grid of angular views or a true light field display with multiple views and focal surfaces.
[0178] Amplitudes for the propagating waveform can be kept below 1 mm even in fairly large-scale displays.
[0179] Unlike embodiments described above utilizing an undulating foil with optical components, the diffractive optical foil may have flat surfaces and even thickness, which may be beneficial features for a dynamic undulating component in the display optical stack. The flat sheet may be more robust and less susceptible to wearing. In some embodiments, the principle can be scaled by use case or designed in a product for different LF display view angles, voxel distance range, and resolution.
[0180] Embodiments using a diffraction grating may be applied to hardware constructions that are found in previously discussed 3D multi-view displays, such as utilizing lenticular sheets or other integral imaging approaches. Such construction may be beneficial for the reliability, setup, and calibration of the whole system, as very few components may be fitted together in some embodiments. Activation of the propagating wave may employ additional actuators and control electronics as well as alteration of the rendering scheme, but these may be added to the structures, electronics, and rendering functions of existing hardware.
[0181] An exemplary embodiment provides an optical method and construction of an enhanced multi- view 3D light field display. The performance of a multi-view display based on lenticular sheets and dense pixel matrices may be enhanced by introducing a flexible diffractive foil with a propagating wave into the structure. As the wave propagates in the grating foil, the angle of incidence between grating and light emitted from a pixel change constantly. As the angle changes, the diffraction orders change their propagation direction slightly. This small change in propagation angle is used in exemplary embodiments for additional temporal multiplexing of view directions. The propagating wave allows sweeping of spatially multiplexed view directions through small angles and, by synchronizing the wave movement to the activation of the pixel matrix, a much denser multi-view display is created with a higher quality 3D picture.
Optical Structure of a Light Field Display with Wavy Diffractive Foil.
[0182] FIG. 19A shows the structure of a single projector cell 1900 that forms one basic unit of a whole
LF display using a flexible diffractive foil (where projector cells 1900 may be separated by baffles 1907).
The light is emitted from a pixelated Light Emitting Layer (LEL) 1911 (which may comprise a substrate
1905 and arrays of light emitting elements 1910) and a microlens 1915 collimates the emitted light into a set of beams that exit the lens aperture at different propagation directions. The beam directions create a stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions by modulating the pixels according to the image content. If only two pixels are used, the result is a stereoscopic image for a single user standing in the middle of the Field-Of-View (FOV) and the image from right half of LF pixels enter the left eye and the left half pixels are visible only to the right eye. If more than two pixels are used, the result is a set of unique views spread across the FOV and multiple users can see the stereoscopic images at different positions inside the predefined viewing zone. This effectively generates a multi-view light field for a 3D scene, each viewer has his/her own stereoscopic view of the same 3D content, and perception of a three dimensional image is generated. As the viewer moves around the display, the image is changed for each new viewing angle. This first part of the projector cell functionality is identical to the method used in current flat form-factor autostereoscopic displays based on e.g. lenticular sheets.
[0183] The next layer in an exemplary projector cell 1900 structure is a grating foil 1920 that alters the propagation direction of the emitted beams by diffraction. Change in the propagation direction is related to the grating parameters and follows the relation: 0m = arcsin (m * λ / d - sin0i), where 0m is the propagation direction of the beam (in relation to grating surface normal) after the grating at diffraction order m, λ is the light wavelength, d is the distance from the center of one grating slit to the center of the adjacent slit (grating period), and Θι is the angle of incidence the beam has in relation to the grating surface normal. A prism structure 1930 positioned after the diffraction grating 1920 may make another alteration to the beam propagation direction, compensating for the angular tilt made by the grating 1920. The system may also operate without the prism 1930, but may assist in having the central view parallel to the display surface normal and for other view directions to be positioned symmetrically around it.
[0184] Functioning of an exemplary optical method. FIG. 19B illustrates the operation of an exemplary method, which utilizes a propagating wave motion that is introduced to the grating foil. Four projector cells 1952, 1954, 1956, 1958 (each as in FIG. 19A) are emphasized representing four different wave phases. Projector cells are separated by the baffles 1907 of a baffle array 1908. As the waveform propagates in the foil 1920, the different phases of the wave tilt the projection directions differently, making the views change direction slightly. The result is an angular sweep of each view direction through a small angle. The length of this sweep may be designed to be the same as the angular spacing between two adjacent views generated with different pixels. If the pixels are modulated in synchronization to the sweep angle, a dense set of different views can be generated in between the main directions, determined by the pixel positions and projector cell lens.
[0185] In FIG. 19B, the projector cells 1954 and 1958 are generating views centered to the surface normal direction of the display. This is due to the fact that the grating foil wave is at trough and crest of the wave amplitude making the incident angles close to grating surface normal. As the prism structure 1932 compensates for the grating tilt, projected views become symmetrical around the central direction. In projector cell 1952, the grating 1920 is tilted to a counter-clockwise direction, altering the view propagation more to the clockwise direction. The same tilt direction, but with somewhat larger angle, is introduced to the propagation directions in cell 1956, where the grating 1920 is tilted to a clockwise direction with respect to the display normal. This means that angular sweeps go back and forth between one position determined by wave trough and crest, and two positions determined by the grating rotating to the left and right during propagation of one full waveform of the grating foil 1920 across the projector cell aperture. The wavelength and amplitude used for a specific sweep angle range can be determined from the previously mentioned grating equation.
Optical Hardware of a Light Field Display with Wavy Diffractive Foil.
[0186] FIG. 20 shows a schematic presentation of a whole display structure 2000. The light is emitted from a pixelated layer that can be, for example, a LED matrix, OLED display, or LCD display with backlight. A matrix of baffles 2008 in the form of, for example, a punctured sheet may be placed on top of the light emitting layer 2011, optically isolating the projector cells from each other. Light collimation optics 2016 is placed on top of the baffles, and may be, for example, a microlens/lenticular lens sheet or a foil with diffractive structures. Actuators 2025 for controlling the linear (and/or in some embodiments angular) motion for generating the propagating wave motion to the grating foil 2020 may be placed at the frame 2030 of the whole display 2000. Wave amplitude below 1 mm may be sufficient for generating the desired angular sweep ranges even in fairly large displays if the grating period and projector cell size are small enough. Microprisms 2032 that make the final adjustment of the view directions can be, for example, integrated in a display protective front window as an extruded groove structure or made as diffractive elements with microstructures. The wavy grating foil 2020 is positioned in between the collimating lenses 2016 and angle adjusting prisms 2032. Material of the grating film 2020 may be, for example, polyester, and may have a thickness of -0.1 mm. Such foils are manufactured by embossing or holographic methods and are readily available in the form of large rolls. A blazed grating structure may be used in some embodiments, as it allows the use of, for example, diffraction order 1 for the beam tilting, and other diffraction orders naturally present in gratings can be attenuated. The other diffraction orders (especially the 0th order) may cause cross-talk or lowered image contrast if not attenuated properly either by the grating design or by an additional baffle structure positioned after the grating foil 2020. Optical structures can be one-dimensional (e.g., cylindrical lenses) if only horizontal views are needed, or two-dimensional (e.g., rotationally symmetric microlenses) if views are desirable in both directions. In the latter case, two orthogonal diffractive wavy foils can be used for the two-dimensional angular scan.
[0187] If linear motion is generated at both ends of the wavy diffractive foil 2020, the movements may be synchronized by considering also the sheet length, in order to avoid a standing wave. Exemplary structures may generally be the same as previously discussed in relation to FIG. 13, where continuous oscillation and correct synchronization may cause a propagating wave to travel through the display width. In some embodiments, piezo-electric actuators may be used to generate small wave amplitudes sufficient for wave generation, when projector array optical components are small. In some embodiments, instead of motors there may be electrical conductors or electromagnets along the display width that generate dynamic propagating wave movement to the wavy diffractive foil with the force based on electric and/or magnetic fields. In some embodiments, the conductors may be integrated between projector cells in the wavy grating film.
[0188] In some embodiments, the grating foil and microprism elements may be built on top of existing display structures. However, in some embodiments, given the use of additional time multiplexing, the light emitting layer preferably has faster refresh rates than those used by some existing multi-view displays. In such embodiments, a display using a LED matrix based on currently available components or LEDs may be an example of a light emitting structure capable of very fast switching speeds sufficient for these embodiments.
Further Diffraction Embodiments.
[0189] Holographic/standard grating film. In some embodiments, readily available holographic/standard grating film is used instead of the more complex blazed grating. Unlike the blazed grating, these gratings diffract equal amounts of light into both orders 1 and -1. Light also propagates through the Oth order to the original direction, but with reduced intensity. In this variation, diffraction orders 0 and -1 are blocked mechanically. FIG. 21 shows an exemplary structure of a single projector cell 2100. The optical paths before and after the wavy grating 2120 may be offset. Transmission of light from the LEL 2110 and collimating MLA1 (2115) through the wavy grating 2120 to the order 1 is about 30% for a single color band for a standard grating design. This light may pass through a prism 2130 after the wavy grating foil 2120, as previously discussed. All the other orders are blocked (such as with block for orders 2137) in the physical aperture 2135 placed after the grating 2120. In order to keep light transmission in reasonable levels, relay optics are added, which focus the different diffraction orders to the aperture 2135 and then re-collimate the beams from diffraction order 1 that are allowed through the structure. Focal length difference in relay optics lens elements MLA2 (2132) and MLA3 (2145) enables scaling of the LF display viewing angle and voxel depth range. Crosstalk baffles 2140 may separate projector cells from one another.
[0190] Straight projector cell design based on PGP modification. FIG. 22 shows the structure of a single projector cell 2200 that uses a Prism-Grating-Prism (PGP) structure. One example of a PGP structure is described in W01993021548A1. The projector cell 2200 of FIG. 22 is similar to that shown in FIG. 21 , but in FIG. 22 there is no shift in the projector cell 2200 optical axis, and the whole design is straight. Crosstalk baffles 2240 still separate projector cells, which each have a LEL 2210, and optical lens elements MLA1
(2215), MLA2 (2232), and ML A3 (2245) are utilized. In the path of the collimated beam, there are two wedge prisms 2230, and the wavy grating 2220 is between them. This arrangement enables a straight symmetric design, and the different diffraction orders coming from holographic/standard grating can be blocked with a simpler aperture 2235 that is positioned only on one side of the space between relay lenses
2232 and 2245. The arrangement may allow lower tolerances for the blocking aperture alignment. [0191] Vision correction and HUD display with LF display. In some embodiments, an active mask (e.g., LCD display) may be incorporated in the structure with holographic/regular gratings, and the additional diffraction orders utilized in LF image generation. If the grating order 0 is passed, the viewer may see a LF display maximum viewing distance through the screen. If the order 1 is passed, the LF display can show the display screen at a user-controlled distance, which may be independent of device distance. In some cases, the LF display may be configured to render the furthest voxel layer distance to provide presbyopia correction, such as for an older viewer. For example, a mobile phone with a LF display on a car dashboard may work as a HUD display via windscreen reflection, and a driver may not have to focus their eyes to a near distance when looking at the display.
[0192] LF display as camera: Active tracking the observer eye direction. Different viewer locations can be detected, for example by an active near-infrared (NIR) camera system detecting the viewer direction from a LF display device. NIR wavelength (e.g., 840 nm) is reflected from human eye retina. Viewer locations can be detected on the basis of this reflection. For example, if every second cell in the LF display is used as a low-resolution camera instead of a light emitting pixel, the display structure itself may track viewer locations and where on the display the viewer is converging or even focusing their eyes. This may substantially reduce the amount of data processing needed for a true LF display, as only the actual viewer eye locations and focal surfaces then are considered in 3D image rendering.
[0193] Flat and solid grating blocks on standing wave. If the wavy grating has noticeable curvature across the projector cell aperture diameter, it may degrade the collimation quality and furthest showable voxel distance. The wavy grating wavelength is preferably long enough to reduce this effect. An alternative approach may be to dispose flat rigid grating pieces on a flexible sheet. Between grating pieces, the sheet material may be resilient so as to behave like a spring when the sheet is oscillated. In such an
embodiment, there may be a standing waveform, where the grating pieces tilt on wave nodes.
[0194] In alternative embodiments, the undulating or wavy diffraction grating may be replaced with an alternative optical element, such as a stretchable soft grating, an ultrasonic grating, a moving pixel layer, or shaking of one lenslet layer.
[0195] In one embodiment, there is a method comprising: providing a light-emitting layer having a plurality of sub-pixels; providing a microlens array over the light-emitting layer, the microlens array comprising a plurality of lenses, each lens corresponding to a subset of the sub-pixels; generating a traveling wave in at least one of the light-emitting layer and the microlens array to generate an oscillation in the distance between each microlens and the light-emitting layer; and illuminating selected sub-pixels in synchrony with the distance between each microlens and the light-emitting layer to generate a 3D image. The method may include wherein the traveling wave is generated in the microlens array, or generated in the light-emitting layer. The method may further comprise selecting a voxel location, wherein illuminating selected sub-pixels in synchrony with the distance comprises: selecting a plurality of sub-pixels such that, for at least one corresponding microlens distance, illumination of a selected sub-pixel generates an image of the selected sub-pixel substantially at the selected voxel location; and illuminating each of the selected sub-pixels at a time when the corresponding microlens is substantially at the corresponding microlens distance.
[0196] In one embodiment, there is a method comprising: providing a light-emitting layer having a plurality of sub-pixels; providing a collimating microlens array over the light-emitting layer, the microlens array comprising a plurality of collimating microlenses; providing a diffractive layer over the collimating microlens array; generating a traveling wave in the diffractive layer to generate an oscillation in the orientation of the diffractive layer over each of the collimating microlenses; and illuminating selected sub- pixels in synchrony with the orientation of the diffractive layer over each of the collimating microlenses to generate a 3D image. The method may further comprise blocking a zeroth-order transmission emitted from the diffractive layer. The method may include wherein the 3D image is generated using first-order emissions from the diffractive layer.
[0197] In one embodiment, there is a display apparatus comprising: a light-emitting layer having a plurality of sub-pixels; a microlens array mounted over the light-emitting layer, the microlens array comprising a plurality of lenses, each lens corresponding to a subset of the sub-pixels; and at least one actuator operative to generate a traveling wave in at least one of the light-emitting layer and the microlens array to generate an oscillation in the distance between each microlens and the light-emitting layer. The display apparatus may include wherein the actuator is operative to generate the traveling wave in the microlens array, or to generate the traveling wave in the light-emitting layer. The display apparatus may further comprise control circuitry operative to illuminate selected sub-pixels in synchrony with the distance between each microlens and the light-emitting layer to generate a 3D image. The display apparatus may include wherein the control circuitry comprises a processor and a non-transitory computer storage medium storing instructions operative to perform functions comprising: selecting a plurality of sub-pixels such that, for at least one corresponding microlens distance, illumination of a selected sub-pixel generates an image of the selected sub-pixel substantially at the selected voxel location; and illuminating each of the selected sub-pixels at a time when the corresponding microlens is substantially at the corresponding microlens distance.
[0198] In one embodiment, there is a display apparatus comprising: a light-emitting layer having a plurality of sub-pixels; a collimating microlens array over the light-emitting layer, the microlens array comprising a plurality of collimating microlenses; a diffractive layer over the collimating microlens array; and at least one actuator operative to generate a traveling wave in the diffractive layer to generate an oscillation in the orientation of the diffractive layer over each of the collimating microlenses. The display apparatus may further comprise control circuitry operative to illuminate selected sub-pixels in synchrony with the orientation of the diffractive layer over each of the collimating microlenses to generate a 3D image. The display apparatus may include wherein the 3D image is generated using first-order emissions from the diffractive layer. The display apparatus may further comprise an element for blocking a zeroth-order transmission emitted from the diffractive layer. The display apparatus may include wherein the element for blocking a zeroth-order transmission is configured to transmit a first-order transmission emitted from the diffractive layer.
Light Field Display with Refractive Tilting Plates
[0199] In some embodiments, rather than flexible optical or substrate layers or a wavy diffraction foil, an exemplary structure (as illustrated in FIGS. 23A-25) may scan the small light emission angles through use of tilting refractive plates. Such systems and methods utilize a combination of spatial and temporal multiplexing in the creation of a dense light field that can be used for displaying 3D content. As in other embodiments, the properties of a more traditional autostereoscopic multiview display are extended by introducing an active optical component to the structure that can be used for high-resolution temporal scanning of light rays, enabling the creation of a true dense light field with depth information instead of having just a set of multiple views.
[0200] Tilting refractive plates may be relatively easier to manufacture than some other active optical components. A refractive plate sheet has flat surfaces and small thickness, which may be beneficial features for any component in the display optical stack. A flat sheet is robust and less apt to wearing. As the associated functionality is based on refraction and not on diffraction, there may be reduced need for specialized optical component manufacturing. Furthermore, tilting plates as described below may be placed in between the light emitting layer and the collimating lens, which may not be an available option with some other approaches, and may consequently result in more compact structures. Additionally, there may be minimal cross-talk between successive projected views as there are no optical structures in the tilting plates which may cause light leakage from one view to another.
Optical Structure and Operation of a Light Field Display with Tilting Plates.
[0201] A display with tilting plates can use similar hardware constructions that are found for previously discussed 3D multiview displays utilizing lenticular sheets or other integral imaging approaches. Activation and modulation of the tilting plates makes use of additional actuators and control electronics as well as alteration of standard rendering schemes. In exemplary embodiments, these components are added to the structures, electronics and rendering functions of existing hardware. Different embodiments may be adapted for different LF display view angles, voxel distance range and resolution.
[0202] FIG. 23A depicts a structural overview of a single projector cell (or LF pixel) 2302 that is one basic unit of a whole LF display 2300 utilizing tilting refractive plates 2320. Light is emitted from a pixelated LEL 2310 and a microlens 2315 collimates the emitted light into a set of beams that exit a lens (cell boundary) aperture 2340 at different propagation directions. Unique views of the same 3D image are projected to the different directions by modulating the pixels according to the image content. These unique views sent via various beam directions create a stereoscopic 3D effect.
[0203] As an example, if only two pixels are used, the result is a stereoscopic image for a single user standing in the middle of a Field-Of-View (FOV). The image from a right half of LF pixels enter the left eye and left half pixels are visible only to the right eye. If more than two pixels are used, the result is a set of unique views spread across the FOV and multiple users can see the stereoscopic images at different positions within the predefined viewing zone. This is a multiview light field for a 3D scene, and each viewer has their own stereoscopic view of the same 3D content and natural perception of a three-dimensional image is provided. As a viewer moves around the display, the observed image is changed at each new viewing angle. This projector cell functionality is analogous to operation of previously discussed flat form- factor autostereoscopic displays based on, for example, lenticular sheets.
[0204] In the projector cell structure illustrated in FIG. 23A, a tilting refractive plate 2320 is placed between the LEL 2310 and microlens 2315. When the plate 2320 is parallel to the light emitting surface, the optical path of emitted light beams is not altered, but when the plate 2320 is tilted, the optical path is bent inside the plate. Bending of the light path occurs as the light rays are refracted in the first interface between air and plate material. This angular shift is compensated when the light exits the plate from the other side and rays are refracted again with the same magnitude angular shift but towards the opposite direction. The plate is flat, so it will not have any optical power and it will cause only a minor effect on beam focus.
However, a small lateral shift (also called parallel shift in optics) between the beam paths before and after the tilting plate may be introduced, and this shift may cause the beams exiting the projector cell to have slightly shifted propagation directions. From the point-of-view of the projector microlens 2315, it may appear as if the light emitting pixel 2310 position is shifting together with the tilting of the plate 2320. The amount of pixel apparent positional shift (and together with it the amount of propagation-angle change introduces) is related to three parameters of the tilting plate 2320: 1) tilt angle, 2) material refractive index, and 3) thickness. The systems set forth herein may be highly tunable, based on the selection of materials with different refractive indexes and thicknesses during manufacturing. Tilt angles may have different pixel apparent positional shift strength in accordance with the selected plate components or materials. [0205] FIG. 23B depicts a schematic presentation of an exemplary structure 2300 (comprising a plurality of projector cells 2302) for sweeping through beam scanning angles, in accordance with an embodiment. The structure may comprise an array of tilting flat plates 2320 and an array of microlenses 2315 that together with the light emitting layer 2310 form a full light field display. The plates 2320 are optically clear and may be reasonably light weight. They can be made from, for example, standard plastic optics material like PMMA or polycarbonate, or glass materials like float, grown, flint, or fused silica. It is preferable that the plates keep their flat form and do not bend in use. A bending plate may degrade the beam collimation level, which would lower the quality of the generated light field. In some embodiments, the plate thickness is at last as great as the plate diameter divided by 6. As a real-world example, the use of a glass plate with diameter of 250 μηι would lead to a minimum thickness of 42 μηι. However, the tolerances for a refractive plate are less demanding when compared to a mirror, and somewhat thinner plates could be utilized without sacrificing too much optical quality. A rotating plate array structure may be made by connecting the plates together at their edges (such as at connections 2322). The connection may comprise, for example, a soft material like silicone, rubber, nylon, etc. The connections 2322 between the plates 2320 allow modulation of the plate array as a single sheet. The rotating/tilting motion of each plate 2320 can be introduced to all plates at the same time by introducing linear motion with the appropriate synchronization to the edges of the array or to just a few contact points along the sheet. The connecting structures between plates act like springs and they may be hidden in the space between two projector cells.
[0206] In FIG. 23B, two projector cells 2304 and 2306 are emphasized, representing two extreme plate tilt angles. As the plate inside projector cell 2304 tilts in the clockwise direction, the beams exiting projector cell 2304 sweep in the counter-clockwise direction. Similarly, when the plate in cell 2306 rotates in the counter-clockwise direction, the projected beams sweep in the clockwise direction. As the plates are tilted back-and-forth between the two extreme angles, small angular range is scanned with the exiting beams that is symmetric in respect to the whole display normal. With suitable selection of tilt plate material, thickness, and tilt angle, the angular sweep range can be designed to complement the angular spacing between two adjacent views generated with the different pixels. As the pixels are modulated in
synchronization to the plate tilting, a dense set of different views is generated in between the broader directions determined by the pixel positions and projector cell lens.
[0207] FIG. 24A depicts an overview of various standing wave states of an exemplary tilting plate array 2425, in accordance with an embodiment. The array 2425 comprises tilting plates 2420 which are connected to each other with a flexible material 2422, and the whole array 2425 can be treated as one sheet. As the refractive plates 2420 rotate back-and-forth between two maximum tilt angles, the total plate array 2425 forms a dynamic shape that may resemble a standing wave, as shown in FIG. 24B. The connection structures 2422 between plates 2420 function as the antinodes of the standing wave. The nodes of the wave are positioned at the center of the transparent plates 2420, and they can be tilted around a virtual axis without introducing a shift in the distance between the plate 2420 and LEL 2410. The optical raytrace pictures (for particular cells 2404 and 2406) show that when the standing wave is at phase 0, the plates 2420 are parallel to the LEL 2410 surface and no lateral shift is introduced to the optical paths. When the standing wave is at phase 1 (with stretched flexible material 2323), the plates 2420 are tilted and the pixels appear to be slightly shifted from their original positions. This virtual shift causes the beams exiting the projector cells (at apertures 2440 in the cell boundary 2442) to have a slight angular shift as well. This structure can sweep separate beams through small angles with the continuous movement of the standing wave.
Optical Hardware of a Light Field Display with Tilting Plates.
[0208] FIG. 25 depicts a schematic presentation of an exemplary display structure 2500 for generating 3D light fields using tilting refractive plates, in accordance with an embodiment. In FIG. 25, light is emitted from a pixelated layer 2510 than can be, for example, an LED matrix, OLED display, or LCD display with backlight. A tilting plate sheet (e.g., array of tilting plates) 2525 is on top of the pixelated LEL 2510. The display 2500 includes actuators 2550 providing the linear and/or angular motion for generating standing waves in the plate array 2525. The actuators 2550 may be secured to the frame 2570 of the display 2500. Light collimation optics are placed on top of the plate array 2525. In the embodiment illustrated in FIG. 25, the collimation optics comprise a microlens/lenticular lens sheet 2515. In other embodiments, the collimation optics 2515 can be a foil with diffractive structures. An array of apertures 2540 comprising, for example, a punctured sheet is placed on top of the microlens array 2515, optically isolating the projector cells from each other. Optical structures may be one-dimensional (e.g., cylindrical lenses) if only horizontal views are used, and may be two-dimensional (e.g., rotationally symmetric microlenses) if views are desired in both horizontal and vertical directions. In the latter case, two orthogonal scanning plate arrays may be positioned in series to facilitate two-dimensional angular scanning. Standing waves are set up along both the horizontal and vertical directions of the plate arrays, and further temporal multiplexing is used to scan the second dimension. In such embodiments, rendering schemes align timing between sub-pixel activation and standing wave frequencies of the horizontal and vertical to scan the desired projection angles.
[0209] Different methods are available for manufacturing a tilting plate array, in accordance with various embodiments. It possible to use thick rigid plates (e.g., glass) that are joined together with elastomer materials such as silicon rubber or thermoplastic urethane. In other embodiments, there may be a continuous elastic foil on top of which an array of more rigid plates is laminated with optically transparent glue, such as, for example, polyethylene. The foil itself can also be used as a functional optical component by making a series of small grooves to the foil, such as by embossing, and the grooves may act as hinges between the more rigid parts that have the full foil thickness. In the two latter cases, the foil material may be optically transparent and ductile, as well as have sufficiently high fatigue strength so as to endure repeated bending movements. Suitable polymer materials for this purpose include, but are not limited to, polycarbonate and polyamide.
[0210] Continuous oscillation at both ends of the plate sheet and correct synchronization causes the standing waveform to appear through the array width. As the sheet array optical components are small, only very small wave amplitudes are used. A such, actuators such as piezo-electric actuators positioned along the sheet length may be used. The amplitude to be used is related to the projector cell aperture size. For example, if the aperture size is 250 μηι and the desired level of tilt is -10 degrees, the plate edges move ~±20 μηι. A display device frame may have support features for the rigid and flexible display components. The support mechanism may include, for example, linear and/or angular momentum motors or moving supports at the sheet's vertical ends. Instead of or in addition to motors, there may be electrical conductors or electromagnets along the width of the display that generate dynamic wave movement in the plate sheet with a force based on electric and/or magnetic fields. The conductors may be integrated between projector cells in the plate sheet, such as by screen printing (silver paste ink) or by using etched copper wiring on foil. Graphene has mechanical and optical properties that are suitable for this kind of display. It is conductive and it can be stretched about 20% without damage, so it may be used both as hinge material between the plates and as a conductor for electrostatic actuation. Additionally, there are several different types of micro electro-mechanical systems (MEMS) that may be used for actuation of tilting plates at the small scale called for in the present systems and methods. In some embodiments, the movement of the tilting array may be generated by coupling the array with sound waves generated with speakers below or above the array.
[0211] The discussed methods utilize time multiplexing, and therefore the light emitting layer should have fast refresh rates. An LED matrix based on currently available components or on LEDs is one example of a suitable light emitting structure capable of very fast switching speeds sufficient for the present systems and methods.
[0212] In some embodiments, standing waves generated in the flexible sheet of plates are induced via actuators on the edges of the display device. The small mechanical movement may be generated with piezoelectric actuators positioned at the frame of the device on the edges of the display.
Tilting Plates with a Membrane.
[0213] An exemplary display device using tilting plates is illustrated in cross section in FIGS. 26A-26B.
As illustrated in FIG. 26A, the display includes a plurality of projection cells 2602. Each projection cell 2602 includes a set of controllable light-emitting sub-pixels 2610 and a tilting refractive plate 2620. The micro- lenses are omitted from these figures to help provide focus for the disclosed plate structure. Each set of light-emitting sub-pixels 2610 may be arranged in a two-dimensional pattern of sub-pixels (e.g., a 128x128 pixel array, among other possibilities), and each sub-pixel may be capable of displaying a full gamut of different colors. The projection cells 2602 may also be arranged in a two-dimensional array, for example in a square or hexagonal array. In the example of FIG. 26A, the tilting refractive plates 2620 are supported by (or are integral with) a membrane 2630 that is capable of serving as the medium of a standing wave. In its rest state (without a standing wave), the membrane 2630 and tilting plates 2620 (collectively the tilting refractive plate array 2625) may be supported at a predetermined distance from the sub-pixels 2610 (e.g. a distance of one focal length) by a plurality of resilient supports 2635. A piezoelectric actuator 2650 (or other linear or rotational actuator) and appropriate power source 2655 are provided to generate a standing wave in the tilting refractive plate array 2625. One or more actuators 2650 may be positioned along one or more edges of the display 2600. In alternative embodiments, a plurality of actuators 2650 is distributed throughout the display 2600 including along the edges and within the perimeter. For example, each projection cell 2602 may be provided with a corresponding actuator 2650. In some embodiments, the resilient supports 2635 may include a position sensor used to determine the angle of each tilting refractive plate 2620 relative to the corresponding sub-pixel array 2610. In alternative embodiments, other types of position sensors may be used, and/or tilting refractive plate position sensing may not be used in cases where plate position can be calculated based on the input to the driving actuator (using, for example, the appropriate standing wave equation).
[0214] While FIG. 26A illustrates a display in a rest state (with no standing wave), FIG. 26B illustrates a portion of a similar display 2600 at a frozen moment in time while a standing wave is active in the projection cells 2602. As illustrated in FIG. 26B, the standing wave causes the angle between each tilting refractive plate 2620 and its respective set of subpixels 2610 to change as a function of time.
Further Embodiments
[0215] In some embodiments, ray-tracing or other simulation techniques may be used to determine when and whether (and with what intensity and hue) to illuminate different sub-pixels.
[0216] For ease of illustration, some of the above examples describe a situation in which only a single voxel is displayed. Extension to display of a plurality of voxels may be implemented by illuminating each sub-pixel if and when the corresponding tilting refractive plate is in a position such that any one (or more) of the voxels to be displayed would be accurately reproduced by illuminating that sub-pixel.
[0217] Exemplary embodiments described above use a traveling wave or a standing wave in the tilting refractive plate array to periodically alter the configuration of the plates with respect to the corresponding sub-pixels. In other embodiments, other techniques are used to alter the angle of incidence at the microlens layer with respect to the sub-pixels. For example, the angle of each tilting refractive plate and its corresponding set of sub-pixels may be adjusted with a piezoelectric actuator, microelectromechanical (MEMS) actuator, linear actuator, magnetic actuator, or other actuator. Such an actuator may operate on a cell-by-cell basis or on a group of projection cells.
[0218] In different embodiments, various kinds of rendering schemes can be used together with the presented display structure and optical method. Depending on the selected rendering scheme, the realized display device can be either a multi-view display with a very dense grid of angular views or a more complex light field display with multiple views and focal surfaces.
[0219] Various embodiments employ a refractive film in place of the tilting plates. In these embodiments, a refractive film is used instead of the more complex sheet with connected refractive plates. The film may offer a simpler approach to the plate components and generation of dynamic movement, but the desired optical effect of shifting the apparent pixel location can be more difficult to achieve. The film may be comparatively thick and the continuous curvature of a wavy homogeneous material will cause negative effects to beam collimation limiting e.g. possible rendered voxel distance. The film may be employed in use cases that allow very small pixel sizes and short viewing distance.
[0220] Various embodiments employ refractive plates/film on top of a multiview display. In these embodiments, the tilting plate sheet or refractive film is placed on top of a regular multiview display structure with lenticular lenses. As the plates tilt, they cause small spatial shifts between the beams exiting projector cell structures. These shifts can be used for enhancing the spatial resolution of such displays with temporal multiplexing. The structure and method is different from the embodiments described previously in this document as the scanning is done in the spatial domain instead of the angular domain. However, by placing the refractive plates on top of the display structure, the current slanted lenticular structures could be utilized much better and balancing between lateral and horizontal resolutions could be done in a more flexible manner. This structure may also resolve the problem of low horizontal spatial resolution associated with current multiview displays.
[0221] Various embodiments employ a double tilted plate structure. In these embodiments, two separate tilting plates are used per cell instead of just one as previously described herein. The double plate structure is discussed in more detail below.
[0222] Various embodiments employ an array of tilting mirror elements in place of the tilting plate array. In these embodiments, the tilting elements are reflective like in Digital Micromirror Devices (DMDs). As the mirrors tilt, the images of the light emitting elements are scanned through a spatial range. This spatial shift is transferred by the collimating lens into angular shift in the projected view direction. In such a case, the optical path may be folded as the light is reflected from the planar mirror surfaces instead of being refracted making the system geometry different from the ones presented in this disclosure thus far.
[0223] In one embodiment, there is an apparatus comprising: a light emitting layer having an array of pixels, wherein each pixel comprises a set of sub-pixels; an array of tilting refractive plates, wherein (i) each refractive plate in the plate array is connected to one or more adjacent plates via a flexible joint and (ii) each set of sub-pixels is projected through a tilting refractive plate in the plate array; a microlens array, wherein each set of sub-pixels is collimated by a microlens in the microlens array, and a control circuit for rendering a 3D light field that is projected via the microlens array, wherein the control circuit synchronizes activation of sub-pixels with tilt angles of the refractive plates. The apparatus may include wherein the light emitting layer is an LED panel, or an OLED panel, or an LCD panel. The apparatus may include wherein the microlens array is a lenticular sheet. The apparatus may include wherein the flexible joint is a silicon connection, or a clear adhesive film affixed to the array of plates. The apparatus may further comprise actuators connected to the array of tilting plates, wherein the actuators are controlled by the control circuit and drive the angular motion of each tilting plate. The apparatus may include wherein the actuators are linear actuators, or are angular actuators. The apparatus may include wherein the actuators set up a standing wave in the array of tilting plates. The apparatus may include wherein multiple independent and binocular views of content are projected at multiple different viewing angles. The apparatus may include wherein light from a given pixel is small in size so as to not create a false focal surface at the light source.
[0224] In one embodiment, there is a method for producing a light field using a plurality of projector cells, each cell having (i) multi-colored light sources on a light emitting layer, (ii) a blocking partition between projector cells; and (iii) a rocking refractive plate optical element, the method comprising: modulating the multi-colored light sources and synchronizing the emitted modulated light with the rocking refractive plate optical element; and passing the emitted modulated light through a collimating microlens. The method may include wherein passing the emitted modulated light through the collimating microlens comprises projecting multiple independent and binocular views of content at different viewing angles. The method may include wherein light from a given multi-colored light source is small enough in size to not create a false focal surface at the light source.
[0225] In one embodiment, there is a light field display comprising: an array of small projector cells, each small projector cell having (i) a pixelated layer that emits sub-pixel light beams, (ii) a tilting plate that alters the propagation direction of the emitted beams, and (iii) a light collimating microlens; an array of actuators for driving an angular motion of each tilting plate; and a control means in communication with both the array of small projector cells and the array of actuators for synchronizing timing between the emitted sub-pixels and the angle of the tilting plates. The light field display may include wherein the tilting plate of each cell is connected to the tilting plate of at least one neighboring cell with a bendable connector to form a tilting plate array. The light field display may include wherein the array of actuators drive continuous oscillations at both ends of the tilting plate array to set up a standing waveform within the tilting plate array. The light field display may include wherein the control means synchronizes timing based on a rendering schema.
Light Field Display with Wavy Diffraction Foil and Spatial Light Modulator
Optical Structure of a Light Field Display with Wavy Diffraction Foil and Spatial Light Modulator.
[0226] In some embodiments, an optical method and basic construction of an enhanced multi-view 3D light field display may extend the capabilities of a multi-view display using lenticular sheets with a Spatial Light Modulator (SLM) and a backlight structure based on a flexible diffractive grating foil with a propagating wave. The number of light emitting elements on the backplane is optically multiplied in the foil layer as the light is diffracted to different grating orders, making it possible to use clusters of physical light emitting elements instead of filling the whole backlight panel with smaller light emitting components. As the waveform propagates in the grating foil, the angle of incidence between the grating and the light emitted from a pixel changes constantly. As the angle changes, the diffraction orders change their propagation direction slightly. This small change in propagation angle is used for additional temporal multiplexing of view directions. The propagating wave allows sweeping of spatially multiplexed view directions through small angles. By synchronizing the wave movement to the activation of the illumination pixels and SLM, a much denser multi-view display is created.
[0227] FIG. 27A depicts a schematic presentation of an exemplary structure of a single LF projector cell 2702 that forms one basic unit of a whole LF display backlight system, in accordance with an embodiment.. In some embodiments, a projector cell 2702 comprises a single light emitting 2705 element on a light emitting layer 2710. The light is emitted from a small component placed on a pixelated Light Emitting Layer (LEL) 2710 and a microlens 2715 collimates the emitted light into a beam. The beam hits a grating foil 2720, which diffracts light into different diffraction orders, and the original beam is divided into several new beams that propagate in different directions. Propagation directions of the new beams are related to the grating parameters and follow the relation:
0m = arcsin (m * λ / d - sin0i) Eq. 9 where 0m is the propagation direction of the new beam (in relation to a grating surface normal vector) after the grating at diffraction order m, λ is the light wavelength, d is the distance from the center of one grating slit to the center of the adjacent slit (grating period), and Θι is the angle of incidence the original beam has in relation to the grating surface normal vector. A focusing lens 2725 positioned after the grating foil 2720 re-focuses the beams into images 2727 of the original light emitting component. As the original beam is split, it generates a set of light emitting pixel images corresponding to the different diffraction orders. A transmissive diffuser foil 2730 placed at the pixel image location mixes the angular distribution of the beams but maintains the spatial distribution. The above-described elements of the projector cell 2702 form a backlight structure that generates a very dense array of small spots of light that can be individually activated. The mixed beams are modulated via a SLM 2735 (e.g., a liquid crystal array) and then passed through a microlens array 2740.
[0228] In some embodiments, a projector cell comprises a plurality of light emitting elements on a light emitting layer. FIG. 27B depicts a schematic presentation of an exemplary structure of a single LF projector cell 2702 with multiple light emitting elements 2705, in accordance with an embodiment. When the light emitting pixels are imaged (2729) to a diffuser surface 2730 behind a SLM layer 2735, the structure may selectively transmit or block the light coming from individual LED images. A microlens or lenticular lens sheet 2740 positioned in front of the SLM 2735 projects collimated beams to different directions out of the display. A particular projection direction is based on the horizontal/vertical position of a pixel image 2729 on the diffuser foil 2730 behind the lens sheet 2740. These microlenses 2740 together with the SLM 2735 form LF display pixels. In some embodiments, different beam directions are used to create a stereoscopic 3D effect. Unique views of the same 3D image are projected to different directions by modulating the SLM pixels and light emitting elements 2705 according to the image content. In some embodiments, two image projections are used, and the result is a stereoscopic image for a single user standing in the middle of a display Field-Of-View (FOV). In such embodiments, the image from a right half of the LF pixels enter the left eye and the image from left half pixels are visible only to the right eye. In cases wherein more than two pixels are used, the result is a set of unique views spread across the FOV. Multiple users can be able see the stereoscopic images at different positions inside a predefined viewing zone. This represents a multi- view light field for a 3D scene, and each viewer may have their own stereoscopic view of the same 3D content. Accordingly, natural perception of a 3D image is generated. As any viewer moves around the display, the image is changed for each new viewing angle to show a correct scene, thereby emulating natural parallax and natural depth blur.
[0229] In embodiments, where the SLM 2735 has a very high resolution, diffuser foil 2730 spots corresponding to each separate light emitting pixel can be blocked individually. However, as the light emitting elements 2705 can also be individually activated, in some embodiments the system modulates a whole cluster of cell-exiting beams cotemporally with a single SLM pixel. This makes it possible to utilize lower resolution SLMs in the design of the display. In certain embodiments, the LEL pixels are modulated faster than the SLM, and the temporal synchronization between light emitting components and the SLM can be modified to take advantage of this. The SLM 2735 can be, for example, a LCD screen with polarizers, and the light emitting elements 2705 can be clusters of LEDs bonded to a backplane. Using the diffractive structure, the light emitting elements have multiple images, which means that a single element can be used for illuminating multiple LF pixels on the SLM. This may lower manufacturing costs of the light emitting layer, as fewer components are needed, and they can be bonded as clusters to the backplane instead of employing dense matrix. The number of images generated is dependent on the particular diffraction grating design— mainly on how many grating orders are created with even illumination intensity.
[0230] FIG. 28 depicts an overview of beam angle change using a diffractive foil 2820, in accordance with an embodiment. A propagating waveform in a grating/diffractive foil 2820 can be used for additional temporal multiplexing of projection angles. FIG. 28 depicts a schematic diagram of a situation where the grating foil 2820 propagating wave has propagated to a position where the foil 2820 is clearly tilted with respect to the direction of a light beam emitted from the LEL 2810. The tilt causes an additional angular spreading of grating order beams, which is followed by spatial spreading of the pixel images 2827 on the SLM 2835. The amount of additional angular spreading can be calculated from the previously noted grating equation. The 0th order beam travels through the grating 2820 unaltered, except for a small lateral shift caused by refraction at the foil and air interfaces. As all the other grating order beams spread further away from the center, the pixel images also move further away from a centerline of the diffuser foil 2830 and a microlens array 2840 projects the resulting beams to an angle directed towards the 0th order. As the propagating wave moves through the projector cell aperture, the alternating trough, crest and slope parts of the wave are used for scanning of small angles. From the point-of-view of the observer, it appears as if the SLM pixels are filled with virtual sub-pixels that travel across the LF pixel aperture defined by the single microlenses of the microlens/lenticular sheet. This is used for the creation of an angularly denser light field as the SLM pixels and LEL elements are modulated in synchronization with the grating foil propagating wave. A monochromatic system, not needing color combination, can be implemented without the propagating wave movement. In those embodiments, the system may rely more on spatial multiplexing.
[0231] FIG. 29 depicts a schematic presentation of an exemplary internal structure a 3D Light Field display 2900 with directed backlight using a diffractive foil 2920, in accordance with an embodiment. The depicted internal structure of a LF display highlights the functionality of a method used with the structure, which utilizes a propagating wave motion in the grating foil 2920. Four successive projector cells 2901. 2902, 2903, 2904 are pictured representing four different wave phases. As the waveform propagates in the foil 2920, the different phases of the wave tilt the projection directions differently making the views to change directions slightly. The result is a set of angular sweeps of view direction through small angles. In one embodiment, lengths of the sweeps are the same value as the angular spacing between two adjacent views generated with the different pixels. If the pixels are modulated in synchronization with the sweep angle, a dense set of different views is generated in between the directions addressable by the pixel positions and LF pixel front lens alone. Similar to the structure of FIG. 28, the display 2900 includes a LEL 2910, collimating microlenses 2915, a diffractive grating foil 2920, focusing microlenses 2925, a light diffuser 2930, a SLM 2935, and collimating microlenses 2940.
[0232] In the embodiment of FIG. 29, projector cells 2902 and 2904 generate views that are symmetric with respect to a normal vector of the display. This is due to the fact that the grating foil wave is at trough and crest of the wave amplitude, making the incident angles align with the grating surface normal vector. As the grating 2920 diffracts the different orders symmetrically on both sides of the 0th order, the beams emitted from the LF pixels are symmetric. In projector cell 2901 , the grating 2920 is tilted in a counterclockwise direction, altering the beam directions such that they propagate at an angle towards the 0th order beam. The 0th order beams, however, are not affected by the grating tilt, as can also be seen from the previous grating equation. This means that the 0th order beams will always be symmetric with respect to the normal vector and the angular tilts occur only on the, for example, -1th and 1th order beams. The same tilt directions with the same angles are introduced to the propagation directions in cell 2903, where the grating 2920 is tilted in a clockwise direction with respect to the display normal. Angular sweeps go back and forth between (i) the positions determined by wave trough and crest and (ii) the positions determined by the grating tilted to the left and right during propagation of one full waveform of the grating foil 2920 across a projector cell aperture. In addition to this angular sweep, the beam bundle emitted from the LF pixels alternates between two states wherein the total beam bundle divergence angle is made larger and smaller. The smaller angles occur when the wave trough or crest is used and the large bundle divergence occurs when the grating foil 2920 is tilted. The particular foil wavelength and amplitude used for a specific sweep angle range can be determined using the previously mentioned grating equation. The line of small arrows 2960 on above the lens 2940 signify the whole beam angular sweeping actions that follow when the waveform is propagating in the grating foil 2920.
Optical Hardware of a Light Field Display with Wavy Diffraction Foil and Spatial Light Modulator.
[0233] FIG. 30 illustrates a schematic overview of an exemplary 3D Light Field display structure 3000 with directed backlight using a diffractive foil 3020, in accordance with an embodiment. Light is emitted from a pixelated layer 3010 that has clusters of light emitting elements that can be, for example, LED matrices or printed OLEDs. An array of light collimation optics 3015 is placed on top of the light emitting pixels 3010 and may comprise, for example, a microlens/lenticular lens sheet (e.g., PMMA or
polycarbonate material) or a foil with diffractive structures. A wavy grating foil 3020 that splits the illumination beams is positioned on top of the collimation optics 3015. The device includes actuators 3022 for providing the linear (and/or angular) motion to generate the propagating wave motion (arrows 3017) in the grating foil 3020. In some embodiments, the actuators 3022 are mounted on or in the frame 3070 of the whole display 3000. In some embodiments, actuators 3022 are positioned throughout the display structure. A wave amplitude below 1 mm is adequate for generating the desired angular sweep ranges even in fairly large displays if the grating period and projector cell size are kept small. As an example, the grating film 3020 may comprise polyester and have a thickness -0.1 mm. Such foils are manufactured via embossing or holographic methods, and are presently available in the form of large rolls. A grating structure that distributes incident light intensity equally to the different grating orders may be used for this component, as it may allow lower complexity rendering schemes to be applied. A commonly used sinusoidal grating pattern diffracts light evenly to the -1 , 0, and +1 orders, but fine-tuning of the pattern may provide a more consistent luminance output and performance. Re-focusing of the beams to a diffusing foil 3030 (e.g. to a light diffuser 3030 located behind the SLM 3035 apertures) may be performed by using a focusing microlens sheet 3025 comprising, for example, PMMA or polycarbonate material. The diffusing foil 3030 mixes up the angular distribution of light rays hitting the foil but maintains the spatial distribution of the beams. This foil 3030 may be thin (e.g., 50 μηι polycarbonate sheet), and therefore the diffusing property resulting from surface structures in the foil does not blur the spot sizes excessively. Mixing of the angular distribution opens up the possibility to use different SLMs and lenticular lenses in the latter parts of the display structure 3000, as the backlight part of the system is "optically decoupled" from the front part that finally generates the LF beams. The SLM 3035 may comprise, for example, a LCD screen with polarizers on both sides. Above the SLM 3035 is a collimating microlens/lenticular lens sheet 3040 that generates the multiple views from the sweeping virtual pixels in the diffuser 3030 behind the SLM 3035 layer.
[0234] Generally, the structure and operation of a display device using a diffractive foil and a SLM may be similar to those previously discussed, such as in relation to FIGS. 5 and 13. Optical structures in the display device can be one-dimensional (e.g., cylindrical lenses) if only horizontal views are used or two- dimensional (e.g., rotationally symmetric microlenses) if views are desired in both directions. In the latter case, two orthogonal diffractive foils may be used for the two-dimensional angular scan. Each foil may carry its own propagating wave, and the two waves may propagate in orthogonal directions. Each oscillating grating foil is driven by actuators, and the actuator mounting positions can be selected to achieve a desired propagation direction for each foil. Spatial resolution in the direction of the propagating wave is multiplexed both (i) by using several successive components (spatial multiplexing) and (ii) by sweeping the apparent positions of these pixels (temporal multiplexing), whereas the spatial separation of components alone in the orthogonal direction determines the achievable resolution. Spatial resolution achievable with the whole structure may be limited by the topmost microlens / lenticular sheet 3040 apertures. However, if this structure is used only for creation of unique horizontal views, the lenticular sheet can be slanted or the vertical array of lenses can have a small offset in the horizontal direction. In this manner, it is possible to manage trade-offs between horizontal and vertical spatial resolutions in order to balance an overall display spatial resolution.
[0235] The generation and propagation of a wave in a flexible sheet, such as the diffractive foil, may be as discussed in relation to FIG. 5. At any given point of the grating foil, the propagating wave causes a linear motion on the sheet in the direction of a display normal vector. Because of the wave, the sheet also carries an angular momentum. The propagating wave may be generated/driven using rotational and/or linear actuators coupled to a horizontal end of the sheet.
[0236] Linear (or angular) motion is generated at both ends of the wavy diffractive grating foil. The movements of the actuators which drive the foil are selected, at least in part by considering the grating sheet length, in order to avoid standing waves. With continuous oscillation and correct synchronization, the propagating wave travels across the display width (as depicted in FIG. 5). Because the projector array optical components are small, small wave amplitudes are adequate for the wave generation. Piezo- electrics are one example class of actuators that are sufficient for this task. In some embodiments the actuators comprise piezo-electric devices. In other embodiments, the actuators may comprise motors, electromagnets, or any other system/component capable of setting up propagating waves within the grating foil. A display device frame may have supports for the rigid and flexible display components. The support mechanism may include, for example, linear and/or angular momentum motors or moving supports at the sheets ends. Instead of or in addition to motors, electrical conductors or electromagnets along the display width may be employed to generate the dynamic propagating wave movement. In some embodiments, the conductors are integrated between projector cells in the wavy grating film.
Further Embodiments
[0237] In some embodiments, elements of embodiments of the present systems may be built as a backlight which can be integrated into existing display structures, potentially easing commercial exploitation of the disclosed systems and methods. As such, some embodiments set forth herein may comprise an enhanced backlight structure rather than a complete display structure. These embodiments may omit, for example, the SLM and final lens array, and may also omit the diffusive foil layer, frame, and various control circuitry (e.g., circuits to control and synchronize the omitted SLM). However, as the present methods are based on additional time multiplexing, the light emitting layer utilizes faster refresh rates than used by current multi-view displays. Examples of suitable light emitting structures capable of the very fast switching speeds adequate for embodiments disclosed herein may comprise an LED matrix based on traditional components and LEDs. [0238] In some embodiments, the display comprises a white backlight with LEDs and the diffractive foil is used as a wide color gamut light engine. In this embodiment, a LED matrix of blue chips is coated with a uniform phosphorous material layer. The phosphorous layer converts the blue light into a wider white light spectrum. As the light emitting component size is small and the phosphor material layer is thin, the spatial resolution of the matrix can be maintained. The diffractive foil separates the white illumination beam into a continuous spectrum imaged to the diffuser layer. An LCD SLM blocks parts of the single LED spots, which are much wider than the original image due to color diffraction spreading, and now exhibit a full spectrum of colors. With this blocking, it is possible to generate different colors through the (LCD) display pixels and also expand the overall display color gamut as the available color space is continuous. This color generation method may be used for LF displays or even for current LCD-type 2D displays for improving the color range and accuracy.
[0239] Various embodiments comprise a directable backlight structure within a Head Mounted Display (HMD) device. The presented directed backlight structure may be used in a HMD. With the presented backlight structure based on LEDs and the diffractive foil, a device equipped with, for example, a common LCD display becomes a LF system. The structure generates several focal surfaces instead of the single focal surface usually present in current HMD devices. The optical method may be applied to all HMDs using SLM displays including augmented reality (AR), virtual reality (VR), and mixed reality (MR) goggles. Fast switching speeds of LEDs may be combined with currently available LCD technologies as the illuminating elements could handle most of the temporal multiplexing for the LF system. The fact that the eyes are very close to the display in a HMD device relaxes the system requirements, such as the beams do not have to be as well collimated. The thickness of the backlight structure may be designed to fit the head mounted use. Also, the system could function in various manners without the propagating wave movement making it simpler to implement.
[0240] In one embodiment, a grating film bends light rays. For example, the diffractive grating orders 1 and -1 bend the light rays symmetrically to two directions. The zeroth order goes directly though the grating and it can be obscured if not needed. The bending angle depends on grating period and light wavelength. When there is a tilted grating in the ray path, the incoming light sees a tighter grating period than in the case when the grating is not tilted. A tilted grating bends rays more than a non-tilted one.
[0241] In an embodiment, a display apparatus includes a diffraction grating layer, and a propagating wave is generated in the diffraction grating layer. The diffraction grating tilts the beam more if the grating is tilted on top of the light emitting element. When the grating is orthogonal to the light beam on the crest or trough of the wave, ray bending power is at the minimum. The changes in ray bending angles generated by a wavy grating are small compared to what is generated by projector lens - pixel position pairs. These small angles may be used for changing observer accommodation-focus distance in the case when the observer is far away from the display and desired angles to the eye-box very small. The wavy grating may also reduce the need for high pixel density on display lenticular sub-projector by introducing a super-resolution phenomenon on large light emitting elements. This kind of a flexible grating foil may also be used in between flat display layers.
[0242] In an alternative embodiment, the performance of a normal multi-view display based on lenticular lens sheets and dense pixel matrices is enhanced by introducing a flexible diffractive foil with a propagating wave into the structure. As the wave propagates in the grating foil, the angle of incidence between grating and light emitted from a pixel change constantly. As the angle changes, also the diffraction orders change their propagation direction slightly. This small change in propagation angle is used for additional temporal multiplexing of view directions. The propagating wave allows sweeping of spatially multiplexed view directions through small angles. By synchronizing the wave movement to the activation of the pixel matrix, a much denser LF display is created.
[0243] In another embodiment, the system comprises an array of small projector cells. The light in a single projector cell is emitted from a pixelated layer and a microlens collimates the emitted light into a set of beams that exit the lens aperture at different propagation directions. The beam directions create a stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions via modulating the sub-pixels according to the image content. This first part of the projector cell functionality is not dissimilar from the methods used in flat form-factor autostereoscopic displays based on e.g. lenticular sheets. The next layer in the exemplary projector cell structure is a grating foil that alters the propagation direction of the emitted beams by diffraction.
[0244] In an embodiment, there is an apparatus comprising: a light emitting layer having an array of pixels, wherein each pixel comprises a set of sub-pixels; a diffractive grating foil layer, wherein each pixel is projected through the diffractive foil layer and the diffractive foil layer carries a propagating wave across its surface; a focusing microlens array, wherein each pixel is focused by a microlens in the focusing microlens array; a diffusive layer; a spatial light modulator; a collimating micro lens array, wherein each pixel is collimated by a microlens in the collimating microlens array; and a control circuit for rendering a 3D light field that is projected via the collimating microlens array, wherein the control circuit synchronizes activation of sub-pixels with the propagating wave in the diffractive grating foil layer. The apparatus may include wherein the light emitting layer is a LED panel, or a microLED panel. The apparatus may include wherein the set of sub-pixels include at least one red, green, and blue sub-pixel; or includes a white-light pixel sub- pixel. The apparatus may include wherein the spatial light modulator is an LCD panel. The apparatus may include wherein the collimating microlens array is a lenticular sheet. The apparatus may include wherein diffractive grating foil layer is made from polystyrene. The apparatus may further comprise actuators connected to the diffractive grating foil layer, wherein the actuators are controlled by the control circuit and drive the propagating wave. The apparatus may include wherein the actuators are linear actuators, or are angular actuators. The apparatus may include wherein the actuators are mounted to a display frame.
[0245] In an embodiment, there is a method comprising: activating a plurality of projector cells according to a rendering schema, each projector cell having (i) multi-colored light sources on a light emitting layer, and (ii) a focusing microlens; exciting a grating foil excited with a traveling wave, wherein light from each projector cell is diffracted by the grating foil; modulating the diffracted light with a Spatial Light Modulator that is synchronized with the multi-colored light sources and the traveling wave; and projecting the modulated light through a collimating micro lens array. The method may further comprise a control means in communication with both the plurality of projector cells and the grating foil for synchronizing timing between the multi-colored light sources on a light emitting layer and an angle of incidence at the grating foil excited with the traveling wave, in accordance with the rendering schema. The method may include wherein projecting the emitted modulated light through the collimating microlens comprises projecting multiple independent and binocular views of content at different viewing angles. The method may include wherein light from a given multi-colored light source is small enough in size to not create a false focal surface at the light source. The method may include wherein exciting the grating foil excited with the traveling wave comprises using an array of actuators to drive oscillations in the grating foil. The method may include wherein the grating foil is made of polystyrene. The method may include wherein the grating foil diffracts the light from each projector cell into -1 , 0 and 1 orders. The method may include wherein a number of focal surfaces created and displayed is determined by the rendering schema.
Light Field Display with Double Refractive Optical Elements
[0246] Optical Structure of a Light Field Display with Double Refractive Optical Elements. In some embodiments of optical methods and structures for multiview 3D light field displays, two arrays of elements or foils for carrying a propagating wave may be utilized, rather than a single array or foil.
[0247] In some embodiments, two arrays of tilting refractive optical components may be incorporated into the structure (such as that previously discussed in relation to tilting plate embodiments). As the components tilt, two different optical functions can be accomplished either separately or simultaneously depending on the tilting phases: 1 ) the apparent point of emission on the light emitting layer is slightly shifted due to bending of the optical path inside the optical elements and the projected beam directions change slightly, and 2) the optical path is made longer or shorter as light goes through the tilting components and the projected beams focus to different distances from the display surface. [0248] The small change in pixel projection angle is used for additional temporal multiplexing of view directions. The tilting motion of the components allow sweeping of spatially multiplexed view directions through small angles and by synchronizing the movement to the activation of the pixel matrix, a much denser multiview display is created with higher quality 3D picture. The focusing function is used for creation of multiple focal surfaces which can be used for addressing the VAC problem. As the separate beams that form voxels of the 3D image both cross and focus to the same focal surface, the eyes are able to obtain better focal cues.
[0249] Several different kinds of rendering schemes may be used together with the disclosed display structures and optical methods. Depending on the selected rendering scheme, the particular embodiment of the display device may be either a multiview display with a very dense grid of angular views or a true light field display with multiple views and focal surfaces. In addition, the structure may function as a regular 2D display by activating all the sub-pixels inside a LF pixel simultaneously.
[0250] Exemplary methods are able to provide both the large light emission angles that are useful for eye convergence and the small emission angles that are desirable for natural eye retinal focus cues. In addition, some such methods make it possible to create multiple focal surfaces outside the display surface to address the VAC problem. Such embodiments present a way to simultaneously scan the small light emission angles and focus the voxel-forming beams with the help of tilting refractive components.
[0251] In an embodiment, a method utilizes a combination of spatial and temporal multiplexing in creation of a dense light field that can be used for displaying 3D content. The properties of a more traditional autostereoscopic multiview display are extended by introducing simple active optical components to the structure that can be used for high-resolution temporal scanning of light rays, enabling the creation of a dense light field with depth information instead of having just a set of multiple views.
[0252] Construction techniques for exemplary embodiments can be adapted from hardware constructions that are found from current 3D multiview displays utilizing lenticular sheets or other integral imaging approaches. Activation of the tilting (or foil) components calls for additional actuators and control electronics as well as alteration of the rendering scheme, but these can be added to the structures, electronics and rendering functions of existing hardware. One advantage of some embodiments is that the principle can be scaled by use case or designed in product to different LF display view angles, voxel distance range and resolution.
Operation of a Single Pixel in a Light Field Display with Double Refractive Optical Elements.
[0253] FIG. 31 shows the structure of a single projector cell or LF pixel 3102 that forms one basic unit of a LF display. The light is emitted from a pixelated LEL 3110, and a microlens 3130 collimates the emitted light into a set of beams that exit the lens aperture at different propagation directions. The beam directions create the stereoscopic 3D effect when unique views of the same 3D image are projected to the different directions by modulating the LEL sub-pixels according to the image content. If only two sub-pixels are used, the result is a stereoscopic image for a single user standing in the middle of the FOV and the image from the right half of the LF pixels enter the left eye and the left half pixels are visible only to the right eye. If more than two sub-pixels are used, the result is a set of unique views spread across the FOV, and multiple users can see the stereoscopic images at different positions inside the predefined image zone. This effectively generates a multiview light field for a 3D scene and each viewer has their own stereoscopic view of the same 3D content and perception of a three dimensional image is generated. As the viewer moves around the display, the image is changed for each new viewing angle.
[0254] In the presented projector cell structure, two additional tilting refractive components 3120, 3125, which can be, for example, polycarbonate plates, are placed between the LEL 3110 and microlens 3130. When the plates 3120, 3125 are parallel to the light emitting surface 3110, the emitted light beam directions are not altered, but when the plates 3120, 3125 are tilted, the beam optical path is bent inside the plates 3120, 3125. Bending of the light path occurs when the light rays are refracted in the first interface between air and plate material. This angular shift is compensated when the light exits the plate from the other side and rays are refracted again with the same angular shift but to an opposite direction. As the plates 3120, 3125 are flat, with parallel opposing faces, they will not have any optical power and it will cause a minor shift to the beam focus. However, a small lateral shift (also called as parallel shift in optics) between the beam paths before and after one tilting plate is introduced and this shift causes the beams exiting the projector cell to have a slightly shifted propagation direction. From the point-of-view of the projector microlens 3130, it appears that the light emitting pixel position is shifting together with the tilting of the plate. If the two plates 3120, 3125 are tilted with the same amount, but in opposite directions, the two lateral shifts compensate each other and the beam directions are not altered. However, in this case the small longitudinal shift caused by the changed optical path through the tilted components 3120, 3125 changes the apparent distance of the light source 3110 from the collimating lens 3130. As the optical distance changes, the collimating lens 3130 starts to focus the beams to a different distance from the display than in the case when the plates 3120, 3125 are parallel to the LEL 3110.
[0255] The amount of pixel apparent positional shifts in both lateral and longitudinal directions (and together with it the amount of angular and/or focal distance change introduced) is related to three main parameters of the tilting components 3120, 3125: 1) tilt angle difference between the two components, 2) material refractive indexes and 3) thicknesses. Larger tilt angle differences between the two components result in larger lateral shifts. Higher refractive indices result in elements that can introduce larger shifts with smaller tilt values as the light is bent more at the air-material interfaces. Larger thickness values allow for greater lateral and longitudinal shift to be introduced as the light propagates a longer distance inside the components.
[0256] When the two tilting refractive elements 3120, 3125 are tilted by the same amount, but in opposite directions, they can be considered to be in opposite phases. FIGS. 32A-32C illustrate three LF pixel optical function cases. In FIG. 32A, the tilt angle 3242 is zero and the tilting elements 3220, 3225 are parallel to each other and parallel to the LEL 3210. The collimating microlens 3230 is positioned at a distance from the LEL 3210, where the focal length of the lens 3230 is almost equal to the optical distance between the lens 3230 and LEL 3210. The lens curvature is selected for the particular embodiment such that the resulting projected beam is focused (3250) to the desired minimum focal surface distance from the lens 3230. In FIG. 32B, the two refractive active elements 3220, 3225 are tilted (tilt angle 3244) the same amount in opposite directions. In this case the optical path is changed as the light propagates a longer distance through the plates 3220, 3225. The opposite tilts balance each other, and there is no lateral shift remaining in the beam after the two elements 3220, 3225. However, there is some longitudinal shift, which causes the beam to focus (3252) to a longer distance from the collimating lens 3230. The tilting elements 3220, 3225 may cause some coma and astigmatism to the projected beams, but some of these effects are compensated by the double element structure. Tilting angles may be limited to relatively low values in order to keep the off-axis optical aberrations adequately low.
[0257] FIG. 32C depicts a case where the two plates 3220, 3225 are tilted (tilt angle 3246) more than in FIG. 32B, causing the optical path length to change more, and the focal point (3254) is now much further away from the LF pixel and beam is almost collimated. However, as the LEL 3210 has a finite surface area, the beam has some divergence 3260 due to the geometric optical factors. As shown, the distance of the focal point can be changed by changing the tilt angles of the refractive elements 3220, 3225. If the angular change is made continuously, the change in focal length becomes continuous and the number of focal surfaces can be very high. FIGS. 32A-32C also illustrate that different geometric magnification ratios are obtained with the different focal distances. This means that the projected source images have different sizes depending on the distance from the LF pixels. Voxels that are created further away from the structure are bigger than those that are located closer, meaning that the achievable spatial resolution is also a function of the focal distance. When designing such a system, the focal surface that is closest to the viewer and furthest away from the display determines the largest voxel size inside the whole image zone, and this size can be used for balancing the spatial resolution over the whole image volume.
[0258] The two tilting refractive elements inside the LF pixel can also be in different phases, if the tilting is actuated with different frequencies or with a relative phase shift. This introduces a small lateral shift in the apparent position of the light emitter, making the emitted beams to tilt to an angle from the optical axis. FIGS. 33A-33C show three example cases. In FIG 33A, the tilting plates 3320, 3325 are again parallel, but both of them are tilted (tilt angle 3342) in the same direction. This makes the focused beam (3350) tilt from the optical axis. In FIG. 33B, the first active element 3320 is tilted with a smaller angle than the second active element 3325, which is tilted to the same direction. The angular difference (angle 3344) between the elements 3320, 3325 changes the optical path length, making the beam nearly collimated (3352), and the tilts cause the beam to propagate to an off-axis angle. Some divergence 3360 caused by source finite size remains in the beam. FIG. 33C shows a case where two parallel active elements 3320, 3325 are tilted in the opposite direction from FIG. 33A. Here, the beam is again tilted off-axis, but in the opposite direction, to focus at location 3354. Overall, the example cases shown in FIGS. 33A-33C illustrate that two tilting elements can be used effectively for both tilting and focusing of the beams simultaneously.
Operation of Display Optics in a Light Field Display with Double Refractive Optical Elements.
[0259] In some embodiments, a full LF display is created by building a large panel containing an array or matrix of the presented LF pixels. FIG. 34A shows the functionality and display structure of such an embodiment of a full LF display, where light is emitted from a matrix of components 3410 and an array of focusing microlenses 3430 collimates the beams. Two layers of tilting elements 3420, 3425 are placed in between these two layers and controlled for tilting motion. In the embodiment illustrated in FIG. 34A, the refractive elements 3420, 3425 comprise even thick continuous foils or films that are oscillating with a propagating waveform, rather than the previously discussed tilting plates. They are optically clear and reasonably light weight. When foils or films are used, it is desirable for the wavelength of the waveform to be long enough (-10 x LF pixel aperture width) in order not to introduce aberrations to the projected beams. Tilting occurs as the waves propagate through the structures and foils or films are bent locally at each LF pixel position to form the tilted refractive elements 3421, 3426.
[0260] In the depicted structure of FIG. 34A, the tilting elements 3420, 3425 have opposite phases. At the troughs and crests of the waves, the two foils or films 3420, 3425 are practically parallel to each other and parallel to the LEL 3410. When the foils or films 3420, 3425 are at an angle, the emitted beams become collimated with slight divergence, as shown for LF pixels 3402 and 3404. At the position of the central LF pixel 3403, the foils or films 3420, 3425 are parallel, causing the emitted beams to focus on an image surface 3450 outside the display structure. The source images are magnified to this surface 3450 with a geometric magnification ratio determined by the distance of the surface 3450 from the collimating lens array 3430. Two beams emitted from neighboring LF pixels can be overlapped on this surface 3450 as the focused emitter array images are larger than the LF pixel apertures, and single emitter beams start to cross each other. In order to create a voxel, the beams may be crossed at the same distance where they focus, allowing for unambiguous focal cues to a viewer's eyes.
[0261] As the two refractive elements 3420, 3425 shown in FIG. 34A are synchronized but with opposite phases, all the lateral shifts are compensated and only longitudinal shifts remain. This causes the beams to go back and forth between states of collimation and focusing, without additional off-axis tilting when the waves travel across the LF pixel apertures. As the foil or film waveforms make the local tilts smooth and continuous, the focal distances also change smoothly and continuously. By synchronizing this wavy movement to the light emitter activation, the system may project multiple different images in different directions simultaneously, and create the different 3D image depth layers sequentially.
[0262] FIG. 34B shows the same display structure of FIG. 34A, but with a phase shift in the two propagating waves of the refractive elements 3420, 3425. The phase shift creates an angular difference between the tilting elements 3420, 3425, and causing the emitted beams to tilt from the optical axis as lateral shifts are introduced. The same focusing function is present as the optical path lengths change dynamically with the waveform in the tilting elements 3420, 3425. As the waveforms propagate in the foils or films, the collimated and focused beams sweep back and forth through an angular range, which may in some embodiments be the same as the angular distance between two beams emitted from neighboring sub-pixels inside each LF pixel. In such instances, the angular density can be increased with temporal multiplexing. As both the focusing and angular sweeping functions can be created simultaneously, such systems and methods may represent versatile hardware for rendering true 3D LF images.
Optical Hardware of a Light Field Display with Double Refractive Optical Elements.
[0263] Different methods may be used for manufacturing the tilting elements. In some embodiments, the tilting elements may comprise continuous sheet structures, which may provide a simple overall construction. In some embodiments, a plastic (e.g., polycarbonate or PMMA) foil or film may be used, in which case tilting can be introduced with a propagating waveform that has a relatively long wavelength in comparison to the LF pixel aperture size. In some embodiments, the tilting elements may comprise rigid plates (e.g., glass), which are joined together into a sheet with elastomer materials, such as silicon rubber or thermoplastic urethane. In other embodiments, the tilting elements may comprise a continuous elastic foil or film, on top of which an array of more rigid plates is laminated with optically transparent glue, such as polyethylene. The foil or film itself may also be used as a functional optical component by providing a series of small grooves in the foil or film, such as by embossing, with the grooves acting as hinges between more rigid parts having the full foil or film thickness. The foil or films may comprise materials that are optically transparent and ductile, as well as have good fatigue strength in order to endure repeated bending movement. Suitable polymer materials include, but are not limited to, polycarbonate and polyamide. [0264] FIG. 35 shows a schematic presentation of an embodiment of a display structure 3500 with dual tilting elements 3520, 3525. The light is emitted from a pixelated layer 3510, which may comprise for example a LED matrix, OLED display, or LCD display with backlight. The tilting elements 3520, 3525 comprise refractive flexible sheets which are disposed above the pixelated layer 3510. Actuators 3527 provide the linear (and/or angular) motion for generating propagating wave motion in the tilting elements 3520, 3525, and may be disposed at, on, or in the frame 3570 of the whole display 3500. Light collimation optics 3530 are disposed on top of the display structure, and may comprise, for example, a
microlens/lenticular lens polycarbonate sheet, or a foil or film with embossed diffractive structures. An array of apertures 3540, such as a punctured plastic sheet, is placed on top of the microlens array 3530, optically isolating the LF pixels from each other. Optical structures may be one-dimensional (e.g., cylindrical lenses) if only horizontal views are desired, or two-dimensional (e.g., rotationally symmetric microlenses) if views are desired in both directions. In the latter case, two orthogonal refractive sheets may be used for the two- dimensional angular scan and focusing.
[0265] In the embodiment shown in FIG. 35, multiple view directions for multiple focal surfaces may be generated for a plurality of users.
[0266] Continuous oscillation at both ends of the refractive sheets 3520, 3525 and appropriate synchronization causes propagating waveforms to appear throughout the display width. As the projector array optical components are small, very small wave amplitudes are sufficient, and actuators, such a piezoelectric actuators, positioned along the sheet length may be adequate to generate the desired very small wave amplitudes. The amplitude used may be selected based on the projector cell aperture size. For example, if the aperture size is 500 μηι and desired tilt is -10°, the wavy motion amplitude can be as small as ~±110 μηι. The display device frame 3570 may have support features for the rigid and flexible display components. The support mechanism may include, for example, linear and/or angular momentum motors or moving supports at the vertical ends of the sheets. Instead of motors there may be electrical conductors or electromagnets along the display width that generate dynamic wave movement to the sheets or foils or films with the force based on electric and/or magnetic fields. The conductors may be integrated between projector cells in the sheet or film, such as by screen printing (e.g. using silver paste ink) or by using etched copper wiring. Graphene is also a promising material that has mechanical and optical properties suitable for displays uses such as those set forth herein. It is conductive and it can be stretched about 20% without damage, so it may be used both as hinge material between refractive components and as a conductor for electrostatic actuation. There are also several different types of Micro Electro Mechanical Systems (MEMS) that may be used for the actuation of the tilting elements, given the small scale of movement. One example is bimorph actuators, which may produce well-controlled tilting action of ±30° to an array of mirrors that have an aperture size or ~1.5 mm. Some further options for movement generation may include sound waves generated with speakers below or above the range that can be heard by the human ear or memory metals that can be activated by heat or electricity.
[0267] Tilting element actuation may utilize accurate actuators that are small and efficient. In exemplary embodiments, environmental factors such as temperature changes are taken into account, and dedicated calibration modules and/or routines may be employed. For example, if a propagating waveform in a flexible polycarbonate foil or film is used for the tilting elements, the amplitude and frequency of the wavy motion may be selected based on the operating temperature. With polycarbonate, a temperature change of 10°C around standard room temperature will induce -1 % change in the elastic modulus of the material. This change may be detected, and a feedback signal may be sent to the actuators in order to compensate for the slightly changed foil or film stiffness.
Light Field Image Generation Module.
[0268] Some exemplary embodiments are used in a display structure in which a LF image generation module, as described herein, is combined with a separate projection lens or lens array. The LF image generation module 3605 in this case may produce intermediate images at different focal surfaces in between some layers of the display system, and the final image may be formed by the front projection lens as shown in the example system in FIG. 36. The modular construction approach of such embodiments may make it possible to use one image generation module for different kinds of viewing scenarios by changing a front projector lens.
[0269] Some such embodiments may be employed in a head mounted display (HMD) device. In such instances, the light emitters may comprise, for example, a very dense matrix of LEDs positioned behind microlens arrays with short focal lengths. The LED sub-pixel images 3610 can then be formed at few millimeter distances from the focusing lens array, and a pair of injection-molded magnifying lenses 3615 may be used for projecting the images to the two eyes 3620 separately (FIG. 36). Such lenses 3615 are commonly used in current virtual reality (VR) display devices. As the images are projected to the two eyes separately, two different sections of the display may be used for producing the stereoscopic image pairs separately. This spatial division may simplify design and manufacture of the hardware and software, as there is reduced need for spatial multiplexing at the LF pixel level. Also, as the eye pupils 3625 are closer to the display, it can be easier to project more than two views inside them, which may be achieved using only a few light sources inside each projector cell. Some of the sub-pixel images may be focused (3640) out of the retina 3630 of the eye 3620, while other sub-pixel images may be focused (3650) on the retina 3630. [0270] The micro-optical components may be manufactured with the high-quality and high-volume wafer-scale manufacturing methods used for making mobile phone camera optics today. The systems and methods set forth herein also permit the addition of multiple focal surfaces, making the device more user friendly than most available VR and AR devices available today.
Light Field Display Structure with Wave Modules.
[0271] In some embodiments, as noted above, sound waves are used as an effective method for generating the tilting motion in the refractive elements. FIG. 37 depicts an exemplary display structure 3700 in which flexible foils or films are used as the tilting refractive elements 3720, 3725, which are actuated with sound waves in wave modules 3758. The modules contain a sound generator 3760 (loudspeaker) on one side and a sound sensor 3762 (microphone) on the other. Air pressure differences generated with the sound generator 3760 actuate a propagating wavy motion in the foils or films 3720, 3725, and the amplitude and wavelength of the motion can be adjusted by controlling the emitted sound frequency and volume. In some embodiments, a frequency of the sound source is kept above or below the human hearing range, which is commonly given as 20 to 20,000 Hz. A sound sensor 3762 on the other side can be used for monitoring possible changes in, for example, the foil or film flexibility due to temperature changes, and/or the like. The feedback signal from the sensor 3762 can be used for adjusting the sound generator 3760 output, and thus actively calibrate the wave module(s) 3758 according to environmental changes. The two modules 3758 may be separated from each other with rigid transparent structures, such as glass sheets, making it possible to, for example, generate the wavy motion in orthogonal directions without excessive crosstalk between the modules that would induce errors to the wavy motion and to the 3D image voxel registration.
Properties of Light Field Displays with Diffractive or Refractive Elements
[0272] A particular balance between temporal and spatial multiplexing may be chosen on the basis of a particular use case in the embodiments set forth herein by selecting the light emitting components accordingly. If, for example, very small LEDs are used, spatial multiplexing may be emphasized by creating more views with a multitude of light emitting components per LF pixel, which may lead to reduction in angular sweep ranges and extension of the time a single component can be in the on-state per single view direction. Alternatively, if larger LED chips are used, there may be a lower number of physical light emitting elements for the same size LF pixel, and requirements for the switching speed for single components become more demanding and angular sweep range is extended to cover larger gaps between the spatially initiated views. The various embodiments set forth herein use a propagating wave in a diffractive or refractive element(s) to provide additional freedom for LF display device optimization, as the performance is not limited by the spatial separation between the pixels.
[0273] Diffractive Foil. In an exemplary embodiment, the convergence beams sweep over a viewer or observer's eyes in order to create the correct focus cues. The sweeping beam entering the eye should have the same direction as it would have if it were emitted from the voxel, in order to generate the correct in-eye focus for correct voxel distance. Both beam angular sweep and spatial change in beam starting point on the display surface are used simultaneously. The beam angular sweep may be realized by temporally switching LEL pixels on and off in sync with the wave propagation in the flexible diffractive grating, and smooth out the pixel grid discrete directional output. The beam starting point on the display surface may have the appearance of jumping temporally from one cell to another in the cell array while sweeping the beam. The orientation of the sweeping beams may follow the voxel centric arc normal. Such an arc may ensure eye convergence, accommodation, and even multi-view angles, if the display viewing angle is wide enough for multiple observers.
[0274] A factor to be considered in the design of the display structure using a wavy diffraction grating is that gratings diffract light with different wavelengths to different angles. This means that if three colored pixels (e.g., red, green, and blue) are used, the different colored beams are tilted to somewhat different directions from the grating foil. However, a prism structure positioned after the grating film may compensate for this effect, as it also tilts the different colored beam directions differently, but may do so in the opposite direction. As the colored sub-pixels are usually spatially separated on the LEL, the collimating lens may also cause some small angular differences to the colored beam projection angles. With the correct arrangement of the different sub-pixels on the LEL, this effect can be used to further compensate for color separation caused by the grating. In addition to these optical hardware methods which can be applied for color correction, special rendering schemes can also be used for combining the different colored beams into mixed color pixels in the viewer's eye.
[0275] A propagating wave in the diffractive foil may create a situation where the projected beams are tilted when the wave passes the cell aperture. As the wave has a continuously curved shape, the tilting angle can be different throughout the projector cell aperture as some parts of the beam hit the foil at slightly different incidence angles. In order to ensure good beam collimation, the foil surface wavelength is preferably large enough that the resulting additional beam divergence does not limit the achievable voxel depth range. However, the wavelength should not be too long as the resulting tilt is used for the beam angular scan, and with higher slope values and smaller wavelength values the angular range is larger. This means that there is a trade-off situation between beam collimation level and angular sweep length, and the furthest achievable voxel distance is balanced with the desired angular scan range connected to pixel density.
[0276] Refractive Tilting Plates. In embodiments herein using refractive plates or foils, the optical materials refract light with different wavelengths to different angles (color dispersion). This means that if three colored pixels (e.g. red, green and blue) are activated, the different colored beams are tilted to somewhat different directions from the tilting plates. As the colored sub-pixels are usually spatially separated on the LEL, the collimating lens will also cause some small angular differences to the colored beam projection angles. In one embodiment, a rendering scheme combines the different colored beams into mixed color pixels in the eye by activating the differently colored pixels with a slight delay from each other so that the sweeping motion of the tilting plate negates the angular differences.
[0277] In another embodiment, the collimated beams sweep over the observer's eyes in order to create the correct focus cues. The sweeping beam entering the eye has the same direction it would have if it were emitted from a voxel. Both the beam angular sweep and spatial change in beam starting point on the display surface are controlled simultaneously. The beam angular sweep is controlled by temporally switching LEL pixels on and off in sync with the tilting plates. It smooths out the pixel grid discrete directional output. The beam starting point on the display surface jumps temporally from one cell to another in the cell array while sweeping the beam. The sweeping beams orientation follows a voxel-centric arc normal. The arc allows for eye convergence, accommodation and also multiview angles if the display viewing angle is wide enough for multiple observers.
[0278] Generally. The depth range achievable with the LF display may be connected to the quality of beam collimation coming from each sub-pixel. Collimation is achieved when the lens focal length is equal to the distance between the lens and the light emitter. The size of the light emitting pixel, diameter of the lens aperture, and lens focal length are three parameters that determine collimation quality. The lens focal length is preferably much larger than the lens aperture size (large F#) in order to achieve good collimation. However, if the focal length is very long, diffraction will start to limit the collimation quality. In some embodiments, in order to reach an adequate level of airy disc diameter caused by diffraction, the F# of the lens should equal the pixel size in microns. Eye resolution is about 1 arc minute corresponding to about ±0.0083° collimation quality. This can be used as the upper boundary when designing the light emitting layer pixel size and lens focal length.
[0279] Displays of different sizes can be realized with the optical methods set forth herein. Pixel size is one limiting factor on the achievable beam collimation level and should be considered carefully when designs for different use cases are created. Table 1 presents some example calculated values for the achievable beam collimation level (shown as beam divergence half angles) for three different exemplary display cases. All the displays are considered to have FullHD spatial resolution mapped with the projector cell structure. The "TV" display has a projector cell pitch of 0.5 mm, which means that full horizontal width is -960 mm (-46" screen). The desktop display has projector cell and screen size that is half from the TV case and the mobile display refers to a -5.3" display with 65 μηι projector cell size. The calculations are made according to a situation where all the three different display types are positioned at 1 m distance from the observer. The first column on the left shows possible pixel sizes and the other three columns show the achievable collimation angles with the different displays. The table shows clearly that as the pixel size decreases, the collimation quality improves as beam divergence goes down. What is also possible to see from the table is the fact that if the pixel size is kept the same, collimation quality decreases as the projector cell width is reduced. Optical analysis also shows that with very small pixel sizes, diffraction becomes a limiting factor. The numbers in the table are calculated with a projector cell that has F# of 2 and with these structures the diffraction limit sets to a pixel size between 2 μηι and 3 μηι.
Figure imgf000077_0001
[0280] Geometric calculations show that if a voxel is positioned at 500 mm distance from the observer, one beam would fill a 4 mm diameter pupil of one eye if the beam has divergence of 0.23° (half angle). Similarly, beam divergence angle for a voxel positioned at 1000 mm distance is 0.11 ° and at 1500 mm distance 0.08°. These angles represent the upper limits of projector cell beam divergence that can be allowed for these voxel distances. Now the values shown in Table 1 can be used as a tool for some further analysis of the trade-offs between design parameters for the projector cell. For example, the last column on the right shows that a mobile device LF display would use a pixel size below 2 μηι in order to be able to generate over 500 mm voxel distances and the diffraction becomes a clear limiting factor making it impossible to reach this voxel distance without disturbing effects coming from the excessive beam divergence. However, a desktop display could reach the 500 mm voxel distances with ~4 μηι pixels and the TV set may render voxels at 1500 mm distance with ~3 μηι pixels as the projector cell optics would be just above the diffraction limit.
[0281] Without any light scattering media between the display and the viewer, all of the LF pixels in the display may project emitter images towards both eyes of the viewer. However, in order to create the stereoscopic image, one emitter inside the LF pixel should not be visible to both eyes simultaneously if the created voxel is located outside the display surface. This means that the FOV of one LF pixel may cover both eyes, but the sub-pixels inside the LF pixels may have FOVs that make the beams narrower than the distance between two eye pupils (-64 mm on average) at the viewing distance. The FOV of one LF pixel and also the FOVs of the single emitters are determined by the widths of the emitter row/emitter and magnification of the whole imaging optics. It may be noted that one voxel created with a focusing beam is visible to the eye only if the beam continues its propagation after the focal point and enters the eye pupil at the designated viewing distance. The FOV of a voxel is preferably adequate for covering both eyes simultaneously. If the voxel would be visible to single eye only, the stereoscopic effect would not be formed and 3D image could not be seen. As one LF pixel emitter can be visible to only one eye at a time, the voxel FOV may be increased by directing multiple crossing beams from more than one LF pixel to the same voxel inside the human persistence-of-vision (POV) time frame. In this case, the total voxel FOV is the sum of individual emitter beam FOVs.
[0282] In order to make LF pixel FOVs overlap at a specified viewing distance, in some embodiments, the display is, for example, curved with a certain radius, or the projected beam directions are turned towards a specific point, such as with a flat Fresnel lens sheet. If the FOVs do not overlap, the LF pixels cannot be seen and some parts of the 3D image cannot be formed. Due to the limited size of the display and practical limits for possible focal distances, an image zone is formed in front of and/or behind the display device where the 3D image is visible. FIG. 38A shows a schematic presentation of an example viewing geometry that can be achieved with an exemplary 3D LF display structure. In front of the display, there is a 3D image zone limited by the furthest focal distance from the display with reasonable spatial resolution and by the whole display FOV. In the pictured case, the display surface is curved with a radius which is the same as the designated viewing distance. The overlapping LF pixel FOVs form a viewing zone around the facial area of the viewer. The size of this viewing zone determines the amount of movement allowed for the viewer head. When both eye pupils are inside the zone simultaneously, a stereoscopic effect is visible. Similarly, for a flat display screen, viewing zones for one or multiple viewers may be generated, as shown in FIG. 38B (for an exemplary flat display as discussed below in relation to a display with a directional backlight). In FIG. 38B, the portion of a viewing zone in front of a viewer's eyes is shaded.
[0283] The size of the viewing zone may be selected on the basis of the use case by altering the LF pixel FOVs. FIGS. 39A-39B show schematic presentations of two different example viewing geometry cases with a curved display. In FIG. 39A, a single viewer is sitting in front of the display and both eye pupils are covered with a small viewing zone achieved with narrow LF pixel FOVs. The minimum functional width of the zone is determined by the eye pupil distance (on average -64 mm). A small width also means a small tolerance for viewing distance changes as the narrow FOVs start to separate from each other very fast both in front of and behind the optimal viewing location. FIG. 39B shows a viewing geometry where the LF pixel FOVs are quite wide, making it possible to have multiple viewers inside the viewing zone and at different viewing distances. In this case also the positional tolerances are large.
[0284] The viewing zone can be increased by increasing the FOV of each LF pixel in the display. This can be done by either increasing the width of the light emitter row or by making the focal length of the collimating optics shorter. Maximum width for the emitter row is determined by the width of the projector cell (LF pixel aperture). There cannot be more components in the single projector cell than what can be bonded to the surface area directly below the collimating lens. If the focal length of the collimating lens is decreased, the geometric magnification increases, making it more difficult to achieve a specific voxel spatial resolution. For example, if the collimator lens focal length is halved, the LF pixel FOV is doubled - but the source image magnification to all focal surfaces increases by a factor of two and it follows that voxel size on a given focal surface is also doubled. The resolution reduction can be compensated by decreasing the highest magnification ratio by bringing the edge of the image zone closer to the display surface.
Unfortunately this will make the total volume where the 3D image is visible shallower and the visual experience more restricted. Overall, this connection between the different design parameters means that there is a trade-off situation between 3D image spatial resolution and sizes of the image and viewing zones. If the viewing zone is increased, the result is either lower resolution on the focal surface closest to the viewer or decrease in the size of the 3D image zone.
Light Field Rendering Schemes
[0285] LF rendering schemes. Several different kinds of rendering schemes can be used together with the disclosed display structure(s) and optical method(s). Depending on the selected rendering scheme, a particular display device may be a multi-view display with a very dense grid of angular views, or a true LF display with multiple views and focal surfaces. [0286] In the simpler multi-view rendering scheme, each LF pixel projects one pixel of each 2D view from the same 3D scene. This leads to a situation where all pixels in one 2D view image are created with the sub-pixels that are at the same positions inside the projector cells (e.g., upper right corner LF sub- pixels projected towards a view at left and below display center line). One 2D image, representing one 3D view at one view direction, may be created and shown on the matrix of LF pixels simply by activating the same sub-pixel inside each projector cell. The multi-view image field may then be made much denser by modulating these images in synchronization with the diffractive foil or tilting plate wave that initiates scanning of additional view directions in between the main view directions. This rendering scheme would not be able to provide the correct focus cues for the eyes as there would be only one focal surface at the surface of the display. However, this scheme may be much simpler to implement, as the rendering may call for only a series of 2D views at small angular intervals.
[0287] In a true LF rendering scheme, several focal points or planes are created in front of the viewer(s) in front of or behind the physical display surface, in addition to the multiple viewing directions. This employs a different rendering approach as at least two projected beams are generated for each 3D object point or voxel. For all voxels that are between the display and viewer, the convergence beams should cross in front of the display at the correct voxel distance. In a similar way, the voxels positioned at a further distance from the viewer than the display should have a beam pair virtually crossing behind the display. When the 3D object pixels are exactly at the display surface, only one beam is used. The crossing of the (at least) two beams is able to generate a focal point (or plane) that is not restricted to only the display surface. In other words, the beams can create the desired true light field.
[0288] As the true LF rendering calls for heavy calculations, the 3D data may be reduced to certain discrete depth layers that are just close enough to each other for the viewer's visual system to have continuous 3D depth experience. Covering the visual range from 50 cm to infinity would take about -27 different depth layers, based on estimated human visual system average depth resolution. The depth layers can be displayed temporally in sequence according to distance, or they can be mixed and adapted on the basis of the image content. In some embodiments, viewer positions are actively detected by the system and voxels are rendered only to those directions where the viewers are located. Active viewer eye tracking, such as using near infrared light with cameras around or in the display structure, may be used for this viewer position detection.
[0289] Selection of the appropriate rendering scheme is dependent on the particular hardware limitations and use case. For example, in a wall-sized advertisement display that is used in a well-lit area, the desired high light intensity easily leads to relatively large light emission layer pixel size as high intensity light emitting components are not readily available in very small sizes. The display may be configured to be viewable from a large distance by multiple simultaneous viewers (and not necessarily to be viewable by a nearby viewer). In this case, a multi-view rendering scheme may be more appropriate, as the long distance between the viewers and the display means that the viewer perception of depth is less accurate and a dense multi-view display can create the 3D effect well enough. The relatively large pixels also do not allow the fine-tuning used for a true LF display with multiple focal surfaces. Another example case is a smaller display for a single user created with a light emitting layer that has a large number of very small pixels with lower light intensity. In this case, a more complex true LF rendering scheme may be utilized, as the spatial resolution may be adequate and the large number of focal surfaces can be calculated for a single user direction and eyebox without excessive calculation power and/or data transfer speeds.
[0290] Exemplary methods may be applied to many different sized displays with different numbers of pixels. The single view direction sweeping angle is dependent on the LEL pixel pitch and size, which means that it will also be considered during the rendering as the timing between the propagating wave and pixel activation is synchronized. There is also a trade-off situation between spatial and temporal multiplexing. If more pixels are available, there can be more simultaneously projected beam directions and pixels can be activated with a slower pace than when there are only a small number of real pixels to be swept over larger angular ranges. If additional eye/face tracking is used for viewer detection, the positional information can be used for adapting the rendering scheme to produce, among other benefits, better resolution views in the correct direction or energy savings by not activating views which are not visible to the viewer(s).
[0291] Another trade-off situation associated with the rendering scheme exists between spatial/angular and depth resolutions. With a limited number of pixels and switching speeds, there is a balance between emphasizing high spatial/angular resolution with the cost of lower number of focal surfaces or having more focal surfaces for better depth effect with the cost of more pixelated image. The same applies to the data processing at the system level, as more focal surfaces call for more calculations and higher data transfer speeds. The human visual system allows reduction of depth information when the objects are further away, as the depth resolution decreases logarithmically. At the same time, the eyes can resolve only larger details as the image plane goes further away. This makes it possible to optimize rendering schemes that produce, for example, different pixel resolutions at different distances from the viewer, lowering the processing speed requirement for image rendering. All of these tradeoffs connected to the rendering scheme can also be adapted based on the presented image content, for example enabling higher resolution of image brightness.
[0292] In some embodiments, in order to create a full-color picture, three differently colored pixels are used on the LEL. The color rendering scheme adapts to the fact that different colors are diffracted to different angular directions at the grating foil (or other flexible light bending layer). Some of this effect can be compensated with the hardware as previously discussed (e.g., by integrating diffractive structures into the focusing lens sheet to make it color corrected, so as to compensate for the different focus distances of the collimating lens), but the remaining color separation may be accomplished using special color rendering. One rendering scheme is to use the movement of the propagating foil or tilting plates (or other flexible light bending layer) as an advantage, and activate the differently colored sub-pixels at slightly different times from each other. If the time intervals between red, green, and blue pixel activation are chosen appropriately, the wave may have sufficient time to propagate (or the plates to tilt) to positions where all three colored pixels are projected to the same direction. This results in the colors being combined in the single projector cell by introducing a short time shift between the different colored image projections.
Exemplary Use Cases
Exemplary Grating Film.
[0293] In an exemplary case, a 1 meter wide LF display with the discussed wavy diffraction foil projector cell structure (flexible diffractive foil disposed over the projector cells) is positioned at 1 meter distance from multiple viewers. The display generates collimated beams to the observer directions. A rendering scheme with multiple light emitting points for each voxel (a 3D pixel) is used in order to create a true LF display with correct eye convergence and focus cues. For each voxel, at least two beams are emitted from at least two points on the display surface. Those convergence beams are created by selectively activating different projector cell sub-pixels corresponding to the correct large angular directions. The pixel projections are swept through the small angular ranges with the wavy diffractive foil approaches discussed herein, and as the pixels are modulated in synchronization to this angular scan a very dense light field is generated. The beam sweeps also create virtual focal surfaces for the eyes at different depths calculated from the 3D data with the true LF rendering scheme.
[0294] Eye spatial resolution at 1 m distance can be as high as 0.29 mm, which results in a scenario for a 1 m wide display having a maximum of 3448 horizontal LF pixels. Each of these LF pixels may also have several sub-pixels in them. As an example, if one projector cell sub-pixel is ~3 μηι in size, one LF pixel may have around 100 sub-pixels generating around 100 spatially multiplexed beam propagation directions. This may be done with a display that has the appearance of a surface without pixels as the projector cell size would be at the eye resolution limit. The number of unique directions may be increased by the diffractive foil and propagating wave discussed herein, as additional projection directions can be packed in between these main directions. [0295] Table 2 lists the beam angles that a single projector cell on the display surface should be able to provide for a single user at the central direction. On average, the eye pupil diameter can be estimated to be around 5 mm. If a voxel is rendered at 500 mm distance from the eyes, a maximum angular sweep of ±0.28° may be used from the display surface for one eye focus accommodation. The beam starting point on the display surface changes from cell to cell when a LF 3D image is rendered for one observer direction. The LF can be rendered for other eye pairs in the multi-view configuration by adding the new view direction angles to the convergence and accommodation angles mentioned in the table.
Figure imgf000083_0001
Table 2. Simulated approximate values for convergence and accommodation angles that a LF display provides when showing three different voxel distance layers.
[0296] Collimated beam quality, and with it the furthest visible voxel distance for a true LF display, are determined by the pixel size on the light emitting layer, collimating lens F#, and diffraction. As an example, pixels on a currently available OLED display are as small as -9.3 μηι and pixels on an LCoS display are as small as -3.74 μηι. It can be seen from Table 2 that if the observer looks at a voxel positioned at 1500 mm distance, their eyes converge 1.24° towards the center line, and ray bundles hitting the eye should have ±0.0985° collimation to achieve realistic focus-accommodation information. This collimation level can be achieved with a projector cell that has -3 μηι sub-pixels and a collimation lens with 0.9 mm focal length. 1.24° convergence angle means -10 μηι distance between pixels on LEL surface.
[0297] In a true LF display design, the projected beam diameter should be smaller than eye pupil size, as the collimated beam should not create a disturbing extra focus in the eye that interferes with the artificially generated swept-beam focus. This means an exemplary display is capable of displaying 3D views inside this depth range when the beam diameter condition is met. The beam generated in a single projector cell may have a diameter less than 0.27 mm, which corresponds to about one tenth of the eye pupil diameter. This cell size may provide 4K spatial resolution for a 46" display, or Full HD resolution for smaller 500 mm wide desktop display.
[0298] For an exemplary display case with 3μηι pixels, about 10° grating tilt change may be used to sweep the beam angle between two adjacent pixels on the LEL if the grating has 1000 lines/mm. With this angle, a continuous array of directions can be created as the projection angles of neighboring pixels start to overlap with each other. Such a propagating wave slope maximum angle can be considered as quite moderate. In embodiments where the grating foil surface wavelength is at least 10 times the projector cell aperture size (so that aperture does not see the curvature of grating wave), the propagating wavelength minimum is 2.9 mm at 0.29 mm projector cell pitch. The desired wave amplitude can be calculated from the wave form, wavelength, and maximum grating tilt. A propagating wave fulfilling these conditions may, for example, have a surface wavelength of 2.9 mm, such that the 10° maximum angle shift is achieved with a wave amplitude of 0.13 mm. If the foil thickness is 0.1 mm, the total minimum space for the foil with the propagating wave form is only 0.23 mm.
Exemplary Display with Tilting Plates
[0299] In an exemplary embodiment, a 5" mobile phone LF display is viewed from 0.5 meter distance by a single observer. The display projects collimated beams towards the observer into a viewing box that is 200 mm wide. This box size can easily accommodate the width of a human face. A rendering scheme with multiple light emitting points for each voxel (a 3D pixel) is used in order to create a multi focal surface LF with correct eye convergence and focus cues. For each voxel, at least two beams are emitted from at least two points on the display surface. The beams are created by selectively activating different projector cell sub-pixels corresponding to the correct angular directions. The pixel beams are also swept through small angular ranges with the tilting plate method and as the pixels are modulated in synchronization to this angular scan, a very dense light field is generated in the horizontal direction where the angular scans are made. The beam sweeps create virtual focal surfaces for the eyes at different depths.
[0300] The display projector cells project the LF pixel images into a viewing angle of 22 degrees, which covers -200 mm wide area at the 500 mm viewing distance. In order to reach adequately small beam divergence values for good eye accommodation, a reasonable full-color pixel size of 12 m is selected and projector cell pitch is set to 250 μηι. Pixels on a currently available OLED display can be as small as -9.3 μηι and on an LCoS display as small as -3.74 μηι. The pixels are divided into 3 individually addressable sub-pixels with red, green and blue filters making each separate colored emitter surface only 4 μηι wide. The desired ±11 degree FOV can be covered with a cell structure that has 600 μηι focal length microlenses made from optical plastic material Zeonex E48R and an aperture mask with 200 μηι diameter holes in front of the lenses. Such a projector cell or LF pixel is able to produce -21 unique full-color pixel projections in the horizontal direction even without the plate tilting action.
[0301] Table 3 shows calculated figures for projected beam divergence values for voxels positioned at four different distances from the observer.
Figure imgf000085_0001
Table 3. Simulated approximate values for convergence and accommodation angles that a LF display provides when showing four different voxel distance layers.
[0302] The described projector cell structure is able to generate individual beams with ±0.2 degree divergence from the 4 μηι sized colored sub-pixels. It can be seen from the table that this collimation level is adequate for presenting voxels that are somewhat further away from the viewer than the display surface (>500 mm), but voxels at 1 m distance are already too far away to be accurately rendered for the human eye. The individual sub-pixel beams create a spot that is -3.5 mm in diameter at the 500 mm viewing distance from the display. This means that the light from two sub-pixels can enter the -5 mm diameter eye pupil simultaneously. It also means that with a static display, the pixels would appear as colored due to the fact that only two of the neighboring three color pixels can enter the eye at the same point in time.
However, as the described tilting plate method employs temporal multiplexing, this color separation problem can be solved with a hardware-based color rendering scheme described later on.
[0303] The plate array of the exemplary embodiment may vary from all refractive plates being normal to a display surface to all refractive plates being at maximum tilt magnitudes (for example, with alternating orientation) with respect to the display surface. In the exemplary display, about 12 degrees of plate tilt is used for creating a lateral shift corresponding to one 4 μηι colored sub-pixel width with a plate thickness of 50 μηι and a plate made from optically clear polystyrene. With this maximum tilting angle, a continuous array of directions can be created as the projection angles of neighboring pixels start to overlap with each other. A standing wave can have a surface wavelength of 0.5 mm corresponding to two projector cell widths, and the maximum tilting angle gives a maximum total thickness of only -0.1 mm to the structure. The plate sheet can be manufactured by embossing V-grooves on both sides of a 50 μηι thick polystyrene foil with 250 μηι spacing. The thinner areas then function as hinges between the thicker foil parts that act as the tilting plates when the standing wave movement is activated.
[0304] One limiting factor related to the creation of multiple views in this example is the refresh frequency of the pixelated light emitting layer. Refresh frequencies for an LCD LEL can be as high as 600 Hz. AT 600 Hz, the display pixels can be modulated 10 times inside the 60 Hz limit commonly considered suitable for a flicker-free image for the human eye. It follows that if the plates tilt back-and-forth at a rate of 30 Hz, full angular scans can be performed with the 60 Hz rate and 210 (21 spatial * 10 temporal) unique full-color horizontal pixel beams could be generated with each projector cell inside the human visual system POV timeframe.
[0305] In the exemplary display, achievable spatial resolutions are different in all of the three orthogonal dimensions. Pixel size in the vertical direction can be 12 m if the pixels have square aperture as the multiview scanning occurs only in the horizontal direction. The selected rendering scheme determines the achievable spatial resolution in both the horizontal and depth directions. Projector cells form an array on top of the display surface with 480 side-by-side LF pixels that are all capable of projecting 210 multiplexed beams into the space in front of the display without flicker. Each voxel is created with at least two crossing beams, and as the display is used at close range, several different focal surfaces (e.g., 10) may be created in order to give the appearance of continuous depth. The number of selected beams per each voxel and number of selected focal surfaces are parameters of the rendering scheme comprising a clear trade-off relationship. Together they determine the final number of pixels that may be projected in the horizontal direction and with it the horizontal pixel resolution.
[0306] In some embodiments, the plate tilting may be used for the creation of full-color beams projected from the projector cell and there is no need for a separate color rendering scheme. As the pixel virtual positions are scanned across the width of three 4 μηι sub-pixels, the full-color rendering can be achieved by introducing a small time delay between the activation of the sub-pixels. This time delay causes the different color pixels to combine into one full-color pixel that is only 4 μηι wide. The position where the full- color pixel is activated is controlled separately by a LF rendering scheme that determines the suitable pixel projection directions. If the plate back-and-forth scanning is done with a 30 Hz rate, two angular sweeps are made each second. As only one 12-degree tilt is sufficient from the tilting plate to bring two sub-pixels on top of each other, a suitable time delay value of ~8 ms is obtained between successive red, green and blue sub-pixels. Exemplary Display with Directional Backlight
[0307] An exemplary 3D LF display comprises the disclosed directional backlight method and projector cell structure. The display has a width of -576 mm in the horizontal direction, which corresponds to a 26" monitor having a standard aspect ratio. With a single viewer positioned at 1 m distance from the display, the display fills an area corresponding to a 32 degree angular field-of-view. Multiple images from different view directions of the same 3D content are projected to a viewing zone covering the single user's facial area, and the user is able to see a 3D stereoscopic image. A true LF rendering scheme may be used for the creation of the full-color images.
[0308] The exemplary 3D LF display may use a single cell projector structure as in FIG. 27A. The light is emitted from a cluster of LEDs that are arranged into a matrix with horizontally alternating red, green and blue components. Each single LED component has a width of 2.5 μηι, and the components are bonded to a backplane with a horizontal pitch of 3.5 μηι. This means that one three-color light emitter pixel is fitted into a horizontal width of -10 μηι. In the vertical direction, the different colored sub-pixels may be arranged in slanted columns which opens up a possibility to increase the horizontal spatial resolution at the cost of lowered vertical resolution. However, as the vertical light emitting pixel size can also be smaller than 10 μηι, there is ample space available between pixel rows and the spatial resolutions can be balanced in order to achieve the appearance of uniform spatial resolution in both directions. In the exemplary scenario, the total width of the light emitting pixel cluster is -0.8 mm, which means that there are a total of 229 sub- pixels or 76 three-color pixels placed side-by-side.
[0309] A collimating lens made of the material Zeonex E48R is placed at a 5 mm distance in front of the [iLED cluster. The collimating lens has a focal length of 5 mm and aperture diameter of 0.8 mm. This lens effectively collimates the light emitted by the μΙ-EDs into narrow directional beams. The beams hit a transmissive grating foil that is placed close (e.g., - 0.1 mm - 0.2 mm) to the collimating lens. The grating foil is made from polystyrene in this example. It has 500 grating lines/mm that are designed to diffract the incident light intensity evenly into orders -1 , 0 and 1. Alternatively, the grating may be designed to diffract the light into more than just three orders, in which case the single illumination beam would be split into more than three child-beams. In the present exemplary case, the three child-beams hit a focusing lens (made of the material Zeonex E48R) that has a focal length of 2.8 mm and aperture diameter of 2.4 mm. This lens is placed at 2.65 mm in front of the beam collimation lens and it re-focuses the separated beams into pixel images at a diffuser sheet positioned at 2.5 mm distance from the focusing lens. A magnification ratio of this two-lens system is -2:1. Diffraction blurs the images and the resulting μΙ-ED image sizes (airy disc size considered) are the following: red -6.3 μηι, green -5.3 μηι and blue -4.8 μηι. This first part of the projector cell forms a backlight structure that is capable of producing an array of very small, color controlled virtual light emitting pixels that can be individually turned on and off with control electronics.
[0310] A more complete LF display projector cell is formed when an LCD with polarizing films on both sides is placed in contact with the diffuser foil and a lenticular sheet is placed in front of the display. The pixel size of the LCD may be the size of individual LED sub-pixel images. However, this is not mandatory as the LEDs themselves are components that can be individually addressed and the pixel images can be turned on and off (with the limitation that all diffraction orders connected to a single LED are working in unison). The smallest pixel sizes in currently available LCDs can produce DPI (Dots-Per-lnch) values between 2000 - 3000, which relates to -10 μηι pixel sizes. This presently available pixel size is suitable for the use of modulating images generated with the presented backlight system. At this present minimum size, the -10 μηι pixel can block -2 side-by-side full-color virtual pixels, where the differently colored spots are overlaid on top of each other. As the beams are generated with separate colored components, the LCD pixels do not need any color filters, and may be used as simple light valves that either block or pass the illumination beams. In this exemplary scenario, three side-by-side lenticular lenses are used for one projector cell structure corresponding to the three diffraction orders. In the exemplary display, the focal length of the lenticular sheet microlenses (made of a material such as PMMA) is -0.7 mm and each microlens is 0.8 mm wide. When positioned at approximately the focal length distance from the LCD layer, the lens projects well collimated beams towards the viewer. Of course, other lens focal lengths and aperture sizes may be used for achieving different full display spatial resolution and beam divergence values. In the exemplary display structure, there are 720 lenses on the topmost layer in the horizontal direction, which corresponds to 720 LF pixels.
[0311] Collimated beam quality, and resultantly the furthest visible voxel distance for a true LF display, are determined by at least the pixel size on the light emitting layer, collimating lens F# and diffraction. Table 4 lists the beam angles that a single projector cell on the display surface provides for a single user at the central direction. The table provides convergence and accommodation angles that the FL display provides for four different voxel distance layers in the exemplary scenario.
Display surface distance 1000mm Config. 1 Config. 2 Config. 3 Config. 4
Voxels and eyes are at optical axis going
through the display at the origin (center)
Voxel distance (mm) 250.00 333.00 500.00 1000.00
One eye convergence angle (°) -7.40 -5.59 -3.72 -1.86
Eye lens accommodation focal length 15.93 16.18 16.44 16.72 (relaxed f 17 mm)
Angles from display surface:
Ray angles from display surface used to ±0.591 ±0.457 ±0.312 ±0.173 initiate eye accommodation (±
angle °)
For eye pupil D=5 mm
Table 4. Simulated approximate values for convergence and accommodation angles that a LF display provides when showing four different voxel distance layers.
[0312] On average, an eye pupil diameter can be estimated to be around 5 mm and for example, if a voxel is rendered at 1000 mm distance from the eyes, a maximum angle of ± 0.17 degrees from the display surface is sufficient for one eye focus accommodation. The previously described projector cell structure produces ~ 4.8 μηι - 6.3 μηι light emitting virtual pixels at the diffusing foil. These beams can be further collimated into illumination beams, using the described lens, with divergences of ~ ±0.12 degrees - ±0.14 degrees. These values are now limited by diffraction relating to the cell lens design, and they limit the longest voxel distance that can be rendered, without conflicting eye focus, to a value around 1 m. This means that by considering the beam quality alone, the 3D image space is constricted to a depth range with a maximum distance approximately at the display surface, and a minimum distance approximately at a -25 cm distance from the viewer.
[0313] As the LED components are physically separated on the backplane, also the illumination beams have separated colors. The colors can be combined into three-color beams by utilizing two different methods. The first method is based on propagating wave in the grating foil and time delay between pixel activation. If the separately colored LEDs are activated at different time intervals, the propagating wave has shifted the positions of the different color component images to the same exact position on the LCD and the colors can be combined. An angular local tilting range of ±3 degrees between the grating foil (over the collimating lens aperture) and illumination beam is sufficient for sweeping the images of single colored sub-pixels over the adequate range on the diffuser foil. For an exemplary grating foil wave form fulfilling this condition, with blue pixels (wavelength -450 nm) this grating tilting range results to a total spatial sweep length of -8 μηι, with green pixels (wavelength -550 nm) to -12 μηι and with red pixels (wavelength -650 nm) to -17 μηι. These are adequate ranges as the neighboring colored pixel images start to overlap and even cross each other inside the LCD pixel aperture. The second color combination method is based on horizontal location of the colored LEDs and diffraction occurring in the diffractive foil. As the different colors are diffracted to somewhat different angles from the grating, the colored pixel images can be combined by compensating this angular difference by activating red, green and blue pixels that are at somewhat different locations on the backplane. In the example case, a distance of 0.25 mm between red and green as well as between blue and green pixels is adequate for the color combination. Both of these two methods can also be used together for example by first selecting the right LEDs on the backplane for the crude positional adjustment and then utilizing the time-based method for fine adjustment of the single beam color. This color combination may employ a calibration routine for the displays as the colored spots are so small that, for example, hardware manufacturing tolerances can have an effect on the combination capability.
[0314] In the exemplary display, about ±3° grating tilt range is used to sweep the beam angle between two adjacent pixels on the LEL if the grating has 500 lines/mm. With this angular range, a continuous array of directions may be addressed, as the projection angles of neighboring pixels start to overlap with each other. This propagating-wave-slope maximum angle can be considered quite moderate. An example grating foil surface wavelength is 10 times the projector cell aperture size (so that aperture does not see the curvature of grating wave). The propagating wavelength minimum is 8.0 mm for a 0.8 mm collimator lens aperture. The wave amplitude can be calculated from the wave form, wavelength and maximum grating tilt. For a diffractive foil that has a propagating wave fulfilling these conditions, if the surface wavelength is 8.0 mm, the ±3 degree maximum angle shift is achieved with a wave amplitude of 0.1 mm. If the foil thickness is 0.1 mm, the total thickness of the space taken up by the foil with the propagating wave form is ~ 0.2 mm.
[0315] Due to the limitation of different color diffraction angles coming from the wavy foil, the full LED cluster width with single color components cannot be used simultaneously at any single time without introducing some crosstalk between LF pixels. A suitable region width is -0.6 mm, which means that a single projector cell can produce -171 separately addressable colored pixels per each lenticular sheet lens and the single LF pixel can produce 171 separate colored beams in the horizontal direction at any given moment. The number of unique view directions can be multiplied by utilizing the propagating wave in the diffractive foil. As the wave propagates, the different diffraction order LED images are shifted inside the LCD screen pixel aperture making it possible to scan the angular space between two beam directions set by the LED matrix. The temporal multiplexing ability is connected to the refresh frequencies of the LEDs and LCD and to the Persistence-Of-Vision property of the human eye. If the LCD has a refresh frequency of, for example, 240 Hz, the image could be updated 4 times inside a 60 Hz refresh frequency rate that is commonly used for determining a flicker free video. The LED matrix however could be modulated much faster and a series of several virtual pixels with slightly different locations could be generated inside one LCD refresh cycle. In some embodiments, this faster rate is used also for creation of different light intensity levels. The overall number of views that could be generated with this method depends on the above- mentioned parameters and the final rendering scheme that is selected for creating the colors and 3D image voxels.
[0316] Considering a viewing geometry of the exemplary scenario, single LF pixels are able to project the image creation beams to a maximum angle of 18 degrees. This means that at the 1 m distance of the single viewer, a view-box is created with the width of -320 mm. This width can easily cover both eyes of a single viewer as the average distance between eye pupils is -64 mm. The view region is so wide that it also provides a lot of tolerance for head positioning making it more comfortable to use and more robust against head movements. As the single projection beam divergence values are - ±0.13 degrees, the beam widths at the 1 m distance are -5 mm, which is suitable given the average eye pupil diameter. The spatial separation between two neighboring beams at this distance is only -1.9 mm (320 mm / 171 beams), which means that two side-by-side beams could be swept over the eye pupil simultaneously fulfilling the Super- Multi-View (SMV) condition. And as described previously, the colored beams can be combined with the propagating wave in the grating foil. If the wave movement frequency is -60 Hz, the diffracted beams may be scanned over the LCD pixel apertures and colored sub-pixels can be combined into three-color beams inside the POV timeframe. Overall, the described display structure and optical methods are well suited for dynamically creating high resolution full-color 3D voxels.
Exemplary Display with Double Refractive Elements
[0317] In an exemplary embodiment utilizing a double flexible refractive element method and structure, a tabletop 3D LF display device with curved 50" screen is placed at 1000 mm distance from a single viewer. The display forms a light field image to a volumetric virtual image zone, which covers the distance from 500 mm from the viewer position to the display surface. The display is able to generate multiple views both in the horizontal and vertical directions with the presented LF pixel structure.
[0318] Light is emitted from LED arrays where component size is 2 μηι x 2 μηι and pitch 3 μηι. The array contains red, green, and blue components assembled to a matrix where the differently colored components are interlaced in alternating rows and columns with 9 μηι spacing between two same color chips. The total width and height of the matrix in each LF pixel is -0.4 mm and there is a total number of
132 x 132 components, which can be used to create 44 x 44 full-color beams emitted from each LF pixel.
Rotationally symmetric collimator lenses are placed at -2.3 mm distance from the μίΕϋε and they have a focal length of 2.0 mm. The lens array is made from polycarbonate as a hot-embossed 0.5 mm thick microlens sheet. Aperture size of the collimating lenses is 500 μηι, which is also the size of the single LF display pixel. Two 350 μηι thick polycarbonate foils or films are placed between the LEDs and collimator lenses. The first foil or film is positioned at an approximate distance between 0.55 - 0.75 mm from the LED matrix and it has a propagating waveform in the horizontal direction. The second foil or film is positioned at minimum distance of 0.2 mm from the first foil or film and it has a propagating waveform in the vertical direction. Foils or films are driven with piezoelectric actuators. The propagating waves are synchronized with opposite phase to each other by synchronizing the actuators causing the movement. The whole optical structure is only ~3 mm thick and the LF pixels are capable of projecting multiple beams that can be focused to multiple focal surface layers in front of the display with the foils or films that have propagating waves in both horizontal and vertical directions.
[0319] A full LF 3D display device can be created in which the whole device surface is filled with the described LF pixel structures. In an exemplary embodiment, the whole display is curved with a radius of 1000 mm in the horizontal direction. This arrangement makes the single LF pixel FOVs overlap, and a -200 mm wide viewing window is formed for a single user at the designated 1 m viewing distance. In the vertical direction, the viewing window height is also -200 mm determined by the total height (0.4 mm) of the LED rows and LF pixel optics magnification to the viewing distance (500:1). A cylindrical
polycarbonate Fresnel lens sheet with 1000 mm focal length is used for overlapping the vertical views.
[0320] In each LF pixel LED cluster in this example, the red, green, and blue components have the same size and are bonded as interlaced matrices. Their colors are combined in the projected beams on the different focal layers when the crossing beams are combined into voxels. The collimator lens array has integrated diffractive structures that compensate color dispersion in the polycarbonate material. Each LF pixel has 44 red, green and blue rows of 44 LEDs, which are used for projecting 44 x 44 unique full-color views in both horizontal and vertical directions. A single LF pixel has a total FOV of -11.4° x 11.4° and it covers the -200 mm wide viewing window at the 1 m viewing distance where the different horizontal view directions are spatially separated from each other by the distance of -4.5 mm. The polycarbonate foil or film propagating waveforms have maximum tilt angles of ±6°, which are able to generate maximum beam tilts of -±0.4° from the nominal directions. This makes it possible to overlap the neighboring colored pixels and to sweep intermediate view directions between the main view directions determined by the LED positions. This also means that at least two views can be projected into the -4 mm diameter eye pupils simultaneously fulfilling the SMV condition if temporal multiplexing is utilized in the 3D image rendering.
[0321] The created viewing zone around the viewing window allows the viewer to move his/her head -65 mm left and right as well as -125 mm forward and -180 mm backwards from the nominal position. Both eye pupils of an average person will stay inside the viewing zone with these measurements and the 3D image can be seen in the whole display image zone. This tolerance for the head position is achieved by making it possible to adjust the display tilt angle in the vertical direction or display stand height and it can be considered as adequate for a single viewer sitting in front of the display in a fairly stable setting.
[0322] In the example display case, about ±6° refractive foil or film tilts across the LF pixel apertures are used for creating a longitudinal shift corresponding to the shift of beam focus from the nominal position of 670 mm from the viewer to the distance of 500 mm from the viewer. With this tilting angle range, a continuous zone of possible focal surfaces can be created between the front part of the image zone and the intermediate image surface. The distance between these two surfaces corresponds to the change in optical power of 0.5 diopters, which is well below the -0.6 D limit of human depth resolution. Distance between the intermediate image surface and display surface also corresponds to optical power change of 0.5 D. This means that the real world display structure can be used in creation of a virtual 3D image zone between the viewer and display with only three focal surfaces located at the distances of 500 mm, 670 mm, and 1000 mm from the viewer. With these focal surfaces, the 3D image content looks practically continuous as the human visual system resolution is not able to see the different focal layers as discrete.
[0323] When the refractive foil or film geometry is in the propagating wave configuration with the maximum 6° tilting angles, the waveform has a surface wavelength of 5 mm corresponding to ten LF pixel widths, and the maximum tilting angle gives maximum wave amplitude of only -0.14 mm. This gives a minimum bending radius of -11 mm, which can be considered as feasible for the 0.35 mm thick polycarbonate foil or film considering a long term use. Total thickness of the foil or film with the waveform stays below 0.5 mm making the structures fairly compact.
[0324] One limiting factor for creation of multiple views in this embodiment is the refresh frequency of the pixelated light emitting layer. As pixel size should be small, OLED and LCD displays can be considered as suitable components that are available today. Alternatively, LED matrices may replace these display types in some applications, and they offer a better alternative with faster modulation rates. The highest reported display refresh frequencies for an LCD are in the range of 600 Hz. With such frequency, the display pixels can be modulated 10 times inside the 60 Hz limit commonly considered suitable for a flicker free image for the human eye. This means that if the foil or film waveforms tilt back-and-forth at a rate of 30 Hz, full angular and focal point scans can be performed with the 60 Hz rate and -19,000 (44 x 44 spatial * 10 temporal) unique full-color pixel beams could be generated with each LF pixel inside the human visual system persistence-of-vision (POV) timeframe. This huge number of views would call for a very large bitrate for the image rendering and in practice the number of views can be limited, such as by eye tracking and selection of only few view directions.
[0325] In order to test the structure functionality and achievable voxel resolution on different image planes between the display and the viewer, a set of simulations was performed with the optical simulation software OpticsStudio 17. As previously noted, the display optical structure was placed at 1000 mm distance from the viewing window and two intermediate focal surfaces were placed between the device and viewer at distances of 500 mm and 670 mm from the viewer. Four rectangular light emitters were created with the LED measurements of 2 μηι x 2 μηι surface area and 3 μηι pitch. These were used in simulations were the light distributions were traced from the display optical structure to the intermediate focal surfaces and to the viewing window. Two refractive foil or film tilt cases were simulated separately in order to show the focusing function achievable with the structure. Simulations were made only with green 550 nm light, which represents an average wavelength in the visible light range. Illumination distribution simulation results show only the geometric imaging effects and diffraction effects are estimated separately for each case based on the simulated airy disc radius. With blue sources the diffraction effects are somewhat smaller and with red sources larger than with the green sources.
[0326] FIG. 40 presents the simulation results as a table where the columns show simulated images at different distances from the viewer and rows mark the tilt angles. All pictures show a square detector area of 3 mm x 3 mm. At the analysis surface located 1000 mm from the viewer and directly on top of the display surface itself, the LF pixel produces circular images that are the same size (0.5 mm) and shape as the aperture of the collimating lens. The result shows that on the display surface, the minimum voxel size is determined by the size of the LF pixel. Only two LEDs inside the pixel are used for creation of one voxel at this distance, one for each eye direction. Spatial resolution on the 50" display surface corresponds to Full HD. The display can be used in a regular 2D mode by activating all the sub-pixels inside the LF pixel simultaneously for one pixel of the 2D image. This increases the visibility of each LF pixel into the full FOV of ~11.4° and all the created voxels are positioned on the display surface.
[0327] On the virtual image plane at 670 mm distance from the viewer, a clear image of the four rectangular light sources can be seen in FIG. 40 when the foil or film tilt angles are 0° and the foils or films are parallel. When the foils or films are tilted by 6°, the images become clearly blurred making the image out of focus. Each rectangle is -0.33 mm wide, which corresponds well to the geometric magnification ratio of 165:1 to this distance. The image blurring airy disc caused by diffraction from the 500 μηι diameter LF pixel aperture has a radius of -1.0 mm with red (650 nm), 0.9 mm with green (550 nm) and 0.7 mm with blue (450 nm) light. In practice this means that the four geometric image spots shown in FIG. 40 are fused together and it is not possible to determine clearly how many sources there are. The positive side of this fusing effect is the fact that four neighboring sub-pixels could be grouped and shown together without potentially disturbing edge effects coming from the gaps between light emitters. Such feature is beneficial when the spatial voxel sizes are balanced through the different focal surfaces. In order to make the voxels uniform in size through the 500 mm thick image zone, a matrix of ~ 2 x 2 neighboring emitters may be used together for the creation of one voxel on the 670 mm distance image surface.
[0328] On the virtual image plane at 500 mm distance from the viewer, a clear image of the four rectangular light sources can be seen when the foil or film tilt angles are 6° and the angle difference is at maximum. When the foils or films are parallel (tilted by 0°), the squares become blurred making the image out of focus. I n comparison to the 670 mm virtual image surface case, the optimal tilt angle for the refractive elements is now different, proving that the idea of using the tilting elements for focus adjustment works. Each rectangle is now -0.5 mm wide, which corresponds well to the geometric magnification ratio of 250:1 to this distance. As the aperture size of a LF pixel remains the same, the airy disc radius values are also the same at this distance than in the previous case. This means that the images are still clearly blurred by diffraction, but the relative effect is not as large as the geometric magnification is larger at this distance. The -0.5-1.0 mm minimum voxel size means that the closest part of the 3D image zone is only capable of producing resolution that is approximately comparable to the SDTV (standard definition) on a surface size that corresponds to a 25" TV. Performance is clearly limited by diffraction.
[0329] The last column of FIG. 40 shows the simulated images at the viewing window (viewer distance 0 mm) at the designated 1000 mm distance from the display. These images show the eyebox size for each beam case when four neighboring sub-pixels are activated inside one LF pixel. It can be seen from the images that when the foils or films are tilted, the eyebox side does not change, but the images are just blurred. This does not mean that voxel images themselves are blurred as the eye is focused to the surface where the voxel is created and the eye lens makes the final imaging by adjusting its focal length. All of the emitters are visible to the single eye only and the beam sizes are -1 mm at the eyebox. This size is not adequate for covering the whole eye pupil and it would likely be beneficial for the correct focal cues to use more than one LF sub-pixel for the creation of voxels. This beam size also makes it possible to fulfill the SMV condition and make the display 3D image quality higher. In order to widen the voxel FOVs to cover both eyes, several active LF pixels and crossing beams may be used in all focal surface distances except for the display surface itself.
[0330] Overall, the simulation results of FIG. 40 show that the refractive tilting elements method can be effectively used for setting the distance of focal surfaces. The presented optical structure allows a 3D LF display that is capable of actively controlling the focus distance of projected beams that form voxels, which can induce the correct focusing response from the human eye. Furthermore, the real life example case shows that adequate resolution and 3D image viewing volume can be achieved with a display structure that is fairly simple and thin. In order to improve the resolution of such displays further, an embodiment removing the diffraction effects may be implemented, such as described in U.S. Provisional Patent Application No. 62/580,797, filed January 26, 2018, entitled "Method and System for Aperture Expansion in Light Field Displays." Such approaches may be beneficial when smaller display sizes with smaller LF pixels are desired. One such embodiment is depicted in FIG. 41 , which shows schematically the functionality of a display structure that combines a LF display 4110 based on tilting refractive elements and an aperture expander 4150 based on a double grating 4142, 4144 and a SLM 4148. A light source image can be generated close to the display structure and further away. The aperture expander 4150 widens the beams by splitting them in the first grating layer 4142. The second grating layer 4144 directs the beams back to the original directions and the beams focus again and form an image of the emitter, but to a longer focal distance. This focal distance change can be compensated with the collimator lens design.
[0331] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Claims

CLAIMS What is Claimed:
1. A display apparatus comprising:
a light-emitting layer disposed within the display apparatus;
a collimating microlens array disposed between the light-emitting layer and an outer surface of the display apparatus, the microlens array comprising a plurality of collimating microlenses;
a first flexible light bending layer disposed between the light-emitting layer and the outer surface of the display apparatus; and
at least one actuator operative to generate a traveling wave in the first flexible light bending layer to generate an oscillation in the orientation of the first flexible light bending layer relative to each of the collimating microlenses.
2. The apparatus of claim 1 , wherein a portion of the light-emitting layer including a plurality of sub-pixels is associated with a single microlens of the collimating microlens array to define one of a plurality of projector cells, and wherein a portion of the first flexible light bending layer spans a cell aperture of each projector cell.
3. The apparatus of claim 2, further comprising a controller configured to control at least a first projector cell and the at least one actuator to:
based on a location of at least one voxel of 3D content to be displayed by the display apparatus, illuminating a subset of the plurality of sub-pixels of the first projector cell in synchrony with the orientation of the first flexible light bending layer relative to the microlens of the first projector cell to generate a 3D image.
4. The apparatus of claim 3, wherein the generated 3D image comprises a plurality of independent views of the 3D content projected at a plurality of viewing angles.
5. The apparatus of any of claims 1 -4, wherein the first flexible light bending layer comprises a flexible diffractive foil, and the collimating microlens array is disposed between the light-emitting layer and the flexible diffractive foil.
6. The apparatus of claim 5, further comprising a microprism array, wherein the flexible diffractive foil is disposed between the collimating microlens array and the microprism array.
7. The apparatus of claim 5, further comprising a spatial light modulator (SLM), wherein the flexible diffractive foil is disposed between the SLM and the light-emitting layer, and the SLM is configured to be controlled in synchronization with the at least one actuator and the light-emitting layer to modulate light diffracted by the flexible diffractive foil.
8. The apparatus of any of claims 1 -4, wherein the first flexible light bending layer comprises a first array of tilting refractive plates, and the first array of tilting refractive plates is disposed between the light- emitting layer and the collimating microlens array.
9. The apparatus of claim 8, wherein each refractive plate in the plate array is connected to one or more adjacent plates via a flexible joint.
10. The apparatus of any of claims 1 -4, wherein the first flexible light bending layer comprises a first flexible refractive foil, the first flexible refractive foil disposed between the light-emitting layer and the collimating microlens array.
11. The apparatus of any of claims 1-4, further comprising a second flexible light bending layer, wherein the first flexible light bending layer is disposed between the light-emitting layer and the collimating microlens array, and the second flexible light bending layer is disposed between the first flexible light bending layer and the collimating microlens array.
12. The apparatus of claim 11 , wherein the first and second flexible light bending layers comprise a first and a second array of tilting refractive plates.
13. The apparatus of claim 11 , wherein the first and second flexible light bending layers comprise a first and a second flexible refractive foil.
14. A method comprising:
controlling at least a first actuator to generate a traveling wave in a first flexible light bending layer, said traveling wave generating an oscillation in the orientation of the first flexible light bending layer relative to a plurality of projector cells, each projector cell having (i) a subset of light sources of a light-emitting layer, and (ii) a focusing microlens; and
controlling illumination of the light sources of the projector cells, based on 3D content to be projected as voxels, wherein the control of illumination is in synchronization with the traveling wave in the first flexible light bending layer.
15. The method of claim 14, wherein the first flexible light bending layer comprises a flexible diffractive foil, wherein the microlens of each projector cell is disposed between the light-emitting layer and the first flexible light bending layer, and further comprising: modulating the diffracted light from the flexible diffractive foil with a spatial light modulator, disposed between the first flexible diffractive foil and a second microlens, that is synchronized with the light sources of the projector cells and the traveling wave; and
projecting the modulated light through a third microlens array to project the 3D content.
16. The method of claim 14, wherein the first flexible light bending layer comprises a first array of
refractive tilting plates disposed between the light-emitting layer and microlens of each projector cell, and further comprising:
modulating the light sources of the projector cells and synchronizing the emitted modulated light with the orientations of the refractive tilting plates; and
passing the emitted modulated light through the microlens of each projector cell to project the 3D content.
PCT/US2018/028949 2017-04-24 2018-04-23 Systems and methods for 3d displays with flexible optical layers WO2018200417A1 (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201762489393P 2017-04-24 2017-04-24
US62/489,393 2017-04-24
US201762522842P 2017-06-21 2017-06-21
US62/522,842 2017-06-21
US201762564908P 2017-09-28 2017-09-28
US201762564913P 2017-09-28 2017-09-28
US62/564,908 2017-09-28
US62/564,913 2017-09-28
US201862633042P 2018-02-20 2018-02-20
US62/633,042 2018-02-20

Publications (1)

Publication Number Publication Date
WO2018200417A1 true WO2018200417A1 (en) 2018-11-01

Family

ID=62148492

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/028949 WO2018200417A1 (en) 2017-04-24 2018-04-23 Systems and methods for 3d displays with flexible optical layers

Country Status (1)

Country Link
WO (1) WO2018200417A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109375381A (en) * 2018-11-27 2019-02-22 浙江理工大学 A kind of 3 D displaying method and system of high information flux low crosstalk
WO2020210361A1 (en) * 2019-04-12 2020-10-15 Pcms Holdings, Inc. Optical method and system for light field displays having light-steering layers and periodic optical layer
DE102019118985A1 (en) * 2019-07-12 2021-01-14 Bayerische Motoren Werke Aktiengesellschaft 3D autostereoscopic display device and method of operating it
WO2021040700A1 (en) 2019-08-27 2021-03-04 Leia Inc. Multiview backlight, display, and method employing an optical diffuser
WO2021041329A1 (en) * 2019-08-30 2021-03-04 Pcms Holdings, Inc. Creating a 3d multiview display with elastic optical layer buckling
TWI733498B (en) * 2020-06-19 2021-07-11 錼創顯示科技股份有限公司 Display panel and head mounted device
CN113219656A (en) * 2020-01-21 2021-08-06 未来(北京)黑科技有限公司 Vehicle-mounted head-up display system
US20210306612A1 (en) * 2020-03-24 2021-09-30 Beijing Boe Optoelectronics Technology Co., Ltd. Displaying device, device and method for generating data, and displaying system
CN114270816A (en) * 2019-03-20 2022-04-01 M·E·瓦尔德 MEMS-driven optical package with micro LED array
CN114815241A (en) * 2021-12-16 2022-07-29 北京灵犀微光科技有限公司 Head-up display system and method and vehicle-mounted system
CN114967170A (en) * 2021-02-18 2022-08-30 清华大学 Display processing method and device based on flexible naked-eye three-dimensional display equipment
WO2022219335A1 (en) * 2021-04-13 2022-10-20 Disguise Technologies Limited Light-emitting diodes
EP3938845A4 (en) * 2019-03-14 2023-03-15 Light Field Lab, Inc. Systems for directing energy with energy directing surface with non-zero deflection angle
WO2023172285A1 (en) * 2022-03-07 2023-09-14 Leia Inc. 2d/multiview switchable lenticular display, system, and method
WO2024058916A1 (en) * 2022-09-14 2024-03-21 Microsoft Technology Licensing, Llc Optical array panel translation

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993021548A1 (en) 1992-04-08 1993-10-28 Valtion Teknillinen Tutkimuskeskus Optical component comprising prisms and a grating
EP1069454A1 (en) * 1998-03-27 2001-01-17 Hideyoshi Horimai Three-dimensional image display
WO2001044858A2 (en) * 1999-12-16 2001-06-21 Reveo, Inc. Three-dimensional volumetric display
US7518149B2 (en) 2003-05-02 2009-04-14 University College Cork - National University Of Ireland, Cork Light emitting mesa structures with high aspect ratio and near-parabolic sidewalls
US7994527B2 (en) 2005-11-04 2011-08-09 The Regents Of The University Of California High light extraction efficiency light emitting diode (LED)
WO2013163468A1 (en) * 2012-04-25 2013-10-31 Fleck Rod G Direct view augmented reality eyeglass-type display
US20140028663A1 (en) * 2012-07-25 2014-01-30 Disney Enterprises, Inc. Volumetric display with rim-driven, varifocal beamsplitter and high-speed, addressable backlight
US20150097756A1 (en) * 2013-10-07 2015-04-09 Resonance Technology, Inc. Wide angle personal displays

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993021548A1 (en) 1992-04-08 1993-10-28 Valtion Teknillinen Tutkimuskeskus Optical component comprising prisms and a grating
EP1069454A1 (en) * 1998-03-27 2001-01-17 Hideyoshi Horimai Three-dimensional image display
WO2001044858A2 (en) * 1999-12-16 2001-06-21 Reveo, Inc. Three-dimensional volumetric display
US7518149B2 (en) 2003-05-02 2009-04-14 University College Cork - National University Of Ireland, Cork Light emitting mesa structures with high aspect ratio and near-parabolic sidewalls
US7994527B2 (en) 2005-11-04 2011-08-09 The Regents Of The University Of California High light extraction efficiency light emitting diode (LED)
WO2013163468A1 (en) * 2012-04-25 2013-10-31 Fleck Rod G Direct view augmented reality eyeglass-type display
US20140028663A1 (en) * 2012-07-25 2014-01-30 Disney Enterprises, Inc. Volumetric display with rim-driven, varifocal beamsplitter and high-speed, addressable backlight
US20150097756A1 (en) * 2013-10-07 2015-04-09 Resonance Technology, Inc. Wide angle personal displays

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
J. WALLACE: "Highly flexible OLED light source has 10 micron bend radius", LASER FOCUS WORLD, 31 July 2013 (2013-07-31)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109375381A (en) * 2018-11-27 2019-02-22 浙江理工大学 A kind of 3 D displaying method and system of high information flux low crosstalk
EP3938845A4 (en) * 2019-03-14 2023-03-15 Light Field Lab, Inc. Systems for directing energy with energy directing surface with non-zero deflection angle
CN114270816A (en) * 2019-03-20 2022-04-01 M·E·瓦尔德 MEMS-driven optical package with micro LED array
WO2020210361A1 (en) * 2019-04-12 2020-10-15 Pcms Holdings, Inc. Optical method and system for light field displays having light-steering layers and periodic optical layer
US11846790B2 (en) 2019-04-12 2023-12-19 Interdigital Madison Patent Holdings, Sas Optical method and system for light field displays having light-steering layers and periodic optical layer
CN113767307B (en) * 2019-04-12 2023-08-29 Pcms控股公司 Optical methods and systems for light field displays having a light diverting layer and a periodic optical layer
CN113767307A (en) * 2019-04-12 2021-12-07 Pcms控股公司 Optical methods and systems for light field displays having light diverting layers and periodic optical layers
DE102019118985A1 (en) * 2019-07-12 2021-01-14 Bayerische Motoren Werke Aktiengesellschaft 3D autostereoscopic display device and method of operating it
WO2021040700A1 (en) 2019-08-27 2021-03-04 Leia Inc. Multiview backlight, display, and method employing an optical diffuser
US11906758B2 (en) 2019-08-27 2024-02-20 Leia Inc. Multiview backlight, display, and method employing an optical diffuser
EP4022217A4 (en) * 2019-08-27 2023-04-26 LEIA Inc. Multiview backlight, display, and method employing an optical diffuser
CN114424110A (en) * 2019-08-30 2022-04-29 Pcms控股公司 Creating 3D multiview displays with elastic optical layer buckling
WO2021041329A1 (en) * 2019-08-30 2021-03-04 Pcms Holdings, Inc. Creating a 3d multiview display with elastic optical layer buckling
CN113219656A (en) * 2020-01-21 2021-08-06 未来(北京)黑科技有限公司 Vehicle-mounted head-up display system
US20210306612A1 (en) * 2020-03-24 2021-09-30 Beijing Boe Optoelectronics Technology Co., Ltd. Displaying device, device and method for generating data, and displaying system
US11831860B2 (en) * 2020-03-24 2023-11-28 Beijing Boe Optoelectronics Technology Co., Ltd. Displaying device, device and method for generating data, and displaying system
TWI733498B (en) * 2020-06-19 2021-07-11 錼創顯示科技股份有限公司 Display panel and head mounted device
CN114967170A (en) * 2021-02-18 2022-08-30 清华大学 Display processing method and device based on flexible naked-eye three-dimensional display equipment
CN114967170B (en) * 2021-02-18 2023-07-18 清华大学 Display processing method and device based on flexible naked eye three-dimensional display equipment
WO2022219335A1 (en) * 2021-04-13 2022-10-20 Disguise Technologies Limited Light-emitting diodes
CN114815241A (en) * 2021-12-16 2022-07-29 北京灵犀微光科技有限公司 Head-up display system and method and vehicle-mounted system
WO2023172285A1 (en) * 2022-03-07 2023-09-14 Leia Inc. 2d/multiview switchable lenticular display, system, and method
WO2024058916A1 (en) * 2022-09-14 2024-03-21 Microsoft Technology Licensing, Llc Optical array panel translation

Similar Documents

Publication Publication Date Title
WO2018200417A1 (en) Systems and methods for 3d displays with flexible optical layers
EP3673319B1 (en) Light field image engine method and apparatus for generating projected 3d light fields
TWI813681B (en) Apparatus and method for displaying a three-dimensional content
CN114175627B (en) Optical methods and systems for distributed aperture-based light field displays
US11624934B2 (en) Method and system for aperture expansion in light field displays
WO2019164745A1 (en) Multifocal optics for light field displays
US11917121B2 (en) Optical method and system for light field (LF) displays based on tunable liquid crystal (LC) diffusers
EP3987346A1 (en) Method for enhancing the image of autostereoscopic 3d displays based on angular filtering
US20220357591A1 (en) Method for creating a 3d multiview display with elastic optical layer buckling
WO2021076424A1 (en) Method for projecting an expanded virtual image with a small light field display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18724046

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18724046

Country of ref document: EP

Kind code of ref document: A1