WO2019135087A1 - Multi-angle light-field display system - Google Patents

Multi-angle light-field display system Download PDF

Info

Publication number
WO2019135087A1
WO2019135087A1 PCT/GB2019/050025 GB2019050025W WO2019135087A1 WO 2019135087 A1 WO2019135087 A1 WO 2019135087A1 GB 2019050025 W GB2019050025 W GB 2019050025W WO 2019135087 A1 WO2019135087 A1 WO 2019135087A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
lens
mirror
free
display
Prior art date
Application number
PCT/GB2019/050025
Other languages
French (fr)
Inventor
Ali Ӧzgür YONTEM
Kun Li
Original Assignee
Cambridge Enterprise Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Enterprise Limited filed Critical Cambridge Enterprise Limited
Publication of WO2019135087A1 publication Critical patent/WO2019135087A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/307Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0808Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more diffracting elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
    • G02B30/35Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers using reflective optical elements in the optical path between the images and the observer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • G02B17/08Catadioptric systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/08Mirrors
    • G02B5/10Mirrors with curved faces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Definitions

  • the present disclosure relates to a light field display system. Particularly, but not exclusively, the disclosure relates to apparatus for displaying multi-depth images.
  • 3D displays have been studied extensively for a long time.
  • Several key methods have been developed, such as holography, integral imaging/light field imaging and stereoscopy.
  • many existing display systems are either bulky (e.g. 3D cinema) and require special eye-wear, or unsuitable for out-of-lab applications (e.g. holography).
  • the systems should be glasses free, multi-view, and have a large viewing angle. Ideally, an observer should be able to view a 3D object from all angles, 360° all around. Most of the volumetric 3D displays, which can provide glasses-free images, are based on projection in a diffusive medium.
  • Integral imaging is an alternative to holography and can provide 3D images under incoherent illumination leading to out-of-lab implementations. However, such a system still produces a coarse representation of the original 3D scene.
  • a light-field system can provide high resolution images, but current systems mostly feature the capture process.
  • experimental light field 3D display systems only provide a limited viewing angle from a fixed point of view. Captured light field images using commercially available cameras can only be observed on a 2D display with computational refocusing.
  • Experimental 3D light-field displays are extremely bulky setups which require multiple projection sources.
  • the imaging planes are limited to a planar configuration. As such, objects are imaged from one plane to a second, parallel plane. Therefore, these systems can only provide a planar field of view with a fixed viewing angle. Furthermore, the 3D reconstruction will be within the range of the fixed focused plane defined in a depth of field of a rectangular volume. It is an aim of the present invention to overcome at least some of these disadvantages.
  • an image generation system for generating a three dimensional image
  • the image generation system comprising a light field display for projecting a first 3D image, a free-form mirror, and a field lens for rendering a second image, said second image reflected by the free-form mirror and formed at a distance from the surface of the free-form mirror, wherein the second image is a real image corresponding to the first image, and wherein the light field display projects the first 3D image in an intermediate imaging volume between the light field display and the field lens, such that second image appears three dimensional.
  • This provides for a compact set up capable of generating a three-dimensional image for simultaneous viewing from multiple angles.
  • the image generation system can work on its own to display 3D images by feeding digitally (numerically) generated light field or holographic images.
  • the free-form mirror is a convex mirror.
  • the free-form mirror is a conical mirror.
  • the free-from mirror is a concave mirror.
  • the field lens is one of a hemispherical, spherical or ball lens.
  • the field lens comprises at least one Fresnel lens.
  • a plurality of Fresnel lenses are provided in a stacked configuration. This results in a lower total focal length, providing a lower numerical aperture and thus a large imaging angle. This is especially useful when it is desirable that the light rays reach the edges of the free-form mirror.
  • the light field display comprises a picture generation unit and a lens array.
  • the lens arrays comprise liquid crystal lens arrays.
  • the picture generation unit comprises one of a laser scanner, a hologram generator, a pixelated display or a projector, wherein the projector comprises a light source and a spatial light modulator.
  • the pixelated display comprises one of a OLED, QLED, a micro-LED display.
  • the pixelated display comprises either a rectangular or a circular pixel configuration.
  • the spatial light modulator is one of a digital micromirror device or a liquid crystal on silicon device.
  • the image generation system further comprises an image processor in communication with the light-field display, wherein the image processor is configured to account for distortions caused by the optical set up such that the second image appears undistorted.
  • the image processor is configured to account for distortions caused by the optical set up such that the second image appears undistorted.
  • the image generation system further comprises a phased array of ultrasonic transducers configured to provide a tactile output corresponding to the dimensions of the second image.
  • This provides haptic feedback to a user interacting with the displayed image, thereby improving the perceived reality of the displayed object.
  • Said transducer array being flexible and reconfigurable so as to conform to the exact size shape of the image displayed.
  • a tracking system is used to track hand gestures, head, eye, hand and finger positions in order to increase the accuracy of the tactile system.
  • the phased array of ultrasonic transducers is located around the periphery of the free-form mirror.
  • Figure 1 is a schematic illustration of the image generation apparatus according to an aspect of the invention.
  • FIG. 2 is a schematic illustration of the image generation apparatus according to an aspect of the invention.
  • FIG. 3 is a schematic illustration of the image generation apparatus according to an aspect of the invention.
  • FIG. 4 is a schematic illustration of the image generation apparatus according to an aspect of the invention.
  • Figure 5 is a schematic illustration of a portion of the image generation apparatus according to an aspect of the invention.
  • Figure 6 is a schematic illustration of the image generation apparatus according to an aspect of the invention.
  • FIG. 7 is a schematic illustration of the image generation apparatus according to an aspect of the invention.
  • Figure 8 is a schematic illustration of a portion of the image generation apparatus according to an aspect of the invention.
  • Figure 9 is a schematic illustration of the free-form mirror according to an aspect of the invention.
  • Figure 10 is a schematic illustration of the free-form mirror according to an aspect of the invention.
  • Figure 11 is a schematic illustration of the image generation apparatus according to an aspect of the invention
  • Figure 12 is a schematic illustration of the field lens according to an aspect of the invention
  • Figure 13 is a schematic illustration of the free-form mirror according to an aspect of the invention.
  • Figure 14 is a schematic illustration of the free-form mirror according to an aspect of the invention.
  • FIG. 15 is a schematic illustration of the image generation apparatus according to an aspect of the invention.
  • Figure 16 is a flow chart according to an aspect of the invention.
  • the disclosure relates to an apparatus for projecting multi dimensional true 3D images.
  • the system can further be configured to provide 3D augmented reality images.
  • Example applications can be, but are not limited to, automotive, telepresence, entertainment (gaming, 3DTV, museums and marketing), and education.
  • Figure 1 shows an image generation system 100, made up of a light field display 120, a field lens 130, and a free-form mirror 140.
  • the light field display 120 is formed by a 2D display device 121 and a lens array 122.
  • the 2D display device 121 is an LCD screen (either a single device or multiple, tiled devices).
  • the 2D display device 121 comprises a circular pixel configuration instead of the conventional rectangular configuration, wherein the relevant scaling and image generation is achieved by known image processing means and processes.
  • the 2D display device 121 comprises a scanning mirror and a digital micromirror device (DMD), or a liquid crystal on silicon (LCoS) device though the skilled person would appreciate that any suitable light source and imaging means (including 3D holographic display devices) may be used provided they were capable of operating in the manner described below.
  • DMD digital micromirror device
  • LCDoS liquid crystal on silicon
  • the lens array 122 comprises an array of diffractive optical elements (DOE) such as photon sieves).
  • DOE diffractive optical elements
  • the lens array 122 is provided by an array of liquid crystal lens arrays.
  • the lens array 122 is provided by reconfigurable DOE.
  • the lens array 122 comprises phase Fresnel lens patterns on phase-only LCoS. In an alternative embodiment, the lens array 122 comprises amplitude Fresnel lens patterns on a digital micromirror device (DMD) or amplitude only LCoS. In an alternative embodiment, the lens array 122 comprises conventional lenses. The skilled person would appreciate that any suitable image generation means, and lens array may be employed to provide the light field display 120.
  • the field lens 130 may be provided by any form of suitable lens, including, but not limited to, a hemispherical, spherical or ball lens.
  • the field lens 130 comprises at least one Fresnel lens.
  • a plurality of Fresnel lenses are provided in a stacked configuration, as shown in Figure 12.
  • the mirror 140 is a hemi-spherical, parabolic convex mirror, with a 360° field of view.
  • the skilled person would appreciate that any curved or multi-angled reflective surface may be employed, and that the field of view of the mirror 140 need not be limited to 2p steradians.
  • the volume around the mirror in which a resulting 3D image is displayed can be cylindrical, spherical, conical or any arbitrary shape.
  • the free-form mirror 140 is formed by a Fresnel lens 142 on top of a flat surface mirror 141 , as shown in Figure 13. This provides a parabolic mirror to be simulated by a setup which beneficially has a thinner form factor.
  • the free-form mirror 140 is formed by a holographic reflective plate 143 with an equivalent phase profile encoded, as shown in Figure 14.
  • the path of the light through the system is referred to as the optical path.
  • any suitable number of intervening reflectors/lens or other optical components are included so as to manipulate the optical path as necessary (for example, to minimize the overall size of the image generation system 100).
  • a series of 2D perspective images (elemental images) 192 are displayed on the 2D display 121 and each of the 2D perspective images are imaged through a corresponding lens of the lens array 122, such that an intermediate 3D image 190 is formed in an imaging volume between the light field display 120 and the field lens 130.
  • This image is relayed through the field lens 130 before being reflected by the free-form mirror 140 towards one or more users 2 who in turn see a reconstructed real 3D image 180 projected at distance from the free-form mirror surface.
  • FIG 2 shows an embodiment of the image generation system 100 in which a spatial light modulator (SLM) 125 is used in place of the 2D display 121.
  • SLM spatial light modulator
  • the remaining components and their arrangement are otherwise identical to those described above in relation to Figure 1 , the common reference numerals of Figures 1 and 2 referring to the same components. Accordingly, a series of holographic 3D elemental images 193 are generated for transmission through the lens array 122, in place of the 2D perspective images 192.
  • FIG. 3 shows the different ways a 3D image can be projected via the mirror 140 of the image generation system 100 of Figures 1 and 2. Whilst the field lens 130 and light field display 120 of the system 100 are not shown, they are arranged as described with reference to Figure 1. A large, single image can be reconstructed around the mirror 140 such that an observer can view different parts of the same 3D image. In an alternative embodiment, multiple different 3D images can be displayed around the mirror 140. Observers will view different objects at different locations around the mirror. In a further embodiment, different observers can view the same portion of a common object.
  • Figures 4 and 5 show a further embodiment of the image generation system 100 which includes a phased array of ultrasonic transducers 150 arranged around the periphery of the free-form mirror 140. Whilst Figure 4 does not depict the display 121 , it is present and arranged as depicted in either of Figures 1 and 2, the phased ultrasonic array 150 being compatible with both the 2D display device and the holographic display device embodiments.
  • the phased array may be provided by any suitable means.
  • the phased array is an array of ultrasonic transducers 150.
  • the phased array is configured to provide haptic feedback to the user in order to allow the user to interact with the displayed object.
  • Figure 5 whilst the field lens 130 and light field display 120 of the system 100 are not shown, they are arranged as described with reference to Figures 1 and 2. Accordingly, the embodiment of Figures 4 and 5 is identical to that of Figures 1 and 2 with the addition of the phased ultrasonic array 150, with the common reference numerals of Figures 1 , 2, 4 and 5 referring to the same components.
  • the phased array may be provided by any suitable means.
  • the phased ultrasonic array 150 In use, the phased ultrasonic array 150 generates a pressure field configured to emulate the physical sensation of the 3D object being displayed thus providing haptic feedback to the user. Accordingly, the system 100 provides both visual and sensational cues of the reconstructed image.
  • a synchronised sound system is used to generate a 3D surrounding and directional audible acoustic field. Such a system allows for the three components of the human sensory system (namely vision, somatosensation and audition) to be stimulated such that it is possible to completely simulate the sensation and interaction with physical objects.
  • Figure 6 and 7 show an embodiment of the invention in which the image generation system 100 is installed in a vehicle.
  • the multi-direction projection capabilities of the image generation system 100 enable a first real 3D image 180a to be generated at a first region and displayed towards the driver 2a of the vehicle, whilst a second real image 180b is generated at a second region and displayed for the front passenger 2b.
  • virtual images 180c and 180d are displayed to the driver 2a and passenger 2b respectively by projecting images onto the windscreen 199 of the vehicle which are then reflected back towards the driver 2a and passenger 2b.
  • Additional projections optics a combination of mirrors and lenses
  • the image generation system operates as a head-up display (HUD).
  • HUD head-up display
  • the real images will be in focus at an apparent depth within reach of the observer, whereas the virtual images will be seen as augmented/mixed reality images overlaid on physical objects outside the vehicle.
  • the phased array such as ultrasonic array 150, generates a pressure field configured to emulate the physical sensation of the 3D object being displayed to provide haptic feedback.
  • the system 100 provides both visual and sensational cues of the reconstructed image.
  • Figure 7 shows a further embodiment with multiple image generation systems 100 for projecting real images 180e and 180f to passengers 2c and 2d in the back seats of the vehicle. This embodiment functions in the manner described above.
  • Figures 9 and 10 depict an alternative configuration for the free-form mirror and field lens.
  • the illustrated mirror is formed by two sub-mirrors 401 and 402.
  • a first section 401 is convex, whilst a second section 402 is a truncated hemisphere.
  • the mirror pair can assume any matching surface shape to create the required geometry for the 3D image reconstruction.
  • intermediate 3D image 190 is relayed through the field lens 130 and is incident on the first mirror section 401 before being reflected towards the second section 402 which redirects the light towards so as to form a reconstructed real 3D image 180 projected at distance from the free-form mirror surface.
  • This arrangement reduces the size of the system by allowing optical components to be at least partially accommodated in the volume defined by the curve of the first mirror section 401.
  • Figure 11 depicts a further embodiment of the image generation system 100 in which the lens array 122 is removed and replaced with a second lens array 123 which surrounds the free-form mirror 140.
  • the illustrated embodiment is compatible with both the 2D display device 121 of Figure 1 and the SLM 125 of Figure 2.
  • a diffusive screen 160 is positioned around the periphery of the mirror 140.
  • the 2D perspective images are formed on the diffusive screen before being relayed through the second surrounding lens array 123 to generate the real 3D images 180.
  • the depicted diffusive screen 160 is cylindrical, any suitable shape may be used.
  • no diffusive screen is employed.
  • Figures 15 and 16 depict a further embodiment of the image generation system 100 which includes an interactive hand tracking system 500. Whilst Figure 15 does not depict the display 121 , 125, it is present and arranged as depicted in either of Figures 1 and 2, the hand tracking system 500 being compatible with both the 2D display device and the holographic display device embodiments.
  • the interactive hand tracking system 500 comprises a controller 510 in communication with the 2D display device 225 (or its equivalent) and the phased ultrasonic array 150.
  • the controller 510 includes an image processing unit configured to recognise and track a user’s hands in a known manner.
  • the position and movement of one or more hands is captured by controller 510, whilst the display system 100 projects the relevant 3D object with which the user interacts.
  • the interactive hand tracking system 500 is used in conjunction with the phased ultrasonic array 150 in order to provide the sensation of tactile feedback corresponding to the displayed object and the detection of the user’s hands.
  • the process of recognising the user’s hands, determining their position and prompting the appropriate response is carried out by a controller 510 in a known manner.
  • Figure 16 sets out an example of the operational steps of the interactive hand tracking system 500.
  • step S501 the position of the user’s hand is recorded by the controller 510 using any known suitable means.
  • the controller 510 includes a camera that feeds into the image processing unit.
  • the background is removed and the general shape of the hand is determined by known background removal methods performed at the controller 510.
  • the overall handshape is registered. Individual feature points of the hand image are identified and extracted for analysis. In an embodiment, this is achieved by comparing the extracted image of the hand to a database of known hand gestures accessible by the controller 510.
  • the individual portions of the hand, including the fingers are recognised via standard edge detection and shape recognition techniques that would be apparent to the skilled person.
  • recognisable gestures are stored as a series of feature points in the database. Detectable gestures include a button press, swipe, pick, pull, pinch zoom. The recorded feature points of the handshape are then compared to those in the database so as to identify the most likely gesture being performed.
  • the controller is trained to better recognise points on hands and fingers using a preferably real-time machine learning/deep learning algorithm.
  • the controller 510 comprises various models of hand and finger shapes which are cross- correlated with observations of the user’s hand and a database of images of known hand and finger positions so as to enable identification and tracking of fingers and hands.
  • the position of the hands within the 3D volume around the mirror is calculated.
  • the determination of the hand’s exact location is performed by the controller 510 in a known manner.
  • multiple perspective images of the hand are be used to determine the 3D locations of the points on the hand uniquely.
  • a plurality of known scanning means are used to determine their respective distance from the hand, thereby providing its location in multiple dimensions.
  • the hand location is estimated from the observed size of the hand as compared to one or more images of a hand at a known distance stored in memory.
  • the position of the hands is correlated with the known virtual position of the one or more displayed 3D objects.
  • the appropriate visual, haptic and audio feedback is presented to the user.
  • the controller 510 being configured to adjust the shape, size and/or orientation of the displayed 3D objects and the output of the phased ultrasonic array to respond to the detected position and movements of the user’s hand.
  • Figure 8a depicts a conventional display device 121 arranged relative to a lens array 122 as described in the image generation system 100. For simplicity, only a single lens of the lens array 122 is shown.
  • the image generation process relies on light from one portion of the display device 121 passing through a single corresponding lens of the lens array 122. If light from one pixel leaks into a neighbouring lens in the array (as shown in Figure 8a), this creates aliases around the generated 3D image.
  • a holographic plate 350 is used to redistribute the light coming from the backlight 300 and display device 121 such that, every elemental image appears behind its corresponding lens in the array 122. This replaces the baffle (which is bulky and hard to manufacture) with a thin plate of diffractive optical element.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Holo Graphy (AREA)

Abstract

An image generation system (100) comprising a free-form mirror (140), a first field lens (130) and a light field display (120), wherein the light field display is configured to project a first 3D image (190) into an intermediate imaging volume between the light field display and the field lens, such that a second 3D image (180) is reflected by the free-form mirror and formed at a distance from the surface of the free-form mirror, the second image being a real image corresponding to the first image.

Description

MULTI-ANGLE LIGHT-FIELD DISPLAY SYSTEM
TECHNICAL FIELD
The present disclosure relates to a light field display system. Particularly, but not exclusively, the disclosure relates to apparatus for displaying multi-depth images.
BACKGROUND
3D displays have been studied extensively for a long time. Several key methods have been developed, such as holography, integral imaging/light field imaging and stereoscopy. However, many existing display systems are either bulky (e.g. 3D cinema) and require special eye-wear, or unsuitable for out-of-lab applications (e.g. holography).
Furthermore, to have a natural feeling of 3D perception, the systems should be glasses free, multi-view, and have a large viewing angle. Ideally, an observer should be able to view a 3D object from all angles, 360° all around. Most of the volumetric 3D displays, which can provide glasses-free images, are based on projection in a diffusive medium.
The limitations of conventional 3D display methods such as holography can be overcome by techniques such as integral imaging. Integral imaging is an alternative to holography and can provide 3D images under incoherent illumination leading to out-of-lab implementations. However, such a system still produces a coarse representation of the original 3D scene.
A light-field system can provide high resolution images, but current systems mostly feature the capture process. In addition, experimental light field 3D display systems only provide a limited viewing angle from a fixed point of view. Captured light field images using commercially available cameras can only be observed on a 2D display with computational refocusing. Experimental 3D light-field displays are extremely bulky setups which require multiple projection sources.
Similarly, in integral imaging/light field systems, the imaging planes are limited to a planar configuration. As such, objects are imaged from one plane to a second, parallel plane. Therefore, these systems can only provide a planar field of view with a fixed viewing angle. Furthermore, the 3D reconstruction will be within the range of the fixed focused plane defined in a depth of field of a rectangular volume. It is an aim of the present invention to overcome at least some of these disadvantages.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide apparatus as claimed in the appended claims.
According to an aspect of the invention there is provided an image generation system for generating a three dimensional image, the image generation system comprising a light field display for projecting a first 3D image, a free-form mirror, and a field lens for rendering a second image, said second image reflected by the free-form mirror and formed at a distance from the surface of the free-form mirror, wherein the second image is a real image corresponding to the first image, and wherein the light field display projects the first 3D image in an intermediate imaging volume between the light field display and the field lens, such that second image appears three dimensional.
This provides for a compact set up capable of generating a three-dimensional image for simultaneous viewing from multiple angles.
Optionally, the image generation system can work on its own to display 3D images by feeding digitally (numerically) generated light field or holographic images.
Optionally, the free-form mirror is a convex mirror.
Optionally, the free-form mirror is a conical mirror.
Optionally, the free-from mirror is a concave mirror.
Optionally, the field lens is one of a hemispherical, spherical or ball lens.
Optionally, the field lens comprises at least one Fresnel lens. In an embodiment a plurality of Fresnel lenses are provided in a stacked configuration. This results in a lower total focal length, providing a lower numerical aperture and thus a large imaging angle. This is especially useful when it is desirable that the light rays reach the edges of the free-form mirror.
Optionally, the light field display comprises a picture generation unit and a lens array. Optionally, the lens arrays comprise liquid crystal lens arrays.
Optionally, the picture generation unit comprises one of a laser scanner, a hologram generator, a pixelated display or a projector, wherein the projector comprises a light source and a spatial light modulator.
Optionally, the pixelated display comprises one of a OLED, QLED, a micro-LED display.
Optionally, the pixelated display comprises either a rectangular or a circular pixel configuration.
Optionally, the spatial light modulator is one of a digital micromirror device or a liquid crystal on silicon device.
Optionally, the image generation system further comprises an image processor in communication with the light-field display, wherein the image processor is configured to account for distortions caused by the optical set up such that the second image appears undistorted. This obviates the need for any post-image generation corrections as well as bulky correction optics. Furthermore, it provides a higher degree of flexibility which can adapt to different display/mirror surfaces.
Optionally, the image generation system further comprises a phased array of ultrasonic transducers configured to provide a tactile output corresponding to the dimensions of the second image. This provides haptic feedback to a user interacting with the displayed image, thereby improving the perceived reality of the displayed object. Said transducer array being flexible and reconfigurable so as to conform to the exact size shape of the image displayed.
Optionally, a tracking system, is used to track hand gestures, head, eye, hand and finger positions in order to increase the accuracy of the tactile system.
Optionally, the phased array of ultrasonic transducers is located around the periphery of the free-form mirror.
Other aspects of the invention will be apparent from the appended claim set. BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic illustration of the image generation apparatus according to an aspect of the invention;
Figure 2 is a schematic illustration of the image generation apparatus according to an aspect of the invention;
Figure 3 is a schematic illustration of the image generation apparatus according to an aspect of the invention;
Figure 4 is a schematic illustration of the image generation apparatus according to an aspect of the invention;
Figure 5 is a schematic illustration of a portion of the image generation apparatus according to an aspect of the invention;
Figure 6 is a schematic illustration of the image generation apparatus according to an aspect of the invention;
Figure 7 is a schematic illustration of the image generation apparatus according to an aspect of the invention;
Figure 8 is a schematic illustration of a portion of the image generation apparatus according to an aspect of the invention;
Figure 9 is a schematic illustration of the free-form mirror according to an aspect of the invention;
Figure 10 is a schematic illustration of the free-form mirror according to an aspect of the invention;
Figure 11 is a schematic illustration of the image generation apparatus according to an aspect of the invention; Figure 12 is a schematic illustration of the field lens according to an aspect of the invention;
Figure 13 is a schematic illustration of the free-form mirror according to an aspect of the invention;
Figure 14 is a schematic illustration of the free-form mirror according to an aspect of the invention;
Figure 15 is a schematic illustration of the image generation apparatus according to an aspect of the invention; and
Figure 16 is a flow chart according to an aspect of the invention.
DETAILED DESCRIPTION
Particularly, but not exclusively, the disclosure relates to an apparatus for projecting multi dimensional true 3D images. The system can further be configured to provide 3D augmented reality images. Example applications can be, but are not limited to, automotive, telepresence, entertainment (gaming, 3DTV, museums and marketing), and education.
Figure 1 shows an image generation system 100, made up of a light field display 120, a field lens 130, and a free-form mirror 140.
The light field display 120 is formed by a 2D display device 121 and a lens array 122. In an embodiment, the 2D display device 121 is an LCD screen (either a single device or multiple, tiled devices). In a further embodiment, the 2D display device 121 comprises a circular pixel configuration instead of the conventional rectangular configuration, wherein the relevant scaling and image generation is achieved by known image processing means and processes. In an alternative embodiment, the 2D display device 121 comprises a scanning mirror and a digital micromirror device (DMD), or a liquid crystal on silicon (LCoS) device though the skilled person would appreciate that any suitable light source and imaging means (including 3D holographic display devices) may be used provided they were capable of operating in the manner described below. The lens array 122 comprises an array of diffractive optical elements (DOE) such as photon sieves). In a further embodiment, the lens array 122 is provided by an array of liquid crystal lens arrays. In a further embodiment, the lens array 122 is provided by reconfigurable DOE.
In an alternative embodiment, the lens array 122 comprises phase Fresnel lens patterns on phase-only LCoS. In an alternative embodiment, the lens array 122 comprises amplitude Fresnel lens patterns on a digital micromirror device (DMD) or amplitude only LCoS. In an alternative embodiment, the lens array 122 comprises conventional lenses. The skilled person would appreciate that any suitable image generation means, and lens array may be employed to provide the light field display 120.
The field lens 130 may be provided by any form of suitable lens, including, but not limited to, a hemispherical, spherical or ball lens.
In an embodiment, the field lens 130 comprises at least one Fresnel lens. In a particular embodiment, a plurality of Fresnel lenses are provided in a stacked configuration, as shown in Figure 12.
In the illustrated embodiment, the mirror 140 is a hemi-spherical, parabolic convex mirror, with a 360° field of view. The skilled person would appreciate that any curved or multi-angled reflective surface may be employed, and that the field of view of the mirror 140 need not be limited to 2p steradians. As a result, the volume around the mirror in which a resulting 3D image is displayed can be cylindrical, spherical, conical or any arbitrary shape.
In an embodiment, the free-form mirror 140 is formed by a Fresnel lens 142 on top of a flat surface mirror 141 , as shown in Figure 13. This provides a parabolic mirror to be simulated by a setup which beneficially has a thinner form factor.
In a further embodiment, the free-form mirror 140 is formed by a holographic reflective plate 143 with an equivalent phase profile encoded, as shown in Figure 14.
The path of the light through the system is referred to as the optical path. The skilled person would understand that in further embodiments, any suitable number of intervening reflectors/lens or other optical components are included so as to manipulate the optical path as necessary (for example, to minimize the overall size of the image generation system 100). In use, a series of 2D perspective images (elemental images) 192 are displayed on the 2D display 121 and each of the 2D perspective images are imaged through a corresponding lens of the lens array 122, such that an intermediate 3D image 190 is formed in an imaging volume between the light field display 120 and the field lens 130. This image is relayed through the field lens 130 before being reflected by the free-form mirror 140 towards one or more users 2 who in turn see a reconstructed real 3D image 180 projected at distance from the free-form mirror surface.
Figure 2 shows an embodiment of the image generation system 100 in which a spatial light modulator (SLM) 125 is used in place of the 2D display 121. The remaining components and their arrangement are otherwise identical to those described above in relation to Figure 1 , the common reference numerals of Figures 1 and 2 referring to the same components. Accordingly, a series of holographic 3D elemental images 193 are generated for transmission through the lens array 122, in place of the 2D perspective images 192.
The resulting 3D image(s) is able to be displayed in different ways. Figure 3 shows the different ways a 3D image can be projected via the mirror 140 of the image generation system 100 of Figures 1 and 2. Whilst the field lens 130 and light field display 120 of the system 100 are not shown, they are arranged as described with reference to Figure 1. A large, single image can be reconstructed around the mirror 140 such that an observer can view different parts of the same 3D image. In an alternative embodiment, multiple different 3D images can be displayed around the mirror 140. Observers will view different objects at different locations around the mirror. In a further embodiment, different observers can view the same portion of a common object.
Figures 4 and 5 show a further embodiment of the image generation system 100 which includes a phased array of ultrasonic transducers 150 arranged around the periphery of the free-form mirror 140. Whilst Figure 4 does not depict the display 121 , it is present and arranged as depicted in either of Figures 1 and 2, the phased ultrasonic array 150 being compatible with both the 2D display device and the holographic display device embodiments.
The phased array may be provided by any suitable means. In a preferred embodiment the phased array is an array of ultrasonic transducers 150. The phased array is configured to provide haptic feedback to the user in order to allow the user to interact with the displayed object. In Figure 5, whilst the field lens 130 and light field display 120 of the system 100 are not shown, they are arranged as described with reference to Figures 1 and 2. Accordingly, the embodiment of Figures 4 and 5 is identical to that of Figures 1 and 2 with the addition of the phased ultrasonic array 150, with the common reference numerals of Figures 1 , 2, 4 and 5 referring to the same components. Again the phased array may be provided by any suitable means.
In use, the phased ultrasonic array 150 generates a pressure field configured to emulate the physical sensation of the 3D object being displayed thus providing haptic feedback to the user. Accordingly, the system 100 provides both visual and sensational cues of the reconstructed image. In a further embodiment, a synchronised sound system is used to generate a 3D surrounding and directional audible acoustic field. Such a system allows for the three components of the human sensory system (namely vision, somatosensation and audition) to be stimulated such that it is possible to completely simulate the sensation and interaction with physical objects.
Figure 6 and 7 show an embodiment of the invention in which the image generation system 100 is installed in a vehicle.
For simplicity, only the mirror 140 and the phased ultrasonic array 150 of the system are pictured. Whilst the field lens 130 and light field display 120 of the system 100 are not shown, they are arranged as described with reference to Figure 1.
In both Figures 6 and 7 the multi-direction projection capabilities of the image generation system 100 enable a first real 3D image 180a to be generated at a first region and displayed towards the driver 2a of the vehicle, whilst a second real image 180b is generated at a second region and displayed for the front passenger 2b. At the same time, virtual images 180c and 180d are displayed to the driver 2a and passenger 2b respectively by projecting images onto the windscreen 199 of the vehicle which are then reflected back towards the driver 2a and passenger 2b. Additional projections optics (a combination of mirrors and lenses) between the windscreen 199 and the mirror 140 can be used to scale the size of the virtual images. Accordingly, the image generation system operates as a head-up display (HUD). In an embodiment, the real images will be in focus at an apparent depth within reach of the observer, whereas the virtual images will be seen as augmented/mixed reality images overlaid on physical objects outside the vehicle. This allows the occupants in the front seats to observe HUD images and infotainment images and interact with them. The phased array, such as ultrasonic array 150, generates a pressure field configured to emulate the physical sensation of the 3D object being displayed to provide haptic feedback. Accordingly, the system 100 provides both visual and sensational cues of the reconstructed image. Though it is not shown in the figures, it is envisaged that the synchronised sound system described in relation to Figure 5 is optionally incorporated in to the vehicle.
Figure 7 shows a further embodiment with multiple image generation systems 100 for projecting real images 180e and 180f to passengers 2c and 2d in the back seats of the vehicle. This embodiment functions in the manner described above.
Figures 9 and 10 depict an alternative configuration for the free-form mirror and field lens. The illustrated mirror is formed by two sub-mirrors 401 and 402. A first section 401 is convex, whilst a second section 402 is a truncated hemisphere. Though the mirror pair can assume any matching surface shape to create the required geometry for the 3D image reconstruction.
In use, intermediate 3D image 190 is relayed through the field lens 130 and is incident on the first mirror section 401 before being reflected towards the second section 402 which redirects the light towards so as to form a reconstructed real 3D image 180 projected at distance from the free-form mirror surface.
This arrangement reduces the size of the system by allowing optical components to be at least partially accommodated in the volume defined by the curve of the first mirror section 401.
Figure 11 depicts a further embodiment of the image generation system 100 in which the lens array 122 is removed and replaced with a second lens array 123 which surrounds the free-form mirror 140.
The illustrated embodiment is compatible with both the 2D display device 121 of Figure 1 and the SLM 125 of Figure 2.
When the light field display 120 is configured to generate a series of 2D perspective images 192, a diffusive screen 160 is positioned around the periphery of the mirror 140. In use, the 2D perspective images are formed on the diffusive screen before being relayed through the second surrounding lens array 123 to generate the real 3D images 180. While the depicted diffusive screen 160 is cylindrical, any suitable shape may be used. When the light field display 120 utilised a SLM 125 to generate a series of 3D perspective images 193, no diffusive screen is employed.
Figures 15 and 16 depict a further embodiment of the image generation system 100 which includes an interactive hand tracking system 500. Whilst Figure 15 does not depict the display 121 , 125, it is present and arranged as depicted in either of Figures 1 and 2, the hand tracking system 500 being compatible with both the 2D display device and the holographic display device embodiments.
The interactive hand tracking system 500 comprises a controller 510 in communication with the 2D display device 225 (or its equivalent) and the phased ultrasonic array 150. The controller 510 includes an image processing unit configured to recognise and track a user’s hands in a known manner.
In use, the position and movement of one or more hands is captured by controller 510, whilst the display system 100 projects the relevant 3D object with which the user interacts. In an embodiment, the interactive hand tracking system 500 is used in conjunction with the phased ultrasonic array 150 in order to provide the sensation of tactile feedback corresponding to the displayed object and the detection of the user’s hands.
The process of recognising the user’s hands, determining their position and prompting the appropriate response is carried out by a controller 510 in a known manner.
Figure 16 sets out an example of the operational steps of the interactive hand tracking system 500.
In step S501 , the position of the user’s hand is recorded by the controller 510 using any known suitable means. In an embodiment, the controller 510 includes a camera that feeds into the image processing unit.
At step S502, the background is removed and the general shape of the hand is determined by known background removal methods performed at the controller 510.
At step S503, the overall handshape is registered. Individual feature points of the hand image are identified and extracted for analysis. In an embodiment, this is achieved by comparing the extracted image of the hand to a database of known hand gestures accessible by the controller 510.
At step S504, the individual portions of the hand, including the fingers are recognised via standard edge detection and shape recognition techniques that would be apparent to the skilled person. In an embodiment, recognisable gestures are stored as a series of feature points in the database. Detectable gestures include a button press, swipe, pick, pull, pinch zoom. The recorded feature points of the handshape are then compared to those in the database so as to identify the most likely gesture being performed.
At step S505, the controller is trained to better recognise points on hands and fingers using a preferably real-time machine learning/deep learning algorithm. In an embodiment, the controller 510 comprises various models of hand and finger shapes which are cross- correlated with observations of the user’s hand and a database of images of known hand and finger positions so as to enable identification and tracking of fingers and hands.
At step S506 the position of the hands within the 3D volume around the mirror is calculated. The determination of the hand’s exact location is performed by the controller 510 in a known manner. In an embodiment, multiple perspective images of the hand are be used to determine the 3D locations of the points on the hand uniquely. In an embodiment, a plurality of known scanning means are used to determine their respective distance from the hand, thereby providing its location in multiple dimensions. In another embodiment, the hand location is estimated from the observed size of the hand as compared to one or more images of a hand at a known distance stored in memory.
At step S507, the position of the hands is correlated with the known virtual position of the one or more displayed 3D objects.
At step S508, the appropriate visual, haptic and audio feedback is presented to the user. The controller 510 being configured to adjust the shape, size and/or orientation of the displayed 3D objects and the output of the phased ultrasonic array to respond to the detected position and movements of the user’s hand. CROSSTALK
Figure 8a depicts a conventional display device 121 arranged relative to a lens array 122 as described in the image generation system 100. For simplicity, only a single lens of the lens array 122 is shown.
The image generation process relies on light from one portion of the display device 121 passing through a single corresponding lens of the lens array 122. If light from one pixel leaks into a neighbouring lens in the array (as shown in Figure 8a), this creates aliases around the generated 3D image.
It is known to place a baffle array between the lens array 122 and the display device 121 to block the light from neighbouring regions.
In an alternative embodiment depicted in Figure 8b, a holographic plate 350 is used to redistribute the light coming from the backlight 300 and display device 121 such that, every elemental image appears behind its corresponding lens in the array 122. This replaces the baffle (which is bulky and hard to manufacture) with a thin plate of diffractive optical element.

Claims

1 An image generation system for generating a three-dimensional image, the image generation system comprising:
a light field display for projecting a first 3D image,
a free-form mirror,
and a field lens for rendering a second image, said second image reflected by the free-form mirror and formed at a distance from the surface of the free-form mirror,
wherein the second image is a real image corresponding to the first image, and
wherein the light field display projects the first 3D image in an intermediate imaging volume between the light field display and the field lens, such that second image appears three dimensional.
2. The system of claim 1 wherein the field lens is one of a hemispherical, spherical or ball lens or any combination of reciprocal/reversible optics.
3. The system of claim 1 wherein the field lens is provided by at least one Fresnel lens.
4. The system of claim 3 wherein the field lens is provided by a plurality of Fresnel lenses in a stacked configuration.
5. The system of any preceding claim wherein the light field display comprises a picture generation unit and a lens array.
6. The system of claim 5 wherein the picture generation unit comprises one of a laser scanner, a hologram generator and a pixelated display or a projector, wherein the projector comprises a light source and a spatial light modulator.
7. The system of claim 6 wherein the spatial light modulator is a digital micromirror device.
8. The system of any of claims 3-7 wherein the lens arrays comprise diffractive optical elements.
9. The system of any of any preceding claim further comprising an image processor in communication with the light-field display, wherein the image processor is configured to account for distortions caused by the optical set up such that the second image appears undistorted.
10. The system of any preceding claim further comprising monitoring means for tracking the hands, fingers, heads and eyes of users.
11. The system of any preceding claim wherein the free-form mirror is one of a convex mirror and a conical mirror.
12. The system of any preceding claim wherein the free-form mirror comprises a truncated hemispherical portion and a separate second convex portion.
PCT/GB2019/050025 2018-01-05 2019-01-04 Multi-angle light-field display system WO2019135087A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
GB1800173.5 2018-01-05
GBGB1800173.5A GB201800173D0 (en) 2018-01-05 2018-01-05 Multi-angle light capture display system
GBGB1802801.9A GB201802801D0 (en) 2018-01-05 2018-02-21 Multi-angle light capture display system
GB1802801.9 2018-02-21
GB1815029.2 2018-09-14
GBGB1815029.2A GB201815029D0 (en) 2018-01-05 2018-09-14 Multi-angle light-field display system

Publications (1)

Publication Number Publication Date
WO2019135087A1 true WO2019135087A1 (en) 2019-07-11

Family

ID=61190272

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/GB2019/050025 WO2019135087A1 (en) 2018-01-05 2019-01-04 Multi-angle light-field display system
PCT/GB2019/050026 WO2019135088A1 (en) 2018-01-05 2019-01-04 Multi-angle light-field capture and display system

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/GB2019/050026 WO2019135088A1 (en) 2018-01-05 2019-01-04 Multi-angle light-field capture and display system

Country Status (2)

Country Link
GB (4) GB201800173D0 (en)
WO (2) WO2019135087A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021045733A1 (en) 2019-09-03 2021-03-11 Light Field Lab, Inc. Light field display system for gaming environments

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015184409A1 (en) * 2014-05-30 2015-12-03 Magic Leap, Inc. Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality
US20170010473A1 (en) * 2014-02-27 2017-01-12 Citizen Holdings Co., Ltd. Projection apparatus
WO2018100002A1 (en) * 2016-11-30 2018-06-07 Jaguar Land Rover Limited Multi-depth display apparatus
WO2018165123A1 (en) * 2017-03-09 2018-09-13 Arizona Board Of Regents On Behalf Of The University Of Arizona Freeform prism and head-mounted display with increased field of view

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000035204A1 (en) * 1998-12-10 2000-06-15 Zebra Imaging, Inc. Dynamically scalable full-parallax stereoscopic display
US7420750B2 (en) * 2004-05-21 2008-09-02 The Trustees Of Columbia University In The City Of New York Catadioptric single camera systems having radial epipolar geometry and methods and means thereof
US9250510B2 (en) * 2012-02-15 2016-02-02 City University Of Hong Kong Panoramic stereo catadioptric imaging
US10565734B2 (en) * 2015-04-15 2020-02-18 Google Llc Video capture, processing, calibration, computational fiber artifact removal, and light-field pipeline

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170010473A1 (en) * 2014-02-27 2017-01-12 Citizen Holdings Co., Ltd. Projection apparatus
WO2015184409A1 (en) * 2014-05-30 2015-12-03 Magic Leap, Inc. Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality
WO2018100002A1 (en) * 2016-11-30 2018-06-07 Jaguar Land Rover Limited Multi-depth display apparatus
WO2018165123A1 (en) * 2017-03-09 2018-09-13 Arizona Board Of Regents On Behalf Of The University Of Arizona Freeform prism and head-mounted display with increased field of view

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ALI ÖZGÜR YÖNTEM ET AL: "Design for 360-degree 3D Light-field Camera and Display", IMAGING AND APPLIED OPTICS 2018, 25 June 2018 (2018-06-25), XP055562196, ISBN: 978-1-943580-44-6, Retrieved from the Internet <URL:https://www.osapublishing.org/DirectPDFAccess/A7C6F6A6-B2D5-E696-DE9108249EEC4DF2_390697/3D-2018-3Tu5G.6.pdf?da=1&id=390697&uri=3D-2018-3Tu5G.6&seq=0&mobile=no> [retrieved on 20190226], DOI: 10.1364/3D.2018.3Tu5G.6 *
CHEN GUOWEN ET AL: "360 Degree Crosstalk-Free Viewable 3D Display Based on Multiplexed Light Field: Theory and Experiments", JOURNAL OF DISPLAY TECHNOLOGY, IEEE SERVICE CENTER, NEW YORK, NY, US, vol. 12, no. 11, 1 November 2016 (2016-11-01), pages 1309 - 1318, XP011625003, ISSN: 1551-319X, [retrieved on 20161007], DOI: 10.1109/JDT.2016.2598552 *
HIROSHI TODOROKI ET AL: "Light field rendering with omni-directional camera", PROCEEDINGS OF SPIE, vol. 5150, 23 June 2003 (2003-06-23), 1000 20th St. Bellingham WA 98225-6705 USA, pages 1159 - 1168, XP055562286, ISSN: 0277-786X, ISBN: 978-1-5106-2099-5, DOI: 10.1117/12.503259 *
HSU CHE-HAO ET AL: "HoloTube: a low-cost portable 360-degree interactive autostereoscopic display", MULTIMEDIA TOOLS AND APPLICATIONS, KLUWER ACADEMIC PUBLISHERS, BOSTON, US, vol. 76, no. 7, 21 April 2016 (2016-04-21), pages 9099 - 9132, XP036204135, ISSN: 1380-7501, [retrieved on 20160421], DOI: 10.1007/S11042-016-3502-3 *
MUNKH-UCHRAL ERDENEBAT, GANBAT BAASANTSEREN, JAE-HYEUNG PARK, NAM KIM, KI-CHUL KWON,YOUNG-HEE JANG AND KWAN-HEE YOO: "Full-parallax 360 degrees horizontal viewing integral imaging using anamorphic optics", SPIE, PO BOX 10 BELLINGHAM WA 98227-0010, USA, 15 February 2011 (2011-02-15), XP040553220, DOI: 10.1117/12.872673 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021045733A1 (en) 2019-09-03 2021-03-11 Light Field Lab, Inc. Light field display system for gaming environments
EP4025953A4 (en) * 2019-09-03 2023-10-04 Light Field Lab, Inc. Light field display system for gaming environments

Also Published As

Publication number Publication date
WO2019135088A8 (en) 2019-09-12
WO2019135088A1 (en) 2019-07-11
GB201815029D0 (en) 2018-10-31
GB201800173D0 (en) 2018-02-21
GB201815033D0 (en) 2018-10-31
GB201802801D0 (en) 2018-04-04

Similar Documents

Publication Publication Date Title
JP7369507B2 (en) Wearable 3D augmented reality display with variable focus and/or object recognition
JP7478773B2 (en) SYSTEM, APPARATUS, AND METHOD FOR EYEBOX EXPANSION IN WEARABLE HEAD-UP DISPLAYS
US11221494B2 (en) Adaptive viewport optical display systems and methods
Yamaguchi Full-parallax holographic light-field 3-D displays and interactive 3-D touch
EP2924991B1 (en) Three-dimensional image display system, method and device
Brar et al. Laser-based head-tracked 3D display research
CN112602004B (en) Projector-combiner display with beam replication
TW200537126A (en) Three-dimensional display using variable focusing lens
JP2008146221A (en) Image display system
JP2020508496A (en) Microscope device for capturing and displaying a three-dimensional image of a sample
JP2010181912A (en) Apparatus and method for projecting spatial image
Kakeya MOEVision: simple multiview display with clear floating image
JP2018151459A (en) Stereoscopic vision device
WO2019135087A1 (en) Multi-angle light-field display system
KR101741227B1 (en) Auto stereoscopic image display device
US20060158731A1 (en) FOCUS fixation
JP7140954B2 (en) 3D display device, 3D display method, and 3D display program
KR101921798B1 (en) Apparatus for generating 3-dimensional picture
TW201636683A (en) Augmented reality imaging method and system
Kim et al. A tangible floating display system for interaction
JP4591997B2 (en) Stereoscopic image display device
KR101032753B1 (en) Stereo scope apparatus
Hedili Light Efficient and Foveated Head-Worn Displays
Soomro Augmented reality 3D display and light field imaging systems based on passive optical surfaces
KR20180117818A (en) Apparatus and system for providing image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19700434

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19700434

Country of ref document: EP

Kind code of ref document: A1