GB2444301A - Autostereoscopic projection display - Google Patents

Autostereoscopic projection display Download PDF

Info

Publication number
GB2444301A
GB2444301A GB0624196A GB0624196A GB2444301A GB 2444301 A GB2444301 A GB 2444301A GB 0624196 A GB0624196 A GB 0624196A GB 0624196 A GB0624196 A GB 0624196A GB 2444301 A GB2444301 A GB 2444301A
Authority
GB
United Kingdom
Prior art keywords
screen
image
projector
projectors
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0624196A
Other versions
GB0624196D0 (en
Inventor
Colin David Cameron
Christopher Paul Wilson
Anton De Braal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
F Poszat HU LLC
Original Assignee
F Poszat HU LLC
Qinetiq Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by F Poszat HU LLC, Qinetiq Ltd filed Critical F Poszat HU LLC
Priority to GB0624196A priority Critical patent/GB2444301A/en
Publication of GB0624196D0 publication Critical patent/GB0624196D0/en
Publication of GB2444301A publication Critical patent/GB2444301A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • G02B27/225
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A projection display system includes a plurality of projectors 1 to project light relating to an image 6 through a screen 2 to a viewing region, the screen having a narrow angle of dispersion in at least one axis. Computation means 4 provide image information to the projectors and is adapted to render an image for each projector to produce an autostereoscopic image for a viewer 3 viewing the screen. The rendering is carried out using one or more virtual cameras 12 located on the projection side of the screen. The image rendering includes incorporating a factor to pre-distort the image data received by each projector to compensate for the projection distortion arising from incorrect perspective, due to projector positioning. By rendering the images in this fashion, from the projector side of the screen, the processing requirements are significantly reduced. Also disclosed is another method for producing an autostereoscopic image, wherein a plurality of projectors form adjacent image projection frusta which combine to form the image.

Description

Projection Display System This invention relates to display systems,
and in particular to projection display systems that comprise a plurality of projectors arranged to illuminate a screen. Such displays may be used to generate autostereoscopic three dimensional (3D) displays.
Display systems incorporating a plurality of projectors are used in both 2D and 3D display systems. Those used to create 3D displays take various forms.
One form uses a plurality of projectors to create a tiled image of high resolution onto a projection screen, and puts an array of lenses in front of the screen, with each lens being arranged to image a small part of the screen.
The lenses in such a system are often arranged in a single axis lenticular array. The viewer then sees, due to the action of the lenses, a different set of pixels depending on his viewpoint, thus giving a 3D-like appearance to suitably projected image data. This method does not rely on a plurality of projectors, but will benefit from the additional pixel count provided by them.
A 3D image may also be formed by arranging a plurality of projectors in relation to a screen such that an observer looking at different parts of the screen will see components of images from different projectors, the components co-operating such that a 3D effect is achieved. This does not require an array of lenses, and can give a better 3D effect, as the resultant image can have an appreciable depth as well as merely looking different from different viewpoints. Better results are achieved with more projectors, as this : *. provides for both a larger angle of view, and a more natural 3D image.
Traditionally, the image rendering requirements of such a display become * S..
quite onerous for a system as the number of projectors increases, leading to an economic limitation on the quality obtainable. Also, as the number of 5:m: 30 projectors increases the setup of each of the projectors in relation to each * other projector becomes more difficult S... *. S* * S S
S I.,
The rendering carried out by these systems is relatively simple in concept, but requires relatively significant processing resources, as data to be displayed on each projector is rendered using a virtual image generation camera located in a viewing volume from which the autostereoscopic image may be seen. The virtual image generation camera is a point from which the rendering takes place. In ray tracing terms it is the point from which all rays are assumed to emanate, and traditionally represents the point from which the image is viewed. For autostereo displays, rendering is traditionally carried out for several virtual image generation camera positions in the viewing volume, and is a computationally intensive task as stated in the paragraph above.
It is an object of the present invention to at least mitigate the problems of the
prior art.
According to the present invention there is provided a method of generating an image on a three dimensional (3D) autostereoscopic display comprising the steps of i. providing a plurality of projectors and a screen, wherein the projectors are arranged to illuminate the screen, and further wherein the screen is arranged to have a narrow angle of dispersion in at least one axis; ii. receiving image information to be displayed, in a form representative of a 3D object; iii. for each projector, pre-distorting the received information to : *** compensate for incorrect perspective of the projector by transforming them ***. into a viewing region perspective; S...
iv. projecting light rays corresponding to the pre-distorted image information from each of the projectors through the screen to a viewing region.
S..... * S
The current invention has been termed Projector Space Image Generation (PSIG), as it approaches the rendering from the point of view of the projector, as opposed to the viewer oriented rendering of the prior art. Effectively, the PSIG approach of the current invention carries out the image rendering from the projector, co-locating a virtual image generation viewpoint, or virtual camera, which, in raytracing terms would be the eye of a viewer or camera, with the projector itself. Of course, this does not mean that actual viewpoint of the resultant image is co-located with the projector -the term "virtual image generation viewpoint" refers to an effective viewpoint taken for the purposes of the image computation, or rendering. Prior art systems take as this "virtual image generation viewpoint" as the actual viewpoint of a viewer of the resultant image, as it is normally done in ray tracing applications. The actual positions of the virtual cameras may be exactly co-located with the projector positions, or may be positions relatively close to the actual projector positions, in which case a correction factor would be used to account for the positional difference.
The technique of the present invention has the advantage of reduced (virtually zero) post rendering information transfer requirements, and therefore simplifies the camera to projector mapping phase.
The present invention provides a means for generation of an autostereoscopic image of high quality, but with a much reduced requirement for processing power as compared to the prior art when rendering the images to be projected. The current invention effectively computes the correct light rays to be projected from the projector side to the screen, and through to an imaginary observer to generate a geometrically accurate image to be displayed. It has been found that such a ray trace method allows the ::. rendering of an image frame from a single projector to be carried out in a single pass. In contrast, the prior art method of rendering from the viewer side of the screen can result orders of magnitude increase in the number of * mathematical operations required.
****** * 30 * The current invention has been implemented on a horizontal Parallax Only S...
(HPO) autostereo projection system, although it could be equally applied to a vertical parallax only system, or a full parallax system as required, by making the appropriate changes to the configuration of the projection system and rendering software.
The screen is adapted for HPO use, by means of being asymmetric in terms of its angle of diffusion. Light hitting the screen from a projector is scattered widely, approximately 600, in the vertical plane to provide a large viewing angle, but relatively very narrowly in the horizontal plane. Typically the horizontal scattering may be, approximately 1.50, 2 or 30 although the angle may be adapted to suit a given system design parameters. This diffusion property means that the system is able to control the propagation direction of the light emitted by the projectors very precisely, and in this way the system is able to provide different images to each of a viewer's eyes in a large volume to produce a 3D effect. The angle of dispersion of the screen may be chosen according to other parameters such as the number of projectors used, the optimal viewing distance chosen, and the spacing between projectors. A larger number of projectors, or projectors that are spaced closer together will typically use a screen with a smaller dispersion angle. This will lead to a better quality image, but at the cost of either more projectors or a smaller viewing volume.
When using a screen material that has horizontal-parallax-only (HPO) properties, certain distortions may be noticeable. These distortions are common to all HPO systems, and involve an image that lacks the correct vertical perspective. Such effects include the foreshortening of an object, and the apparent tracking of objects with the vertical motion of the eye. * S. * S S *S.S S... * S S...
Due to the decrease in processing power needed to implement the current invention as compared to viewer space image generation systems, the display : 30 of complex real-time computer animations is possible whilst still utilising relatively cheap off-the-shelf computer systems. The possible inclusion of live video feeds also opens up the use of the invention with a suitable camera * system to produce a 3D autostereo television system. *1
The information received in step ii above will include information relating to the shape of an object to be displayed, and may further include information relating to colour, texture, brightness levels or any other feature capable of being displayed on the system.
Note that for the purposes of this specification the, or each, projector may comprise a traditional and commonly available projector system having a light source, a spatial light modulator (SLM) of some sort, and a lens. Alternatively, the or each projector may comprise an individual optical aperture with a SLM shared with a neighbouring optical aperture. The light source and SLM may be coincident.
As a second aspect, the present invention provides a method of producing an autostereoscopic image on a screen, the screen comprising a material having a narrow angle of dispersion in at least one axis, the method comprising the steps of: i. providing a plurality of projectors each arranged to illuminate the screen from a different angle; ii. providing processing means for each projector iii. receiving image information to be displayed, in a form representative of a 3D object, and distributing at least part of this information to the processing means associated with each projector; iv. arranging each projector to project an image in a projection frustum * ** to the screen, wherein differing parts of the projected image within each projector's frustum are rendered, using the projector's processing means, to represent a predetermined view of the overall image, the images from each *.... projector combining to produce an autostereo image in a view volume; i: * 30 wherein the rendering carried out for a given projector uses a virtual image generation camera co-located with the image projector. *.* * S **.. S. ** * S *
S
The invention will now be described in more detail, by way of example only, with reference to the following diagrammatically illustrative Figures, of which: Figure 1 shows a representation of the projection hardware upon which the current invention may be implemented; Figure 2 shows the frustra of the projector, and of the rendering; Figures 3 and 4 show a point p in system-space being projected through a point p' to appear in the correct spatial location for an observer; Figure 5 illustrates the complications of using a curved screen with the current invention; and Figure 6 illustrates a potential distortion problem with images produced by certain embodiments of the present invention.
Figure 1 shows the type of projection system on which the current invention may be implemented. The projection system is a HPO system, although the principle of operation of the invention can be applied to other systems. Thus Figure 1 is a plan view of the projection system. A plurality of projectors I are each arranged to project an image onto a screen 2. The screen 2 has dispersion properties such that in the horizontal plane the dispersion angle is : very small, at around 1.5 , whereas in the vertical plane the dispersion angle is rather wider at around 60 * * **** The projectors 1 are preferably arranged such that the angle 8 between two adjacent projectors and the screen is no more than the horizontal dispersion angle of the screen 2. This arrangement ensures that a viewer 3 on the other *** side of the screen 2 will not see any gaps in the image which cannot be illuminated by at least one projector 1.
An advantage of the present invention is that the projectors 1 do not have to be lined up with respect to each other or with respect to the screen with any great precision, as a calibration step (described below) can be carried out to compensate for projector positioning or optical irregularities, and for irregularities in the screen.
A computer cluster comprising a plurality of networked computers 4 are used to carry out the graphical processing, or rendering, of images to be displayed.
More specialised hardware could be used, which would reduce the number of separate computers needed. Each computer 4 contains a processor, memory, and a consumer level graphics card having two output ports, with each port on the graphics card connected to a separate projector. One of the computers 4 acts as a master controller for the remaining computers.
Figure 1 shows a series of light rays 5 being projected from projectors 1 towards the screen 2. A single ray is shown for each projector 1, although in reality each projector will be emitting projections from a grid of pixels within its projection frustum. Each ray 5 shown is directed towards producing a single point, e.g. 7, in the displayed image. This point is not on the surface of screen 2, but appears to an observer to be some distance in front of it. This leads to the requirement that each projector has to send a ray of light corresponding to the image to a different part of the screen. This action would, if no corrective action were taken, lead to a distorted image appearing on the screen. The present invention therefore pre-distorts the vertices of the : ** 3D object to be displayed in a manner described below to correct for this distortion. S. * S ****
Display of a 3D image according to the present invention takes place in the following manner.
1. Application data comprising 3D image information is received in ** the master computer as a series of vertices. This may be for example information from a CAD package such as AUTOCAD, or may be scene information derived from a plurality of cameras. The master computer (or process) sends the data across the network to the rendering computers (or processes).
2. Each rendering process receives the vertices and carries out the rendering for each of its allotted projectors, compensating for certain distortions added to the image by the system. This is the pre-distortion step mentioned above.
3. Once the 3D data has been properly projected into the 2D framebuffer of the graphics card, further calibration data is applied to correct for misalignments of the projectors, mirror and screen surface, by further manipulating the pre-distorted vertices.
Details of the pre-distortion that is carried out on the vertices making up the 3D image will now be presented.
The pre-distortion must take into account the projection frusta. The rendering (or camera') frustum for each projector in the system may not necessarily be identical to the physical projector's frustum. Each projector 1 should be set up such that it addresses the full height of the back of the screen (i.e. it covers the top and bottom regions). Due to the HPO characteristics of the screen, there is a restriction on the rendering frusta that: each frustum's origin be co-located with its associated projector in the ZX plane, and its orientation in the YZ plane be defined by the chosen viewer locations.
Figure 2 shows a part of a screen 2, along with an ideal' rendering frustum 8 (hatched region), and the physical projector frustum 9. The projector frustum e.
9 is produced by a projector typically misaligned from an ideal' projector position 10. Note that the ideal projector position 10 is co-located with the actual position 10' in the ZX plane. S..
The extents of the rendering frusta should be chosen such that all possible rays may be replicated by the corresponding physical projectors; or that the 4. ..
rendering frusta in system-space intersect the physically addressed portions of the screen.
Certain misalignments of the physical projectors, such as rotations and vertical offsets, are corrected by calibration and image warping, as explained later.
Looking again at Figure 1, it can be seen that by placing mirrors 11 along the sides of the bank of projectors 1 a plurality of "virtual projectors" are formed by the reflected portion of the projectors' (1) frusta. This gives the effect of increasing the number of projectors, and so increases the size of a view -volume in which the image 6 may be observed by observer 3. By computing both the real and virtual projector frusta for a mirrored physical projector, the correct partial frusta will be projected onto the screen. (For instance, in the autostereo display example described here, this typically involves loading each computer's (4) graphics card's frame-buffer with the two rendered images side-by-side; the division aligned to the mirror boundary.) In order to correct for the HPO distortions mentioned above, and to present the viewer with a geometrically correct world-space from all viewpoints, the image geometry must be pre-distorted prior to rendering. There is still the need to define the viewer's locus with respect to the screen for a completely accurate distortion correction, and we have defined here an arbitrary motion of the eye.
For the multi-viewer multi-viewpoint autostereo system described here, it is inherently not possible to track every viewer simultaneously. Therefore a compromise is accepted where the viewer loci are defined to be the most common (such as choosing a depth that lies at the centre of a viewing volume). However, this method allows for a real-time update of a viewer's position through varying the co-ordinates in the following calculations. * .. * S *
S
When displaying imagery from an external 3D application, it is important to truthfully represent its eye-space (or application-space), and that includes preserving the central viewpoint and producing perspectively correct objects.
The mathematical representation of the system according to the current invention is now presented. Here the user's viewpoint (from an external application) is defined as being mapped to the central axis of the eye (i.e. along the Z-axis in eye-space). This allows the user's main viewpoint to resemble the application's, and gives users the ability to look around the objects displayed, by moving around in the view-volume.
Let the 4x4 matrix required to transform the application's eye-space into the application's projector-space be MA. Once in projector-space, let the projection matrix PA represent the projection into the application's homogeneous clip space.
We now "un-project" into the display eye-space by applying an inverse eye projection matrix PE1, and further map into the system-space using the inverse eye transformation matrix ME1. Once in system-space, a general transformation matrix T (e.g. to allow the containment of an application in a sub-volume of the display) can be applied, and then the rendering camera transformation matrix M to map into projector-space for geometric pre-distortion.
Having pre-distorted the geometry in projector-space, we then perform a pseudoscopic projection HP into our camera's pseudoscopic homogeneous *S*s clip space; where the pseudoscopic transformation takes the form:
S \ /
H1 ={ -1 J (Eqn. 1) -S.. *5 55 * S * * S
(The signs in brackets represent flipping or flopping of the image; these may be used to compensate for the projection mode of the projectors, for instance.) Thus a homogeneous point p = <pr, Ps,, P, 1> in application-space, before mapping into normalised device co-ordinates becomes: P'=HZ.PI,.D(x,y,z;E).MP.T..ME.pE.pAMA.p (Eqn 2) Where D(x,y,z;E) represents the pre-distortion as a function based on the co-ordinates of the point, and the position of the eye, in projector-space, as is described below.
Figures 3 and 4 illustrate the calculations to be performed in pre-distorting the image before it is displayed by a given projector. A projector 13 wishes to project a ray of light to contribute to point p, sitting a short distance back from the screen 14. An observer looking at a point p of a 3D image sees a ray 15 that passes through the screen 14 at point 16. A projector 14 that is projecting a ray of light 17 to make up point p must in fact direct the ray 17 not directly at point p, but at the part of the screen 14 at which point p appears to the observer (i.e. through the point p'). This is the required pie-distortion for point p, and such pre-distortion is carried out generally for all points, or vertices, that make up the 3D image (although points that are on the screen 14 will in fact not be altered by the pre-distortion process). S 55
In order to pre-distort the point p in projector-space, one needs to find the distance d from the projector origin to the eye origin in the YZ plane, and the Z co-ordinate of the projector ray intersection with the screen z,. * 5
The eye's view of the height of a point p in projector-space, Ye, at a given depth z, is mapped to the required height y, that projects through the common S5 55 S. S
S S 4 I
point at the screen. Thus, due to the HPO nature of the screen, the projected point p' appears at the correct position to the viewer.
It can be seen that for a given projector ray, based on the height E and X-axis rotation E0 of the eye, the effective height, P, and orientation of the projector origin may be calculated.
Thus the pre-distorted height of a point, y, may be calculated: [Z d z1 YPI-. l.Y (Eqn.3) [Zp d+zJ Figure 5 shows a correction to be made to the calculated pre-distortion projection co-ordinates when a curved screen is used. In such a case, the value of z, may be found from the intersection of a particular ray (defined by the equation x = mz) and the screen.
The general transformation matrix T may, as stated above, be used to provide independent image information to different regions of the viewing volume.
The independent image information may comprise for example one image that is visible from one half of the viewing region, and a second image that is viewable from the other half of the viewing region. Alternatively, the independent image information may be arranged such that a first image is * *, projected to a viewer in a first location, and a second image is projected to a * S **S.
viewer in a second location. The viewer locations may be tracked by using head tracking means, and, by making suitable changes to the value of T corresponding to the tracked locations, each viewer will maintain a view of their chosen image where possible as they move within the viewing region. *.S S * *SSS S. **
* * In contrast to the above the prior art method of rendering, from the viewpoint of the viewer, is as follows: For each viewer side virtual camera spacing: a. Render camera frustum (from the viewer side of the screen); b. Determine which columns of the screen appear in the frustum of the camera; c. Determine which projectors will illuminate one or more of these screen columns; d. For each of these projectors: For each projector column, determine whether the scatter sector of the pixel projection illuminates the camera. If so, store the image column in this projector column.
As is said above, the projectors and screen of a system implementing the present invention may be positioned without concerns for extreme positional accuracy. A software calibration phase can be carried out such that deviations in projector position and orientation, such as can be seen in the difference between positions 10 and 10' in Figure 2, can be compensated for.
Note again that the rendering frustum origin should be co-located with the projector's frustum in the ZX p'ane. The calibration is done in the present embodiment by means of the following procedure: 1. Place over the screen a transparent sheet onto which has been printed : .. a grid of reference lines; , 2. For the first projector, arrange for the computer that controls the projector to display a pre-programmed grid pattern; **s..I * : 25 3. Adjust display parameters such as extent and curvature of projection frustum in x and y axes such that the displayed grid is closely aligned with printed grid; Is.
4. storing the extent of the adjustments made in relation to projector in a calibration file 5. repeating steps 2 to 4 for each of the projectors in the system.
The calibration files so produced contain calibration data that is used both before and after the pie-distortion rendering phase to apply transformations to the pre-distorted image data to compensate for the positional and orientation errors previously identified.
A further calibration stage may be carried out to correct differing colour and intensity representation between the projectors. Colour and intensity non-uniformity across the projector images may be corrected at the expense of dynamic range, by applying RGB weightings to each pixel.
Other embodiments of the invention may usefully utilise other facilities of modern graphics cards whilst still being able to produce real-time moving displays. For example, the geometric pre-distortion outlined above may be enhanced to include a full treatment for non-linear optics. Modern graphics cards can utilise a texture map in the vertex processing stage, which allows one to compute off-line corrections for very complicated and imperfect optics.
Examples of such optics include: Curvilinear mirrors, and radial lens distortions.
The invention has utility in many different areas. These include, but are not limited to volume data (MRI/NMR, stereolithography, PET scans, CAT scans, : etc.), 3D computer geometry (from CAD/CAM, 3D games, animations, etc.).
Multiple 2D data sources may also be displayed by mapping them to planes at *.S.
** 25 arbitrary depths in the 3D volume. ***..
A further application of the current invention is to replace computer generated images with those from multiple video cameras, to allow true "Autostereo 3D *S.
Television" with live replay: By either using multiple cameras at different * ** 30 locations, or one camera moved to different locations in time to build up an image, multiple views of a scene may be collected. These separate views may be used to extract depth information. In order to reproduce this 3D video feed, the data needs to be re-projected pseudoscopically with the correct pre-distortion outlined above. Other methods of depth information gathering may be used to compliment the multiple video images, such as laser range-finding and other 3D camera techniques.
With the advent of relatively low cost programmable graphics hardware, the necessary predistortion has been successfuHy implemented within the graphics processing unit's (GPU's) vertex processing stage in the graphics card of each computer. By predistorting each vertex, the subsequent interpolation of fragments approximates to the required predistortion. It must be noted however, that there should be a sufficient number of vertices -fairly evenly spaced -throughout the geometry, to ensure that the resultant image is rendered correctly. By offloading the predistortion of each vertex onto the GPU, real-time frame rates may be achieved with very large 3D datasets.
Some embodiments of the present invention can sometimes exhibit image artefacts that manifest themselves as a bending phenomenon, as shown in Figure 6a. This occurs in images having elements that stretch from the front of the view volume to the back, or which occupy a significant part of the view volume either side of the screen This occurs primarily if a perspective projection is used in the image rendering. The embodiment of the invention as described above uses a perspective projection, with one or more vanishing points. By changing the projection to an orthographic projection, which does : not have vanishing points (or may otherwise be regarded as effectively having all vanishing points at infinity), the bending phenomenon is much reduced. *S..
* 25 However, this can lead to an unnatural appearance of objects in itself. For this reason the projection of different parts of the same object can be adapted according to the apparent distance of each part of the object from the screen.
E.g. those parts of the displayed object that are close to the screen may be S....' displayed in perspective projection, whilst those parts at a maximum distance *. S. * , 30 from the screen may be displayed using anorthographic projection, with intermediate parts being displayed using some combination of both perspective and orthographic projections. This change in projection can preferably occur in a graduated manner as the apparent object distance increases, so leading to a more pleasing image. Figure 6b shows such a corrected image, with reduced bending.
Glossary of some of the terms used in this specification:
Term Description I
Application-space The eye-space of an external application to be mapped into our display.
Autostereo Binocular disparity (and potentially motion parallax) without the need for special glasses.
Camera-space See projector-space.
Eye-space The co-ordinate system of a viewer in Full Parallax (FP) Showing parallax in both the horizontal and vertical dimensions.
Frustum (p1. frusta) A projection volume; typically resembling a truncated square-based (four-sided) pyramid.
Homogeneous Clip Space The co-ordinate system after the (HCS) perspective projection into a cube.
Homogeneous Co-Representation of vectors in four ordinates dimensions, where the fourth component S...
* becomes the wco-ordinate. *S*.
Horizontal Parallax Only Only showing parallax in the horizontal a (HPO) plane. :
Object-space The local co-ordinate system in which 3D objects are defined. *a..
Projector-space The rendering or camera' co-ordinate
S
system.
System Geometry A property of the system including: Relative positions and orientations of the components, projection frusta and screen geometry.
System-space The co-ordinate system in which the display hardware is defined.
View(ing) volume The volume in which users may see imagery generated by a display system.
(Typically clipped by a particular field-of-
view and usable depth range.) Virtual Projectors The reflection of a projector in a side mirror (for example), with the partial frustum appearing to originate from the projector image.
World-space The global co-ordinate system in which all 3D objects and corresponding object-spaces are defined. J S. * . a
S S... :
S * S a.' S * S. * . S
S

Claims (12)

  1. Claims 1. A method of generating an image on a three dimensional (3D)
    autostereoscopic display comprising the steps of i. providing a plurality of projectors and a screen, wherein the projectors are arranged to illuminate the screen, and further wherein the screen is arranged to have a narrow angle of dispersion in at least one axis; ii. receiving image information to be displayed, in a form representative of a 3D object; iii. for each projector, pre-distorting the received image information to compensate for incorrect perspective of the projector by transforming them into a viewing region perspective, the pre-distortion being carried out by means of one or more virtual cameras positioned on the same side of the screen as the projectors; iv. projecting light rays corresponding to the pre-distorted image information from each of the projectors through the screen to a viewing region.
  2. 2. A method as claimed in claim 1 wherein an additional step is incorporated of adjusting the image displayed by at least one of the projectors to compensate for positional, orientational or optical variations between projectors.
  3. 3. A method as claimed in claim 1 or claim 2 wherein at least one mirror is * * * positioned such that light projected from at least one projector is reflected **S.
    S
    from the mirror before proceeding on towards the screen, to produce at least one virtual frustum. *
    ****S. * *
  4. 4. A method as claimed in any of the above claims wherein an additional *:.:.* step is incorporated of apportioning the viewing region into individual sub-*1 * * 30 regions, wherein imagery associated with each sub-region may be controlled independently of other sub-regions.
  5. 5. A method as claimed in any of the above claims wherein the screen has a wide angle of dispersion in one axis.
  6. 6. A method as claimed in any of the above claims wherein the information to be displayed comes from at least one digital image data file.
  7. 7. A method as claimed in any of claims 1 to 4 wherein the information to be displayed comes from one or more video cameras.
  8. 8. A method as claimed in any of the above claims wherein image information representative of an object and projected by a projector is projected by different projections according to an apparent distance from the screen of different parts of the object.
  9. 9. A method as claimed in claim 8 wherein parts of the object relatively close to the screen are displayed using a perspective projection, and parts of the object relatively distant from the screen are displayed using an orthographic projection.
  10. 10. A method as claimed in claim 9 wherein projection parameters are varied according to the apparent distance of a part of an object from the screen, to provide a smooth transition from the perspective projection to the orthographic projection.
    : * ss* 25
  11. 11. A display system comprising a plurality of projectors, a screen arranged **S.
    to have a narrow angle of dispersion in at least one axis,. and computer processing means, characterised that the computer processing means is arranged to incorporate a computer program designed to implement the method as claimed in claim 1.
    *.. 30
  12. 12. A method of producing an autostereoscopic image on a screen, the * . screen comprising a material having a narrow angle of dispersion in at least one axis, the method comprising the steps of: i. providing a plurality of projectors each arranged to illuminate the screen from a different angle; ii. providing processing means for each projector iii. receiving image information to be displayed, in a form representative of a 3D object, and distributing at least part of this information to the processing means associated with each projector; iv. arranging each projector to project an image in a projection frustum to the screen, wherein differing parts of the projected image within each projector's frustum are rendered, using the projector's processing means, to represent a predetermined view of the overall image, the images from each projector combining to produce an autostereo image in a view volume; wherein the rendering carried out for a given projector uses a virtual image generation camera co-located with the image projector. * I. * S * S... * S S...
    S
    *SS.S. * .
    S
    *....S * S *S. * S *.S. S. S. * S *
    S S
GB0624196A 2006-11-30 2006-11-30 Autostereoscopic projection display Withdrawn GB2444301A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0624196A GB2444301A (en) 2006-11-30 2006-11-30 Autostereoscopic projection display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0624196A GB2444301A (en) 2006-11-30 2006-11-30 Autostereoscopic projection display

Publications (2)

Publication Number Publication Date
GB0624196D0 GB0624196D0 (en) 2007-01-10
GB2444301A true GB2444301A (en) 2008-06-04

Family

ID=37671826

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0624196A Withdrawn GB2444301A (en) 2006-11-30 2006-11-30 Autostereoscopic projection display

Country Status (1)

Country Link
GB (1) GB2444301A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2570903A (en) * 2018-02-08 2019-08-14 Holoxica Ltd Projection array light field display
EP3485330A4 (en) * 2016-07-15 2020-07-29 Light Field Lab, Inc. System and methods of holographic sensory data generation, manipulation and transport
US11650354B2 (en) 2018-01-14 2023-05-16 Light Field Lab, Inc. Systems and methods for rendering data from a 3D environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000310826A (en) * 1999-02-23 2000-11-07 Matsushita Electric Works Ltd Virtual environmental experience display device
JP2002148711A (en) * 2000-11-08 2002-05-22 Matsushita Electric Works Ltd Spherical wide field angle video display device
US6483534B1 (en) * 1997-05-23 2002-11-19 D'ursel Wauthier Stereoscopic image production method and device for implementing same
US20060066810A1 (en) * 2004-09-24 2006-03-30 Sergey Shestak Multi-view autostereoscopic projection system using single projection lens unit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6483534B1 (en) * 1997-05-23 2002-11-19 D'ursel Wauthier Stereoscopic image production method and device for implementing same
JP2000310826A (en) * 1999-02-23 2000-11-07 Matsushita Electric Works Ltd Virtual environmental experience display device
JP2002148711A (en) * 2000-11-08 2002-05-22 Matsushita Electric Works Ltd Spherical wide field angle video display device
US20060066810A1 (en) * 2004-09-24 2006-03-30 Sergey Shestak Multi-view autostereoscopic projection system using single projection lens unit

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3485330A4 (en) * 2016-07-15 2020-07-29 Light Field Lab, Inc. System and methods of holographic sensory data generation, manipulation and transport
US11874493B2 (en) 2016-07-15 2024-01-16 Light Field Lab, Inc. System and methods of universal parameterization of holographic sensory data generation, manipulation and transport
US11650354B2 (en) 2018-01-14 2023-05-16 Light Field Lab, Inc. Systems and methods for rendering data from a 3D environment
GB2570903A (en) * 2018-02-08 2019-08-14 Holoxica Ltd Projection array light field display
GB2570903B (en) * 2018-02-08 2020-03-04 Holoxica Ltd Projection array light field display

Also Published As

Publication number Publication date
GB0624196D0 (en) 2007-01-10

Similar Documents

Publication Publication Date Title
KR101094118B1 (en) Three dimensional projection display
EP1143747B1 (en) Processing of images for autostereoscopic display
US7643025B2 (en) Method and apparatus for applying stereoscopic imagery to three-dimensionally defined substrates
US6366370B1 (en) Rendering methods for full parallax autostereoscopic displays
AU2010202382B2 (en) Parallax scanning through scene object position manipulation
Schmidt et al. Multiviewpoint autostereoscopic dispays from 4D-Vision GmbH
US20120182403A1 (en) Stereoscopic imaging
CN102450001A (en) Multi-projector system and method
EP1953702A2 (en) Apparatus and method for generating CG image for 3-D display
EP3607530B1 (en) System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display
JPH11511316A (en) 3D image display drive
WO2012140397A2 (en) Three-dimensional display system
CN111869202B (en) Method for reducing crosstalk on autostereoscopic displays
CA2540538C (en) Stereoscopic imaging
US20140300713A1 (en) Stereoscopic three dimensional projection and display
GB2444301A (en) Autostereoscopic projection display
Annen et al. Distributed rendering for multiview parallax displays
JP7394566B2 (en) Image processing device, image processing method, and image processing program
Katayama et al. A method for converting three-dimensional models into auto-stereoscopic images based on integral photography
Harish et al. A view-dependent, polyhedral 3D display
Prévoteau et al. Real 3D video capturing for multiscopic rendering with controlled distortion
Lucas et al. Shooting and viewing geometries in 3DTV
Xue et al. Tile-Based 3D Display Using A Reconfigurable Display Matrix
CN113597758A (en) Method and apparatus for correcting lenticular lens distortion

Legal Events

Date Code Title Description
COOA Change in applicant's name or ownership of the application

Owner name: F. POSZAT HU., LLC

Free format text: FORMER APPLICANT(S): QINETIQ LIMITED

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)