WO2002009442A1 - Affichage autostereoscopique - Google Patents

Affichage autostereoscopique Download PDF

Info

Publication number
WO2002009442A1
WO2002009442A1 PCT/US2001/022961 US0122961W WO0209442A1 WO 2002009442 A1 WO2002009442 A1 WO 2002009442A1 US 0122961 W US0122961 W US 0122961W WO 0209442 A1 WO0209442 A1 WO 0209442A1
Authority
WO
WIPO (PCT)
Prior art keywords
phases
display screen
image
observer
shutter
Prior art date
Application number
PCT/US2001/022961
Other languages
English (en)
Inventor
Kenneth Perlin
Salvatore Paxia
Joel S. Kollin
Original Assignee
New York University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New York University filed Critical New York University
Priority to AU2001277939A priority Critical patent/AU2001277939A1/en
Publication of WO2002009442A1 publication Critical patent/WO2002009442A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/22Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
    • G02B30/24Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/30Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers
    • G02B30/31Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers involving active parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/324Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/371Image reproducers using viewer tracking for tracking viewers with different interocular distances; for tracking rotational head movements around the vertical axis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/373Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes

Definitions

  • the present invention is related to a display device which solves a long-standing problem: to give a true stereoscopic view of simulated objects, without artifacts, to a single unencumbered observer, while allowing the observer to freely change position and head rotation by using three phases of stripes of the image.
  • stereo display uses shuttered or passively polarized eyewear, in which the observer wears eyewear that blocks one of two displayed images from each eye. Examples include passively polarized glasses, and rapidly alternating shuttered glasses
  • a graphical display is termed autostereoscopic when all of the work of stereo separation is done by the display [J. Eichenlaub, Lightweight Compact 2D/3D Autostereoscopic LCD Backlight for Games, Monitor, and Notebook Applications. Proc. SPIE Vol. 3295, p. 180-185, in Stereoscopic Displays and Virtual Reality Systems V, Mark T. Bolas; Scott S. Fisher; John O. Merritt; Eds. April 1998, incorporated by reference herein] , so that the observer need not wear special eyewear.
  • a number of researchers have developed displays which present a different image to each eye, so long as the observer remains fixed at a particular location in space.
  • Holographic and pseudo-holographic displays output a partial light-field, computing many different views simultaneously. This has the potential to allow many observers to see the same object simultaneously, but of course it requires far greater computation than is required by two-view stereo for a single observer. Generally only a 3D lightfield is generated, reproducing only horizontal, not vertical parallax.
  • Direct volumetric displays have been created by a number of researchers, such as [Elizabeth Downing et . al . A Three-Color, Solid-State, Three-Dimensional Display. Science 273,5279 (Aug. 30, 1996), pp. 1185-118; R. Williams. Volumetric Three Dimensional Display Technology in D. McAllister (Ed.) Stereo Computer Graphics and other True 3D Technologies, 1993; and G. J. Woodgate, D. Ezra, et . al . Observer-tracking Autostereoscopic 3D display systems. Proc. SPIE Vol. 3012, p.187-198, Stereoscopic Displays and Virtual
  • volumetric display does not create a true lightfield, since volume elements do not block each other.
  • the effect is of a volumetric collection of glowing points of light, visible from any point of view as a glowing ghostlike image.
  • the goals of the present invention have been to present a single observer with an artifact-free autostereoscopic view of simulated or remotely transmitted three dimensional scenes.
  • the observer should be able to move or rotate their head freely in three dimensions, while always perceiving proper stereo separation.
  • the subjective experience should simply be that the monitor is displaying a three dimensional object.
  • the present invention provides a solution that could be widely adopted without great expense and that would not suffer from the factor-of-two loss of horizontal resolution which is endemic to parallax barrier systems.
  • the user responsive adjustment could not contain mechanically moving parts, since that . would introduce unacceptable latency.
  • the mechanism could not rely on very high cost components and needed to be able to migrate to a flat screen technology.
  • the significance of the present invention is in that it enables a graphic display to assume many of the properties of a true three dimensional object. An unencumbered observer can walk up to an object and look at it from an arbitrary distance and angle, and the object will remain in a consistent spatial position. For many practical purposes, the graphic display subjectively becomes a three dimensional object. When combined with haptic response, this object could be manipulated in many of the ways that a real object can.
  • Ubiquitous non-invasive stereo displays hold the promise of fundamentally changing the graphical user interface, allowing CAD program designers, creators of educational materials, and authors of Web interfaces (to cite only some application domains) to create interfaces which allow users to interact within a true three dimensional space .
  • the present invention pertains to an apparatus for displaying an image to an observer.
  • the apparatus comprises a display screen upon which stripes of the image appear in at least three distinct phases ' .
  • the apparatus comprises a light blocking shutter disposed in front of the display screen forming a stripe pattern which lets through only 1/3 of each stripe of the image on the display screen during each of the at least three distinct phases.
  • the apparatus comprises a computer connected to the display screen and the light blocking shutter which changes the phases so in each phase the stripe pattern is shifted laterally, which renders 2 3D scenes corresponding to the eyes of the observer, which produces a proper 'left/right orientation pattern for each of the three phases and which interleaves the left/right orientations into three successive time phases as red, green and blue, respectively.
  • the apparatus comprises an eye tracker for identifying the locations of the observers ' eyes and providing the location to the computer.
  • the present invention pertains to a method for displaying an image to an observer.
  • the method comprises the steps of identifying locations of the observer's eyes with an eye tracker.
  • There is the step of forming a stripe pattern which lets through only 1/3 of each stripe of the image on the display screen during each of the at least three distinct phases with a light blocking shutter disposed in front of the screen.
  • Figures la and lb show an observer's eyes seeing half of the respective image through each eye, and the other half of each respective image, respectively.
  • Figures 2a, 2b and 2c show the use of three phases.
  • Figures 3a and 3b show the observer far and near, respectively, from the shutter.
  • Figure 4 shows the stripes vary in width in a perspective linear pattern.
  • FIGS 5a and 5b show the processes of the present invention after 1 iteration and 3 iterations, respectively.
  • Figures 6a and 6b are computer generated illustrations which show separate left and right images, respectively.
  • Figures 7a, 7b and 7c are computer generated illustrations which show the red, green and blue components, respectively.
  • Figure 8 is a flow chart of the present invention.
  • Figure 9 is a computer generated illustration which shows an image displayed on an unenhanced monitor.
  • Figures 10a - and 10b are computer generated illustrations which show what the left and right eyes, respectively, would see with the present invention in place.
  • Figure 11 is a computer generated illustration which shows the apparatus of the present invention.
  • Figures 12a and 12b are computer generated illustrations which show a pi-cell.
  • Figure 13 shows a stereo embodiment of the present invention.
  • an apparatus 10 for displaying an image to an observer comprises a display screen 12 upon which stripes of the image appear in at least three distinct phases.
  • the apparatus 10 comprises a light blocking shutter 14 disposed in front of the display screen 12 forming a stripe pattern which lets through only 1/3 of each stripe of the image on the display screen 12 during each of the at least three distinct phases.
  • the apparatus 10 comprises a computer 16 connected to the display screen 12 and the light blocking shutter 14 which changes the phases so in each phase the stripe pattern is shifted laterally, which renders two 3D scenes corresponding to the eyes of the observer, which produces a proper left/right orientation pattern for each of the three phases and which interleaves the left/right orientations into three successive time phases as red, green and blue, respectively.
  • the apparatus 10 comprises an eye tracker 18 for identifying the locations of the observers ' eyes and providing the location to the computer 16.
  • the display screen 12 includes a rear projection screen 20.
  • the display screen 12 preferably includes a field programmable gate array 22 in communication with the projection screen and the shutter which synchronizes the phases between the shutter and the projection screen.
  • the display screen 12 includes a digital light processor projector 24 in communication with the array and the projection screen which displays the three phases of images sequentially and controls the timing of the phases .
  • the display screen 12 preferably includes a ferroelectric liquid crystal 26 in communication with the array, the light processor, and the projection screen which shutters the start and stop of each phase.
  • the shutter includes a pi-cell.
  • the present invention pertains to a method for displaying an image to an observer.
  • the method comprises the steps of identifying locations of the observer's eyes with an eye tracker 18.
  • There is the step of forming a stripe pattern which lets through only 1/3 of each stripe of the image on the display screen 12 during each of the at least three distinct phases with a light blocking shutter 14 disposed in front of the screen.
  • the forming step includes the step of encoding into 3 1-dimensional bit-maps the three phases of stripe for the light shutter, each indicating an on-off pattern for shutter micro-stripes at one of the three phases; and sending these bit-maps to a field programmable gate array 22 of the display screen 12.
  • the forming step preferably includes the step of sending with the field programmable gate array 22 the three bit-patterns- to a pi-cell light shutter in rotating sequence.
  • the forming step includes the step of controlling with a digital light processor projector 24 of the display screen 12 timing of the rotating sequence of the three-bit patterns to the pi-cell.
  • the displaying step preferably includes the step of displaying with the digital light processor projector 24 the three image phases in succession.
  • a modified parallax barrier was created that combines spatial multiplexing and temporal multiplexing. Since no "fixed parallax barrier geometry could accommodate arbitrary observer position and orientation, a dynamically varying parallax barrier was created, one that continually changes the width and positions ' of its stripes as the observer moves.
  • the use of a virtual dynamic parallax barrier is reminiscent of work by [J.R. Moore, N.A. Dodgson, A.R.L. Travis and S.R. Lang. Time-Multiplexed Color Autostereoscopic Display. Proc.
  • Each dynamic stripe needs to be highly variable in its width, in order to accommodate many different positions and orientations of the observer. For this reason, the dynamic stripes were made rather large, ' and use a correspondingly large gap between the display screen 12 and the light-blocking parallax barrier. Because the stripes are large enough to be easily visible, they were needed to be made somehow unnoticeable . To do this, they were rapidly animated in a lateral direction. The observer then cannot perceive the individual stripes, just as a passenger in a car speeding alongside a picket fence cannot see the individual fence posts.
  • This large-stripe approach requires each stripe to be composed from some number of very slender microstripes, each of which is an individually switchable liquid crystal 26 display element.
  • a dynamic parallax barrier was used consisting of very large stripes, which are made out of many slender ones, and these large stripes are moved so rapidly across the image that the observer cannot perceive them.
  • a temporally multiplexed system could be made from just two alternating phases.
  • Parallax barrier systems depend on the distance E between an observer's two eyes (generally about 2.5 inches).
  • a display screen 12 D inches away from the observer showed alternating stripes of a left and a right image.
  • a light-blocking shutter were placed G inches in front of this display screen 12 in a "picket fence" stripe pattern.
  • each shutter stripe were chosen as E*G/D, and the width of each image stripe as E*G/ (D-G) , then during phase 1 the observer's left eye would be able to see half of one image through the clear stripes, and the observer's right eye would be able to see half of the other image through the clear stripes [ Figure la] . If the light-blocking shutter were then flipped, and the display screen 12 pattern simultaneously changed, then the observer would see the remainder of each respective image [ Figure lb] . If this flipping were done fast enough, then the observer would perceive two complete independent images, each visible only to one ' eye . The problem with this scenario is that the observer would need to be in precisely the correct position; the slightest deviation to the left or right would result in the wrong eye seeing a sliver of the wrong image.
  • the stripes are animated in three phases.
  • the light-blocking shutter lets through only one third of each stripe .
  • the stripe pattern is shifted laterally.
  • the observer's left eye sees one entire image, and the observer's eye sees a different entire image.
  • the use of three phases guarantees that there is room for error in the observer's lateral position [ Figures 2a, 2b, 2c].
  • the observer can be at a wide range of distances, since the stripe width can always be varied so as to equal E*G/D, as described above.
  • Figure 3a shows the observer relatively far;
  • Figure 3b shows the observer much closer.
  • Microstripe. resolution puts a practical upper limit on the observer distance, since the stripes become narrower as the observer's distance to the screen increases.
  • This upper limit increases linearly both with the gap between the display and shutter, and with the shutter resolution. In practice, these have been set so as to be able to handle an observer up to about five feet away.
  • Figures 5a, 5b show how to construct a sequence of stripe positions from two eye positions (shown as a green and red dot, respectively) , a display surface (shown as the bottom of the two horizontal lines) and a shutter surface
  • the next such location is given first by finding the location on the shutter f p (x n ) in the line-of-sight from p, and then finding the corresponding location on the display screen 12 which is in the line-of- sight from q:
  • A x (1 - p y "1 ) / (1 - q y _1 )
  • B (p x p y _1 - q x q y _1 ) / (1 - q y _1 )
  • the even terms locate the centers of those portions of the image visible from the right eye
  • the odd terms locate the centers of those portions of the image visible from the left eye.
  • the openings in the shutter are centered at
  • a custom pi-cell liquid crystal 26 screen built to our specifications by [LXD : http : //www . lxdinc . com/ , incorporated by reference herein] was used, which was driven from power ICs mounted on a custom- made Printed Circuit Board (PCB) .
  • PCB Printed Circuit Board
  • FPGA Field Programmable Gate Array
  • An eye tracker 18 locates the observer's eyes, and sends t information to the CPU.
  • the main CPU uses the eye tracker 18 info to render two scenes: one as seen from each eye.
  • the main CPU also uses the eye tracker 18 info to compute, each of three phases, the proper left/right alternat pattern. These are interleaved into three successive t phases as red, green, and blue, respectively.
  • the main CPU also uses the eye info to compute the tb phases of stripe on the light shutter. These are encoded i three one-dimensional bit-maps, each indicating an on- pattern for the shutter micro-stripes at one of the tb phases. These bit-maps are shipped to the FPGA.
  • the FPGA sends the three bit-patterns to the pi-cell li shutter in rotating sequence, every 1/180 second.
  • the tin for this is controlled by the DLP projector, which prod_ a signal every time its color wheel advances.
  • the DLP projector displays the three image phases succession.
  • the color wheel on the projector is removed, that each of the red, green, and blue components displays a gray scale image.
  • the FLC element is modulated by the FPGA to block the li from the DLP projector lens in a 180 Hz square wave patte This allows finer control over timing.
  • a rear projection screen 20 diffuses the image from DLP projector.
  • Steps (5) through (9) above are part of the "realtime subsystem" which is monitored by the FPGA. These parts of the process are monitored continuously by the FPGA to synchronize all the events which must occur simultaneously 180 times per second.
  • OpenGL is used to encode the red/green/blue sub- images which the DLP projector will turn into time sequential phases. To do this, first render the compute separate left and right images in OpenGL, into off-screen buffers, as shown in Figures 6a, 6b.
  • the real-time subsystem maintains a more stringent schedule: a synchronous 180 Hz cycle.
  • the pattern on the light-shutter needs to switch at the same moment that the DLP projector begins its red, green, or blue component.
  • This timing task is handled by the FPGA, which reads a signal produced by the projector every time it the color wheel cycles (about once every 1/180 second) and responds by cycling the light shutter pattern.
  • the FPGA modulates a ferro-electric optical switch which is mounted in front of the projector lens.
  • the main CPU is not involved at all in this finegrained timing.
  • the only tasks required of the CPU are to produce left/right images, to interleave them to create ⁇ a red/green/blue composite, and to put the result into an onscreen frame buffer, ideally (but not critically) at 60 frames per second.
  • Figure 11 is a computer 16 generated illustration. Each is described in some detail.
  • an ISA interface board was built with a non volatile Xilinx 95C108 PLD and a reconfigurable Xilinx XC4005E FPGA.
  • the PLD is used to generate the ISA Bus Chip Select signals and to reprogram the FPGA.
  • the XC4005E is large enough to contain six 256 bit Dual Ported RAMs (to double buffer the shutter masks needed for our three phases) , the ISA Bus logic, and all the hardware needed to process the DLP signals and drive the pi-cell.
  • this chip When loaded with the three desired patterns from the main CPU, this chip continually monitors the color wheel signals from the DLP projector. Each time it detects a change from red to green, green to blue, or blue to red, it sends the proper signals to the Supertex HV57708 high voltage Serial to parallel converters mounted on the Pi-cell, switching each of the light shutter's 256 microstripes on or off.
  • a standard twisted nematic liquid crystal 26 display (such as is widely used in notebook computers) does not have the switching speed needed; requiring about 20 msec to relax from its on state to its off state after charge has been removed. Instead, a pi-cell is used, which is a form of liquid crystal 26 material in which the crystals twist by 180° (hence the name) rather than that 90° twist used for twisted nematic LC displays.
  • Pi-cells have not been widely used partly because they tend to be bistable - they tend to snap to either one polarization or another This makes it difficult to use them for gray scale modulation. On the other hand, they will relax after a charge has been removed far more rapidly than will twisted nematic - a pi-cell display can be driven to create a reasonable square wave at 200 Hz. This is precisely the characteristic needed - an on-off light blocking device that can be rapidly switched. Cost would be comparable to that of twisted nematic LC displays, if produced at comparable quantities .
  • Figure 12a and Figure 12b which are computer generated illustrations, show the pi-cell device that was manufactured by [LXD: http://www.lxdinc.com/, incorporated by reference herein] .
  • the image to the left shows the size of the screen, the close-up image to the right shows the individual microstripes and edge connectors.
  • the active area is 14"xl2", and the microstripes run vertically, 20 per inch.
  • the microstripe density could easily have exceeded 100 per inch, but the density chosen required to drive only 256 microstripes, and was sufficient for a first prototype. Edge connectors for the even microstripes run along the bottom; edge connectors for the odd microstripes run along the top.
  • a ferro-electric liquid crystal 26 (FLC) will switch even faster than will a pi-cell, since it has a natural bias that allows it to be actively driven from the on-state to the off-state and back again.
  • a ferro-electric element can be switched in 70 microseconds.
  • ferro-electric elements are very delicate and expensive to manufacture at large scales, and would therefore be impractical to use as the light shutter. However, at small sizes they are quite practical and robust to work with.
  • a small ferro-electric switch was used over the projector lens, manufactured by Displaytech [Di spl ayt e ch : http://www.displaytech.com/shutters.html, incorporated by reference herein] , to provide a sharper cut-off between the three phases of the shutter sequence.
  • This element is periodically closed between the respective red, green, and blue phases of the DLP projector's cycle. While the FLC is closed, the pi-cell microstripes transitions (which require about 1.2 ms) are effected.
  • a system based on this principle sends a small infrared light from the direction of a camera during only the even video fields.
  • the difference image between the even and odd video fields will show only two glowing spots, locating the observer's left and right eyes, respectively.
  • the system is able to simultaneously capture two parallax displaced images of the glowing eye spots.
  • the lateral shift between the respective eye spots in these two images is measured, to calculate the distance of each eye.
  • the result is two (x,y,z) triplets, one for each eye, at every video frame.
  • a Kalman filter [M. Grewal , A.
  • Andrews, Kalman Filtering: Theory and Practice, Prentice Hall, 1993, incorporated by reference herein] is used to smooth out these results and to interpolate eye position during the intermediate fields.
  • a number of groups are planning commercial deployment of retroreflective-based tracking in some form, including IBM [M. Flickner: ht tp : / /www . lmaden . ibm .com/cs/blueeyes/ find. html, incorporated by reference herein] .
  • the user tracking provides as a pair of 3D points, one for each eye.
  • this information is used in three ways, (i) Each of these points is used by OpenGL as the eye point from which to render the virtual scene into an offscreen buffer; (ii) The proper succession lateral locations for left/right image interleaving is calculated, which is used to convert the left/right offscreen images into the three temporally phased images; (iii) The proper positions for the light shutter transitions are calculated. This information is converted to three one dimensional bitmaps, each indicating an on-off pattern for the shutter micro-stripes at one of the three phases.
  • This information is sent to the FPGA, which then sends the proper pattern to the light shutter every 1/180 second, synchronously with the three phases of the DLP projector.
  • the goals of the present invention of . the system were (i ) low latency and (ii) absence of artifacts.
  • the most important question to answer is: " does it work?' ' The answer is yes.
  • the experience is most compelling when objects appear to lie near the distance of the display screen 12, so that stereo disparity is reasonably close to focus (which is always in the plane of the projection screen) .
  • the experience is compelling; as an observer looks around an object, it appears to float within the viewing volume. The observer can look around the object, and can position himself or herself at various distances from the screen as well . Special eyewear is not required.
  • the software-implemented renderer did not achieve a consistent 60 frames per second, but rather something closer to 30 frames per second. In practice this meant that if the observer darted his/her head about too quickly, the tracker could not properly feed the display subsystem when the user moved his/her head rapidly.
  • This display platform can be used for teleconferencing. With a truly non-invasive stereoscopic display, two people having a video conversation can perceive the other as though looking across a table. Each person's image is transmitted to the other via a video camera that also captures depth [T. Kanade, et al . Development of a Video Rate Stereo Machine. Proc. of International Robotics and Systems Conference (IROS-95) , Pittsburgh, PA, August 7-9, 1995, incorporated by reference herein] . At the recipient end, movements of the observer's head are tracked, and the transmitted depth-enhanced image is interpolated to create a proper view from the observer's left and right eyes, as in [S. Chen and L. Williams. View Interpolation for Image Synthesis. Computer Graphics (SIGGRAPH 93 Conference Proc.) p.279-288, incorporated by reference herein] . Head movements by each participant reinforce the sense of presence and solidity of the other, and proper eye contact is always maintained.
  • An implementation of an API for game developers is possible so that users of accelerator boards for two-person games can make use of the on-board two-view hardware support provided in those boards to simultaneously accelerate left and right views in the display. Variants of this system for two observers are also possible.
  • Figure 13 shows two cameras with active IR illumination to detect a "red-eye” image and subtract it from a "normal image”.
  • IR polarizers separate the optical illumination paths of the two cameras, making the system far less prone to errors in a stereo mode.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un appareil d'affichage d'une image (fig. 8) à un observateur. L'appareil comprend un écran d'affichage (20) sur lequel des bandes d'images apparaissent dans au moins trois phases distinctes. Cet appareil comporte, également, un obturateur de lumière (14) placé en face de l'écran d'affichage formant un motif en bandes qui permet de visualiser seulement 1/3 de chaque bande d'image sur l'écran d'affichage pendant chacune desdites phases distinctes, ainsi qu'un ordinateur (16) relié à l'écran d'affichage et à l'obturateur de lumière qui modifie les phases de manière que dans chaque phase le motif en bandes est décalé latéralement, afin de faire correspondre deux scènes en 3D avec les yeux de l'observateur, lequel produit un motif d'orientation gauche/droite propre pour chacune des trois phases et intercale les orientations gauche/droite en trois phases temporelles successives respectivement en tant que rouge, vert et bleu. L'appareil comprend, enfin, un suiveur oculaire (18) destiné à identifier les positions des yeux de l'observateur et de transmettre la position à l'ordinateur. L'invention concerne, également, un procédé destiné à afficher l'image à un observateur.
PCT/US2001/022961 2000-07-21 2001-07-20 Affichage autostereoscopique WO2002009442A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001277939A AU2001277939A1 (en) 2000-07-21 2001-07-20 Autostereoscopic display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US21984500P 2000-07-21 2000-07-21
US60/219,845 2000-07-21

Publications (1)

Publication Number Publication Date
WO2002009442A1 true WO2002009442A1 (fr) 2002-01-31

Family

ID=22821012

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/022961 WO2002009442A1 (fr) 2000-07-21 2001-07-20 Affichage autostereoscopique

Country Status (2)

Country Link
AU (1) AU2001277939A1 (fr)
WO (1) WO2002009442A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005106563A2 (fr) * 2004-05-05 2005-11-10 Spielberger Juergen Systeme pour observer des images stereoscopiques
JP2006154809A (ja) * 2004-11-30 2006-06-15 Samsung Sdi Co Ltd バリア装置,立体映像表示装置及びその駆動方法
WO2007136348A1 (fr) * 2006-05-18 2007-11-29 Bracco Imaging S.P.A. Procédés et appareils pour un affichage stéréographique
WO2009034196A1 (fr) * 2007-09-14 2009-03-19 Cash Out Système de visualisation scénique
WO2009036758A1 (fr) * 2007-09-20 2009-03-26 Visumotion Gmbh Procédé pour raccourcir ou allonger la distance d'observation entre des observateurs et un dispositif afin d'obtenir une représentation perceptible en trois dimensions
CN1845612B (zh) * 2005-04-08 2010-05-12 三星电子株式会社 使用混合位置跟踪系统的三维显示装置和方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5959664A (en) * 1994-12-29 1999-09-28 Sharp Kabushiki Kaisha Observer tracking autostereoscopic display and method of tracking an observer
US5991073A (en) * 1996-01-26 1999-11-23 Sharp Kabushiki Kaisha Autostereoscopic display including a viewing window that may receive black view data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5959664A (en) * 1994-12-29 1999-09-28 Sharp Kabushiki Kaisha Observer tracking autostereoscopic display and method of tracking an observer
US5991073A (en) * 1996-01-26 1999-11-23 Sharp Kabushiki Kaisha Autostereoscopic display including a viewing window that may receive black view data

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005106563A2 (fr) * 2004-05-05 2005-11-10 Spielberger Juergen Systeme pour observer des images stereoscopiques
WO2005106563A3 (fr) * 2004-05-05 2006-03-16 Juergen Spielberger Systeme pour observer des images stereoscopiques
JP2006154809A (ja) * 2004-11-30 2006-06-15 Samsung Sdi Co Ltd バリア装置,立体映像表示装置及びその駆動方法
EP1662808B1 (fr) * 2004-11-30 2012-12-19 Samsung Display Co., Ltd. Dispositif d'affichage d'images tridimensionnelles utilisant des bandes parallaxes générées électriquement avec unité d'affichage pouvant être pivotée
US8373617B2 (en) 2004-11-30 2013-02-12 Samsung Display Co., Ltd. Barrier device and stereoscopic image display using the same
CN1845612B (zh) * 2005-04-08 2010-05-12 三星电子株式会社 使用混合位置跟踪系统的三维显示装置和方法
WO2007136348A1 (fr) * 2006-05-18 2007-11-29 Bracco Imaging S.P.A. Procédés et appareils pour un affichage stéréographique
WO2009034196A1 (fr) * 2007-09-14 2009-03-19 Cash Out Système de visualisation scénique
ES2365086A1 (es) * 2007-09-14 2011-09-22 Cash Out Sistema de visualización escénica.
WO2009036758A1 (fr) * 2007-09-20 2009-03-26 Visumotion Gmbh Procédé pour raccourcir ou allonger la distance d'observation entre des observateurs et un dispositif afin d'obtenir une représentation perceptible en trois dimensions

Also Published As

Publication number Publication date
AU2001277939A1 (en) 2002-02-05

Similar Documents

Publication Publication Date Title
US7239293B2 (en) Autostereoscopic display
Perlin et al. An autostereoscopic display
US20080024598A1 (en) Autostereoscopic display
Peterka et al. Advances in the dynallax solid-state dynamic parallax barrier autostereoscopic visualization display system
EP0744872B1 (fr) Méthode et dispositif d'affichage d'images stéréoscopiques
US8179424B2 (en) 3D display method and apparatus
US7190518B1 (en) Systems for and methods of three dimensional viewing
CA2236329C (fr) Systeme et procede de dessin tridimensionnel
US20050275942A1 (en) Method and apparatus to retrofit a display device for autostereoscopic display of interactive computer graphics
US6239830B1 (en) Displayer and method for displaying
WO1996022660A1 (fr) Systeme intelligent et procede associe de creation et presentation d'images stereo multiplexees dans des environnements de realite virtuelle
WO1997019423A9 (fr) Systeme et procede de dessin tridimensionnel
McAllister Display technology: stereo & 3D display technologies
US6674463B1 (en) Technique for autostereoscopic image, film and television acquisition and display by multi-aperture multiplexing
JP2000115812A (ja) 三次元表示方法および装置
US6061084A (en) Displayer and a method for displaying
Dodgson Autostereo displays: 3D without glasses
WO2002009442A1 (fr) Affichage autostereoscopique
Peterka et al. Dynallax: solid state dynamic parallax barrier autostereoscopic VR display
EP2244170A1 (fr) Dispositif tactile d'imagerie stéréo
Surman et al. Glasses-free 3-D and augmented reality display advances: from theory to implementation
WO2009040717A2 (fr) Dispositif d'affichage en 3d modulaire et son procédé de commande
US11595628B2 (en) Projection system and method for three-dimensional images
Lipton Selection devices for field-sequential stereoscopic displays: a brief history
Surman et al. Towards the reality of 3D imaging and display

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP