WO2006070499A1 - 立体画像表示方法 - Google Patents
立体画像表示方法 Download PDFInfo
- Publication number
- WO2006070499A1 WO2006070499A1 PCT/JP2005/011738 JP2005011738W WO2006070499A1 WO 2006070499 A1 WO2006070499 A1 WO 2006070499A1 JP 2005011738 W JP2005011738 W JP 2005011738W WO 2006070499 A1 WO2006070499 A1 WO 2006070499A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional
- light
- display
- image
- emitting element
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/30—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/54—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being generated by moving a 2D surface, e.g. by vibrating or rotating the 2D surface
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/02—Stereoscopic photography by sequential recording
- G03B35/04—Stereoscopic photography by sequential recording with movement of beam-selecting members in a system defining two or more viewpoints
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/18—Stereoscopic photography by simultaneous viewing
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/388—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
- H04N13/393—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume the volume being generated by a moving, e.g. vibrating or rotating, surface
Definitions
- the present invention relates to a stereoscopic image display method that does not require wearing of stereoscopic glasses.
- Non-Patent Documents 1, 2, and 3 propose a three-dimensional display device with a structure in which a one-dimensional light source array aligned in a vertical direction is rotated inside a cylindrical parallax barrier [Non-Patent Documents 1, 2, and 3].
- This device has the feature that an image can be displayed with a finer parallax interval as compared with the prior art by rotating a cylindrical Noralux noria in the opposite direction to the light source array!
- a stereoscopic image can be actually displayed using this device and observed from the entire circumference.
- Patent Document 1 also proposes a stereoscopic display device using a light-emitting array that is circularly scanned and Noralux Noria.
- Patent Document 2 discloses that a plurality of pixels form a cylindrical display surface, and each pixel can emit light of different colors and brightness depending on the angle in the horizontal direction.
- a stereoscopic image display device that presents a standing image that can be viewed with the naked eye to a person outside the display surface using a three-dimensional display device configured as described above is shown.
- Non-Patent Document 1 Tomohiro Kusumi and Makoto Sato “Real-time 3D display by rotating 1D light source array”, Journal of the Institute of Image Information and Television Engineers, Vol. 53, No. 3, pp. 399- 404, (1999)
- Non-Patent Document 2 Tomohiro Todo, Yoshihiro Kashiwagi, Ikuo Nissan, Makoto Sato, “A 3D Video Display”, 3D Image Conference, 99 Proceedings, pp. 99-104 (1999)
- Non-Patent Document 3 “Two-Dimensional 3D Display” written by Tomohiro Todo, Yoshihiro Kashiwagi, Ikuo Honda, Makoto Sato, IEICE Transactions D—II Vol. J84-D-II, No. 6, ppl003— 1011, (2001)
- Non-Patent Document 4 Tomohiro Kusudo and Ikuo Honda “All-around 3D Display Color Video Display System”, 3D Conference 2002, pp. 89-92 (2002)
- Patent Document 1 Special No. 2003-195214 (Applicant: Seiko Epson Corporation)
- Patent Document 2 JP-A-10-97013 (Applicant: Futaba Electronics Co., Ltd.) Disclosure of the Invention
- the data given to the 3D display device is the 3D that determines the color and brightness of the light for the 3 parameters, which are the 2 parameters that specify the pixel position plus the horizontal angle parameter that emits light. It is data of. Generating this type of data to achieve the desired image display is called rendering here.
- An object of the present invention is to provide a stereoscopic image display method capable of performing stereoscopic display of a live-action image and a live-action simulated image.
- Another object of the present invention is to provide a stereoscopic image display method capable of supporting real-time display of a photographed image and a photographed simulated image.
- the source of the image information is a plurality of two-dimensional images having parallax (also referred to as parallax images).
- parallax images When this is displayed as a stereoscopic image, it is necessary to realize a state in which one of a plurality of two-dimensional images can be selectively viewed according to the position of the viewpoint. This is the same as the conventional multi-lens stereoscopic image such as the lenticular method.
- a plurality of pixels form a cylindrical display surface, and each pixel can be configured to emit light of different colors and brightness depending on the angle in the horizontal direction.
- a stereoscopic image that can be viewed with the naked eye is presented to a person outside the display surface.
- a three-dimensional display device for example, the above-described three-dimensional display device developed by the inventors can be used.
- a plurality of one-dimensional light-emitting element arrays each having a plurality of light-emitting elements arranged in rows are arranged at predetermined intervals in the circumferential direction.
- the light-shielding portion is arranged in the circumferential direction. And a light-shielding portion structure arranged at a predetermined interval. Then, under the condition that the rotation speed of the light emitting element array structure is slower than the rotation speed of the light shielding section structure, the light emitting element array structure and the light shielding section structure are rotated in opposite directions and a plurality of primary elements are rotated.
- a plurality of pixels are formed in the space between the light emitting element array structure and the light shielding portion structure.
- the light emitting element array structure and the light shielding part structure may be rotated in the same direction.
- they may be rotated while maintaining a constant speed ratio.
- a light-emitting element array structure in which a plurality of light-emitting elements are arranged two-dimensionally on a cylindrical surface is used instead of a light-emitting element array structure composed of a plurality of one-dimensional light-emitting element arrays, light emission is performed.
- the element array structure may be fixed and the light shielding portion structure may be rotated.
- the method of the present invention can naturally be applied to the case of using another known three-dimensional display device such as Japanese Patent Application Laid-Open No. 10-097013 that is not limited to the above three-dimensional display device.
- a subject to be displayed as a stereoscopic image is defined as one subject center point.
- a plurality of two-dimensional pseudo images equivalent to the plurality of two-dimensional images that can be obtained by using computer graphics technology are created and acquired as a plurality of two-dimensional images (first step).
- the acquired two-dimensional images are stored in memory when data processing is performed later, and may not be stored in memory when data is processed in real time.
- the subject center point is one starting point for measuring the distance on the subject side when, for example, a subject is photographed with a photographing device from the same distance from the outside to the entire circumference.
- one 2D image is selected from the plurality of 2D images.
- the viewpoint power corresponding to the selected 2D image is selected, and one pixel that can be seen is selected as the middle power of multiple pixels (second step).
- the viewpoint position is a position corresponding to the lens principal point of the photographing device (camera) that captured the selected two-dimensional image when the subject center point is made to correspond to the cylinder center of the display surface.
- the image center point corresponding to the subject center point in the two-dimensional image coincides with the center of the cylindrical display surface, and
- the angle formed by the straight line connecting the viewpoint position and the center of the display surface with the virtual plane is the angle formed by the line connecting the subject center point and the lens principal point of the camera (camera) and the imaging surface of the camera (camera).
- the virtual plane is arranged so as to match (third step).
- the image center point is the point where the subject center point appears on the 2D image.
- a virtual extension line extending to the virtual plane through the selected pixel is assumed (fourth step). Based on the color on the two-dimensional image that is assumed to be pasted on the virtual surface corresponding to the intersection point of this virtual extension line and the virtual surface, this one pixel force in the direction toward the viewpoint position
- the display color of one pixel is determined (fifth step). In the simplest case, if the color on the 2D image assumed to be pasted on the virtual plane corresponding to the intersection mentioned above is determined as the display color of one pixel in the direction of one pixel force at the viewpoint position Good. Occurs when a 2D image that is assumed to be pasted on a virtual surface with a force contains a component that is higher than the maximum spatial frequency that can be displayed by the display device.
- the color on the two-dimensional image assumed to be pasted on the virtual surface corresponding to the intersection point of the virtual extension line and the virtual surface, and a plurality of points around the intersection point One pixel by performing a weighted average calculation process according to the distance between the intersection and the plurality of points based on the corresponding plurality of colors on the two-dimensional image assumed to be pasted on the virtual surface It is preferable to determine the display color.
- the maximum spatial frequency that can be displayed by the display device is determined by the spacing between two adjacent pixels, and the spacing force of the horizontal angle discretization in which one pixel can independently control color and brightness.
- the second to fifth steps are executed for a plurality of pixels visible from one viewpoint position to determine display colors for the plurality of pixels (sixth step). Then, the second to sixth steps are executed for all of the corresponding viewpoint positions for all of the plurality of two-dimensional images (seventh step).
- the display colors determined by the first to seventh steps are obtained.
- the 3D display device is controlled so that the color of light emitted by multiple pixels changes according to the horizontal angle from which the light is emitted (eighth step).
- a two-dimensional image corresponding to each viewpoint position using a three-dimensional display device in which pixels are two-dimensionally arranged on a virtual cylindrical display surface. Can be displayed independently, so that it is possible to execute a real-time image or a three-dimensional display of the simulated image in real time.
- the display of the photographed image can be displayed in real time (real time).
- the plurality of light emitting elements included in the light emitting element array include light emitting diodes, laser diodes, organic EL, plasma display elements, FEDs, SEDs, CRTs, and other spatial light such as liquid crystal display elements and DMD elements.
- a combination of a modulator and a suitable light source can be used.
- FIG. 1 is a diagram used for explaining a conventional method.
- FIG. 2 is a diagram showing a basic structure of a three-dimensional display device used in the embodiment of the present invention. is there.
- FIG. 3 is a diagram used for explaining the principle of the method of the present invention.
- FIG. 4 is a diagram showing a specific method of identifying pixels used in explaining the principle of the method of the present invention with reference to FIG.
- FIG. 5 is a diagram used as an auxiliary when explaining the principle of the method of the present invention with reference to FIG. 3.
- FIG. 6 is a flowchart showing the contents of an example of a software algorithm used when steps 2 to 8 of the method according to the embodiment of the present invention are realized using a computer.
- FIG. 2 shows a basic structure of the three-dimensional display device 1 used in the embodiment of the present invention.
- the three-dimensional display device 1 includes a composite rotating structure 3.
- the composite rotating structure 3 has a structure including a light emitting element array structure 5 and a light shielding portion structure 7.
- the light emitting element array structure 5 has a structure in which a plurality of one-dimensional light emitting element arrays 9 are arranged at predetermined intervals in the circumferential direction.
- the one-dimensional light-emitting element array 9 has a structure in which a plurality of light-emitting diodes LED are attached to a support member so as to form a line vertically (vertically).
- the plurality of one-dimensional light-emitting element arrays 9 constituting the light-emitting element array structure 5 each have a structure in which single-color light emitting diodes are arranged in the vertical direction, and three types of red, green and blue A one-dimensional light emitting element array is repeatedly arranged in the circumferential direction to form a light emitting element array structure 5.
- the one-dimensional light emitting element array may have a structure in which three color light emitters are contained in one package, and a plurality of light emitting elements are vertically arranged.
- the plurality of one-dimensional light emitting element arrays 5 are connected by ring-shaped thin and connecting frames (not shown) arranged at the upper and lower positions of each one-dimensional light emitting element array 5.
- the light shielding part structure 7 called a parallax barrier is arranged outside the light emitting element array structure 5, and has a plurality of slits for presenting a stereoscopic image to a plurality of people outside the three-dimensional display device 1, respectively.
- 10 has a structure in which a plurality of light-shielding portions 8 are arranged at predetermined intervals in the circumferential direction.
- the light emitting element array structure 5 and the light shielding portion structure 7 are mutually connected. Rotate with a constant speed ratio.
- the light emitting element array structure 5 and the light shielding part structure 7 may rotate in opposite directions as shown in FIG. 2 as long as they rotate while maintaining a constant speed ratio. You may rotate in the direction.
- a description of the drive structure for rotating the light emitting element array structure 5 and the light shielding portion structure 7 is omitted.
- Both the light-shielding part structure 7 (pararata barrier) and the light-emitting element array structure 5 inside thereof rotate.
- the direction of rotation is the opposite.
- the relative position changes rapidly due to the rotation of both the light emitting element array structure 5 and the light shielding part structure 7 (parallax barrier).
- the direction of the thin light beam exiting through the slit 10 of the light-shielding structure 7 is scanned, and time-division is performed by changing the luminance of the light-emitting diode elements LED constituting the one-dimensional light-emitting element array in synchronization therewith.
- the light is reproduced with.
- the specific light shielding part structure (parallax barrier) 7 rotates at a high speed of 1800 rpm, for example, in a prototype.
- the light-emitting element array structure 5 rotates at a speed of about 100 rpm by the prototype.
- the three-dimensional display device 1 is used to perform a solid display as follows.
- a rendering method for carrying out the method of the present invention will be described with reference to FIGS.
- the 3D display device 1 it is assumed that one pixel in the circumferential direction and m pixels in the axial direction are arranged two-dimensionally on a cylindrical surface (cylindrical display surface). It is expressed as (j, k).
- the P subscript represents the position in the circumferential direction
- the second subscript represents the position in the axial direction of the cylinder. Therefore, all pixels are represented by P (l, 1) to P (l, m).
- a subject to be displayed as a three-dimensional image is determined by defining a single subject center point and photographing it with a photographing device (camera) from the outside to the entire circumference centered on that point. Or multiple equivalents of multiple 2D images that can be obtained by taking a subject at the center point of the subject and taking a picture with a photographing device over the entire circumference around that point.
- a two-dimensional pseudo image is created using computer graphics technology and acquired as a plurality of two-dimensional images (first step). The plurality of two-dimensional images are stored in a memory as image data that can be processed by a computer. If the 2D image is a live-action image, the input of the 3D display device 1 is n 2 images taken by the camera.
- shooting may be performed with a single camera or a plurality of cameras.
- the type of camera is arbitrary. Taking into account later data processing, it is preferable to shoot with a digital camera
- one two-dimensional image I (i) is selected from the plurality of stored two-dimensional images [1 (1) to I (n)], and a plurality of pixels P (j, k) are selected. From the viewpoint position V (i) force corresponding to I (i), one pixel P (j, k) that can be seen is selected (second step).
- the viewpoint position V (i) is a position corresponding to the lens principal point of the camera that captured the two-dimensional image I (i) when the object center point corresponds to the cylinder center O. If the magnification is different between when a 2D image is actually taken and when a 3D image is displayed, the position of V (i) can be changed according to the magnification.
- the image center point corresponding to the subject center point in the 2D image I (i) is cylindrical.
- the angle between the center of the display surface, i.e., the cylindrical center O, and the straight line connecting the viewpoint position V (i) and the cylindrical center O with the virtual plane B (i) is the subject center point and the camera lens principal point.
- the virtual plane B (i) is arranged so as to coincide with the angle formed by the straight line connecting the and the imaging plane of the camera (third step).
- a virtual extension line PL extending from the viewpoint position V (i) through the selected pixel P (j, k) to the virtual plane B (i) is assumed (fourth step). ). These assumptions are actually performed as data operations using a computer.
- the virtual plane B (i) may be a shape such as a curved surface according to it. In this case, the virtual plane B (i) is arranged at an appropriate position with respect to the viewpoint position V (i) according to the shooting conditions.
- the maximum spatial frequency that can be displayed by the display device is determined from the pixel interval and the horizontal angle discretization interval in which one pixel can independently control color and brightness. .
- One viewpoint position V (i) force The above-described second to fifth steps are executed for a plurality of visible pixels to determine display colors for the plurality of pixels P (sixth step). Then, the second to sixth steps are executed for all viewpoint positions V (i) corresponding to all the two-dimensional images (seventh step), and all the plurality of pixels also have a plurality of viewpoint position forces.
- the display surface is viewed, the light emission timings of the plurality of light emitting elements LED included in the primary light emitting element array 9 are controlled so that the display color determined in the first to seventh steps is obtained (the eighth step). ).
- FIG. 6 is a flowchart showing the contents of an example of a software algorithm used when the above steps 2 to 8 are implemented using a computer.
- a plurality of pixels form a cylindrical display surface, and each pixel is configured to emit light of different colors and brightness depending on the angle in the horizontal direction.
- stereoscopic display of a live-action image or a live-action simulated image can be realized in such a way that the external force of the display can be visually recognized by the naked eye over the entire circumference.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Theoretical Computer Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Processing Or Creating Images (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/722,846 US7961182B2 (en) | 2004-12-28 | 2005-06-27 | Stereoscopic image display method |
CA2599833A CA2599833C (en) | 2004-12-28 | 2005-06-27 | Stereoscopic image display method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004381985A JP4033859B2 (ja) | 2004-12-28 | 2004-12-28 | 立体画像表示方法 |
JP2004-381985 | 2004-12-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006070499A1 true WO2006070499A1 (ja) | 2006-07-06 |
Family
ID=36614626
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/011738 WO2006070499A1 (ja) | 2004-12-28 | 2005-06-27 | 立体画像表示方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US7961182B2 (ja) |
JP (1) | JP4033859B2 (ja) |
CA (1) | CA2599833C (ja) |
WO (1) | WO2006070499A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010137646A1 (ja) * | 2009-05-29 | 2010-12-02 | 独立行政法人科学技術振興機構 | スリット視を利用した3次元情報提示装置 |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005117541A2 (en) * | 2004-05-06 | 2005-12-15 | The Regents Of The University Of California | Method and system for aligning and classifying images |
US20090002289A1 (en) * | 2007-06-28 | 2009-01-01 | Boundary Net, Incorporated | Composite display |
US20090323341A1 (en) * | 2007-06-28 | 2009-12-31 | Boundary Net, Incorporated | Convective cooling based lighting fixtures |
DE102007045834B4 (de) * | 2007-09-25 | 2012-01-26 | Metaio Gmbh | Verfahren und Vorrichtung zum Darstellen eines virtuellen Objekts in einer realen Umgebung |
JP5083052B2 (ja) * | 2008-06-06 | 2012-11-28 | ソニー株式会社 | 立体視画像生成装置、立体視画像生成方法およびプログラム |
US20100019993A1 (en) * | 2008-07-23 | 2010-01-28 | Boundary Net, Incorporated | Calibrating pixel elements |
US20100019997A1 (en) * | 2008-07-23 | 2010-01-28 | Boundary Net, Incorporated | Calibrating pixel elements |
US20100020107A1 (en) * | 2008-07-23 | 2010-01-28 | Boundary Net, Incorporated | Calibrating pixel elements |
US8933998B2 (en) * | 2008-12-12 | 2015-01-13 | Sony Corporation | Three-dimensional image display device, method of manufacturing the same, and three-dimensional image display method |
TWI428631B (zh) * | 2008-12-12 | 2014-03-01 | Sony Corp | Dimensional image display device, a manufacturing method thereof, and a stereoscopic image display method |
EP2430112B1 (en) | 2009-04-23 | 2018-09-12 | The University of Chicago | Materials and methods for the preparation of nanocomposites |
JP2011082622A (ja) * | 2009-10-02 | 2011-04-21 | Sony Corp | 画像信号処理装置、画像信号処理方法、画像表示装置、画像表示方法、プログラム、および画像表示システム |
JP2011082675A (ja) * | 2009-10-05 | 2011-04-21 | Sony Corp | 画像信号処理装置、画像信号処理方法、画像表示装置、画像表示方法、プログラム、および画像表示システム |
US8587498B2 (en) * | 2010-03-01 | 2013-11-19 | Holovisions LLC | 3D image display with binocular disparity and motion parallax |
JP2011259373A (ja) * | 2010-06-11 | 2011-12-22 | Sony Corp | 立体画像表示装置及び立体画像表示方法 |
JP5398015B2 (ja) * | 2010-06-29 | 2014-01-29 | 独立行政法人情報通信研究機構 | 立体ディスプレイおよび立体画像提示方法 |
JP5593911B2 (ja) * | 2010-07-20 | 2014-09-24 | 富士通株式会社 | 映像生成装置、映像生成プログラム及び映像生成方法 |
US20120162216A1 (en) * | 2010-12-22 | 2012-06-28 | Electronics And Telecommunications Research Institute | Cylindrical three-dimensional image display apparatus and method |
JP2013017006A (ja) * | 2011-07-04 | 2013-01-24 | Sony Corp | 表示装置 |
WO2013168503A1 (ja) * | 2012-05-07 | 2013-11-14 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
BE1019941A3 (nl) * | 2012-06-05 | 2013-02-05 | Tait Technologies Bvba | Inrichting voor de weergave van driedimensionale beelden, systeem voor de creatie van driedimensionale beelden, en werkwijze voor de creatie van driedimensionale beelden. |
WO2014144989A1 (en) | 2013-03-15 | 2014-09-18 | Ostendo Technologies, Inc. | 3d light field displays and methods with improved viewing angle depth and resolution |
KR102128395B1 (ko) | 2014-02-06 | 2020-07-01 | 삼성디스플레이 주식회사 | 라이트 유닛 및 이를 포함하는 표시 장치 |
DE102015012662A1 (de) * | 2015-10-01 | 2017-04-06 | Martin Mang | Bildanzeigevorrichtung |
WO2018116580A1 (ja) | 2016-12-19 | 2018-06-28 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
EP3704531B1 (en) | 2017-11-02 | 2023-12-06 | InterDigital Madison Patent Holdings, SAS | Method and system for aperture expansion in light field displays |
KR20210019998A (ko) | 2018-05-17 | 2021-02-23 | 피씨엠에스 홀딩스, 인크. | 회절 소자에 기초한 3d 디스플레이 지향성 백라이트 |
CN110264905B (zh) * | 2019-05-24 | 2020-08-14 | 亿信科技发展有限公司 | 一种光场显示系统 |
CN114175627B (zh) | 2019-06-07 | 2024-04-12 | 交互数字Vc控股公司 | 用于基于分布式光孔的光场显示器的光学方法和系统 |
EP3990972A1 (en) | 2019-06-28 | 2022-05-04 | PCMS Holdings, Inc. | Optical method and system for light field (lf) displays based on tunable liquid crystal (lc) diffusers |
JPWO2021075118A1 (ja) * | 2019-10-15 | 2021-04-22 | ||
CN111261027B (zh) * | 2020-01-22 | 2022-07-12 | 京东方科技集团股份有限公司 | 旋转显示设备及其控制方法、旋转显示系统以及存储介质 |
CN111447433A (zh) * | 2020-03-24 | 2020-07-24 | 京东方科技集团股份有限公司 | 显示装置、数据生成装置及方法、显示系统 |
CN111338097A (zh) * | 2020-04-18 | 2020-06-26 | 彭昊 | 一种球形三维显示器 |
US11977244B2 (en) | 2022-04-29 | 2024-05-07 | Sony Interactive Entertainment Inc. | Method and system for generating a visual representation of a real-world object in three physical dimensions |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1097013A (ja) * | 1996-09-20 | 1998-04-14 | Futaba Corp | 立体表示装置 |
JP2003195214A (ja) * | 2001-12-26 | 2003-07-09 | Seiko Epson Corp | 立体表示装置 |
JP2004177709A (ja) * | 2002-11-27 | 2004-06-24 | Toshiba Corp | 立体画像表示装置及び立体画像表示方法 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3815979A (en) * | 1973-05-17 | 1974-06-11 | R Collender | Unaided three dimensional aiming point photography and reproduction method and apparatus |
US4089597A (en) * | 1976-03-11 | 1978-05-16 | Robert Bruce Collender | Stereoscopic motion picture scanning reproduction method and apparatus |
US4158487A (en) * | 1978-02-17 | 1979-06-19 | Collender Robert B | Stereoscopic real image scanning reproduction method and apparatus |
US4367486A (en) * | 1980-03-18 | 1983-01-04 | Jesse B. Eichenlaub | Three dimensional imaging system |
JPH0918897A (ja) * | 1995-07-03 | 1997-01-17 | Canon Inc | 立体画像表示装置 |
GB9513658D0 (en) * | 1995-07-05 | 1995-09-06 | Philips Electronics Uk Ltd | Autostereoscopic display apparatus |
JP2000172878A (ja) * | 1998-12-09 | 2000-06-23 | Sony Corp | 情報処理装置および情報処理方法、並びに提供媒体 |
US7271803B2 (en) * | 1999-01-08 | 2007-09-18 | Ricoh Company, Ltd. | Method and system for simulating stereographic vision |
US7525567B2 (en) * | 2000-02-16 | 2009-04-28 | Immersive Media Company | Recording a stereoscopic image of a wide field of view |
US7277121B2 (en) * | 2001-08-29 | 2007-10-02 | Sanyo Electric Co., Ltd. | Stereoscopic image processing and display system |
JP3879510B2 (ja) * | 2001-10-11 | 2007-02-14 | セイコーエプソン株式会社 | 立体表示装置 |
JP4075418B2 (ja) * | 2002-03-15 | 2008-04-16 | ソニー株式会社 | 画像処理装置及び画像処理方法、印刷物製造装置及び印刷物製造方法、並びに印刷物製造システム |
JP3897712B2 (ja) * | 2003-02-14 | 2007-03-28 | キヤノン株式会社 | 立体画像表示装置 |
US7791638B2 (en) * | 2004-09-29 | 2010-09-07 | Immersive Media Co. | Rotating scan camera |
WO2006096660A2 (en) * | 2005-03-05 | 2006-09-14 | Wag Display Corporation, Inc. | Display system with moving pixels for 2d and 3d image formation |
-
2004
- 2004-12-28 JP JP2004381985A patent/JP4033859B2/ja not_active Expired - Fee Related
-
2005
- 2005-06-27 WO PCT/JP2005/011738 patent/WO2006070499A1/ja not_active Application Discontinuation
- 2005-06-27 CA CA2599833A patent/CA2599833C/en not_active Expired - Fee Related
- 2005-06-27 US US11/722,846 patent/US7961182B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1097013A (ja) * | 1996-09-20 | 1998-04-14 | Futaba Corp | 立体表示装置 |
JP2003195214A (ja) * | 2001-12-26 | 2003-07-09 | Seiko Epson Corp | 立体表示装置 |
JP2004177709A (ja) * | 2002-11-27 | 2004-06-24 | Toshiba Corp | 立体画像表示装置及び立体画像表示方法 |
Non-Patent Citations (3)
Title |
---|
ENDO T. ET AL.: "Jinbutsu Tobu Gazo no Zenshu Rittai Hyoji System", 3D IMAGE CONFERENCE 2004 KOEN RONBUNSHU, 29 June 2004 (2004-06-29), pages 2 - 6, AND 177 - 180, XP002998163 * |
ENDO T. ET AL.: "SeeLINDER: Zenshu kara Kansatsu kano na Entogata 3 Jigen Display", THE VIRTUAL REALITY SOCIETY OF JAPAN DAI 9 KAI TAIKAI RONBUNSHU, 8 September 2004 (2004-09-08), pages 613 - 614, XP002999566 * |
ENDO T. ET AL.: "Sotai o nasu Kaiten gata Camera-Display-kei ni yoru Enkaku Communication System no Teian", THE VIRTUAL REALITY SOCIETY OF JAPAN DAI 9 KAI TAIKAI RONBUNSHU, 8 September 2004 (2004-09-08), pages 465 - 468, XP002998164 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010137646A1 (ja) * | 2009-05-29 | 2010-12-02 | 独立行政法人科学技術振興機構 | スリット視を利用した3次元情報提示装置 |
JP5449342B2 (ja) * | 2009-05-29 | 2014-03-19 | 独立行政法人科学技術振興機構 | スリット視を利用した3次元情報提示装置 |
US9159255B2 (en) | 2009-05-29 | 2015-10-13 | Japan Science And Technology Agency | Three-dimensional information presentation device using slit viewing |
Also Published As
Publication number | Publication date |
---|---|
US20080043014A1 (en) | 2008-02-21 |
JP4033859B2 (ja) | 2008-01-16 |
US7961182B2 (en) | 2011-06-14 |
JP2006189962A (ja) | 2006-07-20 |
CA2599833A1 (en) | 2006-07-06 |
CA2599833C (en) | 2011-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006070499A1 (ja) | 立体画像表示方法 | |
CN107678722B (zh) | 多屏幕拼接方法、装置以及多投影拼接大屏幕 | |
JP3948489B2 (ja) | 輝度フィルターの作成方法、および仮想空間生成システム | |
US20100033479A1 (en) | Apparatus, method, and computer program product for displaying stereoscopic images | |
US20080030573A1 (en) | Volumetric panoramic sensor systems | |
US20110216171A1 (en) | Screen and method for representing picture information | |
JP5176718B2 (ja) | 空間像表示装置 | |
JPWO2004043079A1 (ja) | 立体映像処理方法及び立体映像表示装置 | |
JP2008228199A (ja) | 立体画像表示装置及び立体画像表示方法並びに立体画像用データの構造 | |
CN107632403B (zh) | 三维立体成像显示仪 | |
JP2005347813A (ja) | 画像変換方法および画像変換装置、並びにマルチプロジェクションシステム | |
KR20070103347A (ko) | 360도 파노라마 카메라의 영상컨텐츠를 이용한 고해상도광시야각 대형 디스플레이 방법 | |
WO2015018152A1 (zh) | 隐形装置 | |
TW201323928A (zh) | 裸眼立體顯示裝置 | |
US20100302136A1 (en) | Method and apparatus for displaying three-dimensional stereo images viewable from different angles | |
TWI516091B (zh) | 多視角三維顯示器及其多視角三維圖像資料的產生方法 | |
JP2009003166A (ja) | 質感映像表示装置 | |
CN115272047A (zh) | 用于产生图像帧的系统、方法和显示系统 | |
US9648314B2 (en) | Method of glasses-less 3D display | |
US20090262181A1 (en) | Real-time video signal interweaving for autostereoscopic display | |
JP2008292666A (ja) | 視方向画像データ生成装置、指向性表示用画像データ生成装置、指向性表示ディスプレイ、および、指向性表示システム、並びに、視方向画像データ生成方法、指向性表示用画像データ生成方法 | |
JP5771096B2 (ja) | 画像処理方法 | |
WO2001020386A9 (en) | An autostereoscopic display and method of displaying three-dimensional images, especially color images | |
US20130088485A1 (en) | Method of storing or transmitting auto-stereoscopic images | |
JP5249733B2 (ja) | 映像信号処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11722846 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2599833 Country of ref document: CA |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05765157 Country of ref document: EP Kind code of ref document: A1 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 5765157 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11722846 Country of ref document: US |