WO2009148038A1 - 立体視画像生成装置、立体視画像生成方法およびプログラム - Google Patents
立体視画像生成装置、立体視画像生成方法およびプログラム Download PDFInfo
- Publication number
- WO2009148038A1 WO2009148038A1 PCT/JP2009/060028 JP2009060028W WO2009148038A1 WO 2009148038 A1 WO2009148038 A1 WO 2009148038A1 JP 2009060028 W JP2009060028 W JP 2009060028W WO 2009148038 A1 WO2009148038 A1 WO 2009148038A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display surface
- eyes
- eye
- horopter
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/34—Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/373—Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/388—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
Definitions
- the present invention relates to a stereoscopic image generation apparatus, and more particularly to a stereoscopic image generation apparatus that generates a stereoscopic image from a non-stereo image, a processing method thereof, and a program that causes a computer to execute the method.
- the display device has become larger or has a wider viewing angle, and it is now possible to display images with a higher sense of reality than before.
- the image is forcibly perceived as if there is an image on the display surface of the display device, and there is a possibility that the generation of a stereoscopic effect due to a sensory stereoscopic element such as a shadow or composition may be hindered.
- This is considered to be due to influences caused by physiological stereoscopic elements such as a change in the convergence angle when the display surface of the display device is perceived from both eyes and the occurrence of distortion due to binocular parallax.
- a stereoscope called a synopter is known as an optical device for removing the influence of such physiological stereoscopic elements.
- a synopter supplies light received at the same position separately to both eyes by combining half mirrors. According to this synopter, it is known that the retina images of both eyes become the same, and a stereoscopic depth can be given to a non-stereo image (see, for example, Non-Patent Document 1).
- a stereoscopic depth can be obtained from a non-stereoscopic image by removing the influence of physiological stereoscopic elements and making the retinal images of both eyes the same.
- stereoscopic vision can be realized by a simple mechanism.
- the display device side has no degree of freedom, and it is difficult to obtain a further visual effect.
- an object of the present invention is to remove the influence of physiological stereoscopic elements by image processing.
- the present invention has been made to solve the above problems, and a first aspect of the present invention generates a cylindrical image by projecting a two-dimensional input image onto a cylindrical surface including a virtual circle in contact with both eyes.
- Stereoscopic view comprising a cylindrical surface projection unit and a display surface projection unit that projects the cylindrical image onto a display surface with each of the eyes as a reference and generates a display image projected onto each of the eyes
- An image generation apparatus, a stereoscopic image generation method, or a program thereof Thereby, the same retinal image is supplied to each of both eyes, and the effect which removes the influence which arises from a physiological stereoscopic element is brought about.
- the radius of the virtual circle may be set according to an assumed observation distance or display size. This brings about the effect
- the first side surface further includes an observation distance measurement unit that measures the distance between the display surface and the observation position, and the radius of the virtual circle is the observation distance measured by the observation distance measurement unit. It may be set according to This brings about the effect
- the radius of the virtual circle may be set so that the skewness of the display image is smaller than a predetermined threshold. This brings about the effect that the image is displayed within a range where the distortion is allowable.
- the display surface projection further includes a depth information generating unit that generates depth information from the two-dimensional input image, and a depth information combining unit that combines the depth information with the cylindrical image.
- the unit may generate the display image by projecting a cylindrical image combined with the depth information onto a display surface. This brings about the effect of displaying an image with a further enhanced stereoscopic effect.
- the second aspect of the present invention provides an irradiation surface projection unit that projects a two-dimensional input image onto a two-dimensional plane orthogonal to the line of sight of both eyes and generates an irradiation image corresponding to each of the eyes.
- a display surface projection unit for projecting the corresponding irradiation image on the display surface with each of the eyes as a reference to generate a display image projected onto each of the eyes.
- the position of the irradiation image may be set according to an assumed observation distance. This brings about the effect
- an observation distance measurement unit that measures the distance between the display surface and the observation position is further provided, and the position of the irradiation image may be set according to the observation distance measured by the observation distance measurement unit. Good. This brings about the effect
- the third aspect of the present invention is a stereoscopic image that generates a right-eye image and a left-eye image by converting a two-dimensional input image so that the images projected from the display surface to the right eye and the left eye are the same.
- a generation device, a stereoscopic image generation method thereof, or a program thereby, the same retinal image is supplied to each of both eyes, and the effect which removes the influence which arises from a physiological stereoscopic element is brought about.
- FIG. 1 is a diagram illustrating a configuration example of a stereoscopic image generation apparatus according to an embodiment of the present invention.
- FIG. 2 is a diagram illustrating a first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 3 is a diagram showing an aspect of projection onto the horopter plane in the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 4 is a diagram illustrating a specific example of the projection onto the horopter plane in the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 5 is a diagram showing an aspect of projection onto the display surface in the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 6 is a diagram illustrating a specific example of the projection onto the display surface in the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 7 is a diagram illustrating a processing procedure example according to the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 8 is a diagram illustrating a second example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 9 is a diagram illustrating the relationship between the size of the horopter circle and the position of the convergence point.
- FIG. 10 is a diagram illustrating an aspect of projection onto the horopter plane in the second example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 10 is a diagram illustrating an aspect of projection onto the horopter plane in the second example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 11 is a diagram illustrating a specific example of the projection onto the horopter plane in the second example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 12 is a diagram illustrating a third example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 13 is a diagram illustrating an example of the relationship between the horopter circle and the degree of distortion of the image.
- FIG. 14 is a diagram illustrating an example of the relationship between the angle ⁇ and the image skewness Q.
- FIG. 15 is a diagram illustrating an example of a relationship between a horopter circle and a circumferential angle.
- FIG. 16 is a diagram illustrating a fourth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 12 is a diagram illustrating a third example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 13 is a diagram illustrating an example of the relationship between the horopter circle
- FIG. 17 is a diagram illustrating an example of the relationship between the horopter circle and the display surface.
- FIG. 18 is a diagram illustrating a fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 19 is a diagram illustrating an aspect of the projection onto the tilt surface of the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 20 is a diagram illustrating a specific example of the projection onto the tilt surface of the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 21 is a diagram illustrating a specific example of the projection onto the display surface in the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 22 is a diagram illustrating a processing procedure example according to the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 23 is a diagram illustrating a sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 24 is a diagram illustrating an outline of processing according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 25 is a diagram illustrating an example of a depth map of the horopter plane according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 26 is a diagram illustrating an example of estimating depth information according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 27 is a diagram illustrating a configuration example of the depth map synthesis unit 363 of the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 28 is a diagram illustrating a stereoscopic image generation example according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 29 is a diagram illustrating another example of generating a stereoscopic image according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 1 is a diagram illustrating a configuration example of a stereoscopic image generation apparatus according to an embodiment of the present invention.
- the stereoscopic image generation apparatus includes an image signal input unit 110, a signal processing unit 120, a three-dimensional conversion unit 130, a parameter setting unit 140, an observation distance measurement unit 150, a post-processing unit 160, and a format conversion unit. 170, a source selection unit 180, and a display unit 190.
- the image signal input unit 110 receives an image signal input of a non-stereo image.
- the input non-stereo image is not limited to a still image and may be a moving image.
- a non-stereoscopic image source device a television broadcast receiver, a video playback device (player), an imaging device (camcorder: CAMera and reCORDER), or the like is assumed.
- the signal processing unit 120 performs predetermined signal processing on the input non-stereo image.
- this signal processing for example, white balance adjustment, noise reduction processing, level correction processing, gamma correction processing, and the like are assumed.
- the three-dimensional conversion unit 130 is a characteristic part of the present invention, and converts a two-dimensional non-stereo image into three dimensions.
- a three-dimensional image based on the non-stereo image is generated.
- this three-dimensional image for example, a left-eye image and a right-eye image are obtained.
- the parameter setting unit 140 sets parameters necessary for the three-dimensional conversion process in the three-dimensional conversion unit 130.
- a parameter for example, a radius for specifying a horopter circle described later is assumed.
- the observation distance measuring unit 150 measures the distance between the display unit 190 and the viewer's observation position. Based on the observation distance measured by the observation distance measurement unit 150, the three-dimensional conversion process in the three-dimensional conversion unit 130 can be performed. However, the observation distance assumed in advance may be used without actually measuring the observation distance.
- the post-processing unit 160 performs post-processing for preventing aliasing on the three-dimensional image obtained by the three-dimensional conversion processing in the three-dimensional conversion unit 130. For example, assuming that the display unit 190 alternately displays a left-eye image and a right-eye image for each line, there is a possibility that jaggy (stepped jaggedness) due to aliasing may be displayed. In order to prevent this, the post-processing unit 160 applies a filter in the vertical direction to smooth the image change.
- the format conversion unit 170 converts the three-dimensional image into a format corresponding to the display unit 190.
- the format conversion unit 170 can perform conversion so that, for example, the left-eye image and the right-eye image are alternately arranged for each line in accordance with the corresponding format of the display unit 190.
- the source selection unit 180 selects an image to be displayed as a source. That is, the source selection unit 180 selects the output of the signal processing unit 120 when displaying a non-stereoscopic image as it is, and outputs the output of the format conversion unit 170 when displaying a three-dimensional image for stereoscopic viewing. select.
- the display unit 190 is a display that displays an image.
- the display unit 190 has a function of displaying a three-dimensional image for stereoscopic viewing, but means for realizing the function is not particularly limited.
- a divided wavelength plate is provided for every other line, and linearly polarized light from the even and odd lines on the display screen is converted into one orthogonal to each other. It is conceivable that different images of light are incident on both eyes.
- FIG. 2 is a diagram illustrating a first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the first example of the three-dimensional conversion unit 130 includes a horopter plane image projection unit 311, a display plane right eye projection unit 316, and a display plane left eye projection unit 317.
- the horopter plane image projection unit 311 projects the non-stereo image supplied from the signal processing unit 120 via the signal line 129 onto a cylindrical plane including a horopter circle (horopter).
- the horopter circle is a circumference in contact with both eyes, and it is known that the binocular retinal images corresponding to the points on the horopter circle are the same.
- This cylindrical surface is called a horopter surface, and an image projected on the horopter surface is called a horopter image.
- the intersection of the lines of sight of both eyes is referred to as a convergence point, and the angle formed by the intersection is referred to as a convergence angle or a circumferential angle.
- the convergence angles are equal.
- the size of the horopter circle is specified by the horopter circle information. Further, the relative positional relationship with both eyes is specified by the interocular distance “2a”.
- the horopter circle information and the interocular distance “2a” are supplied from the parameter setting unit 140 via the signal line 149.
- the size of the horopter circle is specified using the radius “r” as the horopter circle information. However, the distance from the center of both eyes to the apex of the horopter circle, the circumferential angle, and the like are used. May be specified.
- the horopter plane image projection unit 311 is an example of a cylindrical plane projection unit described in the claims.
- the display surface right-eye projection unit 316 projects the horopter image onto the right-eye display surface.
- the display surface left-eye projection unit 317 projects the horopter image onto the left-eye display surface.
- the display surface right-eye projection unit 316 and the display surface left-eye projection unit 317 display for the right eye and the left eye based on the interocular distance “2a”, the radius “r” of the horopter circle, and the assumed observation distance “d”. Project onto a surface.
- the image projected on the right-eye display surface is called a right-eye image
- the image projected on the left-eye display surface is called a left-eye image.
- These right eye image and left eye image are supplied to the post-processing unit 160 via the signal line 139.
- the display surface right-eye projection unit 316 and the display surface left-eye projection unit 317 are examples of the display surface projection unit described in the claims.
- FIG. 3 is a diagram showing an aspect of projection onto the horopter plane in the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the horopter circle 520 is a circle that passes through the right eye 511, the left eye 512, and the convergence point 531 (vertex) or 532.
- the right eye 511 and the left eye 512 are separated by “a” from the center of both eyes. That is, the interocular distance is “2a”.
- the radius of the horopter circle 520 is “r”.
- the binocular retinal images of the right eye 511 and the left eye 512 for the points on the horopter circle 520 are the same. This is because the convergence angles are always equal when the point on the horopter circle 520 is the convergence point. For example, the convergence angle 533 for the convergence point 531 is equal to the convergence angle 534 for the convergence point 532.
- the right eye 511 and the left eye 512 can form the same retinal image without binocular parallax. .
- FIG. 4 is a diagram illustrating a specific example of the projection onto the horopter plane in the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 4A shows a coordinate system of the input image I (p, q) supplied from the signal processing unit 120 via the signal line 129.
- FIG. 4B shows a coordinate system of the horopter plane onto which the horopter image 530 is projected. Since the horopter plane is three-dimensional, a three-dimensional coordinate system (x, y, z) is used here. The origin of the coordinate system is set at the center of the horopter circle 520.
- FIG. 4B is a view seen from the direction perpendicular to the horopter plane, that is, the direction perpendicular to the y-axis.
- the horopter image H (x, y, z) is obtained by projecting the input image I (p, q) onto a horopter circle having a radius r, and is represented by the following equation.
- H (x, y, z) I (( ⁇ / 2 ⁇ ) ⁇ r, y)
- the input image size (width) “2L” is the same size as the display surface size (width), but the physical size of the image is provided by providing a function for scaling the input image in the previous stage. May be changed.
- FIG. 5 is a diagram showing one aspect of the projection onto the display surface of the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the convergence point 532 on the horopter circle 520 is projected on the display surface 540.
- An image formed from the convergence point 532 to the right eye 511 is displayed at the display position 541 on the display surface 540.
- an image formed from the convergence point 532 to the left eye 512 is displayed at the display position 542 on the display surface 540. That is, even for the same horopter image 530, the images to be displayed on the display surface 540 are basically different images for the right eye 511 and the left eye 512.
- FIG. 6 is a diagram illustrating a specific example of the projection onto the display surface of the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 6A shows the coordinate system of the horopter surface and the display surface.
- a three-dimensional coordinate system of (x, y, z) is used, but unlike the case of FIG. 4A, the position of the origin on the xy plane is set at the midpoint between the right eye 511 and the left eye 512.
- FIG. 6B shows a coordinate system of the display image J (s, t) projected on the display surface.
- Display images are obtained for each of the right eye 511 and the left eye 512. Since each is a two-dimensional image, a two-dimensional coordinate system is used. The origin of the coordinate system is set at the center point of the display image.
- the position D (x R , y R , z R ) on the display surface 540 projected from the right eye 511 through the position H (x 0 , y 0 , z 0 ) on the horopter image is expressed by the following equation. Given.
- the position D (x R , y R , z R ) projected from the right eye 511 has been described, but the left eye 512 passes through the position H (x 0 , y 0 , z 0 ) on the horopter image.
- the position D (x L , y L , z L ) on the display surface 540 to be projected can be obtained similarly.
- the size of the horopter circle is specified by the radius “r”
- the distance from the center of both eyes to the apex of the horopter circle, the circumferential angle, and the like are used.
- the size of the horopter circle may be specified.
- the distance f from the center of both eyes to the convergence point is given by the following equation.
- FIG. 7 is a diagram illustrating an example of a processing procedure according to the first example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- Step S912 is an example of the cylindrical surface projection procedure described in the claims.
- display images are generated as follows for each of the right eye and the left eye (loop L901).
- perspective transformation is performed on the display surface 540 projected from the right eye 511 through the position H (x 0 , y 0 , z 0 ) on the horopter image, and a three-dimensional position D (x R , y R , z R ) is obtained (step S913).
- a two-dimensional display image J (x R , y R ) on the display surface is obtained from this three-dimensional position (step S914).
- step S913 perspective transformation is performed on the display surface 540 projected from the left eye 512 through the position H (x 0 , y 0 , z 0 ) on the horopter image, and the three-dimensional position D (x L , Y L , z L ) are obtained (step S913). Then, a two-dimensional display image J (x L , y L ) on the display surface is obtained from this three-dimensional position (step S914). Steps S913 and S914 are an example of the display surface projection procedure described in the claims.
- a non-stereo image is projected as a horopter image onto the horopter circle 520 specified by the horopter circle information.
- a stereoscopic image for the right eye 511 and the left eye 512 can be generated by projecting the horopter image onto the display surface at the actually observed or estimated observation distance.
- FIG. 8 is a diagram illustrating a second example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the second embodiment of the three-dimensional conversion unit 130 includes a horopter plane image projection unit 321, a convergence point setting unit 322, a display plane right eye projection unit 326, and a display plane left eye projection unit 327.
- the horopter plane image projection unit 321 projects the non-stereo image supplied from the signal processing unit 120 via the signal line 129 onto the cylindrical plane including the horopter circle, similarly to the horopter plane image projection unit 311.
- the horopter circle is specified using the radius “r” based on the convergence point set by the convergence point setting unit 322.
- the horopter plane image projection unit 321 is an example of a cylindrical plane projection unit described in the claims.
- the convergence point setting unit 322 sets a convergence point and supplies a radius “r” based on the convergence point.
- the convergence point setting unit 322 sets the convergence point based on the interocular distance “2a”, the observation distance “d”, the display surface size “2M”, and the input image size “2L”.
- the display surface right-eye projection unit 326 and the display surface left-eye projection unit 327 project the horopter image onto the right-eye or left-eye display surface, as in the first embodiment.
- the display surface right-eye projection unit 326 and the display surface left-eye projection unit 327 are examples of the display surface projection unit described in the claims.
- FIG. 9 is a diagram showing the relationship between the size of the horopter circle and the position of the convergence point.
- the horopter circle is uniquely identified by the interocular distance and radius. Therefore, when the radius is not fixed, a number of horopter circles passing through both eyes can be assumed as shown in FIG. Generally, the closer the convergence point is, the wider the convergence angle and the smaller the radius.
- the convergence point of the horopter circle is set by calculating backward from the projection size on the display surface for the purpose of displaying the input image on the entire display surface.
- FIG. 10 is a diagram illustrating an aspect of projection onto the horopter plane in the second example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the input image size (width) is set to “2L”, and an expression representing the size (width) “2m” projected on the display surface is considered.
- angle ⁇ is an angle formed by both sides of the lengths q and r. This angle ⁇ is expressed by the following equation.
- FIG. 11 is a diagram illustrating a specific example of the projection onto the horopter plane in the second example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the convergence point is set so that the projection width on the display surface 540 is displayed in full. Therefore, as shown in the figure, the width “2M” of the display surface 540 itself and the projection width “2 m” on the display surface obtained by the above equation are made to coincide.
- the convergence point setting unit 322 is provided with the size (width) “2M” of the display surface 540, the input image size (width) “2L”, the interocular distance “2a”, and the observation distance “d”.
- the radius r of the horopter circle can be obtained in such a manner that the projection size (width) “2 m” on the display surface obtained by the above matches the “2M”.
- the projection image on the display surface 540 is displayed in full on the display surface 540 by assuming the size of the display surface 540 in advance. It is possible to set a point of convergence such that the horopter circle is uniquely identified.
- FIG. 12 is a diagram illustrating a third example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the convergence point is determined so that the degree of distortion of the image on the display surface does not increase. Since the skewness between the center and the periphery of the image increases as the distance from the front of each eye of both eyes increases, the convergence point is set so that the skewness falls within an allowable range.
- the third embodiment of the three-dimensional conversion unit 130 includes a horopter plane image projection unit 331, a convergence point setting unit 332, a display plane right eye projection unit 336, and a display plane left eye projection unit 337.
- the horopter plane image projection unit 331 projects the non-stereo image supplied from the signal processing unit 120 via the signal line 129 onto the cylindrical plane including the horopter circle, like the horopter plane image projection unit 311.
- the horopter circle is specified using the circumferential angle “ ⁇ ” based on the convergence point set by the convergence point setting unit 332.
- the horopter plane image projection unit 331 is an example of a cylindrical plane projection unit described in the claims.
- the convergence point setting unit 332 sets a convergence point and supplies a circumferential angle “ ⁇ ” based on the convergence point.
- the convergence point setting unit 332 sets the convergence point based on the interocular distance “2a”, the observation distance “d”, the maximum skewness “Qmax”, and the minute angle “ ⁇ ”. Details of the setting will be described with reference to the following figure.
- the display surface right-eye projection unit 336 and the display surface left-eye projection unit 337 project the horopter image onto the right-eye or left-eye display surface, as in the first embodiment.
- the projection onto the display surface is performed using the circumferential angle “ ⁇ ” supplied from the convergence point setting unit 332, the radius “r” in the first embodiment and the circle in the present embodiment are used.
- the circumferential angle “ ⁇ ” is equivalent in the sense of setting a convergence point. Therefore, both may be used as appropriate.
- the display surface right-eye projection unit 336 and the display surface left-eye projection unit 337 are examples of the display surface projection unit described in the claims.
- FIG. 13 is a diagram showing an example of the relationship between the horopter circle and the degree of distortion of the image.
- the skewness is obtained for the left eye 512, but the same applies to the right eye 511.
- the distance on the display surface 540 when the position is shifted by a minute angle “ ⁇ ” is compared between the position 535 on the horopter circle when looking directly in front from the left eye 512 and the position 536 rotated by ⁇ from the position.
- the expected angle of the center point o of the horopter circle is “2 ⁇ ”, and therefore, a shift of “2 ⁇ r” occurs at the position 535 on the horopter circle.
- the position 536 on the horopter circle is similarly displaced by “2 ⁇ r”.
- the displacement width Q1 is as follows: Become.
- FIG. 14 is a diagram illustrating an example of the relationship between the angle ⁇ and the image skewness Q.
- the skewness Q is shown when the minute angle ⁇ is “0.01” in FIG. 13 and the angle ⁇ is changed from “ ⁇ 70 °” to “+ 70 °”.
- the skewness Q between the center and the periphery of the image increases as the distance from the front of the eye increases. Therefore, in the third embodiment, the maximum skewness “Qmax” is given as the skewness threshold value, and the angle ⁇ is set so that the skewness becomes smaller than this.
- FIG. 15 is a diagram showing an example of the relationship between the horopter circle and the circumference angle. Assuming that the circumferential angle is “ ⁇ ”, the angle at which the center point o of the horopter circle 520 is viewed from the right eye 511 and the left eye 512 is “2 ⁇ ”. This angle is divided into two equal parts by a perpendicular line from the center point o to the center of both eyes, and each becomes “ ⁇ ”. This figure shows “ ⁇ ” with the left eye 512 as a reference.
- the angles at the intersection 537 and the left eye 512 are both “ ⁇ / 2 ”. Since the straight line connecting the position 535 on the horopter circle viewed from the left eye 512 directly in front and the left eye 512 is parallel to the center line of the right eye 511 and the left eye 512, the intersection point 537 and the end point 538 of the input image The angle formed is “ ⁇ / 2”. However, the angle ⁇ is an angle formed in the left eye 512 by the position 535 and the end point 538 of the input image.
- the angle at which the center point o of the horopter circle 520 is viewed from the intersection point 537 and the end point 538 of the input image is “2 ⁇ ( ⁇ / 2)”. Since the arc in this case coincides with “L” of the size (width) of the input image, the following equation is established.
- the distortion at the center and the periphery of the screen is the maximum distortion.
- Convergence points can be set to be less than or equal to degrees, and a horopter circle can be identified.
- FIG. 16 is a diagram illustrating a fourth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the convergence point is determined so that the distortion of the image on the display surface does not increase, and the size (width) of the input image is displayed so that the input image is displayed on the entire display surface. Is to scale.
- the fourth embodiment of the three-dimensional conversion unit 130 includes a horopter plane image projection unit 341, a convergence point setting unit 342, a scaling unit 343, a display surface right eye projection unit 346, and a display surface left eye projection unit 347. Prepare.
- the convergence point setting unit 342 sets a convergence point and supplies a radius “r” and an angle “ ⁇ ” based on the convergence point.
- the convergence point setting unit 342 sets the convergence point based on the interocular distance “2a”, the observation distance “d”, the maximum skewness “Qmax”, the minute angle “ ⁇ ”, and the display surface size “2M”.
- the scaling unit 343 enlarges or reduces (scales) the non-stereo image supplied from the signal processing unit 120 via the signal line 129 according to the convergence point set by the convergence point setting unit 342.
- the horopter plane image projection unit 341 projects the non-stereo image scaled by the scaling unit 343 onto a cylindrical plane including a horopter circle.
- the horopter plane image projection unit 341 is an example of a cylindrical plane projection unit described in the claims.
- the display surface right-eye projection unit 346 and the display surface left-eye projection unit 347 project the horopter image onto the display screen for the right eye or the left eye, as in the first embodiment.
- the display surface right-eye projection unit 346 and the display surface left-eye projection unit 347 are examples of the display surface projection unit described in the claims.
- FIG. 17 is a diagram showing an example of the relationship between the horopter circle and the display surface.
- the angle “ ⁇ ” is determined so that the skewness is smaller than the maximum skewness “Qmax” with respect to the minute angle “ ⁇ ”. Based on this angle “ ⁇ ”, a horopter circle is determined and the size of the input image is determined.
- the angle at which the end point 538 of the input image is viewed from the right eye 511 and the left eye 512 is “ ⁇ ”.
- the angle formed from the left eye 512 to the intersection 548 is “ ⁇ / 2 ⁇ ”.
- the angle formed from the vertex T at the intersection 549 is “tan ⁇ 1 (x / a)”.
- the inner angle at the intersection 549 is “ ⁇ -tan ⁇ 1 (x / a)”.
- the angle “ ⁇ ” is given by the following equation. That is, the horopter circle is determined by setting ⁇ so that the following equation is established.
- ⁇ ⁇ ( ⁇ / 2) + tan ⁇ 1 (x / a)
- the angle at which the center point o of the horopter circle 520 is viewed from the intersection point 537 and the end point 538 of the input image is “2 ⁇ ( ⁇ / 2)”. Since the arc in this case coincides with “L” of the size (width) of the input image, the following equation is established.
- the convergence point is set so that the distortion at the center and the periphery of the screen is equal to or less than the maximum distortion, and the horopter circle is specified.
- the input image can be scaled to display the input image on the full display surface.
- FIG. 18 is a diagram illustrating a fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the non-stereo image is projected on the tilt plane corresponding to each eye and then projected on the display surface without going through the procedure of projecting the non-stereo image onto the horopter circle.
- a stereoscopic image is generated.
- the image displayed by the fifth embodiment is equivalent to the image displayed through the horopter circle described in the first to fourth embodiments. Therefore, here, a horopter circle is assumed when generating a stereoscopic image, and the radius of the horopter circle is given as a parameter.
- the fifth embodiment of the three-dimensional conversion unit 130 includes a tilt surface right eye setting unit 354, a tilt surface left eye setting unit 355, a display surface right eye projection unit 356, and a display surface left eye projection unit 357.
- the tilt plane right-eye setting unit 354 assumes a viewpoint in which a convergence point that is equidistant from both eyes is viewed from the right eye, sets a tilt plane that intersects perpendicularly to this viewpoint, and the signal line 129 from the signal processing unit 120.
- the non-stereo image supplied via is projected as a tilt image for the right eye.
- the tilt plane left-eye setting unit 355 assumes a viewpoint in which a convergence point that is equidistant from both eyes is viewed from the left eye, sets a tilt plane that intersects perpendicularly with respect to this viewpoint, and the signal line 129 from the signal processing unit 120. Is projected as a tilt image for the left eye.
- the tilted surface right eye setting unit 354 and tilted surface left eye setting unit 355 are based on the interocular distance “2a”, the radius of the horopter circle “r”, the assumed observation distance “d”, and the tilted surface distance “k”. Projection is performed on the right and left eye tilt planes.
- the tilt surface is an example of the irradiation surface described in the claims. Further, the tilt surface right eye setting unit 354 and the tilt surface left eye setting unit 355 are examples of the irradiation surface projection unit described in the claims.
- the display surface right eye projection unit 356 projects the tilt image for the right eye onto the display surface for the right eye.
- the display surface left-eye projection unit 357 projects the left-eye tilt image onto the left-eye display surface.
- the display surface right-eye projection unit 356 and the display surface left-eye projection unit 357 display the right eye and the left eye based on the interocular distance “2a”, the horopter circle radius “r”, and the assumed observation distance “d”. Project onto a surface.
- An image projected on the right-eye display surface is referred to as a right-eye image
- an image projected on the left-eye display surface is referred to as a left-eye image.
- These right eye image and left eye image are supplied to the post-processing unit 160 via the signal line 139.
- the display surface right-eye projection unit 356 and the display surface left-eye projection unit 357 are examples of the display surface projection unit described in the claims.
- FIG. 19 is a diagram illustrating an aspect of the projection onto the tilt plane of the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the intersection of the center line of the right eye 511 and the left eye 512 and the horopter circle 520 is a convergence point 527 that is equidistant from both eyes.
- the right eye tilt plane 550 is a plane that intersects perpendicularly at a point 551 with respect to the viewpoint when the convergence point 527 is viewed from the right eye 511.
- the left-eye tilt surface 560 is a surface that intersects perpendicularly at a point 561 with respect to the viewpoint when the convergence point 527 is viewed from the left eye 512.
- the distance between the line segment connecting the position of the right eye 511 and the position of the left eye 512 and the line segment connecting the points 551 and 561 is defined as the tilt surface distance “k”.
- images on the right eye tilt surface 550 and the left eye tilt surface 560 are projected on the display surface 570.
- the distance between the right eye 511 and the left eye 512 and the display surface 570 is taken as an observation distance “d”.
- An image formed at a point 552 that is a distance “S” away from the point 551 on the right eye tilt surface 550 is displayed at the display position 571 on the display surface 570.
- An image formed at a point 562 that is a distance “S” away from the point 561 on the left-eye tilt surface 560 is displayed at the display position 572 on the display surface 570.
- a straight line connecting the right eye 511 and the point 571 and a straight line connecting the left eye 512 and the point 572 intersect at an intersection 522 on the horopter circle 520. That is, the image displayed by the fifth embodiment is equivalent to the image displayed through the horopter circle described in the first to fourth embodiments.
- FIG. 20 is a diagram illustrating a specific example of the projection onto the tilt plane of the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 20A shows a coordinate system of the input image I (p, q) supplied from the signal processing unit 120 via the signal line 129.
- FIG. 20B shows the coordinate system of the tilt plane on which the tilt image is projected. Since the tilt plane is three-dimensional, a three-dimensional coordinate system (x, y, z) is used here. The origin of the coordinate system is set at the center of the right eye 511 and the left eye 512.
- FIG. 20B is a view as seen from the direction perpendicular to the tilt plane, that is, from the direction perpendicular to the y-axis.
- the right eye tilt surface 550 and the left eye tilt surface 560 have angles of “ ⁇ / 2” respectively from the horizontal line. Have.
- the distance between the left eye 512 and the intersection point 582 is “k ⁇ tan ( ⁇ / 2)”. Therefore, the distance between the point 589 at a position away from the origin along the z-axis by the tilt surface distance “k” and the point 561 on the left eye tilt surface 560 is “ak ⁇ tan ( ⁇ / 2)”. Become. Accordingly, the relationship between the left eye tilt image L (x, y, z) and the input image I (p, q) on the left eye tilt surface 560 is expressed by the following equation.
- L (x, y, z) I ((x + ak ⁇ tan ( ⁇ / 2)) / (cos ( ⁇ / 2)), y)
- z k ⁇ ((x + a ⁇ k ⁇ tan ( ⁇ / 2)) / sin ( ⁇ / 2)) It is.
- R (x, y, z) I ((x ⁇ a + k ⁇ tan ( ⁇ / 2)) / (cos ( ⁇ / 2)), y)
- z k + ((x ⁇ a + k ⁇ tan ( ⁇ / 2)) / sin ( ⁇ / 2)) It is.
- FIG. 21 is a diagram illustrating a specific example of the projection onto the display surface of the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- FIG. 21 shows the coordinate system of the tilt plane and the display plane.
- a three-dimensional coordinate system of (x, y, z) is used.
- FIG. 21 shows a coordinate system of the display image J (s, t) projected on the display surface.
- Display images are obtained for each of the right eye 511 and the left eye 512. Since each is a two-dimensional image, a two-dimensional coordinate system is used. The origin of the coordinate system is set at the center point of the display image.
- D L (x L , y L , z L ) on the display surface 570 projected from L (x 0 , y 0 , z 0 ) on the left-eye tilt image from the left eye 512 is Given by.
- D L (x L , y L , z L ) projected from the left eye 512 has been described, the right eye 511 passes through R (x 0 , y 0 , z 0 ) on the right eye tilt image.
- D R (x R , y R , z R ) on the display surface 570 projected can be obtained in the same manner.
- FIG. 22 is a diagram illustrating a processing procedure example according to the fifth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the input image I (p, q) is input from the signal processing unit 120 via the signal line 129 (step S921), the input image I (p, q) is projected as a tilt image.
- the input image I (p, q) is projected as a tilt image.
- separate tilt surfaces are provided for the right eye and the left eye, and a display image is generated as follows (loop L902).
- step S922 When the input image I (p, q) is projected as the right eye tilt image R (x, y, z) on the right eye tilt surface 550 (step S922), perspective transformation is performed on the display surface 570, and Three-dimensional D R (x R , y R , z R ) is obtained (step S923). Then, a two-dimensional display image J (x R , y R ) on the display surface is obtained from this three-dimensional position (step S924). Similarly, when the input image I (p, q) is projected as the left-eye tilt image L (x, y, z) on the left-eye tilt surface 550 (step S922), the perspective transformation is performed on the display surface 570.
- Step S923 The three-dimensional D L (x L , y L , z L ) is obtained (step S923). Then, a two-dimensional display image J (x L , y L ) on the display surface is obtained from this three-dimensional position (step S924).
- Step S922 is an example of an irradiation surface projection procedure described in the claims.
- Steps S923 and S924 are an example of a display surface projection procedure described in the claims.
- the non-stereo images are projected as the right-eye and left-eye tilt images on the right-eye and left-eye tilt surfaces, respectively. To do. Then, by projecting the right-eye and left-eye tilt images on the display surface at the actually observed or estimated observation distance, a stereoscopic image for the right eye 511 and the left eye 512 can be generated.
- FIG. 23 is a diagram illustrating a sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- a stereoscopic image corresponding to a depth map (depth (map) based on depth information is generated.
- the sixth embodiment of the three-dimensional conversion unit 130 includes an input image depth map generation unit 361, a horopter plane depth map generation unit 362, a depth map synthesis unit 363, a display surface right eye projection unit 366, and a display surface left eye.
- a projection unit 367 is a diagram illustrating a sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- a stereoscopic image corresponding to a depth map (depth (map) based on depth information is generated.
- the sixth embodiment of the three-dimensional conversion unit 130 includes an input image depth map generation unit 361, a horopter plane depth map generation unit 362, a depth map synthesis unit 363, a display surface right eye projection unit 366, and a display surface left eye.
- the input image depth map generation unit 361 generates a depth map for the non-stereo image (input image) supplied from the signal processing unit 120 via the signal line 129.
- the depth map holds information about the depth for each pixel, and is estimated based on, for example, luminance, high frequency components, motion, saturation, and the like.
- Japanese Unexamined Patent Application Publication No. 2007-502454 describes a multi-view image generation unit that generates a depth map based on edges detected in an input image.
- the input image depth map generation unit 361 is an example of a depth information generation unit described in the claims.
- the horopter plane depth map generator 362 generates a depth map for the horopter plane.
- the size of the horopter circle is specified by the horopter circle information, and the relative positional relationship with both eyes is specified by the interocular distance “2a”, as in the first embodiment.
- the depth map synthesis unit 363 synthesizes the depth map of the input image generated by the input image depth map generation unit 361 and the depth map of the horopter plane generated by the horopter plane depth map generation unit 362.
- the depth map synthesis unit 363 is an example of a depth information synthesis unit described in the claims.
- the display surface right-eye projection unit 366 considers the combined depth map combined by the depth map combining unit 363 for the non-stereo image supplied from the signal processing unit 120 via the signal line 129, and generates a right-eye stereoscopic image. It is projected onto the display surface for the right eye.
- the display surface left-eye projection unit 367 considers the combined depth map combined by the depth map combining unit 363 for the non-stereo image supplied from the signal processing unit 120 via the signal line 129, and generates a stereoscopic image for the left eye. The image is projected onto the display surface for the left eye.
- FIG. 24 is a diagram showing an outline of processing according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the depth information is estimated by the input image depth map generation unit 361 and synthesized with the horopter plane.
- the synthesized point 621 is present in front of the horopter circle, when it is projected onto the display surface, it is grasped as being present at a closer position.
- the synthesized point 622 exists behind the horopter circle, when it is projected onto the display surface, it is grasped as existing at a farther position.
- FIG. 25 is a diagram illustrating an example of a depth map of a horopter plane according to a sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the depth map of the horopter plane is represented by dp H.
- the horopter plane is a three-dimensional shape including a horopter circle, and is specified by a distance from the x plane. That is, the depth map of the horopter plane is a function of x and y and is expressed by the following equation.
- FIG. 26 is a diagram illustrating an example of estimating depth information according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the depth map of the depth information is represented by dp I.
- Depth information indicates the depth corresponding to each pixel, and is represented as three-dimensional information. That is, the depth map of the depth information is a function of x and y and is expressed by the following equation.
- FIG. 27 is a diagram illustrating a configuration example of the depth map synthesis unit 363 of the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the depth map synthesis unit 363 synthesizes the depth map of the input image generated by the input image depth map generation unit 361 and the depth map of the horopter plane generated by the horopter plane depth map generation unit 362 as described above. It is.
- the depth map synthesis unit 363 includes an average value calculation unit 3631, a subtracter 3632, and an adder 3633.
- the average value calculation unit 3631 calculates the average value of the depth map for each input image.
- the subtracter 3632 subtracts the average value for each input image of the depth map from the depth map for each pixel of the input image. Thereby, the alternating current component of the depth map which makes an average value a center value is obtained.
- the adder 3633 adds the AC component of the depth map of the input image supplied from the subtractor 3632 to the depth map of the horopter plane. This provides a combined depth map on the horopter plane.
- FIG. 28 is a diagram illustrating a stereoscopic image generation example according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the input non-stereo image is projected as a left-eye image 630 onto the curved surface 620 corresponding to the depth map, and is projected onto the right-eye image 640 corresponding to each point on the curved surface 620.
- a point 631 on the left-eye image 630 viewed from the left eye 512 is projected to a point 621 on the curved surface 620.
- the point 621 is projected onto the point 641 on the right eye image 640.
- a point 632 on the left-eye image 630 viewed from the left eye 512 is projected to a point 622 on the curved surface 620.
- this point 622 is viewed by the right eye 511, it is projected on the point 642 on the right eye image 640.
- the left-eye image 630 and the right-eye image 640 are shown as being shifted from each other in the z direction, but in actuality they are located on the same plane.
- FIG. 29 is a diagram illustrating another example of generating a stereoscopic image according to the sixth example of the three-dimensional conversion unit 130 in the embodiment of the present invention.
- the input non-stereo image is projected as an image (input image 650) viewed from the center 513 of the right eye and the left eye onto the curved surface 620 corresponding to the depth map, and for the left eye corresponding to each point on the curved surface 620. Projecting to the image 630 and the right-eye image 640.
- a point 651 on the input image 650 viewed from the center 513 is projected to a point 621 on the curved surface 620.
- this point 621 is viewed with the left eye 512, it is projected onto the point 631 on the left eye image 630, and when viewed with the right eye 511, it is projected onto the point 641 on the right eye image 640.
- a point 652 on the input image 650 viewed from the center 513 is projected to a point 622 on the curved surface 620.
- the point 622 is viewed with the left eye 512, it is projected onto the point 632 on the left eye image 630, and when viewed with the right eye 511, it is projected onto the point 642 on the right eye image 640.
- the left-eye image 630, the right-eye image 640, and the input image 650 are shown with their positions shifted in the z-direction, but they are actually all located on the same plane.
- the embodiment of the present invention is an example for embodying the present invention, and has a corresponding relationship with the invention specifying items in the claims as described above.
- the present invention is not limited to the embodiments, and various modifications can be made without departing from the scope of the present invention.
- the processing procedure described in the embodiment of the present invention may be regarded as a method having a series of these procedures, and a program for causing a computer to execute the series of procedures or a recording medium storing the program May be taken as
- this recording medium for example, a CD (Compact Disc), an MD (MiniDisc), a DVD (Digital Versatile Disc), a memory card, a Blu-ray Disc (Blu-ray Disc (registered trademark)), or the like can be used.
- Source selection part 190 Display part 311,321,331,341 Horopter plane image projection part 316, 326, 336, 346, 356, 366 Display surface right eye projection unit 317, 327, 337, 347, 357, 367 Display surface left eye projection unit 322, 332, 342 Convergence point setting unit 343 Scaling unit 354 Slope surface right eye setting unit 355 tilt plane left eye setting unit 361 input image depth map generation unit 362 horopter plane depth map generation unit 363 depth map synthesis unit 520 horopter circle 530 horopter image 540, 570 display surface 620 post-synthesis curved surface 630 image for left eye 640 image for right eye 650 input image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Computing Systems (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Image Processing (AREA)
Abstract
Description
ただし、
z2+x2=r2
ψ=tan-1(z/x)
である。
c=(r2-a2)1/2となる。
また、視聴者の観察位置との間の距離が観察距離dであることから、zR=dであり、以下の式が成り立つ。
x0 2+(z0-c)2=r2
z0>0
これにより、表示面540上の位置D(xR,yR,zR)に投影すべき画像を、位置H(x0,y0,z0)から求めることができる。すなわち、{xR,yR,zR}から{x0,y0,z0}を求めることになる。
=r+(r2-a2)1/2
rを左辺に移動して両辺を二乗すると、
f2-2rf+r2=r2-a2
r=(f2+a2)/2f
となるため、両眼の中心から輻輳点までの距離「f」と両眼間距離「2a」とから半径「r」を求めることができる。円周角と半径との関係については、次の実施例(図15)において説明する。
q=r・cosφ
ここで、角度φは、長さqおよびrの両辺のなす角である。この角度φは次式により表される。
また、表示面上に投影されるサイズ(幅)「2m」のうち、上述の直角三角形を通過する部分をxとし、その右部分をyとする。直角三角形における相似関係から、
p:x=q:(d-c)
したがって、xは次式により与えられる。
また、頂点Tの直角三角形における相似関係から、
t:a=(t+c):s
t:a=(t+c+q):p
したがって、sは次式により得られる。
=(a・q-c・p)/(c+q)
また、長さsおよび半径rの各辺によって形成される三角形における相似関係から、
s:y=q:(q-(d-c))
したがって、yは次式により与えられる。
=((q-d+c)・(a・q-c・p))/(q・(c+q))
このようにして得られたxとyを加算したものが表示面上に投影されるサイズ(幅)の半分「m」となる。
=p・(d-c)/q
+((q-d+c)・(a・q-c・p))/(q・(c+q))
図11は、本発明の実施の形態における三次元変換部130の第2の実施例のホロプター面への投影の具体例を示す図である。第2の実施例では、上述のように、表示面540上の投影幅が表示面540一杯に表示されるように輻輳点を設定する。したがって、同図に示すように、表示面540自体の幅「2M」と、上式により得られた表示面上の投影幅「2m」とを一致させることになる。
一方、ホロプター円上の位置536に対応する表示面540上の位置546では、観察距離「d」を一辺とし、角度「θ」を有する直角三角形を想定すると、ずれ幅Q2は次式のようになる。
したがって、歪度Qは、次式により得られる。
=(2d・(tanθ-tan(θ-δ/2)))/(2d・tan(δ/2))
=(tanθ-tan(θ-δ/2))/tan(δ/2)
図14は、角度θと画像の歪度Qの関係例を示す図である。ここでは、図13において微小角度δを「0.01」として、角度θを「-70°」から「+70°」まで変化させた場合の歪度Qを示している。
また、右目511および左目512の中心点と、ホロプター円520の中心点oと、左目512とからなる直角三角形に注目すると、半径rは次式により表される。すなわち、円周角「τ」と両眼間距離「2a」とから半径「r」を求めることができることがわかる。
したがって、上の2式から半径rを消去して、次式が得られる。
上式より、角度「θ」、両眼間距離の半分「a」、および、入力画像のサイズ「L」が既知であれば、円周角「τ」が得られることがわかる。
x:a=(x+d):M
を満たすことから、
x=a・d/(M-a)
により与えられる。
また、第3の実施例において算出したように、交点537と入力画像の端点538とからホロプター円520の中心点oを見込んだ角度は「2・(θ-τ/2)」となる。この場合の円弧が入力画像のサイズ(幅)の「L」と一致することから次式が成り立つ。
第3の実施例では入力画像のサイズを固定としたが、この第4の実施例では入力画像のサイズは可変であり、上式を満たすようにスケーリング部343によって入力画像のスケーリングが行われる。
ただし、この左目用あおり画像L(x,y,z)では、
z=k-((x+a-k・tan(τ/2))/sin(τ/2))
である。
ただし、この右目用あおり画像R(x,y,z)では、
z=k+((x-a+k・tan(τ/2))/sin(τ/2))
である。
また、視聴者の観察位置との間の距離が観察距離dであることから、zR=dであり、以下の式が成り立つ。
z0>0
ただし、図20により説明したように、
z0=k-((x0+a-k・tan(τ/2))/sin(τ/2))
である。
このとき、座標(xi,yi)に対するx平面からのホロプター面の距離をdiとすると、関数z(x,y)は、次式により与えられる。
図26は、本発明の実施の形態における三次元変換部130の第6の実施例による奥行き情報の推定例を示す図である。
このとき、座標(xi,yi)に対するx平面からの奥行き情報の値をeiとすると、関数z(x,y)は、次式により与えられる。
図27は、本発明の実施の形態における三次元変換部130の第6の実施例の深度マップ合成部363の一構成例を示す図である。深度マップ合成部363は、上述のように入力画像深度マップ生成部361によって生成された入力画像の深度マップと、ホロプター面深度マップ生成部362によって生成されたホロプター面の深度マップとを合成するものである。この深度マップ合成部363は、平均値算出部3631と、減算器3632と、加算器3633とを備えている。
120 信号処理部
130 三次元変換部
140 パラメータ設定部
150 観察距離測定部
160 後処理部
170 フォーマット変換部
180 ソース選択部
190 表示部
311、321、331、341 ホロプター面画像投影部
316、326、336、346、356、366 表示面右目投影部
317、327、337、347、357、367 表示面左目投影部
322、332、342 輻輳点設定部
343 スケーリング部
354 あおり面右目設定部
355 あおり面左目設定部
361 入力画像深度マップ生成部
362 ホロプター面深度マップ生成部
363 深度マップ合成部
520 ホロプター円
530 ホロプター画像
540、570 表示面
620 合成後曲面
630 左目用画像
640 右目用画像
650 入力画像
Claims (13)
- 両眼に接する仮想円を含む円筒面に対して2次元入力画像を投影して円筒画像を生成する円筒面投影部と、
前記両眼のそれぞれを基準として前記円筒画像を表示面に対して投影して前記両眼のそれぞれに射影される表示画像を生成する表示面投影部とを具備する立体視画像生成装置。 - 前記仮想円の半径は、想定される観察距離または表示サイズに応じて設定される請求項1記載の立体視画像生成装置。
- 前記表示面と観察位置との距離を測定する観察距離測定部をさらに具備し、
前記仮想円の半径は、前記観察距離測定部により測定された観察距離に応じて設定される請求項2記載の立体視画像生成装置。 - 前記仮想円の半径は、前記表示画像の歪度が所定の閾値より小さくなるように設定される請求項1記載の立体視画像生成装置。
- 前記2次元入力画像から奥行き情報を生成する奥行き情報生成部と、
前記奥行き情報を前記円筒画像に合成する奥行き情報合成部とをさらに具備し、
前記表示面投影部は、前記奥行き情報が合成された円筒画像を表示面に対して投影して前記表示画像を生成する請求項1記載の立体視画像生成装置。 - 両眼のそれぞれの視線に直交する2次元平面に対して2次元入力画像を投影して両眼のそれぞれに対応する照射画像を生成する照射面投影部と、
前記両眼のそれぞれを基準として対応する前記照射画像を表示面に対して投影して前記両眼のそれぞれに射影される表示画像を生成する表示面投影部とを具備する立体視画像生成装置。 - 前記照射画像の位置は、想定される観察距離に応じて設定される請求項6記載の立体視画像生成装置。
- 前記表示面と観察位置との距離を測定する観察距離測定部をさらに具備し、
前記照射画像の位置は、前記観察距離測定部により測定された観察距離に応じて設定される請求項7記載の立体視画像生成装置。 - 表示面から右目および左目のそれぞれに投影される映像が同一となるように2次元入力画像を変換してそれぞれ右目画像および左目画像を生成する立体視画像生成装置。
- 両眼に接する仮想円を含む円筒面に対して2次元入力画像を投影して円筒画像を生成する円筒面投影手順と、
前記両眼のそれぞれを基準として前記円筒画像を表示面に対して投影して前記両眼のそれぞれに射影される表示画像を生成する表示面投影手順とを具備する立体視画像生成方法。 - 両眼のそれぞれの視線に直交する2次元平面に対して2次元入力画像を投影して両眼のそれぞれに対応する照射画像を生成する照射面投影手順と、
前記両眼のそれぞれを基準として対応する前記照射画像を表示面に対して投影して前記両眼のそれぞれに射影される表示画像を生成する表示面投影手順とを具備する立体視画像生成方法。 - 両眼に接する仮想円を含む円筒面に対して2次元入力画像を投影して円筒画像を生成する円筒面投影手順と、
前記両眼のそれぞれを基準として前記円筒画像を表示面に対して投影して前記両眼のそれぞれに射影される表示画像を生成する表示面投影手順とをコンピュータに実行させるプログラム。 - 両眼のそれぞれの視線に直交する2次元平面に対して2次元入力画像を投影して両眼のそれぞれに対応する照射画像を生成する照射面投影手順と、
前記両眼のそれぞれを基準として対応する前記照射画像を表示面に対して投影して前記両眼のそれぞれに射影される表示画像を生成する表示面投影手順とをコンピュータに実行させるプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09758305A EP2172906A4 (en) | 2008-06-06 | 2009-06-02 | DEVICE, METHOD AND PROGRAM FOR GENERATING STEREOSCOPIC IMAGES |
RU2010103049/08A RU2519518C2 (ru) | 2008-06-06 | 2009-06-02 | Устройство генерирования стереоскопического изображения, способ генерирования стереоскопического изображения и программа |
CN2009801000468A CN101796548B (zh) | 2008-06-06 | 2009-06-02 | 立体图像生成装置和立体图像生成方法 |
US12/671,367 US9507165B2 (en) | 2008-06-06 | 2009-06-02 | Stereoscopic image generation apparatus, stereoscopic image generation method, and program |
BRPI0903910-4A BRPI0903910A2 (pt) | 2008-06-06 | 2009-06-02 | Aparelho de geração de imagem estereoscópica, método de geração de imagem estereoscópica, e programa |
HK10108896.6A HK1142427A1 (en) | 2008-06-06 | 2010-09-20 | Stereoscopic image generation device and stereoscopic image generation method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-149123 | 2008-06-06 | ||
JP2008149123A JP5083052B2 (ja) | 2008-06-06 | 2008-06-06 | 立体視画像生成装置、立体視画像生成方法およびプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009148038A1 true WO2009148038A1 (ja) | 2009-12-10 |
Family
ID=41398115
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/060028 WO2009148038A1 (ja) | 2008-06-06 | 2009-06-02 | 立体視画像生成装置、立体視画像生成方法およびプログラム |
Country Status (10)
Country | Link |
---|---|
US (1) | US9507165B2 (ja) |
EP (1) | EP2172906A4 (ja) |
JP (1) | JP5083052B2 (ja) |
KR (1) | KR20110034575A (ja) |
CN (2) | CN101796548B (ja) |
BR (1) | BRPI0903910A2 (ja) |
HK (1) | HK1142427A1 (ja) |
RU (1) | RU2519518C2 (ja) |
TW (1) | TW201011689A (ja) |
WO (1) | WO2009148038A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130021446A1 (en) * | 2011-07-20 | 2013-01-24 | GM Global Technology Operations LLC | System and method for enhanced sense of depth video |
CN114390271A (zh) * | 2020-10-19 | 2022-04-22 | 苏州佳世达光电有限公司 | 判别连续影像顺序的系统及其方法 |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5235976B2 (ja) * | 2010-05-31 | 2013-07-10 | 株式会社ソニー・コンピュータエンタテインメント | 映像再生方法、および映像再生装置 |
KR20120051308A (ko) * | 2010-11-12 | 2012-05-22 | 삼성전자주식회사 | 3d 입체감을 개선하고 시청 피로를 저감하는 방법 및 장치 |
KR20120052091A (ko) * | 2010-11-15 | 2012-05-23 | 삼성전자주식회사 | 컴퓨터, 모니터, 기록매체 및 이에 적용되는 3차원 영상 제공방법 |
US9210322B2 (en) | 2010-12-27 | 2015-12-08 | Dolby Laboratories Licensing Corporation | 3D cameras for HDR |
US20130147793A1 (en) * | 2011-12-09 | 2013-06-13 | Seongyeom JEON | Mobile terminal and controlling method thereof |
CN102692806B (zh) * | 2012-06-04 | 2015-08-05 | 济南大学 | 自由视点四维空间视频序列的采集与形成方法 |
CN104012088B (zh) * | 2012-11-19 | 2016-09-28 | 松下知识产权经营株式会社 | 图像处理装置以及图像处理方法 |
RU2600524C2 (ru) * | 2014-07-15 | 2016-10-20 | Федеральное государственное автономное образовательное учреждение высшего профессионального образования "Санкт-Петербургский государственный университет аэрокосмического приборостроения" | Способ конвертации 2d-изображения в квазистереоскопическое 3d-изображение |
KR102172388B1 (ko) * | 2014-09-11 | 2020-10-30 | 엘지디스플레이 주식회사 | 곡면 디스플레이 및 이의 영상 처리 방법 |
CN104240294B (zh) * | 2014-09-28 | 2017-10-20 | 华南理工大学 | 基于双目单视界的三维重建方法 |
DE102014115331A1 (de) * | 2014-10-21 | 2016-04-21 | Isra Surface Vision Gmbh | Verfahren und Vorrichtung zur Ermittlung einer dreidimensionalen Verzerrung |
US9818021B2 (en) * | 2014-10-21 | 2017-11-14 | Isra Surface Vision Gmbh | Method for determining a local refractive power and device therefor |
WO2016202837A1 (en) * | 2015-06-16 | 2016-12-22 | Koninklijke Philips N.V. | Method and apparatus for determining a depth map for an image |
US9934615B2 (en) * | 2016-04-06 | 2018-04-03 | Facebook, Inc. | Transition between binocular and monocular views |
US10326976B2 (en) * | 2016-06-17 | 2019-06-18 | Industry-Academic Cooperation Foundation, Yonsei University | Method and apparatus for providing personal 3-dimensional image using convergence matching algorithm |
US9609197B1 (en) | 2016-08-19 | 2017-03-28 | Intelligent Security Systems Corporation | Systems and methods for dewarping images |
US9547883B1 (en) | 2016-08-19 | 2017-01-17 | Intelligent Security Systems Corporation | Systems and methods for dewarping images |
WO2021236468A1 (en) * | 2020-05-19 | 2021-11-25 | Intelligent Security Systems Corporation | Technologies for analyzing behaviors of objects or with respect to objects based on stereo imageries therof |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09102052A (ja) * | 1995-10-06 | 1997-04-15 | Olympus Optical Co Ltd | 立体画像表示装置 |
JP2001175885A (ja) * | 1999-12-20 | 2001-06-29 | Tomohiko Hattori | 立体画像表示装置用2dー3d画像変換方式および装置 |
JP2002365593A (ja) | 2001-06-08 | 2002-12-18 | Sony Corp | 表示装置、位置調整パターン表示プログラム、記録媒体、偏光メガネ、及び表示装置のフィルター位置調整方法 |
JP2005339313A (ja) * | 2004-05-28 | 2005-12-08 | Toshiba Corp | 画像提示方法及び装置 |
WO2006017771A1 (en) * | 2004-08-06 | 2006-02-16 | University Of Washington | Variable fixation viewing distance scanned light displays |
JP2007502454A (ja) | 2003-08-05 | 2007-02-08 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | マルチビュー画像の生成 |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4875034A (en) * | 1988-02-08 | 1989-10-17 | Brokenshire Daniel A | Stereoscopic graphics display system with multiple windows for displaying multiple images |
US5644324A (en) * | 1993-03-03 | 1997-07-01 | Maguire, Jr.; Francis J. | Apparatus and method for presenting successive images |
JP3089306B2 (ja) * | 1993-08-26 | 2000-09-18 | 松下電器産業株式会社 | 立体画像撮像及び表示装置 |
AUPN732395A0 (en) | 1995-12-22 | 1996-01-25 | Xenotech Research Pty Ltd | Image conversion and encoding techniques |
JP2846856B2 (ja) * | 1996-07-19 | 1999-01-13 | 三洋電機株式会社 | 立体映像表示装置 |
US6229562B1 (en) * | 1997-07-08 | 2001-05-08 | Stanley H. Kremen | System and apparatus for the recording and projection of images in substantially 3-dimensional format |
ATE420528T1 (de) * | 1998-09-17 | 2009-01-15 | Yissum Res Dev Co | System und verfahren zur erzeugung und darstellung von panoramabildern und filmen |
US7271803B2 (en) | 1999-01-08 | 2007-09-18 | Ricoh Company, Ltd. | Method and system for simulating stereographic vision |
JP2000228748A (ja) * | 1999-02-05 | 2000-08-15 | Ricoh Co Ltd | 画像入力装置 |
US6507665B1 (en) * | 1999-08-25 | 2003-01-14 | Eastman Kodak Company | Method for creating environment map containing information extracted from stereo image pairs |
JP2001141417A (ja) * | 1999-11-11 | 2001-05-25 | Fuji Photo Film Co Ltd | 視差量補正装置 |
WO2001039512A1 (en) * | 1999-11-26 | 2001-05-31 | Sanyo Electric Co., Ltd. | Device and method for converting two-dimensional video to three-dimensional video |
JP3425402B2 (ja) | 2000-01-31 | 2003-07-14 | 独立行政法人通信総合研究所 | 立体画像を表示する装置および方法 |
JP2001245322A (ja) * | 2000-03-01 | 2001-09-07 | Inst Of Physical & Chemical Res | 立体画像の入出力方法と装置 |
JP2002223458A (ja) * | 2001-01-26 | 2002-08-09 | Nippon Hoso Kyokai <Nhk> | 立体映像作成装置 |
IL159537A0 (en) * | 2001-06-28 | 2004-06-01 | Omnivee Inc | Method and apparatus for control and processing of video images |
WO2003046832A1 (de) * | 2001-11-24 | 2003-06-05 | Tdv Technologies Corp. | Erzeugung einer stereo-bildfolge aus einer 2d-bildfolge |
RU2287858C2 (ru) * | 2001-11-24 | 2006-11-20 | Тдв Текнолоджиз Корп. | Создание последовательности стереоскопических изображений из последовательности двумерных изображений |
JP3929346B2 (ja) | 2002-04-24 | 2007-06-13 | 日立造船株式会社 | ステレオ画像表示方法およびステレオ画像表示装置 |
US7489812B2 (en) * | 2002-06-07 | 2009-02-10 | Dynamic Digital Depth Research Pty Ltd. | Conversion and encoding techniques |
US6890077B2 (en) * | 2002-11-27 | 2005-05-10 | The Boeing Company | Method and apparatus for high resolution video image display |
JP2004186863A (ja) * | 2002-12-02 | 2004-07-02 | Amita Technology Kk | 立体映像表示装置及び立体映像信号処理回路 |
WO2005062105A1 (en) * | 2003-12-12 | 2005-07-07 | Headplay, Inc. | Optical arrangements for head mounted displays |
GB0412651D0 (en) * | 2004-06-07 | 2004-07-07 | Microsharp Corp Ltd | Autostereoscopic rear projection screen and associated display system |
JP4033859B2 (ja) * | 2004-12-28 | 2008-01-16 | 独立行政法人科学技術振興機構 | 立体画像表示方法 |
CN1330928C (zh) * | 2005-12-29 | 2007-08-08 | 清华紫光股份有限公司 | 一种采用双波长结构光测量物体轮廓的方法 |
CN100552539C (zh) * | 2006-01-10 | 2009-10-21 | 钟明 | 一种环幕立体电影图像的制作方法 |
JP4614456B2 (ja) | 2006-07-07 | 2011-01-19 | 国立大学法人 東京大学 | 再帰性反射材、投影装置、航空機、および航空機用シミュレータ |
US10908421B2 (en) * | 2006-11-02 | 2021-02-02 | Razer (Asia-Pacific) Pte. Ltd. | Systems and methods for personal viewing devices |
US8330801B2 (en) * | 2006-12-22 | 2012-12-11 | Qualcomm Incorporated | Complexity-adaptive 2D-to-3D video sequence conversion |
TWI331872B (en) * | 2006-12-29 | 2010-10-11 | Quanta Comp Inc | Method for displaying stereoscopic image |
-
2008
- 2008-06-06 JP JP2008149123A patent/JP5083052B2/ja not_active Expired - Fee Related
-
2009
- 2009-06-02 US US12/671,367 patent/US9507165B2/en not_active Expired - Fee Related
- 2009-06-02 KR KR1020107002176A patent/KR20110034575A/ko not_active Application Discontinuation
- 2009-06-02 CN CN2009801000468A patent/CN101796548B/zh not_active Expired - Fee Related
- 2009-06-02 RU RU2010103049/08A patent/RU2519518C2/ru not_active IP Right Cessation
- 2009-06-02 WO PCT/JP2009/060028 patent/WO2009148038A1/ja active Application Filing
- 2009-06-02 BR BRPI0903910-4A patent/BRPI0903910A2/pt not_active IP Right Cessation
- 2009-06-02 EP EP09758305A patent/EP2172906A4/en not_active Withdrawn
- 2009-06-02 CN CN201210034197.1A patent/CN102789058B/zh not_active Expired - Fee Related
- 2009-06-04 TW TW098118610A patent/TW201011689A/zh unknown
-
2010
- 2010-09-20 HK HK10108896.6A patent/HK1142427A1/xx not_active IP Right Cessation
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09102052A (ja) * | 1995-10-06 | 1997-04-15 | Olympus Optical Co Ltd | 立体画像表示装置 |
JP2001175885A (ja) * | 1999-12-20 | 2001-06-29 | Tomohiko Hattori | 立体画像表示装置用2dー3d画像変換方式および装置 |
JP2002365593A (ja) | 2001-06-08 | 2002-12-18 | Sony Corp | 表示装置、位置調整パターン表示プログラム、記録媒体、偏光メガネ、及び表示装置のフィルター位置調整方法 |
JP2007502454A (ja) | 2003-08-05 | 2007-02-08 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | マルチビュー画像の生成 |
JP2005339313A (ja) * | 2004-05-28 | 2005-12-08 | Toshiba Corp | 画像提示方法及び装置 |
WO2006017771A1 (en) * | 2004-08-06 | 2006-02-16 | University Of Washington | Variable fixation viewing distance scanned light displays |
Non-Patent Citations (2)
Title |
---|
JAN J KOENDERINK ET AL.: "Perception", vol. 23, 1997, PION PUBLICATION, article "On so- called paradoxical monocular stereoscopy", pages: 583 - 594 |
See also references of EP2172906A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130021446A1 (en) * | 2011-07-20 | 2013-01-24 | GM Global Technology Operations LLC | System and method for enhanced sense of depth video |
CN114390271A (zh) * | 2020-10-19 | 2022-04-22 | 苏州佳世达光电有限公司 | 判别连续影像顺序的系统及其方法 |
CN114390271B (zh) * | 2020-10-19 | 2023-08-18 | 苏州佳世达光电有限公司 | 判别连续影像顺序的系统及其方法 |
Also Published As
Publication number | Publication date |
---|---|
CN102789058B (zh) | 2015-03-18 |
TW201011689A (en) | 2010-03-16 |
US9507165B2 (en) | 2016-11-29 |
US20100201783A1 (en) | 2010-08-12 |
HK1142427A1 (en) | 2010-12-03 |
CN101796548A (zh) | 2010-08-04 |
CN101796548B (zh) | 2012-10-31 |
RU2010103049A (ru) | 2011-08-10 |
BRPI0903910A2 (pt) | 2015-06-30 |
RU2519518C2 (ru) | 2014-06-10 |
JP2009294988A (ja) | 2009-12-17 |
EP2172906A4 (en) | 2012-08-08 |
KR20110034575A (ko) | 2011-04-05 |
EP2172906A1 (en) | 2010-04-07 |
JP5083052B2 (ja) | 2012-11-28 |
CN102789058A (zh) | 2012-11-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5083052B2 (ja) | 立体視画像生成装置、立体視画像生成方法およびプログラム | |
KR101629479B1 (ko) | 능동 부화소 렌더링 방식 고밀도 다시점 영상 표시 시스템 및 방법 | |
TWI523488B (zh) | 處理包含在信號中的視差資訊之方法 | |
JP5450330B2 (ja) | 画像処理装置および方法、ならびに立体画像表示装置 | |
WO2015161541A1 (zh) | 一种针对多视点裸眼3d显示的并行同步缩放引擎及方法 | |
US20120128234A1 (en) | System for Generating Images of Multi-Views | |
US9154762B2 (en) | Stereoscopic image system utilizing pixel shifting and interpolation | |
JP2011090400A (ja) | 画像表示装置および方法、並びにプログラム | |
US5764236A (en) | Image data processing apparatus and image reproduction apparatus | |
JP2013065951A (ja) | 表示装置、表示方法、及びプログラム | |
US8976171B2 (en) | Depth estimation data generating apparatus, depth estimation data generating method, and depth estimation data generating program, and pseudo three-dimensional image generating apparatus, pseudo three-dimensional image generating method, and pseudo three-dimensional image generating program | |
JP5931062B2 (ja) | 立体画像処理装置、立体画像処理方法、及びプログラム | |
JP2013090129A (ja) | 画像処理装置、および画像処理方法、並びにプログラム | |
JP5423845B2 (ja) | 立体視画像生成装置、立体視画像生成方法およびプログラム | |
KR101192121B1 (ko) | 양안시차 및 깊이 정보를 이용한 애너그리프 영상 생성 방법 및 장치 | |
EP4030752A1 (en) | Image generation system and method | |
JP5708395B2 (ja) | 映像表示装置及び映像表示方法 | |
JP5780214B2 (ja) | 奥行き情報生成装置、奥行き情報生成方法、奥行き情報生成プログラム、擬似立体画像生成装置 | |
TW201332338A (zh) | 立體影像處理裝置及立體影像處理方法 | |
Johnson | Practical Stereo Rendering | |
JP2012105169A (ja) | 立体映像撮像装置および立体映像撮像方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980100046.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 7368/CHENP/2009 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009758305 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20107002176 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010103049 Country of ref document: RU Ref document number: 12671367 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09758305 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: PI0903910 Country of ref document: BR Kind code of ref document: A2 Effective date: 20100129 |