WO2012169173A1 - 視差画像生成装置、視差画像生成方法、プログラムおよび集積回路 - Google Patents
視差画像生成装置、視差画像生成方法、プログラムおよび集積回路 Download PDFInfo
- Publication number
- WO2012169173A1 WO2012169173A1 PCT/JP2012/003681 JP2012003681W WO2012169173A1 WO 2012169173 A1 WO2012169173 A1 WO 2012169173A1 JP 2012003681 W JP2012003681 W JP 2012003681W WO 2012169173 A1 WO2012169173 A1 WO 2012169173A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- depth
- parallax
- depth value
- value
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/305—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/341—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present invention relates to a parallax image generating device for generating a parallax image for representing a three-dimensional image, a parallax image generating method, a program, and an integrated circuit.
- Patent Document 1 discloses a technique (hereinafter referred to as prior art A) for eliminating an unnatural appearance at the end of a three-dimensional image.
- the frame (frame) is arranged so that the end of the three-dimensional image is covered with the image of the three-dimensional frame (frame). Therefore, the related art A has a problem that the display size of the three-dimensional image is reduced.
- the present invention has been made to solve such a problem, and it is possible to suppress parallax image generation at the edge of a three-dimensional image without reducing the size of the three-dimensional image. It aims at providing an apparatus etc.
- a parallax image generating device generates two parallax images having parallax mutually necessary for expressing a three-dimensional image from a two-dimensional image to be processed. Perform processing using a depth image to The depth image is composed of a plurality of depth values.
- the parallax image generation apparatus is configured to display, on a display surface for displaying a parallax image, a position corresponding to the depth value as the depth value closer to the end of the depth image among the plurality of depth values constituting the depth image.
- FIG. 1 is a diagram showing an example of the configuration of a three-dimensional image viewing system according to a first embodiment of the present invention.
- FIG. 2 is a diagram showing an example of the configuration of a parallax image generating device according to the first embodiment of the present invention.
- FIG. 3 is a diagram for explaining a depth image.
- FIG. 4 is a diagram for explaining a three-dimensional image.
- FIG. 5 is a view showing an image for left eye and an image for right eye as an example.
- FIG. 6 is a diagram showing the arrangement of objects in a three-dimensional area.
- FIG. 7 is a view of the arrangement position of each object in a three-dimensional area as viewed from the ZX plane.
- FIG. 8 is a flowchart of parallax image generation processing.
- FIG. 8 is a flowchart of parallax image generation processing.
- FIG. 9 is a flowchart of depth image correction processing.
- FIG. 10 is a diagram for explaining the correction target area in the depth image.
- FIG. 11 is a diagram showing the state of the depth value after correction.
- FIG. 12 is a view showing an example of a depth image.
- FIG. 13 is a diagram for explaining one line in the two-dimensional image to be processed.
- FIG. 14 is a diagram showing the arrangement of pixel groups in a three-dimensional area.
- FIG. 15 is a diagram for explaining the shift of the pixel.
- FIG. 16 is a diagram showing an image for the left eye and an image for the right eye generated by the parallax image generation process according to the first embodiment of the present invention.
- FIG. 17 is a diagram showing the arrangement of objects in a three-dimensional area.
- FIG. 10 is a diagram for explaining the correction target area in the depth image.
- FIG. 11 is a diagram showing the state of the depth value after correction.
- FIG. 12 is a view showing an example of
- FIG. 18 is a perspective view showing the position of an object represented by an image for the left eye and an image for the right eye in a three-dimensional area.
- FIG. 19 is an external view of a parallax image generating device as a display.
- FIG. 20A is an external view of a parallax image generating device as a digital still camera.
- FIG. 20B is an external view of a parallax image generating device as a digital video camera.
- FIG. 21A is a diagram showing an example of a physical format of a recording medium according to the second embodiment of the present invention.
- FIG. 21B is a view showing the configuration of a recording medium according to the second embodiment of the present invention.
- FIG. 21C is a diagram showing the configuration of a computer system according to the second embodiment of the present invention.
- a parallax image generating device generates two parallax images having parallax mutually necessary for expressing a three-dimensional image from a two-dimensional image to be processed. Perform processing using a depth image to The depth image is composed of a plurality of depth values.
- the parallax image generation apparatus is configured to display, on a display surface for displaying a parallax image, a position corresponding to the depth value as the depth value closer to the end of the depth image among the plurality of depth values constituting the depth image.
- the object A is displayed so as to overlap the end of the three-dimensional image.
- the object A is the end of the three-dimensional image. It looks like it breaks off.
- the depth value is corrected so that the position corresponding to the depth value approaches the display surface for displaying the parallax image as the depth value approaches the end of the depth image. Then, the first and second parallax images are generated using the depth image corrected by the correction processing.
- the unnatural expression of the object displayed at the end of the three-dimensional image can be suppressed without reducing the size of the three-dimensional image. That is, unnatural expression at the end of the three-dimensional image can be suppressed without reducing the size of the three-dimensional image.
- the depth value correction unit further includes a plurality of depth values corresponding to a correction target area which is an area from the end of the depth image to a position separated by L (an integer of 1 or more) pixels from the depth image.
- L an integer of 1 or more
- the depth value correction unit may be configured to set the third of the plurality of depth values corresponding to the correction target area in front of the display surface for displaying the generated first and second parallax images.
- the correction processing may be performed on depth values for expressing a part of pixels in a two-dimensional image.
- the depth value correction unit may increase L as the size of the two-dimensional image in the horizontal direction is larger.
- the depth value correction unit extracts, from among the depth values included in the correction target area, a depth value at which the corresponding position is farthest from the display surface, and a position corresponding to the extracted depth value.
- the value of L may be increased as the distance from the display surface to the front side.
- the depth value correction unit may perform the correction processing on at least one of the correction target area at the left and right ends and the correction target area at the upper and lower ends of the depth image.
- the depth value correction unit determines the value of the L of the correction target area at the left and right ends of the depth image. You may enlarge it.
- the depth value correction unit determines the value of the L of the correction target area at the upper and lower ends of the depth image. You may enlarge it.
- the depth value correction unit may increase the value of L of the corresponding correction target area as the pan or tilt speed of the imaging device increases.
- the depth value correction unit may make the correction target area positioned in the direction in which the imaging device faces, of the end of the depth image, larger than the correction target area positioned on the opposite side.
- the depth value correction unit is configured to display the first and second parallax images in which the depth value closest to the end of the depth image among the plurality of depth values constituting the depth image is displayed.
- the closest depth value may be corrected to be a value for representing a pixel.
- a parallax image generation method uses a depth image for generating two parallax images having parallax mutually necessary for expressing a three-dimensional image from a two-dimensional image to be processed. It is a parallax image generation method for performing processing.
- the depth image is composed of a plurality of depth values.
- the parallax image generation method is such that, among the plurality of depth values constituting the depth image, the depth value closer to the end of the depth image, the position corresponding to the depth value on the display surface for displaying the parallax image Using the two-dimensional image and the depth image corrected by the correction processing to generate the first and second parallax images having parallax with each other using the step of performing the correction processing to correct the depth value so as to be closer to each other And the step of
- a program performs processing using a depth image for generating two parallax images having parallax mutually necessary for expressing a three-dimensional image from a two-dimensional image to be processed.
- a program for The depth image is composed of a plurality of depth values.
- the program causes the position corresponding to the depth value to be closer to the display surface for displaying the parallax image as the depth value closer to the end of the depth image among the plurality of depth values constituting the depth image And correcting the depth value, and generating the first and second parallax images having parallax with each other using the two-dimensional image and the depth image corrected by the correction process.
- An integrated circuit performs processing using a depth image necessary to represent a three-dimensional image from a two-dimensional image to be processed, to generate two parallax images having parallax each other.
- the depth image is composed of a plurality of depth values.
- the integrated circuit is configured such that a position corresponding to the depth value approaches a display surface for displaying a parallax image as the depth value closer to the end of the depth image among the plurality of depth values constituting the depth image Using a depth value correction unit that performs correction processing to correct the depth value, the two-dimensional image, and the depth image corrected by the correction processing, using first and second parallax images having parallax with each other And a parallax image generating unit to generate.
- FIG. 1 is a diagram showing an example of the configuration of a three-dimensional image viewing system 1000 according to the first embodiment of the present invention.
- the X, Y, and Z directions are orthogonal to one another.
- Each of the X, Y, Z directions shown in the following figures are also orthogonal to one another.
- the three-dimensional image viewing system 1000 includes a parallax image generating device 100 and active shutter glasses 200.
- the parallax image generation device 100 is, for example, a plasma display, a liquid crystal display, an organic EL display, or the like.
- the parallax image generation device 100 is not limited to the above display, and may be a digital video camera, a digital still camera, or the like.
- the parallax image generation device 100 may be a device incorporated in a display or a camera.
- the parallax image generation device 100 includes a display surface 101 for displaying an image.
- the display surface 101 is assumed to be parallel to the XY plane. As an example, it is assumed that the display surface 101 can display an image composed of a plurality of pixels arranged in m (natural number) rows and n (natural number) columns.
- m and n are assumed to be 1080 and 1920, respectively. That is, it is assumed that the display surface 101 can display an image having a size of 1920 ⁇ 1080 pixels (hereinafter, also referred to as full HD size). In the following, the size of an image that can be displayed on the display surface 101 is also referred to as a displayable size.
- the displayable size is not limited to the full HD size, and may be, for example, 1366 ⁇ 768 pixels in size.
- parallax image generation apparatus 100 is an apparatus that displays parallax images for representing a three-dimensional image by, for example, a frame sequential method.
- the size of the parallax image displayed on the display surface 101 is equal to the displayable size.
- the display method of the three-dimensional image in the parallax image generation apparatus 100 is not limited to a frame sequential method.
- the display method of the three-dimensional image in the parallax image generation device 100 may be, for example, a lenticular method.
- the size of the three-dimensional image represented by the image displayed on the display surface 101 is smaller than the displayable size.
- the left-eye image 21L is an image to be shown to the left eye (hereinafter, also referred to as a first viewpoint) of the user (viewer).
- the right-eye image 21R is an image for showing the user's right eye (hereinafter, also referred to as a second viewpoint).
- the left-eye image 21L and the right-eye image 21R are two-dimensional images having parallax.
- the parallax image generation device 100 alternately displays the left-eye image 21L and the right-eye image 21R on the display surface 101.
- the active shutter glasses 200 show the left-eye image 21L only to the left eye of the user by shielding the right eye of the user. Further, when the right-eye image 21R is displayed on the display surface 101, the active shutter glasses 200 show the right-eye image 21R only to the user's right eye by blocking the left eye of the user.
- the user wearing the active shutter glasses 200 having such a configuration can view the left-eye image 21L with the left eye, and can view the right-eye image 21R with the right eye.
- the user can view a three-dimensional image represented by the left-eye image 21L and the right-eye image 21R.
- the display method of the three-dimensional image is not limited to the frame sequential method using the active shutter glasses 200.
- the display method of the three-dimensional image may be a method using polarized glasses.
- the display method of the three-dimensional image may be a method using a parallax barrier, a lenticular sheet or the like.
- FIG. 2 is a diagram showing an example of the configuration of the parallax image generating device 100 according to the first embodiment of the present invention.
- the parallax image generation device 100 includes a depth value correction unit 110 and a parallax image generation unit 120.
- the depth value correction unit 110 performs processing using a depth image, the details of which will be described later.
- the depth image corresponds to, for example, a depth map.
- the depth image is an image used to generate an image for the left eye and an image for the right eye as parallax images from the two-dimensional image to be processed. That is, the depth image is an image for generating two parallax images having parallax from each other from the two-dimensional image to be processed.
- the two parallax images (an image for the left eye and an image for the right eye) are images necessary to represent a three-dimensional image.
- FIG. 3 is a diagram for explaining a depth image.
- the depth image is composed of a plurality of depth values.
- the plurality of depth values correspond to pixel values of a plurality of pixels forming the depth image.
- the plurality of depth values constituting the depth image are arranged in a matrix.
- z [mn] indicates a depth value of a pixel corresponding to m rows and n columns in a depth image. That is, z [mn] indicates the depth value of the pixel at coordinates (n, m) in the depth image. Also, for example, z [12] indicates the depth value of the pixel corresponding to one row and two columns in the depth image.
- the depth value is represented in the range of ⁇ 1 to 1 as an example.
- the depth value is not limited to the range of -1 to 1, and may be represented, for example, in the range of 0 to 255.
- FIG. 4 is a diagram for explaining a three-dimensional image.
- FIG. 4 is a figure which shows the two-dimensional image 10 as an example.
- the two-dimensional image 10 is an image to be processed when generating a parallax image for expressing a three-dimensional image.
- Three objects (objects) 11, 12 and 13 are arranged in the two-dimensional image 10 shown in (a) of FIG.
- the depth image D10 is an image for generating two parallax images having parallax from each other from the two-dimensional image 10 to be processed.
- the two parallax images are the left-eye image 20L and the right-eye image 20R, or the left-eye image 21L and the right-eye image 21R described later.
- the size (resolution) of the two-dimensional image 10 is the same as the size (resolution) of the depth image D10.
- each of the plurality of pixels constituting the depth image is also referred to as a depth pixel.
- the depth pixel indicates a depth value. That is, the depth image is composed of a plurality of depth values.
- the depth image D10 is configured of a plurality of depth pixels indicating depth values.
- Each depth pixel constituting the depth image D10 indicates the depth value of the pixel at the same coordinate as the coordinate of the depth pixel in the two-dimensional image 10.
- the pixel at coordinates (x, y) in the depth image D10 indicates the depth value of the pixel at coordinates (x, y) in the two-dimensional image 10. That is, the two-dimensional image 10 is an image corresponding to the depth image D10. Further, the pixel at the coordinate (x, y) in the two-dimensional image 10 is a pixel corresponding to the depth value at the coordinate (x, y) in the depth image D10.
- depth image D10 as an example, as for a depth pixel closer to white, a depth value for expressing that a pixel in a three-dimensional image corresponding to the depth pixel is positioned closer to the front side from display surface 101 Indicates In depth image D10, as an example, a depth value for representing that a pixel in a three-dimensional image corresponding to the depth pixel is positioned farther from display surface 101 as the depth pixel is closer to black. Indicates
- the depth image D10 shows depth images D11, D12, D13.
- the plurality of pixels forming the depth image D11 indicate the depth values of the plurality of pixels forming the object 11, respectively.
- the depth images D12 and D13 are also similar to the depth image D11.
- an image for the left eye and an image for the right eye are generated from the two-dimensional image by a DIBR (Depth Image Based Rendering) method or the like using a depth image.
- DIBR Depth Image Based Rendering
- FIG. 5 is a view showing an image 20L for the left eye and an image 20R for the right eye as an example. It is assumed that the left-eye image 20L and the right-eye image 20R are images that have not been subjected to the processing of the present invention.
- FIG. 5 is a figure which shows the image 20L for left eyes as an example.
- the left-eye image 20L includes objects 11, 12, and 13 in which each pixel of the two-dimensional image 10 is moved (shifted) according to a plurality of corresponding depth values.
- FIG. 5 is a figure which shows the image 20R for right eyes as an example.
- the right-eye image 20R includes objects 11, 12, and 13 in which each pixel of the two-dimensional image 10 is moved (shifted) according to a plurality of corresponding depth values.
- the user feels a three-dimensional effect as if each of the objects 11, 12, and 13 is disposed at the position shown in FIG. it can.
- the three-dimensional area R10 is an area capable of expressing a three-dimensional image to the user by a plurality of parallax images (for example, an image for the left eye and an image for the right eye).
- the user feels that the stereoscopic effect of the object 11 has suddenly disappeared near the left end of the object 11 (that is, the portion outside the three-dimensional area R10 of the object 11).
- the user looks as if the portion located outside the three-dimensional area R10 of the object 11 is blinking.
- the Z-axis direction of the three-dimensional region R10 indicates a depth value. That is, the depth value indicates a position for expressing each pixel of the three-dimensional image in the three-dimensional area R10 (three-dimensional space).
- the three-dimensional region R10 is represented by depth values in the range of ⁇ 1 to 1, for example.
- the display surface 101 is a parallax zero surface.
- the parallax zero plane is a plane in which the parallax of the pixels at the same position of the left eye image and the right eye image displayed on the parallax zero plane is zero.
- the depth value corresponding to the zero parallax surface is also referred to as a zero parallax depth value.
- the parallax zero depth value at the position of the display surface 101 (parallax zero surface) in the Z-axis direction is represented by 0 as an example.
- the depth value of the position of the zero parallax surface may be represented by a numerical value other than zero.
- the depth value on the near side of the display surface 101 in the Z-axis direction is expressed as a negative value as an example.
- the depth value on the back side of the display surface in the Z-axis direction is expressed as a positive value as an example.
- FIG. 7 is a view of the arrangement position of each object in the three-dimensional area R10 as viewed from the ZX plane.
- FIG. 7 shows, as an example, an arrangement relationship between each viewpoint and each object when the left eye and the right eye of the user are arranged on the X axis.
- the viewpoint S0 is a position obtained by projecting the center position of the display surface 101 (parallax zero surface) on the X axis.
- the viewpoint S1 corresponds to the position of the user's left eye.
- the viewpoint S2 corresponds to the position of the right eye of the user.
- the region between the line L11 and the line L12 is an image (for example, the left-eye image 20L) that is expressed when the display surface 101 is viewed from the viewpoint S1.
- an area between the line L21 and the line L22 is an image (for example, the right-eye image 20R) which is expressed when the display surface 101 is viewed from the viewpoint S2.
- parallax image generation processing processing for generating a parallax image in the present embodiment.
- the parallax image generation unit 120 acquires the two-dimensional image 10 to be processed.
- the depth value correction unit 110 in FIG. 2 acquires a depth image D10 corresponding to the two-dimensional image 10.
- FIG. 8 is a flowchart of parallax image generation processing.
- the parallax image generation process corresponds to a parallax image generation method.
- step S110 depth image correction processing is performed.
- FIG. 9 is a flowchart of depth image correction processing.
- step S111 the depth value correction unit 110 sets one pixel of the plurality of pixels forming the depth image to be processed as a pixel to be processed (hereinafter also referred to as a depth pixel to be processed).
- a depth pixel to be processed the depth value indicated by the processing target depth pixel is also referred to as depth value z or z.
- the depth value correction unit 110 determines whether the depth value indicated by the processing target depth pixel is a projection value.
- the pop-out value is a value for representing the pixel in the three-dimensional image corresponding to the depth value indicated by the processing target depth pixel at a position in front of the zero parallax surface.
- the parallax zero depth value corresponding to the parallax zero plane is 0, as an example. Also, it is assumed that the depth value is represented in the range of -1 to 1. In this case, the pop-out value is a value in the range of ⁇ 1 ⁇ pop-up value ⁇ 0.
- step S111 If YES in step S111, the process proceeds to step S112. On the other hand, if NO in step S111, the process for the current depth pixel to be processed ends. In the process of step S111, different pixels are set as processing target depth pixels each time.
- step S112 the depth value correction unit 110 determines whether the processing target depth pixel is a pixel within the correction target area.
- the correction target area is an area in the depth image.
- FIG. 10 is a diagram for explaining the correction target area in the depth image.
- the depth image shown in FIG. 10 is a depth image D10.
- the image which the depth image D10 shows is not shown for the simplification of a figure.
- the width L of the correction target area is calculated by the depth value correction unit 110 multiplying the width W of the depth image by a predetermined coefficient k (0 ⁇ k ⁇ 1). It is assumed that k is, for example, 0.1 (or 0.05). When the width of the depth image is, for example, 1920 pixels, L is 192 pixels (96 pixels).
- the width L of the correction target area may be calculated by the depth value correction unit 110 multiplying the width of the two-dimensional image to be processed by the coefficient k.
- the width of the two-dimensional image to be processed is equal to the width W of the depth image to be processed. That is, the depth value correction unit 110 calculates the value of L based on the size of the two-dimensional image or the depth image in the horizontal direction. More specifically, the depth value correction unit 110 increases the value of L as the size in the horizontal direction of the two-dimensional image or depth image is larger.
- Correction target areas R21 and R22 are arranged in the depth image.
- the correction target area R ⁇ b> 21 is an area of the depth image to a position separated by the distance L from the left end of the depth image.
- the distance L is equal to the width of L (an integer of 1 or more) pixels continuously arranged in the X direction. That is, the correction target area R21 is an area to a position away from the left end of the depth image by L pixels in the depth image.
- the correction target area R22 is an area from the right end of the depth image to a position separated by the distance L. That is, the correction target area R22 is an area of the depth image to a position away from the right end of the depth image by L pixels.
- correction target areas R21 and R22 are areas from the depth image to positions separated by L (an integer of 1 or more) pixels from the left and right ends of the depth image.
- the correction target areas R21 and R22 are determined based on the size of the two-dimensional image or the depth image in the horizontal direction.
- the correction target regions R21 and R22 can each have a width of about 5% of the size of the two-dimensional image in the horizontal direction.
- the method of determining the correction target regions R21 and R22 is not limited to the above, and may be determined by the following method, for example.
- the correction target areas R21 and R22 may be predetermined areas. That is, L may be a predetermined value.
- the value of L may be determined according to the value of the depth value included in the correction target regions R21 and R22. More specifically, of the depth values included in the correction target area, the depth value correction unit 110 selects a depth value at which the corresponding position is farthest from the display surface (in the above example, ⁇ 1 Extract close values). Then, the depth value correction unit 110 may increase the value of L as the position corresponding to the extracted depth value moves closer to the front side from the display surface (that is, as the extracted depth value is closer to ⁇ 1). .
- the correction target areas R21 and R22 are provided at the left and right ends of the depth image, but instead of or in addition to this, the correction target areas may be provided at the upper and lower ends of the depth image. Good. That is, the correction target area may be provided on at least one of the left and right ends and the upper and lower ends of the depth image.
- the depth value correction unit 110 determines the widths (that is, the values of L) of the correction target areas on the left and right and the upper and lower sides of the depth image based on the features of the corresponding two-dimensional image May be changed.
- the depth value correction unit 110 can handle pan (move the orientation of the imaging device in the horizontal direction) or tilt (move the orientation of the imaging device in the vertical direction). Change the width of the correction target area of the depth image.
- the depth value correction unit 110 compares the left and right of the depth image compared to the case where the two-dimensional image is captured while not panning Increase the width of the correction target area at the end of (increase the value of L).
- the depth value correction unit 110 compares the depth image up and down compared to when the image is captured while not being tilted. Increase the width of the correction target area at the end of (increase the value of L).
- the depth value correction unit 110 may change the width of the correction target area according to the speed of panning or tilting (scrolling speed). Specifically, the depth value correction unit 110 increases the width of the correction target area at the left and right ends of the depth image as the panning speed is higher. Similarly, the depth value correction unit 110 increases the width of the correction target area at the upper and lower ends of the depth image as the tilt speed is higher.
- the depth value correction unit 110 makes the width of the correction target region at the left and right ends (or the upper and lower ends) of the depth image asymmetric according to the direction (pan or tilt direction) the imaging device faces. May be That is, the depth value correction unit 110 corrects the correction target area at the end (the side on which the object is framed in) of the end (the side on which the object is framed in) the end to be corrected Make it bigger.
- the depth value correction unit 110 makes the correction target area at the right (left) end of the depth image larger than the correction target area at the left (right) end.
- the depth value correction unit 110 makes the correction target area at the upper (lower) end of the depth image larger than the correction target area at the lower (upper) end.
- the parallax image generation device 100 is configured such that the above-mentioned imaging conditions (pan / tilt, direction of movement of the imaging device (upper and lower, right and left), speed of movement of the imaging device, etc.) , Or may be estimated from motion amounts that can be calculated by comparing a plurality of two-dimensional images at different times.
- step S112 if YES in step S112, the process proceeds to step S113. On the other hand, if NO in step S112, the process for the current depth pixel to be processed ends.
- the depth value indicated by the processing target depth pixel determined as YES in steps S111 and S112 is a depth value for expressing a part of pixels in the three-dimensional image on the near side of the display surface 101.
- step S113 depth value correction processing is performed.
- the depth value correction unit 110 displays the parallax image at a position corresponding to the depth value as the depth value closer to the end of the depth image among the plurality of depth values constituting the depth image
- the correction processing for correcting the depth value is performed so as to approach the display surface 101 of
- the depth value correction unit 110 gradually corrects each depth value included in the correction target region R21 of FIG. 10 to a value closer to 0 as it approaches the left end of the depth image D10. Similarly, the depth value correction unit 110 gradually corrects each depth value included in the correction target region R22 of FIG. 10 to a value closer to 0 as it approaches the right end of the depth image D10.
- the “position corresponding to the depth value” is a position in the Z-axis direction specified (indicated) by the depth value in the three-dimensional area R10 (three-dimensional space).
- the position corresponding to the depth value is a position specified (indicated) by the depth value on an axis orthogonal to the display surface 101.
- the corrected depth value is referred to as a corrected depth value z '.
- the depth value correction unit 110 calculates the corrected depth value z 'by Equation 1 and Equation 2 shown in FIG.
- Equation 1 is the distance (number of pixels) from the end of the depth image to the depth pixel to be processed.
- l is the distance from the left end of the depth image to the processing target depth pixel.
- the processing target depth pixel is a pixel in the correction target region R22
- l is the distance from the right end of the depth image to the processing target depth pixel.
- the corrected depth value z ′ is calculated by Equation 2.
- the depth value correction unit 110 determines that the depth value closest to the end of the depth image among the plurality of depth values constituting the depth image is In the display surface 101 (parallax zero surface), the depth value closest to the end of the depth image is corrected to be a value for expressing a pixel.
- the corrected depth value z ' may be a predetermined value other than 0.
- the predetermined value is a value in which the absolute value of the predetermined value is close to zero.
- the corrected depth value z ′ is a value of the depth value z indicated by the processing target depth pixel.
- the equation for calculating the corrected depth value z ′ is not limited to Equation 2. That is, any other equation can be used as long as it is an equation for calculating the corrected depth value z ′ so that the position corresponding to the depth value approaches the display surface 101 as the depth value approaches the end of the depth image. It is also good.
- the corrected depth value z ′ may be calculated, for example, by the following Equation 3.
- steps S111 to S113 are performed on all the pixels constituting the depth image.
- the process of step S112 is performed only in the case of YES at step S111. Further, the process of step S113 is performed only in the case of YES in step S112.
- a depth image (hereinafter, also referred to as a corrected depth image) in which at least one part of the plurality of depth values constituting each of the correction target regions R21 and R22 in the depth image is corrected is generated.
- FIG. 11 is a diagram showing the state of the depth value after correction.
- FIG. 11 is a diagram showing the state of the depth value after correction when there is a depth value to be subjected to the process of step S113 in each of the correction target areas R21 and R22 as an example.
- the corrected depth value is a value corresponding to a curve corresponding to each of the two L's in the graph of FIG.
- the depth value is a value corresponding to a curve corresponding to L on the left side of the graph in FIG.
- FIG. 12 is a view showing an example of a depth image.
- FIG. 12 shows a depth image D10.
- the corrected depth image generated by the depth image correction process is the corrected depth image D10A shown in (b) of FIG. .
- the depth value correction unit 110 selects one of the plurality of depth values corresponding to the correction target area.
- a correction process is performed on the depth value for expressing a part of pixels in the three-dimensional image on the front side of the display surface 101.
- step S114 After the above process is performed on all the pixels forming the depth image, the process of step S114 is performed.
- step S114 the depth value correction unit 110 transmits the corrected depth image D10A to the parallax image generation unit 120.
- step S120 the depth image correction process ends, and the process returns to the parallax image generation process of FIG. 8 again, and the process proceeds to step S120.
- step S120 generation of parallax images is performed.
- the parallax image generation unit 120 uses the two-dimensional image 10 to be processed and the corrected depth image D10A corrected by the correction processing (depth image correction processing) to generate first and second parallax images.
- the first and second parallax images are images necessary to represent a three-dimensional image.
- the first and second parallax images are an image for the left eye and an image for the right eye, respectively.
- the image for the left eye and the image for the right eye are generated from one two-dimensional image by the DIBR method using a depth image or the like.
- the DIBR method is a known technique, and therefore the detailed description will not be repeated.
- the parallax image generation unit 120 is the parallax image generation unit 120.
- the two-dimensional image to be processed is the two-dimensional image 10.
- the depth value is represented in the range of 0 to 255 as an example, in order to simplify the description.
- FIG. 13 is a diagram for explaining one line in the two-dimensional image 10 to be processed.
- FIG. 13A is a diagram showing the processing target line LN 10 in the two-dimensional image 10.
- the processing target line LN10 is one line to be processed in the two-dimensional image 10.
- (B) of FIG. 13 shows depth values of a plurality of pixels forming the processing target line LN10.
- the numerical values shown in (b) of FIG. 13 are depth values corresponding to each region (pixel group).
- the depth value shown in (b) of FIG. 13 is an example, and is not a depth value corresponding to the corrected depth image D10A.
- the pixel group 11 a is a pixel group constituting an area corresponding to the processing target line LN 10 in the object 11.
- the pixel group 12 a is a pixel group constituting an area corresponding to the processing target line LN 10 in the object 12.
- the pixel group 13 a is a pixel group constituting an area corresponding to the processing target line LN 10 in the object 13.
- Each of the pixel groups 14 a and 14 b is a pixel group that constitutes an area corresponding to the processing target line LN 10 among the areas other than the objects 11, 12 and 13.
- the depth value of each pixel constituting the pixel group 11 a is zero.
- the depth value of each pixel constituting the pixel group 12a is 128.
- the depth value of each pixel constituting the pixel group 13a is 192.
- the depth value of each pixel constituting each of the pixel groups 14a and 14b is 255.
- FIG. 14 is a diagram showing the arrangement of pixel groups in the three-dimensional region R10.
- Offset is a predetermined offset value (viewing distance Offset).
- the Offset may be zero.
- the distance D is a distance between the viewpoint S0 and the viewpoint S1.
- the zero parallax distance Z0 is a distance between the X axis and the display surface 101 on a straight line passing through the viewpoint S0 and the display surface 101.
- the zero parallax distance Z0 is 128 as an example.
- the shift amount (movement amount) x of each pixel constituting the processing target line LN 10 is calculated by the parallax image generation unit 120 according to the following Expression 4.
- FIG. 15 is a diagram for explaining the shift of the pixel.
- (A) of FIG. 15 illustrates an example of the shift amount of the pixel group calculated by Expression 4.
- the shift amount of each pixel constituting the pixel group 11a is -5.
- the shift amount of each pixel constituting the pixel group 12a is zero.
- the shift amount of each pixel constituting the pixel group 13a is +2.
- the shift amount of each pixel constituting each of the pixel groups 14a and 14b is +5.
- Each pixel constituting the processing target line LN10 is shifted in the X-axis direction as shown in (b) of FIG. 15 based on the corresponding shift amount.
- each pixel constituting the pixel group 11a is shifted to the left by 5 pixels.
- Each pixel constituting the pixel group 12a is not shifted.
- Each pixel constituting the pixel group 13a is shifted by two pixels to the right.
- Each pixel constituting each of the pixel groups 14a and 14b is shifted to the right by 5 pixels.
- each pixel forming the processing target line LN10 is updated. That is, the process target line LN10 is updated. Note that blank regions R31 and R32 in which no pixels exist may occur in the processing target line LN10 after update.
- linear interpolation processing is performed so that the depth value of the pixel adjacent to the left end of the blank area R31 and the depth value of the pixel adjacent to the right end of the blank area R31 are smoothly connected.
- linear interpolation processing is performed on the blank region R32.
- new pixels constituting the processing target line LN10 are as shown in (d) of FIG.
- the process for interpolating pixels in the blank area is not limited to the linear interpolation process, and may be another process.
- the processing on the processing target line LN10 described above is performed on all the lines (rows) constituting the two-dimensional image 10. Thereby, an image for the right eye as a second parallax image is generated from the two-dimensional image 10.
- the method of generating the left-eye image as the first parallax image is the same as the method of generating the right-eye image described above, and thus the detailed description will not be repeated.
- step S120 the parallax image generation unit 120 performs the correction process (the depth image correction process) and the two-dimensional image 10 to be processed by the same process as the above-described method for generating the right-eye image described with reference to FIGS. ) And the corrected depth image D10A corrected by the above are used to generate an image for the left eye and an image for the right eye having parallax.
- the depth value of each pixel constituting the two-dimensional image 10 is the depth value of the pixel in the corrected depth image D10A corresponding to the pixel.
- the image for the left eye and the image for the right eye generated using the two-dimensional image to be processed and the depth image corrected by the depth image correction processing are both the processed left eye image and the processed right eye image Say.
- FIG. 16 is a view showing a left-eye image 21L and a right-eye image 21R generated by the parallax image generation process according to the first embodiment of the present invention.
- the left-eye image 21L is a processed left-eye image.
- the right-eye image 21R is a processed right-eye image.
- FIG. 16 is a figure which shows the image 21L for left eyes as an example.
- (B) of FIG. 16 is a figure which shows the image 21R for right eyes as an example.
- Each of the left-eye image 21L and the right-eye image 21R is an image generated from the two-dimensional image 10 to be processed using the corrected depth image D10A in the process of step S120.
- the positions of the objects shown by the left-eye image 21L and the right-eye image 21R shown in FIG. 16 are not necessarily accurate.
- the parallax image generating device 100 alternately displays the generated left-eye image 21L and the generated right-eye image 21R on the display surface 101. That is, the display surface 101 displays the generated first parallax image (image for left eye 21L) and the second parallax image (image for right eye 21R).
- FIG. 17 is a diagram showing an arrangement state of each object in the three-dimensional area R10.
- FIG. 17A is a diagram showing the position of an object represented by the left-eye image 20L and the right-eye image 20R which has not been subjected to the processing of the present invention in the three-dimensional region R10. From (a) of FIG. 17, the user who views the left-eye image 20L and the right-eye image 20R using the active shutter glasses 200 suddenly loses the stereoscopic effect of the object 11 near the left end of the object 11 I feel like that.
- FIG. 17B shows the position of an object represented by the left-eye image 21L and the right-eye image 21R in the three-dimensional region R10.
- the left eye image 21L is viewed with the left eye using the above-described active shutter glasses 200, and the three-dimensional region for the user who views the right eye image 21R with the right eye It is a figure which shows the position and shape of the object represented in R10.
- FIG. 18 is a perspective view showing the position of an object represented by the left-eye image 21L and the right-eye image 21R in the three-dimensional region R10. Specifically, FIG. 18 is a diagram showing the position and the shape of each object shown in (b) of FIG.
- the depth value of the pixel at the left end of the object 11 is zero. Therefore, the amount of projection of the object 11 approaches 0 as the object 11 approaches the left end of the object 11.
- the size of the three-dimensional image represented by the left-eye image 21L and the right-eye image 21R generated by the parallax image generation process of the present embodiment does not decrease as in the related art A. Further, in the three-dimensional image represented by the left-eye image 21L and the right-eye image 21R, as shown in (a) of FIG. Can be prevented.
- the depth value correction process is performed only on the processing target depth pixel that is a processing target depth pixel indicating a pop out value and is included in any of the correction target regions R21 and R22. But it is not limited to this.
- step S111 may not be performed.
- depth value correction processing may be performed on a processing target depth pixel that does not indicate a pop-out value and that is included in any of the correction target regions R21 and R22. That is, the depth value is a processing target depth pixel indicating a depth value for expressing a pixel behind display surface 101 and is included in any of correction target regions R21 and R22. Correction processing may be performed.
- the depth value correction unit 110 performs correction processing (depth value correction processing) on a plurality of depth values corresponding to one of the correction target areas R21 and R22.
- correction processing depth value correction processing
- the right end of the object 13 can be expressed at the position of the right end of the display surface 101 in the three-dimensional area R10 of FIG. That is, it is possible to prevent the phenomenon that the depth of the right end of the object 13 on the back side of the display surface 101 suddenly disappears in the vicinity of the right end of the object 13.
- the zero parallax depth value corresponding to the zero parallax surface is a predetermined value
- the present invention is not limited to this.
- the parallax zero depth value may be appropriately changed, for example, according to a parallax parameter given from the outside of the parallax image generating device 100.
- the depth image is an image prepared in advance, but is not limited to this.
- the depth image may be generated from the shift amount of the left-eye image and the right-eye image obtained by the imaging process of the 3D camera.
- the parallax image generation device 100 is a display.
- FIG. 19 is an external view of a parallax image generating device 100 as a display.
- the parallax image generation device 100 is not limited to the above display, and may be a digital video camera, a digital still camera, or the like.
- FIG. 20A is an external view of a parallax image generating apparatus 100 as a digital still camera.
- FIG. 20B is an external view of a parallax image generating device 100 as a digital video camera.
- the present invention may be realized as a parallax image generation method in which the operation of the characteristic configuration unit included in the parallax image generation device 100 is a step. Furthermore, the present invention may be realized as a program that causes a computer to execute the steps included in such a parallax image generation method. Also, the present invention may be realized as a computer readable recording medium storing such a program.
- the processing shown in the embodiment can be easily implemented in an independent computer system. It becomes possible.
- 21A to 21C are explanatory views of the parallax image generating method according to the embodiment when the computer system implements the parallax image generating method using a program recorded on a recording medium such as a flexible disk. .
- FIG. 21A shows an example of the physical format of a flexible disk which is a recording medium main body.
- FIG. 21B shows the appearance of the flexible disk from the front, the sectional structure, and the flexible disk.
- the flexible disk FD is contained in the case F, and a plurality of tracks Tr are formed concentrically from the outer periphery toward the inner periphery on the surface of the disk, and each track is divided into 16 sectors Se in the angular direction ing. Therefore, in the flexible disk FD storing the program, the program is recorded in the area allocated on the flexible disk FD.
- FIG. 21C shows a configuration for recording and reproducing the program on the flexible disk FD.
- the computer system Cs When the program for realizing the parallax image generation method is recorded on the flexible disk FD, the computer system Cs writes the program via the flexible disk drive FDD.
- the program When the parallax image generation method is built in the computer system by the program in the flexible disk FD, the program is read from the flexible disk FD by the flexible disk drive FDD and transferred to the computer system Cs.
- the recording medium is not limited to this, and any recording medium such as an IC card, a ROM cassette, and the like can be used as long as the program can be recorded.
- the correction target area is an area at the left end and the right end of the depth image
- the present invention is not limited to this.
- the correction target area may be an area at the upper end and the lower end of the depth image.
- the correction target area may be an area of the left end portion, the right end portion, the upper end portion, and the lower end portion of the depth image.
- correction target area is not limited to the area in the depth image, and may be defined as an area in the display surface 101, for example.
- the parallax image generation method according to the present invention corresponds to the parallax image generation processing of FIG. 8 and the depth image correction processing of FIG.
- the parallax image generation method according to the present invention does not necessarily include all the corresponding steps in FIG. 8 or 9. That is, the parallax image generating method according to the present invention may include only the minimum steps capable of realizing the effects of the present invention.
- the order in which the steps in the parallax image generation method are performed is an example for specifically explaining the present invention, and may be an order other than the above. Also, some of the steps in the parallax image generation method and the other steps may be performed independently and in parallel.
- each component of the parallax image generation device 100 is typically realized as an LSI (Large Scale Integration) which is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include some or all.
- the parallax image generation device 100 may be configured as an integrated circuit.
- an LSI Although an LSI is used here, it may be called an IC (Integrated Circuit), a system LSI, a super LSI, or an ultra LSI depending on the degree of integration.
- IC Integrated Circuit
- system LSI system LSI
- super LSI super LSI
- ultra LSI ultra LSI depending on the degree of integration.
- the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- a programmable field programmable gate array FPGA
- a reconfigurable processor that can reconfigure connection and setting of circuit cells in the LSI may be used.
- the present invention can be used as a parallax image generating device capable of suppressing unnatural expression at the end of a three-dimensional image without reducing the size of the three-dimensional image.
- Parallax image generation apparatus 101
- Display surface 110 Depth value correction unit 120
- Active shutter glasses 1000 Three-dimensional image viewing system
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
図1は、本発明の第1の実施形態に係る3次元画像視聴システム1000の構成の一例を示す図である。図1において、X、Y、Z方向の各々は、互いに直交する。以下の図に示されるX、Y、Z方向の各々も、互いに直交する。
本発明は、視差画像生成装置100が備える特徴的な構成部の動作をステップとする視差画像生成方法として実現してもよい。また、本発明は、そのような視差画像生成方法に含まれる各ステップをコンピュータに実行させるプログラムとして実現してもよい。また、本発明は、そのようなプログラムを格納するコンピュータ読み取り可能な記録媒体として実現されてもよい。
以上、本発明に係る視差画像生成装置および視差画像生成方法について、前記各実施の形態に基づいて説明したが、本発明は、これら実施の形態に限定されるものではない。本発明の主旨を逸脱しない範囲内で、当業者が思いつく変形を本実施の形態に施したものも、本発明に含まれる。
101 表示面
110 奥行き値補正部
120 視差画像生成部
200 アクティブシャッタメガネ
1000 3次元画像視聴システム
Claims (14)
- 処理対象の2次元画像から、3次元画像を表現するために必要な、互いに視差を有する2つの視差画像を生成するための奥行き画像を用いた処理を行う視差画像生成装置であって、
前記奥行き画像は、複数の奥行き値から構成され、
前記奥行き画像を構成する前記複数の奥行き値のうち、該奥行き画像の端に近い奥行き値程、該奥行き値に対応する位置が視差画像を表示するための表示面に近づくように、該奥行き値を補正する補正処理を行う奥行き値補正部と、
前記2次元画像と前記補正処理により補正された前記奥行き画像とを用いて、互いに視差を有する第1および第2視差画像を生成する視差画像生成部とを備える
視差画像生成装置。 - 前記奥行き値補正部は、前記奥行き画像のうち、該奥行き画像の端からL(1以上の整数)個の画素だけ離れた位置までの領域である補正対象領域に対応する複数の奥行き値に対し前記補正処理を行う
請求項1に記載の視差画像生成装置。 - 前記奥行き値補正部は、前記補正対象領域に対応する複数の奥行き値のうち、生成された前記第1および第2視差画像を表示するための前記表示面よりも手前側において、前記3次元画像内の一部の画素を表現するための奥行き値に対し前記補正処理を行う
請求項2に記載の視差画像生成装置。 - 前記奥行き値補正部は、前記2次元画像の水平方向のサイズが大きい程、前記Lの値を大きくする
請求項2又は3に記載の視差画像生成装置。 - 前記奥行き値補正部は、前記補正対象領域に含まれる奥行き値のうち、対応する位置が前記表示面から最も手前側に離れている奥行き値を抽出し、抽出した奥行き値に対応する位置が前記表示面から手前側に離れる程、前記Lの値を大きくする
請求項2又は3に記載の視差画像生成装置。 - 前記奥行き値補正部は、該奥行き画像の左右の端の前記補正対象領域及び上下の端の前記補正対象領域少なくとも一方に対して、前記補正処理を行う
請求項2~5のいずれか1項に記載の視差画像生成装置。 - 前記奥行き値補正部は、
前記2次元画像が撮像装置をパンしている間に撮像された画像である場合に、該奥行き画像の左右の端の前記補正対象領域の前記Lの値を大きくする
請求項6に記載の視差画像生成装置。 - 前記奥行き値補正部は、
前記2次元画像が撮像装置をチルトしている間に撮像された画像である場合に、該奥行き画像の上下の端の前記補正対象領域の前記Lの値を大きくする
請求項6又は7に記載の視差画像生成装置。 - 前記奥行き値補正部は、前記撮像装置のパン又はチルトの速度が速い程、対応する前記補正対象領域の前記Lの値を大きくする
請求項7又は8に記載の視差画像生成装置。 - 前記奥行き値補正部は、前記奥行き画像の端のうち、前記撮像装置が向く方向に位置する前記補正対象領域を、反対側に位置する前記補正対象領域より大きくする
請求項7~9のいずれか1項に記載の視差画像生成装置。 - 前記奥行き値補正部は、前記奥行き画像を構成する前記複数の奥行き値のうち、該奥行き画像の端に最も近い奥行き値が、前記第1および第2視差画像を表示するための前記表示面において、画素を表現するための値となるように、前記最も近い奥行き値を補正する
請求項1~10のいずれか1項に記載の視差画像生成装置。 - 処理対象の2次元画像から、3次元画像を表現するために必要な、互いに視差を有する2つの視差画像を生成するための奥行き画像を用いた処理を行うための視差画像生成方法であって、
前記奥行き画像は、複数の奥行き値から構成され、
前記奥行き画像を構成する前記複数の奥行き値のうち、該奥行き画像の端に近い奥行き値程、該奥行き値に対応する位置が視差画像を表示するための表示面に近づくように、該奥行き値を補正する補正処理を行うステップと、
前記2次元画像と前記補正処理により補正された前記奥行き画像とを用いて、互いに視差を有する第1および第2視差画像を生成するステップとを含む
視差画像生成方法。 - 処理対象の2次元画像から、3次元画像を表現するために必要な、互いに視差を有する2つの視差画像を生成するための奥行き画像を用いた処理を行うためのプログラムであって、
前記奥行き画像は、複数の奥行き値から構成され、
前記奥行き画像を構成する前記複数の奥行き値のうち、該奥行き画像の端に近い奥行き値程、該奥行き値に対応する位置が視差画像を表示するための表示面に近づくように、該奥行き値を補正する補正処理を行うステップと、
前記2次元画像と前記補正処理により補正された前記奥行き画像とを用いて、互いに視差を有する第1および第2視差画像を生成するステップとを
コンピュータに実行させるプログラム。 - 処理対象の2次元画像から、3次元画像を表現するために必要な、互いに視差を有する2つの視差画像を生成するための奥行き画像を用いた処理を行う集積回路であって、
前記奥行き画像は、複数の奥行き値から構成され、
前記奥行き画像を構成する前記複数の奥行き値のうち、該奥行き画像の端に近い奥行き値程、該奥行き値に対応する位置が視差画像を表示するための表示面に近づくように、該奥行き値を補正する補正処理を行う奥行き値補正部と、
前記2次元画像と前記補正処理により補正された前記奥行き画像とを用いて、互いに視差を有する第1および第2視差画像を生成する視差画像生成部とを備える
集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280002047.0A CN103004217B (zh) | 2011-06-08 | 2012-06-05 | 视差图像生成装置、视差图像生成方法、程序及集成电路 |
US13/814,514 US9147278B2 (en) | 2011-06-08 | 2012-06-05 | Parallax image generation device, parallax image generation method, program, and integrated circuit |
JP2012548295A JP6008322B2 (ja) | 2011-06-08 | 2012-06-05 | 視差画像生成装置、視差画像生成方法、プログラムおよび集積回路 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011128704 | 2011-06-08 | ||
JP2011-128704 | 2011-06-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012169173A1 true WO2012169173A1 (ja) | 2012-12-13 |
Family
ID=47295758
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/003681 WO2012169173A1 (ja) | 2011-06-08 | 2012-06-05 | 視差画像生成装置、視差画像生成方法、プログラムおよび集積回路 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9147278B2 (ja) |
JP (1) | JP6008322B2 (ja) |
CN (1) | CN103004217B (ja) |
WO (1) | WO2012169173A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10158847B2 (en) * | 2014-06-19 | 2018-12-18 | Vefxi Corporation | Real—time stereo 3D and autostereoscopic 3D video and image editing |
CN107767412A (zh) * | 2017-09-11 | 2018-03-06 | 西安中兴新软件有限责任公司 | 一种图像处理方法及装置 |
US11570418B2 (en) | 2021-06-17 | 2023-01-31 | Creal Sa | Techniques for generating light field data by combining multiple synthesized viewpoints |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004221699A (ja) * | 2003-01-09 | 2004-08-05 | Sanyo Electric Co Ltd | 立体画像処理方法および装置 |
JP2009533897A (ja) * | 2006-04-07 | 2009-09-17 | リアル・ディ | 垂直周囲視差の補正 |
JP2011078109A (ja) * | 2010-11-04 | 2011-04-14 | Fujifilm Corp | 3次元表示装置および方法並びにプログラム |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3921061A1 (de) * | 1989-06-23 | 1991-01-03 | Hertz Inst Heinrich | Wiedergabeeinrichtung fuer dreidimensionale wahrnehmung von bildern |
US6108005A (en) | 1996-08-30 | 2000-08-22 | Space Corporation | Method for producing a synthesized stereoscopic image |
CA2496353C (en) * | 2002-08-20 | 2011-10-18 | Kazunari Era | Method and apparatus for generating a stereographic image |
EP1931150A1 (en) | 2006-12-04 | 2008-06-11 | Koninklijke Philips Electronics N.V. | Image processing system for processing combined image data and depth data |
JP4625517B2 (ja) * | 2008-10-27 | 2011-02-02 | 富士フイルム株式会社 | 3次元表示装置および方法並びにプログラム |
JP2010268097A (ja) | 2009-05-13 | 2010-11-25 | Fujifilm Corp | 3次元表示装置及び3次元表示方法 |
US8472704B2 (en) | 2009-10-30 | 2013-06-25 | Fujifilm Corporation | Image processing apparatus and image processing method |
JP2011164202A (ja) | 2010-02-05 | 2011-08-25 | Sony Corp | 画像表示装置、画像表示システム及び画像表示方法 |
JP2011166285A (ja) * | 2010-02-05 | 2011-08-25 | Sony Corp | 画像表示装置、画像表示観察システム及び画像表示方法 |
JP5238767B2 (ja) * | 2010-07-26 | 2013-07-17 | 株式会社東芝 | 視差画像生成方法及び装置 |
JP4989760B2 (ja) * | 2010-12-21 | 2012-08-01 | 株式会社東芝 | 送信装置、受信装置および伝送システム |
JP2012138787A (ja) * | 2010-12-27 | 2012-07-19 | Sony Corp | 画像処理装置、および画像処理方法、並びにプログラム |
JP4892105B1 (ja) * | 2011-02-21 | 2012-03-07 | 株式会社東芝 | 映像処理装置、映像処理方法および映像表示装置 |
-
2012
- 2012-06-05 WO PCT/JP2012/003681 patent/WO2012169173A1/ja active Application Filing
- 2012-06-05 CN CN201280002047.0A patent/CN103004217B/zh not_active Expired - Fee Related
- 2012-06-05 US US13/814,514 patent/US9147278B2/en active Active
- 2012-06-05 JP JP2012548295A patent/JP6008322B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004221699A (ja) * | 2003-01-09 | 2004-08-05 | Sanyo Electric Co Ltd | 立体画像処理方法および装置 |
JP2009533897A (ja) * | 2006-04-07 | 2009-09-17 | リアル・ディ | 垂直周囲視差の補正 |
JP2011078109A (ja) * | 2010-11-04 | 2011-04-14 | Fujifilm Corp | 3次元表示装置および方法並びにプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20130127846A1 (en) | 2013-05-23 |
JPWO2012169173A1 (ja) | 2015-02-23 |
JP6008322B2 (ja) | 2016-10-19 |
CN103004217B (zh) | 2016-08-03 |
US9147278B2 (en) | 2015-09-29 |
CN103004217A (zh) | 2013-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6449428B2 (ja) | 曲面状多視点映像ディスプレイ装置及びその制御方法 | |
JP6308513B2 (ja) | 立体画像表示装置、画像処理装置及び立体画像処理方法 | |
CN106604018B (zh) | 3d显示设备及其控制方法 | |
JP5881732B2 (ja) | 画像処理装置、立体画像表示装置、画像処理方法および画像処理プログラム | |
JP5269027B2 (ja) | 三次元画像表示装置および画像処理装置 | |
US9154765B2 (en) | Image processing device and method, and stereoscopic image display device | |
WO2012157540A1 (ja) | 画像処理装置、画像処理方法、およびプログラム | |
JP2013013056A (ja) | 裸眼立体視用映像データ生成方法 | |
TW201225640A (en) | Apparatus and method for displaying stereoscopic images | |
US20140267235A1 (en) | Tilt-based look around effect image enhancement method | |
US10939092B2 (en) | Multiview image display apparatus and multiview image display method thereof | |
KR20160042535A (ko) | 다시점 영상 디스플레이 장치 및 그 제어 방법 | |
WO2012169173A1 (ja) | 視差画像生成装置、視差画像生成方法、プログラムおよび集積回路 | |
JP5127973B1 (ja) | 映像処理装置、映像処理方法および映像表示装置 | |
TWI500314B (zh) | A portrait processing device, a three-dimensional portrait display device, and a portrait processing method | |
US20140362197A1 (en) | Image processing device, image processing method, and stereoscopic image display device | |
KR20170036476A (ko) | 다시점 영상 디스플레이 장치 및 그 제어 방법 | |
US20130154907A1 (en) | Image display device and image display method | |
KR101980275B1 (ko) | 다시점 영상 디스플레이 장치 및 디스플레이 방법 | |
Jeong et al. | Depth image‐based rendering for multiview generation | |
JP2014216719A (ja) | 画像処理装置、立体画像表示装置、画像処理方法およびプログラム | |
JP5281720B1 (ja) | 立体映像処理装置及び立体映像処理方法 | |
TWI463434B (zh) | 將2d影像形成3d影像之影像處理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280002047.0 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2012548295 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13814514 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12797572 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12797572 Country of ref document: EP Kind code of ref document: A1 |