WO2014083752A1 - 別視点画像生成装置および別視点画像生成方法 - Google Patents
別視点画像生成装置および別視点画像生成方法 Download PDFInfo
- Publication number
- WO2014083752A1 WO2014083752A1 PCT/JP2013/006141 JP2013006141W WO2014083752A1 WO 2014083752 A1 WO2014083752 A1 WO 2014083752A1 JP 2013006141 W JP2013006141 W JP 2013006141W WO 2014083752 A1 WO2014083752 A1 WO 2014083752A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- viewpoint image
- image
- viewpoint
- hole
- different
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
Definitions
- the present disclosure relates to an image processing technique for three-dimensional (3D) display, and in particular, generates another image having a viewpoint position different from the two or more images from two or more images captured at different viewpoint positions.
- the present invention relates to another viewpoint image generation apparatus.
- a technique is known in which an image having parallax (hereinafter also referred to as a stereo image) is displayed on the left and right eyes of the viewer so that the viewer perceives a flat image as a stereoscopic image.
- a stereo image an image having parallax
- Patent Document 1 is a technique for generating another viewpoint image by using a depth map indicating the distance in the depth direction of the image and moving the pixel in the horizontal direction according to the distance in the depth direction.
- DIBR Depth Image Based Rendering
- An image generated by DIBR may include an area that does not appear in the original stereo image. Since such a region is a region to which no pixel value is assigned (hereinafter referred to as a hole region), it needs to be interpolated by some processing.
- the present disclosure provides a different viewpoint image generation device that can interpolate a hole region generated in a different viewpoint image with high quality.
- the different viewpoint image generation device includes the two or more images from each of two or more images acquired at each of two or more viewpoint positions based on distance information indicating the depth of each pixel in the image.
- Another viewpoint image that is an image corresponding to an image acquired at a virtual viewpoint position different from the viewpoint position, and that generates another viewpoint image that is an image including a hole region that is a region in which pixel values are missing one by one
- a combination that calculates a combination ratio of each of the two or more different viewpoint images for each processing unit With the ratio calculator And a separate viewpoint image synthesizing unit for synthesizing on the basis of another viewpoint images of the two or more, the calculated combination ratio.
- a recording medium such as a system, a method, an integrated circuit, a computer program, or a computer-readable CD-ROM.
- the system, method, integrated circuit, computer program, and You may implement
- the different viewpoint image generation device According to the different viewpoint image generation device according to the present disclosure, it is possible to interpolate the hole area generated in the different viewpoint image with high quality.
- FIG. 1 is an overall configuration diagram of another viewpoint image generation apparatus according to an embodiment.
- FIG. 2 is a diagram illustrating an example of an image and a depth map input to the different viewpoint image generation unit.
- FIG. 3 is a diagram illustrating a different viewpoint image generated by the different viewpoint image generation unit.
- FIG. 4 is a diagram illustrating a hole density calculation method of the hole density calculation unit.
- FIG. 5 is a schematic diagram for explaining a specific example of the hole density calculation method.
- FIG. 6 is a diagram illustrating a composite ratio map generated by the composite ratio calculation unit.
- FIG. 7 is a diagram illustrating the relationship between the difference between the hole density of the left viewpoint image and the hole density of the right viewpoint image and the composition ratio ⁇ .
- FIG. 1 is an overall configuration diagram of another viewpoint image generation apparatus according to an embodiment.
- FIG. 2 is a diagram illustrating an example of an image and a depth map input to the different viewpoint image generation unit.
- FIG. 3 is a diagram illustrating a different viewpoint image generated
- FIG. 8 is a schematic diagram showing the left-side viewpoint image after the hole filling process and the right-side viewpoint image after the hole filling process.
- FIG. 9 is a diagram illustrating an output image generated by the different viewpoint image synthesis unit.
- FIG. 10 is a flowchart of the operation of the different viewpoint image generation device according to the embodiment.
- FIG. 11 is an overall configuration diagram of another viewpoint image generation apparatus according to another embodiment.
- FIG. 12 is a flowchart of the operation of the different viewpoint image generation device according to another embodiment.
- FIG. 13 is a first diagram illustrating an application example of the different viewpoint image generation device.
- FIG. 14 is a second diagram illustrating an application example of the different viewpoint image generation device.
- the amount of parallax of the stereo image is determined by the distance between the lenses of the stereo camera that captures the stereo image, but it is desirable that the amount of parallax of the stereo image after shooting can be changed afterwards to adjust the stereoscopic effect.
- a stereo image with a different viewpoint position is displayed according to the position of the viewer.
- DIBR a technique called DIBR that generates an image of a viewpoint different from the stereo image from a stereo image
- An image generated by DIBR may include a hole area which is an area to which no pixel value is assigned, and the hole area needs to be interpolated by some processing.
- the hole area can be simply interpolated by linear interpolation processing using the pixel values of the peripheral area of the hole area.
- a method has a problem that image quality deterioration due to interpolation is conspicuous. is there.
- Patent Document 1 applies the pixel value of the corresponding region of the right viewpoint image to the hole region of the different viewpoint image generated from the left viewpoint image of the stereo image. This is a technique for interpolating a region.
- Patent Document 1 since the technique described in Patent Document 1 applies the pixel value of the image at the opposite viewpoint as it is to the hole region, there is a problem that the boundary between the hole region after interpolation and the peripheral region is not smoothly connected.
- another viewpoint image generation device includes two or more acquired at each of two or more viewpoint positions based on distance information indicating the depth of each pixel in the image.
- Another viewpoint image that is an image corresponding to an image acquired from each of the images at a virtual viewpoint position different from the two or more viewpoint positions and includes a hole area that is an area in which pixel values are missing.
- the separate viewpoint image generation unit that generates one by one Based on the hole density of each of the two or more different viewpoint images, and the composition ratio of each of the two or more different viewpoint images, based on the hole density calculation unit that calculates the hole density that is the ratio of the hole region in the viewpoint image
- the processing unit Comprising a synthesis ratio calculation unit for calculating, and another viewpoint image synthesizing unit for synthesizing on the basis of another viewpoint images of the two or more, the calculated synthesis ratio to.
- the image processing apparatus further includes a hole area interpolation unit that interpolates each hole area of the two or more different viewpoint images using a pixel value in the different viewpoint image, and the different viewpoint image composition unit interpolates the hole area.
- the two or more different viewpoint images may be combined based on the calculated combining ratio.
- a hole area interpolation unit that interpolates a hole area in the image synthesized by the different viewpoint image synthesis unit using a pixel value in the image may be provided.
- the hole density calculation unit for each of the two or more different viewpoint images, for each processing unit, the ratio of the hole area in the window that is the predetermined area centered on the processing unit as the hole density It may be calculated.
- the hole density calculating unit may calculate the hole density by giving different weights to a hole region located in a central portion in the window and a hole region located in a peripheral portion in the window. .
- composition ratio calculation unit calculates the composition ratio of each of the two or more different viewpoint images for each processing unit so that the composition ratio becomes larger as the different viewpoint image having a smaller hole density. Also good.
- the synthesis ratio calculation unit is configured so that each of the two or more different viewpoint images has a higher synthesis ratio as another viewpoint image whose viewpoint position is closer to the virtual viewpoint position. May be calculated for each processing unit.
- the processing unit is a pixel
- the hole density calculating unit calculates a hole density for each pixel for each of the two or more different viewpoint images
- the synthesis ratio calculating unit is the two or more different viewpoints.
- the composition ratio of each viewpoint image may be calculated for each pixel.
- FIG. 1 is an overall configuration diagram of another viewpoint image generation apparatus 100 according to the embodiment.
- Another viewpoint image generation device 100 includes a left another viewpoint image generation unit 101, a right another viewpoint image generation unit 102, a hole density calculation unit 103, a composition ratio calculation unit 104, a hole region interpolation unit 105, and another viewpoint image. And a combining unit 106.
- another viewpoint image generation apparatus 100 generates an image of another viewpoint from a left viewpoint image and a right viewpoint image that are stereo images.
- the left another viewpoint image generation unit 101 moves each pixel included in the left viewpoint image in the horizontal direction based on the depth of the pixel. Another left viewpoint image is generated.
- the right-specific viewpoint image generation unit 102 moves each pixel included in the right viewpoint image in the horizontal direction based on the depth of the pixel based on the right viewpoint image and the depth map (right depth map) of the right viewpoint image. Another right viewpoint image is generated.
- the left viewpoint image is an image photographed at the left viewpoint position
- the right viewpoint image is an image photographed at a right viewpoint position different from the left viewpoint position.
- the left-side viewpoint image and the right-side viewpoint image are both images corresponding to images acquired at the same virtual viewpoint position (a position different from both the left viewpoint position and the right viewpoint position).
- the depth map is distance information indicating the depth of each pixel in the image (the distance from the viewpoint position to the subject displayed on each pixel).
- one different viewpoint image generation unit may generate a left viewpoint image and a right viewpoint image from the left and right viewpoint images and the left and right depth maps, respectively.
- the different viewpoint image generation unit based on distance information indicating the depth of each pixel in the image, from each of the two images acquired at each of the two viewpoint positions, a virtual viewpoint that is different from the two viewpoint positions.
- Another viewpoint image corresponding to the image acquired at the position is generated one by one.
- the hole density calculation unit 103 indicates the hole density indicating the distribution of the hole area for each processing unit for each of the left viewpoint image and the right viewpoint image generated by the left viewpoint image generation unit 101 and the right viewpoint image generation unit 102. Generate a map. Specifically, for each of the two different viewpoint images generated, the hole density calculation unit 103 calculates the ratio of the hole area in a predetermined area including the processing unit for each processing unit including one or more pixels. Calculate a certain hole density. Note that the hole region is a region in which pixel values in another viewpoint image are missing.
- the processing unit is a pixel
- the hole density calculation unit 103 calculates the hole density for each pixel for each of the two different viewpoint images.
- a window described later is used as the predetermined area.
- the composition ratio calculation unit 104 generates a composition ratio map indicating the ratio when the left another viewpoint image and the right another viewpoint image are combined based on the hole density map generated by the hole density calculation unit 103. Specifically, the composition ratio calculation unit 104 compares the size of the hole density of each of the two different viewpoint images for each corresponding pixel (processing unit), and compares the two according to the size of the compared hole density. The composition ratio of each different viewpoint image is calculated for each pixel (processing unit).
- the hole area interpolation unit 105 sets the hole area of each of the left viewpoint image and the right viewpoint image generated by the left viewpoint image generation unit 101 and the right viewpoint image generation unit 102 as a hole area in the same image.
- a hole filling process for interpolation is performed using pixel values located in the periphery. That is, the hole area interpolation unit 105 interpolates each hole area of two different viewpoint images using the pixel values in the different viewpoint images.
- the pixel value is information indicating at least the luminance and color of the pixel.
- the pixel value is information including luminance values of RGB color components, information including luminance values and color differences, and the like. Further, the pixel value may include additional information such as a depth value in addition to the information regarding the color.
- the different viewpoint image composition unit 106 synthesizes the left viewpoint image and the right another viewpoint image that have been subjected to the hole filling processing by the hole region interpolation unit 105 based on the composition ratio indicated by the composition ratio map generated by the composition ratio calculation unit 104.
- the processed output image (output-specific viewpoint image) is generated.
- the different viewpoint image combining unit 106 combines two different viewpoint images in which the hole regions are interpolated based on the calculated combination ratio.
- each component of the different viewpoint image generation device 100 may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- FIG. 2 is a diagram illustrating an example of an image and a depth map input to the left-side viewpoint image generation unit 101 and the right-side viewpoint image generation unit 102.
- the left viewpoint image 211 is an image obtained by photographing the subject 201 and the subject 202 at the left viewpoint position 210.
- the right viewpoint image 221 is an image obtained by photographing the subject 201 and the subject 202 at the right viewpoint position 220.
- the left viewpoint image 211 and the right viewpoint image 221 are stereoscopic images obtained by photographing the same subject 201 and subject 202, and the relative positions of the subject 201 and the subject 202 in the images are different.
- the left depth map 212 is an image showing the background of the left viewpoint image 211 and the depth of the subject 201.
- the right depth map 222 is an image showing the background of the right viewpoint image 221 and the depth of the subject 201. That is, the left depth map 212 indicates the distance from the left viewpoint position 210 to the subject 201, and the right depth map 222 indicates the distance from the right viewpoint position 220 to the subject 201.
- the depth of the subject 202 is also shown in the left depth map 212 and the right depth map 222, but in this embodiment, the depth of the subject 202 is omitted for the sake of simplicity of explanation.
- the depth map of the present embodiment has a brighter luminance value for pixels where a subject located near the viewpoint position is photographed, and a darker luminance value for pixels where a subject located far away from the viewpoint position is photographed. Become.
- the left another viewpoint image generation unit 101 generates another left viewpoint image corresponding to an image acquired at the virtual viewpoint position from the left viewpoint image and the left depth map as shown in FIG.
- the right-specific viewpoint image generation unit 102 generates a right-specific viewpoint image corresponding to an image acquired at the virtual viewpoint position from the right viewpoint image and the right depth map as shown in FIG.
- FIG. 3 is a diagram illustrating a left another viewpoint image generated by the left another viewpoint image generating unit 101 and a right another viewpoint image generated by the right another viewpoint image generating unit 102.
- the subject 201 closer to the viewpoint position has a larger movement amount due to the difference in viewpoint position than the subject 202 far from the viewpoint position.
- the left-side viewpoint image generation unit 101 and the right-side viewpoint image generation unit 102 consider the distance from the viewpoint position of the input image (the left viewpoint image 211 and the right viewpoint image 221) to the virtual viewpoint position, and input image pixels. Move horizontally. At this time.
- the left-side viewpoint image generation unit 101 and the right-side viewpoint image generation unit 102 use the depth map indicating the distance in the depth direction of the image, and adjust the movement amount of each pixel according to the distance in the depth direction.
- the left another viewpoint image generation unit 101 generates a left another viewpoint image 302 corresponding to an image obtained when the subjects 201 and 202 are photographed at the virtual viewpoint position 301.
- the right-specific viewpoint image generation unit 102 generates a right-specific viewpoint image 303 obtained when the subjects 201 and 202 are photographed at the virtual viewpoint position 301.
- the movement amount of the subject 201 on the near side is larger than the movement amount of the subject 202 on the back side.
- the viewpoint position moves from the right viewpoint position 220 to the virtual viewpoint position 301 except that the moving direction is reversed. That is, the movement amount ⁇ x of the pixel value of each pixel accompanying the movement of the viewpoint position is obtained by the following (Equation 1).
- d is a depth value (depth map value) of each pixel, and becomes a smaller value on the far side farther from the viewpoint position, and a larger value on the near side closer to the viewpoint position.
- ⁇ b represents the movement amount of the viewpoint position (movement amount from the left viewpoint position 210 to the virtual viewpoint position 301 or movement amount from the right viewpoint position 220 to the virtual viewpoint position 301).
- a certain hole area is generated. Specifically, in the left another viewpoint image 302 generated from the left viewpoint image 211, a hole region 310 is generated on the right side of the subject 201 in the image, and in the right another viewpoint image 303 generated from the right viewpoint image 221, A hole region 311 is generated on the left side of the subject 201 in the image.
- FIG. 3 unless the subject 202 exists at the same distance as the background, a hole region is also formed on the side of the subject 202 in the image. However, in FIG. Is omitted.
- FIG. 4 is a diagram illustrating a hole density calculation method performed by the hole density calculation unit 103.
- the hole density calculation unit 103 extracts a hole map 401 (FIG. 4A) in which only the hole area is extracted from the left viewpoint image 302, and a hole map 402 (FIG. 4) in which only the hole area is extracted from the right viewpoint image 303.
- the window 403 is scanned.
- the ratio of the hole area to the entire window 403 when the center of the window 403 overlaps with the pixel is calculated as the hole density Den.
- the hole density calculation unit 103 obtains hole density maps 406 and 407 including the calculated hole density by the following (Equation 2).
- Equation 2 N is the total number of pixels in the window, and H [x + dx, y + dy] is a component on the coordinates [x + dx, y + dy] in the hole map.
- dx and dy mean relative positions from the window center.
- the hole density calculation unit 103 calculates the hole density Den of the pixel located at the coordinates (x, y) by adding the number of H [x + dx, y + dy] as the hole area and dividing by N. For example, in FIG. 4, the hole density of the window 404 of the hole map 401 is smaller than the hole density of the window 405 of the hole map 402.
- FIG. 5 is a schematic diagram for explaining a specific example of the hole density calculation method.
- the hole map 408 shown in FIG. 5A there is a hole region 409 having a size of 4 ⁇ 5 pixels.
- the window shape described above may be square or rectangular. Further, the shape of the window is not limited to a rectangle, and may be a circle or an ellipse.
- the hole density calculation unit 103 may calculate the hole density Den by giving different weights to the hole region located in the central part in the window and the hole region located in the peripheral part in the window.
- the hole density calculation unit 103 calculates one hole for the hole region located at the center of the window 410b in calculating the hole density of the pixel B.
- the hole density calculation unit 103 calculates one pixel as one point for the hole region located in the peripheral part of the window 410b. Thereby, the composition ratio described later is calculated in consideration of the position of the hole region in the window.
- the window size may be configured so that the user can specify the parameter by inputting parameters to the different viewpoint image generation apparatus 100 of the user.
- a hole density map 406 (FIG. 4C) is an image of the hole density calculated for each pixel in the left viewpoint image 302, and the hole density calculated for each pixel in the right viewpoint image 303 is shown.
- the imaged image is a hole density map 407 ((d) in FIG. 4).
- the pixel located near the edge of the hole region has a low hole density
- the pixel located near the center of the hole region has a high hole density
- the composition ratio calculation unit 104 compares the hole densities of the hole density maps 406 and 407 for each pixel at the same position, and combines the left viewpoint image and the right viewpoint image together (composition weight of pixel values). Find ⁇ . As a result, a composite ratio map 510 shown in FIG. 6 is generated.
- composition ratio ⁇ is obtained by the following (Equation 3).
- FIG. 7 is a diagram illustrating the relationship between the difference between the hole density of the left-side viewpoint image 302 and the hole density of the right-side viewpoint image 303 and the composition ratio ⁇ determined by (Equation 3).
- the composition ratio calculation unit 104 sets the composition ratio ⁇ of the left viewpoint image to 1. In addition, for the pixels with Den (L) ⁇ Den (R) equal to or greater than the threshold T, the composition ratio calculation unit 104 sets the composition ratio ⁇ to zero. For a pixel with 0 ⁇ Den (L) ⁇ Den (R) ⁇ T, the composition ratio is determined linearly according to the size of Den (L) and Den (R).
- the composition ratio calculation unit 104 calculates the composition ratio of each of the two different viewpoint images for each pixel so that the composition ratio of the different viewpoint images having a smaller hole density is larger.
- the threshold value T may be configured to be changeable by the user by inputting parameters to the different viewpoint image generation device 100 of the user. Further, the value of the composition ratio ⁇ of the pixels satisfying 0 ⁇ Den (L) ⁇ Den (R) ⁇ T may be determined nonlinearly.
- the distance from the left viewpoint position 210 to the virtual viewpoint position 301 is closer than the distance from the right viewpoint position 220 to the virtual viewpoint position 301. Therefore, in principle, the pixel value of the left another viewpoint image 302 with a small hole area is used as the pixel value of the different viewpoint image after synthesis.
- the horizontal axis of the graph 500 shown in FIG. 7 (the formula serving as a reference for (Formula 3)) is Den (L) -Den (R).
- the horizontal axis of the graph shown in FIG. Is expressed as Den (R) -Den (L).
- the composition ratio with respect to the pixel value of the left-side viewpoint image 302 is high.
- the synthesis ratio calculation unit 104 sets each of the two different viewpoint images so that the synthesis ratio becomes larger as the different viewpoint image is closer to the virtual viewpoint position 301. A composition ratio is calculated for each pixel.
- the pixel values in the image with a small hole area are often used, so that the different viewpoint image generation device 100 outputs with higher accuracy.
- An image can be generated.
- calculation method of the synthesis ratio is not limited to (Equation 3), and other calculation methods may be used.
- the hole area interpolation unit 105 acquires the left another viewpoint image 302 and the right another viewpoint image 303 from the left another viewpoint image generation unit 101 and the right another viewpoint image generation unit 102, respectively, and the left another viewpoint image 302 and the right another viewpoint image 303.
- the hole filling process (interpolation process) of the inner hole region is performed.
- FIG. 8 is a schematic diagram showing a left-side viewpoint image after the hole filling process (hereinafter also referred to as an image after left interpolation) and a right-side viewpoint image after the hole filling process (hereinafter also referred to as an image after right interpolation).
- the interpolated region 601 (region that was originally a hole region) of the post-left interpolation image 600 shown in FIG. 8A has pixel values of pixels located to the left and right of the hole region. Interpolated to stretch. The same applies to the interpolated area 611 of the post-right interpolation image 610 shown in FIG.
- the interpolation processing (in-plane interpolation processing) of the hole region interpolation unit 105 may be processing other than linear interpolation.
- the depth values of the depth maps at coordinates adjacent to the hole region in the horizontal direction may be compared, and extrapolation processing may be performed using the pixel values on the side farther from the viewpoint.
- the different viewpoint image composition unit 106 obtains the post-left interpolation image 600 and the post-right interpolation image 610, and obtains the composition ratio map 510 from the hole density calculation unit 103. Then, the different viewpoint image synthesis unit 106 generates an output image obtained by synthesizing the two post-interpolation images (the post-left interpolation image 600 and the post-right interpolation image 610).
- FIG. 9 is a diagram illustrating an output image generated by the different viewpoint image composition unit 106.
- O (x, y) L (x, y) .alpha. (X, y) + R (x, y). ⁇ 1-.alpha. (X, y) ⁇ ... (Formula 4)
- the right area 701 of the subject 201 in the output image 700 shown in FIG. 9 corresponds to the area 511 in the composition ratio map 510.
- the pixel value of the right-interpolated image 610 is assigned to the pixel located in the central portion of the right region 701. Pixel values in the region other than the right region 701 are assigned the pixel values of the left interpolated image 600. A pixel value obtained by mixing the pixel value of the left interpolated image 600 and the pixel value of the right interpolated image 610 in accordance with the composition ratio ⁇ is assigned to the pixel located in the peripheral portion of the right region 701.
- FIG. 10 is a flowchart of the operation of the different viewpoint image generation apparatus 100.
- the left another viewpoint image generation unit 101 and the right another viewpoint image generation unit 102 generate the left another viewpoint image 302 and the right another viewpoint image 303 from the input image (the left viewpoint image 211 and the right viewpoint image 221) and the depth map. (S101).
- the hole density calculation unit 103 calculates the hole density for each pixel with respect to the hole area in each of the left viewpoint image 302 and the right viewpoint image 303 (S102). Then, the composition ratio calculation unit 104 calculates a composition ratio when the left viewpoint image 302 and the right viewpoint image 303 are combined (S103).
- the hole area interpolation unit 105 interpolates the hole areas in the left-side viewpoint image 302 and the right-side viewpoint image 303 (S104).
- the separate viewpoint image composition unit 106 combines the left separate viewpoint image and the interpolated right separate viewpoint image interpolated in step S104 based on the composition ratio calculated in step S103 (S105).
- step S102, step S103, and step S104 is not particularly limited.
- the different viewpoint image generation apparatus 100 may perform processing in the order of step S102, step S103, and step S104, or may perform processing in the order of step S104, step S102, and step S103.
- the process of step S102, step S103, and step S104 may be performed in parallel.
- the hole ratio calculation unit 103 and the combination ratio calculation unit 104 calculate the combination ratio map 510. Then, the different viewpoint image combining unit 106 combines the two different viewpoint images generated from the left viewpoint image and the right viewpoint image in accordance with the calculated combining ratio.
- the interpolation process is performed only on the hole area that needs to be interpolated, and the interpolation process reflects the composite ratio calculated based on the hole density, so that the hole area is smoothly interpolated. That is, according to the different viewpoint image generation device 100, the hole area generated in the different viewpoint image can be interpolated with high quality. In other words, the different viewpoint image generation device 100 can generate a high-quality different viewpoint image (output image).
- FIG. 11 is an overall configuration diagram of another viewpoint image generation apparatus 100a according to another embodiment.
- FIG. 12 is a flowchart of the operation of the different viewpoint image generation device 100a. In the following description, differences from the different viewpoint image generation device 100 will be mainly described, and description overlapping with the above embodiment will be omitted.
- the separate viewpoint image composition unit 106 combines the left separate viewpoint image 302 and the right separate viewpoint image 303 according to the composition ratio calculated by the composition ratio calculation unit 104 (S106 in FIG. 12). At this time, when both corresponding pixels (pixels located at the same coordinates) of the left-side viewpoint image 302 and the right-side viewpoint image 303 are hole regions, the pixel is handled as a hole region. Therefore, the image synthesized by the different viewpoint image synthesis unit 106 includes a hole area.
- the hole area interpolation unit 105a interpolates the hole area in the image synthesized by the different viewpoint image synthesis unit 106 using the pixel value in the image (S107 in FIG. 12).
- any existing method may be used as described in the above embodiment.
- the different viewpoint image generation device 100a can generate a high-quality different viewpoint image (output image).
- the different viewpoint image generation devices 100 and 100a generate different viewpoint images having different viewpoint positions from the two images captured at different viewpoint positions.
- the different viewpoint image generation devices 100 and 100a may generate different viewpoint images having different viewpoint positions from the two or more images captured at different viewpoint positions.
- the different viewpoint image generation apparatuses 100 and 100a may generate different viewpoint images having different viewpoint positions from the three images captured at different viewpoint positions.
- the composition ratio calculation unit 104 calculates, for example, the composition ratio of each viewpoint image that is proportional to the size of the hole density of the three images.
- the hole density is calculated for each pixel.
- the hole density is calculated for each processing unit composed of one or more pixels (for example, a processing unit having a block size of 4 ⁇ 4 pixels). Good.
- a window is set for each processing unit, and the hole density is also compared for each processing unit.
- the different viewpoint image generation devices 100 and 100a generate different viewpoint images between two viewpoint positions from two images captured at different viewpoint positions. It is also possible to generate another viewpoint image at a viewpoint position other than between. Further, the two images do not have to be taken for stereoscopic viewing, and may be images obtained by photographing the same subject.
- the different viewpoint image generation devices 100 and 100a are realized, for example, as a television 800 shown in FIG.
- the different viewpoint image generation devices 100 and 100a generate different viewpoint images from two images for stereoscopic viewing that have been taken in advance.
- the different viewpoint image generation devices 100 and 100a can display the generated different viewpoint image and two images of the stereoscopic images taken in advance in combination with each other and display them to the user.
- a configuration in which the user adjusts the parallax of the image displayed on the television 800 with the remote controller can be realized.
- the different viewpoint image generation apparatuses 100 and 100a are realized as a Blu-Ray (registered trademark) player 810, for example.
- the different viewpoint image generation apparatuses 100 and 100a generate different viewpoint images from the two images for stereoscopic viewing recorded on the inserted Blu-Ray (registered trademark) disc.
- the different viewpoint image generation apparatuses 100 and 100a may be realized as the set top box 820. In this case, the different viewpoint image generation apparatuses 100 and 100a generate different viewpoint images from two images for stereoscopic viewing acquired from cable television broadcasting or the like.
- the different viewpoint image generation apparatuses 100 and 100a include a DSC (Digital Still Camera) having a 3D shooting function shown in FIG. 14A and a digital video having a 3D shooting function shown in FIG. 14B. It may be realized as a camera.
- the different viewpoint image generation devices 100 and 100a generate different viewpoint images from two images for stereoscopic viewing taken in advance.
- the different viewpoint image generation devices 100 and 100a may be realized by a server / client method.
- the different viewpoint images generated by the different viewpoint image generation apparatuses 100 and 100a are mainly used for parallax adjustment as described above, but may be used for other purposes.
- the different viewpoint image generation device is specifically a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
- a computer program is stored in the RAM or hard disk unit.
- the different viewpoint image generation apparatus achieves its function by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- a part or all of the components constituting the above-described another viewpoint image generation device may be configured by a single system LSI (Large Scale Integration).
- the system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on a single chip, and specifically, a computer system including a microprocessor, ROM, RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- a part or all of the components constituting the different viewpoint image generation device may be configured as an IC card that can be attached to and detached from the different viewpoint image generation device or a single module.
- the IC card or the module is a computer system including a microprocessor, ROM, RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above.
- the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
- the present disclosure may be the method described above. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
- the present disclosure also relates to a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray ( (Registered trademark) Disc), or recorded in a semiconductor memory or the like.
- the digital signal may be recorded on these recording media.
- the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
- the present disclosure may be a computer system including a microprocessor and a memory, and the memory may store the computer program, and the microprocessor may operate according to the computer program.
- the program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like, and executed by another independent computer system. It is good.
- the different viewpoint image generation device and the different viewpoint image generation method according to the present disclosure it is possible to generate a high-quality different viewpoint image from an image with depth map information captured by the imaging device.
- These configurations can be applied to devices such as consumer or commercial imaging devices (digital still cameras and video cameras) or portable terminals.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
3D映画や3DTV等の立体映像を表示する方法としては様々な技術が存在する。これらの技術は、視聴者の左右の目に対して視差のあるステレオ画像を表示することにより、視聴者に平面映像を立体映像として知覚させる点において共通している。視聴者の左右の目に表示するステレオ画像の間の視差量が大きい程、視聴者の感じる立体感は強くなる。逆に、ステレオ画像の視差量が小さい場合は、視聴者の感じる立体感は弱まる。
図1は、実施の形態に係る別視点画像生成装置100の全体構成図である。
・・・(式2)
α=0 : Den(L)-Den(R)≧T
α=1-((Den(L)-Den(R))/T) :
0<Den(L)-Den(R)<T
・・・(式3)
・・・(式4)
以上のように、本出願において開示する技術の例示として、実施の形態を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記実施の形態で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。
101 左別視点画像生成部
102 右別視点画像生成部
103 穴密度算出部
104 合成比率算出部
105、105a 穴領域補間部
106 別視点画像合成部
201、202 被写体
210 左視点位置
211 左視点画像
212 左デプスマップ
220 右視点位置
221 右視点画像
222 右デプスマップ
301 仮想視点位置
302 左別視点画像
303 右別視点画像
310、311、409 穴領域
401、402、408 穴マップ
403、404、405、410a、410b ウィンドウ
406、407 穴密度マップ
500 グラフ
510 合成比率マップ
511 領域
600 左補間後画像
610 右補間後画像
601、611 被補間領域
700 出力画像
701 右側領域
800 テレビ
810 Blu-Ray(登録商標)プレーヤ
820 セットトップボックス
Claims (11)
- 画像内の各画素の奥行きを示す距離情報に基づいて、2以上の視点位置それぞれにおいて取得された2以上の画像の各々から、前記2以上の視点位置とは異なる仮想視点位置において取得される画像に相当する画像であって、画素値が欠落した領域である穴領域が含まれる画像である別視点画像を1つずつ生成する別視点画像生成部と、
生成された2以上の別視点画像の各々について、1以上の画素からなる処理単位ごとに、当該処理単位を含む所定領域内に占める、当該別視点画像内の穴領域の割合である穴密度を算出する穴密度算出部と、
前記2以上の別視点画像の各々の前記穴密度に基づいて、前記2以上の別視点画像の各々の合成比率を前記処理単位ごとに算出する合成比率算出部と、
前記2以上の別視点画像を、算出された合成比率に基づいて合成する別視点画像合成部とを備える
別視点画像生成装置。 - さらに、前記2以上の別視点画像の各々の穴領域を当該別視点画像内の画素値を用いて補間する穴領域補間部を備え、
前記別視点画像合成部は、穴領域が補間された前記2以上の別視点画像を、算出された合成比率に基づいて合成する
請求項1に記載の別視点画像生成装置。 - さらに、前記別視点画像合成部が合成した画像内の穴領域を当該画像内の画素値を用いて補間する穴領域補間部を備える
請求項1に記載の別視点画像生成装置。 - 前記穴密度算出部は、前記2以上の別視点画像の各々について、処理単位ごとに、当該処理単位を中心とした前記所定領域であるウィンドウ内に占める穴領域の割合を前記穴密度として算出する
請求項1~3のいずれか1項に記載の別視点画像生成装置。 - 前記穴密度算出部は、前記ウィンドウ内の中央部に位置する穴領域と、前記ウィンドウ内の周辺部に位置する穴領域とにそれぞれ異なる重みを与えて前記穴密度を算出する
請求項4に記載の別視点画像生成装置。 - 前記合成比率算出部は、前記穴密度の大きさが小さい別視点画像ほど合成比率が大きくなるように、前記2以上の別視点画像の各々の合成比率を前記処理単位ごとに算出する
請求項1~5のいずれか1項に記載の別視点画像生成装置。 - 前記合成比率算出部は、穴密度の大きさが等しい場合には、視点位置が前記仮想視点位置と近い別視点画像ほど合成比率が大きくなるように、前記2以上の別視点画像の各々の合成比率を前記処理単位ごとに算出する
請求項1~6のいずれか1項に記載の別視点画像生成装置。 - 前記処理単位は、画素であり、
前記穴密度算出部は、前記2以上の別視点画像の各々について、画素ごとに穴密度を算出し、
前記合成比率算出部は、前記2以上の別視点画像の各々の合成比率を画素ごとに算出する
請求項1~7のいずれか1項に記載の別視点画像生成装置。 - 画像内の各画素の奥行きを示す距離情報に基づいて、2以上の視点位置それぞれにおいて取得された2以上の画像の各々から、前記2以上の視点位置とは異なる仮想視点位置において取得される画像に相当する画像であって、画素値が欠落した領域である穴領域が含まれる画像である別視点画像を1つずつ生成する別視点画像生成ステップと、
生成された2以上の別視点画像の各々について、1以上の画素からなる処理単位ごとに、当該処理単位を含む所定領域内に占める、当該別視点画像内の前記穴領域の割合である穴密度を算出する穴密度算出ステップと、
前記2以上の別視点画像の各々の前記穴密度に基づいて、前記2以上の別視点画像の各々の合成比率を前記処理単位ごとに算出する合成比率算出ステップと、
前記2以上の別視点画像を、算出された合成比率に基づいて合成する別視点画像合成ステップとを含む
別視点画像生成方法。 - 請求項9に記載の別視点画像生成方法をコンピュータに実行させるためのプログラム。
- 画像内の各画素の奥行きを示す距離情報に基づいて、2以上の視点位置それぞれにおいて取得された2以上の画像の各々から、前記2以上の視点位置とは異なる仮想視点位置において取得される画像に相当する画像であって、画素値が欠落した領域である穴領域が含まれる画像である別視点画像を1つずつ生成する別視点画像生成部と、
生成された2以上の別視点画像の各々について、1以上の画素からなる処理単位ごとに、当該処理単位を含む所定領域内に占める、当該別視点画像内の前記穴領域の割合である穴密度を算出する穴密度算出部と、
前記2以上の別視点画像の各々の前記穴密度に基づいて、前記2以上の別視点画像の各々の合成比率を前記処理単位ごとに算出する合成比率算出部と、
前記2以上の別視点画像を、算出された合成比率に基づいて合成する別視点画像合成部とを備える
集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/374,379 US9596445B2 (en) | 2012-11-30 | 2013-10-16 | Different-view image generating apparatus and different-view image generating method |
CN201380006913.8A CN104081768B (zh) | 2012-11-30 | 2013-10-16 | 异视点图像生成装置以及异视点图像生成方法 |
JP2014513852A JP6195076B2 (ja) | 2012-11-30 | 2013-10-16 | 別視点画像生成装置および別視点画像生成方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-263627 | 2012-11-30 | ||
JP2012263627 | 2012-11-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014083752A1 true WO2014083752A1 (ja) | 2014-06-05 |
Family
ID=50827417
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/006141 WO2014083752A1 (ja) | 2012-11-30 | 2013-10-16 | 別視点画像生成装置および別視点画像生成方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9596445B2 (ja) |
JP (1) | JP6195076B2 (ja) |
CN (1) | CN104081768B (ja) |
WO (1) | WO2014083752A1 (ja) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6000520B2 (ja) * | 2011-07-25 | 2016-09-28 | キヤノン株式会社 | 撮像装置およびその制御方法およびプログラム |
JP2015129846A (ja) * | 2014-01-07 | 2015-07-16 | キヤノン株式会社 | 撮像装置およびその制御方法 |
JP6381266B2 (ja) * | 2014-04-15 | 2018-08-29 | キヤノン株式会社 | 撮像装置、制御装置、制御方法、プログラム、および、記憶媒体 |
TWI573433B (zh) * | 2014-04-30 | 2017-03-01 | 聚晶半導體股份有限公司 | 優化深度資訊的方法與裝置 |
JP6555056B2 (ja) * | 2015-09-30 | 2019-08-07 | アイシン精機株式会社 | 周辺監視装置 |
CN105847782A (zh) * | 2016-04-15 | 2016-08-10 | 乐视控股(北京)有限公司 | 一种三维图像生成方法和装置 |
US11348269B1 (en) | 2017-07-27 | 2022-05-31 | AI Incorporated | Method and apparatus for combining data to construct a floor plan |
US10915114B2 (en) | 2017-07-27 | 2021-02-09 | AI Incorporated | Method and apparatus for combining data to construct a floor plan |
US10521963B1 (en) * | 2018-06-08 | 2019-12-31 | Verizon Patent And Licensing Inc. | Methods and systems for representing a pre-modeled object within virtual reality data |
US10510155B1 (en) * | 2019-06-11 | 2019-12-17 | Mujin, Inc. | Method and processing system for updating a first image generated by a first camera based on a second image generated by a second camera |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012501494A (ja) * | 2008-08-29 | 2012-01-19 | トムソン ライセンシング | ヒューリスティックなビューブレンディングによるビュー合成 |
WO2012147329A1 (ja) * | 2011-04-28 | 2012-11-01 | パナソニック株式会社 | 立体視強度調整装置、立体視強度調整方法、プログラム、集積回路、記録媒体 |
WO2012153513A1 (ja) * | 2011-05-12 | 2012-11-15 | パナソニック株式会社 | 画像生成装置、及び画像生成方法 |
WO2013005365A1 (ja) * | 2011-07-01 | 2013-01-10 | パナソニック株式会社 | 画像処理装置、画像処理方法、プログラム、集積回路 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8395642B2 (en) | 2009-03-17 | 2013-03-12 | Mitsubishi Electric Research Laboratories, Inc. | Method for virtual image synthesis |
JP4939639B2 (ja) | 2010-09-28 | 2012-05-30 | シャープ株式会社 | 画像処理装置、画像処理方法、プログラム及び記録媒体 |
CN102325259A (zh) * | 2011-09-09 | 2012-01-18 | 青岛海信数字多媒体技术国家重点实验室有限公司 | 多视点视频中虚拟视点合成方法及装置 |
CN102447925B (zh) * | 2011-09-09 | 2014-09-10 | 海信集团有限公司 | 一种虚拟视点图像合成方法及装置 |
-
2013
- 2013-10-16 WO PCT/JP2013/006141 patent/WO2014083752A1/ja active Application Filing
- 2013-10-16 US US14/374,379 patent/US9596445B2/en active Active
- 2013-10-16 JP JP2014513852A patent/JP6195076B2/ja not_active Expired - Fee Related
- 2013-10-16 CN CN201380006913.8A patent/CN104081768B/zh active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012501494A (ja) * | 2008-08-29 | 2012-01-19 | トムソン ライセンシング | ヒューリスティックなビューブレンディングによるビュー合成 |
WO2012147329A1 (ja) * | 2011-04-28 | 2012-11-01 | パナソニック株式会社 | 立体視強度調整装置、立体視強度調整方法、プログラム、集積回路、記録媒体 |
WO2012153513A1 (ja) * | 2011-05-12 | 2012-11-15 | パナソニック株式会社 | 画像生成装置、及び画像生成方法 |
WO2013005365A1 (ja) * | 2011-07-01 | 2013-01-10 | パナソニック株式会社 | 画像処理装置、画像処理方法、プログラム、集積回路 |
Also Published As
Publication number | Publication date |
---|---|
US20150109409A1 (en) | 2015-04-23 |
CN104081768B (zh) | 2017-03-01 |
JPWO2014083752A1 (ja) | 2017-01-05 |
CN104081768A (zh) | 2014-10-01 |
JP6195076B2 (ja) | 2017-09-13 |
US9596445B2 (en) | 2017-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6195076B2 (ja) | 別視点画像生成装置および別視点画像生成方法 | |
US8488869B2 (en) | Image processing method and apparatus | |
US9219911B2 (en) | Image processing apparatus, image processing method, and program | |
JP4939639B2 (ja) | 画像処理装置、画像処理方法、プログラム及び記録媒体 | |
WO2012176431A1 (ja) | 多視点画像生成装置、多視点画像生成方法 | |
JP2014056466A (ja) | 画像処理装置及び方法 | |
JP5755571B2 (ja) | 仮想視点画像生成装置、仮想視点画像生成方法、制御プログラム、記録媒体、および立体表示装置 | |
JP2012227797A (ja) | 立体視画像生成方法および立体視画像生成システム | |
JP2012244396A (ja) | 画像処理装置、画像処理方法、およびプログラム | |
JP2017510092A (ja) | オートステレオスコピックマルチビューディスプレイのための画像の生成 | |
WO2019050038A1 (ja) | 画像生成方法および画像生成装置 | |
JP6033625B2 (ja) | 多視点画像生成装置、画像生成方法、表示装置、プログラム、及び、記録媒体 | |
JP2013223008A (ja) | 画像処理装置及び方法 | |
JP6148154B2 (ja) | 画像処理装置及び画像処理プログラム | |
JP5691965B2 (ja) | 奥行き推定データの生成装置、生成方法及び生成プログラム、並びに疑似立体画像の生成装置、生成方法及び生成プログラム | |
JP5861114B2 (ja) | 画像処理装置、及び画像処理方法 | |
JP2014042238A (ja) | 3dビジュアルコンテントのデプスベースイメージスケーリング用の装置及び方法 | |
JP5627498B2 (ja) | 立体画像生成装置及び方法 | |
US20130187907A1 (en) | Image processing apparatus, image processing method, and program | |
JP4815004B2 (ja) | 多視点画像符号化装置 | |
US20150334364A1 (en) | Method and device for generating stereoscopic video pair | |
WO2014045471A1 (ja) | 画像信号処理装置および画像信号処理方法 | |
JP6217486B2 (ja) | 立体画像生成装置、立体画像生成方法、及び立体画像生成プログラム | |
US20140055579A1 (en) | Parallax adjustment device, three-dimensional image generation device, and method of adjusting parallax amount | |
JP6217485B2 (ja) | 立体画像生成装置、立体画像生成方法、及び立体画像生成プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2014513852 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13858382 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14374379 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13858382 Country of ref document: EP Kind code of ref document: A1 |