WO2013118955A1 - Apparatus and method for depth map correction, and apparatus and method for stereoscopic image conversion using same - Google Patents

Apparatus and method for depth map correction, and apparatus and method for stereoscopic image conversion using same Download PDF

Info

Publication number
WO2013118955A1
WO2013118955A1 PCT/KR2012/008238 KR2012008238W WO2013118955A1 WO 2013118955 A1 WO2013118955 A1 WO 2013118955A1 KR 2012008238 W KR2012008238 W KR 2012008238W WO 2013118955 A1 WO2013118955 A1 WO 2013118955A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth map
correction
region
image
depth
Prior art date
Application number
PCT/KR2012/008238
Other languages
French (fr)
Korean (ko)
Inventor
우대식
박재범
전병기
김종대
정원석
Original Assignee
에스케이플래닛 주식회사
시모스 미디어텍(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to KR1020120013708A priority Critical patent/KR101332638B1/en
Priority to KR10-2012-0013708 priority
Application filed by 에스케이플래닛 주식회사, 시모스 미디어텍(주) filed Critical 에스케이플래닛 주식회사
Publication of WO2013118955A1 publication Critical patent/WO2013118955A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/001Image restoration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity

Abstract

The present invention relates to an apparatus and method for depth map correction, and to an apparatus and method for stereoscopic image conversion using same. The depth map correction apparatus comprises: a filtering unit for performing noise-filtering or sharp characteristic improvement filtering on an inputted two-dimensional image; a region setup unit for selecting the boundary surface having the largest variation in the characteristics of a pixel from the filtered image and dividing the filtered image into a correction region, a neighboring region and a peripheral region according to the degree of correction in accordance with the selected boundary surface; and a depth value correction unit for performing, on the correction region, interpolation using the filtered image and a pre-generated depth map so as to correct a depth value and generate a corrected depth map by performing depth value correction on the neighboring region and peripheral region.

Description

Depth map correction device and method and stereoscopic image conversion device and method using same

The present invention relates to an apparatus and method for correcting a depth map, and to an apparatus and method for converting a stereoscopic image using the same, and more particularly, to perform noise filtering or sharp characteristic enhancement filtering on a two-dimensional input image. Select the boundary surface with the largest degree of characteristic change of pixels in the image, and classify it into the correction region, the neighboring region and the outer region according to the degree of correction based on the boundary surface, and interpolate the filtered image and the depth map with respect to the correction region. The present invention relates to a depth map correcting apparatus and method for generating a depth map corrected by performing depth correction and correcting depth values of the neighboring and outer regions, and a stereoscopic image converting apparatus and method using the same.

Recently, as interest in 3D images is amplified, research on 3D images is being actively conducted.

In general, it is known that humans feel the most three-dimensional effect by the parallax between both eyes. Thus, 3D imaging can be implemented using these characteristics of humans. For example, by distinguishing a particular subject into a left eye image seen through the viewer's left eye and a right eye image seen through the viewer's right eye, the viewer simultaneously displays the left eye image and the right eye image so that the viewer views the 3D image as a 3D image. I can make it visible. As a result, the 3D image may be implemented by manufacturing a binocular image divided into a left eye image and a right eye image and displaying the same.

In order to convert a monocular 2D image without depth information into a 3D image, it is necessary to add depth information to the 2D image to render.

In general, stereoscopic conversion is divided into manual and automatic methods. The manual method literally creates a depth map while watching the image according to the subjective judgment of a person about all the images. This process is based on the subjective judgment of the person who can predict the depth map while watching the video. Therefore, a person directly produces a depth map for each image, and the error of the depth map is very small. However, a lot of time and effort is required because a person directly intervenes in each video to create a depth map of the video.

Automatic stereoscopic transformation means analyzing the characteristics of an image to extract an appropriate depth map and using it to generate left and right stereoscopic images. In this process, since the image itself does not have information about the depth map, the depth map is generated by using general image characteristics such as edge characteristics, color, brightness characteristics, and vanishing point characteristics of the image. However, these features often do not coincide with the stereoscopic characteristics of the image itself, so there are many errors in the depth map for stereoscopic representation, and in particular, many errors are included in the characteristics of the boundary surface. That is, the boundary of the object in the actual image and the boundary of the depth map created for the stereoscopic representation do not coincide with each other, resulting in visual dizziness or uncomfortable stereoscopic feeling due to incomplete stereoscopic representation of the boundary.

The present invention has been made to solve the above problems, an object of the present invention is to correct the boundary characteristics of the depth map that does not match the original image to match as much as possible in consideration of the original image to mitigate dizziness due to inconsistency An apparatus and method for correcting a depth map and a stereoscopic image conversion apparatus and method using the same are provided.

Another object of the present invention is to provide a depth map correction apparatus and method for correcting an error of a depth map so as to have a boundary characteristic that is as close as possible to an image through image processing in view of image transformation, and a stereoscopic image conversion apparatus and method using the same. To provide.

Still another object of the present invention is to correct an error of a depth map, and a depth map correction apparatus and method capable of minimizing an error in image conversion by converting a 2D image into a 3D image using the corrected depth map. And a stereoscopic image conversion apparatus and method using the same.

According to an aspect of the present invention, a filtering unit that performs noise filtering or sharp characteristic enhancement filtering on a two-dimensional input image, selects an interface having the greatest degree of characteristic change of pixels in the filtered image, A region setting unit that divides the correction region, the neighboring region, and the outer region according to the degree of correction on the basis of the boundary surface, and corrects the depth value by performing interpolation using the filtered image and the generated depth map for the correction region, There is provided a depth map correcting apparatus including a depth value correcting unit generating a corrected depth map by correcting depth values of the neighboring region and the outer region.

The filtering unit may be a noise filter that removes a noise component of the input image, or a sharp characteristic improvement filter that makes a deviation of a pixel value with respect to an interface of the input image larger than a predetermined value.

The area setting unit corresponds to an interface in the filtered image, and a correction area setting unit sets an area having the greatest degree of characteristic change of a pixel according to a position change as a correction area, and an adjacent area within a predetermined distance of the correction area. And an outer region setting unit configured to set the outer region setting unit to set an area except the correction region and the neighboring region as an outer region.

The depth map corrector corrects the depth value of the depth map corresponding to the correction area by using the following equation.

[Equation]

Correction area New Depth (i) = ?? (SI (n) * Depth (n))

Here, i is a pixel index to the left and right of the correction area, n is set to be smaller than the correction area in the interpolation section, wherein SI (n) is a pixel value of the input image (original image), and Depth (n) is a depth map. The pixel value of, New Depth (i) means the corrected depth value at pixel position i.

In addition, the depth map corrector corrects the depth values of the outer region through Gaussian filleting or low pass filtering.

In addition, the depth map correction unit corrects the depth value of the depth map for the neighboring area by using the following equation.

[Equation]

Neighborhood Depth (i) = A + delta * i

Where i is the pixel index from one side of the correction region to one side of the outer region, the pixel position of the neighboring region, A is the pixel value in contact with the neighboring region in the correction region, delta is ((BA) / (kj)), B is a pixel value in contact with an adjacent area in the outer region, j is a position index of a pixel having an A value, and k is a position index of a pixel having a pixel value of B.

According to another aspect of the invention, the image analysis unit for extracting at least one characteristic information by analyzing the two-dimensional input image, a depth map generator for generating a depth map for the input image based on the characteristic information, A depth map corrector for filtering an input image and correcting the generated depth map using the filtered image, and a stereoscopic image generator for converting the input image into a 3D stereoscopic image using the corrected depth map Provided is a stereoscopic image converting apparatus.

The image analyzer extracts characteristic information including at least one of edge information, color information, luminance information, motion information, and histogram information.

The depth map generator generates a depth map by dividing a plurality of pixels constituting the input image into at least one block and setting depth values of the at least one block. .

The depth map correction unit performs noise filtering or sharp characteristic enhancement filtering on the input image, selects a boundary surface having the greatest degree of characteristic change of pixels in the filtered image, and then corrects the region based on the degree of correction based on the boundary surface. , The neighboring region and the outer region, the depth value is corrected by performing interpolation using the filtered image and the depth map for the correction region, and the corrected depth map is corrected by the depth value correction of the neighboring region and the outer region. Create

According to another aspect of the present invention, a depth map correction apparatus for calibrating a depth map, the method comprising the steps of: (a) performing noise filtering or sharp characteristic enhancement filtering on a two-dimensional input image, (b Selecting an interface having the greatest degree of characteristic change of pixels in the filtered image, and dividing the boundary into a correction region, a neighboring region, and an outer region according to the degree of correction based on the boundary surface; and (c) filtering the correction region. A depth map correction method is provided, comprising performing interpolation using a corrected image and a depth map, and generating a corrected depth map by correcting depth values of the outer region and the neighboring region.

In the step (c), performing the interpolation of the filtered image and the depth map on the correction region to correct the depth map of the correction region, and performing Gaussian filleting or low pass filtering on the outer region. Correcting depth values of the outer region through the correction, using a pixel value in contact with a neighboring region in the correction region, and a gradient value corresponding to a degree of change of a pixel for connecting the correction region and the outer region to the neighboring region. Compensating for the depth values.

According to another aspect of the present invention, in the method for converting a two-dimensional input image to a three-dimensional stereoscopic image in the stereoscopic image conversion apparatus, the step of extracting at least one characteristic information by analyzing the two-dimensional input image, Generating a depth map of the input image based on the characteristic information, filtering the input image, correcting the generated depth map using the filtered image, and using the corrected depth map There is provided a stereoscopic image conversion method comprising converting the input image into a three-dimensional stereoscopic image.

In the correcting of the depth map, noise filtering or sharp characteristic enhancement filtering may be performed on the input image, a boundary surface having the greatest degree of characteristic change of pixels in the filtered image is selected, and then the degree of correction is performed based on the boundary surface. According to the correction region, the neighboring region, the outer region, and the correction region by performing the interpolation using the filtered image and the depth map to correct the depth value, the correction by the depth value correction of the neighboring region and the outer region To create a defined depth map.

Therefore, according to the present invention, it is possible to mitigate the dizziness caused by the inconsistency by correcting the boundary characteristic of the depth map that does not coincide with the original image as much as possible in consideration of the original image.

In addition, in view of image conversion, an error in the depth map may be corrected to have boundary characteristics that match the image object as much as possible through image processing.

In addition, the error of the depth map may be corrected, and the error of the image conversion may be minimized by converting the 2D image into the 3D image using the corrected depth map.

1 is a block diagram showing the configuration of a stereoscopic image conversion apparatus according to the present invention.

Figure 2 is a block diagram schematically showing the configuration of a depth map correction device according to the present invention.

3 is a flowchart illustrating a method for converting a two-dimensional input image into a three-dimensional stereoscopic image by the image conversion apparatus according to the present invention.

4 is an exemplary diagram for explaining a depth map before and after correction according to the present invention;

5 is a flowchart illustrating a method of correcting a depth map by a depth map correcting apparatus according to the present invention;

6 is a view for explaining a correction process of the depth map according to the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the description with reference to the accompanying drawings, the same or corresponding components will be given the same reference numerals and redundant description thereof will be omitted.

In order to correct the following depth map, the following three conditions must be assumed between the original image (that is, the input image) and the depth map.

① Overall, the depth map should properly reflect the depth value of the original image, and only contain partial errors in the same area as the boundary.

② The brightness or color boundary of the original image matches the boundary of the depth map.

③ The original image should be the image with the least external influence such as noise.

A method of correcting a depth map based on the above preconditions will be described with reference to the drawings.

1 is a block diagram showing the configuration of a stereoscopic image conversion apparatus according to the present invention.

Referring to FIG. 1, the stereoscopic image converting apparatus 100 includes an image analyzer 110, a depth map generator 120, a depth map corrector 130, and a stereoscopic image generator 140.

The image analyzer 110 extracts at least one characteristic information by analyzing a two-dimensional input image. The characteristic information includes edge information, color information, luminance information, motion information, histogram information, and the like.

The image analyzer 110 extracts characteristic information in an image through various analysis methods in units of pixels or blocks in order to collect information that is a basis for generating a depth map.

The depth map generator 120 generates a depth map of the input image based on the characteristic information extracted by the image analyzer 110. That is, the depth map generator 120 divides a plurality of pixels constituting the input image into at least one block and sets a depth value for the at least one block. Create a depth map.

In addition, the depth map generator 220 generates a depth map for each frame of the 2D image based on the extracted characteristic information. That is, the depth map generator 220 extracts depth values for each pixel of each frame from a depth map of the 2D image. Here, the depth map is a data structure storing depth values of each pixel per frame for the 2D image.

The depth map corrector 130 filters the input image and corrects the depth map generated by the depth map generator 220 using the filtered image. That is, the depth map corrector 130 performs noise filtering or sharp characteristic enhancement filtering on the input image, and selects the boundary surface having the largest characteristic variation of the pixel in the filtered image. Here, the largest change in the characteristic of the pixel means the largest change in the pixel value. Then, the depth map corrector 130 divides the correction region, the neighboring region, and the outer region according to the degree of correction on the basis of the selected boundary surface, and performs interpolation of the filtered image and the depth map with respect to the correction region. After correcting the value, a corrected depth map is generated by correcting depth values of the neighboring area and the outer area.

A detailed description of the depth map correction unit 130 will be described with reference to FIG. 2.

The stereoscopic image generator 140 converts the two-dimensional input image into a three-dimensional stereoscopic image using the depth map corrected by the depth map corrector 130. For example, the stereoscopic image generator 140 may generate parallax information using the corrected depth map and generate a 3D stereoscopic image using the parallax information. The 3D stereoscopic image generated in this way looks more stereoscopic as the depth values for each pixel in each frame vary.

Here, the stereoscopic image generator 140 converts the 2D image into a 3D stereoscopic image using parallax information, but the stereoscopic image generator 140 uses the corrected depth map to input the image. The method of converting a to a stereoscopic image follows various conventional methods.

The stereoscopic image converting apparatus 100 configured as described above may convert a 2D input image into a 3D image by setting a depth value for the input image based on the characteristic information of the input image.

2 is a block diagram schematically showing a configuration of a depth map correction device according to the present invention.

In FIG. 1, the depth map correction unit is described. In FIG. 2, the depth map correction apparatus 200 will be described.

Referring to FIG. 2, the depth map correcting apparatus 200 includes a filtering unit 210, an area setting unit 220, and a depth value correcting unit 230.

The filtering unit 210 performs noise filtering or sharp characteristic enhancement filtering on the 2D input image. That is, since the boundary of the image is greatly influenced by external influences such as noise, the filtering unit 210 improves the sharpness characteristic to enhance the noise filtering and boundary characteristics in order to minimize the influence of the noise before performing the image processing. Perform filtering. Here, the noise filtering refers to minimizing the effects of noise by using a filter, for example, a lowpass filter, for attenuating the noise components and passing the required signal components. The sharp characteristic improvement filtering refers to making a large deviation of pixel values with respect to the boundary in order to visually distinguish the boundary of the image more clearly and clearly. Accordingly, the sharp characteristic improvement filtering may mean filtering through a high pass filter that passes a frequency band higher than a given cutoff frequency and attenuates a lower frequency band to attenuate pixel values. have.

Accordingly, the filtering unit 210 may include a noise filter for removing noise components of the input image, a sharp characteristic improvement filter for increasing a deviation of a pixel value with respect to an interface of the input image to a predetermined value or more. Can be.

The area setting unit 220 selects an interface having the greatest degree of characteristic change of pixels in the filtered image, and classifies the boundary into a correction area, a neighboring area, and an outer area according to the degree of correction.

That is, the area setting unit 220 sets an area corresponding to the boundary of the filtered image and has a large degree of change in the characteristic or size of the pixel according to the position change as the correction area, and the adjacent area within a predetermined distance of the correction area. An area is set as a neighboring area, and an area except the correction area and a neighboring area is set as an outer area.

The correction area is an area corresponding to the boundary of the filtered image and is selected as an area having a large change in characteristic or size of the pixel according to a change in position, and is a region for performing depth map correction based on each boundary. The neighboring area is an area adjacent to the correction area and refers to an area for setting depth values for left and right areas after depth value correction of the correction area. The outer region is a region except for the correction region and the neighboring region, and refers to the region for the overall depth value correction as the outer region of each boundary surface.

The depth value corrector 230 corrects a depth value of the correction area by performing interpolation between the filtered image and a previously generated depth map with respect to the correction area, and corrects the depth value of the neighboring area and the outer area. Depth value correction creates a corrected depth map.

That is, the depth value corrector 230 corrects the depth value (correction area New Depth (i)) of the depth map corresponding to the correction area by using Equation 1.

Equation 1

Figure PCTKR2012008238-appb-M000001

Here, i is a pixel index to the left / right side of the correction region, and n is an interpolation section, which is usually set slightly smaller than the correction region. SI (n) denotes a pixel value of an input image (filtered image), Depth (n) denotes a pixel value of a depth map, and New Depth (i) denotes a corrected depth value at pixel position i.

In other words, the depth value corrector 230 performs interpolation of the filtered image and the depth map on the correction region according to the precondition that the boundary characteristics of the filtered image match the boundary characteristics of the depth map. Perform according to. The interpolation is corrected to a depth map in which the boundary characteristics of the original image are reflected in the depth map and the boundary characteristics are more clearly distinguished.

When the depth value of the correction region is determined through Equation 1, the depth value correction unit 230 corrects the depth values of the outer region.

Unlike the correction region corresponding to the boundary surface, the correction of the outer region is a stable region with a very small degree of change in the depth map. For the stability of the depth map, Gaussian filleting or similar lowpass filtering is performed. The filtering for the outer region is additionally performed for the stability of the overall depth map.

When the depth values for the outer region are corrected as described above, the depth value corrector 230 performs the depth value correction for the neighboring region. In other words, the neighboring area is an intermediate area between the correction area and the outer area, and is a buffer area for connection to this area because each area is corrected differently.

Therefore, the depth value correcting unit 230 performs depth value correction on the neighboring area by using Equation 2.

Equation 2

Figure PCTKR2012008238-appb-M000002

Here, i is a pixel index from one side of the correction region to one side of the outer region, and refers to the pixel position of the neighboring region, and A is a pixel value in contact with the neighboring region in the correction region.

The delta is defined as in Equation 3.

Equation 3

Figure PCTKR2012008238-appb-M000003

Where A is a pixel value in contact with a neighboring region in the correction region, B is a pixel value in contact with a neighboring region in the outer region, j is a position index of a pixel having a value A, and k is a position index of a pixel having a pixel value of B. .

As a result, delta is a slope value corresponding to the degree of change of the pixel for connecting the correction region and the outer region. Therefore, the depth value of the neighboring region according to Equation 2 is to guarantee the continuity of the depth map by linearly connecting the pixel value corresponding to the start position of the outer region from the last pixel value of the correction region.

Using the above method, the depth value correcting unit 230 generates a depth map in which depth values corresponding to each region are corrected.

3 is a flowchart illustrating a method of converting a 2D input image into a 3D stereoscopic image by the stereoscopic image converting apparatus according to the present invention, and FIG. 4 is an exemplary diagram for describing a depth map before and after correction according to the present invention. .

Referring to FIG. 3, the 3D image conversion apparatus analyzes a two-dimensional input image to extract at least one characteristic information (S302). Here, the characteristic information includes edge information, color information, luminance information, motion information, histogram information, and the like.

After performing the step S302, the stereoscopic image conversion apparatus generates a depth map of the input image based on the characteristic information (S304).

Then, the stereoscopic image converting device filters the input image and corrects the generated depth map using the filtered image (S306). The method of correcting the depth map by the stereoscopic image conversion apparatus will be described with reference to FIG. 5.

Reference will be made to FIG. 4 to compare the depth map generated in S304 with the depth map corrected in S306.

4A is an example of an original image (which is an input image) for stereoscopic conversion. When the depth map is generated using the original image 400 as shown in (a), the depth map 410 as shown in FIG. 4 (b) is generated. That is, when the depth map is generated based on the boundary characteristic of the original image 400, the depth map 410 as shown in FIG. 4B is generated.

Comparing the generated depth map 410 with the original image 400, the characteristics of the overall depth map 410 properly represents the depth value of the original image 400, but when looking at the interface between the objects, the depth map 410 is very roughly expressed. For example, if the original image 400 and the depth map 410 are compared with respect to the boundary between person A and person B, in the depth map 410, the boundary between person A and person B is not clear and visually matches naturally. There is a partial error that does not. That is, the depth map 410 is represented in a form having a lot of curvature, and has a very low resolution compared to the original image 400.

As shown in (b), in the depth map 410 having a partial error at the boundary surface compared to the original image 400 and having a low resolution, the boundary surface is used to express a natural three-dimensional effect using the original image 400. At the same time as the correction of the depth map at the same level as the resolution of the original image 400, the depth map 420 with the boundary surface corrected is generated as shown in FIG.

That is, the original image 400 is filtered, and the depth value corresponding to the interface between person A and person B, which is the boundary with the greatest degree of characteristic change of pixels in the filtered image, is corrected using Equation 1, When the depth values of the region and the outer region are corrected, the corrected depth map 420 as shown in FIG. 4C is generated. A detailed description of a method of correcting depth values of the neighboring area and the outer area will be given with reference to FIG. 2.

Looking at the corrected depth map 420, it can be seen that the boundary between person A and person B is clear so that person A and person B can be clearly distinguished and have a resolution similar to that of the original image 400.

Referring to FIG. 3 again, after performing the S306, the stereoscopic image converting apparatus converts the input image into a 3D stereoscopic image using the corrected depth map (S308).

As described above, the boundary characteristics of the depth map that do not match the original image are corrected as much as possible in consideration of the original image, and the 2D image is converted into a 3D stereoscopic image using the corrected depth map. It can alleviate dizziness caused.

5 is a flowchart illustrating a method of correcting a depth map by a depth map correcting apparatus according to the present invention, and FIG. 6 is a view for explaining a process of correcting a depth map according to the present invention.

Referring to FIG. 5, the depth map correction apparatus performs noise filtering and sharp characteristic enhancement filtering on a two-dimensional input image (hereinafter referred to as an original image) (S502). That is, since the boundary of the image is greatly influenced by external influences such as noise, the original image can enhance the correction effect as much as there is no noise. Therefore, the depth map correction apparatus performs noise filtering to minimize effects such as noise before performing image processing, and performs sharp characteristic enhancement filtering to enhance the characteristics of the boundary surface. Referring to (a) of FIG. 4 for the sharp characteristic improvement filtering, the boundary between the human and the mountain in the image, the boundary between the mountain and the sky, and the like are the distinct boundary and the three-dimensional boundary of each object. The sharp characteristic enhancement filtering may be to make the deviation of the pixel value between the interfaces more distinctive.

After performing S502, the depth map correction apparatus selects a boundary surface having a large degree of characteristic change of pixels in the filtered image (S504), and moves to a correction region, a neighboring region, and an outer region according to the degree of correction based on the boundary surface. (S506).

The correction area is an area corresponding to the boundary of the filtered image and is selected as a region having a large change in characteristic or size of a pixel according to a position change, and is a region for performing depth map correction based on each boundary. The neighboring area is an area adjacent to the correction area and refers to an area for setting depth values for left and right areas after depth value correction of the correction area. The outer region is a region except for the correction region and the neighboring region, and refers to the region for the overall depth value correction as the outer region of each boundary surface.

Each area divided as described above is first set a correction range of the correction region according to the degree of correction, and set a neighboring region having a more stable depth value out of the correction range, and then define the other region as the outer region. .

After the operation S506 is performed, the depth map correction device corrects the depth value of the correction area by performing interpolation between the filtered image and the previously generated depth map with respect to the correction area (S508).

That is, the depth map correction apparatus corrects the depth value of the depth map corresponding to the correction region by using Equation 1.

In other words, the depth map correction apparatus performs interpolation of the filtered image and the depth map with respect to the correction region according to the precondition that the boundary characteristics of the filtered image match the boundary characteristics of the depth map, according to Equation 1. . The interpolation is corrected to a depth map in which the boundary characteristics of the original image are reflected in the depth map and the boundary characteristics are more clearly distinguished.

When the depth values of the depth map corresponding to the correction region are corrected by S508, the depth map correction apparatus corrects the depth values of the outer region (S510).

Unlike the correction area corresponding to the boundary surface, the correction of the outer area is a stable area having a very small degree of change in the depth map. Since the correction of the outer area is already included in the area of the object for representing a stereoscopic image, the correction is usually not performed or the depth map is not. To ensure the stability of the Gaussian filleting or similar low-pass filtering is performed. The filtering for the outer region is additionally performed for the stability of the overall depth map.

When the S510 is performed, the depth map correcting apparatus performs depth value correction on the neighboring region (S512). That is, the depth value correction of the neighboring region is a buffer region for connection to this region because each region has been differently corrected to the middle region of the correction region and the outer region. Therefore, the depth map correcting apparatus performs depth value correction on the neighboring region by using Equation 2.

When the S512 is performed, the depth map correcting apparatus generates a depth map in which depth values corresponding to each region are corrected (S514).

A method of correcting the depth map using the original image by the depth map correcting apparatus will be described with reference to FIG. 6.

The image for correcting the depth map by the depth map correction apparatus is a two-dimensional image in x and y spaces, but for convenience of description, the image will be described by expressing one-dimensionally from the horizontal axis as a starting point.

Referring to FIG. 6, (1) shows the characteristics (or sizes, values, etc.) of pixels according to the positional change of the original image. When noise and sharp characteristic filtering is performed on the original image as shown in (1), (2) Is converted into an image having a characteristic of a pixel such as That is, when noise and sharp characteristic filtering are performed on the original image, noise components of the image are removed as shown in (2), and distinction of the boundary surface becomes clear.

In (2), if the interface with a large degree of change is selected, V1 and V2 of the rising part and the falling part are selected. On the basis of the boundary of the selected V1, V2 based on the degree of correction is divided into a correction region (a), a neighboring region (b), the outer region (c). Each region first sets a correction region of the correction region a according to the degree of correction, and sets a neighboring region b having a more stable depth value out of the correction range. The other area is set as the outer area (c).

The correction area (a) is an area corresponding to the boundary of the filtered image, and an area having a large change in pixel value is selected according to a change in position, and the neighboring area (b) is an adjacent area of the correction area (a). A region at a certain distance from a) is selected, and an outer region c is selected except for the correction region a and the neighboring region b.

The depth map correction apparatus performs interpolation of the filtered image 2 and the depth map 3 on the correction region a according to the precondition that the boundary characteristics of the filtered image match the boundary characteristics of the depth map. . Here, the depth map 3 is a depth map generated using the characteristic information extracted from the original image 1.

Then, the depth map corrector performs correction on the neighboring area and the outer area to obtain the corrected depth map as shown in (4).

A detailed description of a method of correcting depth values for the correction region, the neighboring region, and the outer region will be given with reference to FIG. 2.

When the depth map 4 corrected through the above process is compared with the previously generated depth map 3, the depth map 3 of which the depth value of the correction region a is previously generated is corrected in the corrected depth map 4. It can be seen that the original image 1 coincides with the depth value of the corresponding area of the.

In addition, when looking at the boundary value after the correction, the boundary surface is corrected to move as close as possible to the original image, and the variability of the depth map is expressed in the most suppressed form. Of course, the partial fluctuations of the original depth map cannot be completely removed within the correction region, but since the visual characteristics of the person are more sensitive to the boundary agreement, the actual three-dimensional effect can make a lot of improvements.

The stereoscopic image conversion device or the depth map correction device according to the present invention can be implemented in the form of a personal computer, a tablet PC, a notebook computer, a mobile phone, a smartphone, and the like. It may be executable by a processor consisting of one or more cores included in the device.

According to another aspect of the present invention, performing noise filtering or sharp characteristic enhancement filtering on a two-dimensional input image, selecting an interface having the largest characteristic variation of a pixel in the filtered image, Dividing the image into a correction region, a neighboring region, and an outer region according to the degree of correction based on the boundary surface, and correcting the depth value by performing interpolation of the filtered image and the depth map with respect to the correction region, and the outer region and the neighboring region. A depth map correction method comprising generating a corrected depth map by correcting a depth value of is provided by a program, and a recording medium readable by the electronic device is provided.

According to another aspect of the invention, the step of extracting at least one characteristic information by analyzing a two-dimensional input image, generating a depth map for the input image based on the characteristic information, filtering the input image And correcting the generated depth map by using the filtered input image, and converting the input image into a three-dimensional stereoscopic image using the corrected depth map. There is provided a recording medium recorded in the form and readable by the electronic device.

The depth map correction method and the stereoscopic image conversion method can be written in a program, and codes and code segments constituting the program can be easily inferred by a programmer in the art.

According to the present invention, a depth map correction device and a stereoscopic image conversion device using the same may include a processor, a memory, a storage device, and an input / output device as components, and these components may be interconnected using, for example, a system bus.

The processor may process instructions for execution within the device. In one implementation, the processor may be a single-threaded processor, and in other implementations, the processor may be a multi-threaded processor. The processor processes instructions stored on memory or storage devices. It is possible to do

On the other hand, the memory stores information in the apparatus. In one embodiment, the memory is a computer readable medium. In one implementation, the memory may be a volatile memory unit, and for other implementations, the memory may be a nonvolatile memory unit. The storage device described above can provide a mass storage for the device. In one embodiment, the storage device is a computer readable medium. In various different implementations, the storage device may include, for example, a hard disk device, an optical disk device, or some other mass storage device.

The above-described input / output device provides an input / output operation for the device according to the present invention. In one implementation, the input / output device may include one or more network interface devices such as, for example, an Ethernet card, such as a serial communication device such as an RS-232 port and / or a wireless interface device such as, for example, an 802.11 card. In other implementations, the input / output device can include driver devices, such as keyboards, printers, and display devices, configured to send output data to and receive input data from other input / output devices.

The apparatus according to the invention may be driven by instructions that cause one or more processors to perform the functions and processes described above. For example, such instructions may include instructions that are interpreted, for example, script instructions such as JavaScript or ECMAScript instructions, or executable code or other instructions stored on a computer readable medium. Furthermore, the device according to the present invention may be implemented in a distributed manner over a network, such as a server farm, or may be implemented in a single computer device.

Although the specification and drawings describe exemplary device configurations, the functional operations and subject matter implementations described herein may be embodied in other types of digital electronic circuitry, or modified from the structures and structural equivalents disclosed herein. It may be implemented in computer software, firmware or hardware, including, or a combination of one or more of them. Implementations of the subject matter described herein relate to one or more computer program products, ie computer program instructions encoded on a program storage medium of tangible type for controlling or by the operation of an apparatus according to the invention. It may be implemented as the above module. The computer readable medium may be a machine readable storage device, a machine readable storage substrate, a memory device, a composition of materials affecting a machine readable propagated signal, or a combination of one or more thereof.

The terms "processing system", "processing device" and "subsystem" encompass all the instruments, devices and machines for processing data, including, for example, programmable processors, computers or multiple processors or computers. The processing system may include, in addition to hardware, code that forms an execution environment for a computer program on demand, such as code constituting processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more thereof. .

A computer program (also known as a program, software, software application, script or code) mounted on an apparatus according to the invention and executing a method according to the invention is a programming comprising a compiled or interpreted language or a priori or procedural language. It can be written in any form of language, and can be deployed in any form, including stand-alone programs or modules, components, subroutines, or other units suitable for use in a computer environment. A computer program does not necessarily correspond to a file in a file system. A program may be in a single file provided to the requested program, in multiple interactive files (eg, a file that stores one or more modules, subprograms, or parts of code), or part of a file that holds other programs or data. (Eg, one or more scripts stored in a markup language document). The computer program may be deployed to run on a single computer or on multiple computers located at one site or distributed across multiple sites and interconnected by a communication network.

Computer-readable media suitable for storing computer program instructions and data include, for example, semiconductor memory devices such as EPROM, EEPROM, and flash memory devices, such as magnetic disks such as internal hard disks or external disks, magneto-optical disks, and CD-ROMs. And all forms of nonvolatile memory, media and memory devices, including DVD-ROM discs. The processor and memory can be supplemented by or integrated with special purpose logic circuitry.

Implementations of the subject matter described herein may include, for example, a backend component such as a data server, or include a middleware component such as, for example, an application server, or a web browser or graphical user, for example, where a user may interact with the implementation of the subject matter described herein. It can be implemented in a computing system that includes a front end component such as a client computer having an interface or any combination of one or more of such back end, middleware or front end components. The components of the system may be interconnected by any form or medium of digital data communication such as, for example, a communication network.

Although the specification includes numerous specific implementation details, these should not be construed as limiting to any invention or the scope of the claims, but rather as a description of features that may be specific to a particular embodiment of a particular invention. It must be understood. Certain features that are described in this specification in the context of separate embodiments may be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments individually or in any suitable subcombination. Furthermore, while the features may operate in a particular combination and may be initially depicted as so claimed, one or more features from the claimed combination may in some cases be excluded from the combination, the claimed combination being a subcombination Or a combination of subcombinations.

Likewise, although the operations are depicted in the drawings in a specific order, it should not be understood that such operations must be performed in the specific order or sequential order shown in order to obtain desirable results or that all illustrated operations must be performed. In certain cases, multitasking and parallel processing may be advantageous. Moreover, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems will generally be integrated together into a single software product or packaged into multiple software products. It should be understood that it can.

Specific embodiments of the subject matter described in this specification have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order but still achieve desirable results. As an example, the process depicted in the accompanying drawings does not necessarily require that particular illustrated or sequential order to obtain desirable results. In certain implementations, multitasking and parallel processing may be advantageous.

The foregoing description presents the best mode of the invention, and provides examples to illustrate the invention and to enable those skilled in the art to make and use the invention. The specification thus produced is not intended to limit the invention to the specific terms presented. Thus, while the present invention has been described in detail with reference to the examples described above, those skilled in the art can make modifications, changes and variations to the examples without departing from the scope of the invention.

As such, those skilled in the art will appreciate that the present invention can be implemented in other specific forms without changing the technical spirit or essential features thereof. Therefore, the above-described embodiments are to be understood as illustrative in all respects and not as restrictive. The scope of the present invention is shown by the following claims rather than the detailed description, and all changes or modifications derived from the meaning and scope of the claims and their equivalent concepts should be construed as being included in the scope of the present invention. do.

The present invention can correct an error of the overall depth map of an image through image processing during automatic stereoscopic image conversion, and convert the 2D image into a 3D image by using the corrected depth map to correct the error of image conversion. Apparatus and method for minimizing depth map correction and stereoscopic image conversion apparatus and method using the same are applicable.

Claims (14)

  1. A filtering unit configured to perform noise filtering or sharp characteristic enhancement filtering on the 2D input image;
    A region setting unit which selects an interface having the greatest degree of characteristic change of pixels in the filtered image and divides the image into a correction region, a neighboring region, and an outer region according to the degree of correction based on the selected boundary surface; And
    A depth value correction unit for correcting a depth value by performing interpolation using the filtered image and a previously generated depth map with respect to the correction area, and generating a corrected depth map by correcting depth values of the neighboring area and the outer area. ;
    Depth map correction device comprising a.
  2. The method of claim 1,
    And the filtering unit is a noise filter for removing a noise component of the input image, or a sharp characteristic improvement filter for making a deviation of a pixel value with respect to an interface of the input image larger than a predetermined value.
  3. The method of claim 1,
    The area setting unit
    A correction area setting unit that sets an area corresponding to an interface in the filtered image and has the greatest characteristic change degree of a pixel according to a position change as a correction area;
    A neighboring area setting unit that sets an adjacent area within a predetermined distance of the correction area as a neighboring area; And
    And an outer area setting unit configured to set an area excluding the correction area and a neighboring area as an outer area.
  4. The method of claim 1,
    And the depth map corrector corrects a depth value of a depth map corresponding to a correction region by using the following equation.
    [Equation]
    Correction area New Depth (i) = ?? (SI (n) * Depth (n))
    Here, i is a pixel index to the left and right of the correction area, n is set to be smaller than the correction area in the interpolation section, wherein SI (n) is a pixel value of the input image (original image), and Depth (n) is a depth map. The pixel value of, New Depth (i) means the corrected depth value at pixel position i.
  5. The method of claim 1,
    And the depth map corrector corrects the depth values of the outer region through Gaussian filleting or low pass filtering of the outer region.
  6. The method of claim 1,
    And the depth map corrector corrects a depth value of a neighboring area by using the following equation.
    [Equation]
    Neighborhood Depth (i) = A + delta * i
    Where i is the pixel index from one side of the correction region to one side of the outer region, the pixel position of the neighboring region, A is the pixel value in contact with the neighboring region in the correction region, delta is ((BA) / (kj)), B is a pixel value in contact with an adjacent area in the outer region, j is a position index of a pixel having an A value, and k is a position index of a pixel having a pixel value of B.
  7. An image analyzer extracting at least one feature information by analyzing a two-dimensional input image;
    A depth map generator configured to generate a depth map of the input image based on the characteristic information;
    A depth map corrector for filtering the input image and correcting the generated depth map using the filtered image; And
    A stereoscopic image generator for converting the input image into a three-dimensional stereoscopic image using the corrected depth map;
    Stereoscopic image conversion device comprising a.
  8. The method of claim 7, wherein
    The image analyzer extracts characteristic information including at least one of edge information, color information, luminance information, motion information, and histogram information. Converter.
  9. The method of claim 7, wherein
    The depth map generator divides a plurality of pixels constituting the input image into at least one block, and then sets a depth value for the at least one block to generate a depth map. Stereoscopic image conversion device, characterized in that.
  10. The method of claim 7, wherein
    The depth map correction unit performs noise filtering or sharp characteristic enhancement filtering on the input image, selects an interface having the greatest degree of characteristic change of pixels in the filtered image, and then corrects an area according to a degree of correction based on the boundary surface. And a neighboring area and an outer area, and correcting the depth value by performing interpolation using the filtered image and the generated depth map with respect to the correction area, and correcting the depth value by correcting the depth value of the neighboring area and the outer area. And a depth map generated.
  11. In the method that the depth map correction device corrects the depth map,
    (a) performing noise filtering or sharp characteristic enhancement filtering on the two-dimensional input image;
    selecting a boundary surface having the greatest degree of characteristic change of pixels in the filtered image, and dividing the boundary into a correction region, a neighboring region, and an outer region according to the degree of correction based on the boundary surface; And
    (c) correcting a depth value by performing interpolation using the filtered image and a previously generated depth map with respect to the correction region, and generating a corrected depth map by correcting depth values of the outer region and the neighboring region. step;
    Depth map correction method comprising a.
  12. The method of claim 11,
    In step (c),
    Correcting the depth map of the correction region by performing interpolation using the filtered image and the depth map with respect to the correction region;
    Correcting depth values of the outer region through Gaussian filleting or low pass filtering of the outer region; And
    And correcting depth values of the neighboring region using a gradient value corresponding to a pixel value in contact with the neighboring region in the correction region and a degree of change of the pixel for connecting the correction region and the outer region. Depth map correction method.
  13. In the stereoscopic image conversion apparatus converts a two-dimensional input image into a three-dimensional stereoscopic image,
    Extracting at least one characteristic information by analyzing a two-dimensional input image;
    Generating a depth map of the input image based on the characteristic information;
    Filtering the input image and correcting the generated depth map using the filtered image; And
    Converting the input image into a 3D stereoscopic image using the corrected depth map;
    Stereoscopic image conversion method comprising a.
  14. The method of claim 13,
    Correcting the depth map,
    After performing noise filtering or sharp characteristic enhancement filtering on the input image, selecting an interface having the greatest degree of characteristic change of pixels in the filtered image, and then correcting region, neighboring region, and outer area according to the degree of correction based on the boundary surface. Classify the area, and correct the depth value by performing interpolation using the filtered image and the depth map for the correction area, and generate a corrected depth map by correcting the depth value of the neighboring area and the outer area. Stereoscopic image conversion method.
      According to the present invention, it is possible to mitigate the dizziness caused by the inconsistency by correcting the boundary characteristic of the depth map that does not coincide with the original image as much as possible in consideration of the original image.
PCT/KR2012/008238 2012-02-10 2012-10-11 Apparatus and method for depth map correction, and apparatus and method for stereoscopic image conversion using same WO2013118955A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020120013708A KR101332638B1 (en) 2012-02-10 2012-02-10 Apparatus and method for correcting depth map and apparatus and method for generating 3d conversion image using the same
KR10-2012-0013708 2012-02-10

Publications (1)

Publication Number Publication Date
WO2013118955A1 true WO2013118955A1 (en) 2013-08-15

Family

ID=48947693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/008238 WO2013118955A1 (en) 2012-02-10 2012-10-11 Apparatus and method for depth map correction, and apparatus and method for stereoscopic image conversion using same

Country Status (2)

Country Link
KR (1) KR101332638B1 (en)
WO (1) WO2013118955A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2561525A (en) * 2016-12-22 2018-10-24 Canon Kk Method and corresponding device for digital 3D reconstruction

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101664868B1 (en) 2015-03-25 2016-10-12 (주)이더블유비엠 Compensation method and apparatus for depth image based on outline
WO2017007047A1 (en) * 2015-07-08 2017-01-12 재단법인 다차원 스마트 아이티 융합시스템 연구단 Spatial depth non-uniformity compensation method and device using jittered comparison
KR20190086320A (en) * 2018-01-12 2019-07-22 삼성전자주식회사 The apparatus for proccesing image and method therefor

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100046837A1 (en) * 2006-11-21 2010-02-25 Koninklijke Philips Electronics N.V. Generation of depth map for an image
KR20100064196A (en) * 2008-12-04 2010-06-14 삼성전자주식회사 Method and appratus for estimating depth, and method and apparatus for converting 2d video to 3d video
US20100315488A1 (en) * 2009-06-16 2010-12-16 Samsung Electronics Co., Ltd. Conversion device and method converting a two dimensional image to a three dimensional image
KR20110099526A (en) * 2010-03-02 2011-09-08 (주) 스튜디오라온 Method for converting two dimensional images into three dimensional images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100046837A1 (en) * 2006-11-21 2010-02-25 Koninklijke Philips Electronics N.V. Generation of depth map for an image
KR20100064196A (en) * 2008-12-04 2010-06-14 삼성전자주식회사 Method and appratus for estimating depth, and method and apparatus for converting 2d video to 3d video
US20100315488A1 (en) * 2009-06-16 2010-12-16 Samsung Electronics Co., Ltd. Conversion device and method converting a two dimensional image to a three dimensional image
KR20110099526A (en) * 2010-03-02 2011-09-08 (주) 스튜디오라온 Method for converting two dimensional images into three dimensional images

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2561525A (en) * 2016-12-22 2018-10-24 Canon Kk Method and corresponding device for digital 3D reconstruction

Also Published As

Publication number Publication date
KR101332638B1 (en) 2013-11-25
KR20130092157A (en) 2013-08-20

Similar Documents

Publication Publication Date Title
US8488896B2 (en) Image processing apparatus and image processing method
US20130021489A1 (en) Regional Image Processing in an Image Capture Device
JP6165459B2 (en) Method for correcting user's line of sight in video, machine-readable storage medium, and portable terminal
BRPI0721462A2 (en) 2d image region classification system and method for 2d to 3d conversion
WO2015102361A1 (en) Apparatus and method for acquiring image for iris recognition using distance of facial feature
CA2669016A1 (en) System and method for compositing 3d images
JP2011526013A (en) Image processing
US8224069B2 (en) Image processing apparatus, image matching method, and computer-readable recording medium
JP2011170838A (en) Image processing device and electronic apparatus
KR101679290B1 (en) Image processing method and apparatus
JP5880182B2 (en) Image generating apparatus, image generating method, and program
CN104980646B (en) Blocking detection method and an electronic device
KR101377325B1 (en) Stereoscopic image, multi-view image and depth image acquisition appratus and its control method
EP0632915B1 (en) A machine method for compensating for non-linear picture transformations, e.g. zoom and pan, in a video image motion compensation system
CN102780893B (en) The image processing apparatus and a control method
EP2950520A1 (en) Method and apparatus for outputting picture
JP5589548B2 (en) Imaging apparatus, image processing method, and program storage medium
US8902323B2 (en) Face recognition system and method for taking group photo
US20110255845A1 (en) Video editing apparatus and video editing method
KR20130040771A (en) Three-dimensional video processing apparatus, method therefor, and program
JP5553141B2 (en) Image processing system, image processing apparatus, image processing method, and program
WO2013081435A1 (en) 3d image display device and method
WO2009151292A2 (en) Image conversion method and apparatus
KR101642402B1 (en) Apparatus and method for capturing digital image for guiding photo composition
WO2015102317A1 (en) Image processing apparatus and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12868290

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12868290

Country of ref document: EP

Kind code of ref document: A1