KR20140067253A - Image processing apparatus and method thereof - Google Patents

Image processing apparatus and method thereof Download PDF

Info

Publication number
KR20140067253A
KR20140067253A KR1020120134264A KR20120134264A KR20140067253A KR 20140067253 A KR20140067253 A KR 20140067253A KR 1020120134264 A KR1020120134264 A KR 1020120134264A KR 20120134264 A KR20120134264 A KR 20120134264A KR 20140067253 A KR20140067253 A KR 20140067253A
Authority
KR
South Korea
Prior art keywords
pixel
depth
image
value
viewpoint
Prior art date
Application number
KR1020120134264A
Other languages
Korean (ko)
Inventor
최욱
이기창
김도균
김창용
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020120134264A priority Critical patent/KR20140067253A/en
Publication of KR20140067253A publication Critical patent/KR20140067253A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0077Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Landscapes

  • Image Processing (AREA)

Abstract

The present invention relates to a method and an apparatus for correcting a difference between a shooting time of a depth image and a shooting time of a color image. In one aspect of the present invention, A corresponding pixel corresponding to the reference pixel in the depth image at the first viewpoint among the pixels belonging to the depth image at the third viewpoint is estimated based on the corresponding condition of the first viewpoint and the corresponding condition of the second viewpoint, The current pixel belonging to the image can be generated.

Figure P1020120134264

Description

[0001] IMAGE PROCESSING APPARATUS AND METHOD THEREOF [0002]

The following embodiments relate to a method and apparatus for correcting a difference between a shooting time of a depth image and a shooting time of a color image.

Using a color / depth camera using one lens and one sensor, color / depth images can be taken at the same time and at the same frame rate. Since a color image and a depth image are alternately acquired from one sensor, when a moving object is photographed using the above color / depth camera, a time difference occurs between the frame of the color image and the frame of the depth image.

In one aspect of the present invention, the image processing apparatus uses the depth image at the first viewpoint and the depth image at the third viewpoint, based on a predetermined corresponding condition, among the pixels belonging to the depth image at the third viewpoint, And a depth image generator for generating a current pixel belonging to the depth image at the second viewpoint using the reference pixel and the corresponding pixel.

In another aspect, the image processing apparatus may further include a compensation unit that generates a depth value of a hole pixel in which the depth value is not calculated among the current pixels, based on a depth value of peripheral pixels of the hole pixel.

The estimator extracts the infrared brightness value of the reference pixel from the amplitude image at the first time point and extracts the infrared brightness value of the at least one candidate pixel included in the third time point from the brightness image at the third time point An infrared brightness similarity calculation unit for calculating an infrared brightness similarity based on a difference between infrared brightness values between the reference pixel and the candidate pixel, and the corresponding condition is a case where the infrared brightness similarity is equal to or greater than a predetermined reference, And a determination unit that estimates the candidate pixel as the corresponding pixel.

Wherein the estimating unit extracts a depth value of the reference pixel from the depth image at the first viewpoint, extracts a depth value of at least one candidate pixel included in the third viewpoint from the depth image at the third viewpoint, And a depth value similarity calculator for calculating a depth value similarity based on the depth value difference between the candidate pixels, and the corresponding condition is a case where the depth value similarity is greater than or equal to a certain range, As a corresponding pixel.

Wherein the estimating unit extracts an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint and extracts an infrared brightness value and a depth value from the depth image and the brightness image at the third viewpoint Extracting an infrared brightness value and a depth value of the candidate pixels to be displayed on the basis of the difference between the infrared brightness values between the pixels in the window and the candidate pixels and the depth value between the pixels in the window and the candidate pixels, a topology similarity calculation unit for calculating topology similarity of the candidate pixels and the corresponding condition may include a determination unit for estimating the candidate pixels as corresponding pixels when the topology similarity is in a certain range or more and the corresponding condition is satisfied.

The estimating unit may include a verifying unit for verifying whether the reference pixel of the first view depth image is estimated from the corresponding pixel of the third viewpoint depth image or the symmetry.

The estimator includes a downscaling unit for downscaling the luminance image and the depth image at the first viewpoint, the luminance image and the depth image at the third viewpoint, and a downscaling unit for generating a pyramid from the smallest image of the pyramid, And a change amount calculation unit for calculating an optical flow and calculating a change amount of coordinates from a reference pixel of the depth image at the first viewpoint to a corresponding pixel at the third viewpoint.

Wherein the estimating unit determines that the difference between the reference pixel and the infrared brightness value of the luminance image at the first time point is within a certain range and the difference between the reference pixel and the depth value of the depth image at the first time point is within a certain range, It is possible to estimate the corresponding pixel at the third time point in which the topology of the pixels in the set window is similar.

The depth image generation unit may generate the depth image by using the coordinates of the reference pixel of the depth image at the first viewpoint, the coordinates of the corresponding pixel of the depth image at the third viewpoint, the amount of change of coordinates from the coordinates of the reference pixel to the coordinates of the corresponding pixel, And interpolating the position of the current pixel belonging to the depth image of the second viewpoint and the depth value of the current pixel by linearly interpolating the depth value of the corresponding pixel and the depth value of the corresponding pixel.

If the difference between the depth value of the reference pixel and the depth value of the corresponding pixel is greater than the threshold value, the interpolator generates the hole pixel without calculating the depth value of the current pixel belonging to the corresponding depth image at the second time point .

The interpolation unit may calculate an average value of a plurality of depth values included in a current pixel belonging to the depth image at the second view point as a representative depth value of the current pixel.

The compensator may calculate a depth value of the hall pixel by summing the depth values of the neighboring pixels by increasing the weight as the distance between the coordinates of the hall pixel and the coordinates of the neighboring pixels is closer.

The compensator may calculate the depth value of the hall pixel by summing the depth values of the neighboring pixels by increasing the weight value as the color value between the hall pixel extracted from the color image at the second time point and the neighboring pixels is closer.

Wherein the compensating unit increases the weight of the distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels and the closer the color value between the hole pixel and the neighboring pixels is, The depth value can be calculated.

According to an aspect of the present invention, an image processing method includes the steps of: extracting, from among pixels belonging to a depth image at the third viewpoint, a depth image at a first viewpoint and a depth image at a third viewpoint, Estimating a corresponding pixel corresponding to a reference pixel in the depth image, and using the reference pixel and the corresponding pixel to generate a current pixel belonging to the depth image of the second viewpoint.

In another aspect, the image processing method may further include generating a depth value of a hole pixel in which a depth value is not calculated among current pixels, based on a depth value of peripheral pixels of the hole pixel.

Wherein the step of estimating comprises the steps of: extracting an infrared brightness value of the reference pixel from an amplitude image at a first time point; extracting an infrared brightness value of one or more candidate pixels included in the third viewpoint Calculating an infrared brightness similarity based on a difference between infrared brightness values between the reference pixel and the candidate pixel; and if the infrared brightness similarity is equal to or greater than a predetermined reference value, And estimating the candidate pixel as the corresponding pixel.

Wherein the step of estimating includes extracting a depth value of the reference pixel from the depth image at the first viewpoint, extracting a depth value of at least one candidate pixel included in the third viewpoint from the depth image of the third viewpoint, Calculating a depth value similarity based on a depth value difference between the reference pixel and the candidate pixel, and the corresponding condition is a case in which the depth value similarity is greater than or equal to a certain range, and when the corresponding condition is satisfied, And estimating it as a corresponding pixel.

Wherein the estimating comprises extracting an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint, Extracting an infrared brightness value and a depth value of the candidate pixels included in the window and a difference between the infrared brightness values of the pixels in the window and the candidate pixels and a difference in depth value between the pixels in the window and the candidate pixels Calculating a topology similarity based on the pixel values of the candidate pixels and the corresponding condition, and when the corresponding condition is satisfied, estimating the candidate pixels as corresponding pixels when the topology similarity is not less than a certain range.

Wherein the step of generating the depth image comprises the steps of: generating a first depth-of-view depth image having a first depth-of-view depth and a third depth-of-view depth image; Calculating a position of a current pixel belonging to the depth image at the second view point and a depth value of the current pixel by linearly interpolating the depth value of the reference pixel and the depth value of the corresponding pixel.

Wherein the compensating step is performed such that the distance between the coordinate of the hole pixel and the coordinates of the neighboring pixels is close to each other and the weight is increased as the color value between the hall pixel and the surrounding pixels is closer to each other, The depth value of the pixel can be calculated.

1 is a diagram illustrating a concept of generating a depth image in an image processing apparatus according to an exemplary embodiment.
2 is a block diagram of an image processing apparatus according to an embodiment.
3 is a block diagram of an image processing apparatus according to another embodiment.
4 is a block diagram of an image processing apparatus according to another embodiment.
FIG. 5 is a diagram for estimating a pixel-by-pixel correspondence relationship in an image processing apparatus according to an embodiment.
6 is a diagram illustrating a pyramid used in an image processing apparatus according to an exemplary embodiment.
7 illustrates a depth image including a hall pixel generated by the image processing apparatus according to an exemplary embodiment.
8 is a diagram for explaining a concept of generating a depth value of a hole pixel in an image processing apparatus according to an embodiment.
9 is a view showing a depth image before and after generation of a depth value of a hole pixel in an image processing apparatus according to an exemplary embodiment.
10 is a flowchart of an image processing method according to an embodiment.

Hereinafter, embodiments according to one aspect will be described in detail with reference to the accompanying drawings.

In one embodiment, the color / depth camera emits light to obtain an intensity image and a depth image according to the intensity of the received light. The intensity image enables the identification of the object as it is measured through the intensity of the reflected and refracted rays from the object, and the depth image can indicate how far the object is from the depth camera, i.e. perspective.

1 is a diagram illustrating a concept of generating a depth image in an image processing apparatus according to an exemplary embodiment.

In order to generate a three-dimensional image or a stereoscopic image, an image generated at various angles may be required. An image generated from these various angles is called a multi-view image. As one of various methods of generating the multi-view image, a color image and a depth image can be acquired and used. In this case, it is preferable that the color image and the depth image are generated at the same time rather than having a time difference.

However, when color images and depth images are acquired alternately from one sensor, color images and depth images at different points in time can be obtained. In order to generate a color image and a depth image at the same point in time from the color image and the depth image having the parallax, a depth image corresponding to the acquisition point of the color image may be generated, or a color image corresponding to the acquisition point of the depth image You need to create it.

In the indoor environment, the change of the color value of the color image per pixel is more than the change of the depth value of the depth image. Therefore, it may be easier to generate the depth image corresponding to the acquisition time of the color image.

Referring to FIG. 1, a color image is acquired through a color / depth camera at t 0 , t 2 , and t 4 , and a depth image is acquired through a color / depth camera at t 1 and t 3 . The image processing apparatus according to the embodiment can generate a depth image corresponding to the color image at the time point t 2 based on the correspondence relation between pixels in the depth image at the points of time t 1 and t 3 . Further, the image processing apparatus of the embodiment can compensate the generated depth image based on color image at the time point t 2.

2 is a block diagram of an image processing apparatus according to an embodiment.

Referring to FIG. 2, the image processing apparatus according to an exemplary embodiment may include an estimation unit 210, a depth image generation unit 220, and a compensation unit 230. The first time point, the second time point and the third time point used in the following description correspond to the time point t 1 , the time point t 2, and the time point t 3 of FIG.

The color / depth camera can sequentially take a depth image at a first point of time, a color image at a second point of time, and a depth image at a third point of time. The image processing apparatus according to one embodiment may be mounted on a color / depth camera, or may be a separate device separate from the color / depth camera.

Based on the depth image at the first viewpoint and the depth image at the third viewpoint, the estimator 210 satisfies a predetermined corresponding condition among the pixels belonging to the third viewpoint depth image, Lt; RTI ID = 0.0 > pixel < / RTI > The reference pixel means any one of the pixels constituting the depth image photographed at the first viewpoint. The corresponding pixel means a pixel indicating at which point the object photographed at the reference pixel at the first view point is equally reflected at the third view point.

The estimator 210 calculates a corresponding pixel corresponding to the reference pixel based on a corresponding condition between the first view depth image and the third view depth image, a corresponding condition between the first view luminosity image and the third view luminosity image, Can be estimated.

Corresponding conditions may include an infrared brightness similarity, a depth value similarity, a topology similarity, and the like between the reference pixel and one or more candidate pixels included in the third viewpoint. A pixel that satisfies the corresponding condition among the candidate pixels can be estimated as a corresponding pixel.

The infrared brightness similarity may be determined using the infrared brightness values of the reference pixels and candidate pixels extracted from the brightness image, and the depth value similarity may be determined using the depth values of the reference pixels and candidate pixels extracted from the depth image , The topology similarity can be determined using the infrared brightness values of the reference pixels and candidate pixels extracted from the brightness image and the depth values of the reference pixels and candidate pixels extracted from the depth image.

For example, the estimation unit 210 may determine that the difference between the infrared brightness values of the candidate pixels and the reference pixels is within a certain range, the difference between the depth values is within a certain range, and the pixels in the window and the topology topology can estimate similar corresponding pixels. The estimator 210 may estimate a corresponding pixel satisfying all of the above corresponding conditions among the candidate pixels. The predetermined range may be changed according to the setting. To increase accuracy, it is possible to set a narrow range, and in order to increase the processing speed rather than the accuracy, it is possible to set a certain range.

The similarity of the topology can be determined according to whether the pattern of variation of the infrared brightness value between the pixels included in the window is similar to the pattern of the variation of the depth value. Since the pixels included in the window are located adjacent to each other, there is a high possibility that the pattern of change of the infrared brightness value and the pattern of change of the depth value between the pixels are similar.

For example, the estimation unit 210 may estimate a spare pixel corresponding to a pixel having an infrared brightness value and a depth value similar to those of a pixel included in the window. The estimator 210 may determine that the topology is similar if the variation patterns of the respective infra-red brightness values and depth values are similar from the pixels included in the window to the spare corresponding pixels, and estimate the spare corresponding pixels as corresponding pixels . The change pattern can be determined by considering the coordinates of the pixels included in the window and the distance between the preliminary corresponding pixels and the direction from the pixels included in the window to the preliminary corresponding pixels. For example, the change pattern in the window of Fig. 5 is indicated by an arrow.

The estimator 210 may estimate a corresponding pixel corresponding to a reference pixel in the first view depth image among the pixels belonging to the depth image at the third view point. The estimator 210 may estimate corresponding pixels belonging to the depth image at the third time point corresponding to all the pixels of the first view depth image.

The depth image generating unit 220 may calculate the coordinates of the current pixel at the second point of time based on the amount of change of the coordinates from the coordinates of the reference pixel to the coordinates of the corresponding pixel. The depth image generating unit 220 may calculate the depth value of the current pixel at the second time point based on the amount of change from the depth value of the reference pixel to the depth value of the corresponding pixel. The current pixel means a pixel constituting the depth image at the second view point. The coordinates and depth values of the current pixel may be determined based on the coordinates and depth values of the reference pixels, the coordinates and depth values of the corresponding pixels, and the time difference between the first and third points of time.

The depth image generator 220 may estimate the coordinates of the current pixel by interpolating the coordinates of the reference pixel according to the amount of change of the coordinates. The depth image generating unit 220 may calculate the depth value of the current pixel of the second view depth image using the depth value of the reference pixel of the first view depth image and the depth value of the corresponding pixel of the third view depth image have.

The depth image generator 220 applies the ratio of the time difference between the third point of time and the second point of time to the time difference between the first point of time and the third point of time to the depth value of the reference pixel, The depth value of the current pixel can be calculated by applying the ratio of the time difference between the second point of time and the first point of time to the depth value of the corresponding pixel.

The depth image generating unit 220 may not calculate the depth value of the current pixel if the difference between the depth value of the reference pixel and the depth value of the corresponding pixel is greater than the threshold value. Since the depth value is not calculated, the current pixel in which the depth value does not exist can be defined as a hole pixel. The threshold value may be set to be variable according to the resolution or accuracy of the depth image at the second time point.

The compensating unit 230 can compensate for the hole pixel in which the depth value does not exist based on the depth value of the surrounding pixels of the hole pixel. The neighboring pixels may be pixels adjacent to the hole pixel, for example, all pixels surrounding the hole pixel.

The compensator 230 may calculate the depth value of the hole pixel by applying different weights to the depth values of neighboring pixels according to the difference in distance between the hole pixel and the neighboring pixels. For example, a large weight may be applied to the depth value of a neighboring pixel close to the hole pixel.

The compensating unit 230 may calculate the depth value of the hole pixel by applying different weights to the depth values of neighboring pixels according to the difference of the color values between the hole pixel and the neighboring pixels. At this time, the compensation unit 230 may use the color image at the second time point. For example, large weights may be applied to neighboring pixels having color values similar to the color values of the hall pixels. The similarity of color values can be determined by comparing a threshold value with a difference between color values. If the difference between the color values is less than or equal to the set threshold value, the color values may be determined to be similar to each other. The threshold value can be adjusted according to calculation complexity, calculation accuracy, and the like.

3 is a block diagram of an image processing apparatus according to another embodiment.

Referring to FIG. 3, an image processing apparatus according to an exemplary embodiment of the present invention may include an estimation unit 310, a depth image generation unit 320, and a compensation unit 330.

The estimation unit 310 may include at least one of an infrared brightness similarity calculation unit 311, a depth value similarity calculation unit 313, a topology similarity calculation unit 315, and a verification unit 317 and a determination unit 319 have.

The infrared brightness similarity calculation unit 311 extracts the infrared brightness value of the reference pixel from the amplitude image at the first time point and extracts the infrared brightness value of the at least one candidate pixel included in the third viewpoint from the brightness image at the third time point The value can be extracted. The infrared brightness similarity calculation unit 311 can calculate the infrared brightness similarity based on the infrared brightness difference value between the reference pixel and the candidate pixel.

The determination unit 319 may estimate a candidate pixel that satisfies the corresponding condition as a corresponding pixel, when the infrared brightness similarity degree is equal to or greater than a predetermined reference as a corresponding condition. Here, the predetermined criterion can be adjusted according to the performance of the infrared brightness similarity calculation unit 311, calculation complexity, calculation accuracy, processing time, and the like. For example, if the performance is relatively good, the predetermined criterion may be high.

The depth value similarity calculation unit 313 may extract the depth value of the reference pixel from the depth image at the first viewpoint and extract the depth value of the at least one candidate pixel included at the third viewpoint from the depth image at the third viewpoint . The depth value similarity calculation unit 313 can calculate the depth value similarity based on the depth value difference between the reference pixel and the candidate pixel.

The determination unit 319 can estimate a candidate pixel that satisfies the corresponding condition as a corresponding pixel, when the depth value similarity is greater than or equal to a certain range as a corresponding condition.

The topology similarity calculation unit 315 extracts the infrared brightness value and the depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint and extracts the brightness value and depth value from the depth image and the brightness image at the third viewpoint The infrared brightness value and the depth value of the candidate pixels included in the input image can be extracted. The topology similarity calculation unit 315 can calculate the topology similarity based on the difference between the infrared brightness values between the pixels in the window and the candidate pixels and the difference in the depth value between the pixels in the window and the candidate pixels.

The determination unit 319 can estimate candidate pixels satisfying the corresponding condition as corresponding pixels, assuming that the topology similarity degree is a certain range or more as a corresponding condition.

The verification unit 315 may verify whether the reference pixel of the first view depth image is estimated from the corresponding pixel of the third viewpoint depth image. The verification unit 315 determines the pixel q of the third viewpoint depth image as the reference pixel again with respect to the corresponding pixel q of the third viewpoint depth image estimated from the reference pixel p of the first viewpoint depth image, As a result of estimating the corresponding pixel of the first viewpoint depth image, it can be verified whether the pixel p of the original first viewpoint depth image is estimated. The infrared brightness similarity calculation unit 311, the depth value similarity calculation unit 313, and the topology similarity calculation unit 315 calculate the similarity of the pixel q from the pixel q of the third viewpoint depth image in the same manner as the method of estimating the pixel q from the pixel p A pixel of the first viewpoint depth image can be estimated.

The estimating unit 310 calculates the degree of similarity of the infrared light, the depth value similarity calculated by the depth value similarity calculating unit 313, and the topology similarity calculated by the topology similarity calculating unit 315 based on the infrared light similarity calculated by the infrared brightness similarity calculating unit 311 And a pixel of the third viewpoint depth image satisfying all of the corresponding conditions of the third viewpoint depth image can be estimated as a corresponding pixel of the final viewpoint depth image.

When a plurality of corresponding pixels of the third viewpoint depth image estimated in consideration of the infrared brightness similarity are considered, the estimating unit 310 selects a corresponding pixel satisfying the condition in consideration of the depth value similarity among the corresponding pixels, Pixels are selected, a pixel satisfying the condition can be estimated as a corresponding pixel of the final third viewpoint depth image considering the topology similarity.

The depth image generation unit 320 may include an interpolation unit 321. [

The interpolation unit 321 obtains the depth value of the depth image at the first viewpoint, the coordinates of the corresponding pixel of the depth image at the third viewpoint, the amount of change of the coordinates from the coordinates of the reference pixel to the coordinates of the corresponding pixel, The position of the current pixel belonging to the depth image at the second view point and the depth value of the current pixel can be calculated by linearly interpolating the depth value of the corresponding pixel.

The interpolator 321 calculates the position of the current pixel by using the ratio of the time difference between the first and second points of time with respect to the time difference between the first point and the third point, Can be calculated.

The interpolation unit 321 applies the ratio of the time difference between the first and second points of time to the time difference between the first point and the third point of time to the depth value of the corresponding pixel, The depth value of the current pixel can be calculated by applying the ratio of the time difference between the third time point and the second time point to the depth value of the reference pixel.

If the difference between the depth value of the reference pixel and the depth value of the corresponding pixel is larger than the threshold value, the interpolator 321 generates the hole pixel without calculating the depth value at the current pixel belonging to the depth image at the corresponding second point of time .

When a plurality of depth values are calculated at the position of the current pixel belonging to the depth image at the second view point, the interpolator 321 can calculate the average value of the plurality of depth values as the representative depth value of the current pixel.

The compensating unit 330 may calculate the depth value of the hole pixel by adding the depth values of neighboring pixels by increasing the weight as the distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels is closer. Since the depth value of the hall pixel is calculated by the compensating unit 330, all the current pixels when the hall pixel exists in the estimated second view depth image can have depth values.

The compensating unit 330 may calculate the depth value of the hall pixel by summing the depth values of neighboring pixels by increasing the weight value as the color value between the hall pixel and the surrounding pixels is closer to the color image at the second time point. The compensating unit 330 may calculate the depth value of the hole pixel by applying the weight according to the distance between the coordinates to the depth values of neighboring pixels.

The compensating unit 330 increases the weight of the distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels and the closer the color value between the hole pixel and the neighboring pixels is, Can be calculated.

4 is a block diagram of an image processing apparatus according to another embodiment.

Referring to FIG. 4, the image processing apparatus according to an exemplary embodiment may include an estimation unit 410, a depth image generation unit 420, and a compensation unit 430.

The estimator 410 may include a downscaling unit 411 and a variation calculation unit 413. [

The downscaling unit 411 may generate a pyramid by downscaling the luminance image and the depth image at the first time point, the luminance image and the depth image at the third time point. Here, the pyramid refers to a structure including a plurality of images having different sizes. The downscaling unit 411 may reduce a size of an image to generate a plurality of images. The downscaling unit 411 may reduce the size of each of the luminance image and the depth image at the first time point to generate a plurality of images. The downscaling unit 411 may reduce the size of each of the luminance image and the depth image at the third time point to generate a plurality of images.

The change amount calculation unit 413 calculates an optical flow for each image forming the pyramid from the smallest image of the pyramid and calculates a change amount of the coordinates from the reference pixel of the first view depth image to the corresponding pixel of the third view Can be calculated. Optical flow is a technique that is mainly used to estimate the motion of an object in an image. The most representative of these is the Lucas Canade optical flow method.

The change amount calculation unit 413 can calculate the corresponding relationship using a method of extending the Lucas-Kanade Optical Flow.

The change amount calculation unit 413 may calculate an optical flow between the brightness image at the first time point and the brightness image at the third time point, and calculate a change amount of coordinates between the corresponding pixels. The amount of change of the coordinates can be calculated by a motion vector.

The change amount calculation unit 413 can calculate a change amount of coordinates between corresponding pixels between the brightness images using Equation (1).

[Equation 1]

Figure pat00001

(X, y) is the coordinates of the pixel, (u x , u y ) is the coordinate of the pixel p in the first view image, (w x , w y ) is the range of the surrounding pixels of the pixel p , that is the window with a focus on the pixel p, (g x, g y) is amount of change in position of the pixel p, (u x + g x, u y + g y) is a pixel in the third viewpoint image corresponding to the pixel p The coordinates (x + g x , y + g y ) of q are the corresponding pixels to which this correspondence relationship is applied to the pixel (x, y). (g x , g y ) is given by (0, 0). (d x , d y ) represents the amount of change of coordinates to a pixel having a similar infrared brightness value.

(Dx, dy) satisfying Equation (1) can be calculated using a 2x2 matrix inverse transformation. α (x, y) is a weight determined by how much the pixel (x, y) has a depth value similar to the pixel p, and has a larger value as the depth values become similar.

The change amount calculation unit 413 may calculate the optical flow between the depth image at the first viewpoint and the depth image at the third viewpoint and calculate the amount of change in coordinates between the corresponding pixels. The amount of change of the coordinates can be calculated by a motion vector.

The change amount calculation unit 413 can calculate the amount of change of coordinates between the corresponding pixels in the depth image using Equation 2 and Equation 3. [

&Quot; (2) "

Figure pat00002

&Quot; (3) "

Figure pat00003

Here, Z represents the depth value, and < RTI ID = 0.0 >

Figure pat00004
If it is not greater than 0, it outputs 0. Otherwise, it outputs the difference of the depth value of the corresponding pixel at the first and third time points.

The amount of change (e x , e y ) of a coordinate to a pixel having a similar depth value satisfying the above equation can also be calculated using the inverse of the matrix. Finally, the vector (g ' x , g' y ) indicating the correspondence relationship can be updated as follows.

The change amount calculation unit 413 calculates the change amount (e x , e y ) of a pixel having a depth value similar to the change amount (d x , d y ) of a coordinate to a pixel having a similar infrared brightness value, (G ' x , g' y ) can be calculated from the image of the current level as shown in Equation (4). The pyramid structure can consist of several levels from the smallest to the largest. (G ' x , g' y ) calculated from the image of the smallest size can be used in the image of the next level.

&Quot; (4) "

Figure pat00005

Figure pat00006

For example, when the difference in image size between levels in a pyramid structure is doubled, (g ' x , g' y ) estimated in a previous small level image is used as an initial value of a large image of the immediately upper level 2g ' x , 2g' y ). (2 g ' x , 2 g' y ) can be used for (g x , g y ) in equation (4) As a result of repeating the above updating and scaling process for all levels up to the largest image, the final correspondence relation, that is, the change amount of the coordinates from the reference pixel of the depth image at the first viewpoint to the corresponding pixel of the depth image at the third viewpoint Can be obtained.

The depth image generation unit 420 may include an interpolation unit 421.

The interpolation section 421 calculates the depth of the reference pixel by using the coordinates of the reference pixel of the first viewpoint depth image, the coordinates of the corresponding pixel of the third viewpoint depth image, the amount of change of the coordinates from the coordinates of the reference pixel to the coordinates of the corresponding pixel, The position of the current pixel belonging to the second viewpoint depth image and the depth value of the current pixel can be calculated by linearly interpolating the depth value of the second viewpoint depth image.

The interpolation unit 421 can calculate the position of the current pixel belonging to the second viewpoint depth image and the depth value of the current pixel using Equation (5).

&Quot; (5) "

Figure pat00007

Here, t 1 is the first time point, t 2 is the second time point, t 3 is the third point in time, x (t2) is the x coordinate of the current pixel at the time depth imaging, y (t2) is the second point, the depth image the y coordinate, Z (t2) of the current pixel is the second current pixel depth value of the time depth imaging, u x is the x-coordinate, u y of the reference pixels of the first point, a depth image of the first reference pixel at the time of depth image the y-coordinate, g x and g y are the x coordinate the change amount y coordinates a change amount, Z (t1) is the pixel of the first viewpoint depth depth value of the reference pixel in the image, Z (t3) is a third point depth image Represents the corresponding depth value.

The interpolation unit 421 obtains a ratio of the time difference between the first and second points of time with respect to the time difference between the first point and the third point by using a value obtained by applying the amounts of change g x and g y of the coordinates, The position x (t2) , y (t2) of the current pixel of the second view depth image can be calculated using the coordinates (u x , u y) of the pixel.

The interpolator 421 applies the ratio of the time difference between the first and second time points to the time difference between the first and third time points to the depth value Z (t3) of the corresponding pixel of the third viewpoint depth image, The ratio of the time difference between the third time point and the second time point with respect to the time difference between the first time point and the third time point is applied to the depth value Z (t1) of the reference pixel of the first viewpoint depth image, The depth value Z (t2) of the current pixel can be calculated.

When the difference between the depth value of the reference pixel of the first viewpoint depth image and the depth value of the corresponding pixel of the third viewpoint depth image is greater than the threshold value, the interpolating section 421 stores the current pixel belonging to the corresponding second viewpoint depth image It is possible to generate a hole pixel without calculating a depth value. A hole pixel means a pixel having no depth value.

When a plurality of depth values are calculated with the current pixel belonging to the second view depth image, the interpolator 421 can calculate an average value of the plurality of depth values as a representative depth value of the current pixel. A plurality of depth values may be estimated for a current pixel of one second viewpoint depth image.

The compensating unit 430 may calculate the depth value of the hole pixel by adding the depth values of neighboring pixels by increasing the weight as the distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels is closer. By calculating the depth value of the hole pixel, when the estimated second view depth image includes the hole pixel, all the pixels can have the depth value.

The compensating unit 430 may calculate the depth value of the hall pixel from the color image at the second time point by increasing the weight value of the hall pixel and the neighboring pixels closer to the color value and summing the depth values of the surrounding pixels to which the weight value is applied. The compensating unit 330 may calculate the depth value of the hole pixel by applying a weight according to the distance to the depth values of neighboring pixels.

The compensating unit 430 compensates for the difference between the coordinates of the hole pixel and the coordinates of the neighboring pixels, and increases the weight as the color value between the hole pixel and the surrounding pixels is closer to the sum of the depth values of the surrounding pixels to which the weight is applied. The depth value can be calculated.

FIG. 5 is a diagram for estimating a pixel-by-pixel correspondence relationship in an image processing apparatus according to an embodiment.

Referring to FIG. 5, the image processing apparatus according to an exemplary embodiment can estimate a correspondence relation between a luminance image at depth t 1 and a depth image, and a luminance image at depth t 3 and a depth image, for each pixel.

When the correspondence between pixels is estimated, the correspondence can be expressed as follows.

Figure pat00008

It may be used corresponding to the following conditions to estimate the pixel q of the image (t 3) from a depth corresponding to a pixel p of the depth image (t 1).

(1) The infrared brightness values at p and q should be similar to each other.

(2) The depth values at p and q should be similar within a certain range.

(3) The pixels around p in the depth image (t 1 ) should have a similar relationship to p.

(4) When p and q correspond, q and p must correspond.

(3) The term in other words refers to the similarity of the topology, in which the pixel 521, the pixel 523 and the pixel 525 corresponding to the pixel 511, the pixel 513 and the pixel 515 in the window 510 ) Are similar to each other, and the depth values should be similar to each other. Here, the similarity means that the difference in the infrared brightness value is equal to or less than the set threshold value. It may also mean that the difference in depth value is below the set threshold value. Since the pixel 511, the pixel 513, and the pixel 515 located in the window 510 are located at a close distance from each other, the infrared brightness value and the depth value are similarly changed.

In the middle diagram, the arrows indicate the calculated optical flow. It is noted that the pixel 511, the pixel 513 and the pixel 515 have an optical flow to the corresponding pixel 521, pixel 523 and pixel 525, . If the directions of the optical flows are similar, the topology can be defined as similar.

6 is a diagram illustrating a pyramid used in an image processing apparatus according to an exemplary embodiment.

Referring to FIG. 6, an image processing apparatus according to an exemplary embodiment may generate a pyramid from a luminance image and a depth image at an input first view point, a luminance image at a third viewpoint, and a depth image. Pyramids can contain images of different sizes in order. The image processing apparatus according to an exemplary embodiment may generate a luminance image 620 by downscaling the luminance image 610 at a first time point. In addition, the image processing apparatus according to an exemplary embodiment may generate a luminance image 630 by downscaling the luminance image 620.

The image processing apparatus according to an exemplary embodiment may generate images having a small size through downscaling from the depth image at the first view point, the brightness image at the third view point, and the depth image, respectively.

The image processing apparatus according to an exemplary embodiment can calculate an optical flow from an image having the smallest size using the conditions (1), (2), and (3) of FIG.

7 illustrates a depth image including a hall pixel generated by the image processing apparatus according to an exemplary embodiment.

The image processing apparatus according to the embodiment can estimate the correspondence relation between the luminance image at depth t 1 and the depth image and the luminance image at depth t 3 and the depth image.

When the correspondence between pixels is estimated, the correspondence can be expressed as follows.

Figure pat00009

An exemplary image processing apparatus according to the example may be from the corresponding relationship of the above equation (5) using the current description of the section on the position and the depth value of the pixel in FIG. 4 at the time t 2 to calculate linearly.

The image processing apparatus according to an embodiment of the present invention is such that the difference between the depth value Z (t 1 ) of the reference pixel of the depth image at the time t 1 and the depth value Z (t 3 ) of the corresponding pixel of the depth image at the time t 3 exceeds the threshold value If it is large, it may not generate the interpolated value. For example, the threshold may be 30 centimeters (cm).

In Equation (5) of FIG. 4, since g x and g y are usually calculated as a prime number, x (t 2 ) and y (t 2 ) can be calculated as a prime number. Since the image processing apparatus according to an exemplary embodiment can calculate pixel positions having coordinates of rounded integers by rounding the decimal values, x (t 2 ) and y (t 2 ) rounded to integer values are 0 To N. For example, pixel 710 in FIG. 7 may have pixel location 711, pixel location 713. FIG.

Referring to FIG. 7, a plurality of calculated pixel positions may be included in the pixel 710 of the depth image 700 at the time point t 2 . The depth value can be calculated for each pixel. The image processing apparatus according to an exemplary embodiment calculates an average of depth values to calculate a representative depth value per pixel. For example, since the pixel 710 contains two pixel positions, the representative depth value can be calculated as follows.

Figure pat00010

Here, Z 1 (t 2) is the depth value of the pixel position 711, and Z 2 (t 2 ) is the depth value of the pixel position 713.

8 is a diagram for explaining a concept of generating a depth value of a hole pixel in an image processing apparatus according to an embodiment.

Referring to FIG. 8, in the depth image 800 of the viewpoint t 2 , there is a hole pixel 810 having no depth value for each pixel. The image processing apparatus according to an exemplary embodiment may calculate the depth value of the hole pixel 810 from the depth value of the surrounding pixels 820. [ The range 830, which may correspond to neighboring pixels, can be adjusted according to the accuracy of the depth value.

 The image processing apparatus according to an exemplary embodiment of the present invention may have a weighted value when pixels adjacent to the hall pixel 810 are calculated in such a manner that (1) the position is close to (2) The depth value of the hole pixel 810 can be calculated from the weight sum of the depth values of the hole pixels 810. If the above condition is reflected in the equation, Equation 6 is obtained.

&Quot; (6) "

Figure pat00011

Here, j is a hole pixel 810 having no depth value, and i represents one of surrounding pixels having a depth value. w i is the weight, | Ci-Cj || is the color difference between pixels i and j, and || xi-xj || is the image coordinate difference. σ C and σ x are constants.

9 is a view showing a depth image before and after generation of a depth value of a hole pixel in an image processing apparatus according to an exemplary embodiment.

Referring to FIG. 9, an image 910 indicates the generation of the depth value of the hall pixel, and an image 920 indicates the depth value of the hall pixel. It can be seen that the boundary of the object is represented more smoothly after the depth value of the hole pixel is generated.

The depth value of the hole pixel can be generated using a weighted sum based method as shown in FIG. 8, or a random forest, which is a machine learning technique.

10 is a flowchart of an image processing method according to an embodiment.

In step 1010, the image processing apparatus according to an exemplary embodiment uses the depth image at the first viewpoint and the depth image at the third viewpoint, and selects, from the pixels belonging to the depth image at the third viewpoint, It is possible to estimate a corresponding pixel corresponding to the reference pixel in the depth image at one view point.

An image processing apparatus according to an exemplary embodiment of the present invention extracts an infrared brightness value of a reference pixel from an amplitude image at a first time point and extracts an infrared brightness value of one or more candidate pixels included at a third time point from a brightness image at a third time point And calculates the infrared brightness similarity based on the infrared brightness difference value between the reference pixel and the candidate pixel. The image processing apparatus according to an exemplary embodiment may estimate a candidate pixel that satisfies the corresponding condition as a corresponding pixel, when the infrared brightness similarity degree is equal to or greater than a predetermined reference as a corresponding condition.

The image processing apparatus extracts a depth value of the reference pixel from a depth image at a first viewpoint, extracts a depth value of at least one candidate pixel included at a third viewpoint from a depth image at a third viewpoint, The depth value similarity can be calculated based on the difference in depth value between the reference pixel and the candidate pixel. The image processing apparatus according to an exemplary embodiment may estimate a candidate pixel that satisfies the corresponding condition as a corresponding pixel when the depth value similarity is greater than or equal to a certain range as a corresponding condition.

The image processing apparatus according to an embodiment extracts the infrared brightness value and the depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint, Based on the differences in the infrared brightness values between the pixels in the window and the candidate pixels, and the difference in the depth values between the pixels in the window and the candidate pixels, the topology ) Similarity can be calculated. The image processing apparatus according to an embodiment may estimate candidate pixels satisfying the corresponding condition as corresponding pixels when the topology similarity degree is equal to or greater than a certain range as corresponding conditions.

In operation 1020, the image processing apparatus according to an exemplary embodiment may generate a current pixel belonging to a depth image at a second viewpoint using a reference pixel and a corresponding pixel.

The image processing apparatus according to an embodiment includes the coordinates of the reference pixel of the first viewpoint depth image, the coordinates of the corresponding pixel of the depth image of the third viewpoint, the amount of change of the coordinates from the coordinates of the reference pixel to the coordinates of the corresponding pixel, Value and the depth value of the corresponding pixel are linearly interpolated to calculate the position of the current pixel belonging to the depth image at the second view point and the depth value of the current pixel.

The image processing apparatus according to an exemplary embodiment may generate a depth value of a hole pixel for which a depth value is not calculated among current pixels based on a depth value of neighboring pixels of the hole pixel.

The image processing apparatus according to an exemplary embodiment of the present invention increases the weight as the distance between the coordinates of the hall pixel and the coordinates of the neighboring pixels is closer to the color value between the hall pixel and the neighboring pixels and increases as the sum of the depth values of the surrounding pixels to which the weight is applied The depth value of the hole pixel can be calculated.

The image processing apparatus according to an embodiment estimates the correspondence relation between pixels obtained from the depth image alternately obtained from the color image and the depth image picked up before and after the shooting time of the color image, Can be generated.

Also, the image processing apparatus according to an embodiment generates a hole pixel for a pixel that does not satisfy the condition of the corresponding relationship, and a depth value of the hole pixel is a color of a pixel at a position corresponding to the hole pixel in the color image and a color By estimating based on the degree of similarity, it is possible to generate a depth image of the same time as the photographing time of the color image more accurately.

In addition, the image processing apparatus according to an embodiment can provide a source of a multi-view image that can more accurately represent a three-dimensional image by correcting a time difference between a color image and a depth image.

The method according to an embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

Claims (21)

Corresponding to a reference pixel in the depth image at the first viewpoint among the pixels belonging to the depth image at the third viewpoint, based on a predetermined corresponding condition, using the depth image at the first viewpoint and the depth image at the third viewpoint, An estimating unit that estimates a corresponding pixel; And
A depth image generating unit for generating a current pixel belonging to a depth image at a second viewpoint using the reference pixel and the corresponding pixel,
And the image processing apparatus.
The method according to claim 1,
A compensating unit which generates a depth value of a hole pixel whose depth value is not calculated among the current pixels based on a depth value of peripheral pixels of the hole pixel,
Further comprising:
The method according to claim 1,
The estimation unit
Extracting an infrared brightness value of the reference pixel from an amplitude image at a first time point, extracting an infrared brightness value of at least one candidate pixel included at the third time point from a brightness image at a third time point, An infrared brightness similarity calculation unit for calculating an infrared brightness similarity based on a difference between infrared brightness values between the pixel and the candidate pixel; And
Wherein the corresponding condition is a case where the infrared brightness similarity degree is equal to or greater than a predetermined reference, and when the corresponding condition is satisfied,
And the image processing apparatus.
The method according to claim 1,
The estimation unit
Extracting a depth value of the reference pixel from the depth image at the first viewpoint, extracting a depth value of at least one candidate pixel included in the third viewpoint from the depth image at the third viewpoint, A depth value similarity calculation unit for calculating a depth value similarity based on a depth value difference between pixels; And
Wherein the corresponding condition is a case in which the depth value similarity is greater than or equal to a certain range, and when the corresponding condition is satisfied,
And the image processing apparatus.
The method according to claim 1,
The estimation unit
Extracting an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint and extracting an infrared brightness value and a depth value of the candidate pixel included in the third viewpoint from the depth image and the brightness image at the third viewpoint, And calculates a topology based on a difference between an infrared brightness value between the pixels in the window and the candidate pixels and a depth value between the pixels in the window and the candidate pixels, A topology similarity calculation unit for calculating a similarity; And
Wherein the corresponding condition is a case where the topology similarity is equal to or more than a certain range, and when the corresponding condition is satisfied,
And the image processing apparatus.
The method according to claim 1,
The estimation unit
A verification unit for verifying whether a reference pixel of the first viewpoint depth image is estimated from a corresponding pixel of the third viewpoint depth image,
And the image processing apparatus.
The method according to claim 1,
The estimation unit
A downscaling unit for downscaling the luminance image and the depth image at the first time point, the luminance image and the depth image at the third time point to generate a pyramid; And
Calculates an optical flow for each image forming the pyramid from the smallest image of the pyramid and calculates a change amount of coordinates from a reference pixel of the depth image at the first viewpoint to a corresponding pixel at the third viewpoint The change-
And the image processing apparatus.
The method according to claim 1,
The estimation unit
Wherein the difference between the reference pixel and the infrared brightness value of the brightness image at the first time point is within a certain range and the difference between the reference pixel value and the depth value of the depth image at the first time point is within a certain range, Estimating a corresponding pixel at the third time point in which the topology of the pixels is similar
Image processing apparatus.
The method according to claim 1,
The depth image generation unit
A depth of the reference pixel, a depth of the reference image, a depth of the reference image, a depth of the reference image, a depth of the reference image, An interpolation unit which linearly interpolates the depth value of the corresponding pixel to calculate the position of the current pixel belonging to the depth image at the second time point and the depth value of the current pixel,
And the image processing apparatus.
10. The method of claim 9,
The interpolator
If the difference between the depth value of the reference pixel and the depth value of the corresponding pixel is greater than the threshold value, a hole pixel is generated in the current pixel belonging to the depth image at the corresponding second viewpoint without calculating the depth value
Image processing apparatus.
10. The method of claim 9,
The interpolator
Calculating an average value of a plurality of depth values included in a current pixel belonging to the depth image at the second time point as a representative depth value of the current pixel
Image processing apparatus.
3. The method of claim 2,
The compensation unit
The depth value of the hall pixel is calculated as the sum of the depth values of the surrounding pixels by increasing the weight as the distance between the coordinates of the hall pixel and the coordinates of the surrounding pixels becomes closer
Image processing apparatus.
3. The method of claim 2,
The compensation unit
The depth value of the hall pixel is calculated as the sum of the depth values of the surrounding pixels by increasing the weight as the color value between the hall pixel extracted from the color image at the second time point and the surrounding pixels is closer
Image processing apparatus.
3. The method of claim 2,
The compensation unit
The distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels is close to the distance between the adjacent pixel and the color value of the adjacent pixel is closer to the color value, Calculate
Image processing apparatus.
Corresponding to a reference pixel in the depth image at the first viewpoint among the pixels belonging to the depth image at the third viewpoint, based on a predetermined corresponding condition, using the depth image at the first viewpoint and the depth image at the third viewpoint, Estimating a corresponding pixel; And
Generating a current pixel belonging to a depth image at a second viewpoint using the reference pixel and the corresponding pixel
And an image processing method.
16. The method of claim 15,
Generating a depth value of the current pixel in which the depth value is not calculated from the current pixels based on the depth value of the surrounding pixels of the hole pixel
Further comprising the steps of:
16. The method of claim 15,
The estimating step
Extracting an infrared brightness value of the reference pixel from an amplitude image at a first time point;
Extracting an infrared brightness value of at least one candidate pixel included in the third viewpoint from the brightness viewpoint at the third viewpoint;
Calculating an infrared brightness similarity based on a difference in infrared brightness value between the reference pixel and the candidate pixel; And
Wherein the corresponding condition is a case where the infrared brightness similarity is equal to or greater than a predetermined reference, and if the corresponding condition is satisfied, estimating the candidate pixel as the corresponding pixel
And an image processing method.
16. The method of claim 15,
The estimating step
Extracting a depth value of the reference pixel from a depth image at the first viewpoint;
Extracting a depth value of at least one candidate pixel included in the third viewpoint from the depth view image at the third viewpoint;
Calculating a depth value similarity based on a depth value difference between the reference pixel and the candidate pixel; And
Wherein the corresponding condition is that the depth value similarity is greater than or equal to a certain range, and if the corresponding condition is satisfied, estimating the candidate pixel as the corresponding pixel
And an image processing method.
16. The method of claim 15,
The estimating step
Extracting an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint;
Extracting an infrared brightness value and a depth value of candidate pixels included in the third viewpoint from the depth image and the brightness image at the third viewpoint;
Calculating a topology similarity based on a difference between infrared brightness values between the pixels in the window and the candidate pixels and a difference in depth value between the pixels in the window and the candidate pixels; And
Wherein the corresponding condition is a case where the topology similarity is equal to or greater than a certain range, and if the corresponding condition is satisfied, estimating the candidate pixels as corresponding pixels
And an image processing method.
16. The method of claim 15,
The generating step
A depth of the reference pixel, a depth of the reference pixel, a depth of the reference pixel, a depth of the reference pixel, a depth of the reference image, Calculating a position of a current pixel belonging to the depth image at the second view point and a depth value of the current pixel by linearly interpolating the depth value of the pixel,
And an image processing method.
17. The method of claim 16,
The step of generating the depth value of the hole pixel
The distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels is close to the distance between the adjacent pixel and the color value of the adjacent pixel is closer to the color value, Calculate
Image processing method.
KR1020120134264A 2012-11-26 2012-11-26 Image processing apparatus and method thereof KR20140067253A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120134264A KR20140067253A (en) 2012-11-26 2012-11-26 Image processing apparatus and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120134264A KR20140067253A (en) 2012-11-26 2012-11-26 Image processing apparatus and method thereof

Publications (1)

Publication Number Publication Date
KR20140067253A true KR20140067253A (en) 2014-06-05

Family

ID=51123531

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120134264A KR20140067253A (en) 2012-11-26 2012-11-26 Image processing apparatus and method thereof

Country Status (1)

Country Link
KR (1) KR20140067253A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160047891A (en) * 2014-10-23 2016-05-03 삼성전자주식회사 Electronic device and method for processing image
KR20160123871A (en) * 2015-04-17 2016-10-26 삼성전자주식회사 Method and apparatus for estimating image optical flow

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160047891A (en) * 2014-10-23 2016-05-03 삼성전자주식회사 Electronic device and method for processing image
US10970865B2 (en) 2014-10-23 2021-04-06 Samsung Electronics Co., Ltd. Electronic device and method for applying image effect to images obtained using image sensor
US11455738B2 (en) 2014-10-23 2022-09-27 Samsung Electronics Co., Ltd. Electronic device and method for applying image effect to images obtained using image sensor
KR20160123871A (en) * 2015-04-17 2016-10-26 삼성전자주식회사 Method and apparatus for estimating image optical flow

Similar Documents

Publication Publication Date Title
JP6244407B2 (en) Improved depth measurement quality
KR101419979B1 (en) Method and system for converting 2d image data to stereoscopic image data
KR101742120B1 (en) Apparatus and method for image processing
WO2020039166A1 (en) Method and system for reconstructing colour and depth information of a scene
CN105551020B (en) A kind of method and device detecting object size
JP2007000205A (en) Image processing apparatus, image processing method, and image processing program
JP2018151689A (en) Image processing apparatus, control method thereof, program and storage medium
KR20170091496A (en) Method and apparatus for processing binocular image
EP2291825A1 (en) System and method for depth extraction of images with forward and backward depth prediction
JP2011081605A (en) Image processing apparatus, method and program
JP2007053621A (en) Image generating apparatus
KR101125061B1 (en) A Method For Transforming 2D Video To 3D Video By Using LDI Method
KR101853215B1 (en) Coding Device and Method and Depth Information Compensation by Plane Modeling
KR101027003B1 (en) Stereo matching apparatus and its method
JP4985542B2 (en) Corresponding point search device
WO2015198592A1 (en) Information processing device, information processing method, and information processing program
KR20140000833A (en) Stereo matching apparatus and its method
JP6079076B2 (en) Object tracking device and object tracking method
JP6991700B2 (en) Information processing equipment, information processing method, program
KR101888969B1 (en) Stereo matching apparatus using image property
KR20140067253A (en) Image processing apparatus and method thereof
KR102240570B1 (en) Method and apparatus for generating spanning tree,method and apparatus for stereo matching,method and apparatus for up-sampling,and method and apparatus for generating reference pixel
KR101435611B1 (en) Occlusion removal method for three dimensional integral image
JP2015033047A (en) Depth estimation device employing plural cameras
JP4985863B2 (en) Corresponding point search device

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination