KR20140067253A - Image processing apparatus and method thereof - Google Patents
Image processing apparatus and method thereof Download PDFInfo
- Publication number
- KR20140067253A KR20140067253A KR1020120134264A KR20120134264A KR20140067253A KR 20140067253 A KR20140067253 A KR 20140067253A KR 1020120134264 A KR1020120134264 A KR 1020120134264A KR 20120134264 A KR20120134264 A KR 20120134264A KR 20140067253 A KR20140067253 A KR 20140067253A
- Authority
- KR
- South Korea
- Prior art keywords
- pixel
- depth
- image
- value
- viewpoint
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0077—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Landscapes
- Image Processing (AREA)
Abstract
The present invention relates to a method and an apparatus for correcting a difference between a shooting time of a depth image and a shooting time of a color image. In one aspect of the present invention, A corresponding pixel corresponding to the reference pixel in the depth image at the first viewpoint among the pixels belonging to the depth image at the third viewpoint is estimated based on the corresponding condition of the first viewpoint and the corresponding condition of the second viewpoint, The current pixel belonging to the image can be generated.
Description
The following embodiments relate to a method and apparatus for correcting a difference between a shooting time of a depth image and a shooting time of a color image.
Using a color / depth camera using one lens and one sensor, color / depth images can be taken at the same time and at the same frame rate. Since a color image and a depth image are alternately acquired from one sensor, when a moving object is photographed using the above color / depth camera, a time difference occurs between the frame of the color image and the frame of the depth image.
In one aspect of the present invention, the image processing apparatus uses the depth image at the first viewpoint and the depth image at the third viewpoint, based on a predetermined corresponding condition, among the pixels belonging to the depth image at the third viewpoint, And a depth image generator for generating a current pixel belonging to the depth image at the second viewpoint using the reference pixel and the corresponding pixel.
In another aspect, the image processing apparatus may further include a compensation unit that generates a depth value of a hole pixel in which the depth value is not calculated among the current pixels, based on a depth value of peripheral pixels of the hole pixel.
The estimator extracts the infrared brightness value of the reference pixel from the amplitude image at the first time point and extracts the infrared brightness value of the at least one candidate pixel included in the third time point from the brightness image at the third time point An infrared brightness similarity calculation unit for calculating an infrared brightness similarity based on a difference between infrared brightness values between the reference pixel and the candidate pixel, and the corresponding condition is a case where the infrared brightness similarity is equal to or greater than a predetermined reference, And a determination unit that estimates the candidate pixel as the corresponding pixel.
Wherein the estimating unit extracts a depth value of the reference pixel from the depth image at the first viewpoint, extracts a depth value of at least one candidate pixel included in the third viewpoint from the depth image at the third viewpoint, And a depth value similarity calculator for calculating a depth value similarity based on the depth value difference between the candidate pixels, and the corresponding condition is a case where the depth value similarity is greater than or equal to a certain range, As a corresponding pixel.
Wherein the estimating unit extracts an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint and extracts an infrared brightness value and a depth value from the depth image and the brightness image at the third viewpoint Extracting an infrared brightness value and a depth value of the candidate pixels to be displayed on the basis of the difference between the infrared brightness values between the pixels in the window and the candidate pixels and the depth value between the pixels in the window and the candidate pixels, a topology similarity calculation unit for calculating topology similarity of the candidate pixels and the corresponding condition may include a determination unit for estimating the candidate pixels as corresponding pixels when the topology similarity is in a certain range or more and the corresponding condition is satisfied.
The estimating unit may include a verifying unit for verifying whether the reference pixel of the first view depth image is estimated from the corresponding pixel of the third viewpoint depth image or the symmetry.
The estimator includes a downscaling unit for downscaling the luminance image and the depth image at the first viewpoint, the luminance image and the depth image at the third viewpoint, and a downscaling unit for generating a pyramid from the smallest image of the pyramid, And a change amount calculation unit for calculating an optical flow and calculating a change amount of coordinates from a reference pixel of the depth image at the first viewpoint to a corresponding pixel at the third viewpoint.
Wherein the estimating unit determines that the difference between the reference pixel and the infrared brightness value of the luminance image at the first time point is within a certain range and the difference between the reference pixel and the depth value of the depth image at the first time point is within a certain range, It is possible to estimate the corresponding pixel at the third time point in which the topology of the pixels in the set window is similar.
The depth image generation unit may generate the depth image by using the coordinates of the reference pixel of the depth image at the first viewpoint, the coordinates of the corresponding pixel of the depth image at the third viewpoint, the amount of change of coordinates from the coordinates of the reference pixel to the coordinates of the corresponding pixel, And interpolating the position of the current pixel belonging to the depth image of the second viewpoint and the depth value of the current pixel by linearly interpolating the depth value of the corresponding pixel and the depth value of the corresponding pixel.
If the difference between the depth value of the reference pixel and the depth value of the corresponding pixel is greater than the threshold value, the interpolator generates the hole pixel without calculating the depth value of the current pixel belonging to the corresponding depth image at the second time point .
The interpolation unit may calculate an average value of a plurality of depth values included in a current pixel belonging to the depth image at the second view point as a representative depth value of the current pixel.
The compensator may calculate a depth value of the hall pixel by summing the depth values of the neighboring pixels by increasing the weight as the distance between the coordinates of the hall pixel and the coordinates of the neighboring pixels is closer.
The compensator may calculate the depth value of the hall pixel by summing the depth values of the neighboring pixels by increasing the weight value as the color value between the hall pixel extracted from the color image at the second time point and the neighboring pixels is closer.
Wherein the compensating unit increases the weight of the distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels and the closer the color value between the hole pixel and the neighboring pixels is, The depth value can be calculated.
According to an aspect of the present invention, an image processing method includes the steps of: extracting, from among pixels belonging to a depth image at the third viewpoint, a depth image at a first viewpoint and a depth image at a third viewpoint, Estimating a corresponding pixel corresponding to a reference pixel in the depth image, and using the reference pixel and the corresponding pixel to generate a current pixel belonging to the depth image of the second viewpoint.
In another aspect, the image processing method may further include generating a depth value of a hole pixel in which a depth value is not calculated among current pixels, based on a depth value of peripheral pixels of the hole pixel.
Wherein the step of estimating comprises the steps of: extracting an infrared brightness value of the reference pixel from an amplitude image at a first time point; extracting an infrared brightness value of one or more candidate pixels included in the third viewpoint Calculating an infrared brightness similarity based on a difference between infrared brightness values between the reference pixel and the candidate pixel; and if the infrared brightness similarity is equal to or greater than a predetermined reference value, And estimating the candidate pixel as the corresponding pixel.
Wherein the step of estimating includes extracting a depth value of the reference pixel from the depth image at the first viewpoint, extracting a depth value of at least one candidate pixel included in the third viewpoint from the depth image of the third viewpoint, Calculating a depth value similarity based on a depth value difference between the reference pixel and the candidate pixel, and the corresponding condition is a case in which the depth value similarity is greater than or equal to a certain range, and when the corresponding condition is satisfied, And estimating it as a corresponding pixel.
Wherein the estimating comprises extracting an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint, Extracting an infrared brightness value and a depth value of the candidate pixels included in the window and a difference between the infrared brightness values of the pixels in the window and the candidate pixels and a difference in depth value between the pixels in the window and the candidate pixels Calculating a topology similarity based on the pixel values of the candidate pixels and the corresponding condition, and when the corresponding condition is satisfied, estimating the candidate pixels as corresponding pixels when the topology similarity is not less than a certain range.
Wherein the step of generating the depth image comprises the steps of: generating a first depth-of-view depth image having a first depth-of-view depth and a third depth-of-view depth image; Calculating a position of a current pixel belonging to the depth image at the second view point and a depth value of the current pixel by linearly interpolating the depth value of the reference pixel and the depth value of the corresponding pixel.
Wherein the compensating step is performed such that the distance between the coordinate of the hole pixel and the coordinates of the neighboring pixels is close to each other and the weight is increased as the color value between the hall pixel and the surrounding pixels is closer to each other, The depth value of the pixel can be calculated.
1 is a diagram illustrating a concept of generating a depth image in an image processing apparatus according to an exemplary embodiment.
2 is a block diagram of an image processing apparatus according to an embodiment.
3 is a block diagram of an image processing apparatus according to another embodiment.
4 is a block diagram of an image processing apparatus according to another embodiment.
FIG. 5 is a diagram for estimating a pixel-by-pixel correspondence relationship in an image processing apparatus according to an embodiment.
6 is a diagram illustrating a pyramid used in an image processing apparatus according to an exemplary embodiment.
7 illustrates a depth image including a hall pixel generated by the image processing apparatus according to an exemplary embodiment.
8 is a diagram for explaining a concept of generating a depth value of a hole pixel in an image processing apparatus according to an embodiment.
9 is a view showing a depth image before and after generation of a depth value of a hole pixel in an image processing apparatus according to an exemplary embodiment.
10 is a flowchart of an image processing method according to an embodiment.
Hereinafter, embodiments according to one aspect will be described in detail with reference to the accompanying drawings.
In one embodiment, the color / depth camera emits light to obtain an intensity image and a depth image according to the intensity of the received light. The intensity image enables the identification of the object as it is measured through the intensity of the reflected and refracted rays from the object, and the depth image can indicate how far the object is from the depth camera, i.e. perspective.
1 is a diagram illustrating a concept of generating a depth image in an image processing apparatus according to an exemplary embodiment.
In order to generate a three-dimensional image or a stereoscopic image, an image generated at various angles may be required. An image generated from these various angles is called a multi-view image. As one of various methods of generating the multi-view image, a color image and a depth image can be acquired and used. In this case, it is preferable that the color image and the depth image are generated at the same time rather than having a time difference.
However, when color images and depth images are acquired alternately from one sensor, color images and depth images at different points in time can be obtained. In order to generate a color image and a depth image at the same point in time from the color image and the depth image having the parallax, a depth image corresponding to the acquisition point of the color image may be generated, or a color image corresponding to the acquisition point of the depth image You need to create it.
In the indoor environment, the change of the color value of the color image per pixel is more than the change of the depth value of the depth image. Therefore, it may be easier to generate the depth image corresponding to the acquisition time of the color image.
Referring to FIG. 1, a color image is acquired through a color / depth camera at t 0 , t 2 , and t 4 , and a depth image is acquired through a color / depth camera at t 1 and t 3 . The image processing apparatus according to the embodiment can generate a depth image corresponding to the color image at the time point t 2 based on the correspondence relation between pixels in the depth image at the points of time t 1 and t 3 . Further, the image processing apparatus of the embodiment can compensate the generated depth image based on color image at the time point t 2.
2 is a block diagram of an image processing apparatus according to an embodiment.
Referring to FIG. 2, the image processing apparatus according to an exemplary embodiment may include an
The color / depth camera can sequentially take a depth image at a first point of time, a color image at a second point of time, and a depth image at a third point of time. The image processing apparatus according to one embodiment may be mounted on a color / depth camera, or may be a separate device separate from the color / depth camera.
Based on the depth image at the first viewpoint and the depth image at the third viewpoint, the
The
Corresponding conditions may include an infrared brightness similarity, a depth value similarity, a topology similarity, and the like between the reference pixel and one or more candidate pixels included in the third viewpoint. A pixel that satisfies the corresponding condition among the candidate pixels can be estimated as a corresponding pixel.
The infrared brightness similarity may be determined using the infrared brightness values of the reference pixels and candidate pixels extracted from the brightness image, and the depth value similarity may be determined using the depth values of the reference pixels and candidate pixels extracted from the depth image , The topology similarity can be determined using the infrared brightness values of the reference pixels and candidate pixels extracted from the brightness image and the depth values of the reference pixels and candidate pixels extracted from the depth image.
For example, the
The similarity of the topology can be determined according to whether the pattern of variation of the infrared brightness value between the pixels included in the window is similar to the pattern of the variation of the depth value. Since the pixels included in the window are located adjacent to each other, there is a high possibility that the pattern of change of the infrared brightness value and the pattern of change of the depth value between the pixels are similar.
For example, the
The
The depth
The
The
The depth
The compensating
The
The compensating
3 is a block diagram of an image processing apparatus according to another embodiment.
Referring to FIG. 3, an image processing apparatus according to an exemplary embodiment of the present invention may include an
The
The infrared brightness
The
The depth value
The
The topology
The
The
The estimating
When a plurality of corresponding pixels of the third viewpoint depth image estimated in consideration of the infrared brightness similarity are considered, the estimating
The depth
The
The
The
If the difference between the depth value of the reference pixel and the depth value of the corresponding pixel is larger than the threshold value, the
When a plurality of depth values are calculated at the position of the current pixel belonging to the depth image at the second view point, the
The compensating
The compensating
The compensating
4 is a block diagram of an image processing apparatus according to another embodiment.
Referring to FIG. 4, the image processing apparatus according to an exemplary embodiment may include an
The
The downscaling
The change
The change
The change
The change
[Equation 1]
(X, y) is the coordinates of the pixel, (u x , u y ) is the coordinate of the pixel p in the first view image, (w x , w y ) is the range of the surrounding pixels of the pixel p , that is the window with a focus on the pixel p, (g x, g y) is amount of change in position of the pixel p, (u x + g x, u y + g y) is a pixel in the third viewpoint image corresponding to the pixel p The coordinates (x + g x , y + g y ) of q are the corresponding pixels to which this correspondence relationship is applied to the pixel (x, y). (g x , g y ) is given by (0, 0). (d x , d y ) represents the amount of change of coordinates to a pixel having a similar infrared brightness value.
(Dx, dy) satisfying Equation (1) can be calculated using a 2x2 matrix inverse transformation. α (x, y) is a weight determined by how much the pixel (x, y) has a depth value similar to the pixel p, and has a larger value as the depth values become similar.
The change
The change
&Quot; (2) "
&Quot; (3) "
Here, Z represents the depth value, and < RTI ID = 0.0 >
If it is not greater than 0, it outputs 0. Otherwise, it outputs the difference of the depth value of the corresponding pixel at the first and third time points.The amount of change (e x , e y ) of a coordinate to a pixel having a similar depth value satisfying the above equation can also be calculated using the inverse of the matrix. Finally, the vector (g ' x , g' y ) indicating the correspondence relationship can be updated as follows.
The change
&Quot; (4) "
For example, when the difference in image size between levels in a pyramid structure is doubled, (g ' x , g' y ) estimated in a previous small level image is used as an initial value of a large image of the immediately upper level 2g ' x , 2g' y ). (2 g ' x , 2 g' y ) can be used for (g x , g y ) in equation (4) As a result of repeating the above updating and scaling process for all levels up to the largest image, the final correspondence relation, that is, the change amount of the coordinates from the reference pixel of the depth image at the first viewpoint to the corresponding pixel of the depth image at the third viewpoint Can be obtained.
The depth
The
The
&Quot; (5) "
Here, t 1 is the first time point, t 2 is the second time point, t 3 is the third point in time, x (t2) is the x coordinate of the current pixel at the time depth imaging, y (t2) is the second point, the depth image the y coordinate, Z (t2) of the current pixel is the second current pixel depth value of the time depth imaging, u x is the x-coordinate, u y of the reference pixels of the first point, a depth image of the first reference pixel at the time of depth image the y-coordinate, g x and g y are the x coordinate the change amount y coordinates a change amount, Z (t1) is the pixel of the first viewpoint depth depth value of the reference pixel in the image, Z (t3) is a third point depth image Represents the corresponding depth value.
The
The
When the difference between the depth value of the reference pixel of the first viewpoint depth image and the depth value of the corresponding pixel of the third viewpoint depth image is greater than the threshold value, the interpolating
When a plurality of depth values are calculated with the current pixel belonging to the second view depth image, the
The compensating
The compensating
The compensating
FIG. 5 is a diagram for estimating a pixel-by-pixel correspondence relationship in an image processing apparatus according to an embodiment.
Referring to FIG. 5, the image processing apparatus according to an exemplary embodiment can estimate a correspondence relation between a luminance image at depth t 1 and a depth image, and a luminance image at depth t 3 and a depth image, for each pixel.
When the correspondence between pixels is estimated, the correspondence can be expressed as follows.
It may be used corresponding to the following conditions to estimate the pixel q of the image (t 3) from a depth corresponding to a pixel p of the depth image (t 1).
(1) The infrared brightness values at p and q should be similar to each other.
(2) The depth values at p and q should be similar within a certain range.
(3) The pixels around p in the depth image (t 1 ) should have a similar relationship to p.
(4) When p and q correspond, q and p must correspond.
(3) The term in other words refers to the similarity of the topology, in which the
In the middle diagram, the arrows indicate the calculated optical flow. It is noted that the
6 is a diagram illustrating a pyramid used in an image processing apparatus according to an exemplary embodiment.
Referring to FIG. 6, an image processing apparatus according to an exemplary embodiment may generate a pyramid from a luminance image and a depth image at an input first view point, a luminance image at a third viewpoint, and a depth image. Pyramids can contain images of different sizes in order. The image processing apparatus according to an exemplary embodiment may generate a
The image processing apparatus according to an exemplary embodiment may generate images having a small size through downscaling from the depth image at the first view point, the brightness image at the third view point, and the depth image, respectively.
The image processing apparatus according to an exemplary embodiment can calculate an optical flow from an image having the smallest size using the conditions (1), (2), and (3) of FIG.
7 illustrates a depth image including a hall pixel generated by the image processing apparatus according to an exemplary embodiment.
The image processing apparatus according to the embodiment can estimate the correspondence relation between the luminance image at depth t 1 and the depth image and the luminance image at depth t 3 and the depth image.
When the correspondence between pixels is estimated, the correspondence can be expressed as follows.
An exemplary image processing apparatus according to the example may be from the corresponding relationship of the above equation (5) using the current description of the section on the position and the depth value of the pixel in FIG. 4 at the time t 2 to calculate linearly.
The image processing apparatus according to an embodiment of the present invention is such that the difference between the depth value Z (t 1 ) of the reference pixel of the depth image at the time t 1 and the depth value Z (t 3 ) of the corresponding pixel of the depth image at the time t 3 exceeds the threshold value If it is large, it may not generate the interpolated value. For example, the threshold may be 30 centimeters (cm).
In Equation (5) of FIG. 4, since g x and g y are usually calculated as a prime number, x (t 2 ) and y (t 2 ) can be calculated as a prime number. Since the image processing apparatus according to an exemplary embodiment can calculate pixel positions having coordinates of rounded integers by rounding the decimal values, x (t 2 ) and y (t 2 ) rounded to integer values are 0 To N. For example,
Referring to FIG. 7, a plurality of calculated pixel positions may be included in the
Here, Z 1 (t 2) is the depth value of the
8 is a diagram for explaining a concept of generating a depth value of a hole pixel in an image processing apparatus according to an embodiment.
Referring to FIG. 8, in the
The image processing apparatus according to an exemplary embodiment of the present invention may have a weighted value when pixels adjacent to the
&Quot; (6) "
Here, j is a
9 is a view showing a depth image before and after generation of a depth value of a hole pixel in an image processing apparatus according to an exemplary embodiment.
Referring to FIG. 9, an
The depth value of the hole pixel can be generated using a weighted sum based method as shown in FIG. 8, or a random forest, which is a machine learning technique.
10 is a flowchart of an image processing method according to an embodiment.
In
An image processing apparatus according to an exemplary embodiment of the present invention extracts an infrared brightness value of a reference pixel from an amplitude image at a first time point and extracts an infrared brightness value of one or more candidate pixels included at a third time point from a brightness image at a third time point And calculates the infrared brightness similarity based on the infrared brightness difference value between the reference pixel and the candidate pixel. The image processing apparatus according to an exemplary embodiment may estimate a candidate pixel that satisfies the corresponding condition as a corresponding pixel, when the infrared brightness similarity degree is equal to or greater than a predetermined reference as a corresponding condition.
The image processing apparatus extracts a depth value of the reference pixel from a depth image at a first viewpoint, extracts a depth value of at least one candidate pixel included at a third viewpoint from a depth image at a third viewpoint, The depth value similarity can be calculated based on the difference in depth value between the reference pixel and the candidate pixel. The image processing apparatus according to an exemplary embodiment may estimate a candidate pixel that satisfies the corresponding condition as a corresponding pixel when the depth value similarity is greater than or equal to a certain range as a corresponding condition.
The image processing apparatus according to an embodiment extracts the infrared brightness value and the depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint, Based on the differences in the infrared brightness values between the pixels in the window and the candidate pixels, and the difference in the depth values between the pixels in the window and the candidate pixels, the topology ) Similarity can be calculated. The image processing apparatus according to an embodiment may estimate candidate pixels satisfying the corresponding condition as corresponding pixels when the topology similarity degree is equal to or greater than a certain range as corresponding conditions.
In
The image processing apparatus according to an embodiment includes the coordinates of the reference pixel of the first viewpoint depth image, the coordinates of the corresponding pixel of the depth image of the third viewpoint, the amount of change of the coordinates from the coordinates of the reference pixel to the coordinates of the corresponding pixel, Value and the depth value of the corresponding pixel are linearly interpolated to calculate the position of the current pixel belonging to the depth image at the second view point and the depth value of the current pixel.
The image processing apparatus according to an exemplary embodiment may generate a depth value of a hole pixel for which a depth value is not calculated among current pixels based on a depth value of neighboring pixels of the hole pixel.
The image processing apparatus according to an exemplary embodiment of the present invention increases the weight as the distance between the coordinates of the hall pixel and the coordinates of the neighboring pixels is closer to the color value between the hall pixel and the neighboring pixels and increases as the sum of the depth values of the surrounding pixels to which the weight is applied The depth value of the hole pixel can be calculated.
The image processing apparatus according to an embodiment estimates the correspondence relation between pixels obtained from the depth image alternately obtained from the color image and the depth image picked up before and after the shooting time of the color image, Can be generated.
Also, the image processing apparatus according to an embodiment generates a hole pixel for a pixel that does not satisfy the condition of the corresponding relationship, and a depth value of the hole pixel is a color of a pixel at a position corresponding to the hole pixel in the color image and a color By estimating based on the degree of similarity, it is possible to generate a depth image of the same time as the photographing time of the color image more accurately.
In addition, the image processing apparatus according to an embodiment can provide a source of a multi-view image that can more accurately represent a three-dimensional image by correcting a time difference between a color image and a depth image.
The method according to an embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.
Claims (21)
A depth image generating unit for generating a current pixel belonging to a depth image at a second viewpoint using the reference pixel and the corresponding pixel,
And the image processing apparatus.
A compensating unit which generates a depth value of a hole pixel whose depth value is not calculated among the current pixels based on a depth value of peripheral pixels of the hole pixel,
Further comprising:
The estimation unit
Extracting an infrared brightness value of the reference pixel from an amplitude image at a first time point, extracting an infrared brightness value of at least one candidate pixel included at the third time point from a brightness image at a third time point, An infrared brightness similarity calculation unit for calculating an infrared brightness similarity based on a difference between infrared brightness values between the pixel and the candidate pixel; And
Wherein the corresponding condition is a case where the infrared brightness similarity degree is equal to or greater than a predetermined reference, and when the corresponding condition is satisfied,
And the image processing apparatus.
The estimation unit
Extracting a depth value of the reference pixel from the depth image at the first viewpoint, extracting a depth value of at least one candidate pixel included in the third viewpoint from the depth image at the third viewpoint, A depth value similarity calculation unit for calculating a depth value similarity based on a depth value difference between pixels; And
Wherein the corresponding condition is a case in which the depth value similarity is greater than or equal to a certain range, and when the corresponding condition is satisfied,
And the image processing apparatus.
The estimation unit
Extracting an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint and extracting an infrared brightness value and a depth value of the candidate pixel included in the third viewpoint from the depth image and the brightness image at the third viewpoint, And calculates a topology based on a difference between an infrared brightness value between the pixels in the window and the candidate pixels and a depth value between the pixels in the window and the candidate pixels, A topology similarity calculation unit for calculating a similarity; And
Wherein the corresponding condition is a case where the topology similarity is equal to or more than a certain range, and when the corresponding condition is satisfied,
And the image processing apparatus.
The estimation unit
A verification unit for verifying whether a reference pixel of the first viewpoint depth image is estimated from a corresponding pixel of the third viewpoint depth image,
And the image processing apparatus.
The estimation unit
A downscaling unit for downscaling the luminance image and the depth image at the first time point, the luminance image and the depth image at the third time point to generate a pyramid; And
Calculates an optical flow for each image forming the pyramid from the smallest image of the pyramid and calculates a change amount of coordinates from a reference pixel of the depth image at the first viewpoint to a corresponding pixel at the third viewpoint The change-
And the image processing apparatus.
The estimation unit
Wherein the difference between the reference pixel and the infrared brightness value of the brightness image at the first time point is within a certain range and the difference between the reference pixel value and the depth value of the depth image at the first time point is within a certain range, Estimating a corresponding pixel at the third time point in which the topology of the pixels is similar
Image processing apparatus.
The depth image generation unit
A depth of the reference pixel, a depth of the reference image, a depth of the reference image, a depth of the reference image, a depth of the reference image, An interpolation unit which linearly interpolates the depth value of the corresponding pixel to calculate the position of the current pixel belonging to the depth image at the second time point and the depth value of the current pixel,
And the image processing apparatus.
The interpolator
If the difference between the depth value of the reference pixel and the depth value of the corresponding pixel is greater than the threshold value, a hole pixel is generated in the current pixel belonging to the depth image at the corresponding second viewpoint without calculating the depth value
Image processing apparatus.
The interpolator
Calculating an average value of a plurality of depth values included in a current pixel belonging to the depth image at the second time point as a representative depth value of the current pixel
Image processing apparatus.
The compensation unit
The depth value of the hall pixel is calculated as the sum of the depth values of the surrounding pixels by increasing the weight as the distance between the coordinates of the hall pixel and the coordinates of the surrounding pixels becomes closer
Image processing apparatus.
The compensation unit
The depth value of the hall pixel is calculated as the sum of the depth values of the surrounding pixels by increasing the weight as the color value between the hall pixel extracted from the color image at the second time point and the surrounding pixels is closer
Image processing apparatus.
The compensation unit
The distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels is close to the distance between the adjacent pixel and the color value of the adjacent pixel is closer to the color value, Calculate
Image processing apparatus.
Generating a current pixel belonging to a depth image at a second viewpoint using the reference pixel and the corresponding pixel
And an image processing method.
Generating a depth value of the current pixel in which the depth value is not calculated from the current pixels based on the depth value of the surrounding pixels of the hole pixel
Further comprising the steps of:
The estimating step
Extracting an infrared brightness value of the reference pixel from an amplitude image at a first time point;
Extracting an infrared brightness value of at least one candidate pixel included in the third viewpoint from the brightness viewpoint at the third viewpoint;
Calculating an infrared brightness similarity based on a difference in infrared brightness value between the reference pixel and the candidate pixel; And
Wherein the corresponding condition is a case where the infrared brightness similarity is equal to or greater than a predetermined reference, and if the corresponding condition is satisfied, estimating the candidate pixel as the corresponding pixel
And an image processing method.
The estimating step
Extracting a depth value of the reference pixel from a depth image at the first viewpoint;
Extracting a depth value of at least one candidate pixel included in the third viewpoint from the depth view image at the third viewpoint;
Calculating a depth value similarity based on a depth value difference between the reference pixel and the candidate pixel; And
Wherein the corresponding condition is that the depth value similarity is greater than or equal to a certain range, and if the corresponding condition is satisfied, estimating the candidate pixel as the corresponding pixel
And an image processing method.
The estimating step
Extracting an infrared brightness value and a depth value of the pixels in the window set in the reference pixel from the depth image and the brightness image at the first viewpoint;
Extracting an infrared brightness value and a depth value of candidate pixels included in the third viewpoint from the depth image and the brightness image at the third viewpoint;
Calculating a topology similarity based on a difference between infrared brightness values between the pixels in the window and the candidate pixels and a difference in depth value between the pixels in the window and the candidate pixels; And
Wherein the corresponding condition is a case where the topology similarity is equal to or greater than a certain range, and if the corresponding condition is satisfied, estimating the candidate pixels as corresponding pixels
And an image processing method.
The generating step
A depth of the reference pixel, a depth of the reference pixel, a depth of the reference pixel, a depth of the reference pixel, a depth of the reference image, Calculating a position of a current pixel belonging to the depth image at the second view point and a depth value of the current pixel by linearly interpolating the depth value of the pixel,
And an image processing method.
The step of generating the depth value of the hole pixel
The distance between the coordinates of the hole pixel and the coordinates of the neighboring pixels is close to the distance between the adjacent pixel and the color value of the adjacent pixel is closer to the color value, Calculate
Image processing method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120134264A KR20140067253A (en) | 2012-11-26 | 2012-11-26 | Image processing apparatus and method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120134264A KR20140067253A (en) | 2012-11-26 | 2012-11-26 | Image processing apparatus and method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20140067253A true KR20140067253A (en) | 2014-06-05 |
Family
ID=51123531
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120134264A KR20140067253A (en) | 2012-11-26 | 2012-11-26 | Image processing apparatus and method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20140067253A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160047891A (en) * | 2014-10-23 | 2016-05-03 | 삼성전자주식회사 | Electronic device and method for processing image |
KR20160123871A (en) * | 2015-04-17 | 2016-10-26 | 삼성전자주식회사 | Method and apparatus for estimating image optical flow |
-
2012
- 2012-11-26 KR KR1020120134264A patent/KR20140067253A/en not_active Application Discontinuation
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160047891A (en) * | 2014-10-23 | 2016-05-03 | 삼성전자주식회사 | Electronic device and method for processing image |
US10970865B2 (en) | 2014-10-23 | 2021-04-06 | Samsung Electronics Co., Ltd. | Electronic device and method for applying image effect to images obtained using image sensor |
US11455738B2 (en) | 2014-10-23 | 2022-09-27 | Samsung Electronics Co., Ltd. | Electronic device and method for applying image effect to images obtained using image sensor |
KR20160123871A (en) * | 2015-04-17 | 2016-10-26 | 삼성전자주식회사 | Method and apparatus for estimating image optical flow |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6244407B2 (en) | Improved depth measurement quality | |
KR101419979B1 (en) | Method and system for converting 2d image data to stereoscopic image data | |
KR101742120B1 (en) | Apparatus and method for image processing | |
WO2020039166A1 (en) | Method and system for reconstructing colour and depth information of a scene | |
CN105551020B (en) | A kind of method and device detecting object size | |
JP2007000205A (en) | Image processing apparatus, image processing method, and image processing program | |
JP2018151689A (en) | Image processing apparatus, control method thereof, program and storage medium | |
KR20170091496A (en) | Method and apparatus for processing binocular image | |
EP2291825A1 (en) | System and method for depth extraction of images with forward and backward depth prediction | |
JP2011081605A (en) | Image processing apparatus, method and program | |
JP2007053621A (en) | Image generating apparatus | |
KR101125061B1 (en) | A Method For Transforming 2D Video To 3D Video By Using LDI Method | |
KR101853215B1 (en) | Coding Device and Method and Depth Information Compensation by Plane Modeling | |
KR101027003B1 (en) | Stereo matching apparatus and its method | |
JP4985542B2 (en) | Corresponding point search device | |
WO2015198592A1 (en) | Information processing device, information processing method, and information processing program | |
KR20140000833A (en) | Stereo matching apparatus and its method | |
JP6079076B2 (en) | Object tracking device and object tracking method | |
JP6991700B2 (en) | Information processing equipment, information processing method, program | |
KR101888969B1 (en) | Stereo matching apparatus using image property | |
KR20140067253A (en) | Image processing apparatus and method thereof | |
KR102240570B1 (en) | Method and apparatus for generating spanning tree,method and apparatus for stereo matching,method and apparatus for up-sampling,and method and apparatus for generating reference pixel | |
KR101435611B1 (en) | Occlusion removal method for three dimensional integral image | |
JP2015033047A (en) | Depth estimation device employing plural cameras | |
JP4985863B2 (en) | Corresponding point search device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |