KR20140010856A - Image processing apparatus and method - Google Patents
Image processing apparatus and method Download PDFInfo
- Publication number
- KR20140010856A KR20140010856A KR1020120131505A KR20120131505A KR20140010856A KR 20140010856 A KR20140010856 A KR 20140010856A KR 1020120131505 A KR1020120131505 A KR 1020120131505A KR 20120131505 A KR20120131505 A KR 20120131505A KR 20140010856 A KR20140010856 A KR 20140010856A
- Authority
- KR
- South Korea
- Prior art keywords
- depth
- image
- contrast
- luminance value
- value
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
Abstract
Description
It relates to an image processing apparatus and method, and more particularly to an image processing apparatus and method for adjusting the depth of a 3D (3-Dimensional) image.
The 3D display is highly realistic, but there may be a problem of deterioration of image quality or fatigue of viewing. Therefore, there is a need for a technique of re-adjusting the depth to minimize viewing deterioration while minimizing image quality deterioration of the 3D display.
US Patent US 7,557,824 "Method and apparatus for generating a stereoscopic image", one of the conventional methods for readjusting the depth, obtains a Region of Interest (ROI) using the depth information and viewing distance of the image, Based on the obtained ROI, depth mapping is proposed to readjust the sense of depth corresponding to the near and far regions.
According to one side, the object separation unit for separating the object region and the background region using at least one of the color image and the depth image associated with the color image; A contrast calculator for calculating a contrast between the object area and the background area; And a depth adjusting unit adjusting the depth of the depth image by using the contrast.
According to an embodiment, the object separator may separate the object region from the background region to generate a mask and store the mask in a buffer.
According to an embodiment, the object separator may separate the object region from the background region using a visual attention map based on a human visual cognitive characteristic model.
According to another embodiment, the object separator may separate the object region from the background region by using a region of interest.
According to another embodiment, the object separator may segment the level of the depth value using the depth image and remove the horizontal plane by using a horizontal plane equation to separate the object area from the background area.
The contrast calculator may include a color value converter configured to convert color values of at least one pixel of the input color image into luminance values reflecting display characteristics associated with the image processing apparatus.
According to an embodiment, the color value converter may convert a color value of the at least one pixel into a luminance value reflecting the display characteristic by using a PLCwise Model (Piecewise Linear Interpolation assuming Constant Chromaticity coordinates).
In this case, the contrast calculator calculates a first luminance value representing the luminance value of the object region and a second luminance value representing the luminance value of the background region using the luminance value, and comparing the luminance value with the first luminance value. The contrast may be calculated based on the difference between the second luminance values.
According to an embodiment, the contrast calculator may calculate the contrast by applying the first luminance value and the second luminance value to a Michelson contrast calculation method.
The depth adjuster may include: a depth normalizer configured to normalize the depth image by using a maximum depth value that can be represented by a display associated with the image processing apparatus; A scale factor determiner that determines a scale factor using the contrast; And a depth rescaling unit to rescale the depth value of the depth image using the scale factor.
In this case, the scale factor may be determined as a smaller value as the contrast becomes larger using a database constructed by experimenting with a minimum branch difference with respect to a depth value.
The scale factor determination unit may determine a predetermined scale factor corresponding to a contrast level including the contrast among a plurality of predetermined contrast levels.
The image processing apparatus may further include an image renderer configured to render a 3D image corresponding to at least one viewpoint based on a result of the color image and the depth adjusting unit adjusting the depth of the depth image. Can be.
In this case, the image renderer may render the 3D image by using a depth image based rendering (DIBR) technique.
According to another aspect, the method may further include: separating an object region and a background region by using at least one of a color image and a depth image associated with the color image; Calculating a contrast between the object area and the background area; And adjusting the depth of the depth image by using the contrast.
The calculating of the contrast may include converting a color value of at least one pixel of the input color image into a luminance value reflecting a display characteristic associated with the image processing apparatus. Can be.
In this case, the converting of the color values may include converting the color values of the at least one pixel into luminance values reflecting the display characteristics by using a PLCwise Model (Piecewise Linear Interpolation assuming Constant Chromaticity coordinates). The calculating may include calculating a first luminance value representing a luminance value of the object region and a second luminance value representing a luminance value of the background region using the luminance value, and calculating the first luminance value and the second luminance value. The contrast can be calculated based on the difference in luminance value.
The adjusting of the depth may include: normalizing the depth image using a maximum depth value that can be represented by a display associated with the image processing apparatus; Determining a scale factor using the contrast; And rescaling a depth value of the depth image by using the scale factor.
In this case, the scale factor may be determined as a smaller value as the contrast becomes larger using a database constructed by experimenting with a minimum branch difference with respect to a depth value.
1 is a block diagram of an image processing apparatus according to an embodiment.
2 is a conceptual diagram referred to for describing an image input to an image processing apparatus, according to an exemplary embodiment.
3 is a conceptual diagram illustrating a result of separation of an object region and a background region, according to an exemplary embodiment.
4 is a conceptual diagram illustrating a process of separating an object region and a background region and storing them in a buffer according to an embodiment.
5 is a block diagram illustrating a process of performing a color value conversion by a color value conversion unit according to an embodiment.
6 is a detailed block diagram of a depth adjuster according to an exemplary embodiment.
7 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment.
8 is an exemplary flowchart illustrating an image processing method, according to an exemplary embodiment.
In the following, some embodiments will be described in detail with reference to the accompanying drawings. However, it is not limited or limited by these embodiments. Like reference symbols in the drawings denote like elements.
1 is a block diagram of an
According to an exemplary embodiment, the
Various embodiments may exist in the method of separating the object region and the background region by the
The
The depth adjuster 130 readjusts the depth of the input image by using the calculated contrast. This process may be referred to as re-scaling below.
According to an embodiment, the
The three-dimensional perception, which is referred to in human visual cognitive characteristics, can be understood as a technology that reproduces reality by combining the principle of binocular disparity with empirical perceptual factors. I understand it as a mechanism.
In embodiments, one of these human cognitive traits may be based on the understanding that human vision is sensitive to brightness. In such embodiments, contrast masking is used to calculate the contrast of the input image, and the minimum depth difference found theoretically and / or experimentally for the calculated contrast is databased. When the contrast between the object region and the background region of the input image is calculated, the depth is adjusted using this database.
Various embodiments of the depth adjustment process will be described later in more detail with reference to FIGS. 5 to 6.
According to the
2 is a conceptual diagram referred to for describing an image input to an image processing apparatus, according to an exemplary embodiment.
The
The
Furthermore, according to another exemplary embodiment, another color image (not shown) different from the
Therefore, hereinafter, an embodiment in which the
The
Therefore, the process of readjusting the depth may be required. In the past, the depth was often adjusted without considering the human visual perception.
According to an embodiment, the
According to an exemplary embodiment, the
3 is a conceptual diagram illustrating a
The
There are various embodiments in which the
For example, a background and an object may be distinguished by using a visual attention map (not shown) generated based on a human visual perception characteristic.
According to another embodiment, region segmentation using a region of interest (ROI), which may be predetermined or determined in real time, may be used.
Furthermore, according to another exemplary embodiment, pixels having a depth estimated as the depth of the object may be determined using depth information of the
In this embodiment, the pixels of the depth estimated as the depth of the object are separated from the
In addition, various pre-processing and / or post-processing processes may be added to separate the
The result of separating the
4 is a conceptual diagram illustrating a process in which an object region and a background region are separated and stored in the
For example, the
According to an embodiment, one bit of binary data stored in the
However, this is only an example, and in order to speed up image processing or for other purposes, one bit of binary data stored in the
The result of separating the
5 is a block diagram illustrating a process of performing a color value conversion by a color value conversion unit according to an embodiment.
The color
According to an embodiment, in the
The PLCC model may convert input R, G, and B values into X, Y, and Z values using previously measured values by reflecting display characteristics associated with the
At least some of these may be values that must be used to measure display characteristics. For example, a correlation between X, Y, Z values measured at intervals of 0 to 255 and black values of R, G, and B may be measured.
The Y value calculated by this process may be associated with the luminance value. When the luminance value representing the
6 is a detailed block diagram of the
According to an embodiment, the
The
Depending on the hardware characteristics or the calibration level, since the maximum depth that each of the displays can represent may be different from each other, such a depth normalization process may be performed.
When the depth normalization is performed in this way, the
According to an embodiment, the
Representative luminance values of each of the object region and the background region are determined using 'Y' values corresponding to luminance or brightness among the X, Y, and Z values derived as described with reference to FIG. 5.
As described above, according to the exemplary embodiment, the representative luminance value may be obtained as an average of Y values of pixels belonging to each region, or may be calculated using a Y value distribution frequency of the pixels.
According to an embodiment, the
Referring to Equation 2, C Michelson compared to Michelson of the input image may be calculated using Ymax and Ymin, in which the larger value of the representative Y value of the object region and the representative Y value of the background region is Ymax.
However, this calculation method is only one embodiment and various other applications that can calculate the contrast are not excluded.
The scale
According to embodiments in the scale factor determination process, the human visual recognition characteristics may be reflected.
In terms of psychophysics, the Just Noticeable Difference (JND) means the minimum difference in stimuli that can be distinguished by humans or animals. This JND may be interpreted as a minimum branch difference.
JND was first studied by Weber, and numerous experiments were conducted to measure the brightness value L and the delta L, the JND of the brightness value.
This means that the ratio of all brightness values L and thereby the delta L, which is JND, is constant constant k. Later scientists have suggested that this JND or constant k changes with L depending on the brightness value, but the above equation 3 can still be considered valid unless it is a precise level of accuracy.
As a result of theoretical and / or experimental studies on the correlation between the brightness value and the depth value, the larger the contrast that is the difference between the brightness values of the object region and the background region, the smaller the JND of the depth value is. For example, it may be understood that the greater the contrast between the object region and the background region, the more susceptible a human is to a change in depth value.
According to an exemplary embodiment, based on the determination, the
The mapping table and / or calculation data for determining the value of the scale factor according to the calculated contrast may be stored in the
Meanwhile, according to one embodiment, the calculated contrast may be determined to be included in any one of several predetermined contrast levels.
For example, the
In this case, the
Then, the
According to an embodiment, the
According to an embodiment, the depth value of the input depth image may be normalized, so that the depth adjustment may be performed by multiplying the normalized depth value by the scale factor.
When the image is reconstructed according to the adjusted depth value, the contrast of the input image and the human visual perception characteristics are reflected, thereby minimizing image quality deterioration while maintaining the perceived depth. It is possible to provide an image with less viewing fatigue.
The reproduction of the image will be described later in more detail with reference to FIG. 7.
7 is a block diagram illustrating an
According to an exemplary embodiment, the
According to an embodiment, the
For example, the
The process of rendering an image at any point of time using the DIBR method may be understood with reference to the following equation.
In Equation 4, u v denotes an arbitrary virtual view to be obtained, u r denotes an input, d denotes a depth, and β denotes an arbitrary number changeable to set a view desired to be made. Can be understood.
For example, after reprojecting an image to a three-dimensional space using depth information of one reference image and a depth image, the left image moves objects in the image to the right relative to the original color image according to the depth. The right image can move objects in the image to the left. At this time, the movement of the object, such as disparity, is determined in proportion to the depth value.
8 is an exemplary flowchart illustrating an image processing method, according to an exemplary embodiment.
In
In
As described above, various embodiments may exist in the method of separating the object region and the background region by the
Then, in
In
In
The
In
When the scale factor is determined and the depth value is normalized, the
According to an embodiment, in
According to these embodiments, since the depth adjustment is adaptively made using the Just Noticeable Difference (JND), which is the minimum detectable depth difference (JND) of the human visual perception characteristics, It is possible to reduce viewing fatigue without deteriorating image quality and unnecessary adjustment of the sense of depth.
The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA) A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.
The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.
The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.
Claims (20)
A contrast calculator for calculating a contrast between the object area and the background area; And
Depth adjustment unit for adjusting the depth of the depth image by using the contrast
And the image processing apparatus.
The object separator is configured to separate the object region and the background region to generate a mask to store in the buffer.
The object separator is further configured to separate the object region and the background region using a visual attention map based on a human visual cognitive characteristic model.
And the object separator is configured to separate the object region and the background region by using a region of interest.
The object separator may segment the level of the depth value using the depth image, and remove the horizontal plane by using a horizontal plane equation to separate the object area from the background area.
And the contrast calculator comprises a color value converter configured to convert color values of at least one pixel of the input color image into luminance values reflecting display characteristics associated with the image processing apparatus.
The color value converting unit converts a color value of the at least one pixel into a luminance value reflecting the display characteristic by using a PLCwise Model (Piecewise Linear Interpolation assuming Constant Chromaticity coordinates).
The contrast calculator calculates a first luminance value representing a luminance value of the object region and a second luminance value representing a luminance value of the background region using the luminance value, and the first luminance value and the second luminance value. And calculating the contrast based on a difference in luminance value.
And the contrast calculator is configured to calculate the contrast by applying the first luminance value and the second luminance value to a Michelson contrast calculation method.
The depth adjustment unit,
A depth normalizer which normalizes the depth image by using a maximum depth value that can be represented by a display associated with the image processing apparatus;
A scale factor determiner that determines a scale factor using the contrast; And
Depth rescaling unit to rescale the depth value of the depth image using the scale factor
And the image processing apparatus.
And the scale factor is determined to be smaller as the contrast is greater using a database constructed by experimenting with a minimum difference in depth value.
The scale factor determiner is configured to determine a predetermined scale factor corresponding to a contrast level including the contrast among a plurality of predetermined contrast levels.
An image renderer that renders a 3D image corresponding to at least one viewpoint by using a result of the color image and the depth adjuster adjusting the depth of the depth image.
Further comprising:
And the image renderer renders the 3D image by using a depth image based rendering (DIBR) technique.
Calculating a contrast between the object area and the background area; And
Adjusting the depth of the depth image by using the contrast
And an image processing method.
The calculating of the contrast may include converting a color value of at least one pixel of the input color image into a luminance value reflecting display characteristics associated with the image processing apparatus.
The converting of the color values may include converting color values of the at least one pixel into luminance values reflecting the display characteristics by using a PLCwise Model (Piecewise Linear Interpolation assuming Constant Chromaticity coordinates).
The calculating of the contrast may include calculating a first luminance value representing a luminance value of the object region and a second luminance value representing a luminance value of the background region using the luminance value, and calculating the first luminance value. And calculating the contrast based on a difference between the second luminance value and the second luminance value.
Adjusting the depth,
Normalizing the depth image by using a maximum depth value that can be represented by a display associated with the image processing apparatus;
Determining a scale factor using the contrast; And
Rescaling the depth value of the depth image using the scale factor
And an image processing method.
The scale factor is determined to be smaller as the contrast is larger using a database constructed by experimenting with a minimum branch difference with respect to a depth value.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310292218.4A CN103546736B (en) | 2012-07-12 | 2013-07-12 | Image processing equipment and method |
US13/940,456 US9661296B2 (en) | 2012-07-12 | 2013-07-12 | Image processing apparatus and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261670830P | 2012-07-12 | 2012-07-12 | |
US61/670,830 | 2012-07-12 |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20140010856A true KR20140010856A (en) | 2014-01-27 |
KR101978176B1 KR101978176B1 (en) | 2019-08-29 |
Family
ID=50143366
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120131505A KR101978176B1 (en) | 2012-07-12 | 2012-11-20 | Image processing apparatus and method |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101978176B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180066479A (en) * | 2016-12-09 | 2018-06-19 | 한국전자통신연구원 | Automatic object separation method and apparatus using plenoptic refocus |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008033897A (en) * | 2006-06-29 | 2008-02-14 | Matsushita Electric Ind Co Ltd | Image processor, image processing method, program, storage medium, and integrated circuit |
KR20080015078A (en) * | 2005-06-17 | 2008-02-18 | 마이크로소프트 코포레이션 | Image segmentation |
KR20110051416A (en) * | 2009-11-10 | 2011-05-18 | 삼성전자주식회사 | Image processing apparatus and method |
KR20110059531A (en) * | 2009-11-27 | 2011-06-02 | 소니 주식회사 | Image processing apparatus, image processing method and program |
KR20110094957A (en) * | 2010-02-18 | 2011-08-24 | 중앙대학교 산학협력단 | Apparatus and method for object segmentation from range image |
JP2011223566A (en) * | 2010-04-12 | 2011-11-04 | Samsung Electronics Co Ltd | Image converting device and three-dimensional image display device including the same |
-
2012
- 2012-11-20 KR KR1020120131505A patent/KR101978176B1/en active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080015078A (en) * | 2005-06-17 | 2008-02-18 | 마이크로소프트 코포레이션 | Image segmentation |
JP2008033897A (en) * | 2006-06-29 | 2008-02-14 | Matsushita Electric Ind Co Ltd | Image processor, image processing method, program, storage medium, and integrated circuit |
KR20110051416A (en) * | 2009-11-10 | 2011-05-18 | 삼성전자주식회사 | Image processing apparatus and method |
KR20110059531A (en) * | 2009-11-27 | 2011-06-02 | 소니 주식회사 | Image processing apparatus, image processing method and program |
KR20110094957A (en) * | 2010-02-18 | 2011-08-24 | 중앙대학교 산학협력단 | Apparatus and method for object segmentation from range image |
JP2011223566A (en) * | 2010-04-12 | 2011-11-04 | Samsung Electronics Co Ltd | Image converting device and three-dimensional image display device including the same |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180066479A (en) * | 2016-12-09 | 2018-06-19 | 한국전자통신연구원 | Automatic object separation method and apparatus using plenoptic refocus |
Also Published As
Publication number | Publication date |
---|---|
KR101978176B1 (en) | 2019-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9661296B2 (en) | Image processing apparatus and method | |
US9445075B2 (en) | Image processing apparatus and method to adjust disparity information of an image using a visual attention map of the image | |
CA2727218C (en) | Methods and systems for reducing or eliminating perceived ghosting in displayed stereoscopic images | |
US9171372B2 (en) | Depth estimation based on global motion | |
US8638329B2 (en) | Auto-stereoscopic interpolation | |
US9123115B2 (en) | Depth estimation based on global motion and optical flow | |
US9137512B2 (en) | Method and apparatus for estimating depth, and method and apparatus for converting 2D video to 3D video | |
US8977039B2 (en) | Pulling keys from color segmented images | |
JP2015156607A (en) | Image processing method, image processing apparatus, and electronic device | |
Jung et al. | Depth sensation enhancement using the just noticeable depth difference | |
US10210654B2 (en) | Stereo 3D navigation apparatus and saliency-guided camera parameter control method thereof | |
US11908051B2 (en) | Image processing system and method for generating image content | |
TW201622418A (en) | Processing of disparity of a three dimensional image | |
KR20180064028A (en) | Method and apparatus of image processing | |
KR101978176B1 (en) | Image processing apparatus and method | |
KR101849696B1 (en) | Method and apparatus for obtaining informaiton of lighting and material in image modeling system | |
KR20060114708A (en) | Method and scaling unit for scaling a three-dimensional model | |
US8482574B2 (en) | System, method, and computer program product for calculating statistics associated with a surface to be rendered utilizing a graphics processor | |
KR100914312B1 (en) | Method and system of generating saliency map using graphics hardware and programmable shader and recording medium therewith | |
US20140198176A1 (en) | Systems and methods for generating a depth map and converting two-dimensional data to stereoscopic data | |
US9137519B1 (en) | Generation of a stereo video from a mono video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |