WO2007105129A1 - Image enhancement - Google Patents

Image enhancement Download PDF

Info

Publication number
WO2007105129A1
WO2007105129A1 PCT/IB2007/050666 IB2007050666W WO2007105129A1 WO 2007105129 A1 WO2007105129 A1 WO 2007105129A1 IB 2007050666 W IB2007050666 W IB 2007050666W WO 2007105129 A1 WO2007105129 A1 WO 2007105129A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
particular edge
image enhancement
neighborhood
enhancement
Prior art date
Application number
PCT/IB2007/050666
Other languages
French (fr)
Inventor
Rui F. C. Guerreiro
Jeroen A. P. Tegenbosch
Paul M. Hofman
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2007105129A1 publication Critical patent/WO2007105129A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • G06T5/75Unsharp masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • H04N19/86Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving reduction of coding artifacts, e.g. of blockiness

Definitions

  • the invention relates to a method of image enhancement, in particular contrast, edge or sharpness enhancement.
  • the invention further relates to an image enhancement unit.
  • the invention further relates to an image processing apparatus, comprising: - means for receiving an input signal representing an input image; and such an image enhancement unit.
  • the invention further relates to a computer program product to be loaded by a computer arrangement, comprising instructions for image enhancement, the computer arrangement comprising processing means and a memory.
  • Another method called statistical differencing, generates the enhanced image by dividing each pixel value by its standard deviation estimated inside a specified window centered at the pixel (See e.g. W. K. Pratt, Digital Image Processing, John Wiley and Sons, New York, 1991)
  • the amplitude of a pixel in the image is increased when it differs significantly from its neighbors, while it is decreased otherwise.
  • a generalization of the statistical difference methods includes the contributions of the pre-selected first-order and second-order moments.
  • An alternate approach to contrast enhancement is based on modifying the magnitude of the Fourier transform of an image while keeping the phase invariant.
  • the transform magnitude is normalized to range between 0 and 1 and raised to a power which is a number between zero and one (See e.g. H. C. Andrews, A. G. Teschler, and R. P. Kruger, "Image processing by digital computer", IEEE Spectrum, vol. 9, July 1972, pp. 20-32)
  • An inverse transform of the modified spectrum yields the enhanced image.
  • This conceptually simple approach in some cases, results in unpleasant enhanced images with two types of artifacts: enhanced noise and replication of sharp edges. Moreover, this method is of high computational complexity.
  • UMS Unsharp Masking
  • the fundamental idea of UM is to subtract from the input signal a low-pass filtered version of the signal itself. The same effect can, however, be obtained by adding to the input signal a processed version of the signal in which high-frequency components are enhanced.
  • the invention provides a method, unit and computer program product for image enhancement.
  • the method according to the invention comprises: computing border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and computing an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
  • edges which correspond to texture within objects
  • edges which correspond to borders of objects.
  • the respective border probability values are computed which represent the probability of belonging to an object border.
  • a relatively high border probability value means that the edge corresponds to a border of an object
  • a relatively low border probability value means that the edge does not correspond to a border of an object but to a texture.
  • the computed border probability values are used to control the amount of local image enhancement.
  • the amount of image enhancement is relatively high in the neighborhood of edges, which corresponds to borders.
  • the amount of image enhancement is limited in other regions of the image, including neighborhoods of edges, which are classified, on basis of the border probability, as not belonging to object borders.
  • first neighborhood of an edge is meant a region of pixels, which is located in the environment of the edge.
  • the pixels forming the edge itself may be included in the region.
  • the first dimension of such a region is 10 pixels wide.
  • image enhancement is meant any type of pixel processing, i.e. modification of pixel values resulting in an increased difference in pixel values of neighboring pixels.
  • the pixel value may represent luminance and/or color.
  • the image enhancement may be substantially symmetrical, meaning that the amounts of image enhancement at both sides of the edge are mutually substantially equal. So, in an embodiment of the method according to the invention, a second amount of image enhancement in a second neighborhood of the particular edge is relatively high if the border probability value of the particular edge is relatively high, the second neighborhood being located on a second side of the particular edge, being opposite to the first side of the particular edge.
  • the image enhancement is preferably asymmetrical, meaning that the amounts of image enhancement at both sides of the edge are mutually different. So, in an embodiment of the method according to the invention, a second amount of image enhancement in a second neighborhood of the particular edge is lower than the first amount of image enhancement, the second neighborhood being located on a second side of the particular edge, being opposite to the first side of the particular edge. Examples for applying asymmetrical image enhancement are as follows:
  • a detected/classified border edge corresponds to a border of an object, which is positioned relatively close to the camera/viewer compared to its background. It is beneficial, to perform a relatively strong image enhancement at the first neighborhood of the edge which corresponds to the object while a limited image enhancement or even image de-enhancement/blur is performed at the second neighborhood being located at the opposite side of the edge.
  • a detected/classified border edge is located adjacent to a flat area, i.e. a region in the image having a measure of variation of pixel values, which is relatively low. In other words, an area, which is substantially homogeneous. In that case asymmetrical image enhancement may be beneficial. Enhancement of the area being located at the other side of the edge is advantageous.
  • Computing border probability values may be based on a number of parameters.
  • computing the border probability values of the particular edge is based on a measure of variation of pixel values of the input image in a third neighborhood of the particular edge.
  • the third neighborhood is a group of pixels adjacent to the particular edge.
  • the third neighborhood may correspond to the first neighborhood or the second neighborhood as mentioned above.
  • the measure of variation is computed by addition of differences between respective pixel values of pixels in the neighborhood.
  • a relatively low measure of variation corresponds to a relatively high border probability value, resulting in relatively strong image enhancement. That means that the amount of image enhancement is relatively high for edges near flat areas.
  • computing the border probability values is based on a depth map corresponding to the input image.
  • a depth map is a two-dimensional matrix of depth values. Each of the depth values corresponds to a respective pixel. The depth values represent distances from the respective points in the scene to the camera/viewer. A relatively large distance corresponds to a relatively high depth value. That means that foreground objects have lower depth values than the background.
  • a relatively high value of a derivative in the depth map i.e. a relatively big depth step corresponds to a relatively high border probability value.
  • Fig. 1 schematically shows a first embodiment of the image enhancement unit according to the invention connected to a display device
  • Fig. 2 schematically shows an image enhancement filter of the image enhancement unit according to the invention
  • Fig. 3 schematically shows a second embodiment of the image enhancement unit according to the invention connected to a display device;
  • Fig. 4 schematically shows an input image;
  • Fig. 5A schematically shows an input image and a corresponding depth map
  • Fig. 5 B shows an input image and a corresponding depth map
  • Fig. 6A shows an input image
  • Fig. 6B shows the detected edges in the input image of Fig. 6A
  • Fig. 6C shows a border probability value map of input image of Fig. 6A
  • Fig. 6D shows a filtered border probability value map of Fig. 6C
  • Fig. 6E shows an output image based on the input image of Fig. 6A
  • Fig. 7 schematically shows an image processing apparatus according to the invention.
  • Fig. 8 A schematically shows a one-dimensional signal representing luminance values of a line of an input image;
  • Fig. 8B to Fig. 8H schematically show one-dimensional signals which are based on the one-dimensional signal of Fig. 8A and which are achieved by enhancement according to the invention. Same reference numerals are used to denote similar parts throughout the
  • Fig. 1 schematically shows a first embodiment of the image enhancement unit 100 according to the invention, connected to a display device 120.
  • the image enhancement unit 100 is provided with input images at its input connector 110 and is arranged to compute a sequence of output images on basis of the input images.
  • the image enhancement unit 100 may provide the sequence of output images to a further (not depicted) processing unit or directly to a display device 120.
  • the image enhancement unit 100 is arranged: - to compute border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and to compute an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
  • the first embodiment of the image enhancement unit 100 comprises: an edge detection unit 102 for detecting edges in the input image. Edge detection is based on computing derivatives of pixel values; a border probability value computation unit 104, which is arranged to compute border probability values for pixels in the input image. Typically the border probability values are only computed for the set of pixels related to edges; and an image enhancement filter 106 which is arranged to compute an output image on basis of the input image under the control of the border probability values being computed by the border probability value computation unit 104.
  • the edge detection unit 102, the border probability value computation unit 104 and the image enhancement filter 106 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality.
  • edge detection unit 102 Any type of high pass filter may be applied as edge detection unit 102.
  • the operation of the edge detection unit 102 will not be described in detail herein as it is well known in the art. See e.g. A.K Jain, "Fundamentals of digital image processing", 1989 Prentice-Hall International.
  • the edge detection is as disclosed in the patent application IB2005/050400 filed by the applicant on January 31, 2005 (Attorney Docket NL040109)
  • the border probability value computation unit 104 is arranged to compute measure of variation values.
  • a relatively low measure of variation value corresponds to a relatively high border probability value. In other words, it is assumed that if an edge is adjacent to a substantially flat area, i.e.
  • the edge actually corresponds to a border of an object.
  • a relatively high amount of enhancement should be applied then. It is assumed that if the edge is not adjacent to a substantially flat area, the edge actually corresponds to texture. According to the invention the amount of enhancement, if any should be limited then.
  • the measure of variation value r ⁇ R (x,y) is assigned to an element of a two- dimensional matrix of variation values 116, whereby the element corresponds to a particular pixel of the input image with coordinates (x,y) .
  • a region of pixels in the neighborhood of the particular pixel is used.
  • the shape of the region may be irregular. However preferably the region is rectangular or diamond shaped.
  • the measure of variation value r ⁇ R (x,y) is computed as specified in Equation 1 :
  • L(i,j) is the luminance value of the pixel of the input image with coordinates (i,j) .
  • the measure of variation value r ⁇ R (x,y) is computed by means of a group of pixels fetched from the neighborhood forming a rectangular shaped kernel/window with size nxm .
  • a typical size of the kernel is 5x5 pixels.
  • the position of the kernel is based on the coordinates of the particular pixel of the input image for which the measure of variation value r ⁇ R (x,y) has to be computed.
  • the kernel does not include the particular pixel.
  • the kernel has an offset o R relative to the particular pixel.
  • the index R is used to indicate which region is used for the computation of the measure of variation r ⁇ R (x,y) and hence depends on the offset o R .
  • the actual position of the kernel is preferably based on the orientation of the edge. For instance if the edge is horizontal, the kernel will be to located left or right from the edge. If the edge is vertical the kernel will be located above or below the edge.
  • Equation 2 The relation between (x,y) and (i,j) is given in Equation 2:
  • the amount of enhancement may be symmetrical relative to the detected edge. However, preferably the amounts of enhancement at the opposite sides of the edge are mutually different.
  • two different control values are required for each detected edge pixel. So, preferably for each of the pixels of the image, which has been detected as an edge pixel two measures of the variation are computed, i.e. based on two different kernels/windows. One of the kernels being located at a first side of the edge and the other one of the kernels being located at a second side of the edge, being opposite to the first side. So based on kernels with a first offset o 1 and a second offset o 2 , respectively.
  • the first embodiment of the image enhancement unit 100 is arranged to perform the following operations: edge detection for detecting edges in an input image; - computing measure of variation values on basis of pixels in the neighborhood of the detected edges; classifying the detected edges on basis of the respective measure of variation values whereby edges are classified as relatively important if the measure of variation value in the neighborhood of the edges is relatively low; and - computing an output image on basis of the input image by selectively enhancing at least one of the regions located at one of the sides of the edges, which are classified as being important.
  • the edge itself and the opposite side of the edge are enhanced too.
  • FIG. 2 schematically shows an image enhancement filter 106 of the image enhancement unit 100 according to the invention, comprising: a high pass filter 202 for computing a high passed filtered version of the input image which is provided at the input connector 110; a multiplying unit 206 for multiplying the pixel values of the high passed version of the input image with respective weight coefficients, resulting into a weighted high passed filtered version of the input image; and an adding unit 204 for adding the pixel values of the input image with the respective pixel values of the weighted high passed filtered version of the input image as provided by the multiplying unit 206.
  • the output of the adding unit 204 is the output image, which is based on the input image.
  • the high pass filter may be a one-dimensional filter or a two-dimensional filter. Typically a vertical one-dimensional filter is combined with a horizontal one-dimensional filter. As an example the following set of coefficients of the high pass filter is provided [- 10002000-1]. If the image enhancement filter 106 is applied in the first embodiment of the image enhancement unit 100 as described in connection with Fig. 1, the weight coefficients 116 are based on the measure of variation values r ⁇ R (x,y) . As said above, relatively low measure of variation values correspond to relatively strong enhancement meaning that the corresponding weight factors are relatively high.
  • Fig. 3 schematically shows a second embodiment of the image enhancement unit 300 according to the invention, connected to a display device 120.
  • the image enhancement unit 300 is provided with input images at its input connector 310 and is arranged to compute a sequence of output images on basis of the input images.
  • the image enhancement unit 300 may provide the sequence of output images to a further (not depicted) processing unit or directly to a display device 120.
  • the image enhancement unit 300 is arranged: to compute border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and - to compute an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
  • the second embodiment of the image enhancement unit 300 comprises: - an edge detection unit 302 for detecting edges in a depth map corresponding to an input image, whereby the depth map is provided at the second input connector 310; a border probability value calculating unit 304 for calculating border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in the input image; and - an image enhancement filter 106 which is arranged to compute an output image on basis of the input image under the control of the border probability values being computed by the border probability value computation unit 104.
  • the edge detection unit 302, the border probability value computation unit 304 and the image enhancement filter 106 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality. Images taken of real 3-D scenes typically contain many cues that help the human observer in perceiving 3-D.
  • the second embodiment of the image enhancement unit 300 according to the invention is arranged to an enhanced the contrast of an input image around the edges of its corresponding depth map. By doing so, light cues that are available in the input image, especially on the borders of objects, are enhanced and, therefore, a greater perception of depth is achieved. Since only the borders of objects are processed, typically a small percentage of the image, the amount of enhancement may be relatively strong without significantly changing the total input image.
  • the second embodiment of the image enhancement unit 300 is arranged to perform the following operations: determining the edges in the depth map, resulting into a two-dimensional matrix representing depth edges D HP , preferably using a standard high-pass filter on the depth map; dilation of D HP to increase the region resulting into D; - computing the border probability values e.g. as specified in Equation 3:
  • P is the border probability value
  • ® represents point-wise multiplication (also known as Hadamard product)
  • T represents a threshold after which differences in depth are considered to be relevant, to prevent being affected by noise in the depth map
  • represents a noise reduction coefficient
  • blurring the values of the two-dimensional matrix representing the border probability values to soften the transition between neighborhoods to be enhanced and neighborhoods which do not have to be enhanced
  • applying local image enhancement in the image proportional to the blurred values of the two-dimensional matrix representing the border probability values. It was observed that applying local image enhancement could, in some cases, create visible unwanted overshoots also away from edges. A way to tackle this and still perform the desired enhancement is by applying the enhancement only inside the objects. That means that the enhancement is asymmetrical, i.e.
  • a relatively strong enhancement in a first region at a first side of the edge and a relatively low enhancement in a second region at a second side of the edge being opposite to the first side is implemented by reduction of the computed border probability values for the elements corresponding to a relatively high depth value.
  • a relatively high depth value corresponds to background and a relatively low depth value corresponds to foreground, i.e. the objects of interest.
  • Fig. 4 schematically shows an input image 400.
  • the input image represents an object 402 in front of the background.
  • a number of edges 404, 410-414 are depicted.
  • edges 404 corresponds to the border of the object 402.
  • the other edges 410-414 do not belong to one of the borders of the object 402.
  • These edges 410-414 actually correspond to texture within the object.
  • Relatively strong enhancement should be applied in the neighborhood 406 and/or 408 of the border edge 406.
  • Optional enhancement in the neighborhood of the non-border edges 410-414 should be limited.
  • the enhancement to be applied is preferably asymmetrical.
  • the amount of enhancement for a first region 406 located in the neighborhood of the border edge at a first side of the edge 404 is substantially higher than the amount of enhancement for a second region 408 also located in the neighborhood of the border edge but at a second side of the edge 404 which is opposite to the first side of the edge.
  • the amount of enhancement for the first region 406 is substantially lower than the amount of enhancement for the second region 408.
  • Fig. 5A schematically shows an input image 400 and a corresponding depth map 502.
  • the input image 404 is described in connection with Fig. 4.
  • the depth map 502 comprises a two-dimensional matrix of depth values.
  • a first group 508 of depth values corresponds to background.
  • a second group 506 of depth values corresponds to the object 402 of the input image 400.
  • Fig. 5 B shows an input image 514 and a corresponding depth map 512.
  • Fig. 6A shows an input image
  • Fig. 6B shows the detected edges in the input image of Fig. 6A
  • Fig. 6C shows a border probability value map of input image of Fig. 6A.
  • border probability values are computed. These border probability values have been put in a two- dimensional matrix and visualized in Fig. 6C;
  • Fig. 6D shows a filtered border probability value map, which is, based on thresholding the border probability value map as visualized in Fig. 6C.
  • thresholding is meant that only the border probability values are maintained which satisfy a predetermined threshold condition.
  • the other border probability values have been assigned a value equal to zero;
  • Fig. 6E shows an output image based on the input image of Fig. 6A and the filtered border probability value map as shown in Fig. 6D.
  • Fig. 7 schematically shows an image processing apparatus 700 according to the invention, comprising: receiving means 702 for receiving a signal representing input images; the image enhancement unit 704 as described in connection with any of the Figs. 1 to 6; and a display device 120 for displaying the output images of the image enhancement unit 704.
  • the signal may be a broadcast signal received via an antenna or cable but may also be a signal from a storage device like a VCR (Video Cassette Recorder) or Digital Versatile Disk (DVD).
  • the signal is provided at the input connector 710.
  • the image processing apparatus 700 might e.g. be a TV.
  • the image processing apparatus 700 does not comprise the optional display device but provides the output images to an apparatus that does comprise a display device 120.
  • the image processing apparatus 700 might be e.g. a set top box, a satellite-tuner, a VCR player, a DVD player or recorder.
  • the image processing apparatus 700 comprises storage means, like a hard disk or means for storage on removable media, e.g. optical disks.
  • the image processing apparatus 700 might also be a system being applied by a film-studio or broadcaster.
  • the display device 120 may be a monoscopic or a stereoscopic display device.
  • Fig. 8A schematically shows a one-dimensional signal 800 representing luminance values of a line of an input image.
  • Three different portions of the one-dimensional signal 800 can be distinguished: a first portion 810 belonging to a first region of the image at a first side of the edge; a second portion 812 corresponding to the edge, showing a transition of the luminance signal 800; and a third portion 814 belonging to a second region of the image at a second side of the edge, being opposite to the first side.
  • Fig. 8B to Fig. 8H schematically show one-dimensional signals which are based on the one-dimensional signal of Fig. 8A and which are achieved by enhancement according to the invention.
  • Table 1 the different types of enhancement which may be applied according to the invention are summarized. Table 1 :

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

A method of image enhancement is disclosed. The method comprises: computing border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and computing an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge

Description

Image enhancement
The invention relates to a method of image enhancement, in particular contrast, edge or sharpness enhancement.
The invention further relates to an image enhancement unit.
The invention further relates to an image processing apparatus, comprising: - means for receiving an input signal representing an input image; and such an image enhancement unit.
The invention further relates to a computer program product to be loaded by a computer arrangement, comprising instructions for image enhancement, the computer arrangement comprising processing means and a memory.
A large number of approaches have been devised to improve the perceived quality of an image. (See e.g. A. K. Jain, Fundamentals of Digital Image Processing, Prentice Hall, Inc., Englewood Cliffs, N.J., 1989.) The histogram equalization, a commonly used method, is based on the mapping of the input gray levels to achieve a nearly uniform output gray level distribution (See e.g. W. K. Pratt, Digital Image Processing, John Wiley and Sons, New York, 1991) However, histogram equalization applied to the entire image has the disadvantage of the attenuation of low contrast in the sparsely populated histogram regions. Employing local histogram equalization, which is of high computational complexity, can alleviate this problem.
Another method, called statistical differencing, generates the enhanced image by dividing each pixel value by its standard deviation estimated inside a specified window centered at the pixel (See e.g. W. K. Pratt, Digital Image Processing, John Wiley and Sons, New York, 1991) Thus, the amplitude of a pixel in the image is increased when it differs significantly from its neighbors, while it is decreased otherwise. A generalization of the statistical difference methods includes the contributions of the pre-selected first-order and second-order moments.
An alternate approach to contrast enhancement is based on modifying the magnitude of the Fourier transform of an image while keeping the phase invariant. The transform magnitude is normalized to range between 0 and 1 and raised to a power which is a number between zero and one (See e.g. H. C. Andrews, A. G. Teschler, and R. P. Kruger, "Image processing by digital computer", IEEE Spectrum, vol. 9, July 1972, pp. 20-32) An inverse transform of the modified spectrum yields the enhanced image. This conceptually simple approach, in some cases, results in unpleasant enhanced images with two types of artifacts: enhanced noise and replication of sharp edges. Moreover, this method is of high computational complexity.
A simple linear operator that can be used to enhance blurred images is Unsharp Masking (UM) (See e.g. A. K. Jain, Fundamentals of Digital Image Processing, Prentice Hall, Inc., Englewood Cliffs, N. J., 1989). The fundamental idea of UM is to subtract from the input signal a low-pass filtered version of the signal itself. The same effect can, however, be obtained by adding to the input signal a processed version of the signal in which high-frequency components are enhanced.
It is, inter alias, an object of the invention to provide an improved image enhancement. To this end, the invention provides a method, unit and computer program product for image enhancement.
This object of the invention is achieved in that the method according to the invention comprises: computing border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and computing an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
An essential aspect of the invention is that the image enhancement to be applied depends on the type of edges. Basically two types of edges may be distinguished in images: - edges which correspond to texture within objects; and edges, which correspond to borders of objects.
For edges in an input image the respective border probability values are computed which represent the probability of belonging to an object border. A relatively high border probability value means that the edge corresponds to a border of an object, while a relatively low border probability value means that the edge does not correspond to a border of an object but to a texture. The computed border probability values are used to control the amount of local image enhancement. As a result, the amount of image enhancement is relatively high in the neighborhood of edges, which corresponds to borders. The amount of image enhancement is limited in other regions of the image, including neighborhoods of edges, which are classified, on basis of the border probability, as not belonging to object borders.
With a first neighborhood of an edge is meant a region of pixels, which is located in the environment of the edge. The pixels forming the edge itself may be included in the region. Typically, for a standard definition image the first dimension of such a region is 10 pixels wide.
With image enhancement is meant any type of pixel processing, i.e. modification of pixel values resulting in an increased difference in pixel values of neighboring pixels. The pixel value may represent luminance and/or color. The image enhancement may be substantially symmetrical, meaning that the amounts of image enhancement at both sides of the edge are mutually substantially equal. So, in an embodiment of the method according to the invention, a second amount of image enhancement in a second neighborhood of the particular edge is relatively high if the border probability value of the particular edge is relatively high, the second neighborhood being located on a second side of the particular edge, being opposite to the first side of the particular edge.
The image enhancement is preferably asymmetrical, meaning that the amounts of image enhancement at both sides of the edge are mutually different. So, in an embodiment of the method according to the invention, a second amount of image enhancement in a second neighborhood of the particular edge is lower than the first amount of image enhancement, the second neighborhood being located on a second side of the particular edge, being opposite to the first side of the particular edge. Examples for applying asymmetrical image enhancement are as follows:
Suppose that a detected/classified border edge corresponds to a border of an object, which is positioned relatively close to the camera/viewer compared to its background. It is beneficial, to perform a relatively strong image enhancement at the first neighborhood of the edge which corresponds to the object while a limited image enhancement or even image de-enhancement/blur is performed at the second neighborhood being located at the opposite side of the edge. Suppose that a detected/classified border edge is located adjacent to a flat area, i.e. a region in the image having a measure of variation of pixel values, which is relatively low. In other words, an area, which is substantially homogeneous. In that case asymmetrical image enhancement may be beneficial. Enhancement of the area being located at the other side of the edge is advantageous.
Computing border probability values may be based on a number of parameters. Preferably, computing the border probability values of the particular edge is based on a measure of variation of pixel values of the input image in a third neighborhood of the particular edge. The third neighborhood is a group of pixels adjacent to the particular edge. The third neighborhood may correspond to the first neighborhood or the second neighborhood as mentioned above. The measure of variation is computed by addition of differences between respective pixel values of pixels in the neighborhood. Preferably, a relatively low measure of variation corresponds to a relatively high border probability value, resulting in relatively strong image enhancement. That means that the amount of image enhancement is relatively high for edges near flat areas.
Alternatively, computing the border probability values is based on a depth map corresponding to the input image. A depth map is a two-dimensional matrix of depth values. Each of the depth values corresponds to a respective pixel. The depth values represent distances from the respective points in the scene to the camera/viewer. A relatively large distance corresponds to a relatively high depth value. That means that foreground objects have lower depth values than the background. Preferably, a relatively high value of a derivative in the depth map, i.e. a relatively big depth step corresponds to a relatively high border probability value.
Modifications of the image enhancement unit and variations thereof may correspond to modifications and variations thereof of the image processing apparatus, the method and the computer program product, being described.
These and other aspects of the image enhancement unit, according to the invention will become apparent from and will be elucidated with respect to the implementations and embodiments described hereinafter and with reference to the accompanying drawings, wherein:
Fig. 1 schematically shows a first embodiment of the image enhancement unit according to the invention connected to a display device; Fig. 2 schematically shows an image enhancement filter of the image enhancement unit according to the invention;
Fig. 3 schematically shows a second embodiment of the image enhancement unit according to the invention connected to a display device; Fig. 4 schematically shows an input image;
Fig. 5A schematically shows an input image and a corresponding depth map;
Fig. 5 B shows an input image and a corresponding depth map;
Fig. 6A shows an input image;
Fig. 6B shows the detected edges in the input image of Fig. 6A; Fig. 6C shows a border probability value map of input image of Fig. 6A;
Fig. 6D shows a filtered border probability value map of Fig. 6C;
Fig. 6E shows an output image based on the input image of Fig. 6A;
Fig. 7 schematically shows an image processing apparatus according to the invention. Fig. 8 A schematically shows a one-dimensional signal representing luminance values of a line of an input image; and
Fig. 8B to Fig. 8H schematically show one-dimensional signals which are based on the one-dimensional signal of Fig. 8A and which are achieved by enhancement according to the invention. Same reference numerals are used to denote similar parts throughout the
Figures.
Fig. 1 schematically shows a first embodiment of the image enhancement unit 100 according to the invention, connected to a display device 120. The image enhancement unit 100 is provided with input images at its input connector 110 and is arranged to compute a sequence of output images on basis of the input images. The image enhancement unit 100 may provide the sequence of output images to a further (not depicted) processing unit or directly to a display device 120. The image enhancement unit 100 is arranged: - to compute border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and to compute an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
The first embodiment of the image enhancement unit 100 comprises: an edge detection unit 102 for detecting edges in the input image. Edge detection is based on computing derivatives of pixel values; a border probability value computation unit 104, which is arranged to compute border probability values for pixels in the input image. Typically the border probability values are only computed for the set of pixels related to edges; and an image enhancement filter 106 which is arranged to compute an output image on basis of the input image under the control of the border probability values being computed by the border probability value computation unit 104.
The edge detection unit 102, the border probability value computation unit 104 and the image enhancement filter 106 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality.
Any type of high pass filter may be applied as edge detection unit 102. The operation of the edge detection unit 102 will not be described in detail herein as it is well known in the art. See e.g. A.K Jain, "Fundamentals of digital image processing", 1989 Prentice-Hall International. Preferably, the edge detection is as disclosed in the patent application IB2005/050400 filed by the applicant on January 31, 2005 (Attorney Docket NL040109) The border probability value computation unit 104 is arranged to compute measure of variation values. A relatively low measure of variation value corresponds to a relatively high border probability value. In other words, it is assumed that if an edge is adjacent to a substantially flat area, i.e. substantially homogeneous region, the edge actually corresponds to a border of an object. According to the invention a relatively high amount of enhancement should be applied then. It is assumed that if the edge is not adjacent to a substantially flat area, the edge actually corresponds to texture. According to the invention the amount of enhancement, if any should be limited then.
The measure of variation value r\R(x,y) is assigned to an element of a two- dimensional matrix of variation values 116, whereby the element corresponds to a particular pixel of the input image with coordinates (x,y) . For the computation of the measure of variation value r\R(x,y) a region of pixels in the neighborhood of the particular pixel is used. The shape of the region may be irregular. However preferably the region is rectangular or diamond shaped.
Preferably, the measure of variation value r\R(x,y) is computed as specified in Equation 1 :
Figure imgf000009_0001
whereby L(i,j) is the luminance value of the pixel of the input image with coordinates (i,j) . In this case, the measure of variation value r\R(x,y) is computed by means of a group of pixels fetched from the neighborhood forming a rectangular shaped kernel/window with size nxm . A typical size of the kernel is 5x5 pixels. The position of the kernel is based on the coordinates of the particular pixel of the input image for which the measure of variation value r\R(x,y) has to be computed. Typically, the kernel does not include the particular pixel. Typically the kernel has an offset oR relative to the particular pixel. The index R is used to indicate which region is used for the computation of the measure of variation r\R(x,y) and hence depends on the offset oR .
The actual position of the kernel is preferably based on the orientation of the edge. For instance if the edge is horizontal, the kernel will be to located left or right from the edge. If the edge is vertical the kernel will be located above or below the edge.
The relation between (x,y) and (i,j) is given in Equation 2:
(i,j) = (x,y) + δR = (x,y) + (oR,oR) (2)
The amount of enhancement may be symmetrical relative to the detected edge. However, preferably the amounts of enhancement at the opposite sides of the edge are mutually different. In order to apply different amounts of enhancement two different control values are required for each detected edge pixel. So, preferably for each of the pixels of the image, which has been detected as an edge pixel two measures of the variation are computed, i.e. based on two different kernels/windows. One of the kernels being located at a first side of the edge and the other one of the kernels being located at a second side of the edge, being opposite to the first side. So based on kernels with a first offset o1 and a second offset o2 , respectively. The amount of image enhancement at different sides of the edge, as performed by the image enhancement filter 106 is controlled on basis of the two different measures of variation: r\R=l(x,y);r\R=2(x,y)
The first embodiment of the image enhancement unit 100 according to the invention is arranged to perform the following operations: edge detection for detecting edges in an input image; - computing measure of variation values on basis of pixels in the neighborhood of the detected edges; classifying the detected edges on basis of the respective measure of variation values whereby edges are classified as relatively important if the measure of variation value in the neighborhood of the edges is relatively low; and - computing an output image on basis of the input image by selectively enhancing at least one of the regions located at one of the sides of the edges, which are classified as being important. Optionally, the edge itself and the opposite side of the edge are enhanced too.
Fig. 2 schematically shows an image enhancement filter 106 of the image enhancement unit 100 according to the invention, comprising: a high pass filter 202 for computing a high passed filtered version of the input image which is provided at the input connector 110; a multiplying unit 206 for multiplying the pixel values of the high passed version of the input image with respective weight coefficients, resulting into a weighted high passed filtered version of the input image; and an adding unit 204 for adding the pixel values of the input image with the respective pixel values of the weighted high passed filtered version of the input image as provided by the multiplying unit 206.
The output of the adding unit 204 is the output image, which is based on the input image. The high pass filter may be a one-dimensional filter or a two-dimensional filter. Typically a vertical one-dimensional filter is combined with a horizontal one-dimensional filter. As an example the following set of coefficients of the high pass filter is provided [- 10002000-1]. If the image enhancement filter 106 is applied in the first embodiment of the image enhancement unit 100 as described in connection with Fig. 1, the weight coefficients 116 are based on the measure of variation values r\R(x,y) . As said above, relatively low measure of variation values correspond to relatively strong enhancement meaning that the corresponding weight factors are relatively high.
Fig. 3 schematically shows a second embodiment of the image enhancement unit 300 according to the invention, connected to a display device 120. The image enhancement unit 300 is provided with input images at its input connector 310 and is arranged to compute a sequence of output images on basis of the input images. The image enhancement unit 300 may provide the sequence of output images to a further (not depicted) processing unit or directly to a display device 120. The image enhancement unit 300 is arranged: to compute border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and - to compute an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
The second embodiment of the image enhancement unit 300 comprises: - an edge detection unit 302 for detecting edges in a depth map corresponding to an input image, whereby the depth map is provided at the second input connector 310; a border probability value calculating unit 304 for calculating border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in the input image; and - an image enhancement filter 106 which is arranged to compute an output image on basis of the input image under the control of the border probability values being computed by the border probability value computation unit 104.
The edge detection unit 302, the border probability value computation unit 304 and the image enhancement filter 106 may be implemented using one processor. Normally, these functions are performed under control of a software program product. During execution, normally the software program product is loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetical and/or optical storage, or may be loaded via a network like Internet. Optionally an application specific integrated circuit provides the disclosed functionality. Images taken of real 3-D scenes typically contain many cues that help the human observer in perceiving 3-D. There are light cues, where planes turned towards a light source are brighter than others; perspectives, where parallel lines converge towards infinity; relative distances, where closer objects seem bigger than those farther away; etc. For instance, it is well know that increasing sharpness of an image increases perception of depth, simply because it increases the visibility of such cues.
The second embodiment of the image enhancement unit 300 according to the invention is arranged to an enhanced the contrast of an input image around the edges of its corresponding depth map. By doing so, light cues that are available in the input image, especially on the borders of objects, are enhanced and, therefore, a greater perception of depth is achieved. Since only the borders of objects are processed, typically a small percentage of the image, the amount of enhancement may be relatively strong without significantly changing the total input image.
Preferably, the second embodiment of the image enhancement unit 300 according to the invention is arranged to perform the following operations: determining the edges in the depth map, resulting into a two-dimensional matrix representing depth edges DHP, preferably using a standard high-pass filter on the depth map; dilation of DHP to increase the region resulting into D; - computing the border probability values e.g. as specified in Equation 3:
P(x,y) = min {D(X, y) ® V $ (3)
where P is the border probability value; ® represents point-wise multiplication (also known as Hadamard product); T represents a threshold after which differences in depth are considered to be relevant, to prevent being affected by noise in the depth map; and γ represents a noise reduction coefficient; blurring the values of the two-dimensional matrix representing the border probability values, to soften the transition between neighborhoods to be enhanced and neighborhoods which do not have to be enhanced; applying local image enhancement in the image, proportional to the blurred values of the two-dimensional matrix representing the border probability values. It was observed that applying local image enhancement could, in some cases, create visible unwanted overshoots also away from edges. A way to tackle this and still perform the desired enhancement is by applying the enhancement only inside the objects. That means that the enhancement is asymmetrical, i.e. a relatively strong enhancement in a first region at a first side of the edge and a relatively low enhancement in a second region at a second side of the edge being opposite to the first side. Preferably, this is implemented by reduction of the computed border probability values for the elements corresponding to a relatively high depth value. A relatively high depth value corresponds to background and a relatively low depth value corresponds to foreground, i.e. the objects of interest. If the image enhancement filter 106 is applied in the second embodiment of the image enhancement unit 300 as described in connection with Fig. 3, the weight coefficients 316 are based on the blurred values of the two-dimensional matrix representing the border probability values.
Fig. 4 schematically shows an input image 400. The input image represents an object 402 in front of the background. A number of edges 404, 410-414 are depicted.
However, only one of the depicted edges 404 corresponds to the border of the object 402. The other edges 410-414 do not belong to one of the borders of the object 402. These edges 410-414 actually correspond to texture within the object. According to the invention a distinction should be made between the two types of edges, i.e. border edges 404 and non- border edges 410-414. Relatively strong enhancement should be applied in the neighborhood 406 and/or 408 of the border edge 406. Optional enhancement in the neighborhood of the non-border edges 410-414 should be limited.
Preferably, a further distinction is made. That means that the enhancement to be applied is preferably asymmetrical. For instance the amount of enhancement for a first region 406 located in the neighborhood of the border edge at a first side of the edge 404 is substantially higher than the amount of enhancement for a second region 408 also located in the neighborhood of the border edge but at a second side of the edge 404 which is opposite to the first side of the edge. Alternatively, the amount of enhancement for the first region 406 is substantially lower than the amount of enhancement for the second region 408. Fig. 5A schematically shows an input image 400 and a corresponding depth map 502. The input image 404 is described in connection with Fig. 4. The depth map 502 comprises a two-dimensional matrix of depth values. A first group 508 of depth values corresponds to background. A second group 506 of depth values corresponds to the object 402 of the input image 400. By computing a derivative of depth values in the x-direction edges 504 in the depth map 502 can be detected. It is clearly noticeable that the detected edge 504 in the depth map 502 corresponds to the border edge 404 in the input image 400.
Fig. 5 B shows an input image 514 and a corresponding depth map 512.
Next, the working of the first embodiment of the image enhancement unit 100 according to the invention will be explained in connection with the Figs. 6A to 6E.
Fig. 6A shows an input image;
Fig. 6B shows the detected edges in the input image of Fig. 6A;
Fig. 6C shows a border probability value map of input image of Fig. 6A. For the regions in the neighborhood of the detected edges, as shown in Fig. 6B, border probability values are computed. These border probability values have been put in a two- dimensional matrix and visualized in Fig. 6C;
Fig. 6D shows a filtered border probability value map, which is, based on thresholding the border probability value map as visualized in Fig. 6C. By thresholding is meant that only the border probability values are maintained which satisfy a predetermined threshold condition. The other border probability values have been assigned a value equal to zero; and
Fig. 6E shows an output image based on the input image of Fig. 6A and the filtered border probability value map as shown in Fig. 6D.
Fig. 7 schematically shows an image processing apparatus 700 according to the invention, comprising: receiving means 702 for receiving a signal representing input images; the image enhancement unit 704 as described in connection with any of the Figs. 1 to 6; and a display device 120 for displaying the output images of the image enhancement unit 704.
The signal may be a broadcast signal received via an antenna or cable but may also be a signal from a storage device like a VCR (Video Cassette Recorder) or Digital Versatile Disk (DVD). The signal is provided at the input connector 710. The image processing apparatus 700 might e.g. be a TV. Alternatively the image processing apparatus 700 does not comprise the optional display device but provides the output images to an apparatus that does comprise a display device 120. Then the image processing apparatus 700 might be e.g. a set top box, a satellite-tuner, a VCR player, a DVD player or recorder. Optionally the image processing apparatus 700 comprises storage means, like a hard disk or means for storage on removable media, e.g. optical disks. The image processing apparatus 700 might also be a system being applied by a film-studio or broadcaster. The display device 120 may be a monoscopic or a stereoscopic display device.
Fig. 8A schematically shows a one-dimensional signal 800 representing luminance values of a line of an input image. Three different portions of the one-dimensional signal 800 can be distinguished: a first portion 810 belonging to a first region of the image at a first side of the edge; a second portion 812 corresponding to the edge, showing a transition of the luminance signal 800; and a third portion 814 belonging to a second region of the image at a second side of the edge, being opposite to the first side.
Fig. 8B to Fig. 8H schematically show one-dimensional signals which are based on the one-dimensional signal of Fig. 8A and which are achieved by enhancement according to the invention. In Table 1 the different types of enhancement which may be applied according to the invention are summarized. Table 1 :
Figure imgf000015_0001
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be constructed as limiting the claim. The word 'comprising' does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements and by means of a suitable programmed computer. In the unit claims enumerating several means, several of these means can be embodied by one and the same item of hardware or software. The usage of the words first, second and third, etcetera do not indicate any ordering. These words are to be interpreted as names. No specific sequence of acts is intended to be required unless specifically indicated.

Claims

CLAIMS:
1. A method of image enhancement, the method comprising: computing border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and computing an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
2. A method of image enhancement as claimed in claim 1, whereby a second amount of image enhancement in a second neighborhood of the particular edge is relatively high if the border probability value of the particular edge is relatively high, the second neighborhood being located on a second side of the particular edge, being opposite to the first side of the particular edge.
3. A method of image enhancement as claimed in claim 1 , whereby a second amount of image enhancement in a second neighborhood of the particular edge is lower than the first amount of image enhancement, the second neighborhood being located on a second side of the particular edge, being opposite to the first side of the particular edge.
4. A method of image enhancement as claimed in any of the claims 1 to 3, whereby computing the border probability value of the particular edge is based on a measure of variation of pixel values of the input image in a third neighborhood of the particular edge.
5. A method of image enhancement as claimed in claim 4, whereby the probability value of the particular edge is relatively high if the measure of variation is relatively low.
6. A method of image enhancement as claimed in any of the claims 1 to 3, whereby computing the border probability value of the particular edge is based on a depth map corresponding to the input image.
7. A method of image enhancement as claimed in claim 3, whereby the border probability value of the particular edge is relatively high if the derivative of corresponding depth values in a predetermined direction is relatively high.
8. An image enhancement unit comprising: - first computing means for computing border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and second computing means for computing an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
9. An image processing apparatus comprising: - receiving means for receiving a signal corresponding to an input image; and an image enhancement unit for computing an output image, as claimed in claim 8.
10. A computer program product to be loaded by a computer arrangement, comprising instructions for image enhancement, the computer arrangement comprising processing means and a memory, the computer program product, after being loaded, providing said processing means with the capability to carry out: computing border probability values for edges representing respective probabilities for the edges to correspond to respective object borders in an input image; and - computing an output image on basis of the input image and the border probability values, whereby a first amount of image enhancement in a first neighborhood of a particular edge is relatively high if the border probability value of the particular edge is relatively high, the first neighborhood being located on a first side of the particular edge.
PCT/IB2007/050666 2006-03-13 2007-03-01 Image enhancement WO2007105129A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP06111006.0 2006-03-13
EP06111006 2006-03-13

Publications (1)

Publication Number Publication Date
WO2007105129A1 true WO2007105129A1 (en) 2007-09-20

Family

ID=37967128

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2007/050666 WO2007105129A1 (en) 2006-03-13 2007-03-01 Image enhancement

Country Status (1)

Country Link
WO (1) WO2007105129A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI420413B (en) * 2010-07-15 2013-12-21 Chunghwa Picture Tubes Ltd Depth map enhancing method and computer-readable medium therefor
US9686528B2 (en) 2012-06-28 2017-06-20 Thomson Licensing Dealiasing method and device for 3D view synthesis

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020140864A1 (en) * 2001-03-28 2002-10-03 Koninklijke Philips Electronics N.V. System and method for performing segmentation-based enhancements of a video image
US20040120597A1 (en) * 2001-06-12 2004-06-24 Le Dinh Chon Tam Apparatus and method for adaptive spatial segmentation-based noise reducing for encoded image signal
US20050249430A1 (en) * 2004-05-07 2005-11-10 Samsung Electronics Co., Ltd. Image quality improving apparatus and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020140864A1 (en) * 2001-03-28 2002-10-03 Koninklijke Philips Electronics N.V. System and method for performing segmentation-based enhancements of a video image
US20040120597A1 (en) * 2001-06-12 2004-06-24 Le Dinh Chon Tam Apparatus and method for adaptive spatial segmentation-based noise reducing for encoded image signal
US20050249430A1 (en) * 2004-05-07 2005-11-10 Samsung Electronics Co., Ltd. Image quality improving apparatus and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI420413B (en) * 2010-07-15 2013-12-21 Chunghwa Picture Tubes Ltd Depth map enhancing method and computer-readable medium therefor
US9686528B2 (en) 2012-06-28 2017-06-20 Thomson Licensing Dealiasing method and device for 3D view synthesis

Similar Documents

Publication Publication Date Title
Chen et al. Robust image and video dehazing with visual artifact suppression via gradient residual minimization
CN108702496B (en) System and method for real-time tone mapping
Tsiotsios et al. On the choice of the parameters for anisotropic diffusion in image processing
Ancuti et al. Single-scale fusion: An effective approach to merging images
Yu et al. Fast single image fog removal using edge-preserving smoothing
Talebi et al. Fast multilayer Laplacian enhancement
US7751641B2 (en) Method and system for digital image enhancement
US8965141B2 (en) Image filtering based on structural information
US9202267B1 (en) System and method to enhance and process a digital image
JP2001229390A (en) Method and device for changing pixel image into segment
EP1231778A2 (en) Method and system for motion image digital processing
Trentacoste et al. Unsharp masking, countershading and halos: enhancements or artifacts?
Singh et al. Weighted least squares based detail enhanced exposure fusion
KR20140109801A (en) Method and apparatus for enhancing quality of 3D image
Rabie Adaptive hybrid mean and median filtering of high-ISO long-exposure sensor noise for digital photography
CN115393216A (en) Image defogging method and device based on polarization characteristics and atmospheric transmission model
Han et al. Automatic illumination and color compensation using mean shift and sigma filter
Choudhury et al. Perceptually motivated automatic color contrast enhancement
WO2007105129A1 (en) Image enhancement
Choudhury et al. Hierarchy of nonlocal means for preferred automatic sharpness enhancement and tone mapping
Wang et al. A bilateral filtering based ringing elimination approach for motion-blurred restoration image
Chamaret et al. Video retargeting for stereoscopic content under 3D viewing constraints
Wang et al. Video enhancement using adaptive spatio-temporal connective filter and piecewise mapping
US20090226108A1 (en) Multi-pass algorithm to reduce ringing artifacts in noise removal and deblur effects
Song et al. Contrast enhancement algorithm considering surrounding information by illumination image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07705981

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07705981

Country of ref document: EP

Kind code of ref document: A1