WO2014056766A1 - Image enhancement apparatus and method - Google Patents

Image enhancement apparatus and method Download PDF

Info

Publication number
WO2014056766A1
WO2014056766A1 PCT/EP2013/070500 EP2013070500W WO2014056766A1 WO 2014056766 A1 WO2014056766 A1 WO 2014056766A1 EP 2013070500 W EP2013070500 W EP 2013070500W WO 2014056766 A1 WO2014056766 A1 WO 2014056766A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
level
input image
unit
enhancement
Prior art date
Application number
PCT/EP2013/070500
Other languages
French (fr)
Inventor
Paul Springer
Toru Nishi
Matthias BRÜGGEMANN
Original Assignee
Sony Corporation
Sony Deutschland Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation, Sony Deutschland Gmbh filed Critical Sony Corporation
Publication of WO2014056766A1 publication Critical patent/WO2014056766A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • G06T5/75Unsharp masking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20008Globally adaptive

Definitions

  • the present disclosure relates to an image enhancement apparatus and method as well as a computer program for implementing said method on a computer.
  • Image enhancement apparatus and methods for video signal processing used in TV system have to deal with many kinds of input quality.
  • the input quality is not known.
  • With a predefined enhancement gain either an overenhancement is achieved for good input video quality or input signals with a low input quality are not enough enhanced.
  • an image enhancement apparatus comprising:
  • an image enhancement unit that enhances an input image based on a global feature level to obtain an enhanced image
  • a global feature level analysis unit that analyzes said input image to obtain said global feature level
  • a subtraction unit that subtracts said input image from said enhanced image to obtain a difference image
  • an enhancement level estimation unit that compares, at least partially, said enhanced input image and said input image and determines an enhancement level
  • a global gain factor computation unit that determines a global gain factor based on said global feature level and said enhancement level
  • a multiplication unit that multiplies said difference image with said global gain factor to obtain a weighted difference image
  • an addition unit that adds said weighted difference image to said input image to obtain an output image.
  • an image enhancement method comprising:
  • a computer program comprising program means for causing a computer to carry out the steps of the method disclosed herein, when said computer program is carried out on a computer, as well as a non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method disclosed herein to be performed are provided.
  • One of the aspects of the disclosure is to provide for an automated control of image enhancement gain.
  • One or more global feature levels of one or more global features of an input image e.g. one or more of a sharpness level, contrast level, noise level or artifact level
  • the gain factor of the desired image enhancement e.g. one or more of a sharpness enhancement, contrast enhancement, noise reduction and artifact reduction
  • the enhancement level after the image enhancement is estimated so that the gain of the image enhancement can be well adapted to achieve a desired enhancement level for each level of the global feature of said input image.
  • the proposed image enhancement apparatus and method are thus, for instance, able to estimate the sharpness level of an input sequence of input images for computing a gain value which is controlling the enhancement level of the image enhancement. Furthermore, the enhancement level can be adjusted to a predefined level, depending on which the gain level is automatically adjusted for each level of input sharpness.
  • the proposed apparatus and method not only computes a gain factor depending on the global feature level of the input image, but also adapts the gain factor to the current enhancement level of the input image.
  • Fig. 1 shows a block diagram of a first embodiment of a proposed image enhancement apparatus
  • Fig. 2 shows a block diagram of a second embodiment of a proposed image enhancement apparatus
  • Fig. 3 shows a block diagram of a third embodiment of a proposed image enhancement apparatus
  • Fig. 4 shows a block diagram of a first embodiment of an image enhancement unit
  • Fig. 5 shows a block diagram of a second embodiment of an image enhancement unit
  • Fig. 6 shows a block diagram of a first embodiment of a sharpness level estimation unit
  • Fig. 7 shows diagrams of local gradient and contrast for images with high and low sharpness
  • Fig. 8 shows a block diagram of a second embodiment of a sharpness level estimation unit
  • Fig. 9 shows a block diagram of an embodiment of an optimal local filter standard deviation computation unit
  • Fig. 10 shows a block diagram of an embodiment of an enhancement level estimation unit
  • Fig. 11 shows a block diagram of an embodiment of a global gain factor computation unit.
  • Fig. 1 shows a block diagram of a first embodiment of a proposed image enhancement apparatus 10a according to the present disclosure. It comprises an image enhancement unit 12 that enhances an input image X based on a global feature level to obtain an enhanced image Y.
  • a global feature level analysis unit 14 analyzes said input image X to obtain said global feature level 15.
  • a subtraction unit 16 subtracts said input image X from said enhanced image Y to obtain a difference image 17.
  • An enhancement level estimation unit 18 compares, at least partially (e.g. in image regions), said enhanced input image Y and said input image X and determines an enhancement level 19.
  • a global gain factor computation unit 20 determines a global gain factor 21 based on said global feature level 15 and said enhancement level 19.
  • a multiplication unit 22 multiplies said difference image 17 with said global gain factor 21 to obtain a weighted difference image 23.
  • an addition unit 24 adds said weighted difference image 23 to said input image X to obtain an output image Z.
  • Examples for image enhancement methods used in the image enhancement unit 12 are sharpness or contrast enhancement and noise or artifact reduction.
  • the parameters of the image enhancement can be controlled depending on the global feature level of X, computed by the global feature level analysis unit 14.
  • Examples for global feature levels are sharpness level, contrast level, noise level or artifact level.
  • the enhancement level of the enhanced image Y is estimated by comparing image regions of X and Y with defined characteristics using the enhancement level estimation unit 18.
  • the difference between X and Y can be described as a signal containing the enhancement information. It is multiplied with a gain factor to control the final enhancement level and added to the input image X, generating the enhanced output image Z.
  • the gain factor is adapted to the global feature level of the input sequence.
  • the global gain factor computation unit 20 computes the global gain factor 21 (also called global weighting factor) using the global feature level value 15 and the enhancement level value 19 compared to a predefined target enhancement level 25, e.g. set by the user or the system designer.
  • FIG. 2 depicts a block diagram of a second embodiment of a proposed image enhancement apparatus 10b according to the present disclosure for temporally recursive processing.
  • the result Z of the apparatus 10b is stored to a frame buffer 26 and motion compensated in a motion compensation unit 28 to obtain a motion compensated output image Z', so that it can be used as further input of the processing.
  • a mixing unit 30 the motion compensated previous result Z' is mixed with the input image X, depending on the reliability of the motion compensation.
  • the mixed input signal W is then forwarded to the image enhancement unit 12, which enhances the input image X using information from the compensated previous result Z' and the current input W.
  • Examples for such temporally recursive image enhancement methods are super-resolution or temporal noise and artifact reduction.
  • the gain control works similar to the non-recursive processing explained above with reference to Fig. 1 and can contain an additional stabilizing factor, avoiding changing enhancement levels from frame to frame.
  • FIG. 3 depicts a block diagram of a third embodiment of a proposed image enhancement apparatus 10c according to the present disclosure for automated control of sharpness enhancement as an example for the image enhancement.
  • a well known unsharp masking method is used as exemplary embodiment for the image enhancement unit 12'.
  • the low pass filter kernel used inside the unsharp masking is controlled depending on the global sharpness level 15' of the input image X.
  • a sharpness level estimation unit 14' is realized.
  • the sharpness level estimation unit 18 compares the local contrasts of the input image X and the enhanced input image Y and computes the mean of the ratios between the local contrasts in a discriminated area. This value is defined as the enhancement level 19 and is forwarded to the global gain factor computation unit 20.
  • This unit 20 computes a global gain factor using the sharpness level value 15' (generally the global feature level) and the enhancement level value 19, preferably compared to a predefined target enhancement level 25.
  • a global gain factor using the sharpness level value 15' (generally the global feature level) and the enhancement level value 19, preferably compared to a predefined target enhancement level 25.
  • a small gain factor 21 is computed to add only a small amount of sharpness, while in case of an input signal X with a low sharpness level a high gain factor 21 is chosen.
  • the difference signal 17 between X and Y is multiplied with the global gain factor 21 to control the gain of the image enhancement.
  • each kind of image processing changing the input image that should be adapted to the sharpness level of the input image can be used as embodiment of the sharpness enhancement unit 12'.
  • Exemplary an unsharp masking unit 12a as depicted in Fig. 4 is used as embodiment for the image enhancement unit 12, particularly for the sharpness enhancement unit 12'.
  • the current input image X is separately filtered in x- and y- direction using a Gaussian Filter Kernel 123 by a first filter 120 and a second (orthogonal) filter 122.
  • the filter Kernel 123 can be computed in a Gaussian filter computation unit 124 depending on the input global feature level 15 (e.g. the input sharpness 15').
  • the filter coefficients for a 7-tap Gaussian Filter Kernel with standard deviation ⁇ are computed using the following formula:
  • the standard deviation for an image with a high sharpness level should be selected lower than for an image with a low shaipness level.
  • the low-pass filtered input image 125 is subtracted from the input signal X in a subtraction unit 126.
  • the resulting difference signal 127 is finally added to the input X in an addition unit 128, generating an output signal Y with a higher sharpness than input signal X.
  • the image (e.g. sharpness) enhancement can be realized by combining details from the compensated previous result and from the current input signal as provided in the embodiment of the image enhancement unit 12b depicted in Fig. 5.
  • the difference signal 127 between the low pass filtered current input 125 and the mixed input signal W (see Fig. 2) containing the details from previous frames is added to the current input X.
  • a signal Y containing details from the current and the previous frames is thus generated according to this embodiment.
  • FIG. 12 Another example for an embodiment of the image enhancement unit 12 is a temporally recursive image enhancement as described in European patent application 12167633.2 filed by the applicant on May 1 1, 2012, which is herein incorporated by reference in its entirety.
  • This document describes an image enhancement apparatus for enhancing an input image of a sequence of input images of at least a first view and obtaining an enhanced output image of at least said first view, said apparatus comprising an unsharp masking unit configured to enhance the sharpness of the input image, a motion compensation unit configured to generate at least one preceding motion compensated image by compensating motion in a preceding output image, a weighted selection unit configured to generate a weighted selection image from said sharpness enhanced input image and said preceding motion compensated image based on selection weighting factor, detail signal generation unit configured to generate a detail signal from said input image and said weighted selection image, and a combination unit configured to generate said enhanced output image from said detail signal and from said input image and/or said weighted selection image.
  • Fig. 6 depicts a block diagram of a first embodiment of the sharpness level estimation unit 14' a.
  • the idea for estimating the sharpness level is depicted in the diagrams shown in Fig. 7.
  • images with a low sharpness level show a smoother edge transition (as shown in Fig. 7A) than images with a high sharpness level (as shown in Fig. 7B).
  • the maximum gradient of an edge with the same height is lower for a lower sharpness level, while the local contrast, describing the height of the edge is similar.
  • the idea is to compute a mean ratio of maximum gradient and local contrast for detected edge positions. This value is assumed to describe the sharpness level in the input image.
  • a first step the absolute gradient is computed for the whole input image X in a gradient computation unit 140.
  • the gradients in x- and y-directions are computed by simple difference operators.
  • gradX (x, y) X (x, y) - X (x - 1, y )
  • the maximum local gradient is detected inside a local block area (e.g. 5x5 pixels) in a maximum local gradient computation unit 142. Further, based on a threshold decision edge positions are detected in an edge feature detection unit 144. If the gradient at the position (x, y) exceeds a defined threshold, the position is assumed to be located on an edge. Furthermore, by a local contrast computation unit 146 the local contrast is computed for the input image X in the same block area that is used for the maximum gradient detection (and additionally provided as separate output if needed in an embodiment). Then for each pixel the ratio of maximum gradient and local contrast is computed in a ratio computation unit 148. Finally the mean of this ratio is computed inside the discriminated edge area in a sharpness level computation unit 150. The resulting value is defined as sharpness level 15'.
  • a local contrast computation unit 146 the local contrast is computed for the input image X in the same block area that is used for the maximum gradient detection (and additionally provided as separate output if needed in an embodiment). Then for each pixel
  • Fig. 8 shows a second embodiment for the sharpness level estimation unit 14'b.
  • the mean optimal standard deviation for Gaussian filtering is computed inside a detected edge area. This measure also indicates the sharpness of an edge, as for steep edges a smaller optimal standard deviation is detected than for blurred edges.
  • edge positions are detected in an edge feature detection unit 144. If the gradient at the position (x, y) exceeds a defined threshold the position is assumed to be located on an edge.
  • the optimal local standard deviation for Gaussian filtering is computed using a set of e.g. three pre-defined filter standard deviations StdDevl , StdDev2, StdDev3 in an optimal local filter standard deviation computation unit 152.
  • the mean optimal standard deviation is computed inside the detected edge area in a optimal standard deviation computation unit 154. This value is defined as sharpness level.
  • FIG. 9 An embodiment of the optimal local filter standard deviation computation unit 152 is depicted in Fig. 9.
  • the optimal standard deviation is computed depending on the minimum description length criterion.
  • the input signal X is separately filtered in a filter unit 1524 with three different 7-tap filter kernels, which are computed in a filter computation unit 1522 using three different standard deviations ⁇ ⁇ :
  • the difference images between the low-pass filtered results and input X are computed in a difference image computation unit 1526.
  • the local description length is computed inside a (e.g. 5x5 pixels) block area description length computation unit 1528 using the following equation.
  • the local description length values are used to detect the standard deviation of the low-pass filters that induce the local minimum description length in a selection unit 1530.
  • Fig. 10 depicts a block diagram of an embodiment of the enhancement level estimation unit 18.
  • the local contrasts of the enhanced image Y and the input image X are compared and a mean ratio of the contrasts is computed inside an area with defined local contrast range of input X.
  • the local contrast of the input image X and the enhanced input image Y is computed by local contrast computation units 182, 184 inside a local block area (e.g. 5x5 pixels) by computing the difference between maximum and minimum luminance value inside this block area.
  • the ratio of local contrast of X and local contrast of Y is computed for each pixel in a contrast ratio computation unit 186.
  • an area is discriminated in which the local contrast of X has a value inside a defined contrast range in an area discrimination unit 188.
  • the mean of the contrast ratios is computed in a mean contrast ratio computation unit 190 and defined as enhancement level.
  • a global gain factor is computed by the global gain factor computation unit 20.
  • the computed enhancement level and shaipness level are mapped to a predefined range by an enhancement level mapping unit 202 and a sharpness level mapping unit 204.
  • An example for said mapping is a linear mapping between 0 and 1.
  • the mapped shaipness level value 205 is used as basic gain value, as the gain of the image enhancement shall be controlled depending on the sharpness level.
  • the gain should be high for input signals with low sharpness levels and low for input signals with high sharpness levels.
  • the mapped estimated enhancement level 203 is subtracted from the predefined target enhancement level 201 in a subtraction unit 206 and the difference 207 is added to the basic gain value 205 in an addition unit 208.
  • the output of the addition unit 208 may be used as global gain factor 21 according to this embodiment.
  • the present disclosure provides an apparatus and a method for an automated control of image enhancement gain.
  • the global feature level, e.g. the sharpness, of the input signal is estimated and based on this estimation the gain factor of the image enhancement can be automatically controlled. Furthermore the enhancement level after the image enhancement is estimated so that the gain of the image enhancement can be well adapted to achieve a desired enhancement level for each level of input sharpness. This avoids a too strong and a too weak image enhancement even if the quality of the input images is not known.
  • a circuit is a structural assemblage of electronic components including conventional circuit elements, integrated circuits including application specific integrated circuits, standard integrated circuits, application specific standard products, and field programmable gate arrays. Further a circuit includes central processing units, graphics processing units, and microprocessors which are programmed or configured according to software code. A circuit does not include pure software, although a circuit does include the above-described hardware executing software.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Picture Signal Circuits (AREA)

Abstract

A proposed image enhancement apparatus comprises an image enhancement unit that enhances an input image based on a global feature level to obtain an enhanced image, a global feature level analysis unit that analyzes said input image to obtain said global feature level, and a subtraction unit that subtracts said input image from said enhanced image to obtain a difference image. An enhancement level estimation unit compares, at least partially, said enhanced input image and said input image and determines an enhancement level. A global gain factor computation unit determines a global gain factor based on said global feature level and said enhancement level. A multiplication unit multiplies said difference image with said global gain factor to obtain a weighted difference image and an addition unit adds said weighted difference image to said input image to obtain an output image.

Description

IMAGE ENHANCEMENT APPARATUS AND METHOD
BACKGROUND
Field of the Disclosure
[0001] The present disclosure relates to an image enhancement apparatus and method as well as a computer program for implementing said method on a computer.
Description of Related Art [0002] Image enhancement apparatus and methods for video signal processing used in TV system have to deal with many kinds of input quality. In many cases the input quality is not known. With a predefined enhancement gain either an overenhancement is achieved for good input video quality or input signals with a low input quality are not enough enhanced. Hence, there is a need for an improved image enhancement and method in such situations.
[0003] The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventor(s), to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention.
SUMMARY
[0004] It is an object to provide an improved image enhancement apparatus and method that avoids a too strong and a too weak image enhancement even if the quality of the input images is not known. It is a further object to provide a corresponding computer program for implementing said image enhancement method and a non-transitory computer- readable recording medium.
[0005] According to an aspect there is provided an image enhancement apparatus comprising:
an image enhancement unit that enhances an input image based on a global feature level to obtain an enhanced image,
a global feature level analysis unit that analyzes said input image to obtain said global feature level,
a subtraction unit that subtracts said input image from said enhanced image to obtain a difference image, an enhancement level estimation unit that compares, at least partially, said enhanced input image and said input image and determines an enhancement level,
a global gain factor computation unit that determines a global gain factor based on said global feature level and said enhancement level,
a multiplication unit that multiplies said difference image with said global gain factor to obtain a weighted difference image, and
an addition unit that adds said weighted difference image to said input image to obtain an output image.
[0006] According to a further aspect there is provided an image enhancement method comprising:
enhancing an input image based on a global feature level to obtain an enhanced image,
analyzing said input image to obtain said global feature level,
subtracting said input image from said enhanced image to obtain a difference image,
comparing, at least partially, said enhanced input image and said input image and determines an enhancement level,
determining a global gain factor based on said global feature level and said enhancement level,
multiplying said difference image with said global gain factor to obtain a weighted difference image, and
adding said weighted difference image to said input image to obtain an output image.
[0007] According to still further aspects a computer program comprising program means for causing a computer to carry out the steps of the method disclosed herein, when said computer program is carried out on a computer, as well as a non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method disclosed herein to be performed are provided.
[0008] Preferred embodiments are defined in the dependent claims. It shall be understood that the claimed image enhancement method, the claimed computer program and the claimed computer-readable recording medium have similar and/or identical preferred embodiments as the claimed image enhancement apparatus and as defined in the dependent claims.
[0009] One of the aspects of the disclosure is to provide for an automated control of image enhancement gain. One or more global feature levels of one or more global features of an input image, e.g. one or more of a sharpness level, contrast level, noise level or artifact level, are estimated. Based on this estimation the gain factor of the desired image enhancement, e.g. one or more of a sharpness enhancement, contrast enhancement, noise reduction and artifact reduction, is automatically controlled. Furthermore, the enhancement level after the image enhancement is estimated so that the gain of the image enhancement can be well adapted to achieve a desired enhancement level for each level of the global feature of said input image.
[0010] The proposed image enhancement apparatus and method are thus, for instance, able to estimate the sharpness level of an input sequence of input images for computing a gain value which is controlling the enhancement level of the image enhancement. Furthermore, the enhancement level can be adjusted to a predefined level, depending on which the gain level is automatically adjusted for each level of input sharpness.
[0011] In contrast to known apparatus and methods the proposed apparatus and method not only computes a gain factor depending on the global feature level of the input image, but also adapts the gain factor to the current enhancement level of the input image. [0012] It is to be understood that both the foregoing general description of the invention and the following detailed description are exemplary, but are not restrictive of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Fig. 1 shows a block diagram of a first embodiment of a proposed image enhancement apparatus,
Fig. 2 shows a block diagram of a second embodiment of a proposed image enhancement apparatus,
Fig. 3 shows a block diagram of a third embodiment of a proposed image enhancement apparatus,
Fig. 4 shows a block diagram of a first embodiment of an image enhancement unit,
Fig. 5 shows a block diagram of a second embodiment of an image enhancement unit,
Fig. 6 shows a block diagram of a first embodiment of a sharpness level estimation unit, Fig. 7 shows diagrams of local gradient and contrast for images with high and low sharpness
Fig. 8 shows a block diagram of a second embodiment of a sharpness level estimation unit,
Fig. 9 shows a block diagram of an embodiment of an optimal local filter standard deviation computation unit,
Fig. 10 shows a block diagram of an embodiment of an enhancement level estimation unit, and
Fig. 11 shows a block diagram of an embodiment of a global gain factor computation unit.
DESCRIPTION OF THE EMBODIMENTS
[0014] Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, Fig. 1 shows a block diagram of a first embodiment of a proposed image enhancement apparatus 10a according to the present disclosure. It comprises an image enhancement unit 12 that enhances an input image X based on a global feature level to obtain an enhanced image Y. A global feature level analysis unit 14 analyzes said input image X to obtain said global feature level 15. A subtraction unit 16 subtracts said input image X from said enhanced image Y to obtain a difference image 17. An enhancement level estimation unit 18 compares, at least partially (e.g. in image regions), said enhanced input image Y and said input image X and determines an enhancement level 19. A global gain factor computation unit 20 determines a global gain factor 21 based on said global feature level 15 and said enhancement level 19. A multiplication unit 22 multiplies said difference image 17 with said global gain factor 21 to obtain a weighted difference image 23. Finally, an addition unit 24 adds said weighted difference image 23 to said input image X to obtain an output image Z.
[0015] Examples for image enhancement methods used in the image enhancement unit 12 are sharpness or contrast enhancement and noise or artifact reduction. The parameters of the image enhancement can be controlled depending on the global feature level of X, computed by the global feature level analysis unit 14. Examples for global feature levels are sharpness level, contrast level, noise level or artifact level.
[0016] The enhancement level of the enhanced image Y is estimated by comparing image regions of X and Y with defined characteristics using the enhancement level estimation unit 18. The difference between X and Y can be described as a signal containing the enhancement information. It is multiplied with a gain factor to control the final enhancement level and added to the input image X, generating the enhanced output image Z. The gain factor is adapted to the global feature level of the input sequence.
[0017] Optionally, the global gain factor computation unit 20 computes the global gain factor 21 (also called global weighting factor) using the global feature level value 15 and the enhancement level value 19 compared to a predefined target enhancement level 25, e.g. set by the user or the system designer.
[0018] Fig. 2 depicts a block diagram of a second embodiment of a proposed image enhancement apparatus 10b according to the present disclosure for temporally recursive processing. In this embodiment the result Z of the apparatus 10b is stored to a frame buffer 26 and motion compensated in a motion compensation unit 28 to obtain a motion compensated output image Z', so that it can be used as further input of the processing.
[0019] In a mixing unit 30 the motion compensated previous result Z' is mixed with the input image X, depending on the reliability of the motion compensation. The mixed input signal W is then forwarded to the image enhancement unit 12, which enhances the input image X using information from the compensated previous result Z' and the current input W. Examples for such temporally recursive image enhancement methods are super-resolution or temporal noise and artifact reduction. The gain control works similar to the non-recursive processing explained above with reference to Fig. 1 and can contain an additional stabilizing factor, avoiding changing enhancement levels from frame to frame.
[0020] Fig. 3 depicts a block diagram of a third embodiment of a proposed image enhancement apparatus 10c according to the present disclosure for automated control of sharpness enhancement as an example for the image enhancement. For realizing an automated control of sharpness enhancement a well known unsharp masking method is used as exemplary embodiment for the image enhancement unit 12'. The low pass filter kernel used inside the unsharp masking is controlled depending on the global sharpness level 15' of the input image X. As global feature level analysis unit a sharpness level estimation unit 14' is realized.
[0021] Two different embodiments for the sharpness level estimation are preferred. The first one estimates the sharpness level of the input image X by comparing the maximum gradient and the local contrast in a local block area. The second one estimates the sharpness by computing the mean optimal standard deviation for Gaussian filtering inside detected edge areas. Furthermore the gain value shall be adapted to the current enhancement level. The enhancement level estimation unit 18 compares the local contrasts of the input image X and the enhanced input image Y and computes the mean of the ratios between the local contrasts in a discriminated area. This value is defined as the enhancement level 19 and is forwarded to the global gain factor computation unit 20. This unit 20 computes a global gain factor using the sharpness level value 15' (generally the global feature level) and the enhancement level value 19, preferably compared to a predefined target enhancement level 25. [0022] In case of an input signal X with a high sharpness level only a small gain factor 21 is computed to add only a small amount of sharpness, while in case of an input signal X with a low sharpness level a high gain factor 21 is chosen. The difference signal 17 between X and Y is multiplied with the global gain factor 21 to control the gain of the image enhancement.
[0023] Generally each kind of image processing changing the input image that should be adapted to the sharpness level of the input image can be used as embodiment of the sharpness enhancement unit 12'. Exemplary an unsharp masking unit 12a as depicted in Fig. 4 is used as embodiment for the image enhancement unit 12, particularly for the sharpness enhancement unit 12'. The current input image X is separately filtered in x- and y- direction using a Gaussian Filter Kernel 123 by a first filter 120 and a second (orthogonal) filter 122. The filter Kernel 123 can be computed in a Gaussian filter computation unit 124 depending on the input global feature level 15 (e.g. the input sharpness 15'). The filter coefficients for a 7-tap Gaussian Filter Kernel with standard deviation σ are computed using the following formula:
Filter(i) = e 2σ2 , i = -3...3
[0024] The standard deviation for an image with a high sharpness level should be selected lower than for an image with a low shaipness level. The low-pass filtered input image 125 is subtracted from the input signal X in a subtraction unit 126. The resulting difference signal 127 is finally added to the input X in an addition unit 128, generating an output signal Y with a higher sharpness than input signal X.
[0025] In case of a temporally recursive processing the image (e.g. sharpness) enhancement can be realized by combining details from the compensated previous result and from the current input signal as provided in the embodiment of the image enhancement unit 12b depicted in Fig. 5. The difference signal 127 between the low pass filtered current input 125 and the mixed input signal W (see Fig. 2) containing the details from previous frames is added to the current input X. A signal Y containing details from the current and the previous frames is thus generated according to this embodiment.
[0026] Another example for an embodiment of the image enhancement unit 12 is a temporally recursive image enhancement as described in European patent application 12167633.2 filed by the applicant on May 1 1, 2012, which is herein incorporated by reference in its entirety. This document describes an image enhancement apparatus for enhancing an input image of a sequence of input images of at least a first view and obtaining an enhanced output image of at least said first view, said apparatus comprising an unsharp masking unit configured to enhance the sharpness of the input image, a motion compensation unit configured to generate at least one preceding motion compensated image by compensating motion in a preceding output image, a weighted selection unit configured to generate a weighted selection image from said sharpness enhanced input image and said preceding motion compensated image based on selection weighting factor, detail signal generation unit configured to generate a detail signal from said input image and said weighted selection image, and a combination unit configured to generate said enhanced output image from said detail signal and from said input image and/or said weighted selection image.
[0027] Fig. 6 depicts a block diagram of a first embodiment of the sharpness level estimation unit 14' a. The idea for estimating the sharpness level is depicted in the diagrams shown in Fig. 7. In general, images with a low sharpness level show a smoother edge transition (as shown in Fig. 7A) than images with a high sharpness level (as shown in Fig. 7B). So the maximum gradient of an edge with the same height is lower for a lower sharpness level, while the local contrast, describing the height of the edge is similar. The idea is to compute a mean ratio of maximum gradient and local contrast for detected edge positions. This value is assumed to describe the sharpness level in the input image. [0028] In a first step the absolute gradient is computed for the whole input image X in a gradient computation unit 140. The gradients in x- and y-directions are computed by simple difference operators. gradX (x, y) = X (x, y) - X (x - 1, y )
gradY (x, y) = X (x, y) - X (x, y - l)
[0029] Then the absolute gradient is computed by the following operation:
Figure imgf000012_0001
[0030] Based on this gradient value the maximum local gradient is detected inside a local block area (e.g. 5x5 pixels) in a maximum local gradient computation unit 142. Further, based on a threshold decision edge positions are detected in an edge feature detection unit 144. If the gradient at the position (x, y) exceeds a defined threshold, the position is assumed to be located on an edge. Furthermore, by a local contrast computation unit 146 the local contrast is computed for the input image X in the same block area that is used for the maximum gradient detection (and additionally provided as separate output if needed in an embodiment). Then for each pixel the ratio of maximum gradient and local contrast is computed in a ratio computation unit 148. Finally the mean of this ratio is computed inside the discriminated edge area in a sharpness level computation unit 150. The resulting value is defined as sharpness level 15'.
[0031] Fig. 8 shows a second embodiment for the sharpness level estimation unit 14'b. According to this embodiment the mean optimal standard deviation for Gaussian filtering is computed inside a detected edge area. This measure also indicates the sharpness of an edge, as for steep edges a smaller optimal standard deviation is detected than for blurred edges. [0032] Like in the first embodiment in a first step the absolute gradient is computed for the whole input image X in the gradient computation unit 140. The gradients in x- and y- directions are computed by simple difference operators. gradX(x, y) = X (x, y)- X (x - l, y)
gradY (x, y) = X (x, y) - X (x, y - l)
[0033] Then the absolute gradient is computed by the following operation:
Figure imgf000013_0001
[0034] Based on a threshold decision edge positions are detected in an edge feature detection unit 144. If the gradient at the position (x, y) exceeds a defined threshold the position is assumed to be located on an edge.
[0035] Furthermore, the optimal local standard deviation for Gaussian filtering is computed using a set of e.g. three pre-defined filter standard deviations StdDevl , StdDev2, StdDev3 in an optimal local filter standard deviation computation unit 152. Finally, the mean optimal standard deviation is computed inside the detected edge area in a optimal standard deviation computation unit 154. This value is defined as sharpness level.
[0036] An embodiment of the optimal local filter standard deviation computation unit 152 is depicted in Fig. 9. The optimal standard deviation is computed depending on the minimum description length criterion. To realize this, the input signal X is separately filtered in a filter unit 1524 with three different 7-tap filter kernels, which are computed in a filter computation unit 1522 using three different standard deviations σχ:
Filter(i) = e i = -3...3 [0037] For filtering the input image X is separately convoluted with the filter coefficients in horizontal and vertical direction:
∑ Filter, (i) -X(x + i,
i=-3...3
∑ Filter, (i)
i=-3...3
Filterx (i) - IFllter hor (x,y + i)
i=-3...3
Filter, vert (*> y) ∑ Filter, (i)
-3...3
[0038] Then the difference images between the low-pass filtered results and input X are computed in a difference image computation unit 1526. For each filtered image then the local description length is computed inside a (e.g. 5x5 pixels) block area description length computation unit 1528 using the following equation.
dl = + σχ · ∑ diff 2 / 25
- 5x5 Block λ=48
[0039] The local description length values are used to detect the standard deviation of the low-pass filters that induce the local minimum description length in a selection unit 1530.
[0040] Fig. 10 depicts a block diagram of an embodiment of the enhancement level estimation unit 18. For estimating the enhancement level the local contrasts of the enhanced image Y and the input image X are compared and a mean ratio of the contrasts is computed inside an area with defined local contrast range of input X. The local contrast of the input image X and the enhanced input image Y is computed by local contrast computation units 182, 184 inside a local block area (e.g. 5x5 pixels) by computing the difference between maximum and minimum luminance value inside this block area. Then the ratio of local contrast of X and local contrast of Y is computed for each pixel in a contrast ratio computation unit 186. Furthermore an area is discriminated in which the local contrast of X has a value inside a defined contrast range in an area discrimination unit 188. Inside of this area the mean of the contrast ratios is computed in a mean contrast ratio computation unit 190 and defined as enhancement level.
[0041] Based on the estimated enhancement and global feature level (e.g. the sharpness level) a global gain factor is computed by the global gain factor computation unit 20. An embodiment of the global gain factor computation unit 20, using sharpness level as an example for the global feature level, is depicted in Fig. 11. At first the computed enhancement level and shaipness level are mapped to a predefined range by an enhancement level mapping unit 202 and a sharpness level mapping unit 204. An example for said mapping is a linear mapping between 0 and 1. The mapped shaipness level value 205 is used as basic gain value, as the gain of the image enhancement shall be controlled depending on the sharpness level. Preferably, the gain should be high for input signals with low sharpness levels and low for input signals with high sharpness levels. To be able to achieve a predefined target enhancement level 201, the mapped estimated enhancement level 203 is subtracted from the predefined target enhancement level 201 in a subtraction unit 206 and the difference 207 is added to the basic gain value 205 in an addition unit 208. The output of the addition unit 208 may be used as global gain factor 21 according to this embodiment.
[0042] In case the estimated enhancement level is too high, the gain value is reduced, and if it is too low the gain value is raised. Especially in case a temporally recursive image enhancement is used for image enhancement, it is possible that the enhancement level strongly varies from frame to frame. Therefore an additional stabilizing factor can be integrated for gain factor computation in case of recursive processing. For this purpose, the computed enhancement level differences are stored for one or more (here two) time instances 210 (generally in a storage unit) and a sum 21 1 of the last three values is finally added to the gain value 209 to finally obtain the global gain factor 21 in this variation of the embodiment. [0043] In summary, the present disclosure provides an apparatus and a method for an automated control of image enhancement gain. The global feature level, e.g. the sharpness, of the input signal is estimated and based on this estimation the gain factor of the image enhancement can be automatically controlled. Furthermore the enhancement level after the image enhancement is estimated so that the gain of the image enhancement can be well adapted to achieve a desired enhancement level for each level of input sharpness. This avoids a too strong and a too weak image enhancement even if the quality of the input images is not known.
[0044] The various elements of the different embodiments of the provided apparatus may be implemented as software and/or hardware, e.g. as separate or combined circuits. A circuit is a structural assemblage of electronic components including conventional circuit elements, integrated circuits including application specific integrated circuits, standard integrated circuits, application specific standard products, and field programmable gate arrays. Further a circuit includes central processing units, graphics processing units, and microprocessors which are programmed or configured according to software code. A circuit does not include pure software, although a circuit does include the above-described hardware executing software.
[0045] Obviously, numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.
[0046] In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single element or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. [0047] In so far as embodiments of the invention have been described as being implemented, at least in part, by software-controlled data processing apparatus, it will be appreciated that a non-transitory machine-readable medium carrying such software, such as an optical disk, a magnetic disk, semiconductor memory or the like, is also considered to represent an embodiment of the present invention. Further, such a software may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
[0048] Any reference signs in the claims should not be construed as limiting the scope.

Claims

1. An image enhancement apparatus comprising:
an image enhancement unit that enhances an input image based on a global feature level to obtain an enhanced image,
a global feature level analysis unit that analyzes said input image to obtain said global feature level,
a subtraction unit that subtracts said input image from said enhanced image to obtain a difference image,
an enhancement level estimation unit that compares, at least partially, said enhanced input image and said input image and determines an enhancement level,
a global gain factor computation unit that determines a global gain factor based on said global feature level and said enhancement level,
a multiplication unit that multiplies said difference image with said global gain factor to obtain a weighted difference image, and
an addition unit that adds said weighted difference image to said input image to obtain an output image.
2. The image enhancement apparatus as claimed in claim 1,
wherein said image enhancement unit is configured to carry out one or more of a sharpness enhancement, contrast enhancement, noise reduction and artifact reduction as image enhancement, and
wherein said global feature level analysis unit is configured to obtain one or more of a sharpness level, contrast level, noise level or artifact level as global feature level.
3. The image enhancement apparatus as claimed in claim 1 ,
further comprising
a storage unit that stores said output image, a motion compensation unit that compensates motion in said stored output image to obtain a motion compensated output image and
a mixing unit that mixes said input image with said motion compensated output image, in particular depending on the reliability of the motion compensation, to obtain a mixed input image
wherein said image enhancement unit is configured to enhance said input image using information from said mixed input image.
4. The image enhancement apparatus as claimed in claim 1,
wherein said image enhancement unit comprises an unsharp masking unit,
wherein said global feature level analysis unit is configured to estimate the global sharpness level of said input image, and
wherein said gain factor computation unit is configured to determine said global gain factor based on said global sharpness level, wherein a small global gain is determined in case of a high sharpness level and a large global gain is determined in case of a small sharpness level.
5. The image enhancement apparatus as claimed in claim 4,
wherein said unsharp masking unit is configured to separately filter said input image in two orthogonal directions depending on a filter kernel determined depending on said global sharpness level to obtain a filtered input image, to subtract said filtered input image from said input image to obtain a subtraction image and to add said subtraction image to said input image to obtain said enhanced image.
6. The image enhancement apparatus as claimed in claim 3 and 4,
wherein said unsharp masking unit is configured to separately filter said mixed input image in two orthogonal directions depending on a filter kernel determined depending on said global sharpness level to obtain a filtered mixed input image, to subtract said filtered mixed input image from said input image to obtain a subtraction image and to add said subtraction image to said input image to obtain said enhanced image.
7. The image enhancement apparatus as claimed in claim 1,
wherein said global feature level analysis unit comprises a sharpness level estimation unit that determines a global shaipness level of said input image, wherein said sharpness level estimation unit is configured to determine a mean ratio of maximum gradient and local contrast for detected edge positions within said input image as global sharpness level of said input image.
8. The image enhancement apparatus as claimed in claim 7,
wherein said sharpness level estimation unit comprises
a gradient computation unit that computes a global gradient of said input image, a maximum local gradient computation unit that computes a maximum local gradient inside one or more local block areas,
an edge detection unit that detects edge areas within said input image by use of said global gradient,
a local contrast computation unit that computes a local contrast inside said one or more local block areas,
a ratio computation unit that computes, for one or more pixels of said one or more local block areas, the ratio of maximum gradient and local contrast,
a sharpness level computation unit that computes a mean of said ratios of maximum gradient and local contrast for said detected edge areas.
9. The image enhancement apparatus as claimed in claim 1 ,
wherein said global feature level analysis unit comprises a sharpness level estimation unit that determines a global shaipness level of said input image, wherein said sharpness level estimation unit is configured to determine a mean optimal standard deviation for Gaussian filtering inside detected edge areas as global sharpness level.
10. The image enhancement apparatus as claimed in claim 9,
wherein said sharpness level estimation unit comprises a gradient computation unit that computes a global gradient of said input image, an edge detection unit that detects edge areas within said input image by use of said global gradient,
an optimal local filter standard deviation computation unit that computes an optimal local standard deviation for filtering said input image, and
a mean optimal standard deviation computation unit that computes a mean optimal filter standard deviation inside detected edge areas as global sharpness level.
1 1. The image enhancement apparatus as claimed in claim 10,
wherein said optimal local filter standard deviation computation unit comprises
a filter unit comprising a number of filters that separately filter said input image with a number of different filter kernels, which are computed using a number of different standard deviations, wherein said input image is separately convoluted with filter coefficients in horizontal and vertical direction,
a difference image computation unit that computes difference images between the respective filtered input images and the input image,
a description length computation unit that computes local description lengths inside one or more local block areas for the respective difference images, and
a selection unit that selects the standard deviation of the filter that induces the local minimum description length as said optimal standard deviation.
12. The image enhancement apparatus as claimed in claim 1 ,
wherein said enhancement level estimation unit comprises
local contrast computation units that separately compute local contrasts within said enhanced input image and said input image,
a contrast ratio computation unit that computes, for a number or all pixels, a contrast ratio from said local contrasts of said enhanced input image and said input image, an area discrimination unit that discriminates an area in which the local contrast of the input image has a value inside a predetermined contrast range, and a mean contrast ratio computation unit that computes the mean of said contrast ratios as enhancement level.
13. The image enhancement apparatus as claimed in claim 1 ,
wherein said global gain factor computation unit comprises
a mapping unit that maps said global feature level and said enhancement level to a predetermined range to obtain a mapped global feature level and a mapped enhancement level,
a subtraction unit that subtracts the mapped enhancement level from a target enhancement level, and
a first addition unit that adds the result of said subtraction to said mapped global feature level to obtain said global gain factor.
14. The image enhancement apparatus as claimed in claim 13,
wherein said global gain factor computation unit further comprises
a storage unit that stores one or more previous mapped enhancement levels, a second addition unit that adds one or more of said one or more previous mapped enhancement levels, the current mapped enhancement levels and the output of said first addition unit to obtain said global gain factor.
15. An image enhancement method comprising:
enhancing an input image based on a global feature level to obtain an enhanced image,
analyzing said input image to obtain said global feature level,
subtracting said input image from said enhanced image to obtain a difference image,
comparing, at least partially, said enhanced input image and said input image and determines an enhancement level,
determining a global gain factor based on said global feature level and said enhancement level, multiplying said difference image with said global gain factor to obtain a weighted difference image, and
adding said weighted difference image to said input image to obtain an output image.
16. A computer program comprising program code means for causing a computer to perform the steps of said method as claimed in claim 15 when said computer program is carried out on a computer.
17. A non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method according to claim 15 to be performed.
PCT/EP2013/070500 2012-10-12 2013-10-02 Image enhancement apparatus and method WO2014056766A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP12188310 2012-10-12
EP12188310.2 2012-10-12

Publications (1)

Publication Number Publication Date
WO2014056766A1 true WO2014056766A1 (en) 2014-04-17

Family

ID=47080313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/070500 WO2014056766A1 (en) 2012-10-12 2013-10-02 Image enhancement apparatus and method

Country Status (1)

Country Link
WO (1) WO2014056766A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028182A (en) * 2019-12-24 2020-04-17 北京金山云网络技术有限公司 Image sharpening method and device, electronic equipment and computer-readable storage medium
CN112465720A (en) * 2020-11-27 2021-03-09 南京邮电大学 Image defogging method and device based on image sky segmentation and storage medium
CN117575974A (en) * 2024-01-15 2024-02-20 浙江芯劢微电子股份有限公司 Image quality enhancement method, system, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794531A (en) * 1984-11-07 1988-12-27 Hitachi, Ltd Unsharp masking for image enhancement
EP2226761A2 (en) * 2009-03-05 2010-09-08 Tektronix, Inc. Methods and systems for filtering a digital signal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794531A (en) * 1984-11-07 1988-12-27 Hitachi, Ltd Unsharp masking for image enhancement
EP2226761A2 (en) * 2009-03-05 2010-09-08 Tektronix, Inc. Methods and systems for filtering a digital signal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANDREA POLESEL ET AL: "Image Enhancement via Adaptive Unsharp Masking", IEEE TRANSACTIONS ON IMAGE PROCESSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 9, no. 3, 1 March 2000 (2000-03-01), XP011025550, ISSN: 1057-7149 *
XIAOLING XIAO ET AL: "An improved unsharp masking method for borehole image enhancement", INDUSTRIAL MECHATRONICS AND AUTOMATION (ICIMA), 2010 2ND INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 30 May 2010 (2010-05-30), pages 349 - 352, XP031725509, ISBN: 978-1-4244-7653-4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028182A (en) * 2019-12-24 2020-04-17 北京金山云网络技术有限公司 Image sharpening method and device, electronic equipment and computer-readable storage medium
CN111028182B (en) * 2019-12-24 2024-04-26 北京金山云网络技术有限公司 Image sharpening method, device, electronic equipment and computer readable storage medium
CN112465720A (en) * 2020-11-27 2021-03-09 南京邮电大学 Image defogging method and device based on image sky segmentation and storage medium
CN112465720B (en) * 2020-11-27 2024-02-23 南京邮电大学 Image defogging method and device based on image sky segmentation and storage medium
CN117575974A (en) * 2024-01-15 2024-02-20 浙江芯劢微电子股份有限公司 Image quality enhancement method, system, electronic equipment and storage medium
CN117575974B (en) * 2024-01-15 2024-04-09 浙江芯劢微电子股份有限公司 Image quality enhancement method, system, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US7406208B2 (en) Edge enhancement process and system
CN100456318C (en) Method for simutaneously suppressing undershoot and over shoot for increasing digital image
US7570309B2 (en) Methods for adaptive noise reduction based on global motion estimation
US8768069B2 (en) Image enhancement apparatus and method
US7792384B2 (en) Image processing apparatus, image processing method, program, and recording medium therefor
EP2439927B1 (en) Image processing device, image processing method, program, recording medium, and integrated circuit
US9135683B2 (en) System and method for temporal video image enhancement
US20130301949A1 (en) Image enhancement apparatus and method
US20060125816A1 (en) Methods of suppressing ringing artifact of decompressed images
EP2836963B1 (en) Noise reduction for image sequences
JP6587317B2 (en) Guided filter-based detail enhancement
KR100677133B1 (en) Method and apparatus for detecting and processing noisy edges in image detail enhancement
EP2816526B1 (en) Filtering method and apparatus for recovering an anti-aliasing edge
EP2107521A2 (en) Detecting a border region in an image
US7932955B2 (en) Method and system for content adaptive analog video noise detection
WO2014056766A1 (en) Image enhancement apparatus and method
US20120019677A1 (en) Image stabilization in a digital camera
US20130301928A1 (en) Shift vector reliability determining apparatus and method
KR101024731B1 (en) Method and system for reducing mosquito noise in a digital image
US9258461B2 (en) Image processing device and method, and image processing program
JP5420049B1 (en) Magnification rate estimation apparatus or method
Brüggemann et al. Automated Gain Control for Temporally Recursive Detail Reconstruction with Variable Video Input Sharpness
US20150055874A1 (en) Image analyzing apparatus and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13776986

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13776986

Country of ref document: EP

Kind code of ref document: A1