WO2017036908A1 - Method and apparatus for inverse tone mapping - Google Patents

Method and apparatus for inverse tone mapping Download PDF

Info

Publication number
WO2017036908A1
WO2017036908A1 PCT/EP2016/070059 EP2016070059W WO2017036908A1 WO 2017036908 A1 WO2017036908 A1 WO 2017036908A1 EP 2016070059 W EP2016070059 W EP 2016070059W WO 2017036908 A1 WO2017036908 A1 WO 2017036908A1
Authority
WO
WIPO (PCT)
Prior art keywords
luminance
pixel
dynamic range
image
pixels
Prior art date
Application number
PCT/EP2016/070059
Other languages
French (fr)
Inventor
Tania POULI
Jonathan Kervec
Cédric Thebault
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to KR1020187005157A priority Critical patent/KR102523505B1/en
Priority to EP16757023.3A priority patent/EP3345155B1/en
Priority to CN201680050153.4A priority patent/CN108027961B/en
Priority to JP2018510794A priority patent/JP6803378B2/en
Priority to US15/755,505 priority patent/US10572983B2/en
Publication of WO2017036908A1 publication Critical patent/WO2017036908A1/en

Links

Classifications

    • G06T5/90
    • G06T5/94
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Picture Signal Circuits (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

A method for inverse tone mapping is provided. The method comprising obtaining a digital image in a color space wherein the luminance is separate from the chrominance, determining a base luminance of pixels in the digital image, determining a detail enhancement map, determining a pixel expansion exponent map, determining an edge map of the image, inverse tone mapping luminance of image based on edge map, pixel expansion map and base luminance, and providing an expanded dynamic range image based on the inverse tone mapped luminance.

Description

METHOD AND APPARATUS FOR INVERSE TONE MAPPING
Technical Field
This disclosure pertains to the field of tone management and specifically the problem of expanding the luminance of images to match the dynamic range of high dynamic range (HDR) displays. The disclosure specifically proposes a method for reducing noise in the image as part of the expansion process.
Background Art
Recent advancements in display technology are beginning to allow for an extended range of color, luminance and contrast to be displayed. Technologies allowing for extensions in luminance or brightness range of image content are known as high dynamic range imaging, often shortened to HDR. HDR technologies focus on capturing, processing and displaying content of a wider dynamic range. Although a number of HDR display devices have appeared, and image cameras capable of capturing images with an increased dynamic range are being developed, there is still very limited HDR content available. While recent developments promise native capture of HDR content in the near future, they do not address existing content. To prepare conventional (hereon referred to as LDR for low dynamic range) content for HDR display devices, reverse or inverse tone mapping operators (ITMO) can be employed. Such algorithms process the luminance information of colors in the image content with the aim of recovering or recreating the appearance of the original scene. Typically, ITMOs take a conventional (i.e. LDR) image as input, expand the luminance range of the colors of this image in a global manner, and subsequently process highlights or bright regions locally to enhance the HDR appearance of colors in the image.
Although several ITMO solutions exist, they focus at perceptually reproducing the appearance of the original scene and rely on strict assumptions about the content. Additionally, most expansion methods proposed in the literature are optimized towards extreme increases in dynamic range.
Typically, HDR imaging is defined by an extension in dynamic range between dark and bright values of luminance of colors combined with an increase in the number of quantization steps. To achieve more extreme increases in dynamic range, many methods combine a global expansion with local processing steps that enhance the appearance of highlights and other bright regions of images. Known global expansion steps proposed in the literature vary from inverse sigmoid, to linear or piecewise linear. To enhance bright local features in an image, it is known to create a luminance expansion map, wherein each pixel of the image is associated with an expansion value to apply to the luminance of this pixel using some function.
The luminance expansion step increases the contrast in the image so that it is better suited for HDR displays. However often, at the same time, this step increases the contrast of artifacts or noise in the image, making such artifacts more visible and therefore more disturbing to the viewers. For that purpose, it might be desirable to denoise the image. If an existing denoising process is applied before or after the ITMO as a separate process, many additional computations will be required. This disclosure is closely related to published application WO2015096955, filed 02 December 2014, entitled "METHOD FOR INVERSE TONE MAPPING AN IMAGE" which is incorporated by reference. In comparison with this document, this invention mainly adapts the expansion process in order to reduce noise in the image in a computationally efficient manner, minimizing additional processing. Summary of invention
The present disclosure is directed to techniques to expand or reshape the luminance information using components that are already present in the core algorithm described in WO2015096955 but in a way that reduces noise in the image, without requiring too many new processing steps.
A subject of the invention is a method for providing an expanded dynamic range image from a digital image defined by pixels associated with colors represented in a color space separating luminance from chrominance, comprising inverse tone mapping luminance value of pixels of said image, and, providing an expanded dynamic range image based on the inverse tone mapped luminance values,
wherein the inverse tone mapping luminance value of at least one pixel is based on an information representative of edges and/or gradients around said at least one pixel in the digital image.
It means that the luminance of this at least one pixel is inverse tone mapped depending on the neighboring content around this at least one pixel in the image.
Preferably, said information M(p) representative of edges and/or gradients around said at least one pixel is obtained by the application of an edge detector algorithm to pixels surrounding said at least one pixel.
Preferably, these surrounding pixels are defined as belonging to a block centered on said at least one pixel.
Preferably, said edge detector algorithm is applied to a low pass filtered luminance value of said surrounding pixels.
Preferably, said inverse tone mapping of pixels comprises applying an expansion exponent value E(p) of said pixels to the luminance value Y(p) of said pixels. Preferably, the inverse tone mapping of the at least one pixel which is based on an information representative of edges and/or gradients around this at least one pixel comprise applying an expansion exponent value of said at least one pixel to a weighted combination of a low-pass filtered luminance and of the luminance of said at least one pixel, in which the weight assigned to said low- pass filtered luminance is proportional to a value representative of edges and/or gradients around said at least one pixel, in which the weight assigned to said luminance is inversely proportional to said value representative of edges and/or gradients around said at least one pixel.
Preferably, said weighted combination is a weighted sum.
Preferably, the value representative of edges and/or gradients around said at least one pixel is above or equal a threshold τ.
Preferably, the method comprises combining inverse tone mapped luminance values of said pixels with scaled chrominance values of said pixels.
A subject of the invention is also an apparatus for providing an expanded dynamic range image from a digital image defined by pixels associated with colors represented in a color space separating luminance from chrominance, comprising a processing unit configured to implement the above method. This processing unit comprises one or several processors.
A subject of the invention is also a non-transitory computer-readable medium storing computer-executable instructions executable to perform the above method.
In one embodiment, a method for inverse tone mapping is provided. The method comprises obtaining a digital image in a color space wherein the luminance is separate from the chrominance, determining a base luminance of pixels in the digital image, determining a detail enhancement map, determining a pixel expansion exponent map, determining an edge map of the image, inverse tone mapping luminance of image based on edge map, pixel expansion map and base luminance, and providing an expanded dynamic range image based on the inverse tone mapped luminance.
Preferably, the base luminance is determined by low-pass filtering the luminance of pixels in the digital image. Preferably, the inverse tone mapping is based on a threshold value for the edge map.
Preferably, providing an expanded dynamic range image further comprises combining the inverse tone mapped luminance with scaled chrominance.
In another embodiment, an apparatus for inverse tone mapping is provided. The apparatus comprises a storage, memory and processor. The storage is for storing video content. The memory for storing data for processing. The processor is in communication with the storage and memory. The processor is configured to obtain a digital image in a color space wherein the luminance is separate from the chrominance, determine a base luminance of pixels in the digital image, determine a detail enhancement map, determine a pixel expansion exponent map, determine an edge map of the image, inverse tone map luminance of digital image based on edge map, pixel expansion map and base luminance, and provide an expanded dynamic range image based on the inverse tone mapped luminance.
In another embodiment, a non-transitory computer-readable medium storing computer executable instructions is provided. The instructions are executable to perform a method comprising obtaining a digital image in a color space wherein the luminance is separate from the chrominance, determining a base luminance of pixels in the digital image, determining a detail enhancement map, determining a pixel expansion exponent map, determining an edge map of the image, inverse tone mapping luminance of image based on edge map, pixel expansion map and base luminance, and providing an expanded dynamic range image based on the inverse tone mapped luminance. Brief description of drawings
The invention will be more clearly understood on reading the description which follows, given by way of non-limiting example and with reference to the appended figures in which:
Figure 1 depicts a block schematic diagram of a system in which inverse tone mapping with noise reduction can be implemented according to an embodiment.
Figure 2 depicts a block schematic diagram of an electronic device for implementing the methodology of inverse tone mapping with noise reduction according to an embodiment.
Figure 3 depicts an exemplary flowchart of a methodology for inverse tone mapping with noise reduction according to an embodiment.
Figure 4 depicts a block schematic diagram of modules for implementing the methodology of inverse tone mapping with noise reduction according to an embodiment.
Figures 5A and 5B are screen shots showing the effect of using different block sizes for the noise filtering.
Figure 6A and 6B are screen shots showing the showing the effect of different thresholds for noise filtering.
Description of embodiments
The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
Turning now to Figure 1 , a block diagram of an embodiment of a system 100 for implementing noise reduction in inverse tone mapping in view of this disclosure is depicted. The system 100 includes an image source 1 10, image processing 120, and a display 130. Each of these will be discussed in more detail below. The image source 1 10 may be a broadcast source, camera, server, or other storage device such as a hard drive, flash storage, magnetic tape, optical disc, or the like. The images source 1 10 provides low dynamic range (LDR) content, such as a digital image 1 12, to image processing 120. The digital images 1 12 may be in any number of formats and resolutions.
The content processing 120 is where the digital images are converted from low dynamic range (LDR) to High Dynamic Range (HDR). This involves the inverse tone mapping operators (ITMO) including noise reduction as set forth herein. The image processing 120 outputs an HDR image 122 to the display 130.
The display 130 can be a television, personal electronic device, or the like that is capable of displaying High Dynamic Range (HDR) images provided by the content processing 120.
Figure 2 depicts an exemplary electronic device 200 that can be used to implement the methodology and system for inverse tone mapping with noise reduction. The electronic device 200 includes one or more processors 210, memory 220, storage 230, and a network interface 240. Each of these elements will be discussed in more detail below.
The processor 210 controls the operation of the electronic device 200. The processor 210 runs the software that operates the electronic device as well as provides the functionality for LDR to HDR conversion set forth in the present disclosure. The processor 210 is connected to memory 220, storage 230, and network interface 240, and handles the transfer and processing of information between these elements. The processor 210 can be general processor or a processor dedicated for a specific functionality. In certain embodiments there can be multiple processors. The memory 220 is where the instructions and data to be executed by the processor are stored. The memory 220 can include volatile memory (RAM), nonvolatile memory (EEPROM), or other suitable media. The storage 230 is where the data used and produced by the processor in executing the content analysis is stored. The storage may be magnetic media (hard drive), optical media (CD/DVD-Rom), or flash based storage. Other types of suitable storage will be apparent to one skilled in the art given the benefit of this disclosure.
The network interface 240 handles the communication of the electronic device 200 with other devices over a network. Examples of suitable networks include Ethernet networks, Wi-Fi enabled networks, cellular networks, and the like. Other types of suitable networks will be apparent to one skilled in the art given the benefit of the present disclosure.
It should be understood that the elements set forth in Figure 2 are illustrative. The electronic device 200 can include any number of elements and certain elements can provide part or all of the functionality of other elements. Other possible implementation will be apparent to on skilled in the art given the benefit of the present disclosure.
It is to be understood that the invention may be implemented in various forms of hardware, software, firmware, special purpose processors, or combinations thereof. The term "processor" should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor ("DSP") hardware, read-only memory ("ROM") for storing software, random access memory ("RAM"), and non-volatile storage. The disclosed concepts may be notably implemented as a combination of hardware and software. Moreover, the software may be implemented as an application program tangibly embodied on a program storage unit. Such a software can take the form of a plug-in to be integrated to another software. The application program may be uploaded to, and executed by, an image processing device comprising any suitable architecture. Preferably, the image processing device is implemented on a computer platform having hardware such as one or more central processing units ("CPU"), a random access memory ("RAM"), and input/output ("I/O") interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit, a display device, a printing unit, ... The image processing device implementing the embodiment of the method according to the invention may be part of any electronic device able to receive images, for instance a TV set, a set-top-box, a gateway, a cell phone, a tablet.
Figure 3 depict a flow diagram 300 of a process for noise reduction for inverse tone mapping in accordance with one embodiment. The flow diagram 300 includes the stages of: obtaining a digital image 310, determining a base luminance 320, determining a detail enhancement map 330, determining a pixel expansion map 340, determining an edge map 350, inverse tone mapping 360, and providing an expanded dynamic range image 370. Each of these steps will be discussed in more detail below in combination with Figure 4. Figure 4 depicts a block diagram 400 of the modules used to perform the low dynamic range to high dynamic range conversion using the noise reduction expansion for inverse tone mapping operators of the present disclosure. These modules include RGB->YUV conversion module 410, low pass filtering module 420, high pass filtering module 430, expansion mapping module 440, edge mapping module 452, inverse tone mapping module 450, detail enhancement module 460, color saturation module 470, image recombination module 480, and RGB conversion module 490.
Referring back to Figure 3, a digital image 1 12 is obtained (310) from an image source 1 10 as discussed above. The techniques of the present disclosure require that the digital image 1 12 be in a color space where the luminance is separate from the chrominance. If the digital image 1 12 is not in such a color space, then it can be converted to such a color space. For example, the digital image 1 12 could be provided in a RGB color space of a display device. This RGB color space can be standardized, and the corresponding display device can be a virtual one. In such a case, the received RGB color coordinates representing colors of the digital image 1 12 are converted to a color space separating luminance from chrominance, for instance the YUV color space. This conversion of colors from the RGB color space into the YUV color space is known per se and therefore not described in detail. Any other color space separating luminance from chrominance can be used instead, as XYZ, Yxy, CIE Lab. Therefore, a luminance value Y(p) and two chrominance values U(p), V(p) are associated with the color of any pixel p of the digital image 1 12. This functionality is provided by the RGB->YUV conversion module 410 of Figure 4.
Using the low pass filtering module 420, a low-pass filtered version of luminance Y(p) is determine referred to herein as a base luminance, Ybase(p) (stage 320 of Figure 3), For example, low-pass filtering can be applied to luminance Y(p) of each pixel, p, of original LDR digital image 1 12 based on one or more Gaussian functions in a spatial neighborhood of the pixel p, and in a neighborhood of the luminance value Y(p). For example, Ybase(p) can be determined through the following equation:
(1 ) Ybase (p
Figure imgf000011_0001
Y(Pi - Y(PW where f's is first a Gaussian function applied on the spatial domain of the image,
f'r a second Gaussian function applied on the luminance range domain, Ω' is the size of a window of the image centered at the pixel p, and pi is a pixel in this window.
In various embodiments, the window size can be for instance 5 or 7. Smaller values of window size may allow for better computational efficiency. In this example, the low-pass filtering is bilateral, which refers to the fact that the filtering is performed here both in spatial and luminance range domains.
In various embodiments, the value of the standard deviation, os, of the first Gaussian function f's can be greater than or equal to 2. In various embodiments, the value of the standard deviation, σΓ, of the second Gaussian function fr can be high enough to smooth texture and noise in the original LDR image 1 12, but low enough to avoid crossing edges between objects of the image. In various embodiments, the value of the standard deviation, σΓ, can be between 0.1 max(Y) and 0.5 max(Y), where max(Y) is the maximum luminance value over all pixels of the original image. In various embodiments, the standard deviation for the spatial Gaussian function fs can be set at os = 3, and the standard deviation for the luminance range Gaussian function f'r can be set at σΓ = 0.3 x max(Y).
The determined base luminance (stage 320 of Figure 3), Ybase(p), can be provided to high pass filtering module 430. Ybase(p) can also be provided to expansion mapping module 440. Both modules are described in more detail below.
The high pass filtering module 430 is configured to extract high frequency details in the digital image 1 12 from base luminance Ybase(p) and/or luminance Y(p) such as to determine a luminance detail enhancement map, Ydetaii(p), based on Ybase(p) and Y(p) (Stage 330 of Figure 3). For example, in various embodiments Ydetaii(p) can be determined by first determining:
Y' base (l>)
Figure imgf000013_0001
where f"s and f"r are the same Gaussian functions as above (f's, f'r), but with larger standard deviation in the luminance range in order to remove more luminance details in the image,
Ω" is the size of a window of the image centered at the pixel, p, which can have the same size as above, and
pi is a pixel in this window.
Y'base(p) can be determined by filtering Y(p) in a similar way that Ybase(p) is determined, but with parameters that remove more luminance details. For example, in various embodiments the standard deviation for the spatial Gaussian function f"s can be set such that o"s = o's, and the standard deviation for the luminance range Gaussian function f"r can be set such as to remove more luminance details in the image, i.e., such that σ"Γ > σ'Γ. For instance σ'7 can be set equal to 0.3 x max(Y). High pass filtering module 430 can then determine luminance detail enhancement Ydetaii(p) for instance as the ratio of Ybase(p) and Y'base(p):
Ydetail(p) = Ybase(p)/ Y'base(p) (24)
The different values of Ydetaii(p) over the digital image 1 12 then form a luminance detail enhancement map. Because σ"Γ > σ'Γ , this ratio corresponds to an extraction of high spatial frequencies of luminance values in the digital image 1 12, i.e. to a high pass filtering of this image. Other ways of extraction of high spatial frequencies of luminance values in the image and, more generally, of forming a luminance detail enhancement map Ydetaii(p) can be used without departing from the disclosure. Expansion mapping module 440 can determine an expansion exponent map, E(p), based on Ybase(p) (Stage 340 of Figure 3). Expansion exponent map, E, can be an image-sized floating point map, in which each value represents the exponent to be applied to Y(p) of each pixel in original LDR image 1 12. The expansion exponent map can be for instance based on a quadratic function, such as:
E= a(Ybase)2 + b(Ybase) + C (2) where the coefficients of the quadratic function, a, b, and c, can be parameters determined as follows based on a maximum display luminance, Lmax, of the HDR display 130:
a = Pa1 exp(Pa2Lmax) + Pa3 exp(pa4Lmax) (3) b = PbiLma>Pb2 + Pb3 (4) c = 1.3 (5)
The parameters used in the above equations can be set, for example, as follows: pai = -8.192e-7, pa2 = 0.000312, pa3 = 1 .657e-5, pa4 = -0.0006967, pbi = 0.05941 , pb2 = 0.03135, and pb3 = -0.07579.
In this way, for example, the shape of the expansion exponent map E can change depending on the maximum display luminance, allowing for a global way of controlling the image appearance without requiring input from a user.
Other ways of determining an expansion exponent map E(p) based on Ybase(p) and/or Y(p) can be used without departing from the invention. Such a map can be for instance determined separately for different luminance zones, such as a shadow zone, a midtone zone and a highlight zone. After having determined separately this map in these different luminance zones, any interpolation mean can be used to obtain the whole expansion exponent map from the maps specific to each luminance zone. Expansion exponent map E(p) can be provided to inverse tone mapper module 450, where the luminance Y(p) of each pixel p of the original LDR image 1 12 can be inverse tone mapped based on E(p) into an expanded luminance Yexp(p). In various embodiments, YexP(p) 237 can be the luminance of the pixel Y(p) raised to the value of the expansion exponent E(p).
As can be seen on Figure 4B providing detailed view of the modules comprised in the inverse tone mapper module 450 of Figure 4A, the inverse tone mapper module is where the denoising of the present disclosure is performed.
Since the expansion is done through an exponential function, noise and other artifacts in Y will inevitably be amplified. Although this issue is managed to some extend by using a low-filtered luminance Ybase to compute expansion exponents E as shown above, in some case - particularly in the presence of film grain - the amplification is sufficiently strong that it impacts visual quality of the HDR image provided by the image processing 120. To avoid this, the luminance expansion step is adapted to perform a denoising effect based on an edge map which can be computed separately. It should be noted that this process should be optional and controllable as it is often likely to smooth areas where the artist would prefer to keep more detail.
The denoising approach described here aims to reduce the visibility of noise and grain without overly smoothing image details and edges. The key idea of this approach is that for each pixel p of the digital image 1 12, we can choose between expanding the luminance Y or the low-pass filtered luminance Ybase of this pixel depending on the neighboring content around that pixel. By analyzing a block of pixels around p, we can determine whether this pixel belongs to a smooth area or if it's near an edge in this image 1 12. If the area is smooth, we can use the low- pass filtered luminance Ybase as the input to the expansion of luminance with no loss of detail. On the other hand, if there is an edge nearby, we expand the unfiltered luminance Y to avoid smoothing the edge. The edge mapping module 452 provides the first step of the denoising, the computation of the edge map M (Stage 350 of Figure 3), which is obtained using a block-based analysis of the low pass filtered luminance, Ybase. This effectively can be seen as a simple edge detector and similar results could be achieved using a convolution with a Sobel filter or equivalent. Note however that after filtering, we take the magnitude of the response and ignore the sign.
For instance, for each pixel p in the low-pass version of the image Ybase, we consider a block Bp of pixels around pixel p having for instance a size b x b. Within this block of pixels, we can compute an edge map in the digital image 1 12 as follows:
Figure imgf000016_0001
Note that, according to this formula, the analysis is preferably performed on the low pass filtered luminance layer Ybase instead on the unfiltered luminance layer Y. The reason for that is that the low pass filtered luminance layer Ybase is already filtered so that noise and grain should be mostly removed. As such, this edge detector will only respond to image details and edges that were strong enough to survive the Gaussian low pass filtering step described above (step 320). The value of M(p) will be higher if there is an edge within the block Bp and lower if the block Bp corresponds to a smooth area, which allows us to decide how to apply the expansion in the next step for instance as shown below. It means that the value M(p) related to a pixel p in the edge map is representative of edges and/or gradients around this pixel in the digital image 1 12, notably in a block centered on this pixel.
The size £> of the blocks used to compute the edge map M(p) can serve as a parameter. Larger blocks will make the denoising more conservative as edges and detail further away will be counted, however they will also use more computational resources as more pixels are assessed for each block. Figure 5A and 5B shows the effect of changing the block size. In Figure 5A, the block size is b=5. In Figure 5B, the block size is £>=15. Any other known edge detection algorithm can be used instead of the above equation used to compute the edge map. Such an edge map is considered as representative of edges and/or gradients around each pixel of the digital image 1 12.
After obtaining an edge map M(p) from the previous step, we can use this edge map to guide the inverse tone mapping of the luminance (stage 360 of Figure 3) such as to remove some noise. Instead of applying the expansion exponent E(p) directly to luminance Y(p) of this pixel for any pixel p of the image 1 12, here, to get an expanded luminance Yexp(p) for any pixel p having an edge level M(p) above a threshold τ, we apply the expansion exponent E(p) to a weighted combination of the image luminance Y(p) and of the low-pass filtered luminance Ybase(p). In this weighted combination, the weight a of the low-pass filtered luminance Ybase(p) for the pixel p is proportional to the value M(p) representative of edges and/or gradients around this pixel, and the weight (1 -a) of the unfiltered luminance Y(p) for the same pixel p is inversely proportional to the same value M(p) representative of edges and/or gradients around this pixel p. For instance, this combination is the following weighted sum :
Figure imgf000017_0001
where a is preferably: where the threshold τ is for instance equal to 10.
A lower threshold will make the denoising more conservative in that it will only be applied to really smooth areas, while a higher threshold will allow the denoising to be applied more generally but it will also smooth image details more. See for example, the images in Figure 6A and Figure 6B. The expanded luminance Yexp(p) can be provided to detail enhancement module 460. Detail enhancement module 460 can enhance the expanded luminance YexP(p) of a pixel p by applying luminance detail enhancement Ydetaii(p) of this pixel. For example, an enhanced expanded luminance, Y'exP(p), can be determined based on the product of the expanded luminance and the luminance detail enhancement provided by the high pass filtering module 430, for example:
Y'exp(p) = Y(P)E(P> X [Ydetail(p)F
where d is a detail enhancement parameter. This detail enhancement parameter d can control the amount of detail enhancement applied by the luminance detail map, Ydetaii(p). For example, increasing the value of d increases the contrast of image edges. In various embodiments, a value of d = 1 .5 can be used.
When expanding the luminance of the original LDR image as described above, luminance and contrast changes can affect the appearance of color and saturation in the image. In this regard, color information of the image may be adjusted by color saturation module 470. For example, color saturation module 470 can be used to preserve the artistic intent with respect to the color of the image. In various embodiments, saturations of colors can be enhanced using the expansion exponent values as a guide. More specifically, the saturation of the color of each pixel can be enhanced based on the zone expansion exponent map, E(p), of the pixel.
For example, saturation of the color of a pixel p can be enhanced by adjusting a Chroma value C(p) of the pixel. The Chroma value can be determined as follows in a cylindrical version of the YUV space:
Figure imgf000018_0001
An adjusted Chroma value C'(p) can be determined as the product of zone expansion exponent map E(p) of the pixel p and the Chroma value C(p) of the pixel:
C'(p) = E p) x C(p)
In various embodiments, the Chroma scaling, which transforms C(p) into C'(p), can be limited to a factor of 1 .5 to avoid over-saturating highlights, e.g., to avoid light explosions and bright lights. With these new values of C'(p), enhanced values of chrominance, U'(p) and V'(p), 241 can be calculated by converting from a cylindrical color space, such as LCH here, to a YUV space:
U'(p) = cos[H(p)] x C'(p)
V'(p) = sin[(H(p)] x C'(p)
where H(p) is the original hue computed from original U(p) and V(p) as follows:
H(p) = arctan[V(p),U(P)] A image recombiner module 480 can combine the enhanced values of chrominance U'(p) and V'(p) 241 with enhanced expanded luminance Y'exp(p) 239 to obtain and provide an expanded HDR image 122, i.e. a high dynamic version of the digital image 1 12. Expanded HDR image 122 can be displayed on HDR display device 130.
Of course, one skilled in the art would understand that expanded HDR image may require conversion into a color space of the display device, such as RGB, before being displayed. In the present embodiment that is handled by the RGB conversion module 490.
The combination of modules 460, 470, 480 490 provide an image with expanded dynamic range (Stage 370 of Figure 3). As shown on Figure 1 , the expanded image 122 based on these expanded colors is now ready to be sent to the display device 130 having its peak luminance Lmax, in order to be reproduced within a high dynamic range. Example of such output using different threshold can be seen in Figure 6A and Figure 6B.
In the Figure 6A, a threshold τ = 50 was used to determine Υθχρ, leading to reduced noise but also smoothing the details (e.g. the faces at the bottom of the image). In contrast, if a small threshold is used to determine ΥθχΡ, as in Figure 6B (τ = 5), then both details and noise are more visible. In both cases however, the sharp edges between the sculpture and the background remain sharp.
While the present invention is described with respect to particular examples and preferred embodiments, it is understood that the present invention is not limited to these examples and embodiments. The present invention as claimed therefore includes variations from the particular examples and preferred embodiments described herein, as will be apparent to one of skill in the art. While some of the specific embodiments may be described and claimed separately, it is understood that the various features of embodiments described and claimed herein may be used in combination. Reference numerals appearing in the claims are by way of illustration only and shall have no limiting effect on the scope of the claims.

Claims

1 . A method for providing an expanded dynamic range image from a digital image (1 12) defined by pixels associated with colors represented in a color space separating luminance (Y) from chrominance, comprising :
inverse tone mapping (360) luminance value Y(p) of pixels of said image, and,
providing (106) an expanded dynamic range image (122) based on the inverse tone mapped luminance values,
wherein the inverse tone mapping (360) luminance value Y(p) of at least one pixel is based on an information M(p) representative of edges and/or gradients around said at least one pixel in the digital image.
2. A method for providing an expanded dynamic range image according to claim 1 , wherein said information M(p) representative of edges and/or gradients around said at least one pixel is obtained by the application of an edge detector algorithm to pixels surrounding said at least one pixel.
3. A method for providing an expanded dynamic range image according to claim 2, wherein said edge detector algorithm is applied to a low pass filtered luminance value Ybase(p) of said surrounding pixels.
4. A method for providing an expanded dynamic range image according to any one of claims 1 to 3, wherein said inverse tone mapping of pixels comprises applying an expansion exponent value E(p) of said pixels to the luminance value Y(p) of said pixels.
5. A method for providing a expanded dynamic range image according to claim 4, wherein the inverse tone mapping (360) of the at least one pixel which is based on an information M(p) representative of edges and/or gradients around this at least one pixel comprise applying an expansion exponent value E(p) of said at least one pixel to a weighted combination of a low-pass filtered luminance Ybase(p) and of the luminance Y(p) of said at least one pixel, in which the weight assigned to said low-pass filtered luminance Ybase(p) is proportional to a value M(p) representative of edges and/or gradients around said at least one pixel, in which the weight assigned to said luminance Y(p) is inversely proportional to said value M(p) representative of edges and/or gradients around said at least one pixel.
6. A method for providing an expanded dynamic range image according to claim 5, wherein said combination is a sum.
7. A method for providing an expanded dynamic range image according to claim 6, wherein the value M(p) representative of edges and/or gradients around said at least one pixel is above or equal a threshold τ.
8. An apparatus for providing an expanded dynamic range image from a digital image defined by pixels associated with colors represented in a color space separating luminance (Y) from chrominance, comprising a processing unit configured to implement the method according to any one of claims 1 to 7.
9. A non-transitory computer-readable medium storing computer- executable instructions executable to perform a method according to any one of claims 1 to 7.
PCT/EP2016/070059 2015-08-31 2016-08-25 Method and apparatus for inverse tone mapping WO2017036908A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020187005157A KR102523505B1 (en) 2015-08-31 2016-08-25 Method and Apparatus for Inverse Tone Mapping
EP16757023.3A EP3345155B1 (en) 2015-08-31 2016-08-25 Method and apparatus for inverse tone mapping
CN201680050153.4A CN108027961B (en) 2015-08-31 2016-08-25 Method and apparatus for providing extended dynamic range images from digital images
JP2018510794A JP6803378B2 (en) 2015-08-31 2016-08-25 Reverse tone mapping method and equipment
US15/755,505 US10572983B2 (en) 2015-08-31 2016-08-25 Method and apparatus for inverse tone mapping

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15306337.5 2015-08-31
EP15306337 2015-08-31

Publications (1)

Publication Number Publication Date
WO2017036908A1 true WO2017036908A1 (en) 2017-03-09

Family

ID=54148447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/070059 WO2017036908A1 (en) 2015-08-31 2016-08-25 Method and apparatus for inverse tone mapping

Country Status (6)

Country Link
US (1) US10572983B2 (en)
EP (1) EP3345155B1 (en)
JP (1) JP6803378B2 (en)
KR (1) KR102523505B1 (en)
CN (1) CN108027961B (en)
WO (1) WO2017036908A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3418972A1 (en) 2017-06-23 2018-12-26 Thomson Licensing Method for tone adapting an image to a target peak luminance lt of a target display device
WO2019147028A1 (en) * 2018-01-24 2019-08-01 삼성전자주식회사 Image processing apparatus, image processing method, and computer-readable recording medium
US11238572B2 (en) 2017-09-27 2022-02-01 Interdigital Vc Holdings, Inc. Device and method for dynamic range expansion in a virtual reality scene

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6237797B2 (en) * 2016-01-05 2017-11-29 ソニー株式会社 Video system, video processing method, program, and video converter
EP3319013A1 (en) * 2016-11-03 2018-05-09 Thomson Licensing Method and device for estimating cast shadow regions and/or highlight regions in images
US10096089B2 (en) * 2017-01-04 2018-10-09 Facebook, Inc. Accelerated skin smoothing effect
CN109544463A (en) * 2018-10-17 2019-03-29 天津大学 The inverse tone mapping (ITM) method of image content-based
CN109785263B (en) * 2019-01-14 2022-09-16 北京大学深圳研究生院 Retinex-based inverse tone mapping image conversion method
CN115052137B (en) * 2019-10-18 2023-09-26 华为技术有限公司 Saturation adjustment method and device
CN112200753B (en) * 2020-10-30 2023-02-10 青岛海泰新光科技股份有限公司 Processing method for wide dynamic range of image
CN112866507B (en) * 2021-01-13 2023-01-10 中国传媒大学 Intelligent panoramic video synthesis method and system, electronic device and medium
WO2023094871A1 (en) * 2021-11-29 2023-06-01 Weta Digital Limited Increasing dynamic range of a virtual production display
KR102564447B1 (en) * 2021-11-30 2023-08-08 엘지전자 주식회사 Display device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060290950A1 (en) * 2005-06-23 2006-12-28 Microsoft Corporation Image superresolution through edge extraction and contrast enhancement

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8721565D0 (en) * 1987-09-14 1987-10-21 Rca Corp Video signal processing system
US4967263A (en) * 1988-09-07 1990-10-30 General Electric Company Widescreen television signal processor system with interpolator for reducing artifacts
US5343254A (en) * 1991-04-25 1994-08-30 Olympus Optical Co., Ltd. Image signal processing device capable of suppressing nonuniformity of illumination
GB2357649A (en) * 1999-12-22 2001-06-27 Nokia Mobile Phones Ltd Image enhancement using inverse histogram based pixel mapping
US7002408B2 (en) 2003-10-15 2006-02-21 Varian Medical Systems Technologies, Inc. Data signal amplifier and processor with multiple signal gains for increased dynamic signal range
US7525583B2 (en) * 2005-02-11 2009-04-28 Hewlett-Packard Development Company, L.P. Decreasing aliasing in electronic images
US8253752B2 (en) * 2006-07-20 2012-08-28 Qualcomm Incorporated Method and apparatus for encoder assisted pre-processing
US8155454B2 (en) * 2006-07-20 2012-04-10 Qualcomm Incorporated Method and apparatus for encoder assisted post-processing
US7933445B2 (en) * 2007-01-09 2011-04-26 Sharp Laboratories Of America, Inc. Color gamut mapping/enhancement technique using skin color detection
US8050496B2 (en) * 2007-01-09 2011-11-01 Sharp Laboratories Of America, Inc. Color gamut mapping/enhancement technique using skin color detection
US8131110B2 (en) * 2008-07-03 2012-03-06 Seiko Epson Corporation Reducing signal overshoots and undershoots in demosaicking
US8238654B2 (en) * 2009-01-30 2012-08-07 Sharp Laboratories Of America, Inc. Skin color cognizant GMA with luminance equalization
GB2486348B (en) * 2009-10-08 2014-11-12 Ibm Method and system for transforming a digital image from a low dynamic range (LDR) image to a high dynamic range (HDR) image
US8314847B2 (en) 2010-05-25 2012-11-20 Apple Inc. Automatic tone mapping curve generation based on dynamically stretched image histogram distribution
TWI559779B (en) * 2010-08-25 2016-11-21 杜比實驗室特許公司 Extending image dynamic range
JP6407717B2 (en) 2011-09-27 2018-10-17 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Apparatus and method for dynamic range conversion of images
WO2013106190A1 (en) 2012-01-09 2013-07-18 Dolby Laboratories Licensing Corporation Hybrid reference picture reconstruction method for single and multiple layered video coding systems
CN102722868B (en) * 2012-05-23 2014-08-20 西安理工大学 Tone mapping method for high dynamic range image
US9911181B2 (en) 2013-12-27 2018-03-06 Thomson Licensing Method for inverse tone mapping of an image
US10123019B2 (en) 2014-02-13 2018-11-06 Dolby Laboratories Licensing Corporation Piecewise inter-layer prediction for signals with enhanced dynamic range
CN104463820A (en) * 2014-10-29 2015-03-25 广东工业大学 Reverse tone mapping algorithm based on frequency domain

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060290950A1 (en) * 2005-06-23 2006-12-28 Microsoft Corporation Image superresolution through edge extraction and contrast enhancement

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FRANCESCO BANTERLE ET AL: "High Dynamic Range Imaging and Low Dynamic Range Expansion for Generating HDR Content", COMPUTER GRAPHICS FORUM, vol. 28, no. 8, 1 December 2009 (2009-12-01), pages 2343 - 2367, XP055031838, ISSN: 0167-7055, DOI: 10.1111/j.1467-8659.2009.01541.x *
LAURENCE MEYLAN ET AL: "Tone mapping for high dynamic range displays", OPTOMECHATRONIC MICRO/NANO DEVICES AND COMPONENTS III : 8 - 10 OCTOBER 2007, LAUSANNE, SWITZERLAND; [PROCEEDINGS OF SPIE , ISSN 0277-786X], SPIE, BELLINGHAM, WASH, vol. 6492, 29 January 2007 (2007-01-29), pages 649210 - 1, XP002581500, ISBN: 978-1-62841-730-2 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3418972A1 (en) 2017-06-23 2018-12-26 Thomson Licensing Method for tone adapting an image to a target peak luminance lt of a target display device
EP3418973A1 (en) * 2017-06-23 2018-12-26 Thomson Licensing Method for tone adapting an image to a target peak luminance lt of a target display device
KR20190000811A (en) * 2017-06-23 2019-01-03 톰슨 라이센싱 Method for tone adapting an image to a target peak luminance lt of a target display device
US10510140B2 (en) 2017-06-23 2019-12-17 Interdigital Vc Holdings, Inc. Method for tone adapting an image to a target peak luminance LT of a target display device
KR102475139B1 (en) 2017-06-23 2022-12-07 인터디지털 브이씨 홀딩스 인코포레이티드 Method for tone adapting an image to a target peak luminance lt of a target display device
US11238572B2 (en) 2017-09-27 2022-02-01 Interdigital Vc Holdings, Inc. Device and method for dynamic range expansion in a virtual reality scene
WO2019147028A1 (en) * 2018-01-24 2019-08-01 삼성전자주식회사 Image processing apparatus, image processing method, and computer-readable recording medium
KR20190090262A (en) * 2018-01-24 2019-08-01 삼성전자주식회사 Image processing apparatus, method for processing image and computer-readable recording medium
CN111492400A (en) * 2018-01-24 2020-08-04 三星电子株式会社 Image processing apparatus, image processing method, and computer-readable recording medium
US11315223B2 (en) 2018-01-24 2022-04-26 Samsung Electronics Co., Ltd. Image processing apparatus, image processing method, and computer-readable recording medium
KR102460390B1 (en) * 2018-01-24 2022-10-28 삼성전자주식회사 Image processing apparatus, method for processing image and computer-readable recording medium

Also Published As

Publication number Publication date
KR102523505B1 (en) 2023-04-18
JP2018527675A (en) 2018-09-20
CN108027961A (en) 2018-05-11
EP3345155B1 (en) 2019-06-26
US10572983B2 (en) 2020-02-25
CN108027961B (en) 2021-11-23
JP6803378B2 (en) 2020-12-23
EP3345155A1 (en) 2018-07-11
US20180253834A1 (en) 2018-09-06
KR20180048627A (en) 2018-05-10

Similar Documents

Publication Publication Date Title
US10572983B2 (en) Method and apparatus for inverse tone mapping
EP2852152B1 (en) Image processing method, apparatus and shooting terminal
US9911181B2 (en) Method for inverse tone mapping of an image
EP3341913B1 (en) Inverse tone mapping based on luminance zones
US9641820B2 (en) Advanced multi-band noise reduction
US20120093433A1 (en) Dynamic Adjustment of Noise Filter Strengths for use with Dynamic Range Enhancement of Images
KR102567860B1 (en) Improved inverse tone mapping method and corresponding device
WO2023273868A1 (en) Image denoising method and apparatus, terminal, and storage medium
KR102315471B1 (en) Image processing method and device
US8363932B2 (en) Apparatus and method of removing false color in image
US9432646B2 (en) Image processing apparatus, image processing method, program and electronic apparatus
EP2421239B1 (en) Image processing apparatus and method for applying film grain effects
JP6335614B2 (en) Image processing apparatus, control method thereof, and program
KR101468433B1 (en) Apparatus and method for extending dynamic range using combined color-channels transmission map
Deever et al. Digital camera image formation: Processing and storage
CN110706162A (en) Image processing method and device and computer storage medium
JP6786273B2 (en) Image processing equipment, image processing methods, and programs
JP2008147714A (en) Image processor and image processing method
Suhaila et al. HE_OWW Filter for Enhancement of Wildlife Observation Digital Images in Limited Light Condition
Zhang et al. A layered tone-mapping operator based on contrast enhanced adaptive histogram equalization
WO2023205548A1 (en) Generating hdr image from corresponding camera raw and sdr images
Hong et al. The histogram processing algorithm for vehicle camera image pixel contrast improving

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16757023

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20187005157

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018510794

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15755505

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE