WO2015144566A1 - An image processing method and a corresponding device - Google Patents

An image processing method and a corresponding device Download PDF

Info

Publication number
WO2015144566A1
WO2015144566A1 PCT/EP2015/055834 EP2015055834W WO2015144566A1 WO 2015144566 A1 WO2015144566 A1 WO 2015144566A1 EP 2015055834 W EP2015055834 W EP 2015055834W WO 2015144566 A1 WO2015144566 A1 WO 2015144566A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
image data
function
mapping function
color mapping
Prior art date
Application number
PCT/EP2015/055834
Other languages
French (fr)
Inventor
Philippe Bordes
Sébastien Lasserre
Pierre Andrivon
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Publication of WO2015144566A1 publication Critical patent/WO2015144566A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/603Colour correction or control controlled by characteristics of the picture signal generator or the picture reproducer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing

Definitions

  • the invention relates to an image processing method for estimating a color transform function also known as color mapping function between first image data and second image data, the color transform function being composed of at least a first color mapping function and a second color mapping function.
  • the color images and videos are usually captured using tri-chromatic cameras into RGB raw data composed of 3 images (Red, Green, Blue).
  • the RGB signal values depend on the tri-chromatic characteristics (color primaries) of the sensor. Given the particular Human Visual System properties, these data are transformed into YUV signal to facilitate the encoding, where Y is the main component and UV (chromaticity) are secondary components the human eye is less sensitive to.
  • YUV formats are used in the industry.
  • ITU-R Rec.601 defines Studio encoding parameters of Standard Digital Television for standard 4:3 and wide- screen 16:9 aspect ratios.
  • ITU-R Rec.709 defines parameters for High Definition Television (HDTV)
  • ITU-R BT.2020 defines parameter values for Ultra-High Definition Television systems (UHDTV).
  • All YUV formats are characterized by a Gamma and Color primaries parameters that allow to define the RGB-to-YUV and YUV-to-RGB conversions.
  • This transformation can be applied in two ways: Constant Luminance (CL) or Non-Constant Luminance (NCL), as depicted with thin lines on Figure 1.
  • a display then transforms the RsGsB s signal (standardized color primaries) into an RDispiayG D i S piayBDispiay signal corresponding to the color primaries of the display as depicted with bold lines on Figure 1 .
  • the color transform function is composed of at least a first color function and a second color function. Estimating the color transform function comprises for at least an iteration k, k being an integer:
  • the color transform function is composed of a 1 D Look-Up Table, a Matrix and a 1 D LUT.
  • the image processing method allows for determining the parameters of a complex color transform function composed of a combination of several multidimensional functions. It allows achieving robust model parameters estimation.
  • a computer program product comprises program code instructions to execute of the steps of the image processing method according to claim 1 or 2 when this program is executed on a computer.
  • a processor readable medium that has stored therein instructions for causing a processor to perform at least the steps of the image processing method.
  • An image processing device comprising at least one processor configured to estimate a color transform function between first image data and second image data.
  • the color transform function is composed of at least a first color function and a second color function.
  • Estimating the color transform function comprises for at least an iteration k, k being an integer: a) estimating the second color function and an inverse of the second
  • FIG. 1 depicts YUV to RGB conversion using Constant Luminance or Non-Constant Luminance according to the state of the art
  • FIG. 5 represents a flowchart of an image processing method for estimating a color transform between first image data and second image data according to a specific and non-limitative embodiment of the invention.
  • FIG. 6 represents an exemplary architecture of an image processing device according to a specific and non-limitative embodiment of the invention.
  • the Color Mapping Function is often modeled as a combination of one dimensional non-linear mapping functions (possibly implemented via a 1 dimensional piecewise linear function with a 1 D LUT) and a 3x3 matrix M, as depicted on figure 2.
  • One typical use case is a video content that has been color graded twice: once (11 ) with a standardized format stdl and a second times (I2) with a standardized format std2.
  • 11 and I2 represents either two videos or two images.
  • color graded with stdl is distributed (broadcasting, broadband, DVD, blu- Ray%) to an end display device 30 that only supports std2 format. Consequently, on a transmitter side, a CMF is estimated by a module 10 between the first color graded version ⁇ and the second color graded version l 2 .
  • Color mapping metadata representative of the estimated CMF and the first color graded version ⁇ ⁇ i are made available to the end display device 30 on a receiver side.
  • the first color graded version may be encoded in an HEVC or an AVC bitstream while the color mapping metadata are for example encoded as metadata in a header of the bitstream or out-of-band.
  • the end display device 30 From the first color graded version of the video content ⁇ or from an estimation of it (e.g. its decoded version ) and from the color mapping metadata, the end display device 30 is able to generate another version ⁇ 2 that approximates the second color graded version l 2 . Specifically, the end display device 30 applies the CMF on the first color graded version ⁇ ⁇ i or on its decoded version ⁇ ⁇ i to obtain the approximated second color graded version ⁇ 2 . In such a case, there is no need to transmit both the first and second color graded versions to the end user display. This solution makes it possible to save bandwidth. There is thus a need to determine on the transmitter side a CMF from the first and second color graded versions. More generally, there is a need to determine on the transmitter side the CMF from first and second image data, namely 11 and I2.
  • FIG. 4 represents a flowchart of an image processing method for estimating a color mapping function CMF between first image data 11 and second image data I2 according to a specific and non-limitative embodiment of the invention.
  • step S10 at an iteration k, the second color mapping function F 2, k and an inverse of the second color mapping function F "1 2 k are estimated, from the second image data 12 and from the first image data transformed by the first color mapping function estimated at iteration k-1 , i.e. F k- (l 1 ).
  • step S1 2 the first color mapping function F-i ,k is estimated from the first image data 11 and from the second image data transformed by the inverse of the of the second color mapping function estimated at step S1 0, i.e. from F "1 2, k (I2).
  • the stop criteria may be a number of iterations.
  • the steps are iterated until the number of iterations is equal to K.
  • the steps are iterated until the absolute variation of the color mapping function parameters (e.g.. between two (or more than two) consecutive iterations) is below a threshold value.
  • the color mapping functions F1 and F2 may be 3x3 matrices or 1 D Look-Up Tables.
  • Figure 5 represents a color mapping function according to a specific and non-limiting embodiment of the invention.
  • the color mapping function is composed of 3 color mapping functions F1 , M and F2.
  • F1 and F2 are sets of three mapping functions f ⁇ i, where Z,e ⁇ Y-i , U-i , V-i , Y 2 , U 2 , V 2 ⁇ .
  • the color mapping function is composed of 7 color mapping functions (f Y i , fui , fv-i , M, f Y , fui , fvi)-
  • the functions f Z are 1 dimensional piecewise linear (e.g. 1 D LUT), and the function M is 3x3 linear matrix.
  • This specific color transform is based on the combination of existing functions implemented in many screen, displays and TV. They could be used to implement any kind of color transform, e.g. in the case where the color grading is color space dependent.
  • Figure 7 represents a flowchart of an image processing method for estimating a color transform between first image data and second image data according to a specific and non-limitative embodiment of the invention.
  • the color mapping function CMF is composed of the color mapping functions depicted on Figure 5.
  • the color mapping function (f Y , fui , fvi , M, f Y2 , fu 2 , 2) is initialized with a-priori values.
  • the functions f ⁇ i are initialized with linear monotonous functions and the matrix M is initialized with the identity matrix.
  • step S22 at an iteration k, for each sample (Y-i ,Ui,V-i) from the set E1 , one can determine its image in E2 using current values of the color mapping function F 1 jk -i ⁇ In addition, for each sample (Y 2 ,U 2 ,V 2 ) from the set E4, one can determine its image in E3 using current values of the inverse color mapping function F "1 2 k- .
  • the determination of M k can be done using LSM method, solving 3 systems of 3 equations.
  • the inverse matrix M "1 k at iteration k can be determined directly from M k in the case where M k is invertible or M "1 k can be determined in the same way as M k using LSM method.
  • step S24 at an iteration k, for each sample (Y 2 ,U 2 ,V 2 ) from the set E4, one can determine its image in E2 using current values of the color mapping function, i.e. F "1 2 k- and M "1 k determined at step S22.
  • (f Y i , fui , fvi ) at iteration k, i.e. F-i , k can be determined for example using the method disclosed in the paper from Cantoni entitled "Optimal Curve Fitting With Piecewise Linear Functions," IEEE Transactions on Computers, Vol. C-20, No1 , January 1971 which is hereby incorporated by reference.
  • step S26 at an iteration k, for each sample (Yi ,Ui,V-i) from the set E1 , one can determine its image in E3 using current values of the color mapping function, i.e. F k determined at step S24 and M k determined at step S22.
  • (fv2, fu2, fv2) at iteration k, i.e. F 2 , k can be determined for example using the method of Cantoni.
  • the inverse functions (f Y2 "1 , 1 , fv2 ⁇ 1 ), i- ⁇ - F "1 2 ,k, are determined at iteration k using for example the same method (LSM) as for estimating (f Y2 , fu2, fv2)-
  • LSM same method
  • i- ⁇ - F "1 2 ki are determined directly from (f Y2 , f U2 , fv2) estimated at iteration k.
  • these functions may not be invertible if they are not strictly monotonous. Even though they are strictly monotonous, one can face numerical precision issue while implementing such inverse function from piecewise linear representation.
  • the iterative principle comprises determining step by step each function, with possible iteration on all or part of the process, as depicted for example on Figure 7.
  • it comprises estimating a dual model composed of the inverse functions (f Y2 "1 , 1 , 1 ) determined at iteration k using for example the same method (LSM) as for estimating (f Y2 , fu2, fv2)-
  • LSM same method
  • a dual model composed of the inverse of M is also estimated.
  • M is invertible
  • an algorithm for inverting matrices can be directly used.
  • the steps are iterated until the number of iterations is equal to K.
  • the steps are iterated until the absolute variation of the function parameters (e.g.. the values L(X) or the matrix coefficients) values between two (or more than two) consecutive iterations is below a threshold value.
  • the invention disclosed with 3 functions can be applied with 2 functions, e.g. 1 D LUT and a matrix 3x3 and also with 3 or more than 3 functions, e.g. 1 D LUT followed by a 3x3 matrix followed by another 1 D LUT or a matrix followed by a 1 D LUT followed by a matrix followed by another 1 D LUT.
  • LSM Least Square Minimization
  • N ⁇ Ske[XN-l;XN]( — + ⁇ k) ⁇ N-l + ⁇ Sk e [XN-1 ; XN] ( ⁇ fc) (3)
  • the 1 D LUT comprises (N+1 ) elements: L 0 to L N .
  • FIG. 8 represents an exemplary architecture of an image processing device 100 configured to estimate a color transform function from first image data and second image data according to a specific and non-limitative embodiment of the invention.
  • the processing device 100 comprises one or more processor(s) 1 10 , which is(are), for example, a CPU, a GPU and/or a DSP (English acronym of Digital Signal Processor), along with internal memory 120 (e.g. RAM, ROM, EPROM).
  • the processing device 100 comprises one or several Input/Output interface(s) 130 adapted to display output information and/or allow a user to enter commands and/or data (e.g. a keyboard, a mouse, a touchpad, a webcam); and a power source 140 which may be external to the processing device 100.
  • the device 100 may also comprise network interface(s) (not shown).
  • the first image data and second image data may be obtained from a source.
  • the source belongs to a set comprising:
  • a local memory e.g. a video memory, a RAM, a flash memory, a hard disk ;
  • a storage interface e.g. an interface with a mass storage, a ROM, an optical disc or a magnetic support;
  • a communication interface e.g. a wireline interface (for example a bus interface, a wide area network interface, a local area network interface) or a wireless interface (such as a IEEE 802.1 1 interface or a Bluetooth interface); and
  • a wireline interface for example a bus interface, a wide area network interface, a local area network interface
  • a wireless interface such as a IEEE 802.1 1 interface or a Bluetooth interface
  • an image capturing circuit e.g. a sensor such as, for example, a CCD (or Charge-Coupled Device) or CMOS (or Complementary Metal-Oxide-Semiconductor)).
  • a sensor such as, for example, a CCD (or Charge-Coupled Device) or CMOS (or Complementary Metal-Oxide-Semiconductor)
  • the color transform may be sent to a destination.
  • the color transform is stored in a remote or in a local memory, e.g. a video memory or a RAM, a hard disk.
  • the color transform is sent to a storage interface, e.g. an interface with a mass storage, a ROM, a flash memory, an optical disc or a magnetic support and/or transmitted over a communication interface, e.g. an interface to a point to point link, a communication bus, a point to multipoint link or a broadcast network.
  • the processing device 100 further comprises a computer program stored in the memory 120.
  • the computer program comprises instructions which, when executed by the processing device 100, in particular by the processor 1 10, make the processing device 100 carry out the method described with reference to figure 4 or 7.
  • the computer program is stored externally to the processing device 100 on a non-transitory digital data support, e.g. on an external storage medium such as a HDD, CD-ROM, DVD, a read-only and/or DVD drive and/or a DVD Read/Write drive, all known in the art.
  • the processing device 100 thus comprises an interface to read the computer program. Further, the processing device 100 could access one or more Universal Serial Bus (USB)-type storage devices (e.g., "memory sticks.") through corresponding USB ports (not shown).
  • USB Universal Serial Bus
  • the processing device 100 is a device, which belongs to a set comprising:
  • a video server e.g. a broadcast server, a video-on-demand server or a web server.
  • the implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method or a device), the implementation of features discussed may also be implemented in other forms (for example a program).
  • An apparatus may be implemented in, for example, appropriate hardware, software, and firmware.
  • the methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device.
  • processors also include communication devices, such as, for example, computers, cell phones, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users.
  • PDAs portable/personal digital assistants
  • Implementations of the various processes and features described herein may be embodied in a variety of different equipment or applications, particularly, for example, equipment or applications.
  • equipment examples include an encoder, a decoder, a post-processor processing output from a decoder, a pre-processor providing input to an encoder, a video coder, a video decoder, a video codec, a web server, a set-top box, a laptop, a personal computer, a cell phone, a PDA, and other communication devices.
  • the equipment may be mobile and even installed in a mobile vehicle.
  • the methods may be implemented by instructions being performed by a processor, and such instructions (and/or data values produced by an implementation) may be stored on a processor-readable medium such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette (“CD"), an optical disc (such as, for example, a DVD, often referred to as a digital versatile disc or a digital video disc), a random access memory (“RAM”), or a read-only memory (“ROM”).
  • the instructions may form an application program tangibly embodied on a processor-readable medium. Instructions may be, for example, in hardware, firmware, software, or a combination.
  • a processor may be characterized, therefore, as, for example, both a device configured to carry out a process and a device that includes a processor-readable medium (such as a storage device) having instructions for carrying out a process. Further, a processor-readable medium may store, in addition to or in lieu of instructions, data values produced by an implementation.
  • implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted.
  • the information may include, for example, instructions for performing a method, or data produced by one of the described implementations.
  • a signal may be formatted to carry as data the rules for writing or reading the syntax of a described embodiment, or to carry as data the actual syntax-values written by a described embodiment.
  • Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal.
  • the formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream.
  • the information that the signal carries may be, for example, analog or digital information.
  • the signal may be transmitted over a variety of different wired or wireless links, as is known.
  • the signal may be stored on a processor- readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

An image processing method for estimating a color transform function between first image data and second image data is disclosed. The color transform function is composed of at least a first color mapping function and a second color mapping function. Estimating the color transform function comprises for at least an iteration k, k being an integer: a) estimating (S10) the second color mapping function and an inverse of the second color mapping function from the second image data and from the first image data transformed by the first color mapping function estimated at iteration k-1; b) estimating (S12) the first color mapping function from the first image data and from the second image data transformed by the inverse of the of the second color mapping function estimated at step a).

Description

AN IMAGE PROCESSING METHOD AND A CORRESPONDING DEVICE
1 . FIELD OF THE INVENTION
The invention relates to an image processing method for estimating a color transform function also known as color mapping function between first image data and second image data, the color transform function being composed of at least a first color mapping function and a second color mapping function.
2. BACKGROUND OF THE INVENTION
The color images and videos are usually captured using tri-chromatic cameras into RGB raw data composed of 3 images (Red, Green, Blue). The RGB signal values depend on the tri-chromatic characteristics (color primaries) of the sensor. Given the particular Human Visual System properties, these data are transformed into YUV signal to facilitate the encoding, where Y is the main component and UV (chromaticity) are secondary components the human eye is less sensitive to. Several YUV formats are used in the industry. For example, ITU-R Rec.601 defines Studio encoding parameters of Standard Digital Television for standard 4:3 and wide- screen 16:9 aspect ratios. ITU-R Rec.709 defines parameters for High Definition Television (HDTV) and ITU-R BT.2020 defines parameter values for Ultra-High Definition Television systems (UHDTV).
All YUV formats are characterized by a Gamma and Color primaries parameters that allow to define the RGB-to-YUV and YUV-to-RGB conversions. This transformation can be applied in two ways: Constant Luminance (CL) or Non-Constant Luminance (NCL), as depicted with thin lines on Figure 1. A display then transforms the RsGsBs signal (standardized color primaries) into an RDispiayGDiSpiayBDispiay signal corresponding to the color primaries of the display as depicted with bold lines on Figure 1 .
In order to support non-standardized YUV signal representations, or to support conversions between two (standardized) YUV formats (YUV-i-to- YUV2) but color graded differently, and to preserve artistic intent, it is known to explicitly signal the YUV to-RGB2 or the YUV to-YUV2 (color mapping) transform to the display so that the display is able to apply the appropriate signal conversion.
There is thus a need of a method for estimating a color transform (also known as color mapping function, CMF) given the input and output video instances (i.e. Yi Ui Vi and Υ2υ2ν2 as depicted on figure 2 or Y1 Ui Vi and R2G2B2).
1. BRIEF SUMMARY OF THE INVENTION
An image processing method for estimating a color transform function between first image data and second image data is disclosed . The color transform function is composed of at least a first color function and a second color function. Estimating the color transform function comprises for at least an iteration k, k being an integer:
a) estimating the second color function and an inverse of the second color function from the second image data and from the first image data transformed by the first color function estimated at iteration k-1 ;
b) estimating the first color function from the first image data and from the second image data transformed by the inverse of the of the second color function estimated at step a).
According to a specific characteristic of the invention, the color transform function is composed of a 1 D Look-Up Table, a Matrix and a 1 D LUT.
The image processing method allows for determining the parameters of a complex color transform function composed of a combination of several multidimensional functions. It allows achieving robust model parameters estimation.
A computer program product is also disclosed that comprises program code instructions to execute of the steps of the image processing method according to claim 1 or 2 when this program is executed on a computer.
A processor readable medium is disclosed that has stored therein instructions for causing a processor to perform at least the steps of the image processing method.
An image processing device comprising at least one processor configured to estimate a color transform function between first image data and second image data is disclosed. The color transform function is composed of at least a first color function and a second color function. Estimating the color transform function comprises for at least an iteration k, k being an integer: a) estimating the second color function and an inverse of the second
color function from the second image data and from the first image data transformed by the first color function estimated at iteration k-1 ; b) estimating the first color function from the first image data and from the second image data transformed by the inverse of the of the second color function estimated at step a). 2. BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings, an embodiment of the present invention is illustrated. It shows:
- Figure 1 depicts YUV to RGB conversion using Constant Luminance or Non-Constant Luminance according to the state of the art;
- Figures 2 and 3 depict color mapping transformation models according to the state of the art;
- Figure 4 represents a piece-wise linear function;
- Figure 5 represents a flowchart of an image processing method for estimating a color transform between first image data and second image data according to a specific and non-limitative embodiment of the invention; and
- Figure 6 represents an exemplary architecture of an image processing device according to a specific and non-limitative embodiment of the invention.
3. DETAILED DESCRIPTION OF THE INVENTION
In the following the expressions "color transform" and "color mapping model" and "color mapping function" are used interchangeably. In order to avoid developing new (hardware) capability and in order to re-use existing display systems, the Color Mapping Function (CMF) is often modeled as a combination of one dimensional non-linear mapping functions (possibly implemented via a 1 dimensional piecewise linear function with a 1 D LUT) and a 3x3 matrix M, as depicted on figure 2. One typical use case is a video content that has been color graded twice: once (11 ) with a standardized format stdl and a second times (I2) with a standardized format std2. In the following 11 and I2 represents either two videos or two images. As depicted on figure 3, color graded with stdl is distributed (broadcasting, broadband, DVD, blu- Ray...) to an end display device 30 that only supports std2 format. Consequently, on a transmitter side, a CMF is estimated by a module 10 between the first color graded version \ and the second color graded version l2. Color mapping metadata representative of the estimated CMF and the first color graded version \ <i are made available to the end display device 30 on a receiver side. The first color graded version may be encoded in an HEVC or an AVC bitstream while the color mapping metadata are for example encoded as metadata in a header of the bitstream or out-of-band. From the first color graded version of the video content \^ or from an estimation of it (e.g. its decoded version ) and from the color mapping metadata, the end display device 30 is able to generate another version ΐ2 that approximates the second color graded version l2. Specifically, the end display device 30 applies the CMF on the first color graded version \ <i or on its decoded version \ <i to obtain the approximated second color graded version \2. In such a case, there is no need to transmit both the first and second color graded versions to the end user display. This solution makes it possible to save bandwidth. There is thus a need to determine on the transmitter side a CMF from the first and second color graded versions. More generally, there is a need to determine on the transmitter side the CMF from first and second image data, namely 11 and I2.
Figure 4 represents a flowchart of an image processing method for estimating a color mapping function CMF between first image data 11 and second image data I2 according to a specific and non-limitative embodiment of the invention. The color mapping function CMF is composed of at least a first color mapping function F1 and of a second color mapping function F2: CMF=F1 oF2, where o is the composition operator. The functions F1 and F2 are initialized at an iteration k=0, for example by an identity function, i.e. Fi,o(x)=x and F2,o(x)=x-
In step S10, at an iteration k, the second color mapping function F2,k and an inverse of the second color mapping function F"1 2 k are estimated, from the second image data 12 and from the first image data transformed by the first color mapping function estimated at iteration k-1 , i.e. F k- (l 1 ).
In step S1 2, the first color mapping function F-i,k is estimated from the first image data 11 and from the second image data transformed by the inverse of the of the second color mapping function estimated at step S1 0, i.e. from F"1 2,k (I2). At each step, only one function is determined. The whole process can be iterated itself several times until a stop criteria is reached. The stop criteria may be a number of iterations. In this case, the steps are iterated until the number of iterations is equal to K. K is an integer, e.g. K=1 0. In a variant, the steps are iterated until the absolute variation of the color mapping function parameters (e.g.. between two (or more than two) consecutive iterations) is below a threshold value. The color mapping functions F1 and F2 may be 3x3 matrices or 1 D Look-Up Tables. Figure 5 represents a color mapping function according to a specific and non-limiting embodiment of the invention. The color mapping function is composed of 3 color mapping functions F1 , M and F2. Specifically, F1 and F2 are sets of three mapping functions fi, where Z,e {Y-i , U-i , V-i , Y2, U2, V2}. Said otherwise the color mapping function is composed of 7 color mapping functions (fYi , fui , fv-i , M, fY , fui , fvi)- The functions fZ; are 1 dimensional piecewise linear (e.g. 1 D LUT), and the function M is 3x3 linear matrix. This specific color transform is based on the combination of existing functions implemented in many screen, displays and TV. They could be used to implement any kind of color transform, e.g. in the case where the color grading is color space dependent.
The functions fez- S→ Y, from the set E1 to the set E2 and from the set E3 to the set E4, are described and implemented using 1 D LUTs, corresponding to piece-wise linear functions as depicted on Figure 6. The function M, from the set E2 to the set E3, is described and implemented using a 3x3 matrix:
Figure imgf000006_0001
Figure 7 represents a flowchart of an image processing method for estimating a color transform between first image data and second image data according to a specific and non-limitative embodiment of the invention. In this embodiment the color mapping function CMF is composed of the color mapping functions depicted on Figure 5. Determining the global color transform (fY , fui , fv-i , M, fY , fui , fvi ) is not straightforward because the values of the samples (Y, U, V) in the sets E2 and E3 are not available. Consequently, the method solves each problem separately with an iterative approach. More precisely, at each step one function only is estimated. On this figure M is estimated first. However, the invention is independent of the order in which the functions are estimated. Exemplarily, fY2, fu2, fv2 can be estimated first followed by M and fvi , fui , fvi -
In an initialization step S20, the color mapping function (fY , fui , fvi , M, fY2, fu2, 2) is initialized with a-priori values. Exemplarily, the functions f∑i are initialized with linear monotonous functions and the matrix M is initialized with the identity matrix.
In step S22, at an iteration k, for each sample (Y-i ,Ui,V-i) from the set E1 , one can determine its image in E2 using current values of the color mapping function F1 jk-i■ In addition, for each sample (Y2,U2,V2) from the set E4, one can determine its image in E3 using current values of the inverse color mapping function F"1 2 k- . The determination of Mk can be done using LSM method, solving 3 systems of 3 equations.
Yi = mi(X0, X1, X2) = gL0. X0 + gL1. X1 + gL2. X2 (4)
For a set of samples ((ΧΟ,ΧΙ,ΧΣ), YI), the quadratic error is Err, = (Υ,- ,(Xo,Xi,X2))2- The LSM method consists in solving a system of 9 equations built from the partial derivative of mi() respectively to g,-j with i=0,1 ,2 and j=0,1 ,2. The inverse matrix M"1 k at iteration k can be determined directly from Mk in the case where Mk is invertible or M"1 k can be determined in the same way as Mk using LSM method.
In step S24, at an iteration k, for each sample (Y2,U2,V2) from the set E4, one can determine its image in E2 using current values of the color mapping function, i.e. F"1 2 k- and M"1 k determined at step S22. Next, (fYi , fui , fvi ) at iteration k, i.e. F-i ,k, can be determined for example using the method disclosed in the paper from Cantoni entitled "Optimal Curve Fitting With Piecewise Linear Functions," IEEE Transactions on Computers, Vol. C-20, No1 , January 1971 which is hereby incorporated by reference.
In step S26, at an iteration k, for each sample (Yi ,Ui,V-i) from the set E1 , one can determine its image in E3 using current values of the color mapping function, i.e. F k determined at step S24 and Mk determined at step S22. Next, (fv2, fu2, fv2) at iteration k, i.e. F2,k, can be determined for example using the method of Cantoni. The inverse functions (fY2 "1 , 1, fv2~1), i-θ- F"1 2,k, are determined at iteration k using for example the same method (LSM) as for estimating (fY2, fu2, fv2)- In the case where (fY2, fu2, fv2) are invertible (fY2 "1, 1, fv2_1), i-θ- F"1 2 ki are determined directly from (fY2, fU2, fv2) estimated at iteration k. However, these functions may not be invertible if they are not strictly monotonous. Even though they are strictly monotonous, one can face numerical precision issue while implementing such inverse function from piecewise linear representation.
The iterative principle comprises determining step by step each function, with possible iteration on all or part of the process, as depicted for example on Figure 7. In addition, it comprises estimating a dual model composed of the inverse functions (fY2 "1, 1, 1) determined at iteration k using for example the same method (LSM) as for estimating (fY2, fu2, fv2)- In the same way, a dual model composed of the inverse of M is also estimated. Alternatively, if M is invertible, an algorithm for inverting matrices can be directly used. These dual models are used to determine the other functions ((fYi , fui , fvi) or M). In that way, one can determine iteratively each function. At each step, one determine one function only. The whole process can be iterated itself several times up until a stop criteria is reached. The stop criteria is a number of iterations. In this case, the steps are iterated until the number of iterations is equal to K. K is an integer, e.g. K=10. In a variant, the steps are iterated until the absolute variation of the function parameters (e.g.. the values L(X) or the matrix coefficients) values between two (or more than two) consecutive iterations is below a threshold value.
The invention disclosed with 3 functions can be applied with 2 functions, e.g. 1 D LUT and a matrix 3x3 and also with 3 or more than 3 functions, e.g. 1 D LUT followed by a 3x3 matrix followed by another 1 D LUT or a matrix followed by a 1 D LUT followed by a matrix followed by another 1 D LUT. Other functions than 1 D LUT and matrix can be used such as for example polynomial transforms: (X,Y,Z)=f(x,y,z,x2,y2,z2).
The estimation of the functions fz,: S→ Y is detailed below with respect to figure 6. For a given point with abscissa S e [Xi;Xi+i], the image of S under fXi is Vsuch that:
Y = fZi (S) = Li + (Li+1- Li)*(S- Xi)/(Xi+1- Xi)
One has to find the optimal values for the L(X,) that minimizes the quadratic errors over all pixels k : L = argminL.(∑Erri) with Errt=∑ske[x..x. j(Yk - f(Sk))2 for the set of (Sk,Yk) sample values, with Sk e [Χ,;Χί+ι] of YU ^ and Yk e YUV2, for each interval [Xi;Xi+i]i=o,..N-i- The Least Square Minimization (LSM) method comprises solving a set of equations of partial derivative of Err i.e. Equivalent to (N+1 equations):
Zo = ∑Ske [X0;X1] (^fc 25fc + 1)LQ +∑ske[X0.X1](— 5 + 5fc) L-L (1)
Zi = ∑Ske[Xi-l;Xi]( $k + ^k)^i-l +
[∑Ske[Xi;Xi+l](^fc 25fc + l)+∑sk e [Xi-1 ; Xi]0¾ ^i] +∑Sk e [Xi ; Xi+1]( $k +
(2)
N =∑Ske[XN-l;XN]( + ^k)^N-l +∑Sk e [XN-1 ; XN] (^fc) (3)
Equivalent to (N+1 equations):
= «0,0 X + «0,1 x (1 )
%i = ai,i-l x ^t-l + «i,i x ai,i+l x ^t+l (2)
Zy = aNN-i x £JV-I + «JV,JV X ½ (3)
Equivalent to (vectors (N+1)x1, matrix (N+1)x(N+1))
Figure imgf000009_0001
N is an integer. The 1 D LUT comprises (N+1 ) elements: L0 to LN.
Figure 8 represents an exemplary architecture of an image processing device 100 configured to estimate a color transform function from first image data and second image data according to a specific and non-limitative embodiment of the invention. The processing device 100 comprises one or more processor(s) 1 10 , which is(are), for example, a CPU, a GPU and/or a DSP (English acronym of Digital Signal Processor), along with internal memory 120 (e.g. RAM, ROM, EPROM). The processing device 100 comprises one or several Input/Output interface(s) 130 adapted to display output information and/or allow a user to enter commands and/or data (e.g. a keyboard, a mouse, a touchpad, a webcam); and a power source 140 which may be external to the processing device 100. The device 100 may also comprise network interface(s) (not shown). The first image data and second image data may be obtained from a source. According to different embodiments of the invention, the source belongs to a set comprising:
- a local memory, e.g. a video memory, a RAM, a flash memory, a hard disk ;
- a storage interface, e.g. an interface with a mass storage, a ROM, an optical disc or a magnetic support;
- a communication interface, e.g. a wireline interface (for example a bus interface, a wide area network interface, a local area network interface) or a wireless interface (such as a IEEE 802.1 1 interface or a Bluetooth interface); and
- an image capturing circuit (e.g. a sensor such as, for example, a CCD (or Charge-Coupled Device) or CMOS (or Complementary Metal-Oxide-Semiconductor)).
According to different embodiments of the invention, the color transform may be sent to a destination. As an example, the color transform is stored in a remote or in a local memory, e.g. a video memory or a RAM, a hard disk. In a variant, the color transform is sent to a storage interface, e.g. an interface with a mass storage, a ROM, a flash memory, an optical disc or a magnetic support and/or transmitted over a communication interface, e.g. an interface to a point to point link, a communication bus, a point to multipoint link or a broadcast network.
According to an exemplary and non-limitative embodiment of the invention, the processing device 100 further comprises a computer program stored in the memory 120. The computer program comprises instructions which, when executed by the processing device 100, in particular by the processor 1 10, make the processing device 100 carry out the method described with reference to figure 4 or 7. According to a variant, the computer program is stored externally to the processing device 100 on a non-transitory digital data support, e.g. on an external storage medium such as a HDD, CD-ROM, DVD, a read-only and/or DVD drive and/or a DVD Read/Write drive, all known in the art. The processing device 100 thus comprises an interface to read the computer program. Further, the processing device 100 could access one or more Universal Serial Bus (USB)-type storage devices (e.g., "memory sticks.") through corresponding USB ports (not shown).
According to exemplary and non-limitative embodiments, the processing device 100 is a device, which belongs to a set comprising:
- a mobile device ;
- a communication device ;
- a game device ;
- a tablet (or tablet computer) ;
- a laptop ;
- a still image camera;
- a video camera ;
- an encoding chip;
- a still image server ; and
- a video server (e.g. a broadcast server, a video-on-demand server or a web server). The implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method or a device), the implementation of features discussed may also be implemented in other forms (for example a program). An apparatus may be implemented in, for example, appropriate hardware, software, and firmware. The methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processors also include communication devices, such as, for example, computers, cell phones, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users.
Implementations of the various processes and features described herein may be embodied in a variety of different equipment or applications, particularly, for example, equipment or applications. Examples of such equipment include an encoder, a decoder, a post-processor processing output from a decoder, a pre-processor providing input to an encoder, a video coder, a video decoder, a video codec, a web server, a set-top box, a laptop, a personal computer, a cell phone, a PDA, and other communication devices. As should be clear, the equipment may be mobile and even installed in a mobile vehicle.
Additionally, the methods may be implemented by instructions being performed by a processor, and such instructions (and/or data values produced by an implementation) may be stored on a processor-readable medium such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette ("CD"), an optical disc (such as, for example, a DVD, often referred to as a digital versatile disc or a digital video disc), a random access memory ("RAM"), or a read-only memory ("ROM"). The instructions may form an application program tangibly embodied on a processor-readable medium. Instructions may be, for example, in hardware, firmware, software, or a combination. Instructions may be found in, for example, an operating system, a separate application, or a combination of the two. A processor may be characterized, therefore, as, for example, both a device configured to carry out a process and a device that includes a processor-readable medium (such as a storage device) having instructions for carrying out a process. Further, a processor-readable medium may store, in addition to or in lieu of instructions, data values produced by an implementation.
As will be evident to one of skill in the art, implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted. The information may include, for example, instructions for performing a method, or data produced by one of the described implementations. For example, a signal may be formatted to carry as data the rules for writing or reading the syntax of a described embodiment, or to carry as data the actual syntax-values written by a described embodiment. Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal. The formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream. The information that the signal carries may be, for example, analog or digital information. The signal may be transmitted over a variety of different wired or wireless links, as is known. The signal may be stored on a processor- readable medium.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. For example, elements of different implementations may be combined, supplemented, modified, or removed to produce other implementations. Additionally, one of ordinary skill will understand that other structures and processes may be substituted for those disclosed and the resulting implementations will perform at least substantially the same function(s), in at least substantially the same way(s), to achieve at least substantially the same result(s) as the implementations disclosed. Accordingly, these and other implementations are contemplated by this application.

Claims

Claims
1 . An image processing method for estimating a color transform function between first image data and second image data, said color transform function being composed of at least a first color mapping function and a second color mapping function, wherein estimating said color transform function comprising for at least an iteration k, k being an integer:
a) estimating (S10) said second color mapping function and an inverse of said second color mapping function from said second image data and from said first image data transformed by said first color mapping function estimated at iteration k-1 ;
b) estimating (S12) said first color mapping function from said first image data and from said second image data transformed by said inverse of said second color mapping function estimated at step a).
2. The method of claim 1 , wherein the color transform function is composed of a 1 D Look-Up Table, a Matrix and a 1 D LUT.
3. A computer program product comprising program code instructions to execute of the steps of the image processing method according to claim 1 or 2 when this program is executed on a computer.
4. A processor readable medium having stored therein instructions for causing a processor to perform at least the steps of the image processing method according to claim 1 or 2.
5. An image processing device comprising at least one processor configured to estimate a color transform function between first image data and second image data, said color transform function being composed of at least a first color mapping function and a second color mapping function, estimating said color transform function comprising for at least an iteration k, k being an integer:
a) estimating said second color mapping function and an inverse of said second color mapping function from said second image data and from said first image data transformed by said first color mapping function estimated at iteration k-1 ;
b) estimating said first color mapping function from said first image data and from said second image data transformed by said inverse of said second color mapping function estimated at step a).
6. The device of claim 5, wherein the color transform function is composed of a 1 D Look-Up Table, a Matrix and a 1 D LUT.
PCT/EP2015/055834 2014-03-25 2015-03-19 An image processing method and a corresponding device WO2015144566A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP14305424 2014-03-25
EP14305424.5 2014-03-25

Publications (1)

Publication Number Publication Date
WO2015144566A1 true WO2015144566A1 (en) 2015-10-01

Family

ID=50489037

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/055834 WO2015144566A1 (en) 2014-03-25 2015-03-19 An image processing method and a corresponding device

Country Status (1)

Country Link
WO (1) WO2015144566A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3122032A1 (en) * 2015-07-22 2017-01-25 Thomson Licensing A method and device for estimating a color mapping between two different color-graded versions of a picture
EP3213510A1 (en) * 2014-10-29 2017-09-06 Thomson Licensing A method and device for estimating a color mapping between two different color-graded versions of a picture
US10375410B2 (en) 2014-10-17 2019-08-06 Interdigital Vc Holdings, Inc. Method for color mapping a video signal based on color mapping data and method of encoding a video signal and color mapping data and corresponding devices
US11025875B2 (en) 2015-05-22 2021-06-01 Interdigital Vc Holdings, Inc. Method for color mapping a video signal and method of encoding a video signal and corresponding devices

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060082843A1 (en) * 2004-10-14 2006-04-20 International Business Machines Corporation Method and system for calibration and characterization of joint nonlinear and linear transformations for a color input or output device
EP1729257A2 (en) * 2005-06-02 2006-12-06 Xerox Corporation Inter-separation decorrelator

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060082843A1 (en) * 2004-10-14 2006-04-20 International Business Machines Corporation Method and system for calibration and characterization of joint nonlinear and linear transformations for a color input or output device
EP1729257A2 (en) * 2005-06-02 2006-12-06 Xerox Corporation Inter-separation decorrelator

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10375410B2 (en) 2014-10-17 2019-08-06 Interdigital Vc Holdings, Inc. Method for color mapping a video signal based on color mapping data and method of encoding a video signal and color mapping data and corresponding devices
US10771802B2 (en) 2014-10-17 2020-09-08 Interdigital Vc Holdings, Inc. Method for color mapping a video signal based on color mapping data and method of encoding a video signal and color mapping data and corresponding devices
EP3213510A1 (en) * 2014-10-29 2017-09-06 Thomson Licensing A method and device for estimating a color mapping between two different color-graded versions of a picture
US11025875B2 (en) 2015-05-22 2021-06-01 Interdigital Vc Holdings, Inc. Method for color mapping a video signal and method of encoding a video signal and corresponding devices
EP3122032A1 (en) * 2015-07-22 2017-01-25 Thomson Licensing A method and device for estimating a color mapping between two different color-graded versions of a picture

Similar Documents

Publication Publication Date Title
US10462492B2 (en) Method and device for encoding a HDR picture
JP6516851B2 (en) Pixel pre-processing and encoding
KR102529013B1 (en) Method and apparatus for encoding and decoding color pictures
KR102523233B1 (en) Method and device for decoding a color picture
KR102509504B1 (en) Coding and decoding methods and corresponding devices
KR20180021869A (en) Method and device for encoding and decoding HDR color pictures
EP3113496A1 (en) Method and device for encoding both a hdr picture and a sdr picture obtained from said hdr picture using color mapping functions
EP3235245B1 (en) Method and device of converting a high-dynamic-range version of a picture to a standard-dynamic-range version of said picture
US20180005358A1 (en) A method and apparatus for inverse-tone mapping a picture
US20220318965A1 (en) Method and device for obtaining a second image from a first image when the dynamic range of the luminance of the first image is greater than the dynamic range of the luminance of the second image
WO2015144566A1 (en) An image processing method and a corresponding device
US20170339316A1 (en) A method and device for estimating a color mapping between two different color-graded versions of a sequence of pictures
KR102574005B1 (en) Method for color mapping a video signal and corresponding device
US9699426B2 (en) Method and device for estimating a color mapping between two different color-graded versions of a picture
US20170337708A1 (en) A method and device for estimating a color mapping between two different color-graded versions of a picture
US11037524B2 (en) Color grading interpolation methods and devices
EP3065127A1 (en) Method and device for processing image data
EP3122032A1 (en) A method and device for estimating a color mapping between two different color-graded versions of a picture
EP3169070A1 (en) Method and device for encoding a hdr picture
US20170118482A1 (en) Method and device for decoding a hdr picture from a bitstream representing a ldr picture and an illumination picture
TW201711474A (en) Method and device for obtaining color difference components for color picture data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15710537

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15710537

Country of ref document: EP

Kind code of ref document: A1