GB2432472A - Automated colour matching between images - Google Patents

Automated colour matching between images Download PDF

Info

Publication number
GB2432472A
GB2432472A GB0517990A GB0517990A GB2432472A GB 2432472 A GB2432472 A GB 2432472A GB 0517990 A GB0517990 A GB 0517990A GB 0517990 A GB0517990 A GB 0517990A GB 2432472 A GB2432472 A GB 2432472A
Authority
GB
United Kingdom
Prior art keywords
colour
image
images
mapping
histograms
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0517990A
Other versions
GB2432472B (en
GB0517990D0 (en
Inventor
Francois Pitie
Anil Kokaram
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Greenparrotpictures Ltd
Original Assignee
Greenparrotpictures Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Greenparrotpictures Ltd filed Critical Greenparrotpictures Ltd
Priority to GB0517990A priority Critical patent/GB2432472B/en
Publication of GB0517990D0 publication Critical patent/GB0517990D0/en
Publication of GB2432472A publication Critical patent/GB2432472A/en
Application granted granted Critical
Publication of GB2432472B publication Critical patent/GB2432472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/92
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6077Colour balance, e.g. colour cast correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

The invention is concerned with automating colour matching between images, particularly in post production environments. The colour distributions in respective images may be matched using a linear process of colour distribution transformation involving Principal Component Analysis. In a second step, which may be performed after the first colour distribution transformation step, a series of one dimensional histograms may be applied along multiple directions in the colour space, the one dimensional histograms being applied to the colour space distributions of both images and describing a change of coordinate system rather than a transform between colour distributions. These one dimensional histograms may be applied in an iterative process and a smoothing post process may also be applied after the iterative process is complete.

Description

<p>Method for matching colour in images</p>
<p>Background of the Invention</p>
<p>A major problem in the post production industry is matching the colour between different shots possibly taken at different times in the day. This process is part of the large activity of film grading in which the film material is digitally manipulated to have consistent grain and colour. The term colour grading will be used specifically to refer to the matching of colour. Colour grading is important because shots taken at different times under natural light can have a substantially different feel due to even slight changes in lighting.</p>
<p>Currently in the industry, colour balancing (as it is called) is achieved by experienced artists who use edit hardware and software to manually match the colour between frames by tuning parameters.</p>
<p>For instance, in an effort to balance the red colour, the digital samples in the red channel in one frame may be multiplied by some factor and the output image viewed and compared to the colour of some other (a target) frame. The factor is then adjusted if the match in colour is not quite right. The amount of adjustment and wether it is an increase or decrease depends crucially on the experience of the artist. This is because it is a delicate task since the change in lighting conditions induces a very complex change of illumination. It would be beneficial to automate this task in some way. There has been no prior art targeted specifically to this problem in the film and digital video industry.</p>
<p>However, there are a few articles which disclose ideas that could be used.</p>
<p>Transfer of Colour Statistics: One popular method proposed by Reinhard [15] matches the mean and variance of the target image to the source image. The transfer of statistics is performed separately on each channel. Since the RGB colour space is highly correlated, the transfer is done in another colourspace la/3. This colourspace has been proposed in an effort to account for for human-perception of colour [16]. But the method is limited to linear transformations. In fact, in the motion picture industry colour grading employs routinely non-linear. Hence, in a practical situation, some example-based recolouring scenarios actually require non-linear colour mapping. Figure 1.4 shows exactly this problem and the method fails to transfer any useful statistics.</p>
<p>The problem of finding a non-linear colour mapping is addressed in particular in [13] for colour equalisation (c.f. grayscale histogram equalisation). That work proposes to deform tessellation meshes in the colour space to fit to the 3D histogram of a uniform distribution. This method can be seen as being related to warping theory which is explicitly used in [10] where the transfer of the 2D chromatic space is performed directly by using a 2D-biquadratic warping. Without having to invoke image warping, a natural extension of the 1D case is to treat the mapping via linear programming and the popular Earth-Mover distance [11]. The major disadvantage of the method is that 1) the mapping is not continuous and 2) pixels of same colours may be mapped to pixels of different colours, which require random selection. Furthermore the computational cost becomes intractable if a very fine clustering of the colour space is desired.</p>
<p>Dealing with Content Variations: One important aspect of the colour transfer problem is the change of content between the two pictures. Consider a pair of images which are of landscapes but in one picture the sky covers a larger area than the other. When transferring the colour from one picture to the other therefore, the sky colour may be applied also to parts of the scenery on the ground in the other. This is because all colour transfer algorithms are sensitive to variations in the areas of the image occupied by the same colour, they risk overstretching the colour mappings and thus producing unbelievable renderings. To deal with this issue a simple solution (presented in [15] ) is to manually select swatches in both pictures and thus associate colour clusters corresponding to the same content. This is tantamount to performing manual image segmentation, and is simply impractical for a large variety of images, and certainly for sequences of images.</p>
<p>One automated solution is to invoke the spatial information of the images to constrain the colour mapping [18, 7, 8]. In an extreme situation, colour from a coloured image may be required to be transferred to a grayscale image. Hence similarities similarities between spatial neighbourhoods of the two pictures is then the only way to create a colour transfer operation automatically. This is computationally demanding solution. Another automated solution is to restrict the variability on the colour mapping. For example in [2], the pixels of both images are classified in a restricted set of basic colour categories, derived from psycho-physiological studies (red, blue, pink. . . ) The colour transfer ensures for instance that blue-ish pixels remain blue-ish pixels. This gives a more natural transformation. The disadvantage is that it limits the range of possible colour transfers.</p>
<p>Summary of the Invention</p>
<p>The goal of the invention is to estimate a colour mapping that will transform a first (source) colour distribution into a second (target) distribution via matching operations. The following are consid-ered to be novel aspects of the invention.</p>
<p>* The invention uses multiple one-dimensional histogram matching operations. These one-dimensional histohrams are derived from projections along multiple directions in the colour space.</p>
<p>* Each one-dimensional histogram matching operation is preferably non-linear. The net effect is therefore a non-linear transformation of colour space.</p>
<p>* The invention is preferably iterative using successive one-dimensional histogram matching op-erations preferably until the one-dimensional histograms of both the source and target images match within some predefined error.</p>
<p>* The invention uses only one-dimensional histogram manipulation instead of multidimensional as in [13, 10, 11], and is iterative.</p>
<p>* There is no need for spatial information in the manner of [18, 7, 8].</p>
<p>* The process is known to converge i.e. will yield a final colour distribution without further change in finite time.</p>
<p>* The method is preferably completely non-parametric and is very effective at matching arbi-trary colour distributions.</p>
<p>* The method preferably uses the raw image data to create the histograms required. Hence measurement directly from the image is suffcient and there is no need to create mathematical models.</p>
<p>* The colour distribution can be gained directly from the image without introducing additional parameters or making use of explicit models or assumptions.</p>
<p>* The invention can be preferably combined with a noise removal stage. This reduces the grain artefact caused by many histogram matching tools. This creates a much better quality image.</p>
<p>* The noise removal step includes the novel element of constraining the gradients in the output image to be close to those in the input image. It is clear to the person skilled in this area that this step can be employed independently of the matching aspect of the invention as described above.</p>
<p>* The invention has the advantage of being completely automatic and does not require user interaction at any stage.</p>
<p>Short Description of the Drawings</p>
<p>Figure 1: Example of Colour transfer using Reinhard [15] Colour Transfer. The transfer fails to resyn-thesise the colour scheme of the target image. To be successful the method would require human interaction.</p>
<p>Figure 2: Example of 1D pdf transfer on grayscale pictures.</p>
<p>Figure 3: Distribution Transfer Diagram. The first part of the process is optional but can handle very effectively linear transformations. The second step of the process can handle any non-linear transformation. At the end of the iterations, when no more improvement is visible, the two set of samples share the exact same statistics. Whilst using this process for matching colours in images, a grain artefact can appear and the noise smoothing module is used to address this problem.</p>
<p>Figure 4: Colour Transfer Process in action. The final result (f) presents the same colour distribution as the target image (a).</p>
<p>Figure 5: Result of noise smoothing on grayscale images. The two consecutive archive frames (a) and (b) suffer from extreme brightness variation. The corresponding mapping transformation (e) is overstretched, which results in an increased level of noise on the mapped original frame (c).</p>
<p>The proposed grain artefact reducer is able to reproduce the noise level of the original picture.</p>
<p>The top of the original picture is saturated and cannot be retrieved but the process succeeds in preserving the soft gradient.</p>
<p>Figure 6: Result of noise smoothing for colour picture. The details of the picture are preserved, while the spurious graininess in sky has been washed out.</p>
<p>Figure 7: Example of employing the proposed Colour Grading process for matching lighting conditions.</p>
<p>The colour properties of the sunset are used to synthesise the evening' scene depicted at sunset.</p>
<p>Detailed description of the invention</p>
<p>This invention is a process for automating the colour grading or colour balancing task even when the lighting conditions have dramatically changed as shown in figure 1.4. It resolves many of the problems with previous work and is computationally extremely efficient. Its implementation in hardware is straightforward but an implementation exploiting the Graphics Subsystem of general purpose processors is described.</p>
<p>Consider transferring the colour from image (a) to image (b) in figure 1.4. One simple approach is to separately match the red, green and blue channels using the standard 1D histogram match-ing technique. After one iteration of this, there is no further change in the mapping since the 1D histograms would not change. Typically this also yields a poor result. Figure 1.4-(c) shows this poor result. The crucial point in this invention is to notice that if a colourspace transformation (a rotation of the colourspace) is applied to the colour data between each iteration of 1D matching this causes the entire colour surface distribution of image (b) to eventually warp into the shape of the colour distribution in image (a). Figure 1.4-(t) shows the final colour transfer appearance in which image (b) has been coloured with the colours in image (a). The amazing observation is that even if a random sequence of rotations is used, the process will converge eventually to a valid picture with some successful colour transfer properties. In addition the transformation need not be a rotation, although other transformations may be less efficient. In a sense what this invention proposes is that if 1 D slices of the multidimensional colour space are matched independently, eventually the entire multidimensional surface will be mapped effectively. To understand the nature of the invention, it is necessary to expose some background on image histograms.</p>
<p>An image histogram is simply the frequency distribution of colour in the image. The histogram of a gray scale image is 1 D because pixels assume only scalar values of gray typically from 0 to 255 for instance. Figure 1.4 shows this. The histogram of a colour image is 3D because each pixel has associated with it 3 values for red, green and blue (in RGB space), or luminance, and two colour components (in YUV space). Matching 1D histograms is a well understood process, used in histogram equalisation for instance. Consider the two gray scale images shown in figure 1.4.</p>
<p>Assume that it is required to transfer the brightness distribution of the image on the left to the one on the right. To do so requires mapping each of the 256 gray scale values in the image on the right to some other grayscale value such that the resulting brightness of image 2 is the same as image 1.</p>
<p>A technique for creating this mapping is well known [4]. The process is as follows.</p>
<p>1. Set the histogram of image 1 to be stored in some memory array H1 such that the n' element of that array is H1 [n] for n = 0: 255 elements. Similarly set the histogram of image 2 to be H2 [n].</p>
<p>2. Calculate the cumulative density functions C1, C2 by summing the elements of each memory array. Hence calculate C1 [n] = H1 [m] and C2 [n] = H [m} 3. Create the required grayscale mapping by scanning the elements of C1 and mapping the grayscale of each entry in C1 to the grayscale value that indexes the value having the least difference in C2.</p>
<p>* Foreachm=O:255 * Find that value of n which makes IC1 [m] -C2 [n] I smallest. Define this value as T[m].</p>
<p>* The mapping for m is then T[m] The result is then an array T[m] which contains for each element m = 0: 255 a corresponding number that is the grayscale to be assigned. To apply the transformation to image 1 then, each pixel in image 1 is visited and its grayscale, I say, is used to look up in T[m] to yield the new grayscale.</p>
<p>Hence for a pixel of grayscale I the new grayscale to be assigned is T{I].</p>
<p>There are variants on this process of course, but this is just one mechanism for achieving 1-D histogram matching. The resulting image should then have an intensity distributed similarly to image 2.</p>
<p>Unfortunately, this cannot work for colour because there are more dimensions involved. A naive approach would be to separate each colour component (red, green and blue) and match the histograms of each colour plane individually. An example of doing this is shown in figure 1.4- (c). It is clearly poor. However, the crucial aspect of the current invention is that this idea can be made to work by transforming the colour space before performing the histogram matching, and then repeating this process iteratively. Of course, to return to the original colourspace, the rotation must be undone after each mapping.</p>
<p>Consider the situation at iteration k of this process. The image pixels involved are the current image j(c) (which has been transformed up to iteration k) and the image data from G, the target image. C is the image containing the colour distribution that is required. The first step of the iteration is to change the coordinate system by rotating both the colour of the samples of 1(k) and the samples of C. Define this rotation operation as R. fl 0 0 \ fcos(3) 0 -sin(fl) \ f cos('y) sin(y) 0 R= 0 cos(a) sin(a) ( 0 1 0) ( -sin('y) cos('y) 0) (1) \ 0 -sin(a) cos(a) J \ sin(3) 0 cos(fl) / \ 0 0 1 / with a, /3, y some angles. Each pixel colour vector = (r, g, b) is transformed in r = (r', g', b') as follow fr'\ fr\ g' )=R( g) (2) \b'J \bJ This operation is equivalent to a colour space conversion. In a second step the samples of both distributions are projected onto the new axis by summing in the relevant direction. Hence summing along the transformed red and green colour axes yields a projection along the transformed blue axis.</p>
<p>This yields three histograms for the three colour planes in each sequence. Define these histograms as HI, HI, HI and Hc, H, H. Then it is possible using the 1 D process above to find for each axis 1, 2, 3, the mappings t1, t2, t3 that transform the histograms H, HI, HI into H, H, H. The resulting transformation t maps a pixel of colour 11,12,13 onto t(I1, 12, 13) = (t1(11), t2(12), t3(13)). The iteration is completed with a rotation of the samples by R' to return in the original coordinate system.</p>
<p>When no new iterations can modify the iterated samples of j(k), it means that the process con-verged to the final solution and the colour statistics of j(c) and G are identical. Note that this process is not restricted to 3 dimensions. It can be applied to vector spaces of any dimensionality. The ideas remain the same, the only difference is that the rotation matrices fit the relevant dimension i.e. for a vector space of 4 dimensions, the rotation matrix is 4 x 4 etc. 1.1 The steps The full process in presented in a separate figure on this page and is simple to implement as it requires no extra parameters.</p>
<p>Process 1 Colour Transfer Process 1: Initialisation. Retrieve the pixels of the original image I and the target image G. G is the image con-taining the colour distribution that is required. For example, the image pixel I = (Ii, 2, 13) encapsulate the red, green and blue components.</p>
<p>k ±o, r( ) -I 2: repeat 3: take one rotation matrix R 4: rotate the pixels samples: 4 RI(c) and Gr RG 5: project the samples on all 3 axis to get the histograms HI, HI, HI and H, H, H 6: for each axis i, find the ID transformation t that matches the histograms HI, HI, HI into ru ij2 u3 LG, -L 7: remap the pixels colours 4 according to the 1D transformations. For example, a sample (Ii, 12, 13) is remapped into (t1(11), t2(12), t3(13)).</p>
<p>8: rotate back the pixel colours to the original colour space: 1(k+1) . R'Ir 9: k-k+1 10: until convergence on all histograms for every possible rotation 1.2 Linear pre-Process.</p>
<p>Although the method can cope with any kind of mapping, it might be desirable as a first step to use a different technique for registering global transformations between the two distributions. In partic-ular, the proposed method is not naturally designed to find rotation between the original and target dataset. This is not to be confused with the rotations used in the pdf transfer process. The rotation considered here is the rotation between the two distributions, whereas the rotations mentioned in the pdf transfer process are applied to both distributions and only describe a change of coordinate system. The linear mapping that we are looking for is of the form G Al + b. A simple solution to this problem is to use the Principal Component Analysis of each of the images.</p>
<p>( -J)t (I -I) = UD1U1 (3) (G -)t (G -= UDGUG (4) where U1 = [u}, u, u} and U0 = [ut, u, u] are the 3 x 3 orthogonal matrices containing the eigenvectors of the covariance matrices of the pixel colour vectors of I and G. The diagonal matrices D1 and D0 contain the eigenvalues corresponding to the eigenvectors in Uj and U0.</p>
<p>Registering the pixel colours vectors of j( ) to the target distribution G is then possible by employing the following linear transformation process: (ur'i) (UJD11/2) -1 (I -) + G (5) The colour distribution of the image 1(0) shares the same average value, variance and global colour directions as the target image G. This operation cannot handle non-linear transformations but offers an interesting speed up if it is used as a first step, before the actual pdf transfer process.</p>
<p>To obtain a correct transformation, the eigenvectors in U1 and U0 have to correspond. This is achieved by ordering them with respect to the magnitude of the corresponding eigenvalues and making sure that they do not point in opposite directions. i.e. Vi =3,ti4u>O (6) 1.3 Noise smoothing The colour mapping to the original picture transfers correctly the target colour palette to the original picture but it might also produce some grain artefacts as shown in figure 1.4 and 1.4. When the content differs or the dynamic range of both pictures is too different, the resulting mapping function can be stretched on some parts (see figure 1.4-d), and thus enhances the noise level. This can be understood by taking the simple example of a linear transformation t of the original picture X: t(X) -p a X+b. The overall variance of the resulting picture is changed to var{t(X)} = a var{X}.</p>
<p>This means that a greater stretching (a> 1) produces a greater noise.</p>
<p>The solution proposed here to reduce the grain is to run a post-processing process that forces the level of noise to remain the same. The idea is to adjust the gradient field of the picture result so that it matches the original picture. If the gradient of both pictures are similar, the level of noise will be the same. Matching the gradient of a picture has been addressed in different applications like image stitching [12] or high dynamic range compression [3], and it can be efficiently solved using a variational approach.</p>
<p>Denote I(x, y) as the original picture. To simplify the following discussion, coordinates are omitted in the expressions and I, J, etc. actually refer to I(x, y), J(x, y), (x, y) and (x, y).</p>
<p>Let t: I -t(I) be the colour transformation. The problem is to find a modified image J of the mapped picture t(I) that minimises on the whole picture range I rthnff. VJ-VI2 +. -t(I)2 (7) with Neumann boundaries condition VJ = VIJc to match the the gradient of J with the gradient of I at the picture border. The term VJ -VII 12 forces the image gradient to be preserved.</p>
<p>The term II J-t(I) 112 ensures that the colours remain close to the target picture. Without II J-t(I) 112, a solution of equation (7) will be actually the original picture I. The weight fields q5 and affect the importance of both terms. Of course numerous expressions for and q5 are possible. For example, the weight field can emphasise that only flat areas have to remain flat but that gradient can change at object borders: = 1 + with o constant (8) A possible choice for the weight field i is to account for the possible stretching of the transformation t. Where Vt is big, the grain becomes more visible: = 1 + II (Vt) (1)11 with o constant (9) (Vt)(I) refers to the gradient of t for the colour I. Numerical Solution. The minimisation problem in equation (7) can be solved using the variational principle which states that the integral must satisfy the Euler-Lagrange equation: F dF dF 9J daJ daJ -o (10) where F = (75. IIVJ -V1112 + ,. IIJ -t(I)112 (11) from which the following can be derived: .J-div(.VJ)__.t(J).div(.VI) (12) This is an elliptic partial differential equation. The expression div ( . VI) can be approximated using standard finite differences [17] by: (13)</p>
<p>IEJSJ</p>
<p>where JV corresponds to the four neighboring pixels of i. This yields the following linear system: a(x,y)J(x,y-1)+b(x,y)J(x,y+1)+c(x,y)J(x-1,y)+d(x,y)J(x+1,y) (14) +e(x,y)J(x,y) = f(x,y) with a(xy) = b(x,y1)+t,b(x,y) b(x,y) = ib(x,y+1)+(x,y) c(x) = -l,y) +/(x,y) d(x,y) = (x+1,y)+'(x,y) e(x,y) = f(x,y) = (x,y) +b(x,y -l))(I(x,y -1)-I(x,y)) +(i,b(x,y) +i/'(x,y+ l)(I(x,y+ 1)-I(x,y)) + (b(x,y) +,b(x -l,y)(I(x -i,y) -I(x,y)) +(b(x,y)+b(x+ 1,y)(I(x+ l,y) _I(x,y))) + q(x, y)I(x, y) The final process is identical to a 2D hR filter in which the coefficients a, b, c, d, e and f only depend on the original picture I and the mapped picture t(I) and thus can be computed beforehand.</p>
<p>The unknown result picture J can be solved by standard iterative methods like SOR, Gauss-Seidel with multigrid approach. Implementations of these numerical solvers are widely available and one can refer for instance to the Numerical Recipes [14]. The main step of these methods is to solve iteratively for J(x, y).</p>
<p>J(k+l)(x, y) = (f(x, y) -a(x, y)J(k)(x y -1) -b(x, y)j(k)(x y + 1) ex,yj (15) -c(x,y) j(k) (x -l,y) -d(x,y)J(k) (x + 1,y)) 1.4 GPU embodiment The main computational demands in the first step of this process are the colourspace conversion and the look up table for mapping pixels. In the second noise smoothing step, the solution of the system requires a substantial amount of spatial processing. It is possible to propose the use of Graphics Hardware from companies such as NVIDIA and ATI to provide a hardware embodiment.</p>
<p>An efficient system can be created by using a general purpose CPU for control and data management while employing the Graphics subsystem for data processing. The main subsystems are proposed as below.</p>
<p>1. It is well known that Graphics Processing Units (GPU's) can perform multiply/add operations on images [6]. Colourspace conversion is in particular built in as a function on chips from NVIDIA and AT!. The calculation of RI in equation (2) can be performed on such chips directly.</p>
<p>2. The creation of the 1D mapping look up table can be performed using LUT operations on the GPU. The entire image can be mapped in a single operation using the GPU functionality.</p>
<p>3. It is well known that all the fundamental operations required for Gauss-Seidel Multigrid op-timisation can be performed on the GPU [9, 1]. A process description has been previously presented in [5]. That module can be used with minimal change for this application.</p>
<p>We do not present here the details of the GPU programming needed to achieve this. This pro-gramming manifestation will change with generations of GPUs and generations of tool development kits. The point to be made here is that the GPU represents a readily available cheap computational resource that can be taken advantage of for this invention. With the processes as outlined above, frame rates at near real time are achieved with current hardware.</p>
<p>References [1] Jeff Bolz, Ian Farmer, Eitan Grinspun, and Peter Schroder. Sparse matrix solvers on the gpu: Conjugate gradients and multigrid. ACM Transactions on Graphics, 22(3):917-924, July 2003.</p>
<p>[2] Y. Chang, K. Uchikawa, and S. Saito. Example-based color stylization based on categori-cal perception. In Proceedings of the 1st Symposium on Applied perception in graphics and visualization (APGV), pages 91-98. ACM Press, 2004.</p>
<p>[3] R. Fattal, D. Lischinski, and M. Werman. Gradient domain high dynamic range compres-sion. In Proceedings of the 29th annual conference on Computer graphics and interactive techniques (SIGGRAPH 02), pages 249-256, New York, NY, USA, 2002. ACM Press.</p>
<p>[4] R. C. Gonzalez and R. E. Woods. Digital Image Processing. Addison Wesley, 1992.</p>
<p>[5] Nolan Goodnight, Cliff Woolley, Gregory Lewin, David Luebke, and Greg Humphreys. A multigrid solver for boundary value problems using programmable graphics hardware. In Graphics Hardware 2003, pages 102-111, July 2003.</p>
<p>[6] nVIDIA SDK 9.1. Video filter. http://download.developer.nvidia.com/developer/SDK.</p>
<p>[7] Y. Ji, H-B. Liu, X-K Wang, and Y-Y. Tang. Color Transfer to Greyscale Images using Texture Spectrum. In Proceedings of the Third International Conference on Machine Learning and Cybernetics, Shanghai, 2004.</p>
<p>[8] J. Jia, J. Sun, C-K. Tang, and H-Y. Shum. Bayesian correction of image intensity with spatial consideration. In 8th European Conference on Computer Vision (ECCV), 2004.</p>
<p>[9] Jens Kruger and Rudiger Westermann. Linear algebra operators for gpu implementation of numerical algorithms. ACM Transactions on Graphics, 22(3):908-916, July 2003.</p>
<p>[10] L. Lucchese and S. K. Mitra. a new Method for Color Image Equalization. In IEEE Interna-tional Conference on Image Processing (ICIP 01), 2001.</p>
<p>[11] J. Morovic and P-L. Sun. Accurate 3d image colour histogram transformation. Pattern Recog-nition Letters, 24(1 1):1725-1735, 2003.</p>
<p>[12] P. Perez, M. Gangnet, and A. Blake. Poisson image editing. ACM Trans. Graph., 22(3):313-318, 2003.</p>
<p>[13] E. Pichon, M. Niethammer, and G. Sapiro. Color histogram equalization through mesh defor-mation. In IEEE International Conference on Image Processing (JCIP'04), 2003.</p>
<p>[14] W. Press, S. Teukoisky, W. Vetterling, and B. Flannery. Numerical Recipes in C: The Art of Scientific Computing. Cambridge University Press, New York, NY, USA, 1992.</p>
<p>[15] E. Reinhard, M. Ashikhmin, B. Gooch, and P. Shirley. Color transfer between images. IEEE Computer Graphics Applications, 21 (5):34-41, 2001.</p>
<p>[16] D.L. Ruderman, T.W. Cronin, and C.C. Chiao. Statistics of Cone Responses to Natural Images: Implications for Visual Coding. Journal of the Optical Society of America, (8):2036-2045, 1998.</p>
<p>[17] J. Weickert, B. ter Haar Romeny, and M. Viergever. Efficient andReliable Schemes for Non-linear Diffusion Filtering. IEEE Transactions on Image Processing, 7(3):398-410, march 1998.</p>
<p>[18] T. Welsh, M. Ashikhmin, and K. Mueller. Transferring Color to Greyscale Images. In Pro-ceedings of ACM SIGGRAPH, pages 227-280, San Antonio, 2002.</p>

Claims (2)

  1. <p>Claims 1. A method for estimating the linear mapping required to
    transform one multidimensional dis-tribution into another using Principal Component Analysis for the purpose of matching (or transferring) colour between two images.</p>
    <p>2. A method for estimating the mapping required to transform one colour distribution into an-other using one-dimensional histograms along multiple directions in the colour space.</p>
    <p>3. A method as in claim 2 wherein the operations are applied iteratively.</p>
    <p>4. A method as in claim 3 which employs a transformation of the data space between each iteration.</p>
    <p>5. A method as in claims 2 and 3 and 4 in which the transformation is a rotation of the colour space.</p>
    <p>6. A method as in claim 3 in which the iterations require that 1D histograms be matched.</p>
    <p>7. A method as in claim 3 in which the iterations require that 1D histograms be matched using a cumulative histogram.</p>
    <p>8. A method as in any of the preceeding claims applied for matching the colour between multiple images.</p>
    <p>9. A method as in claims 1, 8 that also involves a post process for image smoothing alter colour mapping.</p>
    <p>10. A method as in claim 9 in which the post process consists of constraining the gradients in the output image to be close to those in the input image.</p>
    <p>11. A method as in claim 10 in which the post process consists of minimising the weighted sum of the squared difference of image gradients between a desired noise reduced image and the target image, as well as the weighted sum of squared differences between the desired noise reduced image and the transformed image.</p>
    <p>12. A method which employs a process as in claim 1 as a preprocess before using a method as in claim
  2. 2.</p>
    <p>13. A process for implementing colour transfer between images using a method as in claims 1, 2, 12 where the main elements are implemented using a Graphics Processing Unit.</p>
    <p>14. A process for implementing colour transfer between images using a method as in claims 1, 2, 12 where any of the main elements below are implemented using a Graphics Processing Unit.</p>
    <p>(a) The image data transformations (i.e. colourspace conversion).</p>
    <p>(b) The mapping of input pixels into output pixels using a Look-up-table process embedded intheGPU.</p>
    <p>(c) The noise smoothing process.</p>
    <p>15. An image data processing unit performing in operation a method or process according to any of the preceeding claims.</p>
    <p>16. A unit as in claim 15 including a Graphics Processing Unit (GPU).</p>
GB0517990A 2005-09-05 2005-09-05 Method for matching colour in images Active GB2432472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0517990A GB2432472B (en) 2005-09-05 2005-09-05 Method for matching colour in images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0517990A GB2432472B (en) 2005-09-05 2005-09-05 Method for matching colour in images

Publications (3)

Publication Number Publication Date
GB0517990D0 GB0517990D0 (en) 2005-10-12
GB2432472A true GB2432472A (en) 2007-05-23
GB2432472B GB2432472B (en) 2011-01-12

Family

ID=35220832

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0517990A Active GB2432472B (en) 2005-09-05 2005-09-05 Method for matching colour in images

Country Status (1)

Country Link
GB (1) GB2432472B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009355842B2 (en) * 2009-11-27 2016-04-21 Cadens Medical Imaging Inc. Method and system for filtering image data and use thereof in virtual endoscopy
EP3148173A1 (en) * 2015-09-28 2017-03-29 Thomson Licensing Method for color grading of a digital visual content, and corresponding electronic device, computer readable program product and computer readable storage medium
US9661299B2 (en) 2011-06-30 2017-05-23 Thomson Licensing Outlier detection for colour mapping

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102360506B (en) * 2011-09-30 2013-07-31 北京航空航天大学 Local linear preserver-based scene color style uniformizing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5543940A (en) * 1994-02-02 1996-08-06 Electronics For Imaging Method and apparatus for converting color scanner signals into colorimetric values
US5771311A (en) * 1995-05-17 1998-06-23 Toyo Ink Manufacturing Co., Ltd. Method and apparatus for correction of color shifts due to illuminant changes
US20020154325A1 (en) * 1996-02-26 2002-10-24 Holub Richard A. System for distributing and controlling color reproduction at multiple sites
EP1538826A2 (en) * 2003-12-05 2005-06-08 Samsung Electronics Co., Ltd. Color transformation method and apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3826623B2 (en) * 1999-06-25 2006-09-27 コニカミノルタビジネステクノロジーズ株式会社 Image processing device
JP4345649B2 (en) * 2004-11-24 2009-10-14 ノーリツ鋼機株式会社 Photographic image processing method and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5543940A (en) * 1994-02-02 1996-08-06 Electronics For Imaging Method and apparatus for converting color scanner signals into colorimetric values
US5771311A (en) * 1995-05-17 1998-06-23 Toyo Ink Manufacturing Co., Ltd. Method and apparatus for correction of color shifts due to illuminant changes
US20020154325A1 (en) * 1996-02-26 2002-10-24 Holub Richard A. System for distributing and controlling color reproduction at multiple sites
EP1538826A2 (en) * 2003-12-05 2005-06-08 Samsung Electronics Co., Ltd. Color transformation method and apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Vrhel, M, Trussell, H, "Colour Correction using Principal Components", Colour Research and Application, Volume 17, no 5, pp 328-338, October 1992 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009355842B2 (en) * 2009-11-27 2016-04-21 Cadens Medical Imaging Inc. Method and system for filtering image data and use thereof in virtual endoscopy
US9661299B2 (en) 2011-06-30 2017-05-23 Thomson Licensing Outlier detection for colour mapping
EP3148173A1 (en) * 2015-09-28 2017-03-29 Thomson Licensing Method for color grading of a digital visual content, and corresponding electronic device, computer readable program product and computer readable storage medium
EP3148172A1 (en) * 2015-09-28 2017-03-29 Thomson Licensing Method for color grading of a digital visual content, and corresponding electronic device, computer readable program product and computer readable storage medium

Also Published As

Publication number Publication date
GB2432472B (en) 2011-01-12
GB0517990D0 (en) 2005-10-12

Similar Documents

Publication Publication Date Title
US7796812B2 (en) Method for matching color in images
Fattal et al. Gradient domain high dynamic range compression
Tao et al. Error-tolerant image compositing
Liu et al. Detail-preserving underexposed image enhancement via optimal weighted multi-exposure fusion
Paris et al. A fast approximation of the bilateral filter using a signal processing approach
US7343040B2 (en) Method and system for modifying a digital image taking into account it&#39;s noise
Pouli et al. Progressive histogram reshaping for creative color transfer and tone reproduction
Paris et al. Local Laplacian filters: edge-aware image processing with a Laplacian pyramid
Pitie et al. N-dimensional probability density function transfer and its application to color transfer
US7356198B2 (en) Method and system for calculating a transformed image from a digital image
Bonneel et al. Example-based video color grading.
Neumann et al. Color style transfer techniques using hue, lightness and saturation histogram matching
US10922860B2 (en) Line drawing generation
US6990252B2 (en) System for manipulating noise in digital images
Di Martino et al. Poisson image editing
US7577313B1 (en) Generating synthesized texture in differential space
Singh et al. Anisotropic diffusion for details enhancement in multiexposure image fusion
Liu et al. Survey of natural image enhancement techniques: Classification, evaluation, challenges, and perspectives
Singh et al. Weighted least squares based detail enhanced exposure fusion
GB2432472A (en) Automated colour matching between images
Aubry et al. Fast and robust pyramid-based image processing
Drago et al. Design of a tone mapping operator for high-dynamic range images based upon psychophysical evaluation and preference mapping
Pitié et al. Towards automated colour grading
CN115358943A (en) Low-light image enhancement method, system, terminal and storage medium
Shen et al. High dynamic range image tone mapping and retexturing using fast trilateral filtering

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20120614 AND 20120620