WO2009091152A2 - Color recovery method and system - Google Patents

Color recovery method and system Download PDF

Info

Publication number
WO2009091152A2
WO2009091152A2 PCT/KR2008/007883 KR2008007883W WO2009091152A2 WO 2009091152 A2 WO2009091152 A2 WO 2009091152A2 KR 2008007883 W KR2008007883 W KR 2008007883W WO 2009091152 A2 WO2009091152 A2 WO 2009091152A2
Authority
WO
WIPO (PCT)
Prior art keywords
color
channel
images
sub
criterion
Prior art date
Application number
PCT/KR2008/007883
Other languages
French (fr)
Other versions
WO2009091152A3 (en
WO2009091152A9 (en
Inventor
Chan Sup Chung
Original Assignee
Industry-Academic Cooperation Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industry-Academic Cooperation Foundation filed Critical Industry-Academic Cooperation Foundation
Publication of WO2009091152A2 publication Critical patent/WO2009091152A2/en
Publication of WO2009091152A3 publication Critical patent/WO2009091152A3/en
Publication of WO2009091152A9 publication Critical patent/WO2009091152A9/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6027Correction or control of colour gradation or colour contrast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6083Colour correction or control controlled by factors external to the apparatus
    • H04N1/6086Colour correction or control controlled by factors external to the apparatus by scene illuminant, i.e. conditions at the time of picture capture, e.g. flash, optical filter used, evening, cloud, daylight, artificial lighting, white point measurement, colour temperature
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/62Retouching, i.e. modification of isolated colours only or in isolated picture areas only

Definitions

  • the present invention relates to a color recovery method that recovers colors of digital images distorted due to lighting by applying a human principle of implementing color constancy.
  • 0442320 disclosed a color recovery method of digital images by using the intra- channel-ratio scales (adaptation-like or Land s Retinex-like scaling functions of which principles are barrowed from human color constancy) and the equi-max algorithm to solve the unique color problem.
  • the invention has a problem in that the equi- max algorithm does not work on a steady criterion base and thus needs readjustment of its correction parameter depending upon the seriousness of the unique color problem.
  • a unique color problem occurs when a distorted digital image contains a dominantly extended large region of a relatively uniform color like a blue color dominant sea-side scene. Such a problem also occurs when we take a zoomed-in photograph of an object having relatively unique uniform color. Disclosure of Invention Technical Problem
  • the present invention proposes to solve the above problems. It is an object of the present invention to provide a color recovery method and system that recovers the colors of digital images distorted due to a lighting effect into colors under the standard lighting and overcomes a unique color problem occurring in the digital images.
  • the color recovery method adopted by the present invention comprises: a) determining criterion region in consideration of variance values for each color channel of images to be recovered; b) computing relative surface reflectance for the determined criterion regions and computing adjustment coefficients for excluding distortion effects of a light source using the computed relative surface reflectance; and c) recovering the colors for the images by using the adjustment coefficient.
  • the system proposed by the present invention comprises the following computational units: a criterion region determination unit that determines regions where a sum of variance values for each color channel of images to be recovered is at the maximum as criterion regions; an adjustment coefficient computation unit that computes adjustment coefficients for excluding distortion effects of light sources within the criterion regions; and an image recovery unit that recovers the color for the images by using the adjustment coefficients.
  • FIG. 1 is a block diagram of a color recovery system according to an exemplary embodiment of the present invention.
  • FIG. 2 is a flow chart of a color recovery system according to an exemplary embodiment of the present invention.
  • FIG. 3 is a reference diagram of a process of recovering unique colors for each channel of digital images according to an exemplary embodiment of the present invention
  • FIG. 4 is a reference diagram of images biased to a red color system according to an exemplary embodiment of the present invention
  • FIG. 5 is a reference diagram of images biased to a green color system according to an exemplary embodiment of the present invention.
  • FIG. 6 is a reference diagram of images biased to a blue color according to an exemplary embodiment of the present invention. Best Mode for Carrying out the Invention
  • FIG. 1 is a block diagram of a color recovery system according to an exemplary embodiment of the present invention.
  • a color recovery system 10 recovers colors of digital images distorted by light by using a principle of implementing color constancy based on a human vision system and performs the operations under the following three basic assumptions.
  • the color recovery system 10 includes a criterion region deter- mination unit 20, a ratio scale computation unit 30, a surface reflectance computation unit 40, a color recovery unit 50, an adjustment coefficient computing unit 60, an image recovery unit 70, a first correction unit 73, and a second correction unit 75.
  • the criterion region determination unit 20 searches and selects a criterion region for color correction by finding out an input image region where the sum of three intra- channel dispersion values over it is maximum (i.e., the sum of light intensity variance within each color channel is maximum). This part of operation is implemented on the basis of the third aforementioned assumption.
  • the ratio scale computation unit 30 computes an intra-channel ratio scale (IR) which is a ratio of the color values for each channel of each pixel forming the criterion region to the color values for each channel of the entire criterion region. This part of computation is made by implementing the first and the second aforementioned assumption.
  • IR intra-channel ratio scale
  • the surface reflectance computation unit 40 receives the intra-channel ratio scale to compute relative surface reflectance that is an approximate value to surface reflectance for each pixel of the criteria region.
  • the color recovery unit 50 computes an estimated unbiased surface color for each pixel of the criteria region by multiplying the relative surface reflectance by luminance for each pixel of the criteria region.
  • the adjustment coefficient computing unit 60 computes the adjustment coefficient in order to exclude an effect of a light source by using inherent surface colors for each color channel of the recovered criterion region and the color values for each color channel of the criterion region.
  • the image recovery unit 70 recovers the unbiased colors by using the adjustment coefficients and the color values for each pixel of an input.
  • the first correction unit 73 performs an overflow control to correct the color values for each of the pixels of the recovered images when the color values for each pixel of the recovered images exceeds a threshold value that is set to 0 to 225.
  • the second correction unit 75 re-corrects the color values that are subjected to the correction and the color values for each pixel that is subjected to the correction in consideration of an output gain factor.
  • FIG. 2 is a flow chart of a color recovery method according to an exemplary embodiment of the present invention. As shown in FIG. 2, the color recovery method includes the following steps that are time-serially performed in the color recovery system 10.
  • the criterion region determination unit 10 determines the criterion region for the images input to recover the colors.
  • the criterion region is a sub-region of the input image where the sum of dispersion values for each color channel is at the maximum. This variance maximum region, by its statistical characteristics, is most likely to contain different color surfaces more than any other sub-regions. As it was mentioned above, it enables us to effectively handle the unique color problem by minimizing the effect of a dominant color extended over a relatively large region in the process of computing the color adjustment coefficients.
  • ⁇ 2 ⁇ 2 R + O 2 G + ⁇ 2 B
  • ⁇ 2 denotes a sum of dispersion values of three channels
  • ⁇ 2 R denotes a dispersion value of an R channel
  • ⁇ 2 G denotes a dispersion value of a G channel
  • ⁇ 2 B denotes a dispersion value of a B channel.
  • the image to be recovered is segmented into a plurality of sub-images. And, the sum of dispersion values of each color channel for each of the plurality of segmented sub-images is computed and the sub-image where the sum of the dispersion values is at the maximum is determined as the first sub-image.
  • the first sub-image is re-segmented into a plurality of sub-images. And, the sum of dispersion values of each color channel for each of the re-segmented sub- images is computed and the sub-image where the sum of the dispersion values is at the maximum is determined as the second sub-image.
  • the first sub-image is determined as the criterion region.
  • the sub-image is back re-segmented into a plurality of sub-images and then, the segmentation is repeated until the sum of the dispersion values in each color channel of the sub-image among the re-segmented sub-images where the sum of the dispersion values of each color channel is at the maximum is less than or equal to the sum of the dispersion values of each color channel of the sub-image before the re- segmentation, such that the criterion region can be determined.
  • another method of determining the criterion region extracts the sub-images by applying each of the different sized windows to the images to be recovered, computes the sum of the dispersion values for each color channel for each of the extracted sub- images, and then, determines, as the criterion region, the sub-image where the sum of the dispersion values for each color channel is at the maximum.
  • the ratio scale computation unit 30 computes the intra-channel ratio scale (IR); the three-channel color values of each image pixel are transformed to three different sets of ratio scales each of which indicates the relative strength of a pixel in each color channel.
  • the ratio values are equivalent to the normalized three- color-channel values of each pixel.
  • V ⁇ denotes a color value for a K color channel of a single pixel of the criterion region
  • EV K denotes a color value of a K color channel of the entire criterion region
  • IR K denotes the intra-channel ratio scale for a K color channel for each pixel of the criterion region.
  • Equation 2 computes the intra-channel ratio scale that is a core computation factor in the color recovery method according to the present invention and converts the color values of each pixel forming the criterion region into the ratio value based on Equation 2.
  • the surface reflectance computation unit 40 computes the relative surface reflectance for substituting the surface reflectance that determines the unique color of the object surface.
  • P ⁇ denotes the relative surface reflectance in the K color channel of each pixel of the criterion region
  • S ⁇ denotes the surface reflectance that determines the unique color of the object surface in the K color channel of each pixel of the criterion region
  • S R / ⁇ S K denotes the intra-channel ratio scale in the K color channel
  • K denotes R, G, and B color channels.
  • the reason of computing the relative surface reflectance is that, since it is impossible to compute the surface reflectance (i.e., S ⁇ ) which is an intrinsic property of a surface, the equation 3 estimates it with S K / ⁇ S K which can be obtained by the Equation 2 and substitutes this approximated value for the surface reflectance.
  • the color recovery unit 50 recovers the unique surface color for each pixel of the criterion region by multiplying luminance for each pixel of the criteria region by the relative surface reflectance for each color channel of each pixel.
  • V R , V G , and V B denote the color values of the RGB channels.
  • the recovered surface color of each pixel in the criterion region is just the product of the luminance obtained by equation 4 and the surface reflectance obtained by the following equation 5.
  • C ⁇ denotes the recovered surface color of the K color channel
  • L denotes luminance
  • P ⁇ denotes a relative surface reflectance
  • the adjustment coefficient computation unit 60 computes the adjustment coefficient for each color channel to eliminate the distortion effect of the light source for the entire input image by taking the ratio of the sum of CK and the sum of Vk for the pixels in the criterion region.
  • the coefficient is computed by the following equation 6.
  • m ⁇ denotes an adjustment coefficient of the K color channel
  • ⁇ C K denotes the sum of the recovered K-channel color values
  • EV K denotes the sum of the original K-channel color values
  • the image recovery unit 70 recovers the colors of the input images by multiplying the original color values with the coefficient m ⁇ obtained from the equation 6 as in the following equation 7.
  • RC K denotes the recovered K-channel color value
  • V ⁇ denotes the original K-channel color value
  • m ⁇ denotes the color adjustment coefficient of the K-channel obtained by the equation 6.
  • the first correction unit 73 determines whether or not the recovered color value exceeds a threshold value that is set to 0 to 255 and then ends if the recovered color value does not exceed the threshold value. [72] If the recovered color value exceeds the threshold value, at step S73, the first correction unit 73 performs an overflow control to correct the recovered color value to the threshold value or less.
  • RC K denotes a corrected color value in a K color channel of each pixel forming the recovered image
  • max ⁇ denotes a maximum color value before recovering the input image and after recovering the input image.
  • the second correction unit 75 re-corrects the corrected color value by multiplying an output gain factor by the unique surface color at step 73.
  • RC K denotes a re-corrected color value in a K color channel of each pixel
  • G ⁇ denotes an output gain factor in a K color channel
  • ⁇ RC K denotes a sum of color values for each pixel in a K color channel of the recovered image
  • LRC' K denotes a sum of color values for each pixel in a K color channel of the corrected image
  • Max RC - k is a maximum threshold value in a K color channel as a thresholding value.
  • the thresholding value may be the re-corrected color value.
  • FIG. 3 is a reference diagram to show in a concise way how the color recovery process works according to the system proposed by the present invention.
  • the color recovery process for the input image can be regarded as a process that redistributes or splits the luminance (i.e., the total light energy) of each pixel into three distortion-free color components by multiplying the adjustment coefficients obtained from an approximation process mainly based on intra-channel-ratio scales, a process very similar to that of human color constancy.
  • FIG. 4 is a reference diagram to show a recovery of an image extremely distorted by a red lighting as an exemplary embodiment of the present invention.
  • the image (a) in FIG. 4 was photographed under the standard lighting around noon and was changed into the severely distorted image (b) by a simulated red lighting for a performance test of the present color recovery system (10).
  • the image (c) shows the result of recovering the image (b) using the color recovery system 10 and the image (d) shows, as a bench marking, the result of recovering the same image (b) by an existing other color recovery program (Adobe Photoshop) known as most popular and outstanding.
  • Adobe Photoshop an existing other color recovery program known as most popular and outstanding.
  • the comparison of the images (c) and (d) tells us the color correction system proposed by the present invention performs far better in this extremely distorted lighting condition.
  • FIG. 5 and FIG. 6 are the reference diagrams to show the results of recovering images distorted by a green light and a blue light respectively.
  • the color recovery system (10) proposed by the present invention showed satisfactory performances also for the images (b) as shown in the images (c) in FIG. 5 and 6.
  • the images (d) of the FIG. 5 and 6 show, for a bench marking, the results of recovering the same images (b) by the existing other color recovery program (Adobe Photoshop). Again for the distortion by the two distortion lights blue and green, the color recovery system (10) proposed by the present invention performed much better than the existing system.
  • the color recovery system proposed by the present invention provides a recording medium readable in the computer that records information coded with an instruction language executable within the computer.
  • the recording medium readable in the computer there are a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical storage device, etc.
  • Programmers of a technical field to which the present invention belongs can easily infer the program, code, and segments that can realize each function so as to implement the recording medium.
  • the present invention can handle the unique color problem, a problem caused by a unique color uniformly extended over a relatively large area in the input image like a bluish sea-side photo or a zoomed-in image of an object with a uniform color
  • the color recovery system proposed by the present invention can be applied in an automatic and single-step way to most digital images processed by a digital camera, a photographing apparatus, and an image processing program.
  • the present invention can recover the image when colors are distorted due to the multiple light sources to achieve a more unique color and can overcome the unique color problem that cannot recover the unique color when the bias for the specific color exceeds the threshold value, it can be applied to a digital camera, a photographing apparatus, and an image processing program.

Abstract

The present invention relates to a color recovery method and system, and more specifically, to a color recovery method and system that recovers colors of digital images distorted due to lighting by using a principle of implementing color constancy based on a human vision system. The present invention comprises: a) determining criterion region in consideration of variance values for each color channel of images to be recovered; b) computing relative surface reflectance for the determined criterion regions and computing adjustment coefficients for excluding distortion effects of a light source using the computed relative surface reflectance; and c) recovering the colors for the images by using the adjustment coefficient. With the present invention, the distorted colors of digital images due to the spectral bias of a light source can be easily recovered in a very fast single step operation free from the unique color problem, one of the worst problems in the field of color correction technology so-far known as impossible to solve, if and only if there is a sub-region in a digital image containing sufficient number of surface elements of different reflectance.

Description

Description
COLOR RECOVERY METHOD AND SYSTEM
Technical Field
[1] The present invention relates to a color recovery method that recovers colors of digital images distorted due to lighting by applying a human principle of implementing color constancy. Background Art
[2] When an image distortion occurs due to lighting, it is theoretically impossible for any system to recover the exact distortion-free color values of the image (i.e., the color values obtainable under a balanced standard light condition or color values defined by spectral surface reflectance). This is because the image contains only the light information reflected from object surfaces and not the light source per se. However, by the well-known color constancy function, human vision system can nicely approximate the intrinsic color values of an image by eliminating or minimizing the distortion effect of the lighting.
[3] Although applying such human principles of color constancy to the image-processing technology appears desirable, there has not been a satisfactory attempt yet. Most of the existing digital image processing devices and soft wares adopt very complicated color correction techniques far distant from human principles: techniques of utilizing diverse cues derived mainly from the optic or engineering point of view and sometimes even with some trial-and-error-based manual operations. As a result, all of these approaches have the problems of limited application range; within certain ranges of their operational parameters, the existing systems seem to work well but their performance drops abruptly once they fall out of the ranges. In some cases, they even use manual operations to ease these problems but they often turn out to be much effort costly and time consuming.
[4] In order to solve these problems of existing technology, KR Patent Registration No.
0442320 disclosed a color recovery method of digital images by using the intra- channel-ratio scales (adaptation-like or Land s Retinex-like scaling functions of which principles are barrowed from human color constancy) and the equi-max algorithm to solve the unique color problem. However, the invention has a problem in that the equi- max algorithm does not work on a steady criterion base and thus needs readjustment of its correction parameter depending upon the seriousness of the unique color problem. A unique color problem occurs when a distorted digital image contains a dominantly extended large region of a relatively uniform color like a blue color dominant sea-side scene. Such a problem also occurs when we take a zoomed-in photograph of an object having relatively unique uniform color. Disclosure of Invention Technical Problem
[5] The present invention proposes to solve the above problems. It is an object of the present invention to provide a color recovery method and system that recovers the colors of digital images distorted due to a lighting effect into colors under the standard lighting and overcomes a unique color problem occurring in the digital images. Technical Solution
[6] In order to achieve the goal, the color recovery method adopted by the present invention comprises: a) determining criterion region in consideration of variance values for each color channel of images to be recovered; b) computing relative surface reflectance for the determined criterion regions and computing adjustment coefficients for excluding distortion effects of a light source using the computed relative surface reflectance; and c) recovering the colors for the images by using the adjustment coefficient.
[7] To achieve such a goal, the system proposed by the present invention comprises the following computational units: a criterion region determination unit that determines regions where a sum of variance values for each color channel of images to be recovered is at the maximum as criterion regions; an adjustment coefficient computation unit that computes adjustment coefficients for excluding distortion effects of light sources within the criterion regions; and an image recovery unit that recovers the color for the images by using the adjustment coefficients. Advantageous Effects
[8] With the present invention, the distorted colors of digital images due to the spectral bias of a light source can be easily recovered in a very fast single step operation free from the unique color problem, one of the worst problems in the field of color correction technology so-far known as impossible to solve, if and only if there is a sub- region in a digital image containing sufficient number of surface elements of different reflectance. Brief Description of Drawings
[9] FIG. 1 is a block diagram of a color recovery system according to an exemplary embodiment of the present invention;
[10] FIG. 2 is a flow chart of a color recovery system according to an exemplary embodiment of the present invention;
[11] FIG. 3 is a reference diagram of a process of recovering unique colors for each channel of digital images according to an exemplary embodiment of the present invention; [12] FIG. 4 is a reference diagram of images biased to a red color system according to an exemplary embodiment of the present invention;
[13] FIG. 5 is a reference diagram of images biased to a green color system according to an exemplary embodiment of the present invention; and
[14] FIG. 6 is a reference diagram of images biased to a blue color according to an exemplary embodiment of the present invention. Best Mode for Carrying out the Invention
[15] Preferred embodiments of the invention are described hereafter in detail with reference to the accompanying drawings. Reference numerals denoting components in each figure are referred by the same reference numerals for the sake of simply describing the drawings. Describing the invention herein, when it is considered that detailed description about related known configurations or functions makes the aspects of the invention unclear, the detailed description may be omitted. Further, preferred embodiments of the invention are described hereafter, but it will be apparent to those skilled in the art that various modifications and changes may be made thereto without departing from the scope and spirit of the invention. Mode for the Invention
[16] FIG. 1 is a block diagram of a color recovery system according to an exemplary embodiment of the present invention.
[17] A color recovery system 10 recovers colors of digital images distorted by light by using a principle of implementing color constancy based on a human vision system and performs the operations under the following three basic assumptions.
[ 18] First, since the problem of the color constancy cannot be solved by an inverse function, color constancy in its nature is a problem that requires not exact solution but approximation. Human vision system adopts a strategy of light adaptation for such approximation and a strategy similar to it is applied to the present invention.
[19] Second, the effect of a light source is uniformly applied to all the portions of a single image. This means that, if we find a color correction approximation for one region, we can apply that to the entire image region.
[20] Third, natural scenes are most likely to include object surfaces having different light reflecting spectra. If entire scene or image consists of a uniform color or very limited number of surfaces of different color, no system can distinguish whether the apparent colors come from the intrinsic surface properties of objects (i.e., surface reflectance) or the distortion due to light source. The present invention assumes that a photographed image always contains object surfaces of different color enough to overcome such a problem.
[21] As shown in FIG. 1, the color recovery system 10 includes a criterion region deter- mination unit 20, a ratio scale computation unit 30, a surface reflectance computation unit 40, a color recovery unit 50, an adjustment coefficient computing unit 60, an image recovery unit 70, a first correction unit 73, and a second correction unit 75.
[22] The criterion region determination unit 20 searches and selects a criterion region for color correction by finding out an input image region where the sum of three intra- channel dispersion values over it is maximum (i.e., the sum of light intensity variance within each color channel is maximum). This part of operation is implemented on the basis of the third aforementioned assumption.
[23] The ratio scale computation unit 30 computes an intra-channel ratio scale (IR) which is a ratio of the color values for each channel of each pixel forming the criterion region to the color values for each channel of the entire criterion region. This part of computation is made by implementing the first and the second aforementioned assumption.
[24] The surface reflectance computation unit 40 receives the intra-channel ratio scale to compute relative surface reflectance that is an approximate value to surface reflectance for each pixel of the criteria region.
[25] The color recovery unit 50 computes an estimated unbiased surface color for each pixel of the criteria region by multiplying the relative surface reflectance by luminance for each pixel of the criteria region.
[26] The adjustment coefficient computing unit 60 computes the adjustment coefficient in order to exclude an effect of a light source by using inherent surface colors for each color channel of the recovered criterion region and the color values for each color channel of the criterion region.
[27] The image recovery unit 70 recovers the unbiased colors by using the adjustment coefficients and the color values for each pixel of an input.
[28] The first correction unit 73 performs an overflow control to correct the color values for each of the pixels of the recovered images when the color values for each pixel of the recovered images exceeds a threshold value that is set to 0 to 225.
[29] The second correction unit 75 re-corrects the color values that are subjected to the correction and the color values for each pixel that is subjected to the correction in consideration of an output gain factor.
[30] FIG. 2 is a flow chart of a color recovery method according to an exemplary embodiment of the present invention. As shown in FIG. 2, the color recovery method includes the following steps that are time-serially performed in the color recovery system 10.
[31] At step SlO, the criterion region determination unit 10 determines the criterion region for the images input to recover the colors. The criterion region is a sub-region of the input image where the sum of dispersion values for each color channel is at the maximum. This variance maximum region, by its statistical characteristics, is most likely to contain different color surfaces more than any other sub-regions. As it was mentioned above, it enables us to effectively handle the unique color problem by minimizing the effect of a dominant color extended over a relatively large region in the process of computing the color adjustment coefficients.
[32] The computation of the criterion region is performed based on the following equation
1.
[33] [Equation 1]
[34] σ2 = σ2 R + O2G + σ2 B
[35] where σ2 denotes a sum of dispersion values of three channels, σ2 R denotes a dispersion value of an R channel, σ2 G denotes a dispersion value of a G channel, and σ2 B denotes a dispersion value of a B channel.
[36] The method of determining the criterion region is as follows.
[37] First, the image to be recovered is segmented into a plurality of sub-images. And, the sum of dispersion values of each color channel for each of the plurality of segmented sub-images is computed and the sub-image where the sum of the dispersion values is at the maximum is determined as the first sub-image.
[38] Next, the first sub-image is re-segmented into a plurality of sub-images. And, the sum of dispersion values of each color channel for each of the re-segmented sub- images is computed and the sub-image where the sum of the dispersion values is at the maximum is determined as the second sub-image.
[39] Finally, when it is determined that the sum of the dispersion values of the first sub- image is larger than that of the second sub-image by comparing the first sub-image and the second-sub image, the first sub-image is determined as the criterion region.
[40] If the sum of the dispersion values of the second sub-image is larger than that of the first sub-image, the sub-image is back re-segmented into a plurality of sub-images and then, the segmentation is repeated until the sum of the dispersion values in each color channel of the sub-image among the re-segmented sub-images where the sum of the dispersion values of each color channel is at the maximum is less than or equal to the sum of the dispersion values of each color channel of the sub-image before the re- segmentation, such that the criterion region can be determined.
[41] Also, another method of determining the criterion region extracts the sub-images by applying each of the different sized windows to the images to be recovered, computes the sum of the dispersion values for each color channel for each of the extracted sub- images, and then, determines, as the criterion region, the sub-image where the sum of the dispersion values for each color channel is at the maximum.
[42] At step S20, the ratio scale computation unit 30 computes the intra-channel ratio scale (IR); the three-channel color values of each image pixel are transformed to three different sets of ratio scales each of which indicates the relative strength of a pixel in each color channel. Thus, the ratio values are equivalent to the normalized three- color-channel values of each pixel.
[43] The computation of the intra-channel ration scale is performed based on the following equation 2.
[44] [Equation 2]
Figure imgf000007_0001
[46] Where Vκ denotes a color value for a K color channel of a single pixel of the criterion region, EVK denotes a color value of a K color channel of the entire criterion region, and IRK denotes the intra-channel ratio scale for a K color channel for each pixel of the criterion region.
[47] Equation 2 computes the intra-channel ratio scale that is a core computation factor in the color recovery method according to the present invention and converts the color values of each pixel forming the criterion region into the ratio value based on Equation 2.
[48] At step 30, the surface reflectance computation unit 40 computes the relative surface reflectance for substituting the surface reflectance that determines the unique color of the object surface.
[49] The computation of the relative surface reflectance is performed based on the following equation 3.
[50] [Equation 3]
[51] Q
Figure imgf000007_0002
[52] Where Pκ denotes the relative surface reflectance in the K color channel of each pixel of the criterion region, Sκ denotes the surface reflectance that determines the unique color of the object surface in the K color channel of each pixel of the criterion region, SR/∑SK denotes the intra-channel ratio scale in the K color channel, and K denotes R, G, and B color channels.
[53] At this time, the reason of computing the relative surface reflectance is that, since it is impossible to compute the surface reflectance (i.e., Sκ) which is an intrinsic property of a surface, the equation 3 estimates it with SK/∑SK which can be obtained by the Equation 2 and substitutes this approximated value for the surface reflectance. [54] At step 40, the color recovery unit 50 recovers the unique surface color for each pixel of the criterion region by multiplying luminance for each pixel of the criteria region by the relative surface reflectance for each color channel of each pixel.
[55] The luminance of each pixel in the criterion region is just the sum of the three channel color values as appeared in the equation 4.
[56] [Equation 4]
[57] L = VR+ VG+ VB
[58] Where VR, VG, and VB denote the color values of the RGB channels.
[59] The recovered surface color of each pixel in the criterion region is just the product of the luminance obtained by equation 4 and the surface reflectance obtained by the following equation 5.
[60] [Equation 5]
[61] Cκ = L xPκ
[62] Where, for each pixel in the criterion region, Cκ denotes the recovered surface color of the K color channel, L denotes luminance, and Pκ denotes a relative surface reflectance.
[63] At step 50, the adjustment coefficient computation unit 60 computes the adjustment coefficient for each color channel to eliminate the distortion effect of the light source for the entire input image by taking the ratio of the sum of CK and the sum of Vk for the pixels in the criterion region. The coefficient is computed by the following equation 6.
[64] [Equation 6]
Figure imgf000008_0001
K
[66] Where mκ denotes an adjustment coefficient of the K color channel, ∑CK denotes the sum of the recovered K-channel color values, and EVK denotes the sum of the original K-channel color values.
[67] At step 60, the image recovery unit 70 recovers the colors of the input images by multiplying the original color values with the coefficient mκ obtained from the equation 6 as in the following equation 7.
[68] [Equation 7]
[69] RCK = VK xmK
[70] Where, every pixel in the entire input image, RCK denotes the recovered K-channel color value, Vκ denotes the original K-channel color value, and mκ denotes the color adjustment coefficient of the K-channel obtained by the equation 6.
[71] At step S70, the first correction unit 73 determines whether or not the recovered color value exceeds a threshold value that is set to 0 to 255 and then ends if the recovered color value does not exceed the threshold value. [72] If the recovered color value exceeds the threshold value, at step S73, the first correction unit 73 performs an overflow control to correct the recovered color value to the threshold value or less.
[73] The overflow control is performed by the following equation 8.
[74] [Equation 8]
1751 RC - RΓ x ma3W™' m aXrecorererf
[76] Where RCK denotes a corrected color value in a K color channel of each pixel forming the recovered image, max^^ denotes a maximum color value before recovering the input image and
Figure imgf000009_0001
after recovering the input image.
[77] Since the output values for each color channel displayable on the monitor is 256 steps of 0 to 255, if the recovered color value exceeds a threshold value 255, the output value of the color value recovered in the color recovery system 10 using the above equation 8 is adjusted, making it possible to correct the recovered color value to the threshold value or less.
[78] At step 75, the second correction unit 75 re-corrects the corrected color value by multiplying an output gain factor by the unique surface color at step 73.
[79] The re-correction for the corrected color value is performed by the following equation 9.
[80] [Equation 9]
[81] RC" κ = Gκx RC' κ (i f , Gκx RC' κ ≤ MαxRC» )
= MαxRC" (i f , GKX RC' κ > MαxRC" )
ΈRC K \
(G* =
[82] Where RCK denotes a re-corrected color value in a K color channel of each pixel, Gκ denotes an output gain factor in a K color channel, ∑RCK denotes a sum of color values for each pixel in a K color channel of the recovered image, LRC'K denotes a sum of color values for each pixel in a K color channel of the corrected image, and MaxRC-k is a maximum threshold value in a K color channel as a thresholding value.
[83] As in equation 9, if the product of the corrected color value and the output gain factor does not exceed the thresholding value and if the product of the corrected color value and the output gain factor does not exceed the thresholding value, the thresholding value may be the re-corrected color value.
[84] FIG. 3 is a reference diagram to show in a concise way how the color recovery process works according to the system proposed by the present invention. As shown in FTG. 3, the color recovery process for the input image can be regarded as a process that redistributes or splits the luminance (i.e., the total light energy) of each pixel into three distortion-free color components by multiplying the adjustment coefficients obtained from an approximation process mainly based on intra-channel-ratio scales, a process very similar to that of human color constancy.
[85] FIG. 4 is a reference diagram to show a recovery of an image extremely distorted by a red lighting as an exemplary embodiment of the present invention. The image (a) in FIG. 4 was photographed under the standard lighting around noon and was changed into the severely distorted image (b) by a simulated red lighting for a performance test of the present color recovery system (10). The image (c) shows the result of recovering the image (b) using the color recovery system 10 and the image (d) shows, as a bench marking, the result of recovering the same image (b) by an existing other color recovery program (Adobe Photoshop) known as most popular and outstanding. The comparison of the images (c) and (d) tells us the color correction system proposed by the present invention performs far better in this extremely distorted lighting condition.
[86] FIG. 5 and FIG. 6 are the reference diagrams to show the results of recovering images distorted by a green light and a blue light respectively. Like it was already shown with FIG. 4, the color recovery system (10) proposed by the present invention showed satisfactory performances also for the images (b) as shown in the images (c) in FIG. 5 and 6. The images (d) of the FIG. 5 and 6 show, for a bench marking, the results of recovering the same images (b) by the existing other color recovery program (Adobe Photoshop). Again for the distortion by the two distortion lights blue and green, the color recovery system (10) proposed by the present invention performed much better than the existing system.
[87] Although not shown, the color recovery system proposed by the present invention provides a recording medium readable in the computer that records information coded with an instruction language executable within the computer.
[88] As an example of the recording medium readable in the computer, there are a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical storage device, etc. Programmers of a technical field to which the present invention belongs can easily infer the program, code, and segments that can realize each function so as to implement the recording medium.
[89] Since the present invention can handle the unique color problem, a problem caused by a unique color uniformly extended over a relatively large area in the input image like a bluish sea-side photo or a zoomed-in image of an object with a uniform color, the color recovery system proposed by the present invention can be applied in an automatic and single-step way to most digital images processed by a digital camera, a photographing apparatus, and an image processing program.
[90] As described above, although the present invention has been described with reference to the limited embodiments and accompanying drawings, the present invention is not limited to the embodiments and various changes and modification may be made by those skilled in the art. Therefore, the scope of the present invention should not be limited to the above-described embodiments and should be defined by the appended claims and their equivalents. Industrial Applicability
[91] Since the present invention can recover the image when colors are distorted due to the multiple light sources to achieve a more unique color and can overcome the unique color problem that cannot recover the unique color when the bias for the specific color exceeds the threshold value, it can be applied to a digital camera, a photographing apparatus, and an image processing program.

Claims

Claims
[ 1 ] A color recovery method comprising: a) determining criterion region in consideration of variance values for each color channel of images to be recovered; b) computing relative surface reflectance for the determined criterion regions and computing adjustment coefficients for excluding distortion effects of a light source using the computed relative surface reflectance; and c) recovering the colors for the images by using the adjustment coefficient.
[2] The color recovery method according to claim 1, wherein the step b) comprises: bl) computing an intra-channel ratio scale (IR) for each pixel of the criterion region; b2) computing reflective surface reflectance (P) of the criterion region by using the intra-channel ratio scale (IR); b3) recovering a unique surface color (C) by using luminance (L) and the relative surface reflectance of the criterion region; and
M) computing an adjustment coefficient to exclude distortion effects of a light source by using the unique surface color (C) of the criterion region and the color values for each channel in the criterion region.
[3] The color recovery method according to claim 1, wherein at step a), the criterion region is a region where a sum of variance values for each color channel of the image to be recovered is at the maximum.
[4] The color recovery method according to claim 1, wherein the step a) comprises: al) segmenting the image to be recovered into a plurality of sub-images; a2) computing a sum of the variance values in each color channel for each segmented sub-images and determining, as a first sub-images, the sub-images where the sum of the variance values is at the maximum; a3) re-segmenting the first sub-image into a plurality of sub-images; a4) computing a sum of the variance values in each color channel for each segmented sub-images and determining, as a second sub-images, the sub-images where the sum of the variance values is at the maximum; a5) comparing the sum of the variance values according to the first sub-images and the sum of the variance values according to the second sub-images; and a6) determining the first sub-images as the criterion region according to the comparison result or repeatedly further performing the steps corresponding to the steps a3) to a5) on the second sub-images.
[5] The color recovery method according to claim 1, wherein at step a6), by repeatedly performing the steps corresponding to steps a3) to a5) until the maximum value of the sum of the variance value for each image channel among the re-segmented sub-images is smaller than or equal to the sum of the variance values for each image channel for the sub-images before the re-segmentation,
The criterion region determines, as the criterion region, the sub-images before the re-segmentation.
[6] The color recovery method according to claim 1, wherein the step a) comprises: al 1) extracting the sub-images by applying different sized windows to each of the images to be recovered; a22) computing the sum of the variance values for each color channel for each of the extracted sub-image; and a33) determining, as the criterion region, the sub-image where the sum of the variance values for each color channel is at the maximum.
[7] The color recovery method according to claim 2, wherein the intra-channel ratio scale (IR) is a ratio of the color values for each color channel of each pixel forming the criterion region to the color values for each color channel of the criterion region.
[8] The color recovery method according to claim 2, wherein at step b2), the relative surface reflectance is represented by the following equation,
[Equation]
Figure imgf000013_0001
where Pκ denotes the relative surface reflectance in the K color channel of each pixel of the criterion region, SK/∑SK denotes the intra-channel ratio scale in the K color channel, and K denotes R, G, and B color channels.
[9] The color recovery method according to claim 2, wherein at step B3), the luminance is obtained according to the following equation [Equation] L = VR+ VG+ VB where L denotes luminance for each pixel, VR denotes a color value in an R color channel of each pixel forming the criterion region, V0 denotes a color value in a G color channel of each pixel forming the criterion region, and VB denotes a color value in a B color channel of each pixel forming the criterion region.
[10] The color recovery method according to claim 2, wherein at step b4), the ad- justment coefficient is represented by the following equation,
Figure imgf000014_0001
Where mκ denotes an adjustment coefficient of a K color channel of the criterion region, £CK denotes a sum of unique surface colors of a K color channel owned by the pixels that belong to the criterion region, and ∑VK denotes a sum of color values Vκ in a K color channel owned by the pixels that belong to the criterion region.
[11] The color recovery method according to claim 2, wherein subsequent to step C), further comprising dl) when the color values for each color channel of each pixels of the recovered images exceeds the threshold value, correcting the color values to the threshold value or less by performing an overflow control; and d2) re-correcting the color values for each color channel of each of the corrected pixel by using an output gain factor.
[12] The color recovery method according to claim 11, wherein at step dl), the performing the overflow control uses the maximum color value before recovering the input image and the ratio value of the maximum color value after the recovering the input image.
[13] The color recovery method according to claim 11, wherein at step d2), the re- corrected color value is a product of the output gain factor and the unique surface colors for each corrected color channel.
[14] A recording medium readable in a computer in which information coded with an instruction language executable in the computer by the color recovery method according to any one of claims 1 to 13 .
[15] A color recovery system comprising: a criterion region determination unit that determines regions where a sum of variance values for each color channel of images to be recovered is at the maximum as criterion regions; an adjustment coefficient computation unit that computes adjustment coefficients for excluding distortion effects of light sources within the criterion regions; and an image recovery unit that recovers the color for the images by using the adjustment coefficients.
[16] The color recovery system according to claim 15, further comprising: a ratio scale computation unit that computes an intra-channel ratio scale (IR) of the criterion region; a surface reflectance computation unit that computes reflective surface reflectance (P) of the criterion region by using the intra-channel ratio scale (IR): and a color recovery unit that recovers a unique surface color (C) for each pixel by using luminance (L) and the relative surface reflectance of the criterion region; the adjustment coefficient computation unit computes the adjustment coefficient to exclude distortion effects of a light source by using the unique surface colors (C) of the criterion region and the color values for each channel in the criterion region.
[17] The color recovery system according to claim 15, further comprising when the color values for each color channel of the image recovered through the image recovery unit exceeds the predetermined threshold value, a first correction unit that corrects the color values for each channel to the predetermined threshold value or less through the overflow control.
[18] The color recovery system according to claim 17, further comprising a second correction unit that re-corrects the color values for each corrected color channel by multiplying the color values for each color channel corrected in the first correction unit by the output gain factor.
PCT/KR2008/007883 2008-01-16 2008-12-31 Color recovery method and system WO2009091152A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020080005017A KR100933282B1 (en) 2008-01-16 2008-01-16 Color restoration method and system
KR10-2008-0005017 2008-01-16

Publications (3)

Publication Number Publication Date
WO2009091152A2 true WO2009091152A2 (en) 2009-07-23
WO2009091152A3 WO2009091152A3 (en) 2009-10-01
WO2009091152A9 WO2009091152A9 (en) 2009-11-26

Family

ID=40885777

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2008/007883 WO2009091152A2 (en) 2008-01-16 2008-12-31 Color recovery method and system

Country Status (4)

Country Link
JP (1) JP2009171585A (en)
KR (1) KR100933282B1 (en)
CN (1) CN101489022B (en)
WO (1) WO2009091152A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2521866A (en) * 2014-01-07 2015-07-08 Nokia Technologies Oy An apparatus and/or method and/or computer program for creating images adapted for transflective displays

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101133572B1 (en) * 2005-06-21 2012-04-05 삼성전자주식회사 Color gamut reproducing apparatus having wide color gamut and color signal processing method the same
CN102867295B (en) * 2012-08-06 2015-10-21 电子科技大学 A kind of color correction method for color image
JP6120515B2 (en) * 2012-09-26 2017-04-26 キヤノン株式会社 Image processing apparatus, image processing method, and program
KR101708689B1 (en) * 2015-08-24 2017-02-21 엘지전자 주식회사 Image Quality Improvement Method of Head Mounted Display including Optical system having Holographic Optical Elements
CN106204481B (en) * 2016-07-10 2019-06-25 上海大学 A kind of illumination estimation method of image color shape constancy
WO2018035691A1 (en) * 2016-08-22 2018-03-01 华为技术有限公司 Image processing method and apparatus
CN110033521B (en) * 2019-04-01 2020-01-14 重庆固成未来教育科技有限公司 Three-dimensional visualization system based on VR and AR technologies
CN112637568B (en) * 2020-12-24 2021-11-23 中标慧安信息技术股份有限公司 Distributed security monitoring method and system based on multi-node edge computing equipment
CN113303905B (en) * 2021-05-26 2022-07-01 中南大学湘雅二医院 Interventional operation simulation method based on video image feedback
CN113792710A (en) * 2021-11-15 2021-12-14 滨州学院 Spectrum reconstruction method and device and electronic equipment
CN115861450A (en) * 2022-10-28 2023-03-28 河南科技学院 Piecewise linear stretching underwater image color correction method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0766986A (en) * 1993-08-25 1995-03-10 Toyo Ink Mfg Co Ltd Device and method for adjusting color balance
JP2006262404A (en) * 2005-03-18 2006-09-28 Sharp Corp Device and method for estimating light source, image processor, and method for image processing
JP2006339897A (en) * 2005-05-31 2006-12-14 Univ Of Tokyo System, method, and program for color conversion

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3008878B2 (en) 1997-02-14 2000-02-14 日本電気株式会社 Color conversion method and apparatus, and machine-readable recording medium recording program
US7144133B2 (en) * 2002-05-17 2006-12-05 Infocus Corporation Transflective color recovery
CN100407288C (en) * 2005-03-22 2008-07-30 广达电脑股份有限公司 Method and apparatus for adjusting input image in accordance with display system characteristics

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0766986A (en) * 1993-08-25 1995-03-10 Toyo Ink Mfg Co Ltd Device and method for adjusting color balance
JP2006262404A (en) * 2005-03-18 2006-09-28 Sharp Corp Device and method for estimating light source, image processor, and method for image processing
JP2006339897A (en) * 2005-05-31 2006-12-14 Univ Of Tokyo System, method, and program for color conversion

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2521866A (en) * 2014-01-07 2015-07-08 Nokia Technologies Oy An apparatus and/or method and/or computer program for creating images adapted for transflective displays

Also Published As

Publication number Publication date
KR20090079069A (en) 2009-07-21
JP2009171585A (en) 2009-07-30
CN101489022A (en) 2009-07-22
WO2009091152A3 (en) 2009-10-01
CN101489022B (en) 2011-09-14
WO2009091152A9 (en) 2009-11-26
KR100933282B1 (en) 2009-12-22

Similar Documents

Publication Publication Date Title
WO2009091152A2 (en) Color recovery method and system
KR101328741B1 (en) Apparatus and method for image enhancement using color channel
EP2987134B1 (en) Generation of ghost-free high dynamic range images
US20070279500A1 (en) Method for correcting a digital image
JP2002197457A (en) Method for detecting skin color in digital image
US20070253693A1 (en) Exposure compensation method for digital image
CN115223004A (en) Method for generating confrontation network image enhancement based on improved multi-scale fusion
CN111192205A (en) Image defogging method and system and computer readable storage medium
CN110175967B (en) Image defogging processing method, system, computer device and storage medium
JP4851505B2 (en) Image processing apparatus and image processing method
KR101349968B1 (en) Image processing apparatus and method for automatically adjustment of image
CN110852971B (en) Video defogging method based on dark channel prior and Retinex and storage medium
JP5327766B2 (en) Memory color correction in digital images
CN117274085A (en) Low-illumination image enhancement method and device
KR102439145B1 (en) Method and system for Auto image dehazing using single scale image fusion
CN110086997B (en) Face image exposure brightness compensation method and device
KR102111317B1 (en) System and method of transmission rate calculation for suppression of degradation of image quality
CN110148188B (en) Method for estimating low-illumination image illumination distribution based on maximum difference image
CN111192210B (en) Self-adaptive enhanced video defogging method
CN112907474B (en) Underwater image enhancement method based on background light optimization and gamma transformation
JP4398595B2 (en) Image processing apparatus and program
CN114025144B (en) White balance gain adjustment method, electronic device, and computer-readable storage medium
CN117408906B (en) Low-light level image enhancement method and system
CN116612050B (en) Priori defogging method based on dark channel
KR102301924B1 (en) Shadow reconstruction method using multi-scale gamma correction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08870574

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08870574

Country of ref document: EP

Kind code of ref document: A2