WO2001015409A2 - Color density exposure control - Google Patents

Color density exposure control Download PDF

Info

Publication number
WO2001015409A2
WO2001015409A2 PCT/US2000/022620 US0022620W WO0115409A2 WO 2001015409 A2 WO2001015409 A2 WO 2001015409A2 US 0022620 W US0022620 W US 0022620W WO 0115409 A2 WO0115409 A2 WO 0115409A2
Authority
WO
WIPO (PCT)
Prior art keywords
color density
hue
values
image
color
Prior art date
Application number
PCT/US2000/022620
Other languages
French (fr)
Other versions
WO2001015409A3 (en
Inventor
William G. Reed
Virginia Lee Aldrich
Original Assignee
Digital Now, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Digital Now, Inc. filed Critical Digital Now, Inc.
Priority to AU66458/00A priority Critical patent/AU6645800A/en
Publication of WO2001015409A2 publication Critical patent/WO2001015409A2/en
Publication of WO2001015409A3 publication Critical patent/WO2001015409A3/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B27/00Photographic printing apparatus
    • G03B27/72Controlling or varying light intensity, spectral composition, or exposure time in photographic printing apparatus
    • G03B27/73Controlling exposure by variation of spectral composition, e.g. multicolor printers
    • G03B27/735Controlling exposure by variation of spectral composition, e.g. multicolor printers in dependence upon automatic analysis of the original
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control

Definitions

  • the present invention relates to the field of photography and videography. More particularly, the present invention relates to improving the aesthetic appearance of analogue and digital photographic and video images . More particularly, the present invention relates to a method and system for improving the color appearance of photographic and video images.
  • Color images captured in photographs or movies or videos often require various color adjustments in order to make them more aesthetically pleasing. Color adjustment may be required to compensate for poor lighting or inappropriate camera settings. However, color adjustments may also be desired even when an image is accurately captured. For example, fluorescent light has a blue-green color, and a photograph of objects in a room lit with fluorescent light will appear with a blue-green tint. Even though the photograph accurately captures the image of a room with fluorescent lighting, the image appears "incorrect.” This is because in a live setting, the brain automatically "corrects" the colors of the images seen in a room with fluorescent lighting. If an object is supposed to be white in color, and it is well known that it should be white in color, the brain will make it appear white by compensating for the blue-green fluorescent lighting.
  • Conventional color and color density balancing in photographic printers and digital film scanners is done by using statistical values to set exposure levels or manipulate digitized data. For example, some devices collect information concerning the color density of the colors red, green, and blue for a given number of pixels in an image and compute a histogram of densities for each color. These devices then match the constructed histograms with a database of known histograms and their associated color adjustments and apply the appropriate color adjustments to the image being processed.
  • the drawback of the density histogram process is its limited ability to distinguish images with the same colors but different hues. This process can only determine how much red, green, and blue is contained in each sampled image pixel, but cannot recognize the actual colors contained in the image. For example, the density histogram process cannot distinguish between an image of a room lit with blue- green fluorescent light and an image of a blue-green object or even a scenic image of a blue sky and a green park. All of these images may have the same color densities, contain the same concentrations of red, green, and blue.
  • a color density histogram of an image that is half green and half blue can look the same as a histogram of a blue-green image because a measurement of color density is a measurement for specific colors, e.g., red, green, and blue. Determining the color densities for the colors blue and green will not determine the presence of the color blue-green. Therefore, attempting to determine the contents of an image from a histogram of color densities for a limited number of colors in that image may lead to an incorrect result .
  • the process may interpret an image of blue sky and a green park to be a room using fluorescent light and compensate by reducing the blue and green colors of the resulting image.
  • the present invention is directed to a method and device for using color density values of an image to set the parameters for capturing the image.
  • a method for determining color density values comprising the steps of sampling color density values of an image, converting the color density values to hue values, categorizing the hue values into a number of discrete value ranges, aggregating the hue values within each discrete value range, and determining revised color density values based on the discrete value ranges and the aggregated hue values.
  • color density values of a plurality of colors are used to determine new color density values.
  • histograms of the hue information are constructed from the color density data obtained from an image.
  • a neural network uses a database of hue histograms to determine the appropriate color density values for an image.
  • computed color density values are used to process a photographic or digital image .
  • a color density determination system comprising a means for converting color density values to hue values, a means for categorizing the hue values into a number of discrete value ranges, a means for aggregating the hue values within each discrete value range, and a means for determining revised color density values based on the discrete value ranges and the aggregated hue values, is provided.
  • a color density determination system comprising a scanner that samples color densities from an image and a computer system, with a memory that stores color density data, a converter that translates color density data into hue data, a histogram generator that categorizes the hue data, and a neural network that uses the categorized hue data to determine exposure time, is provided.
  • a computer program product with sampling means for enabling a computer processor to sample color density values of an image, converting means for enabling the processor to convert the color density values to hue values, categorizing means for enabling the processor to categorize the hue values into a number of discrete value ranges, aggregating means for enabling the processor to aggregate the hue values within each value range, and determining means for enabling the processor to determine color density values based on the discrete value ranges and the aggregated hue values, is provided.
  • the present invention advantageously provides a novel method and system for creating improved photographic and video images .
  • Another advantage of the present invention is its ability to produce photographic and video images that have aesthetically pleasing colors.
  • Yet another advantage of the present invention is an improved color density correction method.
  • a further advantage of the present invention is its ability to adjust the colors of images illuminated by artificial light for more aesthetically pleasing appearance.
  • Still another advantage of the present invention is its computational efficiency and reduced data set.
  • Another advantage of the present invention is a reduction in customer complaints regarding the appearance of printed film or digitally scanned images.
  • FIG. 1 illustrates a flow diagram of a process of the present invention
  • FIG. 2 illustrates sample color density plots of an image
  • FIG. 3 illustrates a sample hue histogram of an image
  • FIG. 4 illustrates an embodiment of the present invention.
  • the present invention is a novel method and system for optimizing the color density of photographic or video images in order to produce aesthetically pleasing images.
  • the invention provides a method for determining exposure and illumination times.
  • the invention determines the optimal parameters for exposing film to RGB (red-green-blue) light in order to capture digital film images. These parameters may include the duration and intensity of operation of the RGB lights and the shutter speed of the sensors detecting the film image.
  • the invention can also be used for adjusting the color and density of images previously stored in a computer program. It is a process for making images look more aesthetically pleasing.
  • Color density is a term known to those skilled in the art, and for an image, refers to a range of values. Although color densities can be measured for any color, in a preferred embodiment of the invention, color densities are measured for the colors red, green, and blue. Color and density, as known in the art, refer to different characteristics. A black and white photograph has only one color with varying densities. Likewise, other images may contain the same color with different densities. For example, an image of an object in sunlight will have the same color as an image of the same object in the shade, but their color densities will be different. Thus, discussions of color densities of an image refer to ranges of densities for different colors. The present invention helps to determine the ranges of RGB color densities that will make an image look better.
  • FIG. 1 illustrates a flow diagram of a process of the present invention.
  • Process 100 begins with step 110 wherein color density data is taken from an image being processed by the invention.
  • the color densities of the image are sampled or measured at various points on the image.
  • densities are sampled for the colors red, green, and blue.
  • the sampling process can be performed with a single sensor or with a plurality of sensors.
  • the image sampled may be a stored image, as in the case of processing photographic film, or a live image, as in the case of a digital camera preparing to capture the image.
  • a stored image may also be in a digital format contained in a digital image file.
  • Conventional sampling devices that may be used include the Pre-scanner p/n66079627, available from Digital Now, Inc. of Vienna, VA, and the Kodak KLI-2113 tri-linear CCD sensor.
  • the Pre-scanner p/n66079627 can sample the color densities of a photographic image to a pixel depth of 36 log bits.
  • Conventional video cameras such as the 3 -chip color video camera model X0003 manufactured by Sony, may also be used to perform the sampling function as is known in the art.
  • Many conventional color correction systems process the sampled color density data to calculate various statistical measures such as means, medians, and standard deviation. These conventional systems then use the statistical color density profiles to determine the adjustments to be made to an image based on a database of other known color density profiles.
  • the drawback of this process is the limited information contained in color density data. For example, the color density profiles of an image that is half blue and half green will resemble the color density profile of a blue-green image.
  • step 120 translates the color density data into hue data. It would be readily apparent to one skilled in the art how to convert color density data to hue data. An explanation of the process of converting color density data to hue data can be found on page 50 of the book entitled Video Demystified, written by Keith Jack and published by HighText Publications, which is incorporated herein by reference.
  • the hue data is measured units of degrees of rotation around a hue circle, which represents the various wavelengths along the color spectrum.
  • Each sampled pixel of an image produces color density data that is translated into a hue data point.
  • the translation process converts the density data of a selected number of colors to hue data that represents all the different colors contained in the sampled image. Hue data is obtained in this manner because it is impractical to sample an image for all the colors in the visible spectrum whereas the color density data for a small number of colors would contain the same hue data.
  • the hue data is grouped into discrete ranges or "bins" of degrees of rotation around a hue circle.
  • the number of bins is a function of the number of points sampled on the image being processed and thus the number hue data points. In one embodiment of the invention, the number of bins is equal to the square root of the number of hue data points.
  • the quantity of hue data points in each bin is then aggregated. In this manner, step 130 constructs a histogram of the hue data points. Entire images are thus represented by hue histograms.
  • a hue histogram is compared to a database of hue histograms.
  • the hue histograms in the database are constructed from the raw color density data sampled from known images.
  • the hue histogram database is also linked to a corresponding set of adjusted color densities for each image represented by a hue histogram.
  • the set of adjusted color densities is constructed by an operator who manually adjusts the raw color density measurements of each image in the database for aesthetic quality.
  • the color density adjustments can also be made using a fixed algorithm or other methods known in the art .
  • the color density adjustments for each image are then stored in the database and are linked to the hue histograms of their respective images .
  • the database thus serves as a look-up table of adjustment values.
  • a hue histogram of a sampled image is matched to a hue histogram in the database, and the corresponding color density adjustments are retrieved from the database and applied to the sampled image.
  • an artificial neural network performs step 135.
  • an operator is initially needed to train the neural network by building the hue histogram database.
  • the operator analyzes the color quality of a sufficient number of images to cover a cross-section of all possible images and determines the proper color density adjustments for each image.
  • the operator determines the proper color density adjustments for the colors red, green, and blue.
  • the color density adjustments are parameters for capturing the image in a separate medium comprising a combination of illumination intensity and duration, and exposure time. These adjustments, along with the hue histogram of the corresponding image, are stored in the database. This process creates a training set for the supervised training of the neural network.
  • the training set contains at least 100 sample histograms and corresponding color density adjustments for each histogram bin created by step 130. For example, if a color density sampling device samples 924 pixels on an image, then according to one preferred embodiment of the invention, there would be approximately 31, the square root of 924 rounded up, hue histogram bins to categorize the resulting hue data. Thus for the preferred embodiment, a representative population of 31 x 100, or 3100, real world images would be required to comprise the training set for the neural network.
  • this database or training set defines the input and output variables to be applied to a commercially available neural network software package.
  • the input variables comprise the array of numbers representing the population of data points in each of the 31 histogram bins generated by step 130
  • the output variables comprise color density adjustment values for the colors red, green, and blue.
  • the neural network software then analyzes the training set to derive an algorithm for processing the hue histogram of a sampled image and producing appropriate color density adjustments.
  • Methods for training neural networks are known in the art and include the back-propagation of errors method, the fast-prop method, multi-layer perceptron, and radial basis functions. Any type of learning system can be utilized.
  • TrajanTM neural network software available from Trajan Software, LTD., London, UK, is used.
  • the neural network training sets would be altered for different applications because the output variables may be different.
  • the process of digitizing film images would require outputs that set the parameters for illuminating the film and exposing sensors to the illuminated image.
  • the process of capturing an image with a digital camera would require outputs in the form of aperture and shutter speed settings. It would be apparent to one skilled in the art what output variables are needed for different applications.
  • the neural network described operates with a fixed training set and a fixed algorithm for computing color density adjustments.
  • a neural network with a continuously expanding training set that learns with each hue histogram processed can be used.
  • Step 140 applies the color density adjustments to the sampled image to create an enhanced image.
  • the application of color density adjustments may be in conjunction with capturing the sampled image in a separate medium, as with the digitization of film.
  • the adjustments may also be applied to the original image.
  • Step 150 is an optional step for evaluating the quality of the color density adjustments. If the adjustments produce a poor quality image, process 100 would be repeated until a successive iteration produces an acceptable image. The number of iterations would be limited to a practical number. When process 100 produces an acceptable image, the process is complete. Step 150 may be omitted when cost and efficiency considerations supercede .
  • FIG. 2 illustrates sample blue, green, and red histogram plots of the color densities of an image of a blue-green painting exposed to fluorescent light.
  • the Y-axis measures the frequency of occurrence
  • the X-axis specifies the density ranges or bins. Histograms are often shown as jagged bar graphs instead of a continuous curve because of the limited and finite number of bins used for aggregating the sampled data points.
  • the densities of the image are enhanced for the colors green and blue.
  • the plots show that, for this situation, it may not be possible to determine from color density information whether the image merely contains a blue-green painting or contains blue-green fluorescent lighting. The inability to make such a determination in this or other situations limit the usefulness of using color density data to perform color density adjustments.
  • FIG. 3 illustrates a sample hue histogram of the same image used in FIG. 2.
  • the Y-axis measures the frequency of occurrence
  • the X-axis specifies the bins, measured in degrees of the hue circle, of the hue colors measured.
  • a hue histogram can identify the specific color wavelength characteristic of blue-green fluorescent light and thus can distinguish between an image that should be adjusted to compensate for fluorescent lighting and an image containing a lot of blues and greens, like a photograph of an outdoor park taken in daylight, for example.
  • FIG. 4 illustrates one embodiment of a system of the present invention.
  • devices 410 and 440 digitize film images.
  • film 405 is advanced from reel 415 to reel 435. While film 405 is being advanced, it passes sensor 420, which samples the color densities of each film frame.
  • sensor 420 is the pre-scanner device disclosed and described in co-pending U.S. patent application No. ,
  • device 410 is the ExpressScan-4BTM, available from Digital Now, Inc. of Vienna, VA.
  • sensor 420 scans film 405 with RGB LEDs to obtain RGB color density data.
  • the sampled color density information is transferred to device 440, via data link 439.
  • Device 440 is a computer processing system such as a commercially available computer device, with monitor 481, mouse 482, and keyboard 483.
  • the color density data is stored in memory 445, which can be random access memory.
  • the color density information is then converted to hue data by converter 450.
  • the hue data is then used to construct a hue histogram by generator 455.
  • the generated hue histogram is then analyzed by neural network 460.
  • converter 450, generator 455, and neural network 460 are each components of a single software program.
  • those devices are components of a computer hardware processor.
  • neural network 460 uses the TrajanTM neural network software, available from Trajan Software, LTD. London, UK, or another functionally similar program.
  • the color density adjustments determined by neural network 460 are sent back to device 410 via data link 461.
  • the color density adjustments are parameters for operating sensor 425 and light source 430 in order to illuminate film 405 and capture a film image with the desired color densities based on the color density adjustments.
  • the parameters may include illumination duration and intensity for light source 430 and aperture and shutter speed setting for sensor 425.
  • light source 430 comprises RGB LEDs.
  • sensor 425 is a ThomsonTM digital camera, available from Thomson CFS, Orsey, Cedex, France.
  • the digitized image is then sent from sensor 425 to frame grabber 470 via data link 426.
  • frame grabber 470 is a ThomsonTM frame grabber.
  • the digital image can then be stored as a digital file in memory 480 or external memory 490, or sent to computer network 495.

Abstract

A methods and devices for using hue histograms to determine the appropriate color density adjustments for an image. The method includes the steps of sampling color density values of an image (110) converting the color density values to hue values (120), categorizing the hue values into a number of discrete value ranges (130), aggregating the hue values within each discrete value range, and determining revised color density values based on the discrete value ranges and the aggregated hue values (135).

Description

COLOR DENSITY EXPOSURE CONTROL
Field of Invention
The present invention relates to the field of photography and videography. More particularly, the present invention relates to improving the aesthetic appearance of analogue and digital photographic and video images . More particularly, the present invention relates to a method and system for improving the color appearance of photographic and video images.
Related Art
Color images captured in photographs or movies or videos often require various color adjustments in order to make them more aesthetically pleasing. Color adjustment may be required to compensate for poor lighting or inappropriate camera settings. However, color adjustments may also be desired even when an image is accurately captured. For example, fluorescent light has a blue-green color, and a photograph of objects in a room lit with fluorescent light will appear with a blue-green tint. Even though the photograph accurately captures the image of a room with fluorescent lighting, the image appears "incorrect." This is because in a live setting, the brain automatically "corrects" the colors of the images seen in a room with fluorescent lighting. If an object is supposed to be white in color, and it is well known that it should be white in color, the brain will make it appear white by compensating for the blue-green fluorescent lighting. Therefore, a photograph of a room with fluorescent lighting must also be adjusted to compensate for the blue-green color of the light in order to capture the "correct" image. In the past, an operator would have to manually adjust the color of each image until the images appeared pleasing to the eye. This required the operator to monitor each image being processed and, using the proper equipment, manually change the color densities of each image until the image appeared aesthetically pleasing to the operator. This process is no longer used because of its numerous drawbacks. First, it is very costly and time consuming to have an operator stop an image processor, for printing photographs, digitizing film, or some other processing step, in order to view an image and make changes to its appearance. Furthermore, it takes time and resources to train an operator, who must gain experience before being able to properly make adjustments to the image. Finally, subjective judgments of image quality necessarily varies with different operators so that the value of having an operator manually adjust an image may be offset by inconsistencies in the final product.
There are various photographic printers that attempt to automate the color adjustment process by using a plurality of color filters to detect various color density patterns in an image being processed and compare them to a database of color density patterns associated with specifically identified circumstances. With a plurality of filters, these automated printers analyze the spectral distribution of a captured image and adjust the exposure level of the printed image to compensate for the situation identified. These printers, however, require the use of a large number of filters to analyze the many different colors in a spectrum. Further, these devices have a finite database of color density patterns and are limited in their ability to identify and adjust to new or unanticipated situations, such as newly released film emulsions. There are also commercially available cameras that attempt to determine the type of lighting contained in an image before capturing the image. This process may be automatic or may have a manual switch for control by an operator.
Conventional color and color density balancing in photographic printers and digital film scanners is done by using statistical values to set exposure levels or manipulate digitized data. For example, some devices collect information concerning the color density of the colors red, green, and blue for a given number of pixels in an image and compute a histogram of densities for each color. These devices then match the constructed histograms with a database of known histograms and their associated color adjustments and apply the appropriate color adjustments to the image being processed.
The drawback of the density histogram process is its limited ability to distinguish images with the same colors but different hues. This process can only determine how much red, green, and blue is contained in each sampled image pixel, but cannot recognize the actual colors contained in the image. For example, the density histogram process cannot distinguish between an image of a room lit with blue- green fluorescent light and an image of a blue-green object or even a scenic image of a blue sky and a green park. All of these images may have the same color densities, contain the same concentrations of red, green, and blue.
As a second example, a color density histogram of an image that is half green and half blue can look the same as a histogram of a blue-green image because a measurement of color density is a measurement for specific colors, e.g., red, green, and blue. Determining the color densities for the colors blue and green will not determine the presence of the color blue-green. Therefore, attempting to determine the contents of an image from a histogram of color densities for a limited number of colors in that image may lead to an incorrect result . The process may interpret an image of blue sky and a green park to be a room using fluorescent light and compensate by reducing the blue and green colors of the resulting image.
For the foregoing reasons, there is a need for an efficient and accurate method for determining the contents of a captured image and adjusting the colors of that image for a more aesthetically pleasing result.
Summary of the Invention
The present invention is directed to a method and device for using color density values of an image to set the parameters for capturing the image. In one aspect of the invention, a method for determining color density values, comprising the steps of sampling color density values of an image, converting the color density values to hue values, categorizing the hue values into a number of discrete value ranges, aggregating the hue values within each discrete value range, and determining revised color density values based on the discrete value ranges and the aggregated hue values, is provided. In another aspect of the invention, color density values of a plurality of colors are used to determine new color density values.
In still another aspect of the invention, histograms of the hue information are constructed from the color density data obtained from an image.
In a further aspect of the invention, a neural network uses a database of hue histograms to determine the appropriate color density values for an image. In yet another aspect of the invention, computed color density values are used to process a photographic or digital image .
In another aspect of the invention, a color density determination system, comprising a means for converting color density values to hue values, a means for categorizing the hue values into a number of discrete value ranges, a means for aggregating the hue values within each discrete value range, and a means for determining revised color density values based on the discrete value ranges and the aggregated hue values, is provided.
In still another aspect of the invention, a color density determination system, comprising a scanner that samples color densities from an image and a computer system, with a memory that stores color density data, a converter that translates color density data into hue data, a histogram generator that categorizes the hue data, and a neural network that uses the categorized hue data to determine exposure time, is provided. In yet another aspect of the invention, a computer program product, with sampling means for enabling a computer processor to sample color density values of an image, converting means for enabling the processor to convert the color density values to hue values, categorizing means for enabling the processor to categorize the hue values into a number of discrete value ranges, aggregating means for enabling the processor to aggregate the hue values within each value range, and determining means for enabling the processor to determine color density values based on the discrete value ranges and the aggregated hue values, is provided. The present invention advantageously provides a novel method and system for creating improved photographic and video images .
Another advantage of the present invention is its ability to produce photographic and video images that have aesthetically pleasing colors.
Yet another advantage of the present invention is an improved color density correction method.
A further advantage of the present invention is its ability to adjust the colors of images illuminated by artificial light for more aesthetically pleasing appearance.
Still another advantage of the present invention is its computational efficiency and reduced data set.
Another advantage of the present invention is a reduction in customer complaints regarding the appearance of printed film or digitally scanned images.
These and other features and advantages of the invention will be more fully understood from the following detailed description of a preferred embodiment that should be read in light of the accompanying drawings.
Brief Description of the Drawings
The accompanying drawings, which are incorporated in and form a part of the specification, illustrate a preferred embodiment of the present invention and, together with the description, serve to explain the principles of the invention.
FIG. 1 illustrates a flow diagram of a process of the present invention; FIG. 2 illustrates sample color density plots of an image ;
FIG. 3 illustrates a sample hue histogram of an image; and FIG. 4 illustrates an embodiment of the present invention.
Detailed Description In describing a preferred embodiment of the invention, specific terminology will be used for the sake of clarity. However, the invention is not intended to be limited to the specific terms so selected, and it is to be understood that each specific term includes all equivalents. The present invention is a novel method and system for optimizing the color density of photographic or video images in order to produce aesthetically pleasing images. In the field of photographic printing, the invention provides a method for determining exposure and illumination times. In the field of film scanning and digitization, the invention determines the optimal parameters for exposing film to RGB (red-green-blue) light in order to capture digital film images. These parameters may include the duration and intensity of operation of the RGB lights and the shutter speed of the sensors detecting the film image. The invention can also be used for adjusting the color and density of images previously stored in a computer program. It is a process for making images look more aesthetically pleasing. Color density is a term known to those skilled in the art, and for an image, refers to a range of values. Although color densities can be measured for any color, in a preferred embodiment of the invention, color densities are measured for the colors red, green, and blue. Color and density, as known in the art, refer to different characteristics. A black and white photograph has only one color with varying densities. Likewise, other images may contain the same color with different densities. For example, an image of an object in sunlight will have the same color as an image of the same object in the shade, but their color densities will be different. Thus, discussions of color densities of an image refer to ranges of densities for different colors. The present invention helps to determine the ranges of RGB color densities that will make an image look better.
With reference to the drawings, in general, and FIGS. 1 through 4 in particular, the present invention is described. FIG. 1 illustrates a flow diagram of a process of the present invention. Process 100 begins with step 110 wherein color density data is taken from an image being processed by the invention. The color densities of the image are sampled or measured at various points on the image. In a preferred embodiment of the invention, densities are sampled for the colors red, green, and blue.
The sampling process can be performed with a single sensor or with a plurality of sensors. The image sampled may be a stored image, as in the case of processing photographic film, or a live image, as in the case of a digital camera preparing to capture the image. A stored image may also be in a digital format contained in a digital image file.
Conventional sampling devices that may be used include the Pre-scanner p/n66079627, available from Digital Now, Inc. of Vienna, VA, and the Kodak KLI-2113 tri-linear CCD sensor. The Pre-scanner p/n66079627 can sample the color densities of a photographic image to a pixel depth of 36 log bits. Conventional video cameras, such as the 3 -chip color video camera model X0003 manufactured by Sony, may also be used to perform the sampling function as is known in the art. Many conventional color correction systems process the sampled color density data to calculate various statistical measures such as means, medians, and standard deviation. These conventional systems then use the statistical color density profiles to determine the adjustments to be made to an image based on a database of other known color density profiles. The drawback of this process is the limited information contained in color density data. For example, the color density profiles of an image that is half blue and half green will resemble the color density profile of a blue-green image.
In the present invention, step 120 translates the color density data into hue data. It would be readily apparent to one skilled in the art how to convert color density data to hue data. An explanation of the process of converting color density data to hue data can be found on page 50 of the book entitled Video Demystified, written by Keith Jack and published by HighText Publications, which is incorporated herein by reference. In a preferred embodiment of the invention, the hue data is measured units of degrees of rotation around a hue circle, which represents the various wavelengths along the color spectrum. Each sampled pixel of an image produces color density data that is translated into a hue data point. The translation process converts the density data of a selected number of colors to hue data that represents all the different colors contained in the sampled image. Hue data is obtained in this manner because it is impractical to sample an image for all the colors in the visible spectrum whereas the color density data for a small number of colors would contain the same hue data.
In step 130, the hue data is grouped into discrete ranges or "bins" of degrees of rotation around a hue circle. The number of bins is a function of the number of points sampled on the image being processed and thus the number hue data points. In one embodiment of the invention, the number of bins is equal to the square root of the number of hue data points. The quantity of hue data points in each bin is then aggregated. In this manner, step 130 constructs a histogram of the hue data points. Entire images are thus represented by hue histograms.
In step 135, a hue histogram is compared to a database of hue histograms. The hue histograms in the database are constructed from the raw color density data sampled from known images. The hue histogram database is also linked to a corresponding set of adjusted color densities for each image represented by a hue histogram. The set of adjusted color densities is constructed by an operator who manually adjusts the raw color density measurements of each image in the database for aesthetic quality. The color density adjustments can also be made using a fixed algorithm or other methods known in the art . The color density adjustments for each image are then stored in the database and are linked to the hue histograms of their respective images . The database thus serves as a look-up table of adjustment values. A hue histogram of a sampled image is matched to a hue histogram in the database, and the corresponding color density adjustments are retrieved from the database and applied to the sampled image.
In a preferred embodiment of the present invention, an artificial neural network performs step 135. In this embodiment of the invention, an operator is initially needed to train the neural network by building the hue histogram database. The operator analyzes the color quality of a sufficient number of images to cover a cross-section of all possible images and determines the proper color density adjustments for each image. In a preferred embodiment of the invention, the operator determines the proper color density adjustments for the colors red, green, and blue. In a preferred embodiment of the invention, the color density adjustments are parameters for capturing the image in a separate medium comprising a combination of illumination intensity and duration, and exposure time. These adjustments, along with the hue histogram of the corresponding image, are stored in the database. This process creates a training set for the supervised training of the neural network.
In a preferred embodiment of the invention, the training set contains at least 100 sample histograms and corresponding color density adjustments for each histogram bin created by step 130. For example, if a color density sampling device samples 924 pixels on an image, then according to one preferred embodiment of the invention, there would be approximately 31, the square root of 924 rounded up, hue histogram bins to categorize the resulting hue data. Thus for the preferred embodiment, a representative population of 31 x 100, or 3100, real world images would be required to comprise the training set for the neural network.
In a preferred embodiment of the invention, this database or training set defines the input and output variables to be applied to a commercially available neural network software package. In a preferred embodiment of the invention, the input variables comprise the array of numbers representing the population of data points in each of the 31 histogram bins generated by step 130, and the output variables comprise color density adjustment values for the colors red, green, and blue. The neural network software then analyzes the training set to derive an algorithm for processing the hue histogram of a sampled image and producing appropriate color density adjustments. Methods for training neural networks are known in the art and include the back-propagation of errors method, the fast-prop method, multi-layer perceptron, and radial basis functions. Any type of learning system can be utilized. In one embodiment of the invention, Trajan™ neural network software, available from Trajan Software, LTD., London, UK, is used.
The neural network training sets would be altered for different applications because the output variables may be different. For example, the process of digitizing film images would require outputs that set the parameters for illuminating the film and exposing sensors to the illuminated image. On the other hand, the process of capturing an image with a digital camera would require outputs in the form of aperture and shutter speed settings. It would be apparent to one skilled in the art what output variables are needed for different applications.
The neural network described operates with a fixed training set and a fixed algorithm for computing color density adjustments. However, as known to those skilled in the art, a neural network with a continuously expanding training set that learns with each hue histogram processed can be used.
Step 140 applies the color density adjustments to the sampled image to create an enhanced image. The application of color density adjustments may be in conjunction with capturing the sampled image in a separate medium, as with the digitization of film. The adjustments may also be applied to the original image. Step 150 is an optional step for evaluating the quality of the color density adjustments. If the adjustments produce a poor quality image, process 100 would be repeated until a successive iteration produces an acceptable image. The number of iterations would be limited to a practical number. When process 100 produces an acceptable image, the process is complete. Step 150 may be omitted when cost and efficiency considerations supercede .
FIG. 2 illustrates sample blue, green, and red histogram plots of the color densities of an image of a blue-green painting exposed to fluorescent light. In FIG. 2, the Y-axis measures the frequency of occurrence, and the X-axis specifies the density ranges or bins. Histograms are often shown as jagged bar graphs instead of a continuous curve because of the limited and finite number of bins used for aggregating the sampled data points.
Because the fluorescent lighting is also blue-green in color, the densities of the image are enhanced for the colors green and blue. The plots show that, for this situation, it may not be possible to determine from color density information whether the image merely contains a blue-green painting or contains blue-green fluorescent lighting. The inability to make such a determination in this or other situations limit the usefulness of using color density data to perform color density adjustments.
FIG. 3 illustrates a sample hue histogram of the same image used in FIG. 2. As in FIG. 2, the Y-axis measures the frequency of occurrence, and the X-axis specifies the bins, measured in degrees of the hue circle, of the hue colors measured. A hue histogram can identify the specific color wavelength characteristic of blue-green fluorescent light and thus can distinguish between an image that should be adjusted to compensate for fluorescent lighting and an image containing a lot of blues and greens, like a photograph of an outdoor park taken in daylight, for example.
FIG. 4 illustrates one embodiment of a system of the present invention. In this embodiment, devices 410 and 440 digitize film images. Using device 410, film 405 is advanced from reel 415 to reel 435. While film 405 is being advanced, it passes sensor 420, which samples the color densities of each film frame. In a preferred embodiment, sensor 420 is the pre-scanner device disclosed and described in co-pending U.S. patent application No. ,
(attorney docket number 26183.1004-US01 , filed August 17, 2000, entitled "Film Density Scanner,", of which William G. Reed and John 0. Renn are the inventors, the entirety of which is incorporated herein by reference. In a preferred embodiment, device 410 is the ExpressScan-4B™, available from Digital Now, Inc. of Vienna, VA. In a preferred embodiment, sensor 420 scans film 405 with RGB LEDs to obtain RGB color density data.
The sampled color density information is transferred to device 440, via data link 439. Device 440 is a computer processing system such as a commercially available computer device, with monitor 481, mouse 482, and keyboard 483. The color density data is stored in memory 445, which can be random access memory. The color density information is then converted to hue data by converter 450. The hue data is then used to construct a hue histogram by generator 455. The generated hue histogram is then analyzed by neural network 460. In a preferred embodiment of the invention, converter 450, generator 455, and neural network 460 are each components of a single software program. In a different preferred embodiment, those devices are components of a computer hardware processor. In a preferred embodiment, neural network 460 uses the Trajan™ neural network software, available from Trajan Software, LTD. London, UK, or another functionally similar program.
The color density adjustments determined by neural network 460 are sent back to device 410 via data link 461. In a preferred embodiment the color density adjustments are parameters for operating sensor 425 and light source 430 in order to illuminate film 405 and capture a film image with the desired color densities based on the color density adjustments. The parameters may include illumination duration and intensity for light source 430 and aperture and shutter speed setting for sensor 425. In a preferred embodiment, light source 430 comprises RGB LEDs. In a preferred embodiment, sensor 425 is a Thomson™ digital camera, available from Thomson CFS, Orsey, Cedex, France. The digitized image is then sent from sensor 425 to frame grabber 470 via data link 426. In a preferred embodiment, frame grabber 470 is a Thomson™ frame grabber. The digital image can then be stored as a digital file in memory 480 or external memory 490, or sent to computer network 495.
While there have been shown and described specific embodiments of the present invention, it should be apparent to those skilled in the art that various changes and modifications may be made without departing from the scope of the invention or its equivalents. The invention is intended to be broadly protected consistent with the spirit and scope of the appended claims .

Claims

What is claimed is :
1. A method of determining color density values comprising the steps of : sampling color density values of an image; converting the color density values to hue values; categorizing the hue values into a number of discrete value ranges; aggregating the hue values within each discrete value range ; and determining revised color density values based on the discrete value ranges and the aggregated hue values.
2. The method of claim 1 wherein the sampling step comprises sampling the color density values of at least three colors.
3. The method of claim 2 wherein the three colors are red, green, and blue.
4. The method of claim 1 wherein the sampling step is carried out using a video or still camera.
5. The method of claim 1 wherein the image is an image stored in a medium.
6. The method of claim 1 wherein the determining step is carried out manually.
7. The method of claim 1 wherein the determining step is carried out using a predetermined algorithm.
8. The method of claim 1 wherein the determining step is carried out using a neural network.
9. The method of claim 1 wherein the determining step comprises comparing the aggregated hue values with a set of aggregated hue values stored in a memory area.
10. The method of claim 1 wherein the determining step is carried out for each color density value sampled in the sampling step.
11. The method of claim 1 wherein the number of discrete value ranges is a function of a number of color density values sampled in the sampling step.
12. The method of claim 11 wherein the number of discrete value ranges is less than or equal to the number of color density values sampled.
13. The method of claim 11 wherein the number of discrete value ranges is a square root of the number of color density values sampled.
14. The method of claim 1 further comprising the step of creating a histogram based on the hue values and the number of discrete value ranges.
15. The method of claim 1 further comprising the step of adjusting the image based on the revised color density values .
16. The method of claim 1 furtner comprising the step of recording the image based on the revised color density values .
17. The method of claim 16 wherein the recording step comprises computing a shutter exposure time.
18. The method of claim 16 wherein the recording step 5 comprises computing times for illuminating the image.
19. The method of claim 18 further comprising the step illuminating the image with a red, green, or blue light for the computed times.
10
20. The method of claim 19 wherein the red, green, and blue lights are LED lights.
21. A color density determination system comprising:
15 a means for converting color density values to hue values ; a means for categorizing the hue values into a number of discrete value ranges; a means for aggregating the hue values within each 20 discrete value range; and a means for determining revised color density values based on the discrete value ranges and the aggregated hue values .
25 22. The system of claim 21 wherein the converting means comprises a color space converter.
23. The system of claim 21 further comprising a means for sampling color density values of images.
24. An exposure determination system comprising: a scanner that samples color densities of images; and a computer system configured to determine exposure times based on sampled color densities.
25. The system of claim 24 wherein the computer system comprises : memory that stores color density data; a converter that translates color density data into hue data; a histogram generator that categorizes the hue data; and a neural network that uses the categorized hue data to determine exposure times.
26. The system of claim 25 wherein the computer system further comprises a frame grabber that captures the image.
27. The system of claim 24 further comprising an illumination source to illuminate the image based on the exposure times.
28. An exposure determination system comprising: a memory device having embodied therein color density data; and a processor in communication with said memory device, said processor configured for converting color density data into hue data, categorizing the hue data, and determining exposure times from the categorized hue data .
29. A computer program product comprising a computer usable medium having computer program logic recorded thereon for enabling a processor in a computer system to facilitate determining color density values, said computer program logic comprising: sampling means for enabling the processor to sample color density values of an image; converting means for enabling the processor to convert the color density values to hue values; categorizing means for enabling the processor to categorize the hue values into a number of discrete value ranges ; aggregating means for enabling the processor to aggregate the hue values within each value range; and determining means for enabling the processor to determine color density values based on the discrete value ranges and the aggregated hue values.
PCT/US2000/022620 1999-08-20 2000-08-18 Color density exposure control WO2001015409A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU66458/00A AU6645800A (en) 1999-08-20 2000-08-18 Color density exposure control

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15007699P 1999-08-20 1999-08-20
US60/150,076 1999-08-20

Publications (2)

Publication Number Publication Date
WO2001015409A2 true WO2001015409A2 (en) 2001-03-01
WO2001015409A3 WO2001015409A3 (en) 2001-09-13

Family

ID=22533033

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/022620 WO2001015409A2 (en) 1999-08-20 2000-08-18 Color density exposure control

Country Status (2)

Country Link
AU (1) AU6645800A (en)
WO (1) WO2001015409A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1639807A1 (en) * 2004-05-17 2006-03-29 Seiko Epson Corporation Image processing method, image processing apparatus and program
GB2418316B (en) * 2004-09-21 2007-01-31 Hitachi Ltd Image display apparatus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4929978A (en) * 1987-10-23 1990-05-29 Matsushita Electric Industrial Co., Ltd. Color correction method for color copier utilizing correction table derived from printed color samples
US5109275A (en) * 1989-12-29 1992-04-28 Matsushita Electric Industrial Co., Ltd. Printing signal correction and printer operation control apparatus utilizing neural network
US5309228A (en) * 1991-05-23 1994-05-03 Fuji Photo Film Co., Ltd. Method of extracting feature image data and method of extracting person's face data
US5408343A (en) * 1991-07-19 1995-04-18 Canon Kabushiki Kaisha Image processor in which colors in an original color image are identified as predetermined patterns on a monochromatic copy of the original
US5416848A (en) * 1992-06-08 1995-05-16 Chroma Graphics Method and apparatus for manipulating colors or patterns using fractal or geometric methods
US5822453A (en) * 1996-12-10 1998-10-13 Eastman Kodak Company Method for estimating and adjusting digital image contrast

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4929978A (en) * 1987-10-23 1990-05-29 Matsushita Electric Industrial Co., Ltd. Color correction method for color copier utilizing correction table derived from printed color samples
US5109275A (en) * 1989-12-29 1992-04-28 Matsushita Electric Industrial Co., Ltd. Printing signal correction and printer operation control apparatus utilizing neural network
US5309228A (en) * 1991-05-23 1994-05-03 Fuji Photo Film Co., Ltd. Method of extracting feature image data and method of extracting person's face data
US5408343A (en) * 1991-07-19 1995-04-18 Canon Kabushiki Kaisha Image processor in which colors in an original color image are identified as predetermined patterns on a monochromatic copy of the original
US5416848A (en) * 1992-06-08 1995-05-16 Chroma Graphics Method and apparatus for manipulating colors or patterns using fractal or geometric methods
US5822453A (en) * 1996-12-10 1998-10-13 Eastman Kodak Company Method for estimating and adjusting digital image contrast

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1639807A1 (en) * 2004-05-17 2006-03-29 Seiko Epson Corporation Image processing method, image processing apparatus and program
EP1639807A4 (en) * 2004-05-17 2008-12-03 Seiko Epson Corp Image processing method, image processing apparatus and program
US7680325B2 (en) 2004-05-17 2010-03-16 Seiko Epson Corporation Image processing method of detecting a correspondence between colors, image processing apparatus and program for detecting a correspondence between colors
GB2418316B (en) * 2004-09-21 2007-01-31 Hitachi Ltd Image display apparatus
GB2429598A (en) * 2004-09-21 2007-02-28 Hitachi Ltd Colour correction of an image signal using histograms of hue and saturation
GB2429598B (en) * 2004-09-21 2007-08-29 Hitachi Ltd Image display apparatus

Also Published As

Publication number Publication date
AU6645800A (en) 2001-03-19
WO2001015409A3 (en) 2001-09-13

Similar Documents

Publication Publication Date Title
JP4287179B2 (en) Method for automatic white balance of digital images
EP0667706B1 (en) Colour negative scanning and transforming into colours of original scene
JP4055866B2 (en) Video-based color detection device for printing press control systems
JP4478358B2 (en) Digital image processing method and apparatus for brightness adjustment of digital image
US7580562B2 (en) Image processing method, image processing device, and image processing program
US6954549B2 (en) Local digital image property control with masks
US5809164A (en) System and method for color gamut and tone compression using an ideal mapping function
EP0589376A2 (en) Colour image reproduction of scenes with preferential tone mapping
US9460521B2 (en) Digital image analysis
US20070216776A1 (en) Color image reproduction
WO2004045219A1 (en) Light source estimating device, light source estimating method, and imaging device and image processing method
JPH10243415A (en) Digital camera and method for adjusting color balance
JP3372537B2 (en) How to automatically select a color calibration value
WO2001015409A2 (en) Color density exposure control
JP3706708B2 (en) Image forming system and image forming method
Murphy A testing procedure to characterize color and spatial quality of digital cameras used to image cultural heritage
CN111986151A (en) Skin color detection method and device
EP2769536A1 (en) Automatic method for controlling the colour compliance of multicolour industrial or agri-foodstuff products over the whole surface area and for all of the colours of said products
Holm A strategy for pictorial digital image processing (PDIP)
CN111351078B (en) Lampblack identification method of range hood and range hood
Bolesta Color Analysis and Color Management in Mass Digitization of Transparencies at the National Digital Archives in Poland
Kovalskyi et al. Analysis of the effectiveness of means to achieve optimal color balancing in obtaining a digital photographic image.
Murphy A review of standards defining testing procedures for characterizing the color and spatial quality of digital cameras used to image cultural heritage
US20020067518A1 (en) Process and apparatus for the manufacture of a digital color picture
FR2961996A1 (en) Method for controlling color conformity of polychrome industrial or agricultural food products, involves automatically performing chromolyse of image by segmentation of dominant voxels, and interrupting image controls by image correction

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP