WO2011028626A2 - Method for producing a color image and imaging device employing same - Google Patents

Method for producing a color image and imaging device employing same Download PDF

Info

Publication number
WO2011028626A2
WO2011028626A2 PCT/US2010/046871 US2010046871W WO2011028626A2 WO 2011028626 A2 WO2011028626 A2 WO 2011028626A2 US 2010046871 W US2010046871 W US 2010046871W WO 2011028626 A2 WO2011028626 A2 WO 2011028626A2
Authority
WO
WIPO (PCT)
Prior art keywords
color
image
colors
rendering unit
dimensional look
Prior art date
Application number
PCT/US2010/046871
Other languages
French (fr)
Other versions
WO2011028626A3 (en
Inventor
James R. Sullivan
Rodney Heckaman
Original Assignee
Entertainment Experience Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Entertainment Experience Llc filed Critical Entertainment Experience Llc
Priority to KR1020127008557A priority Critical patent/KR101354400B1/en
Priority to EP10814312.4A priority patent/EP2474166A4/en
Priority to CN201080049185.5A priority patent/CN102598114B/en
Priority to JP2012527000A priority patent/JP2013504080A/en
Priority to KR1020137028025A priority patent/KR101786161B1/en
Publication of WO2011028626A2 publication Critical patent/WO2011028626A2/en
Publication of WO2011028626A3 publication Critical patent/WO2011028626A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/06Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using colour palettes, e.g. look-up tables
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/06Colour space transformation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light

Definitions

  • the projection and/or display of color images is an active area of commercial research and development. New image display, television, games, computers and projection products and viewing experiences are being launched in the marketplace on a regular basis.
  • digital cinema or video projector technology that utilizes colored light emitting diodes (LEDs) as the source of the primary colors for imaging, offers the promise of extreme, wide color gamut along with very long life, low heat illumination. LED brightness is currently limited, however, requiring three optical systems and three image modulators, i.e., one for each of the red, green, and blue (RGB) color channels, for the brightest images.
  • Current projector lamp technology is of higher brightness and can take advantage of single optical systems and single image modulators using complex color filter wheels to provide full color display.
  • LCDs liquid crystal displays
  • projectors, televisions, game displays and computer displays are being introduced with more than the typical three (RGB) colors to improve brightness and expand the color gamut.
  • RGB typical three
  • the multiple RGB channels may be combined for some portion of time during image frames. Adding these multiple RGB channels during an image frame duty cycle will increase the brightness, but will also reduce the colorfulness by desaturating the pure RGB colors.
  • color rendering is accomplished by processing each of the RGB channels independently with matrix operators or with one- dimensional color look-up tables.
  • the RGB colors and the combinations of two and three colors may be independently controlled.
  • control does not provide full three-dimensional color processing.
  • HVS human visual system
  • HVS human visual system
  • Achieving optimal visual processing that provides the brightest, most colorful images, while preserving perceived color accuracy requires three-dimensional color processing.
  • Adaptation plays a powerful role in the instance depicted in Johnson's narrative.
  • the colors of the windows appear exceedingly brilliant, invoking a perception, in the words of Vincent Scully, Architecture, The Natural and Manmade, St. Martin's Press, 1991 , that, "... transcend[s] the statics of the building masses, the realities of this world ... [creating] a world of illusion, shaped by and for the heavenly light of the enormous stained glass windows.”
  • HVS human visual system
  • the HVS is capable of adapting to an enormous range of luminance.
  • the HVS may adapt its light sensitivity over a range of about eight orders of magnitude, e.g., from a starlit, moonlit night having a luminance of about 0.0001 candela per square meter (cd/m 2 ) to a brightly lit summer day of about 600 to 10,000 cd/m 2 .
  • Equally remarkable is that the HVS may accommodate over five orders of magnitude of luminance at any given instant for the perception of complex visual fields that are routinely experienced.
  • This adaptation occurs relative to diffuse white, i.e., an area in the scene that appears white.
  • the perceptions of lightness and chroma are then relative to this white. The higher the brightness of the perceived white, the lower the brightness and chroma of similarly illuminated objects in the scene will appear to the observer; conversely, the lower its brightness, the brighter and more colorful such objects appear.
  • the R, G, and B primary colors of the LEDs often exceed the current video standards, such as e.g., ITU Radiocommunication Sector (ITU-R) Recommendation BT.709, which is the United States standard for the format of high-definition television and consumer digital media.
  • ITU-R ITU Radiocommunication Sector
  • BT.709 which is the United States standard for the format of high-definition television and consumer digital media.
  • Optimal use of these extended colors requires full three-dimensional color processing and can be further optimized using knowledge of the HVS.
  • Prior attempts to process the current video standards, such as with one-dimensional color processing and color matrices, or without use of HVS models have resulted in unsatisfactory and unrealistic displayed images and high rates of product return by consumers.
  • FIGS. 1A - 1 D are two-dimensional schematic diagrams of various prior art ways for processing input color data to produce output color data for rendering a color image.
  • FIG. 1A shows a color hue/saturation/contrast/brightness method, depicting the global controls that rotate hue, stretch saturation and contrast and raise brightness. All colors are changed with these controls with no way to isolate a given color or color region like flesh tones.
  • Rin Gin Win are input HD709 standard colors, and R 0 ut/Gout/W 0 ut are more pure output LED Colors.
  • FIG. 1 B shows a color matrix method depicting a linear matrix global control that rotates and scales the color axes. All colors are changed globally with no way to isolate local colors like flesh tones.
  • R G ⁇ N m are input HD709 standard colors, and Rout Gout Wout are more pure output LED colors. If a 3 X 3 matrix is used, there are nine global choices.
  • FIG. 1 C shows a color gamma tables method depicting gamma global controls that independently maps each input color non-linearly to do things such as increase contrast. It can be seen that, e.g., red changes are the same for all green values. The same relationships occur with other combinations of primary colors. Thus gamma controls are global, with no way to locally isolate colors, such as flesh tones. Rin Gin Win are input HD709 standard colors, and Rout/Gout/W 0 ut are more pure output LED colors. With three primary colors having 4096 settings, there are 12288 global choices.
  • FIG. 1 D shows a 2D example of an RGBCYMW seven color mapping method.
  • the RBG/RGW triangles are independently processed using linear interpolation of input/output control values at each vertices. This is a global control, with no way to isolate local colors or regions like flesh tones.
  • RJGJ ⁇ N m are input HD709 standard colors, and R 0 ut/Gout/W 0 ut are more pure output LED colors. With 14 In/Out colors, there are 14 global choices.
  • Rin/Gin/Win are input HD709 standard colors, and R 0 ut/Gout/W 0 ut are more pure output LED Colors.
  • Digital Cinema Initiatives, LLC (DCI) is a joint venture of major motion picture studios, which was formed in 2002 to create standards for digital cinema systems, including image capture and projection.
  • the digital color standard adopted by the studios for professional movie releases in the DCI format is 12 bits per primary color, nonlinear CIE XYZ Tristimulus values. This is the first time that a digital standard has been established that is encoded in visual color space and therefore independent of any imaging device. For example, using this standard, the same digital file can be displayed to produce the specified color on a television or a printer.
  • the color gamut of this digital color standard is larger than any possible display.
  • FIG. 3 is a diagram of color gamuts, including color gamuts of the DCI and HD709 standards, and color gamuts of various media and/or imaging devices. It can be seen that in diagram 400, the color gamuts 406, 408, 410, and 412 of the various imaging devices are substantially larger than the HD709 standard 404. Accordingly, to take full advantage of the color capabilities of these imaging devices 406 - 412, the color gamut of the HD709 standard must be mapped upwardly, to render the full colors of the larger color gamut, while simultaneously preserving flesh tones and other memory colors, and optimizing the particular device for viewing in a particular environment.
  • the large triangular boundary 402 that represents the DCI standard encompasses all of the color gamuts of the media and/or imaging devices, as well as the color gamut of the HD709 standard 404. Accordingly, the digital color standard input color gamut 402 must be contracted or reduced to fit within the color gamut of a physical display such as a television or projector. Truncating or clipping those input digital color values of the DCI standard that lie outside of the color gamut boundary of the display device will cause loss of color saturation and detail and create a visually sub-optimal displayed image. Conventional video processing using one-dimensional color tables and linear matrices will also produce sub-optimal displayed images. Optimal display of these contracted colors requires full three- dimensional color processing and can be further optimized using knowledge of the HVS and the state of visual adaptation in particular viewing environments.
  • image and video media display products are now being reduced in size. Examples of such products are the new miniature pico-projectors and portable, handheld displays such as iPods® or iPads®. Because of power, heat, and size limitations, these displays generally have reduced color gamuts due to reduced contrast or reduced color saturation. They also are often used in widely differing viewing environments both indoors and outdoors. Improvement of the overall quality of these smaller gamut displays with conventional image and video input is critical to product value. Conventional video processing using one-dimensional color tables and linear matrices will also produce sub-optimal displayed images. Optimal display of these contracted colors requires full three-dimensional color processing and can be further optimized using knowledge of the HVS and the state of visual adaptation in particular viewing environments.
  • HVS adaptation is affected by the viewing environment.
  • a dark room higher contrast is needed in a projected or displayed image for an equally perceived viewing experience as compared to a room with normal room lighting or viewing the same image in bright outdoor lighting.
  • the HVS adaptation to the dark room and the lower overall image brightness combine to reduce the perceived image contrast.
  • less contrast is needed due to brightness adaptation and more contrast is needed due to viewing flare from room lights illuminating the dark areas of the displayed image.
  • a memory color may be characterized as a localized volume in a color space, as will be described subsequently herein.
  • the algorithms used in current image displays, televisions and projectors cannot uniquely preserve a volume within a three- dimensional color space while changing a different volume within the same three- dimensional color space using one dimensional tables, or matrices, or enhancements which are applied to all colors in the 3D space. For example, in some image projectors, color enhancement is attempted using output color definitions of the seven input colors RGBCMYW (red-green-blue-cyan-magenta-yellow-white).
  • a color gamut As a result, when current image displays, televisions and projectors provide enhanced colors, they do so across the entire color gamut, "enhancing" certain memory colors such as flesh tones such that a typical human observer finds them unsatisfactory and not perceptually optimal.
  • the color enhancement is somewhat arbitrary; it does not preserve memory colors, nor produce a perceived display image that is realistic for a better viewing environment.
  • 3D color tables have been implemented for color calibration, but in such circumstances, the tables are small (e.g., 7 x 7 x 7). These 3D look-up-tables are used instead of one dimensional tables and 3 x 3 matrices because the small 3D look-up-tables are generally faster, albeit at the expense of some loss of precision. In any case, significant color improvement or enhancements to deliver color "looks," or gamut mapping or mapping to displays with secondary or more than three primary colors with such small tables is not possible.
  • OLEDs organic light emitting diodes
  • the blue OLED typically has had a considerably shorter lifespan than the red and green OLEDs.
  • One measure of OLED life is the decrease of luminance to half the value of original brightness.
  • the luminance of currently available blue OLEDs decreases to half brightness in a much shorter time than the red or green OLEDs.
  • this differential color change between the blue OLED and the red and green OLEDs changes the color balance of the display.
  • a color-enhanced image display, television, or projection that maintains certain known colors and optimizes colorfulness and contrast will have the highest visual perceptual quality if and only if the rendering is accomplished wherein the input RGB colors are processed inter-dependently.
  • the color enhancement may entail increased brightness and/or a larger or smaller color gamut, depending upon the particular image display or projector.
  • a brighter display is not possible without affecting hue.
  • a first method of producing a color image comprising providing input image data from an image source such as a camera; generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image color data to output image color data in an image rendering unit; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the imaging color rendering controller; processing the input image data through the at least three-dimensional look-up table to produce output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output color values to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image.
  • the values in the lookup table may be calculated based upon a visual model of the human visual system and they may include modeling to improve the perceived brightness or contrast or colorfulness for different viewing environments.
  • the at least one of enhanced brightness, enhanced contrast, or enhanced colorfulness introduced by the at least three dimensional look-up-table may produce a chosen artistic perception in the output image.
  • the image rendering unit may have an expanded color gamut greater than the color gamut of the input image data, wherein the output colors to the image rendering unit utilize the expanded color gamut, or the image rendering unit may have a reduced color gamut smaller than the color gamut of the input image data, wherein the output colors to the image rendering unit utilize the smaller color gamut.
  • the input image data may contain memory colors and non-memory colors
  • the method may include identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the image rendering unit.
  • the perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness, and contrast of the memory colors. They may be increased more than perceived colorfulness, brightness, and contrast of the memory colors.
  • the perceived colorfulness, brightness, and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors.
  • Generating the at least three-dimensional look-up table may include computing enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function.
  • the enhancement function may be a sigmoidal function. More than one at least three- dimensional look-up table for the color transformation of the non-memory colors and the memory colors may be generated and used. Each of the at least three dimensional look-up tables may be optimized for a different viewing environment of the image rendering unit.
  • the method may further include providing a sensor for measuring the ambient light in the viewing environment.
  • the input image data may be of a first color standard, and the method may further include converting the input image data of the first input color standard into an input color specification for inputting into the three-dimensional look-up table.
  • the at least three-dimensional look-up table may have at least three input colors and/or at least three output colors.
  • the at least three output colors may be any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
  • the at least three dimensional look-up table may be losslessly compressed to reduce storage use in memory of the image color rendering controller.
  • the method may further include calibrating the image rendering unit by measuring the color response of the image rendering unit, and then modifying the output image data either by additional processing after the at least three-dimensional look-up-table or by including the required calibration in the at least three-dimensional look-up-table.
  • the image color rendering controller may be contained within the image rendering unit, or it may be external to the image rendering unit.
  • An auxiliary imaging device controller may be in communication with the image color rendering controller and the image rendering unit.
  • the image rendering unit may be selected from, but not limited to a projector, a television, a computer display, and a game display, and may use DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source.
  • the light source may be an LED, OLED, laser, or lamp light sources.
  • the image color rendering controller may be in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
  • the image rendering unit may include an algorithm for color modification, wherein the at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the image rendering unit.
  • the image rendering unit may include an algorithm for creating secondary colors from primary colors, and the at least three-dimensional look-up table further comprises compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
  • the at least three-dimensional look-up table may further include processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit.
  • the at least three-dimensional lookup table may contain a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
  • the at least three-dimensional look-up table may include the definition of secondary colors, and may further contain enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
  • the at least three-dimensional look-up table may further include processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit.
  • the instant method may be used in the display or projection of two dimensional (2D) or "three dimensional” (3D) images.
  • the 3D images are typically produced by providing 2D stereo images simultaneously or in rapid sequence taken from two perspectives, so as to provide the observer with the illusion of depth perception.
  • the image rendering unit may be a "3D" unit.
  • the unit may be e.g., an autostereoscopic display, or it may include a polarizing filter to separate the 2D stereo images being projected and directed to the eyes of an observer using polarization glasses, or it may include a shuttering mechanism to separate the 2D stereo images being projected and directed to the eyes of an observer using time synced shutter glasses.
  • both sets of 2D images may be processed according to the instant method to deliver 3D images that are perceived by an observer to have enhanced brightness, and/or enhanced contrast, and/or enhanced colorfulness.
  • an additional method of producing a color image comprising providing input image data of a first color gamut and an image rendering unit of a second, expanded or reduced color gamut; generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table expand or reduce the input image data to encompass the second color gamut of the image rendering unit; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the imaging color rendering controller; processing the input image data through the at least three-dimensional look-up table using the input image data as addresses into the at least three-dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three- dimensional look-up table; and outputting the output image data to the image rendering unit to produce an
  • the models may include visual models of HVS perceptual adaptation to produce a projected or displayed image that appears as it would in a more optimal, well lit viewing environment.
  • the image processing may include correcting for low level lighting of the surrounding environment and/or indoor or outdoor ambient light added to the displayed image.
  • a method of producing a color image by an image rendering unit in a sub-optimal viewing environment comprising generating an at least three- dimensional look-up table of values of input colors and output colors, the table containing a transformation from a suboptimal viewing environment to an improved viewing environment; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the image color rendering controller; processing the input image data through the at least three- dimensional look-up table using the input image data as addresses into the at least three-dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output image data to the image rendering unit.
  • This method may further include the various aspects and/or steps described above for the first method.
  • the improved viewing environment may be such that an observer may perceive the color image to have more color, contrast, or brightness.
  • a method of producing a color image by an image rendering unit comprising generating an at least three-dimensional look-up table of values of input colors and output colors, the three- dimensional look-up table containing the definition of secondary colors or more than three primary colors; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the image color rendering controller; processing the input image data through the at least three-dimensional look- up table using the input image data as addresses into the at least three-dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output image data to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image.
  • This method may also include the various aspects and/or steps described above for the first method.
  • the secondary colors or more than three primary colors may be explicitly defined, or the secondary colors or more than three primary colors implied in the design of a three in by three out look-up table for two conditions.
  • measured responses of the image rendering unit may be used to define the three- dimensional look-up table, or mathematics provided by a manufacturer of the image rendering unit may be used to define the three-dimensional look-up table.
  • an open definition of how the secondary colors or more than three primary colors are used may be provided. This method may also include the various aspects and/or steps described above for the first method.
  • the problem of displaying or projecting an image that is optimal in human visual perceptual terms regardless of the ambient light and background environment of the image is solved by using visual models to enhance the perceived colorfulness, contrast, or brightness of the image, thereby improving the perceived quality of the image.
  • the visual models of human visual perception may be used to create look-up tables of at least three dimensions to process the image to be displayed. Memory colors of the image may be preserved.
  • the method may further include performing empirical visual studies to determine the dependence of the preference of colorfulness, contrast, or brightness on the ethnicities of the human observers, and defining the perceived quality of the image for each nationality of human observers.
  • the method may further include adjusting the colorfulness, contrast, or brightness of the image based upon one of the ethnicities of the human observers.
  • the method may further include generating an at least three-dimensional look-up table of values of input colors and output colors, the three-dimensional look-up table adjusting the colorfulness, contrast, or brightness of the image to match the enhanced appearance of analog film systems or digital systems designed for cinemas.
  • the method may further include adjusting the colorfulness, contrast, or brightness of the image to produce a chosen artistic perception in the image.
  • the OLED display manages the overall quality and lifespan of the relative luminances of the red, green and blue OLEDs in the display.
  • the method comprises providing input image data and providing the OLED display having at least three OLEDs, each OLED being of a different primary color; generating an at least three- dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image data to output image color data of the OLED display in a manner that optimally manages the quality of the image and the lifetime of the at least three OLEDs; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the imaging color rendering controller; processing the input image data through the at least three-dimensional look-up table to produce output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output image data to produce the image by the OLED display.
  • the values in the look-up table may be calculated based upon a visual model of the human visual system. This method may further include the various
  • the at least three OLEDs may be a red OLED, a green OLED, and a blue OLED.
  • managing the quality of the image and the lifetime of the OLEDs may further include adding a white primary and mapping predetermined amounts of the grey component of RGB pixel values to the white primary to reduce the usage of RGB and extend the life of the red, green, and blue OLEDS.
  • managing the quality of the image and the lifetime of the OLEDs may comprise adding other primary colors and mapping predetermined amounts of the RGB pixel values to the other primary colors to reduce the usage of RGB and extend the life of the red, green, and blue OLEDS.
  • the method may further comprise operating the at least three OLEDs such that a first OLED does not reach end of life sooner than the other OLEDs, and the image quality of each of the OLEDs is reduced about equally over time without perceived artifacts or appearances predominantly of one of the OLED colors.
  • the method may be further comprised of having a controlled degradation of image quality due to changes in the outputs of at least one of the OLEDs, wherein the change of quality at any given point in time has the least loss in perceived quality.
  • the controlled degradation may be tracked by accumulating and using usage data for all of the OLEDs.
  • the controlled degradation may be performed on the entire image over time, or on at least one portion of the image over time.
  • the controlled degradation may be performed by substantially maintaining the brightness of the image while gradually reducing color saturation of the image over time, or by reducing color saturation of the image to a greater extent in image pixels of low color saturation than in image pixels of high color saturation, or by substantially maintaining the brightness of the image while reducing color saturation gradually using adaptive one dimensional tables on each of the primary colors.
  • the one dimensional tables on each primary color may be calculated using a quality degradation model.
  • the quality degradation model may average among one dimensional tables that are pre-designed to provide the targeted image quality at specific OLED lifetimes.
  • the one dimensional tables may be produced by interpolation between a one dimensional table for when the OLEDs are initially operated and a one dimensional table for when the OLEDs are at the ends of their useful lifetimes.
  • the problem of achieving an expanded or maximum color gamut by temporally combining R, G, and B during an image frame duty cycle to increase brightness while maintaining saturated pure R, G, and B colors is solved by calculating the combinations of R, G, and B that maintain a physical or perceived input color in a given viewing environment thereby maintaining physical or perceived color saturation and achieving increased brightness.
  • the calculated combinations are implemented in a 3D look-up table.
  • the color image to be produced may contain "memory colors" as defined herein, and non-memory colors.
  • the methods may include identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities in the image rendering unit, and producing an image comprising human visual system perceptually accurate memory colors using the image rendering unit.
  • the perceived colorfulness, brightness and contrast of the non-memory colors are increased more than perceived brightness and contrast of the memory colors.
  • generating the at least three-dimensional look-up table may includes computing enhanced lightness, chroma, and hue for the memory colors using a sigmoidal enhancement function.
  • More than one at least three-dimensional look-up table may be generated for the color transformation of the non-memory colors and the memory colors. Some or all of the at least three dimensional look-up tables may be optimized for a different viewing environment of the image rendering unit. In such an instance, the method may further include selecting one of the at least three- dimensional look-up tables for loading into the image color rendering controller based upon the viewing environment of the image rendering unit.
  • a sensor may be provided for measuring the ambient light in the viewing environment.
  • the problem of displaying an image that simultaneously has high brightness and high colorfulness of a majority of colors (and particularly high saturation colors), while maintaining realistic "memory colors" is solved by adding white light or any combination of multiple R, G, B colors by combining R, G, and B for some portion of the duty cycle of the image projection time, according to a 3D look-up table, which replaces the lost colorfulness of adding color combinations and at the same time preserves flesh tones and other known memory colors.
  • the image data is processed with a 3D look-up table in a manner that that increases the perceived colorfulness, brightness, and contrast while preserving flesh tones and other known memory colors.
  • the 3D look-up table is created to produce the improved image quality. Visual models may be used to perform the image processing.
  • the methods may further comprise converting the input image data of a first input color standard into an input color specification for inputting into the three-dimensional look-up table.
  • the solutions to the above problems may entail multi-dimensional look-up tables, with three dimensional look-up tables being one example.
  • the at least three dimensional lookup table may have three or more input colors and three or more output colors.
  • the output dimension may be different from the input dimension, such as having RGBCYMW (red-green-blue-cyan-magenta-yellow-white) output values in an RGB table, i.e. three values of input and seven values of output.
  • the number of outputs may also be greater than three due to the display having more than three physical colors, i.e., more than three primary colors such as R, G, and B. In such an instance, the output colors could therefore be the primary colors or combinations of the four or more colors.
  • the three or more than three output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
  • the at least three dimensional look-up table(s) may be losslessly compressed to reduce storage use in a memory of the image color rendering controller.
  • a method of displaying an image containing memory colors and saturated colors comprising identifying the memory colors in input image data to be substantially maintained, characterizing the memory colors with respect to their chromaticities, and generating a three-dimensional look-up table for a color transformation of saturated and memory colors.
  • the three-dimensional look-up table is loaded into an imaging device controller, and input image data is loaded into the imaging device controller.
  • the input image data is processed with an algorithm using the three-dimensional look-up table to produce output image data.
  • the output image data is output to an image rendering device, and a high brightness, high contrast image comprising human visual system perceptually accurate memory colors is displayed or projected.
  • the method includes preprocessing, wherein one dimensional tables and matrices are provided for converting the variety of possible input color standards into a preferred color input to the 3D or higher dimensional color look-up-table. This is done for the purpose of making a single or reduced number of 3D or higher dimensional color look-up-tables adaptable to different video standards.
  • the algorithm containing the 3D or higher dimensional mathematics is executed in real time by the central processing unit of a computer in the image display or projection device so that the need for a 3D color table is obviated. This may be done if the device computer is provided with adequate computational processing capability and memory.
  • the method includes incorporating the variety of possible input color standards directly into the creation of the 3D or higher dimensional color look-up-tables to adapt to different video standards.
  • the image rendering unit (such as, e.g., a display or projection device) is provided with some color modification capability that is "built in.”
  • the device may provided with an algorithm to add white or secondary colors, resulting in a loss of colorfulness, and a distortion in the appearance of memory colors.
  • the output values in the at least three-dimensional lookup table are determined such that the input image data is processed to compensate for the color modification performed by the image rendering unit.
  • the method may thus include providing at least 3D color tables to adjust the color data in a manner that shifts it in a direction within the color space that compensates for the built in color modification that is performed by the image rendering unit.
  • the at least three- dimensional look-up table further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit.
  • the at least three-dimensional look-up table may further comprise compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
  • the values in the at least three dimensional lookup table may also be determined such that the at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit.
  • the at least three-dimensional look-up table may also adjust the colorfulness, contrast, or brightness of the image to be produced to match the enhanced appearance of analog film systems or digital systems designed for cinemas.
  • a device for producing a color image is comprised of a computer including a central processing unit and a memory in communication through a system bus.
  • the memory may be a random access memory, or a computer readable storage medium.
  • the memory contains an at least three dimensional lookup table.
  • the at least three dimensional lookup table contains values of input colors and output colors, wherein the values in the lookup table convert an input image color data set to output image color data in an image rendering unit that is connectable to the device.
  • the at least three dimensional lookup table may be produced by an algorithm for transforming input image data comprising memory colors and non-memory colors to a visual color space, and computing enhanced lightness, chroma, and hue for the memory colors and non-memory colors in the visual color space.
  • the algorithm to produce the three dimensional lookup table may be contained in the memory.
  • the at least three dimensional lookup table includes values of input colors and output colors, wherein the values in the lookup table convert a first color gamut of an input image data set to encompass a second expanded or reduced color gamut of an image rendering unit that is connectable to the device.
  • the at least three dimensional lookup table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
  • the at least three dimensional lookup table contains the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of secondary colors by an image rendering unit that is connectable to the device.
  • the memory may contain a visual model to enhance the perceived colorfulness, contrast, or brightness of the image.
  • the device may further include the image rendering unit in communication with the computer.
  • the image rendering unit may be selected from a projector, a television, a computer display, and a game display, and may use DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation (LCOS), or direct modulation of the light source and LED, organic light emitting diode (OLED), laser, or lamp light sources.
  • the device may further comprise an auxiliary imaging device including at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
  • One of a liquid crystal display, a plasma display, and a DMD projector may be in communication with the auxiliary device.
  • the device may further comprise a communication link to a source of input image data.
  • the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
  • the at least three-dimensional lookup table may contain a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
  • the memory of the device may contain a set of at least three dimensional lookup tables; each table of the set may be optimized for a different viewing environment of the image rendering unit.
  • the device may be provided with a sensor for measuring the ambient light in the viewing environment.
  • FIG. 2 is a schematic diagram of aspects of the instant method for processing input color data to produce output color data for rendering a color image
  • FIG. 3 is a chromaticity diagram that depicts color gamuts of the DCI and HD709 standards, and color gamuts of various media and/or imaging devices;
  • FIG. 5 is a schematic diagram of a device for producing a color image
  • FIG. 6 is a flowchart depicting the steps of one algorithm for generating a three- dimensional lookup table for the purposes of this invention.
  • FIG. 7 is a flowchart depicting one method for producing a color image in accordance with the present disclosure
  • FIG. 8 is a schematic diagram of one mathematical flowchart for producing a color image in accordance with the present invention, which includes color output calibration.
  • BT.709 abbreviated reference to ITU Radiocommunication Sector (ITU-R) Recommendation BT.709, a standard for the format of high-definition television.
  • ITU-R ITU Radiocommunication Sector
  • Chromaticity - normalized CIE Tristimulus values often used to visualize the color gamuts of devices in a Chromaticity diagram, such as that shown in FIG 3.
  • CIECAM02 the most recent color model adopted by the International Commission on Illumination, or Commission Internationale de I'eclairage (CIE), published in 2002.
  • Color Space A three-dimensional space in which each point therein corresponds to a color.
  • Colorfulness Attribute of a visual perception according to which the perceived color of an area appears to be more or less chromatic.
  • Contrast - In the perceptual sense, assessment of the difference in appearance of two or more parts of a field seen simultaneously or successively.
  • DCI Standard - a color standard for digital cinema systems created by Digital Cinema Initiatives, LLC a joint venture of major motion picture studios formed in 2002. The standard is included in the publication, "Digital Cinema System Specification,” Version 1 .2 approved by Digital Cinema Initiatives, LLC March 7, 2008.
  • Color Gamut The range of colors producible with a set of inks, lights, or other colorants.
  • a color gamut may be described in terms of a particular region of a color space.
  • Memory color a color of an object in an image for which an observer may consciously or unconsciously observe and make a judgment as to whether the color of the object is accurate, based upon the observer's memory of previous experiences observing the object.
  • Examples of memory colors are flesh (human skin) tones, the green of grass, the blue of the sky, the yellow of a banana, the red of an apple, and grey scale.
  • the accurate rendering of colors associated with commercial products and registered trademarks, such as "Kodak yellow”, "IBM blue,” and "John Deere green” may be important to some viewers/users of images, and are also examples of memory colors. It is further noted that the perceived appearance of memory colors may be influenced by the context in which they are seen by an observer.
  • Primary colors The colors of the individual light sources, including all color filters, that are used to create a color image in an image rendering unit.
  • Projector - An imaging device which forms an image by delivering and in some instances focusing light on a distant, separate surface such as a wall or screen.
  • RGBCYMW in the use of any of these capital letters in combination herein, they stand for red, green, blue, cyan, yellow, magenta, and white, respectively.
  • Rendering an image - providing an image for observation either via an image display that forms an image from discrete lighted elements at a surface thereof, or via an image projector that forms an image by delivering and in some instances focusing light on a distant, separate surface such as a wall or screen.
  • Tristimulus values - Amounts of the three reference color stimuli, in a given trichromatic system, required to match the color of a stimulus being considered.
  • White a set of three values of primary colors, typically red, green, and blue, that may be added to a color in a portion of an image, thereby in effect adding white to the color to brighten the color.
  • a reference to a three dimensional lookup table or a 3DLUT is meant to indicate a table of at least three dimensions, unless otherwise indicated.
  • a lookup table may be multidimensional, i.e., it may have three or more input colors and three or more output colors.
  • FIG. 2 is an illustrative, two-dimensional schematic diagram depicting the full multi-dimensional capability of an at least three dimensional color table 54 used in processing input color data to produce output color rendering a color image.
  • the diagram 420 of FIG. 2 depicts only a 2D rendition of an at least 3D color table 54 of the present invention. Any point, and/or any region in the full color space can be changed independently.
  • the small squares 422 represent locations in the color space in which no change in color is made. These locations may be memory color locations, such as flesh tones.
  • regions 424 selective increases in contrast, colorfulness, and brightness may be made.
  • the larger squares 426 in these regions 424 represent locations where colorfulness, contrast, and brightness are increased.
  • Any local color or color region such as a flesh tone region, can be chosen for unique color processing.
  • a 3D color table may contain output values for every input RGB color, which for 12 bits per color would be 4096 x 4096 x 4096 independent colors, thereby providing 68.7 billion local color choices.
  • a 3D color table size can be reduced by using the most-significant bits of the input colors to define the 3D color table locations and performing multi-linear or other multidimensional interpolation using the least-significant bits of the input colors.
  • squares 422 and 426 are meant to indicate various color regions, the borders of the squares are not meant to indicate sharply defined boundaries of such regions. As described previously, these regions may be modeled using a probability distribution that provides a smooth transition from regions in the color space that are outside of the regions defined by the squares.
  • the various regions may be defined by Gaussian boundaries that are smoothly connected by probability functions.
  • volume derivatives may be used that displace the color (R,G,B) vectors in different amounts.
  • the color vectors have a lesser displacement, or possibly none at all, while other color regions have larger displacements to increase their contrast, colorfulness, and brightness.
  • the full table may be very large. For example, a large table results if the input color is 24-bit (i.e. 8 bits each for R, G, and B), and the output includes white and is 32 bit (i.e. 8 bits each for R, G, B, and W).
  • this large 3D LUT 54 may be used if the memory 36 of the image color rendering controller is sufficiently large, and results in the fastest color processing.
  • multi-dimensional interpolation may be used to reduce the size of the 3D LUT 54.
  • bits 3 through 8 may be used to define and address the 3D LUT 54.
  • Multi-dimensional interpolation may then be used with bits 1 and 2 to define the output colors that occur between the output colors associated with the 8 vertices of the cube in the 3D LUT 54 defined by bits 3 through 8.
  • the color gamut of an image rendering unit is defined by the maximum colors that can be produced by that image rendering unit with combinations of its primary colors.
  • FIG. 3 shows the color gamuts of various image rendering technologies compared to the CCIR709 color standard 404 and the DCI color standard 402.
  • FIG. 3 shows that displays such as LED projectors (gamut 406), OLED displays (gamut 408), Digital Cinema projectors (gamut 410) and televisions with more than 3 primary colors (gamut 412) have larger color gamuts than the CCIR709 color standard (gamut 404) for digital media distribution, thus illustrating the need to map the smaller CCIR709 color standard to the larger color gamut of these display types.
  • FIG. 3 also shows that the DCI "Hollywood" color standard is significantly larger than the color gamut 414 of an infinite set of lasers, and therefore larger than any possible display or image rendering unit, thus illustrating the need to map the larger input to the smaller color gamut of any display type including a professional digital cinema projector.
  • FIG. 4 is a perspective view of a three-dimensional CIECAM02J L * a * b * opponent color space 10 depicting a series of color gamuts of an image display, projector, or television in which the gamuts have been sequentially reduced by the addition of white to the R, G, and B primary colors thereof.
  • the outer (coarsest squares) color gamut 12 is the color gamut of one exemplary image projector having its primary colors produced by red, green, and blue LEDs.
  • the wire frame color gamut 1 1 represents the CCIR709 video color standard.
  • the successively finer squares solids 14, 16, 18, and 20 represent the color gamuts resulting from the addition of 6.25%, 12.5%, 25%, and 50% white, respectively.
  • 2D projections of the color gamuts 1 1 - 20 are provided on the a * b * plane as respective closed curves 1 1A - 20A.
  • the color gamut 12/12A of the LED primaries has no added white . It can be seen in general from the 3D perspective renditions and the 2D projections that the addition of white always reduces the color gamut of the image device.
  • visual models of visual perception by the human visual system are used in determining the optimum amount of white to add to the colors of the image.
  • the perceived colorfulness, contrast, and/or brightness of the image are enhanced, thereby improving the perceived quality of the image.
  • the visual models of human visual perception may be used to create look-up tables of at least three dimensions to process the image to be displayed.
  • the methods of the present invention may include performing empirical visual studies to determine the dependence of preference of colorfulness, contrast, or brightness on the ethnicities of the human observers, and defining the perceived quality of the image for each nationality of human observers.
  • the colorfulness, contrast, or brightness of the image may be adjusted based upon the preferences of one of the ethnicities of the human observers.
  • FIG. 5 is a schematic diagram of a device for producing a color image, which may be observed by a human observer.
  • the imaging device may include an image rendering unit such as e.g., a television, a display, a projector, or another unit.
  • the imaging device 30 may include an image color rendering controller 32 or computer 32 or other processor comprising a central processing unit 34 and a memory 36.
  • the controller 32 may include a computer readable storage medium 38 such as a hard disk. These components are in communication through a system bus 39.
  • the imaging device 30 may process input image data that is stored on the storage medium 38, or the imaging device 30 may receive input image data from an external device or source 50.
  • the external source 50 may comprise an Internet connection or other network or telecommunications connection, such that the input image data is transmitted through such connection.
  • the imaging device 30 may be adapted to a system for displaying or projecting an image in a variety of ways, depending upon the particular application.
  • the imaging device 30 may be provided as an integrated system comprising the controller 32 and the image rendering unit (display or projector) 40, which only needs to be connected to a source 50 of image input data.
  • the imaging device 30 may be separate from the image rendering unit 40, and in communication with the image rendering unit 40 through a network or telecommunications connection as described above.
  • the imaging device 30 may be provided comprising the image color rendering controller 32, a first port (not shown) for connection to a source 50 of image input data, and a second port (not shown) for connection to the image rendering unit 40.
  • This configuration is particularly useful for retrofitting to projection or flat screen televisions that receive signals via a cable that is connected to a broadcast source of image input data (e.g., "cable TV programming").
  • image input data e.g., "cable TV programming"
  • the cable carrying input image data 50 could be disconnected from the image rendering unit 40, and the imaging device 30 could be placed in line between them to perform the image processing of the present invention.
  • the imaging device 30 may be in communication with, or integrated into an auxiliary device 60 or auxiliary imaging device controller 60, which is in communication with the image rendering unit 40.
  • the imaging device controller 60 may be, without limitation, an audio/video processor, a cable TV set-top box, a video game console, a personal computer (PC), a computer graphics card of a PC, or a DVD or Blu-ray player.
  • the imaging device 30 may be integrated into the electronics and processing components of a broadcast station, a broadcast antenna, receiver or processor, or a digital cinema theatre.
  • the device 30 may be integrated into the hardware and software of media creation, preparation, and production equipment, such as equipment used in the production of DVDs of movies and television programs, or the production of digital cinema for distribution to theaters. Broadcast stations, digital cinema theaters, and media production equipment may all be comprised of an auxiliary imaging device controller 60.
  • the memory 36 of the device 30 may contain a set of at least three dimensional lookup tables 54; each table of the set may be optimized for a different viewing environment of the image rendering unit 40.
  • the device 30 may be provided with a sensor 70 for measuring the ambient light in the viewing environment of the image rendering unit 40, or in the case of a projector 46 or 48, in the viewing environment of the projected image.
  • the memory 36 may contain a visual model of the perception of the human visual system that may be used to enhance the perceived colorfulness, contrast, or brightness of the produced image.
  • FIG. 6 is a flowchart depicting an algorithm for generating a three-dimensional lookup table to improve the perceived colorfulness, contrast or brightness in non- memory colors, while preserving to a higher degree the color accuracy of memory colors.
  • the RGB input values of the input image data are "reverse gamma" corrected to compensate for the non-linearity of this data, thereby producing linearized scalar RGB values.
  • the original input data is supplied with the expectation that it will be used in a display or projector that may have a gamma value of about 2.2, for example.
  • the outer product of the scalar RGB values and the projector matrix is taken to express the input image data as CIE XYZ tristimulus values.
  • the tristimulus values are converted to a visual color space.
  • the transformation to a visual color space enables perceptual modeling to be performed, which characterizes the interdependencies of color, contrast, and brightness, and allows the perception of memory colors to be preserved.
  • the visual color space may be an opponent color space that accurately models constant perceived hue, and has the dimensions of lightness, yellow-blue, and red- green.
  • operation 140 the visual color space predicted appearance attributes of lightness, chroma, and hue are computed.
  • operation 150 the enhanced lightness, chroma, and hue for colors to be rendered are computed. Operation 150 may include steps 152, 154, and 156 for maintaining memory colors in the rendering of the image.
  • operation 150 of the method 100 may include steps 152, 154, and 156. More specifically, the method 100 may include the step 152 of identifying the memory colors in the input image data 50 to be substantially maintained. This may be done based on intuition and experience and/or market research data. It is known that observers of an image depicting human subject matter (such as a movie or television program) will find it objectionable if the colors of the skin, and faces in particular, of the humans in the image do not match those colors that they have in their respective memories of how the humans should look. They will perceive the humans as "not looking right,” if they are too pink, orange, dark, light, etc.
  • the memory colors are characterized with respect to their chromaticities in step 154 from both empirical data and the perceptual context in which they are seen. For instance, it is well understood that humans remember green grass and blue sky as more saturated than the actual stimuli. And, within reason, no matter the color of an illuminant, humans will remember a banana to appear to be a certain yellow (which may also be a memory color). Furthermore, these memory colors are not distributed across the extent of perceptual color in any systematic way. Hence, their representations must necessarily be made in a multivariant, three dimensional, statistical sense and their rendering accomplished in a purely appearance or vision based color space. Algorithms may be employed using visual mathematics which ensure that the memory colors are specified in terms of perceived colors.
  • the enhanced lightness, chroma, and hue for non memory colors and memory colors are also computed.
  • a given memory color is not a single point within the space.
  • memory colors are regions within the color space that are to be left at least perceptually unchanged, or much less changed during the color transformations of the instant methods to produce enhanced images.
  • the memory color "flesh tone" is a range of colors corresponding to the colors of very dark-skinned peoples of African ethnicity to very light skinned Caucasians or Asians. Accordingly, the memory colors are identified and characterized such that the colors within this region will be left unchanged or minimally changed in the color transformations.
  • these memory colors may be characterized as not having rigid, discrete boundaries; this may be done so that in the color transformations to be performed, there is not a discontinuity in the degree of color change at a boundary of a memory color, as explained previously with reference to FIG. 2.
  • the memory color may be modeled using a probability distribution that provides a smooth transition from regions in the color space that are non-memory colors to the region defined as the particular memory color. Any smoothing function that changes the local multi-dimensional derivatives smoothly will be satisfactory.
  • the probability distribution may use non-linear enhancement functions.
  • An exemplary overall nonlinear function that may be used is
  • the enhanced lightness, chroma, and hue of the visual color space are converted to enhanced CIE XYZ tristimulus values.
  • the enhanced CIE XYZ tristimulus values are converted to enhanced RGB scalar values with "white channel.”
  • gamma correction of the enhanced RGB scalar values is performed to produce a 3DLUT containing enhanced RGB values with white channel. The 3DLUT may then be used in the method 200 of FIG. 7.
  • FIG. 6 concludes with a simple statement of the net effect of the operations 1 10 - 180.
  • the 3DLUT which is of at least three dimensions, is created as a discrete sampling of the visual model and contrast/color/brightness HVS perceptual improvement mathematics, and may include preservation of memory colors.
  • the at least 3DLUT 54 may be generated by the CPU 34 of the imaging system 30 according to an algorithm 52 stored in memory 36 or on the readable storage medium 38. Alternatively, the at least 3DLUT 54 may be generated by another computing system and uploaded to the system computer 32.
  • the algorithm 52 of FIG. 5 for generating the at least 3DLUT 54 may be algorithm 100 of FIG. 6.
  • FIG. 7 is a flowchart depicting one method for rendering a color image in accordance with the present disclosure. The method may be performed using the imaging system 30 depicted in FIG. 5.
  • the 3DLUT 54 which may be produced according to the algorithm 100 of FIG. 6, is loaded into the memory 36 or the readable storage medium 38 of the imaging device 30.
  • the input image data from the source 50 is communicated to the CPU 34.
  • the input image data may be of a first input color standard, and may be converted into an input color specification for inputting into the at least three-dimensional look-up table.
  • the input image data is processed with an algorithm 56 that may be stored in memory 36, using the at least three-dimensional look-up table 54 to produce rendered image data.
  • the rendered image data is output to the image display/projection device 40, and a high brightness, high contrast, and high colorfulness image is displayed or projected in step 250.
  • the image may include human visual system perceptually accurate memory colors.
  • the method 100 may be repeatedly performed at a high rate on sequences of image input data, such as at the rate of 24 or 48 "frames per second" used in digital cinema, or such as at the rate of 30, 60, 120 or 240 frames per second used in consumer displays.
  • the 3DLUT 54 of input colors and output colors may contain, or the values therein may be determined from, the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of secondary colors by the image rendering unit 40.
  • the 3DLUT 54 of input colors and output colors may contain, or be determined from, a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
  • the method may include providing input image data 50 of a first color gamut, and an image rendering unit 40 having a second, expanded or reduced color gamut.
  • the 3DLUT 54 of values of input colors and output colors is generated, wherein the values in the 3DLUT 54 are calculated based upon a visual model of the human visual system, thereby expanding the input image data 50 to encompass the second color gamut of the image rendering unit 40.
  • the image rendering unit 40 may be provided with some color modification capability that is built in or embedded in hardware or software.
  • the device may be provided with an algorithm to add white or secondary colors, resulting in a loss of colorfulness, and a distortion in the appearance of memory colors.
  • the output values in the 3DLUT 54 are determined such that the input image data 50 is processed to compensate for the color modification performed by the image rendering unit 40.
  • the method may thus include providing the 3DLUT 54 to adjust the color data in a manner that shifts it in a direction within the color space that compensates for the embedded color modification that is performed by the image rendering unit 40.
  • the 3DLUT 54 further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit 40.
  • the 3DLUT 54 may further comprise compensating for the color modification performed by the addition of the secondary colors in the image rendering unit 40.
  • the values in the 3DLUT 54 may also be determined such that the 3DLUT 54 further comprises processing the input image data 50 to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit 40.
  • the image rendering unit 40 may unintentionally contain some color modification capability resulting from variation in one or more parameters of the unit 40.
  • the image rendering unit 40 is an OLED display
  • color modification may occur due to the differing life spans between blue OLED and red and green OLEDs of the display, as described previously herein.
  • the differential color change between the blue OLED and the red and green OLEDs will change the color balance of the display if no countermeasures are instituted.
  • the output values in the 3DLUT 54 may be determined such that the input image data 50 is processed to compensate for the predicted decrease in luminance of the blue OLED.
  • the method may thus include providing the 3DLUT 54 to adjust the color data in a manner that shifts it in a direction within the color space that compensates for decreasing blue OLED luminance.
  • the 3DLUT 54 further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the continual loss of blue OLED luminance.
  • the 3DLUT 54 may also adjust the colorfulness, contrast, or brightness of the image to be produced to appear as it would in an image from an analog film system or digital system used in cinemas. It is known that film is generally not designed to reproduce color as the eye sees it at the filming site. (A color gamut 416 for film is shown in FIG. 3.) Instead, the colors in film images have increased contrast and increased colorfulness in anticipation of the viewing environment in which the film images will be observed. It is also known that digital systems aim to match the look of film images. Accordingly, the 3DLUT may be designed to provide the same effect in a cinema.
  • the production of the 3D LUT 54 is not limited only to the algorithm 100 of FIG. 6.
  • Bit depth modification and interpolation as described herein may also be applied to all of the applications herein which include the use of 3DLUTs.
  • the 3DLUT may vary in bit depth, depending upon the capacity of the memory 32 and the processing power of the CPU 34.
  • the 3DLUT may be a twelve bit table with 4096 x 4096 x 4096 discrete addresses containing three or more color values of predetermined bit precision.
  • some bits of the table may be used for interpolation between adjacent values. For example, the final two bits of respective adjacent table values may be used in interpolating colors between them.
  • Other methods of multi-dimensional interpolation are known, and are included in embodiments of implementing the 3DLUT.
  • the input data may contain more than three primary colors such as RGB.
  • the 3DLUT could have outputs of RGBCMYW.
  • the algorithm 100 may be used to produce more than one 3DLUT.
  • One factor that may be used to determine the values in the 3DLUT is the set of characteristics of the display or projection device.
  • different 3DLUTs 54 may be produced for different image output devices, for example, an LCD display 42, a lamp-and-color-wheel DMD projector 44, and an LED DMD projector 46.
  • the characteristics of the display or projection device 40 include the "color engine" of the device, and whether it includes only RGB as the primary colors, or has more than three colors.
  • the 3DLUTs 54 may be losslessly compressed to reduce storage use in the memory 36 of the image color rendering controller 30.
  • the visual adaptation transformation implemented in the 3DLUT 54 uses visual adaptation models to produce the effect of improved viewing environment.
  • Other factors in generating the 3DLUT 54 may include a knowledge of the different sensitivities to colorfulness in different worldwide regions, or the intended use of the displayed/projected images; for example, whether the images are viewed in a video game that is being played, or viewed as a movie or television program.
  • the image device 30 may include a keyboard (not shown) or other input device to access a user interface (not shown) that may be displayed on the display or projector 40 (or other user interface screen).
  • the user interface may offer the capability of inputting data on the viewing environment factors 58, and/or other factors such that the optimum 3DLUT is selected from the stored 3DLUTs 54 for the particular display or projector 40 and viewing environment. In that manner, the most perceptually optimal images are provided to the user by the system 30.
  • the 3DLUTs 54 are effective for the enhancement of a variety of images, including but not limited to games, movies, or personal photos. Additionally, some improvement of grey scale images is attained by the resulting contrast and brightness enhancement thereof.
  • the 3DLUT 54 may be produced according to variants of the method 200 such that it has additional or alternative characteristics.
  • the values in the 3DLUT 54 may be provided to convert a first color gamut of an input image data set 50 to encompass a second expanded or reduced color gamut of an image rendering unit 40 that is connectable to the device 30.
  • the 3DLUT 54 may contain a transformation from a suboptimal viewing environment to an improved viewing environment in which the color image is to be observed, including the visual and chromatic adaptation of the human visual system.
  • the 3DLUT 54 may contain the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of secondary colors by an image rendering unit 40 that is connectable to the device 30.
  • the methods of producing a color image may include input color standard transformation and color output calibration of the image rendering device that is in use.
  • FIG. 8 is a schematic diagram of an alternative method 300 for producing a color image, which includes such color output calibration.
  • the diagram includes color output calibration operations 350, 360, and 370; however, for the sake of clarity, the entire method depicted in FIG. 8 will be described, with reference also to FIGS.6 and 7.
  • the input values of R, G, and B are reverse gamma corrected to compensate for the non-linearity of this input data standard, thereby producing linearized scalar values R,, B,, and G,.
  • This correction may be done using the respective one dimensional lookup tables 31 1 , 312, and 313.
  • the input values of R, G, and B may be between 8 and 12 bits (314 in FIG. 8) inclusive.
  • the output values of R,, G,, and B may have 16 bit resolution (315 in FIG. 8), depending upon the architecture of the image color rendering controller 32, and also upon the need for the greater bit depth of the imaging standards being used.
  • the input R, G, and B values may be provided from various devices, such as a video camera having an output in accordance with standard BT.709. In such circumstances, the value of gamma used in the correction may be 2.2.
  • the input R, G, and B values may be provided in accordance with other imaging standards, and other values of gamma and other 1 D lookup tables 31 1 , 312, and 313 may consequently be used in the reverse gamma correction as needed.
  • every color value in the image data stream 319 represented by a unique R,, G,, and B, combination is then operated on by a 3 x 3 matrix determined by the particular imaging standard being used to perform a color transformation to RN, GN, and B, values that are linearized scalar values referenced to the standard BT.709.
  • the RN, GN, and B,, values may be provided with a bit resolution of up to 16 bits as indicated in FIG. 8.
  • the values of R,,, GN, and B, are gamma encoded to re-introduce a non-linearity into the processed data, thereby producing gamma encoded values ⁇ , ⁇ , and GN, for input to the 3D color tables.
  • This encoding may be done using the respective one dimensional lookup tables 331 , 332, and 333, using a gamma encoding factor of 1/2.2, in one embodiment. Other factors may be suitable, depending upon the particular imaging standards being used.
  • the resulting values of Riii, Biii, and Gm may be reduced to 10 bit resolution as shown in FIG. 8, to enable sufficiently fast subsequent processing using the 3D color tables 54.
  • the gamma encoding enables a reduction in the number of bits from 16 for linear data to much less for gamma encoded data, such as 10 bits, without artifacts. This makes the at least 3D table much smaller. It is effective to use fewer gamma encoded bits because the eye sees image data in a manner analogous to a gamma encoder.
  • the three dimensional color tables 54 are used to process the RiiiBiiiGiii data to produce output image Ri V BivGivWiv data for display or projection.
  • the table 54 is 3D in (RGB) and 4D out (RGBW).
  • Other table structures of at least three dimensions may be used, depending upon the particular application. Additionally, for the sake of simplicity of illustration, there is only one table 54 shown in FIG.
  • the white could be for an OLED display, or the signal that drives the combination of RGB to make the image rendering device brighter.
  • the white could be replaced with cyan, or some other color in a four-color image rendering device, such as a four-color TV.
  • the Ri V BivGi V Wiv data may be provided at a 12 bit resolution as indicated in FIG. 8.
  • the Ri V BivGivWiv data including the addition of white for increased brightness or color management of OLED devices may represent a generic display with typical color primaries and linearity. Additionally, however, further operations may be performed to further optimize the Ri V BivGi V Wiv data by calibration for the particular image rendering unit (display or projector) 40 that is in use. The measurement or specification of this particular image rendering unit 40 can be done in manufacturing on done on-site by a technician with conventional linearity and primary color measuring tools..
  • the Ri V BivGivWiv data is first reverse gamma-corrected to produce R v B v GvWv data.
  • This correction may be done using the respective one dimensional lookup tables 351 , 352, 353, and 354.
  • the output values of R v , G v , B v , and W v may have 16 bits.
  • the value of gamma used in the correction may be 2.2, or another value in accordance with the gamma encoder 310.
  • every color value in the image data stream 359 represented by a unique R v , G v , B v , and, and in many cases, W v combination is then operated on by a 4 x 4 matrix.
  • This 4 x 4 matrix is produced for and is unique to the particular image rendering unit 40 of FIG. 5 that is in service.
  • the matrix is calculated from measured or specified values that define the color primaries of the particular image rendering unit 40.
  • the purpose of the operation is to convert from the assumed or generic color primaries in the at least 3D color table to the actual ones in the image rendering unit 40.
  • the visual effect is to adjust for white and the rest of the colors so they are not "tinted" (e.g., a little yellow or blue), because the image rendering unit may have slightly different color primaries than were assumed in creating the at least 3D table.
  • the image rendering unit may have slightly different color primaries than were assumed in creating the at least 3D table.
  • those assumptions are in accordance with the aforementioned BT.709 standard, because most TVs, displays, and projectors adhere to this standard.
  • a given image rendering device may be tinted, e.g., more yellow, however so the calibration matrix compensates for that variation.
  • the R V i, G V i, B V i, and W V i values may be provided with a bit resolution of up to 16 bits.
  • the R V i, G V i, ⁇ ⁇ ,, and W V i values are gamma encoded to introduce the correct non-linearity into the processed data for the image rendering unit 40, thereby producing the Rvii,G V ii, Bvii,W V ii values that, when used by the particular image rendering unit 40 to project or display the image, produce chosen non- linearity defined by the 3D table.
  • This encoding may be done using the respective one dimensional lookup tables 371 , 372, 373, and 374. In one embodiment, a gamma encoding factor of 1/2.2 may be used. Other factors may be suitable, depending upon the particular imaging rendering unit 40.
  • the resulting values of Rvii, B V ii, Bvii,W V ii may be output having between 8 and 12 bit resolution as indicated in FIG. 8.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Color Image Communication Systems (AREA)
  • Processing Of Color Television Signals (AREA)
  • Control Of El Displays (AREA)
  • Electroluminescent Light Sources (AREA)
  • Liquid Crystal Display Device Control (AREA)

Abstract

A method of producing a color image comprising providing input image data from an image source such as a camera; generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image color data to output image color data in an image rendering unit; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the imaging color rendering controller; processing the input image data through the at least three-dimensional look¬ up table to produce output color values stored at the addresses in the at least three- dimensional look-up table; and outputting the output color values to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, or enhanced colorfulness compared to the input image.

Description

METHOD FOR PRODUCING A COLOR IMAGE
AND IMAGING DEVICE EMPLOYING SAME
TECHNICAL FIELD
Processing and projection or display of color images on surfaces, on televisions, on game displays, on computers or by other electronic display media.
BACKGROUND ART
The projection and/or display of color images is an active area of commercial research and development. New image display, television, games, computers and projection products and viewing experiences are being launched in the marketplace on a regular basis. In one aspect of the marketplace, digital cinema or video projector technology that utilizes colored light emitting diodes (LEDs) as the source of the primary colors for imaging, offers the promise of extreme, wide color gamut along with very long life, low heat illumination. LED brightness is currently limited, however, requiring three optical systems and three image modulators, i.e., one for each of the red, green, and blue (RGB) color channels, for the brightest images. Current projector lamp technology is of higher brightness and can take advantage of single optical systems and single image modulators using complex color filter wheels to provide full color display. In a second aspect of the marketplace, televisions, game displays and computer displays such as liquid crystal displays (LCDs) are now being introduced with LEDs as the backlit light source to again take advantage of the extreme, wide color gamut, long life and low heat output of LEDs. In a third aspect of the marketplace, projectors, televisions, game displays and computer displays are being introduced with more than the typical three (RGB) colors to improve brightness and expand the color gamut. Such products offer the promise and technical challenge of how to best use the wide color gamut.
In a color image projector, in order to gain the advantage of the available wide color gamut, longer life, and lower heat of LED illumination, and to achieve maximum brightness with a single optical system and single image modulator, the multiple RGB channels may be combined for some portion of time during image frames. Adding these multiple RGB channels during an image frame duty cycle will increase the brightness, but will also reduce the colorfulness by desaturating the pure RGB colors.
Furthermore, in prior art projectors, color rendering is accomplished by processing each of the RGB channels independently with matrix operators or with one- dimensional color look-up tables. In some projectors, the RGB colors and the combinations of two and three colors may be independently controlled. However, such control does not provide full three-dimensional color processing. With these limited processing options, it is not possible to display images optimally in human visual system (HVS) perceptual terms. For example, it is not possible to render visual lightness contrast without affecting either or both of hue and chroma. Achieving optimal visual processing that provides the brightest, most colorful images, while preserving perceived color accuracy requires three-dimensional color processing.
In providing any color image for viewing by a human observer, whether it is an image printed on a substrate, an electronic display, television, or a projection onto a viewing surface, the perception of color stimuli by the human observer is dependent upon a number of factors. In the International Lighting Vocabulary published in 1987 by the Commission Internationale de I'eclairage (CIE), it is noted as follows: "Perceived color depends upon the spectral distribution of the color stimulus, on the size, shape, structure, and surround of the stimulus area, on the state of adaptation of the observer's visual system, and on the observer's experience of the prevailing and similar situations of observations."
Moreover, in a treatise on the stained glassed windows at the cathedral at Chartres, The Radiance of Chartres: Studies in the Early Stained Glass of the Cathedral, (Columbia University Studies in Art History and Archaeology, No. 4), Random House, 1 st Ed., 1965, author James Rosser Johnson wrote that, "... the experience of seeing these windows ... is a very complicated experience ... that spans many aspects of perception." Yet fundamentally, "... when the spectator enters the Cathedral from the bright sunlight, ... the visitor must step with caution until his eyes have made a partial dark adaptation ... then the details of the interior will seem lighter and clearer while, at the same time, the [stained-glass] windows become richer and more intense."
Adaptation plays a powerful role in the instance depicted in Johnson's narrative. By adapting to the darkness or lower, perceived diffuse white of the cathedral's interior, the colors of the windows appear exceedingly brilliant, invoking a perception, in the words of Vincent Scully, Architecture, The Natural and Manmade, St. Martin's Press, 1991 , that, "... transcend[s] the statics of the building masses, the realities of this world ... [creating] a world of illusion, shaped by and for the heavenly light of the enormous stained glass windows." While such a perceptual experience is certainly complex and affected by the many characteristics of the human visual system (HVS), the richness of it is largely and simply made possible by the broad extent of sensitivity of the HVS and its innate ability to adapt to its surround.
The HVS is capable of adapting to an incredible range of luminance. For example, the HVS may adapt its light sensitivity over a range of about eight orders of magnitude, e.g., from a starlit, moonlit night having a luminance of about 0.0001 candela per square meter (cd/m2) to a brightly lit summer day of about 600 to 10,000 cd/m2. Equally remarkable is that the HVS may accommodate over five orders of magnitude of luminance at any given instant for the perception of complex visual fields that are routinely experienced. This adaptation occurs relative to diffuse white, i.e., an area in the scene that appears white. The perceptions of lightness and chroma are then relative to this white. The higher the brightness of the perceived white, the lower the brightness and chroma of similarly illuminated objects in the scene will appear to the observer; conversely, the lower its brightness, the brighter and more colorful such objects appear.
This means that changing the stimulus that appears white affects the appearance of all other stimuli in the scene. For a display or projection of an image, these powers of adaptation can be harnessed to expand the gamut of the medium in the perceptual sense. For any image display, and particularly single modulation LED displays such as those employing a digital micromirror device (DMD), the projected image can be made to appear brighter by the addition of light from combining RGB colors for some portion of the image frame time. In so doing, the powers of HVS adaptation are exploited to increase the apparent brightness and lightness contrast of the displayed images. For displays illuminated by red, green, and blue LEDs, although the added light reduces the actual display color gamut provided by the "LED primaries," the R, G, and B primary colors of the LEDs often exceed the current video standards, such as e.g., ITU Radiocommunication Sector (ITU-R) Recommendation BT.709, which is the United States standard for the format of high-definition television and consumer digital media. Thus some colors which are possible to output by the R, G, and B LEDs, or displays with more than three colors and extended color gamut are not available to be encoded in the input color data for display in accordance with such standards. Optimal use of these extended colors requires full three-dimensional color processing and can be further optimized using knowledge of the HVS. Prior attempts to process the current video standards, such as with one-dimensional color processing and color matrices, or without use of HVS models have resulted in unsatisfactory and unrealistic displayed images and high rates of product return by consumers.
Illustrative of some of these attempts, FIGS. 1A - 1 D are two-dimensional schematic diagrams of various prior art ways for processing input color data to produce output color data for rendering a color image. FIG. 1A shows a color hue/saturation/contrast/brightness method, depicting the global controls that rotate hue, stretch saturation and contrast and raise brightness. All colors are changed with these controls with no way to isolate a given color or color region like flesh tones. Rin Gin Win are input HD709 standard colors, and R0ut/Gout/W0ut are more pure output LED Colors. There are four controls, and if each control is provided with 20 settings for example, there are 80 global choices.
FIG. 1 B shows a color matrix method depicting a linear matrix global control that rotates and scales the color axes. All colors are changed globally with no way to isolate local colors like flesh tones. R G \Nm are input HD709 standard colors, and Rout Gout Wout are more pure output LED colors. If a 3 X 3 matrix is used, there are nine global choices.
FIG. 1 C shows a color gamma tables method depicting gamma global controls that independently maps each input color non-linearly to do things such as increase contrast. It can be seen that, e.g., red changes are the same for all green values. The same relationships occur with other combinations of primary colors. Thus gamma controls are global, with no way to locally isolate colors, such as flesh tones. Rin Gin Win are input HD709 standard colors, and Rout/Gout/W0ut are more pure output LED colors. With three primary colors having 4096 settings, there are 12288 global choices.
FIG. 1 D shows a 2D example of an RGBCYMW seven color mapping method.
In this simple example of 7-color tetrahedral processing, the RBG/RGW triangles are independently processed using linear interpolation of input/output control values at each vertices. This is a global control, with no way to isolate local colors or regions like flesh tones. RJGJ\Nm are input HD709 standard colors, and R0ut/Gout/W0ut are more pure output LED colors. With 14 In/Out colors, there are 14 global choices. Rin/Gin/Win are input HD709 standard colors, and R0ut/Gout/W0ut are more pure output LED Colors.
Digital Cinema Initiatives, LLC (DCI) is a joint venture of major motion picture studios, which was formed in 2002 to create standards for digital cinema systems, including image capture and projection. The digital color standard adopted by the studios for professional movie releases in the DCI format is 12 bits per primary color, nonlinear CIE XYZ Tristimulus values. This is the first time that a digital standard has been established that is encoded in visual color space and therefore independent of any imaging device. For example, using this standard, the same digital file can be displayed to produce the specified color on a television or a printer. The color gamut of this digital color standard is larger than any possible display.
FIG. 3 is a diagram of color gamuts, including color gamuts of the DCI and HD709 standards, and color gamuts of various media and/or imaging devices. It can be seen that in diagram 400, the color gamuts 406, 408, 410, and 412 of the various imaging devices are substantially larger than the HD709 standard 404. Accordingly, to take full advantage of the color capabilities of these imaging devices 406 - 412, the color gamut of the HD709 standard must be mapped upwardly, to render the full colors of the larger color gamut, while simultaneously preserving flesh tones and other memory colors, and optimizing the particular device for viewing in a particular environment.
It can also be seen that the large triangular boundary 402 that represents the DCI standard encompasses all of the color gamuts of the media and/or imaging devices, as well as the color gamut of the HD709 standard 404. Accordingly, the digital color standard input color gamut 402 must be contracted or reduced to fit within the color gamut of a physical display such as a television or projector. Truncating or clipping those input digital color values of the DCI standard that lie outside of the color gamut boundary of the display device will cause loss of color saturation and detail and create a visually sub-optimal displayed image. Conventional video processing using one-dimensional color tables and linear matrices will also produce sub-optimal displayed images. Optimal display of these contracted colors requires full three- dimensional color processing and can be further optimized using knowledge of the HVS and the state of visual adaptation in particular viewing environments.
Also, image and video media display products are now being reduced in size. Examples of such products are the new miniature pico-projectors and portable, handheld displays such as iPods® or iPads®. Because of power, heat, and size limitations, these displays generally have reduced color gamuts due to reduced contrast or reduced color saturation. They also are often used in widely differing viewing environments both indoors and outdoors. Improvement of the overall quality of these smaller gamut displays with conventional image and video input is critical to product value. Conventional video processing using one-dimensional color tables and linear matrices will also produce sub-optimal displayed images. Optimal display of these contracted colors requires full three-dimensional color processing and can be further optimized using knowledge of the HVS and the state of visual adaptation in particular viewing environments.
Additionally, the capabilities of HVS adaptation are affected by the viewing environment. In a dark room, higher contrast is needed in a projected or displayed image for an equally perceived viewing experience as compared to a room with normal room lighting or viewing the same image in bright outdoor lighting. Relative to bright outdoor lighting, the HVS adaptation to the dark room and the lower overall image brightness combine to reduce the perceived image contrast. In a brightly lit room, less contrast is needed due to brightness adaptation and more contrast is needed due to viewing flare from room lights illuminating the dark areas of the displayed image.
In image displays, televisions, and/or projectors using high brightness light sources or expanded or reduced color gamuts, there is therefore a need in displaying and/or projecting images to optimize the increase in perceived brightness, contrast, and colorfulness while preserving expected memory colors of the displayed image such as flesh tones. Such an optimization should take into account that not all colors should be adjusted in the same manner and to the same extent. To do so would result in images containing certain details that appear unsatisfactory to a human observer. For example, if a flesh tone of a face in an image is modified in the same manner as a relatively saturated color of another object in the image, the face will be perceived as "pink," "orange," or "burnt" by an observer and thus will be perceived as unsatisfactory. There is therefore a need to achieve this optimization while also preserving certain known colors, such as flesh tones, grey tones, named colors (such as commercial "brand" colors), and other "memory" colors in the image. Prior attempts to process the video inputs with one-dimensional color processing and color matrices for such extended brightness, contrast or color gamut displays, have resulted in unsatisfactory and unrealistic displayed images and high rates of product return by consumers
Current projectors, televisions or displays that attempt to enhance or improve perceived color quality with processing that is in any way different than exact colorimetric color reproduction, do not preserve memory colors in the background. A memory color may be characterized as a localized volume in a color space, as will be described subsequently herein. The algorithms used in current image displays, televisions and projectors cannot uniquely preserve a volume within a three- dimensional color space while changing a different volume within the same three- dimensional color space using one dimensional tables, or matrices, or enhancements which are applied to all colors in the 3D space. For example, in some image projectors, color enhancement is attempted using output color definitions of the seven input colors RGBCMYW (red-green-blue-cyan-magenta-yellow-white). This may allow one to provide a bright white in an image without changing red, for example, but it does not allow one to specify any point or localized volume of a memory color in a 3D color space, which is required to preserve that memory color. As a result, when current image displays, televisions and projectors provide enhanced colors, they do so across the entire color gamut, "enhancing" certain memory colors such as flesh tones such that a typical human observer finds them unsatisfactory and not perceptually optimal. In such image devices, the color enhancement is somewhat arbitrary; it does not preserve memory colors, nor produce a perceived display image that is realistic for a better viewing environment.
More generally, to the best of the applicants' knowledge, no one has implemented the use of three dimensional color tables in 3D color processing to improve image quality for video images, or in 3D color processing for gamut mapping to larger color gamut displays than a particular image standard, or in gamut mapping to smaller color gamut displays than a particular image standard, or in 3D color mapping to displays with secondary color capability and more than three colors that are primary or secondary, using visual models of the human visual system or otherwise. Currently, standard color processing for displays uses one dimensional tables, 3 x 3 matrices or matrix mathematics that allows output definition of a small number of colors like RGBCYMW.
3D color tables have been implemented for color calibration, but in such circumstances, the tables are small (e.g., 7 x 7 x 7). These 3D look-up-tables are used instead of one dimensional tables and 3 x 3 matrices because the small 3D look-up-tables are generally faster, albeit at the expense of some loss of precision. In any case, significant color improvement or enhancements to deliver color "looks," or gamut mapping or mapping to displays with secondary or more than three primary colors with such small tables is not possible.
Another problem in certain types of image rendering devices is that the outputs of the primary color light sources are not stable. This is particularly true for image rendering devices that use organic light emitting diodes (OLEDs) as the sources of the primary colors red, green, and blue. A known problem with OLED displays is that the blue OLED typically has had a considerably shorter lifespan than the red and green OLEDs. One measure of OLED life is the decrease of luminance to half the value of original brightness. The luminance of currently available blue OLEDs decreases to half brightness in a much shorter time than the red or green OLEDs. During the operation of an OLED display, this differential color change between the blue OLED and the red and green OLEDs changes the color balance of the display. This change is much more objectionable to a viewer than a decrease in overall brightness of the display, To the best of the applicants' knowledge, the problem of managing the overall lifespan of OLED displays has not be solved adequately, which has led to significant delays in product introduction in the marketplace. There is therefore a need to provide a solution that manages the overall quality and lifespan of the relative luminances of the red, green and blue OLEDs in a display device. DISCLOSURE OF THE INVENTION
A color-enhanced image display, television, or projection that maintains certain known colors and optimizes colorfulness and contrast will have the highest visual perceptual quality if and only if the rendering is accomplished wherein the input RGB colors are processed inter-dependently. This requires the use of a three-dimensional color look-up table, also referred to herein as a 3D LUT. The color enhancement may entail increased brightness and/or a larger or smaller color gamut, depending upon the particular image display or projector. In prior art image displays and projectors in which traditional matrices and one dimensional color tables operate independently on the RGB input colors, a brighter display is not possible without affecting hue. For example, blue skies will be shifted towards purple, flesh tones will be altered in unpredictable ways, and many other color artifacts may be present, depending upon the content of the particular displayed/projected image. The use of 3D color look-up tables enables brighter, higher contrast, and more colorful image displays and projections without color artifacts. Using methods of the present invention, this can be accomplished for image displays or projectors which have color gamuts about the same as that of a given color standard, or larger than the standard, or smaller than the standard. The color rendering of such image displays or projectors can be enhanced using three dimensional tables with differing methods in each volume and with visual models.
In one aspect of the invention, a first method of producing a color image is provided comprising providing input image data from an image source such as a camera; generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image color data to output image color data in an image rendering unit; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the imaging color rendering controller; processing the input image data through the at least three-dimensional look-up table to produce output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output color values to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image.
The values in the lookup table may be calculated based upon a visual model of the human visual system and they may include modeling to improve the perceived brightness or contrast or colorfulness for different viewing environments. The at least one of enhanced brightness, enhanced contrast, or enhanced colorfulness introduced by the at least three dimensional look-up-table may produce a chosen artistic perception in the output image. The image rendering unit may have an expanded color gamut greater than the color gamut of the input image data, wherein the output colors to the image rendering unit utilize the expanded color gamut, or the image rendering unit may have a reduced color gamut smaller than the color gamut of the input image data, wherein the output colors to the image rendering unit utilize the smaller color gamut. The input image data may contain memory colors and non-memory colors, and the method may include identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the image rendering unit. In such circumstances, the perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness, and contrast of the memory colors. They may be increased more than perceived colorfulness, brightness, and contrast of the memory colors. In one embodiment, the perceived colorfulness, brightness, and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors. Generating the at least three-dimensional look-up table may include computing enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function. The enhancement function may be a sigmoidal function. More than one at least three- dimensional look-up table for the color transformation of the non-memory colors and the memory colors may be generated and used. Each of the at least three dimensional look-up tables may be optimized for a different viewing environment of the image rendering unit. The method may further include providing a sensor for measuring the ambient light in the viewing environment.
The input image data may be of a first color standard, and the method may further include converting the input image data of the first input color standard into an input color specification for inputting into the three-dimensional look-up table. The at least three-dimensional look-up table may have at least three input colors and/or at least three output colors. The at least three output colors may be any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors. The at least three dimensional look-up table may be losslessly compressed to reduce storage use in memory of the image color rendering controller. The method may further include calibrating the image rendering unit by measuring the color response of the image rendering unit, and then modifying the output image data either by additional processing after the at least three-dimensional look-up-table or by including the required calibration in the at least three-dimensional look-up-table.
The image color rendering controller may be contained within the image rendering unit, or it may be external to the image rendering unit. An auxiliary imaging device controller may be in communication with the image color rendering controller and the image rendering unit. The image rendering unit may be selected from, but not limited to a projector, a television, a computer display, and a game display, and may use DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source. The light source may be an LED, OLED, laser, or lamp light sources. Without limitation, the image color rendering controller may be in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema. The image rendering unit may include an algorithm for color modification, wherein the at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the image rendering unit. The image rendering unit may include an algorithm for creating secondary colors from primary colors, and the at least three-dimensional look-up table further comprises compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
The at least three-dimensional look-up table may further include processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit. The at least three-dimensional lookup table may contain a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system. The at least three-dimensional look-up table may include the definition of secondary colors, and may further contain enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit. The at least three-dimensional look-up table may further include processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit.
The instant method may be used in the display or projection of two dimensional (2D) or "three dimensional" (3D) images. The 3D images are typically produced by providing 2D stereo images simultaneously or in rapid sequence taken from two perspectives, so as to provide the observer with the illusion of depth perception. The image rendering unit may be a "3D" unit. By way of illustration, and not limitation, the unit may be e.g., an autostereoscopic display, or it may include a polarizing filter to separate the 2D stereo images being projected and directed to the eyes of an observer using polarization glasses, or it may include a shuttering mechanism to separate the 2D stereo images being projected and directed to the eyes of an observer using time synced shutter glasses. In any case, both sets of 2D images may be processed according to the instant method to deliver 3D images that are perceived by an observer to have enhanced brightness, and/or enhanced contrast, and/or enhanced colorfulness. In another aspect of the invention, an additional method of producing a color image is provided, the method comprising providing input image data of a first color gamut and an image rendering unit of a second, expanded or reduced color gamut; generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table expand or reduce the input image data to encompass the second color gamut of the image rendering unit; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the imaging color rendering controller; processing the input image data through the at least three-dimensional look-up table using the input image data as addresses into the at least three-dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three- dimensional look-up table; and outputting the output image data to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image. This method may also include the various aspects and/or steps described above for the first method.
In another aspect of the invention, the models may include visual models of HVS perceptual adaptation to produce a projected or displayed image that appears as it would in a more optimal, well lit viewing environment. The image processing may include correcting for low level lighting of the surrounding environment and/or indoor or outdoor ambient light added to the displayed image. More specifically, a method of producing a color image by an image rendering unit in a sub-optimal viewing environment is provided, the method comprising generating an at least three- dimensional look-up table of values of input colors and output colors, the table containing a transformation from a suboptimal viewing environment to an improved viewing environment; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the image color rendering controller; processing the input image data through the at least three- dimensional look-up table using the input image data as addresses into the at least three-dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output image data to the image rendering unit. This method may further include the various aspects and/or steps described above for the first method. The improved viewing environment may be such that an observer may perceive the color image to have more color, contrast, or brightness.
In yet another aspect of the invention, a method of producing a color image by an image rendering unit is provided, the method comprising generating an at least three-dimensional look-up table of values of input colors and output colors, the three- dimensional look-up table containing the definition of secondary colors or more than three primary colors; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the image color rendering controller; processing the input image data through the at least three-dimensional look- up table using the input image data as addresses into the at least three-dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output image data to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image. This method may also include the various aspects and/or steps described above for the first method.
The secondary colors or more than three primary colors may be explicitly defined, or the secondary colors or more than three primary colors implied in the design of a three in by three out look-up table for two conditions. In either instance, measured responses of the image rendering unit may be used to define the three- dimensional look-up table, or mathematics provided by a manufacturer of the image rendering unit may be used to define the three-dimensional look-up table. Alternatively, an open definition of how the secondary colors or more than three primary colors are used may be provided. This method may also include the various aspects and/or steps described above for the first method.
In another aspect of the invention, the problem of displaying or projecting an image that is optimal in human visual perceptual terms regardless of the ambient light and background environment of the image is solved by using visual models to enhance the perceived colorfulness, contrast, or brightness of the image, thereby improving the perceived quality of the image. The visual models of human visual perception may be used to create look-up tables of at least three dimensions to process the image to be displayed. Memory colors of the image may be preserved. The method may further include performing empirical visual studies to determine the dependence of the preference of colorfulness, contrast, or brightness on the ethnicities of the human observers, and defining the perceived quality of the image for each nationality of human observers. The method may further include adjusting the colorfulness, contrast, or brightness of the image based upon one of the ethnicities of the human observers. The method may further include generating an at least three-dimensional look-up table of values of input colors and output colors, the three-dimensional look-up table adjusting the colorfulness, contrast, or brightness of the image to match the enhanced appearance of analog film systems or digital systems designed for cinemas. The method may further include adjusting the colorfulness, contrast, or brightness of the image to produce a chosen artistic perception in the image.
In another aspect of the invention, a method of producing a color image by an
OLED display is provided that manages the overall quality and lifespan of the relative luminances of the red, green and blue OLEDs in the display. The method comprises providing input image data and providing the OLED display having at least three OLEDs, each OLED being of a different primary color; generating an at least three- dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image data to output image color data of the OLED display in a manner that optimally manages the quality of the image and the lifetime of the at least three OLEDs; loading the at least three-dimensional look-up table into an image color rendering controller; loading the input image data into the imaging color rendering controller; processing the input image data through the at least three-dimensional look-up table to produce output color values stored at the addresses in the at least three-dimensional look-up table; and outputting the output image data to produce the image by the OLED display. The values in the look-up table may be calculated based upon a visual model of the human visual system. This method may further include the various aspects and/or steps described above for the first method.
The at least three OLEDs may be a red OLED, a green OLED, and a blue OLED. In such an instance, managing the quality of the image and the lifetime of the OLEDs may further include adding a white primary and mapping predetermined amounts of the grey component of RGB pixel values to the white primary to reduce the usage of RGB and extend the life of the red, green, and blue OLEDS. Alternatively, managing the quality of the image and the lifetime of the OLEDs may comprise adding other primary colors and mapping predetermined amounts of the RGB pixel values to the other primary colors to reduce the usage of RGB and extend the life of the red, green, and blue OLEDS. The method may further comprise operating the at least three OLEDs such that a first OLED does not reach end of life sooner than the other OLEDs, and the image quality of each of the OLEDs is reduced about equally over time without perceived artifacts or appearances predominantly of one of the OLED colors.
The method may be further comprised of having a controlled degradation of image quality due to changes in the outputs of at least one of the OLEDs, wherein the change of quality at any given point in time has the least loss in perceived quality. The controlled degradation may be tracked by accumulating and using usage data for all of the OLEDs. The controlled degradation may be performed on the entire image over time, or on at least one portion of the image over time. The controlled degradation may be performed by substantially maintaining the brightness of the image while gradually reducing color saturation of the image over time, or by reducing color saturation of the image to a greater extent in image pixels of low color saturation than in image pixels of high color saturation, or by substantially maintaining the brightness of the image while reducing color saturation gradually using adaptive one dimensional tables on each of the primary colors.
The one dimensional tables on each primary color may be calculated using a quality degradation model. The quality degradation model may average among one dimensional tables that are pre-designed to provide the targeted image quality at specific OLED lifetimes. The one dimensional tables may be produced by interpolation between a one dimensional table for when the OLEDs are initially operated and a one dimensional table for when the OLEDs are at the ends of their useful lifetimes.
In another aspect of the invention, in an image display, television, or projector, the problem of achieving an expanded or maximum color gamut by temporally combining R, G, and B during an image frame duty cycle to increase brightness while maintaining saturated pure R, G, and B colors is solved by calculating the combinations of R, G, and B that maintain a physical or perceived input color in a given viewing environment thereby maintaining physical or perceived color saturation and achieving increased brightness. The calculated combinations are implemented in a 3D look-up table.
In any of the above aspects of the invention, the color image to be produced may contain "memory colors" as defined herein, and non-memory colors. In general, the memory colors of the image that is produced are preserved. The methods may include identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities in the image rendering unit, and producing an image comprising human visual system perceptually accurate memory colors using the image rendering unit. The perceived colorfulness, brightness and contrast of the non-memory colors are increased more than perceived brightness and contrast of the memory colors. In one embodiment, generating the at least three-dimensional look-up table may includes computing enhanced lightness, chroma, and hue for the memory colors using a sigmoidal enhancement function. More than one at least three-dimensional look-up table may be generated for the color transformation of the non-memory colors and the memory colors. Some or all of the at least three dimensional look-up tables may be optimized for a different viewing environment of the image rendering unit. In such an instance, the method may further include selecting one of the at least three- dimensional look-up tables for loading into the image color rendering controller based upon the viewing environment of the image rendering unit. A sensor may be provided for measuring the ambient light in the viewing environment.
In a related aspect of the invention, the problem of displaying an image that simultaneously has high brightness and high colorfulness of a majority of colors (and particularly high saturation colors), while maintaining realistic "memory colors" is solved by adding white light or any combination of multiple R, G, B colors by combining R, G, and B for some portion of the duty cycle of the image projection time, according to a 3D look-up table, which replaces the lost colorfulness of adding color combinations and at the same time preserves flesh tones and other known memory colors. The image data is processed with a 3D look-up table in a manner that that increases the perceived colorfulness, brightness, and contrast while preserving flesh tones and other known memory colors. The 3D look-up table is created to produce the improved image quality. Visual models may be used to perform the image processing.
In any of the above aspects of the invention, the methods may further comprise converting the input image data of a first input color standard into an input color specification for inputting into the three-dimensional look-up table.
The solutions to the above problems may entail multi-dimensional look-up tables, with three dimensional look-up tables being one example. The at least three dimensional lookup table may have three or more input colors and three or more output colors. The output dimension may be different from the input dimension, such as having RGBCYMW (red-green-blue-cyan-magenta-yellow-white) output values in an RGB table, i.e. three values of input and seven values of output. The number of outputs may also be greater than three due to the display having more than three physical colors, i.e., more than three primary colors such as R, G, and B. In such an instance, the output colors could therefore be the primary colors or combinations of the four or more colors. In general, the three or more than three output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors. The at least three dimensional look-up table(s) may be losslessly compressed to reduce storage use in a memory of the image color rendering controller.
More specifically, according to the present disclosure, a method of displaying an image containing memory colors and saturated colors is provided comprising identifying the memory colors in input image data to be substantially maintained, characterizing the memory colors with respect to their chromaticities, and generating a three-dimensional look-up table for a color transformation of saturated and memory colors. The three-dimensional look-up table is loaded into an imaging device controller, and input image data is loaded into the imaging device controller. The input image data is processed with an algorithm using the three-dimensional look-up table to produce output image data. The output image data is output to an image rendering device, and a high brightness, high contrast image comprising human visual system perceptually accurate memory colors is displayed or projected.
In one embodiment, the method includes preprocessing, wherein one dimensional tables and matrices are provided for converting the variety of possible input color standards into a preferred color input to the 3D or higher dimensional color look-up-table. This is done for the purpose of making a single or reduced number of 3D or higher dimensional color look-up-tables adaptable to different video standards. In another embodiment, the algorithm containing the 3D or higher dimensional mathematics is executed in real time by the central processing unit of a computer in the image display or projection device so that the need for a 3D color table is obviated. This may be done if the device computer is provided with adequate computational processing capability and memory.
In another embodiment, the method includes incorporating the variety of possible input color standards directly into the creation of the 3D or higher dimensional color look-up-tables to adapt to different video standards. In some circumstances, the image rendering unit (such as, e.g., a display or projection device) is provided with some color modification capability that is "built in." For example, the device may provided with an algorithm to add white or secondary colors, resulting in a loss of colorfulness, and a distortion in the appearance of memory colors. In such circumstances, the output values in the at least three-dimensional lookup table are determined such that the input image data is processed to compensate for the color modification performed by the image rendering unit. The method may thus include providing at least 3D color tables to adjust the color data in a manner that shifts it in a direction within the color space that compensates for the built in color modification that is performed by the image rendering unit. The at least three- dimensional look-up table further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit. In a more specific instance in which the image rendering unit includes an algorithm for creating secondary colors from primary colors, the at least three-dimensional look-up table may further comprise compensating for the color modification performed by the addition of the secondary colors in the image rendering unit. The values in the at least three dimensional lookup table may also be determined such that the at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit. The at least three-dimensional look-up table may also adjust the colorfulness, contrast, or brightness of the image to be produced to match the enhanced appearance of analog film systems or digital systems designed for cinemas.
According to the present disclosure, there is further provided a device for producing a color image. The device is comprised of a computer including a central processing unit and a memory in communication through a system bus. The memory may be a random access memory, or a computer readable storage medium. The memory contains an at least three dimensional lookup table.
In one aspect of the invention, the at least three dimensional lookup table contains values of input colors and output colors, wherein the values in the lookup table convert an input image color data set to output image color data in an image rendering unit that is connectable to the device. In another aspect of the invention, the at least three dimensional lookup table may be produced by an algorithm for transforming input image data comprising memory colors and non-memory colors to a visual color space, and computing enhanced lightness, chroma, and hue for the memory colors and non-memory colors in the visual color space. The algorithm to produce the three dimensional lookup table may be contained in the memory.
In another aspect of the invention, the at least three dimensional lookup table includes values of input colors and output colors, wherein the values in the lookup table convert a first color gamut of an input image data set to encompass a second expanded or reduced color gamut of an image rendering unit that is connectable to the device.
In another aspect of the invention, the at least three dimensional lookup table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
In another aspect of the invention, the at least three dimensional lookup table contains the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of secondary colors by an image rendering unit that is connectable to the device.
In another aspect of the invention wherein the image is perceived by a human observer, the memory may contain a visual model to enhance the perceived colorfulness, contrast, or brightness of the image.
In any of the above aspects of the invention, the device may further include the image rendering unit in communication with the computer. The image rendering unit may be selected from a projector, a television, a computer display, and a game display, and may use DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation (LCOS), or direct modulation of the light source and LED, organic light emitting diode (OLED), laser, or lamp light sources. The device may further comprise an auxiliary imaging device including at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema. One of a liquid crystal display, a plasma display, and a DMD projector may be in communication with the auxiliary device. The device may further comprise a communication link to a source of input image data.
The at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit. Alternatively or additionally, the at least three-dimensional lookup table may contain a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
The memory of the device may contain a set of at least three dimensional lookup tables; each table of the set may be optimized for a different viewing environment of the image rendering unit. The device may be provided with a sensor for measuring the ambient light in the viewing environment. BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure will be provided with reference to the following drawings, in which like numerals refer to like elements, and in which:
FIGS. 1A - 1 D are illustrative, two-dimensional schematic diagrams of various prior art ways for processing input color data to produce output color data for rendering a color image;
FIG. 2 is a schematic diagram of aspects of the instant method for processing input color data to produce output color data for rendering a color image;
FIG. 3 is a chromaticity diagram that depicts color gamuts of the DCI and HD709 standards, and color gamuts of various media and/or imaging devices;
FIG. 4 is a perspective view of a three-dimensional color space depicting a series of color gamuts of an image display, projector, or television in which the gamuts have been sequentially reduced by the addition of white to the R, G, and B primary colors thereof;
FIG. 5 is a schematic diagram of a device for producing a color image;
FIG. 6 is a flowchart depicting the steps of one algorithm for generating a three- dimensional lookup table for the purposes of this invention; and
FIG. 7 is a flowchart depicting one method for producing a color image in accordance with the present disclosure; FIG. 8 is a schematic diagram of one mathematical flowchart for producing a color image in accordance with the present invention, which includes color output calibration.
The present invention will be described in connection with a preferred embodiment, however, it will be understood that there is no intent to limit the invention to the embodiment described. On the contrary, the intent is to cover all alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims.
BEST MODE FOR CARRYING OUT THE INVENTION
For a general understanding of the present invention, reference is made to the drawings. In the drawings, like reference numerals have been used throughout to designate identical elements. In describing the present invention, a variety of terms are used in the description. Standard terminology is widely used in image processing, display, and projection arts. For example, one may refer to the International Lighting Vocabulary, Commission Internationale de I'eclairage (CIE), 1987 for definitions of standard terms in the fields of color science and imaging. One may also refer to Billmeyer and Saltzman's PRINCIPLES OF COLOR TECHNOLOGY, 3rd Ed, Roy S. Berns, John Wiley & Sons, Inc., 2000; and Color Appearance Models, Mark D. Fairchild, Wiley-IS&T, Chichester, UK (2005).
In order to fully describe the invention, as used in the present disclosure, certain terms are defined as follows :
Brightness - attribute of a visual perception according to which an area appears to emit, or reflect, more or less light.
BT.709 - abbreviated reference to ITU Radiocommunication Sector (ITU-R) Recommendation BT.709, a standard for the format of high-definition television.
Chromaticity - normalized CIE Tristimulus values often used to visualize the color gamuts of devices in a Chromaticity diagram, such as that shown in FIG 3.
CIECAM02 - the most recent color model adopted by the International Commission on Illumination, or Commission Internationale de I'eclairage (CIE), published in 2002.
Color - A specification of a color stimulus in terms of operationally defined values, such as three tristimulus values.
Color Space - A three-dimensional space in which each point therein corresponds to a color. Colorfulness - Attribute of a visual perception according to which the perceived color of an area appears to be more or less chromatic.
Contrast - In the perceptual sense, assessment of the difference in appearance of two or more parts of a field seen simultaneously or successively.
DCI Standard - a color standard for digital cinema systems created by Digital Cinema Initiatives, LLC a joint venture of major motion picture studios formed in 2002. The standard is included in the publication, "Digital Cinema System Specification," Version 1 .2 approved by Digital Cinema Initiatives, LLC March 7, 2008.
Display - An imaging device which forms an image from discrete lighted elements at a surface thereof.
Color Gamut - The range of colors producible with a set of inks, lights, or other colorants. A color gamut may be described in terms of a particular region of a color space.
Hue - Attribute of a visual perception according to which an area appears to be similar to one of the colors, red, yellow, green, and blue, or to a combination of adjacent pairs of these colors considered in a closed ring.
Memory color - a color of an object in an image for which an observer may consciously or unconsciously observe and make a judgment as to whether the color of the object is accurate, based upon the observer's memory of previous experiences observing the object. Examples of memory colors are flesh (human skin) tones, the green of grass, the blue of the sky, the yellow of a banana, the red of an apple, and grey scale. The accurate rendering of colors associated with commercial products and registered trademarks, such as "Kodak yellow", "IBM blue," and "John Deere green" may be important to some viewers/users of images, and are also examples of memory colors. It is further noted that the perceived appearance of memory colors may be influenced by the context in which they are seen by an observer.
Primary colors - The colors of the individual light sources, including all color filters, that are used to create a color image in an image rendering unit.
Projector - An imaging device which forms an image by delivering and in some instances focusing light on a distant, separate surface such as a wall or screen.
RGBCYMW - in the use of any of these capital letters in combination herein, they stand for red, green, blue, cyan, yellow, magenta, and white, respectively.
Rendering an image - providing an image for observation, either via an image display that forms an image from discrete lighted elements at a surface thereof, or via an image projector that forms an image by delivering and in some instances focusing light on a distant, separate surface such as a wall or screen.
Saturation - Colorfulness of an area judged in proportion to its brightness.
Secondary colors - Linear or non-linear combinations of the primary colors of an image rendering unit that can be controlled independently from the primary colors. Tristimulus values - Amounts of the three reference color stimuli, in a given trichromatic system, required to match the color of a stimulus being considered.
White - a set of three values of primary colors, typically red, green, and blue, that may be added to a color in a portion of an image, thereby in effect adding white to the color to brighten the color.
It is further noted that as used herein, a reference to a three dimensional lookup table or a 3DLUT is meant to indicate a table of at least three dimensions, unless otherwise indicated. A lookup table may be multidimensional, i.e., it may have three or more input colors and three or more output colors.
FIG. 2 is an illustrative, two-dimensional schematic diagram depicting the full multi-dimensional capability of an at least three dimensional color table 54 used in processing input color data to produce output color rendering a color image. For the sake of simplicity of illustration, the diagram 420 of FIG. 2 depicts only a 2D rendition of an at least 3D color table 54 of the present invention. Any point, and/or any region in the full color space can be changed independently. The small squares 422 represent locations in the color space in which no change in color is made. These locations may be memory color locations, such as flesh tones.
In other regions 424, selective increases in contrast, colorfulness, and brightness may be made. The larger squares 426 in these regions 424 represent locations where colorfulness, contrast, and brightness are increased. Any local color or color region, such as a flesh tone region, can be chosen for unique color processing. In one embodiment, a 3D color table may contain output values for every input RGB color, which for 12 bits per color would be 4096 x 4096 x 4096 independent colors, thereby providing 68.7 billion local color choices. In another embodiment, a 3D color table size can be reduced by using the most-significant bits of the input colors to define the 3D color table locations and performing multi-linear or other multidimensional interpolation using the least-significant bits of the input colors.
It is to be understood that the while the squares 422 and 426 are meant to indicate various color regions, the borders of the squares are not meant to indicate sharply defined boundaries of such regions. As described previously, these regions may be modeled using a probability distribution that provides a smooth transition from regions in the color space that are outside of the regions defined by the squares.
For example, the various regions may be defined by Gaussian boundaries that are smoothly connected by probability functions. In defining the color output values in the at least 3D LUT 54, volume derivatives may be used that displace the color (R,G,B) vectors in different amounts. Within memory color regions, the color vectors have a lesser displacement, or possibly none at all, while other color regions have larger displacements to increase their contrast, colorfulness, and brightness.
The full table may be very large. For example, a large table results if the input color is 24-bit (i.e. 8 bits each for R, G, and B), and the output includes white and is 32 bit (i.e. 8 bits each for R, G, B, and W). Referring to FIG. 5, this large 3D LUT 54 may be used if the memory 36 of the image color rendering controller is sufficiently large, and results in the fastest color processing. However, if the memory 36 is limited in size, but sufficient computational capacity is available in the CPU 34, multi-dimensional interpolation may be used to reduce the size of the 3D LUT 54. In this particular example, for each respective primary input color, bits 3 through 8 may be used to define and address the 3D LUT 54. Multi-dimensional interpolation may then be used with bits 1 and 2 to define the output colors that occur between the output colors associated with the 8 vertices of the cube in the 3D LUT 54 defined by bits 3 through 8.
The color gamut of an image rendering unit, such as a display, television, and/or projector is defined by the maximum colors that can be produced by that image rendering unit with combinations of its primary colors. FIG. 3 shows the color gamuts of various image rendering technologies compared to the CCIR709 color standard 404 and the DCI color standard 402. FIG. 3 shows that displays such as LED projectors (gamut 406), OLED displays (gamut 408), Digital Cinema projectors (gamut 410) and televisions with more than 3 primary colors (gamut 412) have larger color gamuts than the CCIR709 color standard (gamut 404) for digital media distribution, thus illustrating the need to map the smaller CCIR709 color standard to the larger color gamut of these display types. All other international color standards for consumer digital color media are similar to CCIR709 and therefore exhibit the same need to map these standards to the larger color gamut of the display types in FIG. 3. In the methods of the present invention, this is done while simultaneously preserving memory colors, and optimizing the particular device for viewing in a particular environment, and taking into account adaptation of the human visual system. FIG. 3 also shows that the DCI "Hollywood" color standard is significantly larger than the color gamut 414 of an infinite set of lasers, and therefore larger than any possible display or image rendering unit, thus illustrating the need to map the larger input to the smaller color gamut of any display type including a professional digital cinema projector.
In a color image rendering unit, such as a display, television, and/or projector, in order to achieve maximum brightness with a single optical system and single image modulator, the multiple RGB channels may be combined for some portion of time during image frames. Adding these multiple RGB channels during an image frame duty cycle will increase the brightness of the image, but will also reduce the colorfulness by desaturating the pure RGB colors. FIG. 4 is a perspective view of a three-dimensional CIECAM02J L*a*b* opponent color space 10 depicting a series of color gamuts of an image display, projector, or television in which the gamuts have been sequentially reduced by the addition of white to the R, G, and B primary colors thereof. The outer (coarsest squares) color gamut 12 is the color gamut of one exemplary image projector having its primary colors produced by red, green, and blue LEDs. The wire frame color gamut 1 1 represents the CCIR709 video color standard. The successively finer squares solids 14, 16, 18, and 20 represent the color gamuts resulting from the addition of 6.25%, 12.5%, 25%, and 50% white, respectively. For the sake of simplicity of illustration, 2D projections of the color gamuts 1 1 - 20 are provided on the a*b* plane as respective closed curves 1 1A - 20A. The color gamut 12/12A of the LED primaries has no added white . It can be seen in general from the 3D perspective renditions and the 2D projections that the addition of white always reduces the color gamut of the image device.
However, this does not mean that the addition of white to the images of the device cannot be beneficial. It can also be seen that the addition of white at a 6.25% level, as indicated by solid 14 and closed curve 14A, results in a color gamut that is approximately equal to the CCIR709 color video standard, while at the same time making the image perceived to be brighter. In an image rendering unit, and particularly in single modulation LED displays such as those employing a digital micromirror device (DMD), the image is made to appear brighter by the addition of white from combining RGB colors. In digital cinema, this may be done for some portion of the image frame time. The capabilities of human visual system adaptation are thereby exploited to increase the apparent brightness and lightness contrast of the displayed images. In one aspect of the present invention, visual models of visual perception by the human visual system are used in determining the optimum amount of white to add to the colors of the image. The perceived colorfulness, contrast, and/or brightness of the image are enhanced, thereby improving the perceived quality of the image. The visual models of human visual perception may be used to create look-up tables of at least three dimensions to process the image to be displayed. The methods of the present invention may include performing empirical visual studies to determine the dependence of preference of colorfulness, contrast, or brightness on the ethnicities of the human observers, and defining the perceived quality of the image for each nationality of human observers. The colorfulness, contrast, or brightness of the image may be adjusted based upon the preferences of one of the ethnicities of the human observers.
FIG. 5 is a schematic diagram of a device for producing a color image, which may be observed by a human observer. The imaging device may include an image rendering unit such as e.g., a television, a display, a projector, or another unit. Referring to FIG. 5, the imaging device 30 may include an image color rendering controller 32 or computer 32 or other processor comprising a central processing unit 34 and a memory 36. As an alternative memory, or in addition to the memory 36, the controller 32 may include a computer readable storage medium 38 such as a hard disk. These components are in communication through a system bus 39. The device 39 may be further comprised of an image rendering unit 40, which may be an image display or projector, such as a liquid crystal display 42; a plasma display 44; a digital mirror device (DMD) 46 including a DMD 80, a lamp 82, and color wheel 84; or a digital mirror device 48 including a DMD 80, and red, green, and blue LED's, OLEDs or lasers 86, 87, and 88.
The imaging device 30 may process input image data that is stored on the storage medium 38, or the imaging device 30 may receive input image data from an external device or source 50. The external source 50 may comprise an Internet connection or other network or telecommunications connection, such that the input image data is transmitted through such connection.
The imaging device 30 may be adapted to a system for displaying or projecting an image in a variety of ways, depending upon the particular application. In some embodiments, the imaging device 30 may be provided as an integrated system comprising the controller 32 and the image rendering unit (display or projector) 40, which only needs to be connected to a source 50 of image input data. In another embodiment, the imaging device 30 may be separate from the image rendering unit 40, and in communication with the image rendering unit 40 through a network or telecommunications connection as described above. The imaging device 30 may be provided comprising the image color rendering controller 32, a first port (not shown) for connection to a source 50 of image input data, and a second port (not shown) for connection to the image rendering unit 40. This configuration is particularly useful for retrofitting to projection or flat screen televisions that receive signals via a cable that is connected to a broadcast source of image input data (e.g., "cable TV programming"). In such circumstances, the cable carrying input image data 50 could be disconnected from the image rendering unit 40, and the imaging device 30 could be placed in line between them to perform the image processing of the present invention.
In other embodiments, the imaging device 30 may be in communication with, or integrated into an auxiliary device 60 or auxiliary imaging device controller 60, which is in communication with the image rendering unit 40. The imaging device controller 60 may be, without limitation, an audio/video processor, a cable TV set-top box, a video game console, a personal computer (PC), a computer graphics card of a PC, or a DVD or Blu-ray player. In another embodiment, the imaging device 30 may be integrated into the electronics and processing components of a broadcast station, a broadcast antenna, receiver or processor, or a digital cinema theatre. In another embodiment, the device 30 may be integrated into the hardware and software of media creation, preparation, and production equipment, such as equipment used in the production of DVDs of movies and television programs, or the production of digital cinema for distribution to theaters. Broadcast stations, digital cinema theaters, and media production equipment may all be comprised of an auxiliary imaging device controller 60.
The memory 36 of the device 30 may contain a set of at least three dimensional lookup tables 54; each table of the set may be optimized for a different viewing environment of the image rendering unit 40. The device 30 may be provided with a sensor 70 for measuring the ambient light in the viewing environment of the image rendering unit 40, or in the case of a projector 46 or 48, in the viewing environment of the projected image. The memory 36 may contain a visual model of the perception of the human visual system that may be used to enhance the perceived colorfulness, contrast, or brightness of the produced image. FIG. 6 is a flowchart depicting an algorithm for generating a three-dimensional lookup table to improve the perceived colorfulness, contrast or brightness in non- memory colors, while preserving to a higher degree the color accuracy of memory colors. The algorithm 100 of FIG. 6 may be used to perform step 210 of the method 200 of FIG. 7. Additionally, the algorithm 100 is applicable to other image rendering devices that use DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
Referring to FIG. 6, in operation 1 10, the RGB input values of the input image data are "reverse gamma" corrected to compensate for the non-linearity of this data, thereby producing linearized scalar RGB values. (The original input data is supplied with the expectation that it will be used in a display or projector that may have a gamma value of about 2.2, for example.) In operation 120, the outer product of the scalar RGB values and the projector matrix is taken to express the input image data as CIE XYZ tristimulus values. In operation 130, the tristimulus values are converted to a visual color space. The transformation to a visual color space enables perceptual modeling to be performed, which characterizes the interdependencies of color, contrast, and brightness, and allows the perception of memory colors to be preserved. The visual color space may be an opponent color space that accurately models constant perceived hue, and has the dimensions of lightness, yellow-blue, and red- green.
In operation 140, the visual color space predicted appearance attributes of lightness, chroma, and hue are computed. In operation 150, the enhanced lightness, chroma, and hue for colors to be rendered are computed. Operation 150 may include steps 152, 154, and 156 for maintaining memory colors in the rendering of the image.
In applications in which there are specific memory colors to be preserved, operation 150 of the method 100 may include steps 152, 154, and 156. More specifically, the method 100 may include the step 152 of identifying the memory colors in the input image data 50 to be substantially maintained. This may be done based on intuition and experience and/or market research data. It is known that observers of an image depicting human subject matter (such as a movie or television program) will find it objectionable if the colors of the skin, and faces in particular, of the humans in the image do not match those colors that they have in their respective memories of how the humans should look. They will perceive the humans as "not looking right," if they are too pink, orange, dark, light, etc. In like manner, certain other memory colors, such as "grass green" and "sky blue" must be rendered so as to appear as the observers remember them from experience. Regardless of how satisfactory the other colors in the image appear, the observers will find a product that does not render memory colors accurately to not be perceptually optimal, and will likely not buy the product, whether the product is an imaging device such as a television, or a movie to be viewed in a theater.
Once the memory colors are chosen, they are characterized with respect to their chromaticities in step 154 from both empirical data and the perceptual context in which they are seen. For instance, it is well understood that humans remember green grass and blue sky as more saturated than the actual stimuli. And, within reason, no matter the color of an illuminant, humans will remember a banana to appear to be a certain yellow (which may also be a memory color). Furthermore, these memory colors are not distributed across the extent of perceptual color in any systematic way. Hence, their representations must necessarily be made in a multivariant, three dimensional, statistical sense and their rendering accomplished in a purely appearance or vision based color space. Algorithms may be employed using visual mathematics which ensure that the memory colors are specified in terms of perceived colors.
In step 156, the enhanced lightness, chroma, and hue for non memory colors and memory colors are also computed. It is noted that in the color space of the input image data, a given memory color is not a single point within the space. To the contrary, memory colors are regions within the color space that are to be left at least perceptually unchanged, or much less changed during the color transformations of the instant methods to produce enhanced images. By way of example, the memory color "flesh tone" is a range of colors corresponding to the colors of very dark-skinned peoples of African ethnicity to very light skinned Caucasians or Asians. Accordingly, the memory colors are identified and characterized such that the colors within this region will be left unchanged or minimally changed in the color transformations.
Additionally, these memory colors may be characterized as not having rigid, discrete boundaries; this may be done so that in the color transformations to be performed, there is not a discontinuity in the degree of color change at a boundary of a memory color, as explained previously with reference to FIG. 2. In one embodiment, the memory color may be modeled using a probability distribution that provides a smooth transition from regions in the color space that are non-memory colors to the region defined as the particular memory color. Any smoothing function that changes the local multi-dimensional derivatives smoothly will be satisfactory. The probability distribution may use non-linear enhancement functions. An exemplary overall nonlinear function that may be used is
Λ Λ Λ Λ ΛΛΛ„ f 1.5χ ΙηρυίΕΧΡ Λ
Output = 0.0001 + -— tEI
[0.5 + Input'XP
J
In operation 160, the enhanced lightness, chroma, and hue of the visual color space are converted to enhanced CIE XYZ tristimulus values. In operation 170, the enhanced CIE XYZ tristimulus values are converted to enhanced RGB scalar values with "white channel." In operation 180, gamma correction of the enhanced RGB scalar values is performed to produce a 3DLUT containing enhanced RGB values with white channel. The 3DLUT may then be used in the method 200 of FIG. 7.
FIG. 6 concludes with a simple statement of the net effect of the operations 1 10 - 180. The 3DLUT, which is of at least three dimensions, is created as a discrete sampling of the visual model and contrast/color/brightness HVS perceptual improvement mathematics, and may include preservation of memory colors. Referring also to FIG. 5, the at least 3DLUT 54 may be generated by the CPU 34 of the imaging system 30 according to an algorithm 52 stored in memory 36 or on the readable storage medium 38. Alternatively, the at least 3DLUT 54 may be generated by another computing system and uploaded to the system computer 32. The algorithm 52 of FIG. 5 for generating the at least 3DLUT 54 may be algorithm 100 of FIG. 6.
FIG. 7 is a flowchart depicting one method for rendering a color image in accordance with the present disclosure. The method may be performed using the imaging system 30 depicted in FIG. 5. Referring again to FIGS. 5 and 7, in step 210, the 3DLUT 54, which may be produced according to the algorithm 100 of FIG. 6, is loaded into the memory 36 or the readable storage medium 38 of the imaging device 30. In step 220, the input image data from the source 50 is communicated to the CPU 34. The input image data may be of a first input color standard, and may be converted into an input color specification for inputting into the at least three-dimensional look-up table. In step 230, the input image data is processed with an algorithm 56 that may be stored in memory 36, using the at least three-dimensional look-up table 54 to produce rendered image data. In step 240, the rendered image data is output to the image display/projection device 40, and a high brightness, high contrast, and high colorfulness image is displayed or projected in step 250. The image may include human visual system perceptually accurate memory colors. The method 100 may be repeatedly performed at a high rate on sequences of image input data, such as at the rate of 24 or 48 "frames per second" used in digital cinema, or such as at the rate of 30, 60, 120 or 240 frames per second used in consumer displays.
Referring again to FIG. 5, and in one embodiment, the 3DLUT 54 of input colors and output colors may contain, or the values therein may be determined from, the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of secondary colors by the image rendering unit 40. In another embodiment, the 3DLUT 54 of input colors and output colors may contain, or be determined from, a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
In another embodiment, the method may include providing input image data 50 of a first color gamut, and an image rendering unit 40 having a second, expanded or reduced color gamut. The 3DLUT 54 of values of input colors and output colors is generated, wherein the values in the 3DLUT 54 are calculated based upon a visual model of the human visual system, thereby expanding the input image data 50 to encompass the second color gamut of the image rendering unit 40.
In another aspect of the invention, the image rendering unit 40 may be provided with some color modification capability that is built in or embedded in hardware or software. For example, the device may be provided with an algorithm to add white or secondary colors, resulting in a loss of colorfulness, and a distortion in the appearance of memory colors. In such circumstances, the output values in the 3DLUT 54 are determined such that the input image data 50 is processed to compensate for the color modification performed by the image rendering unit 40. The method may thus include providing the 3DLUT 54 to adjust the color data in a manner that shifts it in a direction within the color space that compensates for the embedded color modification that is performed by the image rendering unit 40. The 3DLUT 54 further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit 40.
In a more specific instance in which the image rendering unit 40 includes an algorithm for creating secondary colors from primary colors, the 3DLUT 54 may further comprise compensating for the color modification performed by the addition of the secondary colors in the image rendering unit 40. The values in the 3DLUT 54 may also be determined such that the 3DLUT 54 further comprises processing the input image data 50 to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit 40.
In another aspect of the invention, the image rendering unit 40 may unintentionally contain some color modification capability resulting from variation in one or more parameters of the unit 40. For example, if the image rendering unit 40 is an OLED display, then over the life of the display, color modification may occur due to the differing life spans between blue OLED and red and green OLEDs of the display, as described previously herein. During the operation of the OLED display, the differential color change between the blue OLED and the red and green OLEDs will change the color balance of the display if no countermeasures are instituted.
In such circumstances, the output values in the 3DLUT 54 may be determined such that the input image data 50 is processed to compensate for the predicted decrease in luminance of the blue OLED. The method may thus include providing the 3DLUT 54 to adjust the color data in a manner that shifts it in a direction within the color space that compensates for decreasing blue OLED luminance. The 3DLUT 54 further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the continual loss of blue OLED luminance.
The 3DLUT 54 may also adjust the colorfulness, contrast, or brightness of the image to be produced to appear as it would in an image from an analog film system or digital system used in cinemas. It is known that film is generally not designed to reproduce color as the eye sees it at the filming site. (A color gamut 416 for film is shown in FIG. 3.) Instead, the colors in film images have increased contrast and increased colorfulness in anticipation of the viewing environment in which the film images will be observed. It is also known that digital systems aim to match the look of film images. Accordingly, the 3DLUT may be designed to provide the same effect in a cinema.
The production of the 3D LUT 54is not limited only to the algorithm 100 of FIG. 6. Bit depth modification and interpolation as described herein may also be applied to all of the applications herein which include the use of 3DLUTs. The 3DLUT may vary in bit depth, depending upon the capacity of the memory 32 and the processing power of the CPU 34. In one embodiment, the 3DLUT may be a twelve bit table with 4096 x 4096 x 4096 discrete addresses containing three or more color values of predetermined bit precision. In another embodiment, some bits of the table may be used for interpolation between adjacent values. For example, the final two bits of respective adjacent table values may be used in interpolating colors between them. Other methods of multi-dimensional interpolation are known, and are included in embodiments of implementing the 3DLUT. Additionally, the input data may contain more than three primary colors such as RGB. For example, the input data may contain RGBCMY (wherein C = cyan, M = magenta, and Y = yellow), or some lesser combination such as RGBCM. In such an instance, the 3DLUT could have outputs of RGBCMYW.
Depending upon the particular application, the algorithm 100, or other algorithms that may further include bit depth modification and interpolation, may be used to produce more than one 3DLUT. One factor that may be used to determine the values in the 3DLUT is the set of characteristics of the display or projection device. Referring again to FIG. 5, different 3DLUTs 54 may be produced for different image output devices, for example, an LCD display 42, a lamp-and-color-wheel DMD projector 44, and an LED DMD projector 46. The characteristics of the display or projection device 40 include the "color engine" of the device, and whether it includes only RGB as the primary colors, or has more than three colors. The 3DLUTs 54 may be losslessly compressed to reduce storage use in the memory 36 of the image color rendering controller 30.
Other factors pertain to the "surround," i.e., the viewing environment of the display or projection device 40, such as the ambient lighting of the room in which the display or projection occurs, and the lighting and/or surface immediately surrounding the display/projection screen. In general, the 3DLUT values provide a displayed/ projected image having more contrast, brightness, and colorfulness for any "surround", i.e. viewing environment; for example, a particular room lighting and any conversion from that room lighting to an improved room lighting. If the room lighting is darker or brighter than a desired level, the generation of the 3DLUT 54 may include a visual adaptation transformation to produce a perception of improved viewing environment. The visual adaptation transformation is based upon visual models that may include models of the adaptation of the human vision to viewing environments. For example, in a dark room there is essentially no ambient lighting (other than minimal safety and exit lighting), but using a visual adaptation transformation to increase contrast and colorfulness in a manner analogous to that used in motion picture print film to provide the perception of an improved viewing environment to an observer. As the room lighting increases and the image brightness increases to about the same level, the adaptation transformation is still needed because it the room lighting is still not as bright as daytime outdoor lighting, while the ambient lighting must be compensated for. In summary, the visual adaptation transformation implemented in the 3DLUT 54 uses visual adaptation models to produce the effect of improved viewing environment.
Other factors in generating the 3DLUT 54 may include a knowledge of the different sensitivities to colorfulness in different worldwide regions, or the intended use of the displayed/projected images; for example, whether the images are viewed in a video game that is being played, or viewed as a movie or television program.
These multiple 3DLUTs 54, or a subset of them may be stored in the memory
36 of the computer 32 of the device 30. Additionally, data on the viewing environment factors 58 may be stored in memory. The image device 30 may include a keyboard (not shown) or other input device to access a user interface (not shown) that may be displayed on the display or projector 40 (or other user interface screen). The user interface may offer the capability of inputting data on the viewing environment factors 58, and/or other factors such that the optimum 3DLUT is selected from the stored 3DLUTs 54 for the particular display or projector 40 and viewing environment. In that manner, the most perceptually optimal images are provided to the user by the system 30. The 3DLUTs 54 are effective for the enhancement of a variety of images, including but not limited to games, movies, or personal photos. Additionally, some improvement of grey scale images is attained by the resulting contrast and brightness enhancement thereof.
The 3DLUT 54 may be produced according to variants of the method 200 such that it has additional or alternative characteristics. The values in the 3DLUT 54 may be provided to convert a first color gamut of an input image data set 50 to encompass a second expanded or reduced color gamut of an image rendering unit 40 that is connectable to the device 30. The 3DLUT 54 may contain a transformation from a suboptimal viewing environment to an improved viewing environment in which the color image is to be observed, including the visual and chromatic adaptation of the human visual system. The 3DLUT 54 may contain the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of secondary colors by an image rendering unit 40 that is connectable to the device 30.
In another aspect of the invention, the methods of producing a color image may include input color standard transformation and color output calibration of the image rendering device that is in use. This is best understood with reference to FIG. 8, which is a schematic diagram of an alternative method 300 for producing a color image, which includes such color output calibration. The diagram includes color output calibration operations 350, 360, and 370; however, for the sake of clarity, the entire method depicted in FIG. 8 will be described, with reference also to FIGS.6 and 7.
In operation 310 ("Gammal "), the input values of R, G, and B are reverse gamma corrected to compensate for the non-linearity of this input data standard, thereby producing linearized scalar values R,, B,, and G,. This correction may be done using the respective one dimensional lookup tables 31 1 , 312, and 313. The input values of R, G, and B may be between 8 and 12 bits (314 in FIG. 8) inclusive. The output values of R,, G,, and B, may have 16 bit resolution (315 in FIG. 8), depending upon the architecture of the image color rendering controller 32, and also upon the need for the greater bit depth of the imaging standards being used. The input R, G, and B values may be provided from various devices, such as a video camera having an output in accordance with standard BT.709. In such circumstances, the value of gamma used in the correction may be 2.2. The input R, G, and B values may be provided in accordance with other imaging standards, and other values of gamma and other 1 D lookup tables 31 1 , 312, and 313 may consequently be used in the reverse gamma correction as needed.
In operation 320 ("Color Transform"), every color value in the image data stream 319 represented by a unique R,, G,, and B, combination is then operated on by a 3 x 3 matrix determined by the particular imaging standard being used to perform a color transformation to RN, GN, and B,, values that are linearized scalar values referenced to the standard BT.709. The RN, GN, and B,, values may be provided with a bit resolution of up to 16 bits as indicated in FIG. 8.
In operation 330, ("Gamma2"), the values of R,,, GN, and B,, are gamma encoded to re-introduce a non-linearity into the processed data, thereby producing gamma encoded values
Figure imgf000038_0001
Β,ϋ, and GN, for input to the 3D color tables. This encoding may be done using the respective one dimensional lookup tables 331 , 332, and 333, using a gamma encoding factor of 1/2.2, in one embodiment. Other factors may be suitable, depending upon the particular imaging standards being used. The resulting values of Riii, Biii, and Gm may be reduced to 10 bit resolution as shown in FIG. 8, to enable sufficiently fast subsequent processing using the 3D color tables 54. The gamma encoding enables a reduction in the number of bits from 16 for linear data to much less for gamma encoded data, such as 10 bits, without artifacts. This makes the at least 3D table much smaller. It is effective to use fewer gamma encoded bits because the eye sees image data in a manner analogous to a gamma encoder.
In operation 340, the three dimensional color tables 54 are used to process the RiiiBiiiGiii data to produce output image RiVBivGivWiv data for display or projection. In this embodiment, the table 54 is 3D in (RGB) and 4D out (RGBW). Other table structures of at least three dimensions may be used, depending upon the particular application. Additionally, for the sake of simplicity of illustration, there is only one table 54 shown in FIG. 8; however, it is to be understood that there is a first 3D LUT for determining Riv, a second 3D LUT for determining GiV, a third 3D LUT for determining Biv, and a fourth 3D LUT for determining Wiv, where a white channel is implemented. In this embodiment, the white could be for an OLED display, or the signal that drives the combination of RGB to make the image rendering device brighter. Alternatively, the white could be replaced with cyan, or some other color in a four-color image rendering device, such as a four-color TV. The RiVBivGiVWiv data may be provided at a 12 bit resolution as indicated in FIG. 8.
At this point, the RiVBivGivWiv data, including the addition of white for increased brightness or color management of OLED devices may represent a generic display with typical color primaries and linearity. Additionally, however, further operations may be performed to further optimize the RiVBivGiVWiv data by calibration for the particular image rendering unit (display or projector) 40 that is in use. The measurement or specification of this particular image rendering unit 40 can be done in manufacturing on done on-site by a technician with conventional linearity and primary color measuring tools..
Referring again to FIG. 8, in operation 350 ("Gamma3"), the RiVBivGivWiv data is first reverse gamma-corrected to produce RvBvGvWv data. This correction may be done using the respective one dimensional lookup tables 351 , 352, 353, and 354. The output values of Rv, Gv, Bv, and Wv may have 16 bits. The value of gamma used in the correction may be 2.2, or another value in accordance with the gamma encoder 310.
In operation 360 ("Color Calibration"), every color value in the image data stream 359 represented by a unique Rv, Gv, Bv, and, and in many cases, Wv combination is then operated on by a 4 x 4 matrix. This 4 x 4 matrix is produced for and is unique to the particular image rendering unit 40 of FIG. 5 that is in service. The matrix is calculated from measured or specified values that define the color primaries of the particular image rendering unit 40. The purpose of the operation is to convert from the assumed or generic color primaries in the at least 3D color table to the actual ones in the image rendering unit 40. The visual effect is to adjust for white and the rest of the colors so they are not "tinted" (e.g., a little yellow or blue), because the image rendering unit may have slightly different color primaries than were assumed in creating the at least 3D table. For standard televisions or projectors, those assumptions are in accordance with the aforementioned BT.709 standard, because most TVs, displays, and projectors adhere to this standard. A given image rendering device may be tinted, e.g., more yellow, however so the calibration matrix compensates for that variation. The RVi, GVi, BVi, and WVi values may be provided with a bit resolution of up to 16 bits.
In operation 370, ("Calibration"), the RVi, GVi, Βν,, and WVi values are gamma encoded to introduce the correct non-linearity into the processed data for the image rendering unit 40, thereby producing the Rvii,GVii, Bvii,WVii values that, when used by the particular image rendering unit 40 to project or display the image, produce chosen non- linearity defined by the 3D table. This encoding may be done using the respective one dimensional lookup tables 371 , 372, 373, and 374. In one embodiment, a gamma encoding factor of 1/2.2 may be used. Other factors may be suitable, depending upon the particular imaging rendering unit 40. The resulting values of Rvii, BVii, Bvii,WVii may be output having between 8 and 12 bit resolution as indicated in FIG. 8.
It is, therefore, apparent that there has been provided, in accordance with the present invention, methods and devices for producing a color image. Having thus described the basic concept of the invention, it will be rather apparent to those skilled in the art that the foregoing detailed disclosure is intended to be presented by way of example only, and is not limiting. Various alterations, improvements, and modifications will occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested hereby, and are within the spirit and scope of the invention. Additionally, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations therefore, is not intended to limit the claimed processes to any order except as may be specified in the claims. Accordingly, the invention is limited only by the following claims and equivalents thereto.

Claims

We claim:
1 . A method of producing a color image, the method comprising:
a. providing input image data;
b. generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image color data to output image color data in an image rendering unit;
c. loading the at least three-dimensional look-up table into an image color rendering controller;
d. loading the input image data into the imaging color rendering controller;
e. processing the input image data through the at least three-dimensional lookup table to produce output color values stored at the addresses in the at least three-dimensional look-up table; and
f. outputting the output color values to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image.
2. The method of claim 1 , wherein the values in the look-up table are calculated based upon a visual model of the human visual system.
3. The method of claim 1 , wherein the at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness introduced by the at least three dimensional look- up-table produce a chosen artistic perception in the output image.
4. The method of claim 1 , wherein the image rendering unit is of an expanded color gamut greater than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the expanded color gamut.
5. The method of claim 1 , wherein the image rendering unit is of a smaller color gamut than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the smaller color gamut.
6. The method of claim 1 , wherein the input image data contains memory colors and non-memory colors, and the method comprises identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the image rendering unit.
7. The method of claim 6, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness, and contrast of the memory colors.
8. The method of claim 7, wherein perceived colorfulness, brightness and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors.
9. The method of claim 6, wherein generating the at least three-dimensional look-up table includes computing enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function.
10. The method of claim 6, further comprising generating more than one at least three- dimensional look-up table for the color transformation of the non-memory colors and the memory colors.
1 1 . The method of claim 6, further comprising generating more than one at least three- dimensional look-up table for the color transformation of the non-memory colors and the memory colors, wherein each of the at least three dimensional look-up tables is optimized for a different viewing environment of the image rendering unit.
12. The method of claim 1 1 , further comprising selecting one of the at least three- dimensional look-up tables for loading into the image color rendering controller based upon the viewing environment of the image rendering unit.
13. The method of claim 12, further comprising providing a sensor for measuring the ambient light in the viewing environment.
14. The method of claim 1 , wherein the input image data is of a first color standard, and the method further comprises converting the input image data of the first input color standard into an input color specification for inputting into the three-dimensional look-up table.
15. The method of claim 1 , wherein the at least three-dimensional look-up table has at least three input colors.
16. The method of claim 1 , wherein the at least three-dimensional look-up table has at least three output colors.
17. The method of claim 16, wherein the at least three output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
18. The method of claim 1 , wherein the at least three dimensional look-up table is losslessly compressed to reduce storage use in a memory of the image color rendering controller.
19. The method of claim 1 , further comprising calibrating the image rendering unit by measuring the color response of the image rendering unit, and then modifying the output image data by one of additional processing after the at least three-dimensional look-up-table or including the required calibration in the at least three-dimensional lookup-table.
20. The method of claim 1 , wherein the image color rendering controller is contained within the image rendering unit.
21 . The method of claim 1 , wherein the imaging color rendering controller is external to the image rendering unit.
22. The method of claim 1 , wherein an auxiliary imaging device controller is in communication with the image color rendering controller and the image rendering unit.
23. The method of claim 1 , wherein the image color rendering controller is in communication with the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
24. The method of claim 1 , wherein the image color rendering controller is in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
25. The method of claim 1 , wherein the image rendering unit includes an algorithm for color modification, wherein the at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the image rendering unit.
26. The method of claim 25, wherein the image rendering unit includes an algorithm for creating secondary colors from primary colors, and the at least three-dimensional look-up table further comprises compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
27. The method of claim 25, wherein the at least three-dimensional look-up table further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit.
28. The method of claim 1 , wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
29. The method of claim 1 , wherein the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
30. The method of claim 1 , wherein the at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit.
31 . The method of claim 1 , wherein the color image is a 3D color image.
32. A method of producing a color image, the method comprising:
a. providing input image data of a first color gamut and an image rendering unit of a second, different color gamut;
b. generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table change the input image data to encompass the second color gamut of the image rendering unit; c. loading the at least three-dimensional look-up table into an image color rendering controller;
d. loading the input image data into the imaging color rendering controller;
e. processing the input image data through the at least three-dimensional look- up table using the input image data as addresses into the at least three- dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three-dimensional look-up table; and
f. outputting the output image data to the image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image.
33. The method of claim 32, wherein the values in the lookup table are calculated based upon a visual model of the human visual system.
34. The method of claim 32, wherein the at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness introduced by the at least three dimensional look-up-table produce a chosen artistic perception in the output image.
35. The method of claim 32, wherein the image rendering unit is of an expanded color gamut greater than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the expanded color gamut.
36. The method of claim 32, wherein the image rendering unit is of a smaller color gamut than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the smaller color gamut.
37. The method of claim 32, wherein the color image contains memory colors and non-memory colors, and the method comprises identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the image rendering unit.
38. The method of claim 37, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness, and contrast of the memory colors.
39. The method of claim 37, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors.
40. The method of claim 37, wherein generating the at least three-dimensional look-up table includes computing enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function.
41 . The method of claim 37, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors.
42. The method of claim 37, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors, wherein each of the at least three dimensional look-up tables is optimized for a different viewing environment of the image rendering unit.
43. The method of claim 42, further comprising selecting one of the at least three- dimensional look-up tables for loading into the image color rendering controller based upon the viewing environment of the image rendering unit.
44. The method of claim 43, further comprising providing a sensor for measuring the ambient light in the viewing environment.
45. The method of claim 32, further comprising converting the input image data of a first input color standard into an input color specification for inputting into the three- dimensional look-up table.
46. The method of claim 32, wherein the at least three-dimensional look-up table has at least three input colors.
47. The method of claim 32, wherein the at least three-dimensional look-up table has at least three output colors.
48. The method of claim 47, wherein the at least three output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
49. The method of claim 32, wherein the at least three dimensional look-up table is losslessly compressed to reduce storage use in a memory of the image color rendering controller.
50. The method of claim 32, further comprising calibrating the image rendering unit by measuring the color response of the image rendering unit, and then modifying the output image data by one of additional processing after the at least three-dimensional look-up-table or including the required calibration in the at least three-dimensional lookup-table.
51 . The method of claim 32, wherein the image color rendering controller is contained within the image rendering unit.
52. The method of claim 32, wherein the imaging color rendering controller is external to the image rendering unit.
53. The method of claim 32, wherein an auxiliary imaging device controller is in communication with the image color rendering controller and the image rendering unit.
54. The method of claim 32, wherein the image color rendering controller is in communication with the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
55. The method of claim 32, wherein the image color rendering controller is in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
56. The method of claim 32, wherein the image rendering unit includes an algorithm for color modification, and wherein the at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the image rendering unit.
57. The method of claim 56, wherein the image rendering unit includes an algorithm for creating secondary colors from primary colors, and the at least three-dimensional look-up table further comprises compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
58. The method of claim 56, wherein the at least three-dimensional look-up table further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit.
59. The method of claim 32, wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
60. The method of claim 32, wherein the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
61 . The method of claim 32, wherein the at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit.
62. The method of claim 32, wherein the color image is a 3D color image.
63. A method of producing a color image by an image rendering unit in a sub-optimal viewing environment, the method comprising:
a. generating an at least three-dimensional look-up table of values of input colors and output colors, the table containing a transformation from a suboptimal viewing environment to an improved viewing environment;
b. loading the at least three-dimensional look-up table into an image color rendering controller;
c. loading input image data into the image color rendering controller; d. processing the input image data through the at least three-dimensional lookup table using the input image data as addresses into the at least three- dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three-dimensional look-up table; and
e. outputting the output image data to the image rendering unit.
64. The method of claim 63, wherein the values in the look-up table are calculated based upon a visual model of the human visual system.
65. The method of claim 63, wherein the output image data outputted to the image rendering unit produces a chosen artistic perception in the output image.
66. The method of claim 63, wherein the image rendering unit is of an expanded color gamut greater than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the expanded color gamut.
67. The method of claim 63, wherein the image rendering unit is of a smaller color gamut than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the smaller color gamut.
68. The method of claim 63 wherein the color image contains memory colors and non- memory colors, and the method comprises identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non- memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the image rendering unit.
69. The method of claim 68, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness, and contrast of the memory colors.
70. The method of claim 68, wherein perceived colorfulness, brightness and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors.
71 . The method of claim 68, wherein generating the at least three-dimensional look-up table includes computing enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function.
72. The method of claim 68, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors.
73. The method of claim 68, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors, wherein each of the at least three dimensional look-up tables is optimized for a different viewing environment of the image rendering unit.
74. The method of claim 73, further comprising selecting one of the at least three- dimensional look-up tables for loading into the imaging device controller based upon the viewing environment of the image rendering unit.
75. The method of claim 74, further comprising providing a sensor for measuring the ambient light in the viewing environment.
76. The method of claim 63, further comprising converting the input image data of an first input color standard into an input color specification for inputting into the three- dimensional look-up table.
77. The method of claim 63, wherein the at least three-dimensional look-up table has three or more input colors.
78. The method of claim 63, wherein the at least three-dimensional look-up table has three or more output colors.
79. The method of claim 78, wherein the three or more output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
80. The method of claim 79, wherein the at least three dimensional look-up table is losslessly compressed to reduce storage use in a memory of the image color rendering controller.
81 . The method of claim 63, further comprising calibrating the image rendering unit by measuring the color response of the image rendering unit, and then modifying the output image data by one of additional processing after the at least three-dimensional look-up-table or including the required calibration in the at least three-dimensional look- up-table.
82. The method of claim 63, wherein the image color rendering controller is contained within the image rendering unit.
83. The method of claim 63, wherein the image color rendering controller is external to the image rendering unit.
84. The method of claim 63, wherein an auxiliary imaging device controller is in communication with the image color rendering controller and the image rendering unit.
85. The method of claim 63, wherein the image color rendering controller is in communication with the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
86. The method of claim 63, wherein the image color rendering controller is in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
87. The method of claim 63, wherein the image rendering unit includes an algorithm for color modification, and wherein the at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the image rendering unit.
88. The method of claim 87, wherein the image rendering unit includes an algorithm for creating secondary colors from primary colors, and the at least three-dimensional look-up table further comprises compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
89. The method of claim 87, wherein the at least three-dimensional look-up table further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit.
90. The method of claim 63 wherein the at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit. .
91 . The method of claim 63, wherein the input image data is of a first color gamut and the image rendering unit is of a second, expanded color gamut; and wherein the values in the lookup table expand the input image data to encompass the second color gamut of the image rendering unit.
92. The method of claim 63, wherein the input image data is of a first color gamut and the image rendering unit is of a second, reduced color gamut; and wherein the values in the lookup table reduce the input image data to encompass the second color gamut of the image rendering unit.
93. The method of claim 63, wherein the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
94. The method of claim 63, wherein the color image is a 3D color image.
95. A method of producing a color image, the method comprising:
a. generating an at least three-dimensional look-up table of values of input colors and output colors, the three-dimensional look-up table containing the definition of secondary colors or more than three primary colors;
b. loading the at least three-dimensional look-up table into an image color rendering controller;
c. loading input image data into the image color rendering controller; d. processing the input image data through the at least three-dimensional lookup table using the input image data as addresses into the at least three- dimensional look-up table to produce output image data from the output color values stored at the addresses in the at least three-dimensional look-up table; and
e. outputting the output image data to an image rendering unit to produce an output image that is perceived to have at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness compared to the input image.
96. The method of claim 95, wherein the values in the look-up table are calculated based upon a visual model of the human visual system.
97. The method of claim 95, wherein the at least one of enhanced brightness, enhanced contrast, and enhanced colorfulness introduced by the at least three dimensional look-up-table produce a chosen artistic perception in the output image.
98. The method of claim 95, wherein the image rendering unit is of an expanded color gamut greater than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the expanded color gamut.
99. The method of claim 95, wherein the image rendering unit is of a smaller color gamut than the color gamut of the input image data, and wherein the output colors to the image rendering unit utilize the smaller color gamut.
100. The method of claim 95 wherein the color image contains memory colors and non-memory colors, and the method comprises identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the image rendering unit.
101 . The method of claim 100, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness and contrast of the memory colors.
102. The method of claim 100, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors.
103. The method of claim 100, wherein generating the at least three-dimensional lookup table includes computing the enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function.
104. The method of claim 100, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors.
105. The method of claim 100, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors, wherein each of the at least three dimensional look-up tables is optimized for a different viewing environment of the image rendering unit.
106. The method of claim 105, further comprising selecting one of the at least three- dimensional look-up tables for loading into the image color rendering controller based upon the viewing environment of the image rendering unit.
107. The method of claim 105, further comprising providing a sensor for measuring the ambient light in the viewing environment.
108. The method of claim 95, further comprising converting the input image data of a first input color standard into an input color specification for inputting into the three- dimensional look-up table.
109. The method of claim 95, wherein the at least three-dimensional look-up table has three or more input colors.
1 10. The method of claim 95, wherein the at least three-dimensional look-up table has three or more output colors.
1 1 1 . The method of claim 1 10, wherein the three or more output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
1 12. The method of claim 95, wherein the at least three dimensional look-up table is losslessly compressed to reduce storage use in a memory of the image color rendering controller.
1 13. The method of claim 95, further comprising calibrating the image rendering unit by measuring the color response of the image rendering unit, and then modifying the output image data by one of additional processing after the at least three-dimensional look-up-table or including the required calibration in the at least three-dimensional look- up-table.
1 14. The method of claim 95, wherein the image color rendering controller is contained within the image rendering unit.
1 15. The method of claim 95, wherein the image color rendering controller is external to the image rendering unit.
1 16. The method of claim 95, wherein an auxiliary imaging device controller is in communication with the image color rendering controller the image rendering unit.
1 17. The method of claim 95, wherein the image color rendering controller is in communication with the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
1 18. The method of claim 95, wherein the image color rendering controller is in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
1 19. The method of claim 95, wherein the image rendering unit includes an algorithm for color modification, and wherein the at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the image rendering unit.
120. The method of claim 1 19, wherein the image rendering unit includes an algorithm for creating secondary colors from primary colors, and the at least three-dimensional look-up table further comprises compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
121 . The method of claim 1 19, wherein the at least three-dimensional look-up table further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit.
122. The method of claim 95, wherein the input image data is of a first color gamut and the image rendering unit is of a second, expanded color gamut; and wherein the values in the lookup table expand the input image data to encompass the second color gamut of the image rendering unit.
123. The method of claim 95, wherein the input image data is of a first color gamut and the image rendering unit is of a second, reduced color gamut; and wherein the values in the lookup table reduce the input image data to encompass the second color gamut of the image rendering unit.
124. The method of claim 95, wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
125. The method of claim 95, wherein the at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit.
126. The method of claim 95, wherein the three-dimensional look-up table has the secondary colors or more than three primary colors explicitly defined and wherein measured responses of the image rendering unit are used to define the three- dimensional look-up table.
127. The method of claim 95, wherein the three-dimensional look-up table has the secondary colors or more than three primary colors explicitly defined and wherein mathematics provided by a manufacturer of the image rendering unit are used to define the three-dimensional look-up table.
128. The method of claim 95, wherein the three-dimensional look-up table has the secondary colors or more than three primary colors explicitly defined and wherein there is provided an open definition of how the secondary colors or more than three primary colors are used.
129. The method of claim 95, wherein the three-dimensional look-up table has the secondary colors or more than three primary colors implied in the design of a three in by three out look-up table for two conditions, and wherein measured responses of the image rendering unit are used to define the three-dimensional look-up table.
130. The method of claim 95, wherein the three-dimensional look-up table has the secondary colors or more than three primary colors implied in the design of a three in by three out look-up table for two conditions, and wherein mathematics provided by a manufacturer of the image rendering unit are used to define the three-dimensional look-up table.
131 . The method of claim 95, wherein the three-dimensional look-up table has the secondary colors or more than three primary colors implied in the design of a three in by three out look-up table for two conditions, and wherein there is provided an open definition of how the secondary colors or more than three primary colors are used.
132. The method of claim 95, wherein the color image is a 3D color image.
133. A method of producing a color image perceived by human observers observing the image on an image rendering unit, the method comprising using visual models to enhance the perceived colorfulness, contrast, or brightness of the image, thereby improving the perceived quality of the image.
134. The method of claim 133, further comprising preserving memory colors of the image.
135. The method of claim 133, further comprising performing empirical visual studies to determine the preference of colorfulness, contrast, or brightness on the ethnicities of the human observers, and defining the perceived quality of the image for each nationality of human observers.
136. The method of claim 135, further comprising adjusting the colorfulness, contrast, or brightness of the image based upon one of the ethnicities of the human observers.
137. The method of claim 133, further comprising generating an at least three- dimensional look-up table of values of input colors and output colors, the three- dimensional look-up table adjusting the colorfulness, contrast, or brightness of the image to match the enhanced appearance of analog film systems or digital systems designed for cinemas.
138. The method of claim 133, further comprising adjusting the colorfulness, contrast, or brightness of the image to produce a chosen artistic perception in the image.
139. The method of claim 133, wherein the visual models are based upon the human visual system.
140. The method of claim 133, wherein the image rendering unit is of an expanded color gamut greater than the color gamut of input image data used to produce the image, and wherein the output colors to the image rendering unit utilize the expanded color gamut.
141 . The method of claim 133, wherein the image rendering unit is of a reduced color gamut smaller than the color gamut of input image data used to produce the image and wherein the output colors to the image rendering unit utilize the reduced color gamut.
142. The method of claim 133, wherein input image data used to produce the image contains memory colors and non-memory colors, and the method comprises identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the image rendering unit.
143. The method of claim 142, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness, and contrast of the memory colors.
144. The method of claim 142, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors.
145. The method of claim 142, wherein the method includes generating an at least three-dimensional look-up table including computing enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function.
146. The method of claim 142, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors.
147. The method of claim 142, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors, wherein each of the at least three dimensional look-up tables is optimized for a different viewing environment of the image rendering unit.
148. The method of claim 147, further comprising selecting one of the at least three- dimensional look-up tables for loading into an image color rendering controller based upon the viewing environment of the image rendering unit.
149. The method of claim 148, further comprising providing a sensor for measuring the ambient light in the viewing environment.
150. The method of claim 133, wherein input image data used to produce the image is of a first color standard, and the method further comprises converting the input image data of the first input color standard into an input color specification for inputting into a three-dimensional look-up table.
151 . The method of claim 133, further comprising generating an at least three- dimensional look-up table of values of at least three input colors and values of output colors, wherein the values in the lookup table convert the input image color data to output image color data in an image rendering unit.
152. The method of claim 133, further comprising generating an at least three- dimensional look-up table of values of input colors and values of at least three output colors, wherein the values in the lookup table convert the input image color data to output image color data in an image rendering unit.
153. The method of claim 152, wherein the at least three output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
154. The method of claim 133, further comprising generating an at least three- dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image color data to output image color data in an image rendering unit, and wherein the at least three dimensional look-up table is losslessly compressed to reduce storage use in a memory of the image color rendering controller.
155. The method of claim 133, further comprising calibrating the image rendering unit by measuring the color response of the image rendering unit, and then modifying output image data by one of additional processing after an at least three-dimensional look-up-table or including the required calibration in the at least three-dimensional lookup-table.
156. The method of claim 133, wherein an image color rendering controller is provided within the image rendering unit.
157. The method of claim 133, wherein an imaging color rendering controller is provided external to the image rendering unit.
158. The method of claim 133, wherein an auxiliary imaging device controller is in communication with an image color rendering controller and the image rendering unit.
159. The method of claim 133, wherein an image color rendering controller is in communication with the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
160. The method of claim 133, wherein an image color rendering controller is in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
161 . The method of claim 133, wherein the image rendering unit includes an algorithm for color modification, and wherein an at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the image rendering unit.
162. The method of claim 161 , wherein the image rendering unit includes an algorithm for creating secondary colors from primary colors, and an at least three-dimensional look-up table further comprises compensating for the color modification performed by the addition of the secondary colors in the image rendering unit.
163. The method of claim 161 , wherein an at least three-dimensional look-up table is provided comprising processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the image rendering unit.
164. The method of claim 133, wherein an at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
165. The method of claim 133, wherein an at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
166. The method of claim 133, wherein an at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the image rendering unit.
167. The method of claim 133, wherein the color image is a 3D color image.
168. A method of producing a color image by an OLED display, the method comprising:
a. providing input image data and the OLED display having at least three OLEDs, each OLED being of a different primary color; b. generating an at least three-dimensional look-up table of values of input colors and output colors, wherein the values in the lookup table convert the input image data to output image color data of the OLED display in a manner that optimally manages the quality of the image and the lifetime of the at least three OLEDs;
c. loading the at least three-dimensional look-up table into an image color rendering controller;
d. loading the input image data into the imaging color rendering controller;
e. processing the input image data through the at least three-dimensional look- up table to produce output color values stored at the addresses in the at least three-dimensional look-up table; and
f. outputting the output image color data to produce the image by the OLED display.
169. The method of claim 168, wherein the at least three OLEDs are a red OLED, a green OLED, and a blue OLED.
170. The method of claim 169, wherein managing the quality of the image and the lifetime of the OLEDs comprises adding a white primary and mapping predetermined amounts of the grey component of RGB pixel values to the white primary to reduce the usage of RGB and extend the life of the red, green, and blue OLEDS.
171 . The method of claim 168, wherein the at least three OLEDs are a red OLED, a green OLED, and a blue OLED, and wherein managing the quality of the image and the lifetime of the OLEDs comprises adding other primary colors and mapping predetermined amounts of the RGB pixel values to the other primary colors to reduce the usage of RGB and extend the life of the red, green, and blue OLEDS.
172. The method of claim 168, further comprising operating the at least three OLEDs such that a first OLED does not reach end of life sooner than the other OLEDs, and the image quality of each of the OLEDs is reduced about equally over time without perceived artifacts or appearances predominantly of one of the OLED colors.
173. The method of claim 168, further comprising having a controlled degradation of image quality due to changes in the outputs of at least one of the OLEDs, wherein the change of quality at any given point in time has the least loss in perceived quality.
174. The method of claim 173, further comprising tracking the controlled degradation by accumulating and using usage data for all of the OLEDs.
175. The method of claim 174, further comprising performing the controlled degradation on the entire image over time.
176. The method of claim 174, further comprising performing the controlled degradation on at least one portion of the image over time.
177. The method of claim 173, further comprising performing the controlled degradation by substantially maintaining the brightness of the image while gradually reducing color saturation of the image over time.
178. The method of claim 173, further comprising performing the controlled degradation by substantially maintaining the brightness of the image while reducing color saturation of the image to a greater extent in image pixels of low color saturation than in image pixels of high color saturation.
179. The method of claim 173, further comprising performing the controlled degradation by substantially maintaining the brightness of the image while reducing color saturation gradually using adaptive one dimensional tables on each of the primary colors.
180. The method of claim 179, wherein where the one dimensional tables on each primary color are calculated using a quality degradation model.
181 . The method of claim 180, wherein the quality degradation model averages among one dimensional tables that are pre-designed to provide the targeted image quality at specific OLED lifetimes.
182. The method of claim 181 , wherein the one dimensional tables are produced by interpolation between a one dimensional table for when the OLEDs are initially operated and a one dimensional table for when the OLEDs are at the ends of their useful lifetimes.
183. The method of claim 168, wherein the values in the look-up table are calculated based upon a visual model of the human visual system.
184. The method of claim 168, wherein enhanced brightness, enhanced contrast, or enhanced colorfulness introduced by the at least three dimensional look-up-table produce a chosen artistic perception in the output image.
185. The method of claim 168, wherein the OLED display is of an expanded color gamut greater than the color gamut of the input image data, and wherein the output colors to the OLED display utilize the expanded color gamut.
186. The method of claim 168, wherein the OLED display is of a reduced color gamut smaller than the color gamut of the input image data, and wherein the output colors to the OLED display utilize the reduced color gamut.
187. The method of claim 168, wherein the input image data contains memory colors and non-memory colors, and the method comprises identifying the memory colors in the input image data to be substantially maintained, characterizing the memory colors and non-memory colors with respect to their chromaticities, and producing an image with substantially maintained memory colors using the OLED display.
188. The method of claim 187, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are changed differently than perceived colorfulness, brightness, and contrast of the memory colors.
189. The method of claim 187, wherein perceived colorfulness, brightness, and contrast of the non-memory colors are increased more than perceived colorfulness, brightness, and contrast of the memory colors.
190. The method of claim 187, wherein generating the at least three-dimensional lookup table includes computing enhanced lightness, chroma, and hue for the memory colors using a non-linear enhancement function.
191 . The method of claim 187, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors.
192. The method of claim 187, further comprising generating more than one at least three-dimensional look-up table for the color transformation of the non-memory colors and the memory colors, wherein each of the at least three dimensional look-up tables is optimized for a different viewing environment of the OLED display.
193. The method of claim 192, further comprising selecting one of the at least three- dimensional look-up tables for loading into the image color rendering controller based upon the viewing environment of the OLED display.
194. The method of claim 193, further comprising providing a sensor for measuring the ambient light in the viewing environment.
195. The method of claim 168, wherein the input image data is of a first color standard, and the method further comprises converting the input image data of the first input color standard into an input color specification for inputting into the three-dimensional look-up table.
196. The method of claim 168, wherein the at least three-dimensional look-up table has at least three input colors.
197. The method of claim 168, wherein the at least three-dimensional look-up table has at least three output colors.
198. The method of claim 197, wherein the at least three output colors are any combination of primary colors as independent light sources or secondary colors defined as combinations of primary colors.
199. The method of claim 168, wherein the at least three dimensional look-up table is losslessly compressed to reduce storage use in a memory of the image color rendering controller.
200. The method of claim 168, further comprising calibrating the OLED display by measuring the color response of the OLED display, and then modifying the output image data by one of additional processing after the at least three-dimensional look- up-table or including the required calibration in the at least three-dimensional look-up- table.
201 . The method of claim 168, wherein the image color rendering controller is contained within the OLED display.
202. The method of claim 168, wherein the imaging color rendering controller is external to the OLED display.
203. The method of claim 168, wherein an auxiliary imaging device controller is in communication with the image color rendering controller and the OLED display.
204. The method of claim 168, wherein the image color rendering controller is in communication with at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
205. The method of claim 168, wherein the OLED display includes an algorithm for color modification, wherein the at least three-dimensional look-up table further comprises processing the input image data to compensate for the color modification performed by the OLED display.
206. The method of claim 205, wherein the OLED display includes an algorithm for creating secondary colors from primary colors, and the at least three-dimensional lookup table further comprises compensating for the color modification performed by the addition of the secondary colors in the OLED display.
207. The method of claim 205, wherein the at least three-dimensional look-up table further comprises processing the input image data to increase perceived color, brightness, and contrast to compensate for the reduction in perceived color, brightness, and contrast caused by the algorithm for color modification in the OLED display.
208. The method of claim 168, wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual adaptation of the human visual system.
209. The method of claim 168, wherein the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the OLED display.
210. The method of claim 168, wherein the at least three-dimensional look-up table further comprises processing the input image data to include chromatic adaptation of the human visual system to a specified white point that increases the brightness of the OLED display.
21 1 . The method of claim 168, wherein the color image is a 3D color image.
212. A device for producing a color image, the device comprising a computer comprising a central processing unit and a memory in communication through a system bus, wherein the memory contains an at least three dimensional lookup table of values of input colors and output colors, wherein the values in the lookup table convert input image color data to output image color data in an image rendering unit that is connectable to the device.
213. The device of claim 212, wherein the values in the lookup table are based upon a visual model of the human visual system.
214. The device of claim 212, further comprising the image rendering unit in communication with the computer, the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
215. The device of claim 214, further comprising an auxiliary imaging device including at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, and a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
216. The device of claim 215, further comprising one of a liquid crystal display, a plasma display, and a DMD projector in communication with the auxiliary device.
217. The device of claim 212, further comprising a communication link to a source of input image data.
218. The device of claim 212, wherein the algorithm to produce the at least three dimensional lookup table is contained in the memory.
219. The device of claim 212, wherein the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
220. The device of claim 212, wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an optimal viewing environment including the visual and chromatic adaptation of the human visual system.
221 . The device of claim 212, wherein the memory contains a set of at least three dimensional lookup tables, each one of the set optimized for a different viewing environment of the image rendering unit.
222. The device of claim 221 , further comprising a sensor for measuring the ambient light in the viewing environment.
223. The device of claim 212, wherein the color image is a 3D color image.
224. A device for producing a color image, the device comprising a computer comprising a central processing unit and a memory in communication through a system bus, wherein the memory contains an at least three dimensional lookup table of values of input colors and output colors, wherein the values in the lookup table convert a first color gamut of an input image data set to encompass a second different color gamut of an image rendering unit that is connectable to the device.
225. The device of claim 224, wherein the values in the lookup table are based upon a visual model of the human visual system.
226. The device of claim 224, wherein the color gamut of the image rendering unit is larger than the first color gamut.
227. The device of claim 224, wherein the color gamut of the image rendering unit is smaller than the first color gamut.
228. The device of claim 224, further comprising the image rendering unit in communication with the computer, the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
229. The device of claim 228, further comprising an auxiliary imaging device including at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, and a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
230. The device of claim 229, further comprising one of a liquid crystal display, a plasma display, and a DMD projector in communication with the auxiliary device.
231 . The device of claim 224, further comprising a communication link to a source of input image data.
232. The device of claim 224, wherein the algorithm to produce the at least three dimensional lookup table is contained in the memory.
233. The device of claim 224, wherein the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
234. The device of claim 224, wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
235. The device of claim 224, wherein the memory contains a set of at least three dimensional lookup tables, each one of the set optimized for a different viewing environment of the image rendering unit.
236. The device of claim 235, further comprising a sensor for measuring the ambient light in the viewing environment.
237. The device of claim 224, wherein the color image is a 3D color image.
238. A device for producing a color image, the device comprising a computer comprising a central processing unit and a memory in communication through a system bus, wherein the memory contains an at least three dimensional lookup table containing a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
239. The device of claim 238, wherein the values in the lookup table are based upon a visual model of the human visual system.
240. The device of claim 238, further comprising an image rendering unit in communication with the computer, the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
241 . The device of claim 240, further comprising an auxiliary imaging device including at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, and a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
242. The device of claim 241 , further comprising one of a liquid crystal display, a plasma display, and a DMD projector in communication with the auxiliary device.
243. The device of claim 238, further comprising a communication link to a source of input image data.
244. The device of claim 238, wherein the algorithm to produce the at least three dimensional lookup table is contained in the memory.
245. The device of claim 238, wherein the at least three-dimensional look-up table includes the definition of secondary colors, and contains enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of the secondary colors by the image rendering unit.
246. The device of claim 238, wherein the memory contains a set of at least three dimensional lookup tables, each one of the set optimized for a different viewing environment of the image rendering unit.
247. The device of claim 246, further comprising a sensor for measuring the ambient light in the viewing environment.
248. The device of claim 238, wherein the color image is a 3D color image.
249. A device for producing a color image, the device comprising a computer comprising a central processing unit and a memory in communication through a system bus, wherein the memory contains an at least three dimensional lookup table containing the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to addition of secondary colors by an image rendering unit that is connectable to the device.
250. The device of claim 249, wherein the values in the lookup table are based upon a visual model of the human visual system.
251 . The device of claim 249, further comprising the image rendering unit in communication with the computer, the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
252. The device of claim 251 , further comprising an auxiliary imaging device including at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, and a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
253. The device of claim 252, further comprising one of a liquid crystal display, a plasma display, and a DMD projector in communication with the auxiliary device.
254. The device of claim 249, further comprising a communication link to a source of input image data.
255. The device of claim 249, wherein the algorithm to produce the at least three dimensional lookup table is contained in the memory.
256. The device of claim 249, wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
257. The device of claim 249, wherein the memory contains a set of at least three dimensional lookup tables, each one of the set optimized for a different viewing environment of the image rendering unit.
258. The device of claim 257, further comprising a sensor for measuring the ambient light in the viewing environment.
259. The device of claim 249, wherein the color image is a 3D color image.
260. A device for producing a color image perceived by human observers observing the image on an image rendering unit, the device comprising a computer comprising a central processing unit and a memory in communication through a system bus, wherein the memory contains a visual model to enhance the perceived colorfulness, contrast, or brightness of the image.
261 . The device of claim 260, further comprising the image rendering unit in communication with the computer, the image rendering unit selected from a projector, a television, a computer display, and a game display, the image rendering unit using DMD, plasma, liquid crystal, liquid crystal-on-silicon modulation, or direct modulation of the light source, and using LED, OLED, laser, or lamp light sources.
262. The device of claim 261 , further comprising an auxiliary imaging device including at least one of a cable TV set-top box, a video game console, a personal computer, a computer graphics card, a DVD player, and a Blu-ray player, a broadcast station, an antenna, a satellite, a broadcast receiver and processor, and a digital cinema.
263. The device of claim 262, further comprising one of a liquid crystal display, a plasma display, and a DMD projector in communication with the auxiliary device.
264. The device of claim 260, further comprising a communication link to a source of input image data.
265. The device of claim 260, further comprising an at least three dimensional lookup table stored in the memory and containing values of input colors and output colors.
266. The device of claim 265, wherein the algorithm to produce the at least three dimensional lookup table is contained in the memory.
267. The device of claim 265, wherein the at least three-dimensional look-up table contains a transformation from a suboptimal viewing environment to an improved viewing environment including the visual and chromatic adaptation of the human visual system.
268. The device of claim 265, wherein the memory contains a set of at least three dimensional lookup tables, each one of the set optimized for a different viewing environment of the image rendering unit.
269. The device of claim 268, further comprising a sensor for measuring the ambient light in the viewing environment.
270. The device of claim 265, further comprising the image rendering unit in communication with the computer, wherein the values in the lookup table convert a first color gamut of an input image data set to encompass a second expanded color gamut of the image rendering unit.
271 . The device of claim 265, further comprising the image rendering unit in communication with the computer, wherein the values in the lookup table convert a first color gamut of an input image data set to encompass a second reduced color gamut of the image rendering unit.
272. The device of claim 265, further comprising the image rendering unit in communication with the computer, wherein the at least three dimensional lookup table contains the definition of secondary colors, and enhanced lightness, chroma, and hues to increase perceived colorfulness, contrast, or brightness to compensate for the loss in perceived colorfulness, contrast, or brightness due to the addition of secondary colors by the image rendering unit.
273. The device of claim 260, wherein the color image is a 3D color image.
PCT/US2010/046871 2009-09-01 2010-08-26 Method for producing a color image and imaging device employing same WO2011028626A2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020127008557A KR101354400B1 (en) 2009-09-01 2010-08-26 Method for producing a color image and imaging device employing same
EP10814312.4A EP2474166A4 (en) 2009-09-01 2010-08-26 Method for producing a color image and imaging device employing same
CN201080049185.5A CN102598114B (en) 2009-09-01 2010-08-26 For generation of the method for coloured image and the imaging device of use the method
JP2012527000A JP2013504080A (en) 2009-09-01 2010-08-26 Method for generating color image and imaging apparatus using the method
KR1020137028025A KR101786161B1 (en) 2009-09-01 2010-08-26 Method for producing a color image and imaging device employing same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US23870609P 2009-09-01 2009-09-01
US61/238,706 2009-09-01

Publications (2)

Publication Number Publication Date
WO2011028626A2 true WO2011028626A2 (en) 2011-03-10
WO2011028626A3 WO2011028626A3 (en) 2011-06-16

Family

ID=43624182

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/046871 WO2011028626A2 (en) 2009-09-01 2010-08-26 Method for producing a color image and imaging device employing same

Country Status (6)

Country Link
US (3) US8520023B2 (en)
EP (1) EP2474166A4 (en)
JP (2) JP2013504080A (en)
KR (2) KR101354400B1 (en)
CN (1) CN102598114B (en)
WO (1) WO2011028626A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102611897A (en) * 2012-03-04 2012-07-25 北京佳泰信业技术有限公司 Method and system for carrying out vision perception high-fidelity transformation on color digital image
WO2013139067A1 (en) * 2012-03-22 2013-09-26 Hou Kejie Method and system for carrying out visual stereo perception enhancement on color digital image
TWI612334B (en) * 2016-08-10 2018-01-21 國立臺灣大學 Photometric compensation method and system for a see-through device
TWI676164B (en) * 2018-08-31 2019-11-01 友達光電股份有限公司 Color tuning system, method and display driver

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010083493A1 (en) * 2009-01-19 2010-07-22 Dolby Laboratories Licensing Corporation Image processing and displaying methods for devices that implement color appearance models
WO2011028626A2 (en) * 2009-09-01 2011-03-10 Entertainment Experience Llc Method for producing a color image and imaging device employing same
US8860751B2 (en) * 2009-09-01 2014-10-14 Entertainment Experience Llc Method for producing a color image and imaging device employing same
US20110205397A1 (en) * 2010-02-24 2011-08-25 John Christopher Hahn Portable imaging device having display with improved visibility under adverse conditions
WO2012049845A1 (en) * 2010-10-12 2012-04-19 パナソニック株式会社 Color signal processing device
US9013502B2 (en) * 2011-12-29 2015-04-21 Tektronix, Inc. Method of viewing virtual display outputs
US9076252B2 (en) * 2012-01-05 2015-07-07 Qualcomm Incorporated Image perceptual attribute adjustment
US9472163B2 (en) * 2012-02-17 2016-10-18 Monotype Imaging Inc. Adjusting content rendering for environmental conditions
US9307204B1 (en) * 2012-11-13 2016-04-05 Amazon Technologies, Inc. Enhancement of media sink compatibility
US20140225910A1 (en) * 2013-02-13 2014-08-14 Qualcomm Incorporated Methods and apparatus to render colors to a binary high-dimensional output device
JP5840159B2 (en) * 2013-02-26 2016-01-06 京セラドキュメントソリューションズ株式会社 Image forming apparatus
JP6415022B2 (en) * 2013-05-08 2018-10-31 キヤノン株式会社 Image processing apparatus, image processing method, and program
US20140368530A1 (en) * 2013-06-14 2014-12-18 Portrait Displays, Inc. Illumination Synchronizer
EP3043339A4 (en) * 2013-09-06 2017-08-02 Mitsubishi Electric Corporation Image display device
TWI532384B (en) * 2013-12-02 2016-05-01 矽創電子股份有限公司 Color adjustment device and method of color adjustment
US9992384B2 (en) * 2013-12-13 2018-06-05 Nec Display Solutions, Ltd. Correction coefficient calculation unit, image conversion unit, color correction device, display device, correction coefficient calculation method, and program
KR102158844B1 (en) * 2014-01-03 2020-09-22 삼성전자주식회사 Apparatus and method for processing image, and computer-readable recording medium
JP2015154459A (en) * 2014-02-19 2015-08-24 三星ディスプレイ株式會社Samsung Display Co.,Ltd. Image processing apparatus and image processing method
TWI514369B (en) 2014-05-29 2015-12-21 Au Optronics Corp Signal conversion method for display image
US9931248B2 (en) * 2014-06-16 2018-04-03 International Business Machines Corporation Non-invasive vision enhancement
CN104168475B (en) * 2014-08-15 2016-02-03 浙江大学 A kind of imaging type of digital camera changeable parameters obtains the method for color tristimulus values
US9779691B2 (en) 2015-01-23 2017-10-03 Dell Products, Lp Display front of screen performance architecture
US9558562B2 (en) 2015-02-11 2017-01-31 Dell Products, Lp System and method for reflection mitigation using blue noise pattern
CN105427265B (en) * 2015-12-25 2018-05-29 武汉鸿瑞达信息技术有限公司 A kind of method for enhancing color image contrast ratio and system
CN106997751B (en) * 2016-01-25 2019-04-02 曲阜师范大学 A kind of inversion spectrum characterization method showing equipment
US10675955B2 (en) * 2016-11-14 2020-06-09 Google Llc Adaptive glare removal and/or color correction
US10885676B2 (en) * 2016-12-27 2021-01-05 Samsung Electronics Co., Ltd. Method and apparatus for modifying display settings in virtual/augmented reality
US11301972B2 (en) 2017-03-03 2022-04-12 Dolby Laboratories Licensing Corporation Color saturation adjustment in non-uniform color space
US10446114B2 (en) * 2017-06-01 2019-10-15 Qualcomm Incorporated Adjusting color palettes used for displaying images on a display device based on ambient light levels
US10354613B2 (en) * 2017-06-03 2019-07-16 Apple Inc. Scalable chromatic adaptation
TWI650731B (en) * 2017-07-03 2019-02-11 國立高雄科技大學 Adaptive self-repair and verification method for digital images, computer program products
US10523947B2 (en) 2017-09-29 2019-12-31 Ati Technologies Ulc Server-based encoding of adjustable frame rate content
CN109697698B (en) * 2017-10-20 2023-03-21 腾讯科技(深圳)有限公司 Low illuminance enhancement processing method, apparatus and computer readable storage medium
CN107680556B (en) * 2017-11-03 2019-08-02 深圳市华星光电半导体显示技术有限公司 A kind of display power-economizing method, device and display
US10594901B2 (en) * 2017-11-17 2020-03-17 Ati Technologies Ulc Game engine application direct to video encoder rendering
US11290515B2 (en) 2017-12-07 2022-03-29 Advanced Micro Devices, Inc. Real-time and low latency packetization protocol for live compressed video data
CN108172198A (en) * 2018-01-02 2018-06-15 京东方科技集团股份有限公司 Image processing apparatus, storage medium, display equipment and image processing method
US10880531B2 (en) 2018-01-31 2020-12-29 Nvidia Corporation Transfer of video signals using variable segmented lookup tables
US10715774B2 (en) 2018-07-23 2020-07-14 Microsoft Technology Licensing, Llc Color conversion for ambient-adaptive digital content
KR102496558B1 (en) * 2018-08-02 2023-02-08 삼성디스플레이 주식회사 Device and method for controlling color gamut, display device including the device for controlling color gamut
US11158286B2 (en) * 2018-10-05 2021-10-26 Disney Enterprises, Inc. Machine learning color science conversion
CN109286802A (en) 2018-10-22 2019-01-29 深圳Tcl新技术有限公司 Color gamut matching method, device, display terminal and readable storage medium storing program for executing
WO2020100200A1 (en) * 2018-11-12 2020-05-22 Eizo株式会社 Image processing system, image processing device, and computer program
CN109859702A (en) * 2018-12-28 2019-06-07 南京奥视威电子科技股份有限公司 A kind of 3D lookup table generating method, display color calibrating method, display color correction system
US11100604B2 (en) 2019-01-31 2021-08-24 Advanced Micro Devices, Inc. Multiple application cooperative frame-based GPU scheduling
US10992902B2 (en) 2019-03-21 2021-04-27 Disney Enterprises, Inc. Aspect ratio conversion with machine learning
US11418797B2 (en) 2019-03-28 2022-08-16 Advanced Micro Devices, Inc. Multi-plane transmission
CN110473282B (en) * 2019-08-22 2021-04-20 腾讯科技(深圳)有限公司 Dyeing processing method and device for object model, computer equipment and storage medium
CN112449168B (en) * 2019-09-03 2021-11-23 深圳Tcl新技术有限公司 Color gamut mapping method and system
US11170690B2 (en) 2019-09-26 2021-11-09 Apple Inc. Pixel leakage and internal resistance compensation systems and methods
CN112672129B (en) * 2019-10-16 2022-11-15 深圳Tcl新技术有限公司 Color adjusting method and device, computer equipment and readable storage medium
US11488328B2 (en) 2020-09-25 2022-11-01 Advanced Micro Devices, Inc. Automatic data format detection
CN118762661A (en) * 2020-11-02 2024-10-11 伊英克公司 Method and apparatus for rendering color images
CN112689139B (en) * 2021-03-11 2021-05-28 北京小鸟科技股份有限公司 Video image color depth transformation method, system and equipment
US11457187B1 (en) 2021-08-04 2022-09-27 Warner Bros. Entertainment Inc. System and method for generating video content with hue-preservation in virtual production
CN114430483A (en) * 2021-12-16 2022-05-03 泛太通信导航(珠海)有限公司 3D vision unmanned vehicle monitoring system and corresponding unmanned vehicle
CN115002441B (en) * 2022-08-02 2022-12-09 深圳市前海手绘科技文化有限公司 Three-dimensional video production method and device, electronic equipment and computer storage medium

Family Cites Families (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4500919A (en) * 1982-05-04 1985-02-19 Massachusetts Institute Of Technology Color reproduction system
US5272468A (en) * 1991-04-30 1993-12-21 Texas Instruments Incorporated Image processing for computer color conversion
US5592188A (en) * 1995-01-04 1997-01-07 Texas Instruments Incorporated Method and system for accentuating intense white display areas in sequential DMD video systems
BE1010288A3 (en) * 1996-05-07 1998-05-05 Barco Nv "wide gamut" - display control.
US6611297B1 (en) * 1998-04-13 2003-08-26 Matsushita Electric Industrial Co., Ltd. Illumination control method and illumination device
JP2000115558A (en) * 1998-10-08 2000-04-21 Mitsubishi Electric Corp Color characteristic description device, color management device, image converter and color correction method
JP2002125125A (en) * 2000-07-31 2002-04-26 Seiko Epson Corp Image display system of environment adaptation type, program and information storage medium
JP4605987B2 (en) * 2000-08-28 2011-01-05 セイコーエプソン株式会社 Projector, image processing method, and information storage medium
JP3832626B2 (en) * 2001-06-28 2006-10-11 セイコーエプソン株式会社 Image processing apparatus, image processing method, program, and recording medium
US7119786B2 (en) 2001-06-28 2006-10-10 Intel Corporation Method and apparatus for enabling power management of a flat panel display
US7595811B2 (en) * 2001-07-26 2009-09-29 Seiko Epson Corporation Environment-complaint image display system, projector, and program
EP1461645A4 (en) 2001-12-14 2006-09-06 Digital Optics Internat Corp Uniform illumination system
JP2003323610A (en) * 2002-03-01 2003-11-14 Nec Corp Color correcting method and device, for projector
JP3775666B2 (en) * 2002-03-18 2006-05-17 セイコーエプソン株式会社 Image display device
CN1666242A (en) 2002-04-26 2005-09-07 东芝松下显示技术有限公司 Drive circuit for el display panel
JP4009850B2 (en) * 2002-05-20 2007-11-21 セイコーエプソン株式会社 Projection-type image display system, projector, program, information storage medium, and image projection method
KR20040009966A (en) * 2002-07-26 2004-01-31 삼성전자주식회사 Apparatus and method for correcting color
JP2004157302A (en) * 2002-11-06 2004-06-03 Fujitsu Ltd Image display system
US7397485B2 (en) * 2002-12-16 2008-07-08 Eastman Kodak Company Color OLED display system having improved performance
US20040135790A1 (en) * 2003-01-15 2004-07-15 Xerox Corporation Correcting the color cast of an image
US6897876B2 (en) * 2003-06-26 2005-05-24 Eastman Kodak Company Method for transforming three color input signals to four or more output signals for a color display
US7262753B2 (en) * 2003-08-07 2007-08-28 Barco N.V. Method and system for measuring and controlling an OLED display element for improved lifetime and light output
KR100839959B1 (en) 2003-09-01 2008-06-20 삼성전자주식회사 Display apparatus
EP1667063A4 (en) * 2003-09-11 2008-11-19 Image processing apparatus, image processing method, and image processing program
US7164397B2 (en) * 2003-09-30 2007-01-16 Texas Instruments Incorporated Discrete light color processor
WO2005069638A1 (en) * 2004-01-05 2005-07-28 Koninklijke Philips Electronics, N.V. Flicker-free adaptive thresholding for ambient light derived from video content mapped through unrendered color space
WO2005081187A1 (en) * 2004-02-25 2005-09-01 Matsushita Electric Industrial Co., Ltd. Image processor, image processing system, image processing method, image processing program and integrated circuit device
JP2005249820A (en) * 2004-03-01 2005-09-15 Seiko Epson Corp Color correcting circuit and image display device with the same
US7091523B2 (en) * 2004-05-13 2006-08-15 Eastman Kodak Company Color OLED device having improved performance
TWI251152B (en) * 2004-07-15 2006-03-11 Au Optronics Corp Method for compensating the color difference of display device
KR100633144B1 (en) * 2004-11-09 2006-10-11 삼성전자주식회사 Method for managing color and apparatus thereof
US7362336B2 (en) * 2005-01-12 2008-04-22 Eastman Kodak Company Four color digital cinema system with extended color gamut and copy protection
KR100708129B1 (en) * 2005-05-03 2007-04-16 삼성전자주식회사 Apparatus and method for editing color profile
JP4304623B2 (en) 2005-06-01 2009-07-29 ソニー株式会社 Imaging apparatus and method of processing imaging result in imaging apparatus
EP1964389A2 (en) * 2005-12-21 2008-09-03 Thomson Licensing Constrained color palette in a color space
EP1821275A1 (en) 2006-02-20 2007-08-22 Deutsche Thomson-Brandt Gmbh Method for driving a plasma display panel with attenuation estimation and compensation and corresponding apparatus
JP4676364B2 (en) * 2006-03-17 2011-04-27 富士通株式会社 Color correction method, color correction apparatus, and color correction program
US20070222740A1 (en) 2006-03-22 2007-09-27 Sharp Kabushiki Kaisha Display apparatus, image data providing apparatus, and controlling method
JP2007267084A (en) * 2006-03-29 2007-10-11 Seiko Epson Corp Color converting apparatus and image display device provided with the same
JP2007272146A (en) * 2006-03-31 2007-10-18 National Institute Of Information & Communication Technology Spectrum reproduction type multi-primary color display method
US20070252849A1 (en) * 2006-04-28 2007-11-01 Microtek International Inc. Device and method modifying image of color display
WO2007132635A1 (en) * 2006-05-15 2007-11-22 Sharp Kabushiki Kaisha Color image display device and color conversion device
US7592996B2 (en) 2006-06-02 2009-09-22 Samsung Electronics Co., Ltd. Multiprimary color display with dynamic gamut mapping
KR101216176B1 (en) * 2006-06-30 2012-12-28 엘지디스플레이 주식회사 Apparatus and Method of Organic Light Emitting Diode
US7893945B2 (en) * 2006-08-21 2011-02-22 Texas Instruments Incorporated Color mapping techniques for color imaging devices
US8018476B2 (en) 2006-08-28 2011-09-13 Samsung Electronics Co., Ltd. Subpixel layouts for high brightness displays and systems
JP2008085980A (en) 2006-08-31 2008-04-10 Sony Corp Color conversion device, emulation method, formation method of three-dimension lookup table, and image processor
WO2008049907A1 (en) 2006-10-26 2008-05-02 Seereal Technologies S.A. 3d content generation system
US7982827B2 (en) * 2006-12-14 2011-07-19 Texas Instruments Incorporated System and method for dynamically altering a color gamut
US8164597B2 (en) * 2007-04-10 2012-04-24 Kawasaki Microelectronics, Inc. Color conversion circuit and method of color conversion using interpolation from conversion coefficients some of which are substituted
US20080259099A1 (en) 2007-04-17 2008-10-23 Seiko Epson Corporation Display device, method for driving display device, and electronic apparatus
US20080266314A1 (en) * 2007-04-26 2008-10-30 Mark Melvin Butterworth Nonlinearly extending a color gamut of an image
US20080303918A1 (en) * 2007-06-11 2008-12-11 Micron Technology, Inc. Color correcting for ambient light
US7777760B2 (en) * 2007-06-29 2010-08-17 Apple Inc. Display color correcting system
US20090122087A1 (en) 2007-11-02 2009-05-14 Junichi Maruyama Display device
JP2009133926A (en) 2007-11-28 2009-06-18 Kyocera Corp Optical projector
US8121405B2 (en) * 2007-11-30 2012-02-21 Sharp Laboratories Of America, Inc. Systems and methods for skin-color-cognizant color mapping
KR100924121B1 (en) 2007-12-15 2009-10-29 한국전자통신연구원 Multi-view camera color calibration using color checker chart
JP5320757B2 (en) 2008-02-01 2013-10-23 セイコーエプソン株式会社 Electrophoretic display device driving method, electrophoretic display device, and electronic apparatus
US20090273614A1 (en) 2008-04-15 2009-11-05 Michael Francis Higgins Gamut mapping and subpixel rendering systems and methods
US20100245381A1 (en) * 2009-03-28 2010-09-30 Ramin Samadani Color gamut mapping
KR101651620B1 (en) 2009-08-11 2016-08-29 엘지이노텍 주식회사 Calibration method of display device using camera module and apparatus thereof
WO2011028626A2 (en) * 2009-09-01 2011-03-10 Entertainment Experience Llc Method for producing a color image and imaging device employing same
WO2012106122A1 (en) 2011-01-31 2012-08-09 Malvell World Trade Ltd. Systems and methods for performing color adjustments of pixels on a color display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP2474166A4 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102611897A (en) * 2012-03-04 2012-07-25 北京佳泰信业技术有限公司 Method and system for carrying out vision perception high-fidelity transformation on color digital image
CN102611897B (en) * 2012-03-04 2015-01-07 侯克杰 Method and system for carrying out vision perception high-fidelity transformation on color digital image
WO2013139067A1 (en) * 2012-03-22 2013-09-26 Hou Kejie Method and system for carrying out visual stereo perception enhancement on color digital image
TWI612334B (en) * 2016-08-10 2018-01-21 國立臺灣大學 Photometric compensation method and system for a see-through device
TWI676164B (en) * 2018-08-31 2019-11-01 友達光電股份有限公司 Color tuning system, method and display driver

Also Published As

Publication number Publication date
US8520023B2 (en) 2013-08-27
WO2011028626A3 (en) 2011-06-16
CN102598114A (en) 2012-07-18
US8767006B2 (en) 2014-07-01
KR20120091050A (en) 2012-08-17
KR101786161B1 (en) 2017-11-06
US20110050695A1 (en) 2011-03-03
JP2015200896A (en) 2015-11-12
KR20130125403A (en) 2013-11-18
EP2474166A2 (en) 2012-07-11
CN102598114B (en) 2015-09-16
KR101354400B1 (en) 2014-01-22
US9418622B2 (en) 2016-08-16
US20130314435A1 (en) 2013-11-28
US20140253545A1 (en) 2014-09-11
EP2474166A4 (en) 2014-01-01
JP2013504080A (en) 2013-02-04

Similar Documents

Publication Publication Date Title
US9418622B2 (en) Method for producing a color image and imaging device employing same
US9997135B2 (en) Method for producing a color image and imaging device employing same
US10761371B2 (en) Display device
WO2014088975A1 (en) Method for producing a color image and imaging device employing same
US10957239B2 (en) Gray tracking across dynamically changing display characteristics
KR102176398B1 (en) A image processing device and a image processing method
JP4364281B2 (en) Display device
US20050031199A1 (en) System and method of data conversion for wide gamut displays
CN103763456A (en) Method and apparatus for image data transformation
US11302288B2 (en) Ambient saturation adaptation
US11817063B2 (en) Perceptually improved color display in image sequences on physical displays
Laine et al. Illumination-adaptive control of color appearance: a multimedia home platform application
Dutta Color Primary Correction of Image and Video Between Different Source and Destination Color Spaces
Chenery The Validity and Relevance of Reference Displays
IL159233A (en) Device, system and method of data conversion for wide gamut displays

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080049185.5

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10814312

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2012527000

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2010814312

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010814312

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20127008557

Country of ref document: KR

Kind code of ref document: A