IMAGE CAPTURE DEVICE
INVENTOR
Jon Y. Hardeberg
BACKGROUND OF THE INVENTION
1. Claim of Priority.
This application claims priority to copending U.S. provisional application entitled,
"AN sRGB IMAGE SCANNING ENGINE," having serial no. 60/230,842, filed
September 7, 2000, attorney docket no. 050321-8010 (99RSS440 PID). The disclosure of U.S. provisional application entitled, "AN sRGB IMAGE SCANNING ENGINE," having serial no. 60/230,842, is incorporated in it's entirety by reference.
2. Technical Field.
The invention is generally related to a system for obtaining consistent device
• independent image output results. More particularly, the invention relates to a device independent color image capture device and related methods where the captured image data is converted from a device dependent color space format into a reference color space format. This is accomplished with the implementation of a device profile and converting the device profile to an output color space in a device independent color space format using predetermined conversion algorithms.
3. Related Art.
In modern electronic image capture devices (imaging device), such as flat bed scanners, film scanners, fax machines, video cameras or digital still cameras, electronic image data is typically generated via scanning (capturing) an image, document or scene of interest and generating image data in a predetermined device dependent color space,
such as, for example, RGB (Red, Green, Blue) color space. Because of the device dependent nature, colors in images captured and processed in a device dependent color space tend to vary from device to device. In other words, an image captured and printed out on one printer device may appear very different in terms of the color presentation, than the same image printed via a second printer device.
Color space is a geometric representation of colors in a space, for example, a. three dimensional space. There are two classes of color space, device dependent color space and device independent color space. A device dependent color space describes color with regard to a particular device (image capture device). Some examples of device dependent color space include RGB or CMY color space. A device independent color space quantifies the color itself, independent of the image capture device used to capture, measure or render the color of an image. Some examples of device independent color space include: sRGB and CTELAB.
The sRGB color space is a standardized RGB color space that is device independent. It is defined with regard to a reference display device or CRT monitor as used under reference viewing conditions. The sRGB color space is based on the average performance of a typical personal computer display monitor under predefined reference
, viewing conditions and is described and disclosed in. The sRGB color space format is defined and disclosed in International Electrotechnical Commission (TEC) standard 61966-2-1 (1999-10) -Multimedia systems and equipment - Colour measurement and management - Part 2-1: Colour management - Default RGB colour space - sRGB and is otherwise widely accepted and in use in the consumer imaging industry.
The sRGB color space is used, for example, by such popular image processing software products as Adobe Photoshop™, and Eastman Kodak's "Imaging for Windows"™. Kodak's "Imaging for Windows"™ is the standard imaging software currently delivered with the Microsoft Windows 98™ operating system and is thus widely available.
The CIELAB color space is a color model based on human perception developed by the Commission Internationale de l'Eclairage (CLE) committee and defined in CIE
publication 15.2 Coloήmetry, 1986. The disclosure of CTE publication 15.2 Colorimetry, 1986 is incorporated in its entirety. While widely regarded as a useful color model, CIELAB is not widely implemented in common color printing devices, particularly consumer printing or display devices, as it would require that a color conversion be applied to image data prior to presenting the CIELAB color space image data to a printing device for direct printing. CIELAB is a device independent color space, that is defined with regard to a standard human observer.
Due to differences in each imaging device, even devices of the same manufacture and model, it is common for an image to be captured by different imaging devices and delivered to output with image results that vary considerably, particularly with regard to the way colors appear in the output image. More particularly, the colors of the output image may be very different from the original image and not accurately representative of the original image. Examples of image output results may include: printed output where a scanned image is output in a printed format, or displayed format where the image is output as an image displayed on a display device such as a computer or video monitor.
One current technique for avoiding complications caused by differences between image capture devices has been the use of a standard color management system (CMS). More particularly, the International Color Consortium (ICC) color management system has been used to provide a device profile, that characterizes the image device. Additionally, it has been proposed to perform image data exchange in the CIELAB color space, as proposed in International Telecommunications Standard ITU-T.42 (Continuous- tone Colour Representation Method for Facsimile, 1994) that addresses color fax imaging, since CIELAB color space provides the advantage of increased gamut and pseudo-uniformity. Although these techniques of processing image data are relatively competent, they currently lack widespread use, particularly imaging devices and image processing applications (software) and are not particularly easy to implement or carry out by a typical image capture device user. For example, very few consumer printing or display devices are configured to receive and process image data in, for example, a CLELAB
color space format. To do so would require use of computationally intensive conversion software in conjunction with the typical consumer imaging device or application to accommodate image data in, for example, the CIELAB color space format.
In typical image capture devices such as scanners or digital cameras, an imager is used to convert light reflected from a desired image or scene into electronic data representative of the image or scene. These imagers may be, for example, charge coupled device (CCD) imagers, photo multiplier tubes (PMT) or Complementary Metal
Oxide Semiconductor (CMOS) imagers.
SUMMARY
The invention provides a system for capturing an image or scene and outputting image data in a device independent color space format, such as the sRGB color space format. Briefly described, in architecture, the system can be implemented as follows. An imager is provided for capturing an image, scene or object and generating image data representative of the captured image, scene or object. A controller converts the image data from a device dependent color space format and outputs it for further processing, display or printing. Conversion of image data to a device independent color space format is carried out in accordance with device profile data stored in a look up table (LUT) to allow for quick conversion of image to device independent color space. The invention can also be viewed as providing a method for processing image data. In this regard, the method can be broadly summarized by the following steps: image data representative of a captured image, scene or object is received in a predetermined device dependent color space format. Image data is then converted to a device independent color space format based on a predetermined device profile. Also, the invention provides a methodology for calibrating an imaging device so as to provide for output of image data in a device independent color space format. This technique can be broadly summarized by the following steps: a known calibration target is captured and representative image data representative is generated. The generated image data is then evaluated in conjunction with known image data representative of the target. Based upon
this evaluation, a device profile representative of the response of the imaging device to the known colors in the calibration target is generated. The device profile may then be stored for future reference during the process of capturing an image and outputting image data representative thereof in a device independent color space format. Other features and advantages of the invention will become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional features and advantages be included within the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
The components in the figures are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention. Moreover, in the figures, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 is an illustration of a image capture device. FIG. 2 is a flowchart illustrating the calibration process.
FIG. 3 is a diagram illustrating an image capture device configured to carry out the calibration process of FIG. 2.
FIG. 4 is a flowchart illustrating the process of capturing an image and generating device independent image data. FIG. 5 is a diagram illustrating a further embodiment of an image capture device.
FIG. 6 is a graph illustrating conversion data for converting a given color input value into a corresponding color output value.
FIG. 7 is a graphical representation of a color space.
FIG. 8 is a diagram illustrating a TWAIN filter. FIG. 9 is a diagram illustrating a TWAIN data source.
DETAILED DESCRIPTION
The invention provides for an image capture device capable of producing an sRGB color space output representative of a captured image or scene. There is also
provided a methodology of producing device independent sRGB color space output representative of a captured image or scene. In one embodiment of the image capture device of the invention, an image is captured via an imager such as, for example, a charged couple device (CCD) or complementary metal oxide semiconductor (CMOS) imager, configured to output image data in an sRGB color space format. In another embodiment of the image capture device, an image or scene is captured via an imager and output in a device dependent color space format, such as, for example, RGB color space format. This RGB image data is then converted to the sRGB color space format and then output for further processing, printing or viewing, preferably in an sRGB color space compliant format. Conversion of image data from device dependent image color space, for example, RGB, to the sRGB color space may be carried out via dedicated hardware or via conversion software incorporated as part of, for example, image capture software, such as a TWAIN driver; or image processing software such as Adobe Photoshop™ .
In FIG. 1, an image capture device 100 according to the invention is illustrated. It can be seen that imaging device 100 includes an imager 110, that scans, or captures, an image, scene or object 60 and generates image data output Dl that is provided to a controller 120. Where image data is provided in RGB color space format, image data may be provided, for example, in 8 bits for each color channel. Thus, image data representing one pixel of image data may be provided as 8 bits of Red color data, 8 bits of Blue color data and, 8 bits of Green color data.
Controller 120 converts the image data Dl received from imager 110 by referring to device profile data stored as a look-up table (LUT) 142 in a memory storage device 140. In turn, converted image data D2 is output from the imaging device 100. Image data (input image data) Dl is in a device dependent color space, such as RGB color space format. However, in another embodiment of the invention, imager 110 outputs image data Dl directly in a device independent color space, such as, for example, sRGB. Conversion of the image data Dl is carried out by the controller 120 in accordance with conversion instructions (software) 144 stored on memory 140. Device profile data may be, for example, composed of a three dimensional (3D) LUT, that provides for
conversion data related to, for example, Red color data, Blue color data and Green color data.
Imager 110 may be implemented as, for example, a charged coupled device (CCD), a complimentary metallic oxide semiconductor (CMOS) imager, a photo multiplier tube (PMT). Memory 140 may be implemented as volatile or non- volatile memory, or as a fixed or removable storage media. Device profile data stored in LUT 142 is obtained via a calibration process. Imaging device 100 may be implemented as a video camera, a flat-bed scanner, a digital still camera or multi-function device that provides capabilities such as scanning, faxing or photo-copying. For each image device 100, a calibration module or, device profile, is determined.
This device profile essentially defines the relationship between the image device's representation of input image data in the RGB color space values and a device independent color space, such as sRGB. This relationship may be defined by the transformation of Equation 1:
An exact analytical representation of equation 1 does not exist unless the image capture device 100 is colormetric, that is, where the spectral sensitivities of the three image capture device color channels (Red, Green, Blue) equals the CIE XYZ color matching functions, or any non-singular linear transformation. - Unfortunately, colormetirc recording devices are very rare. Given this, the invention uses an analytical model to approximate the transformation of EQ. 1. This model is based on polynomial regression methods, preferably 3rd order polynomial regression. The process of characterizing an image capture device is depicted in FIG. 2. A color target, such as, for example, the IT8.7/2 44 color target containing a set of color samples with known CIELAB values, is scanned (imaged) as shown at step 210. At step 220, the imager generates color image data (input image data) in accordance with the captured target image. The image data is evaluated in conjunction with known data
representative of each color sample of the color target at step 230 to determine the response of the image capture device to the known target image. The response of the device may be determined by calculating the mean values of the image capture devices color components, for example RGB, and using 3rd order polynomial regression to define the image capture device response. This response model, or device profile, may then be used to create a look-up table (LUT) that can be used to carry out conversion of input color image data from, for example, a standard device dependent color space, such as RGB, to a device independent color space, such as, for example, sRGB color space as shown at step 240. The device profile is preferably stored for future reference at step 250.
In FIG. 3, the image device 100 of FIG. 1 may be configured to carry out the calibration process shown in FIG. 2. In this embodiment, image capture device 100 may include additional stored instructions 140 for carrying out calibration of the image capture device 100. For this purpose, target data 146 may be stored on memory 140. Controller 120 is configured to carry out the comparison of received color image data Dl of a captured known target image 65 with known target data 146 representative of the known target image and generate a device profile that is stored in LUT 142 for future reference during image capture.
The flow chart of FIG. 2 shows the architecture, functionality, and operation of a possible implementation of the conversion software 144 shown in FIG. 3. In this regard, each block represents a module, segment, or portion of code, that comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the blocks may occur out of the order noted in FIG. 2. For example, two blocks shown in succession in FIG. 2 may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
The image capture device response (device profile) may be determined by using polynomial regression techniques. For example, a 3rd order polynomial model involving the three parameters R (Red), G (Green) and B (Blue) may be applied. The resulting
3x20 coefficients may be optimized by minimizing the mean square error in CIELAB of the color patches of, for example, the IT8.7/2 44 color target. By performing optimization directly in the CIELAB color space, and not in the CEEXYZ color space as is typically done, errors can be minimized. More particularly, the RMS^ Eab color difference corresponds directly to the visual color differences.
A general color transformation between two color spaces, can be described by Equation 2:
0 = f (P) (EQ.2) where P denotes the input color value and O denotes the output color value. For purposes of describing the invention, P may represent image data as generated by the imager device and, more particularly, the RGB values, P = [R, G, B], and O = [RSRGB, GSRGB, BSRGB , the calibrated sRGB values.
FIG. 4 shows a flow chart that describes a preferred embodiment of capturing an image or scene. At step 310, an image is captured by an imager of an imaging device. In response, image data is generated by the imager in a first color space format at step 420 and subsequently converted to image data of a reference color space format based upon device profile data stored in a memory device as shown at step 440. Once converted to
, the reference color space format, image data is then converted to a device independent color space format and output from the imaging device for further processing at step 460. The flow chart of FIG. 4 shows the architecture, functionality, and operation of a possible implementation of the conversion software 144 shown in FIG. IT In this regard, each block represents a module, segment, or portion of code, that comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the blocks may occur out of the order noted in FIG. 4. For example, two blocks shown in succession in FIG. 4 may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
In one embodiment of the invention, the transformation of input image data (Dl) color values to output image data (D2) color values may be carried out by direct
computation of the analytical function, without referral to a device profile data table. This direct computation may be via the third order polynomial, using floating point arithmetic. For captured images that yield large amounts of image data, such calculations may be very slow to perform for a given image capture device 100. In this embodiment and with reference to FIG. 5, controller 120 of the image capture device may be configured to carry out the transformation calculations to convert input RGB format image data Dl received from imager 110 into output image data D2 in sRGB color format. Instructions (software) 144 for carrying out such calculations may be stored on memory 140. Transformation calculations may be carried out, for example, in accordance with the following Equations 3A-3B:
R RGB = Ao + AiR + A2G+ A3B+ A4RG+ A5RB+ A>GB+ A7RGB (EQ. 3 A) GS GB = Bo + B1R+ B2G+ B3B+ BφRG + B5RB + B6GB + B7RGB (EQ. 3B) BSRGB = Co + CιR+ C2G+ C3B+ QRG + C5RB + C6GB + C7RGB (EQ. 3C)
where Aj, B\, , and i=0...7 are the polynomial coefficients. This example uses second order polynomials, but it will be recognized that 1st, 3rd or Nth order polynomials may also be used.
In another embodiment, the transformation of image data Dl color values to output image data D2 color values is pre-calculated for all possible input values P and then stored in a look-up table (LUT). In this way, conversion of input image data from, for example, RGB color format to, for example, sRGB color format, can be more quickly carried out by simply referring to the LUT to determine a corresponding output value for a given input value. This embodiment requires a enough memory storage 140 to accommodate the necessary LUT data 142, for example, approximately 50 Mb of storage space may be required to store conversion data for 24 bit per pixel input and output data. In order to provide for a faster processing of image data and reduce storage memory necessary to accommodate a LUT, another embodiment of the invention provides for a reduced size LUT. This reduced size LUT includes conversion data for
only predefined input values, for example, instead of including conversion data for all input values in a predetermined range, for example, 0 - 256, only conversion data for certain predetermined input values will be included. More particularly, for a range of potential input values 0 - 256, instead of including a conversion value for each of the 256 values in this range, only every 32nd value may be included (i.e., 0, 32, 64, 96, 128, 160, 192, 224 and 256). This is illustrated in FIG. 6. Of course any set of predetermined values within a given range may also be specified as may be desired. No conversion data is provided for input values not specified. For example, in the input values falling between 0 and 32 or 32 and 64, are not accommodated with corresponding conversion data. Thus, it is not possible to derive a direct equivalent output value from the look-up table.
Where an input value is not included in the LUT conversion data, and therefore does not have a corresponding output value in the LUT, an output value may be derived by interpolating an output value based upon conversion data provided for input image values that are in fact included in the LUT. For example, where the input value is 206, an output value Y2oβ can be interpolated based on the predetermined input values 192 and 224 that are provided for in the LUT with a corresponding output value, Y192 and Y22 respectively. It will be understood that this example is a function involving a single input value and a single output value. However for image input data, such as color image input data where three values (RGB) are input, the LUT must accommodate conversion data for all three input values. In other words, the LUT should be 3D, or otherwise support conversion data for all three input values. This example illustrates an example of a ID LUT in which only one input variable is provided.
Transformation of image data from one color space to a second color space is given by the equation y ~f(x) in which an input parameter x is coded onto a byte, that is, between 0 and 255. It is possible to pre-calculate the function /for a regularly spaced subset of the input range, and reduce the complexity of calculating the function f directly in real time. For example, for the values 0, 32, 64, 96, 128, 160, 192, 224 and 256 and store these values in a LUT as shown in FIG. 6. An output value for a given input value
may be approximated by, for example, linearly interpolating between the nearest pre- calculated values, for example, as follows: f(208) = 0.500yI92 + 0.500y22 (EQ. 4A) f(198) = 0.875y192 + 0J25y224 (EQ. 4B) As the intervals between each included image data input value are equal, the weighting coefficients (0.5, 0.5, 0.875 and 0J25) depend only on the input points relative to the sub-intervals. This also applies to the 8 sub-intervals and, as a result, they too may be pre-calculated and stored in a look-up table (LUT).
Interpolation may be carried out using any one of several different interpolation techmques including, but not limited to, trilinear interpolation; tetrahedral interpolation; prism interpolation; or slant prism interpolation. As an example, the use of trilinear interpolation to derive an output image value for a non-included input value, based upon conversion data provided in a LUT will be discussed. First, the input color space is divided into cubic sub-spaces using a regular grid. An illustration of this general concept is shown in FIG. 7. For each grid point P, of this structure, the output values O,- can be calculated. Where a given pixel value (input value) P is to be transformed, the output values O are calculated as a weighted sum of the vertices of the cube (in this case eight) including the input point P. This is illustrated by :
Where Wk , k=l,8 are weighting coefficients. It should be noted that the actual weighting co-efficient W* may be pre-calculated and stored in a look-up table (LUT).
It will be recognized that the density of the cubic grid shown in FIG. 7 will have an impact on how accurate the results of the interpolation process will be. More particularly, the denser the grid, the more accurate the results of the interpolation process become, however, at the expense of increased memory requirements necessary to store conversion data. Further it will be recognized that interpolation may be carried out in
using, for example, trilinear interpolation; tetrahedral interpolation; prism interpolation; or slant prism interpolation. In one embodiment of the invention, 3x16 bit conversion data is stored in a three dimensional (3D) LUT. Where 17x17x17 entries are provided for, a color LUT of approximately 29 kilobytes (kb) is generated. By increasing the density of the cubic grid to 33x33x33, memory necessary for storing the LUT would be approximately 217kb. It will be noted that it is possible to reduce the amount of data generated by reducing the number of bits used to, for example, encode the co-efficients Ψk and the output data Ok.
In another embodiment of the invention, as shown in FIG. 8 a color conversion filter 862 is provided that allows for transparent color conversion of captured image data without the need for modifications to the TWAIN data source manager 860 or image processing applications (software) 875. Data source 866 receives image data Dl from an imager 110. The data source 866 passes the image data Dl to a filter 862. Filter 862 functions as an interpolation engine that accesses conversion data 864 to convert the image data Dl into a second color space to produce image data D2. Image data Dl is, for example RGB color space format image data, while image data D2 is, for example, a device independent color space format such as sRGB. The TWAIN data source manager 860 also preferably carries out the function of passing TWAIN messages between an image processing application 875 and the TWAIN data source manager, as well as sending control messages C3 to the original mager 110. In another embodiment of the invention, as shown in FIG. 9 the data source manager 860 is configured to access conversion data 864 and convert the image data Dl into a second color space to produce image data D2.
The image capture device of the invention can be implemented in hardware, software, firmware, or a combination thereof. In the preferred embodiment(s), the conversion of image data to a device independent color space format is implemented in software or firmware that is stored in a memory and that is executed by a suitable instruction execution system. If implemented in hardware, as in an alternative embodiment, the controller can implemented with any or a combination of the following
technologies, that are all well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit having appropriate logic gates, a programmable gate array(s) (PGA), a fully programmable gate array (FPGA), etc. Λ It should be emphasized that the above-described embodiments of the invention, particularly, any "preferred" embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiment(s) of the invention without departing substantially from the spirit and principles of the invention. All such modifications and variations are intended to be included within the scope of the invention and protected by the following claims.