US20140333797A1 - Device and method for processing image - Google Patents

Device and method for processing image Download PDF

Info

Publication number
US20140333797A1
US20140333797A1 US14/022,673 US201314022673A US2014333797A1 US 20140333797 A1 US20140333797 A1 US 20140333797A1 US 201314022673 A US201314022673 A US 201314022673A US 2014333797 A1 US2014333797 A1 US 2014333797A1
Authority
US
United States
Prior art keywords
data
image
matrix
color space
space conversion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/022,673
Other versions
US9140608B2 (en
Inventor
Kyoung-tae Kim
Byong-tae Ryu
Jae-woo Bae
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Display Co Ltd
Original Assignee
Samsung Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Display Co Ltd filed Critical Samsung Display Co Ltd
Assigned to SAMSUNG DISPLAY CO., LTD. reassignment SAMSUNG DISPLAY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAE, JAE-WOO, KIM, KYOUNG-TAE, RYU, BYONG-TAE
Publication of US20140333797A1 publication Critical patent/US20140333797A1/en
Application granted granted Critical
Publication of US9140608B2 publication Critical patent/US9140608B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/505Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors measuring the colour produced by lighting fixtures other than screens, monitors, displays or CRTs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing
    • H04N9/735
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/462Computing operations in or between colour spaces; Colour management systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters
    • G01J3/513Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters having fixed filter-detector pairs
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/68Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits
    • H04N9/69Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits for modifying the colour signals by gamma correction
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/06Colour space transformation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/30Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
    • G09G3/32Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
    • G09G3/3208Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]

Definitions

  • Exemplary embodiments of the invention relates to a device and a method for processing an image. More particularly, the invention relates to a device and a method for processing an image, in which a color of an image that is photographed in a camera in a display is substantially accurately reproduced.
  • a display device In order for a display device to accurately reproduce a color of an image that is photographed in a camera of a mobile phone, a digital camera and a camcorder, the camera and the display device are desired to have the same color gamut.
  • a color gamut of the camera In order for a color gamut of the camera to correspond to a color gamut of the display, a standardized sRGB color gamut under daylight 6500K (“D65”) is typically used as a color gamut to use for storage and reproduction of an image. This corresponds to 70% color gamut based on a national television system committee (“NTSC”).
  • D65 standardized sRGB color gamut under daylight 6500K
  • NTSC national television system committee
  • an organic light emitting display typically has a wide color gamut (110% color gamut based as an NTSC), compared with the sRGB color gamut under D65. Therefore, an image that is generated using an sRGB color gamut under D65 as a standard may not be substantially accurately displayed by an organic light emitting display. That is, when displaying an image that is photographed in a camera having a narrow color gamut in an organic light emitting display having a wide color gamut, a color of the image may be exaggeratingly viewed further than an actual color.
  • Exemplary embodiments of the invention relate to a device and a method for processing an image with improved accuracy in reproducing a color of an image that is photographed in a camera in an organic light emitting display.
  • An exemplary embodiment of an image processing device includes: a camera sensor unit which generates a bayer image; a demosaicking unit which converts the bayer image into a red image, a green image and a blue image; a first color space conversion unit which converts first camera RGB data including data of the red image, the green image and the blue image into first display R′G′B′ data using a first color space conversion matrix; and a display unit which displays an image using the first display R′G′B′ data, where the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N ⁇ 3 matrix is a transpose of a second N ⁇ 3 matrix, where the first N ⁇ 3 matrix represents camera RGB data of N number of colors, and the second N ⁇ 3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer, and where N is a natural number.
  • the image processing device may further include a second color space conversion unit which converts the first camera RGB data into first sRGB data including red image data, green image data and blue image data in an sRGB color gamut using a second color space conversion matrix.
  • the second color space conversion matrix may be calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N ⁇ 3 matrix is a transpose of a third N ⁇ 3 matrix, where the third N ⁇ 3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
  • the image processing device may further include a data transmitting and receiving unit which transmits the first sRGB data and receives second sRGB data.
  • the second color space conversion unit may convert the second sRGB data into second camera RGB data using an inverse matrix of the second color space conversion matrix
  • the first color space conversion unit may convert the second camera RGB data into second display R′G′B′ data using the first color space conversion matrix
  • the display unit may display an image using the second display R′G′B′ data.
  • the image processing device may further include a data storage unit which stores the first display R′G′B′ data.
  • the first color space conversion unit may convert the first display R′G′B′ data, which are stored at the data storage unit, into the first camera RGB data using an inverse matrix of the first color space conversion matrix.
  • the image processing device may further include a pre-processing unit which adjusts a black level of the bayer image and which removes noise in the bayer image.
  • the image processing device may further include a white balancing unit which performs white balancing of the bayer image.
  • the image processing device may further include a gamma correction unit which performs gamma correction of the red image, the green image and the blue image based on a visual perceptual property of humans.
  • a gamma correction unit which performs gamma correction of the red image, the green image and the blue image based on a visual perceptual property of humans.
  • the image processing device may further include a post-processing unit which performs a post-processing for adjusting an edge, a contrast and saturation of the first display R′G′B′ data.
  • An exemplary embodiment of a method of processing an image includes: generating a bayer image including a red channel value, a green channel value and a blue channel value; converting the bayer image into a red image, a green image and a blue image; converting camera RGB data including data of the red image, the green image and the blue image into display R′G′B′ data using a first color space conversion matrix; and displaying an image in a display unit using the display R′G′B′ data, where the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N ⁇ 3 matrix is a transpose of a second N ⁇ 3 matrix, where the first N ⁇ 3 matrix represents camera RGB data of N number of colors, and the second N ⁇ 3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer, and where N is a natural number.
  • the generating the bayer image may include adjusting a black level of the bayer image and removing noise in the bayer image; and performing white balancing of the bayer image.
  • the converting the bayer image may include performing gamma correction of the red image, the green image and the blue image based on a visual perceptual property of humans.
  • the converting the camera RGB data may include performing a post-processing for adjusting an edge, a contrast and saturation of the display R′G′B′ data.
  • the method may further include: converting the camera RGB data into sRGB data including red image data, green image data and blue image data in an sRGB color gamut using a second color space conversion matrix; and transmitting the sRGB data through a data transmitting and receiving unit.
  • the second color space conversion matrix may be calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N ⁇ 3 matrix is a transpose of a third N ⁇ 3 matrix, where the third N ⁇ 3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
  • the method may further include storing the display R′G′B′ data at a data storage unit.
  • An alternative exemplary embodiment of a method of processing an image includes: receiving sRGB data including red image data, green image data and blue image data in an sRGB color gamut through a data transmitting and receiving unit; converting the sRGB data into camera RGB data using an inverse matrix of a second color space conversion matrix; and converting the camera RGB data into display R′G′B′ data using the first color space conversion matrix, where the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N ⁇ 3 matrix is a transpose of a second N ⁇ 3 matrix, where the first N ⁇ 3 matrix represents camera RGB data of N number of colors, and the second N ⁇ 3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer, and where N is a natural number.
  • the second color space conversion matrix may be calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N ⁇ 3 matrix is a transpose of a third N ⁇ 3 matrix, where the third N ⁇ 3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
  • the receiving the sRGB data may include generating a bayer image including a red channel value, a green channel value and a blue channel value in a first image processing device; converting the bayer image into a red image, a green image and a blue image; converting camera RGB data including data of the red image, the green image and the blue image into the sRGB data using a second color space conversion matrix of the first image processing device; and receiving the sRGB data from the first image processing device.
  • the receiving the sRGB data may include converting first display R′G′B′ data, which is stored at the first image processing device, into first camera RGB data using an inverse matrix of the first color space conversion matrix of the first image processing device; converting the first camera RGB data into the sRGB data using a second color space conversion matrix of the first image processing device; and receiving the sRGB data from the first image processing device.
  • an image that is generated with sRGB standard color gamut under daylight 6500K (“D65”) may be effectively displayed with an accurate color.
  • FIG. 1 is a block diagram illustrating an exemplary embodiment of an image processing device according to the invention
  • FIG. 2 is a flowchart illustrating an exemplary embodiment of a method of calculating a first color space conversion matrix of an image processing device according to the invention
  • FIG. 3 is a flowchart illustrating an exemplary embodiment of a method of calculating a second color space conversion matrix of an image processing device according to the invention
  • FIG. 4 is a flowchart illustrating an exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention.
  • FIG. 5 is a flowchart illustrating an alternative exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention.
  • first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the invention.
  • spatially relative terms such as “beneath”, “below”, “lower”, “above”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation, in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
  • Embodiments of the invention are described herein with reference to cross-section illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of the invention. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, embodiments of the invention should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, a region illustrated or described as flat may, typically, have rough and/or nonlinear features. Moreover, sharp angles that are illustrated may be rounded. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the precise shape of a region and are not intended to limit the scope of the claims set forth herein.
  • FIG. 1 is a block diagram illustrating an exemplary embodiment of an image processing device according to the invention.
  • an exemplary embodiment of an image processing device 100 includes a camera sensor unit 101 , a pre-processing unit 102 , a white balancing unit 103 , a demosaicking unit 104 , a gamma correction unit 105 , a first color space conversion unit 106 , a post-processing unit 107 , a display unit 108 , a data storage unit 109 , a second color space conversion unit 110 and a data transmitting and receiving unit 111 .
  • the camera sensor unit 101 includes a color filter and an image sensor, such as a complementary metal-oxide semiconductor (“CMOS”) or a charge coupled device (“CCD”), for example.
  • CMOS complementary metal-oxide semiconductor
  • CCD charge coupled device
  • An image that is acquired by an image sensor such as a CMOS or a CCD is a monochrome image and thus a color filter for passing through only a predetermined frequency band in a visible spectrum may be provided, e.g., disposed, at a front end portion of the image sensor to acquire a color image.
  • the color filter allows an image having a specific frequency band in an image of a subject to pass therethrough.
  • the color filter is divided into a plurality of areas, and each area allows only a video signal having a frequency band substantially the same as a frequency band of one color of three colors of red, green and blue to pass therethrough.
  • the color filter includes a Bayer filter mosaic.
  • the video signal that passes through the color filter is transferred to an image sensor such as a CMOS or a CCD.
  • the image sensor converts the received video signal into an electrical signal.
  • the electrical signal corresponds to the video signal that passes through the color filter, such that the electrical signal includes a red channel value, a green channel value and a blue channel value.
  • a bayer image e.g., a Bayer pattern image
  • an array of a red channel value, a green channel value and a blue channel value is generated.
  • the pre-processing unit 102 adjusts a black level of the bayer image and removes noise in the bayer image.
  • the white balancing unit 103 performs white balancing of the bayer image such that color balance is adjusted for accurate color reproduction.
  • a color that is distorted by lighting in the bayer image may be corrected into a color in which a human recognizes through white balancing.
  • the demosaicking unit 104 converts the bayer image into a green image, a red image and a blue image. In such an embodiment, the demosaicking unit 104 may convert the bayer image having a single channel into a color image having 3 channels.
  • the gamma correction unit 105 performs gamma correction of the green image, the red image and the blue image based on a visual perceptual property of humans.
  • the gamma correction unit 105 may nonlinearly convert each of the green image, the red image and the blue image using a nonlinear transfer function based on a nonlinear visual perceptual property of humans.
  • the camera RGB data includes a color coordinate of a color in a color gamut of a camera.
  • the first color space conversion unit 106 converts the camera RGB data into display R′G′B′ data using a first color space conversion matrix.
  • the display R′G′B′ data include green image data, red image data and blue image data based on color gamut characteristics of the display unit 108 .
  • the display R′G′B′ data includes a color coordinate of a color in a color gamut of the display.
  • the first color space conversion matrix converts the camera RGB data into the display R′G′B′ data based on color gamut characteristics of a display such that a color of an image that is photographed by a color gamut of a camera is displayed with substantially the same color without distortion in a display.
  • An exemplary embodiment of a method of calculating the first color space conversion matrix will be described later in greater detail with reference to FIG. 2 .
  • the first color space conversion unit 106 may inversely convert the display R′G′B′ data into the camera RGB data using an inverse matrix of the first color space conversion matrix.
  • the post-processing unit 107 performs a post-processing that adjust, e.g., improves, an edge, a contrast and saturation of the display R′G′B′ data.
  • the display unit 108 displays an image using the display R′G′B′ data, in which a post-processing is performed.
  • the display unit 108 may be an organic light emitting display, and the organic light emitting display may have a color gamut of 110% based on an NTSC.
  • the data storage unit 109 stores the display R′G′B′ data that is converted in the first color space conversion unit 106 .
  • the display R′G′B′ data corresponding to color gamut characteristics of the display unit 108 may be stored at the data storage unit 109 .
  • the second color space conversion unit 110 converts the camera RGB data to sRGB data using a second color space conversion matrix.
  • the sRGB data includes green image data, red image data and blue image data in an sRGB color gamut under daylight 6500K (“D65”).
  • the sRGB color gamut corresponds to 70% color gamut based on an NTSC.
  • the sRGB data includes a color coordinate of a color in the sRGB color gamut.
  • the second color space conversion matrix converts the camera RGB data into the sRGB data in an sRGB color gamut, which is a standard color gamut to transmit and receive a photographed image. An exemplary embodiment of a method of calculating the second color space conversion matrix will be described later in greater detail with reference to FIG. 3 .
  • the second color space conversion unit 110 may inversely convert the sRGB data into the camera RGB data using an inverse matrix of the second color space conversion matrix.
  • a color gamut of a photographed image may be different from an sRGB color gamut, but a color gamut of the photographed image may be substantially the same as the sRGB color gamut.
  • camera RGB data may be substantially the same as the sRGB data of the sRGB color gamut.
  • the second color space conversion unit 110 may be omitted.
  • the data transmitting and receiving unit 111 transmits the sRGB data to another image processing device or receives sRGB data from another image processing device.
  • Another image processing device may use any one of a camera color gamut and a display color gamut, and image data that is transmitted and received to and from the image processing device may be data of a standard color gamut that is widely used in a random image processing device. Therefore, image data that image processing devices including the data transmitting and receiving unit 11 transmit and receive may be converted into the sRGB data of an sRGB color gamut.
  • FIG. 2 is a flowchart illustrating an exemplary embodiment of a method of calculating a first color space conversion matrix of an image processing device according to the invention.
  • the first color space conversion matrix of the image processing device 100 may be an RGB to R′G′B′ matrix that converts the camera RGB data into the display R′G′B′ data.
  • N is a natural number.
  • the N number of colors may be randomly selected.
  • the N number of colors may be selected as red, green and blue for an accurate calculation of the first color space conversion matrix.
  • the N number of colors may be selected as other colors including cyan, magenta, yellow and white, other than red, green and blue for a more accurate calculation of the first color space conversion matrix.
  • the photographed image is processed in the pre-processing unit 102 , the white balancing unit 103 , the demosaicking unit 104 and the gamma correction unit 105 , and the photographed image is generated into camera RGB data (S 120 ).
  • a pre-processing process, a white balancing process, a demosaicking process and a gamma correction process of a bayer image that is formed in the camera sensor unit 101 may be performed, and the camera RGB data of the N number of colors is thereby generated.
  • the N number of colors are displayed in the display unit 108 , and the N number of colors may be measured by a spectrophotometer (S 130 ).
  • the tristimulus value XYZ of each of the N number of colors is converted into display R′G′B′ data using an XYZ to R′G′B′ inverse conversion matrix that converts the tristimulus value XYZ into the display R′G′B′ data (S 150 ).
  • the R′G′B′ to XYZ conversion matrix that reflects color gamut characteristics of a display may be calculated based on measurement using the spectrophotometer. Therefore, the XYZ to R′G′B′ inverse conversion matrix, which is an inverse matrix of an R′G′B′ to XYZ conversion matrix, may be calculated.
  • the display R′G′B′ data of the N number of colors are generated using the XYZ to R′G′B′ inverse conversion matrix (S 160 ).
  • the RGB to R′G′B′ matrix is calculated based on a relationship between the camera RGB data of the N number of colors and the display R′G′B′ data of the N number of colors (S 170 ).
  • the camera RGB data of the N number of colors and the display R′G′B′ data of the N number of colors may be represented by N ⁇ 3 matrices including a first N ⁇ 3 matrix and a second N ⁇ 3 matrix
  • the camera RGB data of the N number of colors and the display R′G′B′ data of the N number of colors may be represented by Equation 1.
  • A represents an N ⁇ 3 matrix of the camera RGB data of the N number of colors (i.e., the first N ⁇ 3 matrix)
  • B represents an N ⁇ 3 matrix of the display R′G′B′ data of the N number of colors (i.e., the second N ⁇ 3 matrix).
  • RC 1 , GC 1 and BC 1 denote a color coordinate of a first color in the camera RGB data
  • RC 2 , GC 2 and BC 2 denote a color coordinate of a second color in the camera RGB data
  • RC N , GC N and BC N denote a color coordinate of the N-th color in the camera RGB data.
  • RD 1 , GD 1 and BD 1 denote a color coordinate of the first color in the display R′G′B′ data
  • RD 2 , GD 2 and BD 2 denote a color coordinate of the second color in the display R′G′B′ data
  • RD N , GD N and BD N denote a color coordinate of the N-th color in the display R′G′B′ data.
  • Equation 2 A T denotes a transpose of A, and B T denotes a transpose of B.
  • Equation 2 when multiplying the transpose of A T , that is A, to both sides and when multiplying an inverse matrix of (A T ⁇ A), that is, (A T ⁇ A) ⁇ 1 , to both sides, the first color space conversion matrix M satisfies the following Equation 3.
  • the first color space conversion matrix may be calculated from a relationship in which the product of the first color space conversion matrix M and a transpose of A is a transpose of B, as shown in Equation 2.
  • the calculated first color space conversion matrix may be used for converting the camera RGB data into the display R′G′B′ data in the first color space conversion unit 106 of FIG. 1 .
  • FIG. 3 is a flowchart illustrating an exemplary embodiment of a method of calculating a second color space conversion matrix of an image processing device according to the invention.
  • a second color space conversion matrix of the image processing device 100 is an RGB to sRGB matrix that converts the camera RGB data into the sRGB data.
  • the predetermined number of colors e.g., N number of colors
  • the N number of colors may be selected with random colors.
  • the N number of colors may be selected with red, green and blue for accurate calculation of the second color space conversion matrix.
  • the N number of colors may be selected with cyan, magenta, yellow and white, other than red, green and blue, to more accurately calculate the second color space conversion matrix.
  • the photographed image is processed in the pre-processing unit 102 , the white balancing unit 103 , the demosaicking unit 104 and the gamma correction unit 105 , and is generated into the camera RGB data (S 220 ).
  • a pre-processing process, a white balancing process, a demosaicking process and a gamma correction process of a bayer image that is formed in the camera sensor unit 101 may be performed, and camera RGB data of the N number of colors are thereby generated.
  • sRGB data of the predetermined N number of colors is generated (S 230 ).
  • An sRGB color gamut is a standardized color, and the sRGB data of the predetermined N number of colors may be derived from a look-up table of an sRGB color gamut.
  • RGB to sRGB matrix is calculated from a relationship between camera RGB data of the N number of colors and sRGB data of the N number of colors (S 240 ).
  • the RGB to sRGB matrix e.g., the second color space conversion matrix, may be derived with the same method as in the method described with reference to Equations 1 to 3 and FIG. 2 .
  • the second color space conversion matrix is calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N ⁇ 3 matrix A is a transpose of the third N ⁇ 3 matrix B′.
  • FIG. 4 is a flowchart illustrating an exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention.
  • FIG. 4 illustrates an exemplary embodiment of a method including transmitting an image that is photographed in the first image processing device to the second image processing device.
  • an image that is photographed in the first image processing device is generated into first camera RGB data (S 310 ).
  • an image that is photographed in the first camera unit of the first image processing device may be generated into the first camera RGB data through a pre-processing unit, a white balancing unit, a demosaicking unit and a gamma correction unit of the first image processing device.
  • the first camera RGB data is converted into the first display R′G′B′ data using the first color space conversion matrix of the first image processing device, and the first display R′G′B′ data may be displayed into an image through a display unit of the first image processing device.
  • the first camera RGB data is generated into sRGB data using a second color space conversion matrix of the first image processing device (S 320 ).
  • the first camera RGB data may be converted into sRGB data through the second color space conversion unit of the first image processing device.
  • a second color space conversion matrix of the first image processing device converts the first camera RGB data into sRGB data.
  • the sRGB data is transmitted to the second image processing device through the data transmitting and receiving unit of the first image processing device (S 330 ).
  • the sRGB data is received in the second image processing device (S 340 ).
  • the second image processing device may receive the sRGB data through the data transmitting and receiving unit.
  • the sRGB data is generated into second camera RGB data using an inverse matrix of the second color space conversion matrix of the second image processing device (S 350 ).
  • sRGB data may be converted into the second camera RGB data through the second color space conversion unit of the second image processing device.
  • the second color space conversion matrix of the second image processing device converts the second camera RGB data into sRGB data, and an inverse matrix thereof converts sRGB data into second camera RGB data.
  • the second camera RGB data is generated into second display R′G′B′ data using the first color space conversion matrix of the second image processing device (S 360 ).
  • the second camera RGB data is converted into the second display R′G′B′ data through the first color space conversion unit of the second image processing device.
  • the first color space conversion matrix converts the second camera RGB data into second display R′G′B′ data.
  • the second display R′G′B′ data may be displayed into an image through a display unit of the second image processing device.
  • the first image processing device converts and transmits the first camera RGB data into sRGB data of a standardized sRGB color gamut
  • the second image processing device converts sRGB data into second camera RGB data, converts the second camera RGB data into the second display R′G′B′ data, and displays the second display R′G′B′ data in the display unit such that image processing devices having different camera color gamuts and display color gamuts may transmit and receive image data without distortion of a color.
  • FIG. 5 is a flowchart illustrating an alternative exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention.
  • FIG. 5 illustrates an exemplary embodiment of a method including transmitting an image that is stored at the first image processing device to the second image processing device.
  • an image that is photographed in the first image processing device may be stored into first display R′G′B′ data that may be displayed in a display unit of the first image processing device (S 410 ).
  • an image that is photographed in the first camera unit of the first image processing device is generated into first camera RGB data through a pre-processing unit, a white balancing unit, a demosaicking unit and a gamma correction unit of the first image processing device.
  • the first camera RGB data is converted into first display R′G′B′ data through a first color space conversion unit of the first image processing device, and the first display R′G′B′ data may be displayed into an image through a display unit of the first image processing device.
  • the first display R′G′B′ data that is converted through the first color space conversion unit may be stored at a data storage unit of the first image processing device.
  • the first display R′G′B′ data that is stored at the first image processing device is generated into first camera RGB data using an inverse matrix of the first color space conversion matrix of the first image processing device (S 420 ).
  • the first display R′G′B′ data may be converted into first camera RGB data through the first color space conversion unit of the first image processing device.
  • the first color space conversion matrix of the first image processing device converts the first camera RGB data into first display R′G′B′ data, and an inverse matrix thereof converts the first display R′G′B′ data into first camera RGB data.
  • the first camera RGB data is generated into sRGB data using a second color space conversion matrix of the first image processing device (S 430 ).
  • the first camera RGB data is converted into sRGB data through a second color space conversion unit of the first image processing device.
  • the second color space conversion matrix of the first image processing device converts the first camera RGB data into sRGB data.
  • the sRGB data is transmitted to the second image processing device through a data transmitting and receiving unit of the first image processing device (S 440 ).
  • the sRGB data is received in the second image processing device (S 450 ).
  • the second image processing device may receive sRGB data through the data transmitting and receiving unit.
  • the sRGB data is generated into second camera RGB data using an inverse matrix of a second color space conversion matrix of the second image processing device (S 460 ).
  • the sRGB data is converted into second camera RGB data through the second color space conversion unit of the second image processing device.
  • the second color space conversion matrix of the second image processing device converts the second camera RGB data into sRGB data, and an inverse matrix thereof converts the sRGB data matrix into second camera RGB data.
  • the second camera RGB data is generated into second display R′G′B′ data using the first color space conversion matrix of the second image processing device (S 470 ).
  • the second camera RGB data is converted into second display R′G′B′ data through the first color space conversion unit of the second image processing device.
  • the first color space conversion matrix may convert the second camera RGB data into second display R′G′B′ data in the second image processing device.
  • the second display R′G′B′ data may be displayed into an image through a display unit of the second image processing device.
  • first display R′G′B′ data that is stored at the first image processing device is converted into first camera RGB data
  • the first camera RGB data is converted and transmitted to sRGB data of a standardized sRGB color gamut
  • the second image processing device converts the sRGB data into second camera RGB data and converts the second camera RGB data into second display R′G′B′ data and the display unit displays an image based on the second display R′G′B′ data.
  • image processing devices having different camera color gamuts and display color gamuts may transmit and receive image data without distortion of a color.

Abstract

An image processing device includes: a demosaicking unit which converts a bayer image from a camera sensor into red, green image and blue images; a first color space conversion unit which converts first camera RGB data into first display R′G′B′ data using a first color space conversion matrix; and a display unit which displays an image using the first display R′G′B′ data, where the first color space conversion matrix is calculated from a relationship in which the product of the first color space conversion matrix and a transpose of a first N×3 matrix is a transpose of a second N×3 matrix, where the first N×3 matrix represents camera RGB data of N number of colors, and the second N×3 matrix represents display R′G′B′ data of the N number of colors calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer.

Description

  • This application claims priority to Korean Patent Application No. 10-2013-0053222 filed on May 10, 2013, and all the benefits accruing therefrom under 35 U.S.C. §119, the content of which in its entirety is herein incorporated by reference.
  • BACKGROUND
  • (a) Field
  • Exemplary embodiments of the invention relates to a device and a method for processing an image. More particularly, the invention relates to a device and a method for processing an image, in which a color of an image that is photographed in a camera in a display is substantially accurately reproduced.
  • (b) Description of the Related Art
  • In order for a display device to accurately reproduce a color of an image that is photographed in a camera of a mobile phone, a digital camera and a camcorder, the camera and the display device are desired to have the same color gamut. In order for a color gamut of the camera to correspond to a color gamut of the display, a standardized sRGB color gamut under daylight 6500K (“D65”) is typically used as a color gamut to use for storage and reproduction of an image. This corresponds to 70% color gamut based on a national television system committee (“NTSC”).
  • However, an organic light emitting display typically has a wide color gamut (110% color gamut based as an NTSC), compared with the sRGB color gamut under D65. Therefore, an image that is generated using an sRGB color gamut under D65 as a standard may not be substantially accurately displayed by an organic light emitting display. That is, when displaying an image that is photographed in a camera having a narrow color gamut in an organic light emitting display having a wide color gamut, a color of the image may be exaggeratingly viewed further than an actual color.
  • SUMMARY
  • Exemplary embodiments of the invention relate to a device and a method for processing an image with improved accuracy in reproducing a color of an image that is photographed in a camera in an organic light emitting display.
  • An exemplary embodiment of an image processing device includes: a camera sensor unit which generates a bayer image; a demosaicking unit which converts the bayer image into a red image, a green image and a blue image; a first color space conversion unit which converts first camera RGB data including data of the red image, the green image and the blue image into first display R′G′B′ data using a first color space conversion matrix; and a display unit which displays an image using the first display R′G′B′ data, where the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N×3 matrix is a transpose of a second N×3 matrix, where the first N×3 matrix represents camera RGB data of N number of colors, and the second N×3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer, and where N is a natural number.
  • In an exemplary embodiment, the image processing device may further include a second color space conversion unit which converts the first camera RGB data into first sRGB data including red image data, green image data and blue image data in an sRGB color gamut using a second color space conversion matrix.
  • In an exemplary embodiment, the second color space conversion matrix may be calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N×3 matrix is a transpose of a third N×3 matrix, where the third N×3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
  • In an exemplary embodiment, the image processing device may further include a data transmitting and receiving unit which transmits the first sRGB data and receives second sRGB data.
  • In an exemplary embodiment, the second color space conversion unit may convert the second sRGB data into second camera RGB data using an inverse matrix of the second color space conversion matrix, the first color space conversion unit may convert the second camera RGB data into second display R′G′B′ data using the first color space conversion matrix, and the display unit may display an image using the second display R′G′B′ data.
  • In an exemplary embodiment, the image processing device may further include a data storage unit which stores the first display R′G′B′ data.
  • In an exemplary embodiment, the first color space conversion unit may convert the first display R′G′B′ data, which are stored at the data storage unit, into the first camera RGB data using an inverse matrix of the first color space conversion matrix.
  • In an exemplary embodiment, the image processing device may further include a pre-processing unit which adjusts a black level of the bayer image and which removes noise in the bayer image.
  • In an exemplary embodiment, the image processing device may further include a white balancing unit which performs white balancing of the bayer image.
  • In an exemplary embodiment, the image processing device may further include a gamma correction unit which performs gamma correction of the red image, the green image and the blue image based on a visual perceptual property of humans.
  • In an exemplary embodiment, the image processing device may further include a post-processing unit which performs a post-processing for adjusting an edge, a contrast and saturation of the first display R′G′B′ data.
  • An exemplary embodiment of a method of processing an image includes: generating a bayer image including a red channel value, a green channel value and a blue channel value; converting the bayer image into a red image, a green image and a blue image; converting camera RGB data including data of the red image, the green image and the blue image into display R′G′B′ data using a first color space conversion matrix; and displaying an image in a display unit using the display R′G′B′ data, where the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N×3 matrix is a transpose of a second N×3 matrix, where the first N×3 matrix represents camera RGB data of N number of colors, and the second N×3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer, and where N is a natural number.
  • In an exemplary embodiment, the generating the bayer image may include adjusting a black level of the bayer image and removing noise in the bayer image; and performing white balancing of the bayer image.
  • In an exemplary embodiment, the converting the bayer image may include performing gamma correction of the red image, the green image and the blue image based on a visual perceptual property of humans.
  • In an exemplary embodiment, the converting the camera RGB data may include performing a post-processing for adjusting an edge, a contrast and saturation of the display R′G′B′ data.
  • In an exemplary embodiment, the method may further include: converting the camera RGB data into sRGB data including red image data, green image data and blue image data in an sRGB color gamut using a second color space conversion matrix; and transmitting the sRGB data through a data transmitting and receiving unit.
  • In an exemplary embodiment, the second color space conversion matrix may be calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N×3 matrix is a transpose of a third N×3 matrix, where the third N×3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
  • In an exemplary embodiment, the method may further include storing the display R′G′B′ data at a data storage unit.
  • An alternative exemplary embodiment of a method of processing an image includes: receiving sRGB data including red image data, green image data and blue image data in an sRGB color gamut through a data transmitting and receiving unit; converting the sRGB data into camera RGB data using an inverse matrix of a second color space conversion matrix; and converting the camera RGB data into display R′G′B′ data using the first color space conversion matrix, where the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N×3 matrix is a transpose of a second N×3 matrix, where the first N×3 matrix represents camera RGB data of N number of colors, and the second N×3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer, and where N is a natural number.
  • In an exemplary embodiment, the second color space conversion matrix may be calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N×3 matrix is a transpose of a third N×3 matrix, where the third N×3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
  • In an exemplary embodiment, the receiving the sRGB data may include generating a bayer image including a red channel value, a green channel value and a blue channel value in a first image processing device; converting the bayer image into a red image, a green image and a blue image; converting camera RGB data including data of the red image, the green image and the blue image into the sRGB data using a second color space conversion matrix of the first image processing device; and receiving the sRGB data from the first image processing device.
  • In an exemplary embodiment, the receiving the sRGB data may include converting first display R′G′B′ data, which is stored at the first image processing device, into first camera RGB data using an inverse matrix of the first color space conversion matrix of the first image processing device; converting the first camera RGB data into the sRGB data using a second color space conversion matrix of the first image processing device; and receiving the sRGB data from the first image processing device.
  • In exemplary embodiments, when an image that is photographed in a camera having a narrow color gamut is displayed in an organic light emitting display having a wide color gamut, a color of the image is reproduced substantially equally to an actual color.
  • In exemplary embodiment of an organic light emitting display, an image that is generated with sRGB standard color gamut under daylight 6500K (“D65”) may be effectively displayed with an accurate color.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other features of the invention will become more apparent by describing in further detail exemplary embodiments thereof with reference to the accompanying drawings, in which:
  • FIG. 1 is a block diagram illustrating an exemplary embodiment of an image processing device according to the invention;
  • FIG. 2 is a flowchart illustrating an exemplary embodiment of a method of calculating a first color space conversion matrix of an image processing device according to the invention;
  • FIG. 3 is a flowchart illustrating an exemplary embodiment of a method of calculating a second color space conversion matrix of an image processing device according to the invention;
  • FIG. 4 is a flowchart illustrating an exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention; and
  • FIG. 5 is a flowchart illustrating an alternative exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention.
  • DETAILED DESCRIPTION
  • The invention will be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown. This invention may, however, be embodied in many different forms, and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like reference numerals refer to like elements throughout.
  • It will be understood that when an element or layer is referred to as being “on,” “connected to” or “coupled to” another element or layer, the element or layer can be directly on, connected or coupled to the other element or layer or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on,” “directly connected to” or “directly coupled to” another element or layer, there are no intervening elements or layers present. Like numbers refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
  • It will be understood that, although the terms first, second, third, etc., may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the invention.
  • Spatially relative terms, such as “beneath”, “below”, “lower”, “above”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation, in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “includes” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
  • Embodiments of the invention are described herein with reference to cross-section illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of the invention. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, embodiments of the invention should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, a region illustrated or described as flat may, typically, have rough and/or nonlinear features. Moreover, sharp angles that are illustrated may be rounded. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the precise shape of a region and are not intended to limit the scope of the claims set forth herein.
  • All methods described herein can be performed in a suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”), is intended merely to better illustrate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention as used herein.
  • Hereinafter, exemplary embodiments of the invention will be described in further detail with reference to the accompanying drawings.
  • FIG. 1 is a block diagram illustrating an exemplary embodiment of an image processing device according to the invention.
  • Referring to FIG. 1, an exemplary embodiment of an image processing device 100 includes a camera sensor unit 101, a pre-processing unit 102, a white balancing unit 103, a demosaicking unit 104, a gamma correction unit 105, a first color space conversion unit 106, a post-processing unit 107, a display unit 108, a data storage unit 109, a second color space conversion unit 110 and a data transmitting and receiving unit 111.
  • In an exemplary embodiment, the camera sensor unit 101 includes a color filter and an image sensor, such as a complementary metal-oxide semiconductor (“CMOS”) or a charge coupled device (“CCD”), for example. An image that is acquired by an image sensor such as a CMOS or a CCD is a monochrome image and thus a color filter for passing through only a predetermined frequency band in a visible spectrum may be provided, e.g., disposed, at a front end portion of the image sensor to acquire a color image. The color filter allows an image having a specific frequency band in an image of a subject to pass therethrough. The color filter is divided into a plurality of areas, and each area allows only a video signal having a frequency band substantially the same as a frequency band of one color of three colors of red, green and blue to pass therethrough. In one exemplary embodiment, for example, the color filter includes a Bayer filter mosaic. The video signal that passes through the color filter is transferred to an image sensor such as a CMOS or a CCD. The image sensor converts the received video signal into an electrical signal. In such an embodiment, the electrical signal corresponds to the video signal that passes through the color filter, such that the electrical signal includes a red channel value, a green channel value and a blue channel value. In such an embodiment, a bayer image (e.g., a Bayer pattern image) including an array of a red channel value, a green channel value and a blue channel value is generated.
  • In an exemplary embodiment, the pre-processing unit 102 adjusts a black level of the bayer image and removes noise in the bayer image.
  • In such an embodiment, the white balancing unit 103 performs white balancing of the bayer image such that color balance is adjusted for accurate color reproduction. A color that is distorted by lighting in the bayer image may be corrected into a color in which a human recognizes through white balancing.
  • In an exemplary embodiment, the demosaicking unit 104 converts the bayer image into a green image, a red image and a blue image. In such an embodiment, the demosaicking unit 104 may convert the bayer image having a single channel into a color image having 3 channels.
  • In an exemplary embodiment, the gamma correction unit 105 performs gamma correction of the green image, the red image and the blue image based on a visual perceptual property of humans. In such an embodiment, the gamma correction unit 105 may nonlinearly convert each of the green image, the red image and the blue image using a nonlinear transfer function based on a nonlinear visual perceptual property of humans.
  • Hereinafter, data of the green image, data of the red image, and data of the blue image, in which gamma correction is performed will be collectively referred to as “camera RGB data.” The camera RGB data includes a color coordinate of a color in a color gamut of a camera.
  • In an exemplary embodiment, the first color space conversion unit 106 converts the camera RGB data into display R′G′B′ data using a first color space conversion matrix. The display R′G′B′ data include green image data, red image data and blue image data based on color gamut characteristics of the display unit 108. The display R′G′B′ data includes a color coordinate of a color in a color gamut of the display. In such an embodiment, the first color space conversion matrix converts the camera RGB data into the display R′G′B′ data based on color gamut characteristics of a display such that a color of an image that is photographed by a color gamut of a camera is displayed with substantially the same color without distortion in a display. An exemplary embodiment of a method of calculating the first color space conversion matrix will be described later in greater detail with reference to FIG. 2.
  • In such an embodiment, the first color space conversion unit 106 may inversely convert the display R′G′B′ data into the camera RGB data using an inverse matrix of the first color space conversion matrix.
  • In an exemplary embodiment, the post-processing unit 107 performs a post-processing that adjust, e.g., improves, an edge, a contrast and saturation of the display R′G′B′ data.
  • The display unit 108 displays an image using the display R′G′B′ data, in which a post-processing is performed. In one exemplary embodiment, for example, the display unit 108 may be an organic light emitting display, and the organic light emitting display may have a color gamut of 110% based on an NTSC.
  • In an exemplary embodiment, the data storage unit 109 stores the display R′G′B′ data that is converted in the first color space conversion unit 106. In such an embodiment, the display R′G′B′ data corresponding to color gamut characteristics of the display unit 108 may be stored at the data storage unit 109.
  • In an exemplary embodiment, the second color space conversion unit 110 converts the camera RGB data to sRGB data using a second color space conversion matrix. The sRGB data includes green image data, red image data and blue image data in an sRGB color gamut under daylight 6500K (“D65”). The sRGB color gamut corresponds to 70% color gamut based on an NTSC. The sRGB data includes a color coordinate of a color in the sRGB color gamut. The second color space conversion matrix converts the camera RGB data into the sRGB data in an sRGB color gamut, which is a standard color gamut to transmit and receive a photographed image. An exemplary embodiment of a method of calculating the second color space conversion matrix will be described later in greater detail with reference to FIG. 3.
  • In such an embodiment, the second color space conversion unit 110 may inversely convert the sRGB data into the camera RGB data using an inverse matrix of the second color space conversion matrix.
  • As described above, a color gamut of a photographed image may be different from an sRGB color gamut, but a color gamut of the photographed image may be substantially the same as the sRGB color gamut. Accordingly, in an exemplary embodiment, camera RGB data may be substantially the same as the sRGB data of the sRGB color gamut. In such an embodiment, the second color space conversion unit 110 may be omitted.
  • The data transmitting and receiving unit 111 transmits the sRGB data to another image processing device or receives sRGB data from another image processing device. Another image processing device may use any one of a camera color gamut and a display color gamut, and image data that is transmitted and received to and from the image processing device may be data of a standard color gamut that is widely used in a random image processing device. Therefore, image data that image processing devices including the data transmitting and receiving unit 11 transmit and receive may be converted into the sRGB data of an sRGB color gamut.
  • FIG. 2 is a flowchart illustrating an exemplary embodiment of a method of calculating a first color space conversion matrix of an image processing device according to the invention.
  • Referring to FIG. 2, in an exemplary embodiment, the first color space conversion matrix of the image processing device 100 may be an RGB to R′G′B′ matrix that converts the camera RGB data into the display R′G′B′ data.
  • A predetermined number of colors, e.g., N number of colors, are photographed by the camera sensor unit 101 of the image processing device 100 (S110). Here, N is a natural number. The N number of colors may be randomly selected. In an exemplary embodiment, for example, the N number of colors may be selected as red, green and blue for an accurate calculation of the first color space conversion matrix. In an alternative exemplary embodiment, the N number of colors may be selected as other colors including cyan, magenta, yellow and white, other than red, green and blue for a more accurate calculation of the first color space conversion matrix.
  • In an exemplary embodiment, the photographed image is processed in the pre-processing unit 102, the white balancing unit 103, the demosaicking unit 104 and the gamma correction unit 105, and the photographed image is generated into camera RGB data (S120). In such an embodiment, a pre-processing process, a white balancing process, a demosaicking process and a gamma correction process of a bayer image that is formed in the camera sensor unit 101 may be performed, and the camera RGB data of the N number of colors is thereby generated.
  • The N number of colors are displayed in the display unit 108, and the N number of colors may be measured by a spectrophotometer (S130).
  • When intensity of each wavelength of the N number of colors that are displayed in the display unit 108 is measured by the spectrophotometer, a tristimulus value XYZ of each of the N number of colors is calculated (S140).
  • The tristimulus value XYZ of each of the N number of colors is converted into display R′G′B′ data using an XYZ to R′G′B′ inverse conversion matrix that converts the tristimulus value XYZ into the display R′G′B′ data (S150). The R′G′B′ to XYZ conversion matrix that reflects color gamut characteristics of a display may be calculated based on measurement using the spectrophotometer. Therefore, the XYZ to R′G′B′ inverse conversion matrix, which is an inverse matrix of an R′G′B′ to XYZ conversion matrix, may be calculated.
  • The display R′G′B′ data of the N number of colors are generated using the XYZ to R′G′B′ inverse conversion matrix (S160).
  • The RGB to R′G′B′ matrix is calculated based on a relationship between the camera RGB data of the N number of colors and the display R′G′B′ data of the N number of colors (S170).
  • In an exemplary embodiment, the camera RGB data of the N number of colors and the display R′G′B′ data of the N number of colors may be represented by N×3 matrices including a first N×3 matrix and a second N×3 matrix, and the camera RGB data of the N number of colors and the display R′G′B′ data of the N number of colors may be represented by Equation 1.
  • A = [ RC 1 GC 1 BC 1 RC 2 GC 2 BC 2 RC N GC N BC N ] B = [ RD 1 GD 1 BD 1 RD 2 GD 2 BD 2 RD N GD N BD N ] Equation 1
  • In Equation 1, A represents an N×3 matrix of the camera RGB data of the N number of colors (i.e., the first N×3 matrix), and B represents an N×3 matrix of the display R′G′B′ data of the N number of colors (i.e., the second N×3 matrix). RC1, GC1 and BC1 denote a color coordinate of a first color in the camera RGB data, RC2, GC2 and BC2 denote a color coordinate of a second color in the camera RGB data, and RCN, GCN and BCN denote a color coordinate of the N-th color in the camera RGB data. RD1, GD1 and BD1 denote a color coordinate of the first color in the display R′G′B′ data, RD2, GD2 and BD2 denote a color coordinate of the second color in the display R′G′B′ data, and RDN, GDN and BDN denote a color coordinate of the N-th color in the display R′G′B′ data.
  • When an RGB to R′G′B′ matrix, e.g., a first color space conversion matrix is represented with M, a relationship of A, B and M satisfy the following Equation 2.

  • M·A T =B T  Equation 2
  • In Equation 2, AT denotes a transpose of A, and BT denotes a transpose of B.
  • In Equation 2, when multiplying the transpose of AT, that is A, to both sides and when multiplying an inverse matrix of (AT·A), that is, (AT·A)−1, to both sides, the first color space conversion matrix M satisfies the following Equation 3.

  • M=B T ·A·(A T ·A)−1  Equation 3
  • As described above, in an exemplary embodiment where the camera RGB data of the N number of colors is represented by A that is a N×3 matrix, and the display R′G′B′ data of the N number of colors that are calculated by measuring the N number of colors that are displayed in the display unit 108 by the spectrophotometer is represented by B that is a N×3 matrix, the first color space conversion matrix may be calculated from a relationship in which the product of the first color space conversion matrix M and a transpose of A is a transpose of B, as shown in Equation 2.
  • The calculated first color space conversion matrix may be used for converting the camera RGB data into the display R′G′B′ data in the first color space conversion unit 106 of FIG. 1.
  • FIG. 3 is a flowchart illustrating an exemplary embodiment of a method of calculating a second color space conversion matrix of an image processing device according to the invention.
  • Referring to FIG. 3, a second color space conversion matrix of the image processing device 100 is an RGB to sRGB matrix that converts the camera RGB data into the sRGB data.
  • The predetermined number of colors, e.g., N number of colors, are photographed by the camera sensor unit 101 of the image processing device 100 (S210). The N number of colors may be selected with random colors. In one exemplary embodiment, for example, the N number of colors may be selected with red, green and blue for accurate calculation of the second color space conversion matrix. In an alternative exemplary embodiment, the N number of colors may be selected with cyan, magenta, yellow and white, other than red, green and blue, to more accurately calculate the second color space conversion matrix.
  • In an exemplary embodiment, the photographed image is processed in the pre-processing unit 102, the white balancing unit 103, the demosaicking unit 104 and the gamma correction unit 105, and is generated into the camera RGB data (S220). In such an embodiment, a pre-processing process, a white balancing process, a demosaicking process and a gamma correction process of a bayer image that is formed in the camera sensor unit 101 may be performed, and camera RGB data of the N number of colors are thereby generated.
  • In an exemplary embodiment, sRGB data of the predetermined N number of colors is generated (S230). An sRGB color gamut is a standardized color, and the sRGB data of the predetermined N number of colors may be derived from a look-up table of an sRGB color gamut.
  • An RGB to sRGB matrix is calculated from a relationship between camera RGB data of the N number of colors and sRGB data of the N number of colors (S240). The RGB to sRGB matrix, e.g., the second color space conversion matrix, may be derived with the same method as in the method described with reference to Equations 1 to 3 and FIG. 2. In such an embodiment, when the camera RGB data of the N number of colors is represented with an N×3 matrix, e.g., the first N×3 matrix A, and, when sRGB data of the N number of colors in an sRGB color gamut is represented with an N×3 matrix, e.g., a third N×3 matrix B′, the second color space conversion matrix is calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N×3 matrix A is a transpose of the third N×3 matrix B′.
  • Hereinafter, an exemplary embodiment of a method of processing image data in a process in which the image processing devices 100 transmit and receive image data will be described.
  • FIG. 4 is a flowchart illustrating an exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention.
  • FIG. 4 illustrates an exemplary embodiment of a method including transmitting an image that is photographed in the first image processing device to the second image processing device.
  • In an exemplary embodiment, an image that is photographed in the first image processing device is generated into first camera RGB data (S310). In such an embodiment, an image that is photographed in the first camera unit of the first image processing device may be generated into the first camera RGB data through a pre-processing unit, a white balancing unit, a demosaicking unit and a gamma correction unit of the first image processing device. In such an embodiment, in the first image processing device, the first camera RGB data is converted into the first display R′G′B′ data using the first color space conversion matrix of the first image processing device, and the first display R′G′B′ data may be displayed into an image through a display unit of the first image processing device.
  • In an exemplary embodiment, the first camera RGB data is generated into sRGB data using a second color space conversion matrix of the first image processing device (S320). In such an embodiment, the first camera RGB data may be converted into sRGB data through the second color space conversion unit of the first image processing device. A second color space conversion matrix of the first image processing device converts the first camera RGB data into sRGB data.
  • The sRGB data is transmitted to the second image processing device through the data transmitting and receiving unit of the first image processing device (S330).
  • In an exemplary embodiment, the sRGB data is received in the second image processing device (S340). In such an embodiment, the second image processing device may receive the sRGB data through the data transmitting and receiving unit.
  • In an exemplary embodiment, the sRGB data is generated into second camera RGB data using an inverse matrix of the second color space conversion matrix of the second image processing device (S350). In such an embodiment, sRGB data may be converted into the second camera RGB data through the second color space conversion unit of the second image processing device. The second color space conversion matrix of the second image processing device converts the second camera RGB data into sRGB data, and an inverse matrix thereof converts sRGB data into second camera RGB data.
  • In an exemplary embodiment, the second camera RGB data is generated into second display R′G′B′ data using the first color space conversion matrix of the second image processing device (S360). In such an embodiment, the second camera RGB data is converted into the second display R′G′B′ data through the first color space conversion unit of the second image processing device. In the second image processing device, the first color space conversion matrix converts the second camera RGB data into second display R′G′B′ data. The second display R′G′B′ data may be displayed into an image through a display unit of the second image processing device.
  • In an exemplary embodiment, as described above, the first image processing device converts and transmits the first camera RGB data into sRGB data of a standardized sRGB color gamut, and the second image processing device converts sRGB data into second camera RGB data, converts the second camera RGB data into the second display R′G′B′ data, and displays the second display R′G′B′ data in the display unit such that image processing devices having different camera color gamuts and display color gamuts may transmit and receive image data without distortion of a color.
  • FIG. 5 is a flowchart illustrating an alternative exemplary embodiment of a method of processing an image in a first image processing device and a second image processing device according to the invention.
  • FIG. 5 illustrates an exemplary embodiment of a method including transmitting an image that is stored at the first image processing device to the second image processing device.
  • In an exemplary embodiment, an image that is photographed in the first image processing device may be stored into first display R′G′B′ data that may be displayed in a display unit of the first image processing device (S410). In such an embodiment, an image that is photographed in the first camera unit of the first image processing device is generated into first camera RGB data through a pre-processing unit, a white balancing unit, a demosaicking unit and a gamma correction unit of the first image processing device. The first camera RGB data is converted into first display R′G′B′ data through a first color space conversion unit of the first image processing device, and the first display R′G′B′ data may be displayed into an image through a display unit of the first image processing device. In an exemplary embodiment, the first display R′G′B′ data that is converted through the first color space conversion unit may be stored at a data storage unit of the first image processing device.
  • In an exemplary embodiment, the first display R′G′B′ data that is stored at the first image processing device is generated into first camera RGB data using an inverse matrix of the first color space conversion matrix of the first image processing device (S420). In such an embodiment, the first display R′G′B′ data may be converted into first camera RGB data through the first color space conversion unit of the first image processing device. The first color space conversion matrix of the first image processing device converts the first camera RGB data into first display R′G′B′ data, and an inverse matrix thereof converts the first display R′G′B′ data into first camera RGB data.
  • In an exemplary embodiment, the first camera RGB data is generated into sRGB data using a second color space conversion matrix of the first image processing device (S430). In such an embodiment, the first camera RGB data is converted into sRGB data through a second color space conversion unit of the first image processing device. The second color space conversion matrix of the first image processing device converts the first camera RGB data into sRGB data.
  • The sRGB data is transmitted to the second image processing device through a data transmitting and receiving unit of the first image processing device (S440).
  • The sRGB data is received in the second image processing device (S450). In such an embodiment, the second image processing device may receive sRGB data through the data transmitting and receiving unit.
  • The sRGB data is generated into second camera RGB data using an inverse matrix of a second color space conversion matrix of the second image processing device (S460). In such an embodiment, the sRGB data is converted into second camera RGB data through the second color space conversion unit of the second image processing device. The second color space conversion matrix of the second image processing device converts the second camera RGB data into sRGB data, and an inverse matrix thereof converts the sRGB data matrix into second camera RGB data.
  • The second camera RGB data is generated into second display R′G′B′ data using the first color space conversion matrix of the second image processing device (S470). In such an embodiment, the second camera RGB data is converted into second display R′G′B′ data through the first color space conversion unit of the second image processing device. In such an embodiment, the first color space conversion matrix may convert the second camera RGB data into second display R′G′B′ data in the second image processing device. The second display R′G′B′ data may be displayed into an image through a display unit of the second image processing device.
  • In an exemplary embodiment, as described above, after first display R′G′B′ data that is stored at the first image processing device is converted into first camera RGB data, the first camera RGB data is converted and transmitted to sRGB data of a standardized sRGB color gamut, the second image processing device converts the sRGB data into second camera RGB data and converts the second camera RGB data into second display R′G′B′ data and the display unit displays an image based on the second display R′G′B′ data. Accordingly, in such an embodiment, image processing devices having different camera color gamuts and display color gamuts may transmit and receive image data without distortion of a color.
  • While this invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements within the spirit and scope of the appended claims.

Claims (22)

What is claimed is:
1. An image processing device, comprising:
a camera sensor unit which generates a bayer image;
a demosaicking unit which converts the bayer image into a red image, a green image and a blue image;
a first color space conversion unit which converts first camera RGB data comprising data of the red image, the green image and the blue image into first display R′G′B′ data using a first color space conversion matrix; and
a display unit which displays an image using the first display R′G′B′ data,
wherein the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N×3 matrix is a transpose of a second N×3 matrix,
wherein the first N×3 matrix represents camera RGB data of N number of colors, and the second N×3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer.
2. The image processing device of claim 1, further comprising:
a second color space conversion unit which converts the first camera RGB data into first sRGB data comprising red image data, green image data and blue image data in an sRGB color gamut using a second color space conversion matrix.
3. The image processing device of claim 2, wherein
the second color space conversion matrix is calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N×3 matrix, is a transpose of a third N×3 matrix,
wherein the third N×3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
4. The image processing device of claim 2, further comprising:
a data transmitting and receiving unit which transmits the first sRGB data and receives second sRGB data.
5. The image processing device of claim 4, wherein
the second color space conversion unit converts the second sRGB data into second camera RGB data using an inverse matrix of the second color space conversion matrix,
the first color space conversion unit converts the second camera RGB data into second display R′G′B′ data using the first color space conversion matrix, and
the display unit displays an image using the second display R′G′B′ data.
6. The image processing device of claim 4, further comprising:
a data storage unit which stores the first display R′G′B′ data.
7. The image processing device of claim 6, wherein
the first color space conversion unit converts the first display R′G′B′ data, which are stored at the data storage unit, into the first camera RGB data using an inverse matrix of the first color space conversion matrix.
8. The image processing device of claim 1, further comprising:
a pre-processing unit which adjusts a black level of the bayer image and removes noise in the bayer image.
9. The image processing device of claim 1, further comprising:
a white balancing unit which performs white balancing of the bayer image.
10. The image processing device of claim 1, further comprising:
a gamma correction unit which performs gamma correction of the red image, the green image and the blue image based on a visual perceptual property of humans.
11. The image processing device of claim 1, further comprising:
a post-processing unit which performs a post-processing for adjusting an edge, a contrast and saturation of the first display R′G′B′ data.
12. A method of processing an image, the method comprising:
generating a bayer image comprising a red channel value, a green channel value and a blue channel value;
converting the bayer image into a red image, a green image and a blue image;
converting camera RGB data comprising data of the red image, the green image and the blue image into display R′G′B′ data using a first color space conversion matrix; and
displaying an image in a display unit using the display R′G′B′ data,
wherein the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N×3 matrix, is a transpose of a second N×3 matrix,
wherein the first N×3 matrix represents camera RGB data of N number of colors, and the second N×3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer.
13. The method of claim 12, wherein the generating the bayer image comprises:
adjusting a black level of the bayer image and removing noise in the bayer image; and
performing white balancing of the bayer image.
14. The method of claim 12, wherein the converting the bayer image comprises:
performing gamma correction of the red image, the green image and the blue image based on a visual perceptual property of humans.
15. The method of claim 12, wherein the converting the camera RGB data comprises:
performing a post-processing for adjusting an edge, a contrast and saturation of the display R′G′B′ data.
16. The method of claim 12, further comprising:
converting the camera RGB data into sRGB data comprising red image data, green image data and blue image data in an sRGB color gamut using a second color space conversion matrix; and
transmitting the sRGB data through a data transmitting and receiving unit.
17. The method of claim 16, wherein
the second color space conversion matrix is calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N×3 matrix, is a transpose of a third N×3 matrix,
wherein the third N×3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
18. The method of claim 12, further comprising:
storing the display R′G′B′ data at a data storage unit.
19. A method of processing an image, the method comprising:
receiving sRGB data comprising red image data, green image data and blue image data in an sRGB color gamut through a data transmitting and receiving unit;
converting the sRGB data into camera RGB data using an inverse matrix of a second color space conversion matrix; and
converting the camera RGB data into display R′G′B′ data using a first color space conversion matrix,
wherein the first color space conversion matrix is calculated from a relationship, in which the product of the first color space conversion matrix and a transpose of a first N×3 matrix, is a transpose of a second N×3 matrix,
wherein the first N×3 matrix represents camera RGB data of N number of colors, and the second N×3 matrix represents display R′G′B′ data of the N number of colors, which are calculated by measuring the N number of colors displayed in the display unit using a spectrophotometer.
20. The method of claim 19, wherein
the second color space conversion matrix is calculated from a relationship, in which the product of the second color space conversion matrix and a transpose of the first N×3 matrix, is a transpose of a third N×3 matrix,
wherein the third N×3 matrix represents sRGB data of the N number of colors in the sRGB color gamut.
21. The method of claim 19, wherein the receiving the sRGB data comprises:
generating a bayer image comprising a red channel value, a green channel value and a blue channel value in a first image processing device;
converting the bayer image into a red image, a green image and a blue image;
converting camera RGB data comprising data of the red image, the green image and the blue image into the sRGB data using the second color space conversion matrix in the first image processing device; and
receiving the sRGB data from the first image processing device.
22. The method of claim 19, wherein the receiving the sRGB data comprises:
converting first display R′G′B′ data, which is stored at the first image processing device, into first camera RGB data using an inverse matrix of the first color space conversion matrix in the first image processing device;
converting the first camera RGB data into the sRGB data using the second color space conversion matrix in the first image processing device; and
receiving the sRGB data from the first image processing device.
US14/022,673 2013-05-10 2013-09-10 Device and method for processing image for substantially accurately reproduced color images from a camera Active 2033-11-09 US9140608B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0053222 2013-05-10
KR1020130053222A KR20140133272A (en) 2013-05-10 2013-05-10 Device for image processing and method thereof

Publications (2)

Publication Number Publication Date
US20140333797A1 true US20140333797A1 (en) 2014-11-13
US9140608B2 US9140608B2 (en) 2015-09-22

Family

ID=51864520

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/022,673 Active 2033-11-09 US9140608B2 (en) 2013-05-10 2013-09-10 Device and method for processing image for substantially accurately reproduced color images from a camera

Country Status (2)

Country Link
US (1) US9140608B2 (en)
KR (1) KR20140133272A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9140608B2 (en) * 2013-05-10 2015-09-22 Samsung Display Co., Ltd. Device and method for processing image for substantially accurately reproduced color images from a camera
CN107092842A (en) * 2017-04-14 2017-08-25 苏州三星电子电脑有限公司 Secrecy display method and private display device
US20190355098A1 (en) * 2018-05-18 2019-11-21 Gopro, Inc. Multiscale denoising of raw images with noise estimation
CN112399254A (en) * 2019-08-18 2021-02-23 海信视像科技股份有限公司 Display device and color gamut space dynamic adjustment method
US10964240B1 (en) * 2019-10-23 2021-03-30 Pixelworks, Inc. Accurate display panel calibration with common color space circuitry
US20230379067A1 (en) * 2022-05-18 2023-11-23 Rohde & Schwarz Gmbh & Co. Kg Augmented reality spectrum monitoring system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102581850B1 (en) * 2016-11-30 2023-09-22 엘지디스플레이 주식회사 Method and apparatus for generating compensation data of display panel

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050088550A1 (en) * 2003-10-23 2005-04-28 Tomoo Mitsunaga Image processing apparatus and image processing method, and program
US20060104505A1 (en) * 2004-11-15 2006-05-18 Chih-Lung Chen Demosaicking method and apparatus for color filter array interpolation in digital image acquisition systems
US20070053607A1 (en) * 2005-08-11 2007-03-08 Tomoo Mitsunaga Image processing apparatus and method, recording medium, and program
US20070081182A1 (en) * 2005-10-07 2007-04-12 Seiko Epson Corporation Printer and image processing apparatus
US7420663B2 (en) * 2005-05-24 2008-09-02 Bwt Property Inc. Spectroscopic sensor on mobile phone
US20080247662A1 (en) * 2007-04-05 2008-10-09 Fumihito Yasuma Image Processing Device
US8644605B2 (en) * 2011-02-28 2014-02-04 International Business Machines Corporation Mapping colors of an image
US8837821B2 (en) * 2010-05-11 2014-09-16 Olympus Corporation Image processing apparatus, image processing method, and computer readable recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100278642B1 (en) 1998-10-22 2001-01-15 윤종용 Color image processing apparatus and method
KR20050006424A (en) 2003-07-08 2005-01-17 삼성전자주식회사 Color signal processing method and photographing apparatus of using the same
KR101279086B1 (en) 2007-06-01 2013-06-27 삼성전자주식회사 Image forming apparatus and method for color converting
KR20140133272A (en) * 2013-05-10 2014-11-19 삼성디스플레이 주식회사 Device for image processing and method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050088550A1 (en) * 2003-10-23 2005-04-28 Tomoo Mitsunaga Image processing apparatus and image processing method, and program
US20060104505A1 (en) * 2004-11-15 2006-05-18 Chih-Lung Chen Demosaicking method and apparatus for color filter array interpolation in digital image acquisition systems
US7420663B2 (en) * 2005-05-24 2008-09-02 Bwt Property Inc. Spectroscopic sensor on mobile phone
US20070053607A1 (en) * 2005-08-11 2007-03-08 Tomoo Mitsunaga Image processing apparatus and method, recording medium, and program
US20070081182A1 (en) * 2005-10-07 2007-04-12 Seiko Epson Corporation Printer and image processing apparatus
US20080247662A1 (en) * 2007-04-05 2008-10-09 Fumihito Yasuma Image Processing Device
US8837821B2 (en) * 2010-05-11 2014-09-16 Olympus Corporation Image processing apparatus, image processing method, and computer readable recording medium
US8644605B2 (en) * 2011-02-28 2014-02-04 International Business Machines Corporation Mapping colors of an image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Stone et al, RAW File Treatment and Color Management, 05/12/2012, https://web.archive.org/web/20120512113748/http://docs.kde.org/development/en/extragear-graphics/digikam/raw-decoding.html *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9140608B2 (en) * 2013-05-10 2015-09-22 Samsung Display Co., Ltd. Device and method for processing image for substantially accurately reproduced color images from a camera
CN107092842A (en) * 2017-04-14 2017-08-25 苏州三星电子电脑有限公司 Secrecy display method and private display device
US20190355098A1 (en) * 2018-05-18 2019-11-21 Gopro, Inc. Multiscale denoising of raw images with noise estimation
US10902558B2 (en) * 2018-05-18 2021-01-26 Gopro, Inc. Multiscale denoising of raw images with noise estimation
CN112399254A (en) * 2019-08-18 2021-02-23 海信视像科技股份有限公司 Display device and color gamut space dynamic adjustment method
US10964240B1 (en) * 2019-10-23 2021-03-30 Pixelworks, Inc. Accurate display panel calibration with common color space circuitry
US20230379067A1 (en) * 2022-05-18 2023-11-23 Rohde & Schwarz Gmbh & Co. Kg Augmented reality spectrum monitoring system

Also Published As

Publication number Publication date
KR20140133272A (en) 2014-11-19
US9140608B2 (en) 2015-09-22

Similar Documents

Publication Publication Date Title
US9140608B2 (en) Device and method for processing image for substantially accurately reproduced color images from a camera
US10535125B2 (en) Dynamic global tone mapping with integrated 3D color look-up table
US7400332B2 (en) Hexagonal color pixel structure with white pixels
CN107197225B (en) Color digital camera white balance correcting based on chromatic adaptation model
EP3054675B1 (en) Imaging systems with clear filter pixels
US8723995B2 (en) Extended dynamic range in color imagers
US9386189B2 (en) Device for converting color gamut and method thereof
KR101433952B1 (en) Test pattern signal generator and generation method, color measurement system and display device
CN103004211B (en) The method of imaging device and process photographic images
US9219894B2 (en) Color imaging element and imaging device
US9961236B2 (en) 3D color mapping and tuning in an image processing pipeline
JP4874752B2 (en) Digital camera
CN104933706B (en) A kind of imaging system color information scaling method
US9143747B2 (en) Color imaging element and imaging device
US9185375B2 (en) Color imaging element and imaging device
US20120188399A1 (en) Methods and Systems for Automatic White Balance
JP5878586B2 (en) Image color adjustment method and electronic apparatus therefor
US8416258B2 (en) Method for adjusting the color of images
US9036030B2 (en) Color calibration of an image capture device in a way that is adaptive to the scene to be captured
US20140314317A1 (en) Method and apparatus for converting gray level of color image
CN102231787B (en) Image color correction method and device
JP2002109523A (en) Image pickup device, optical filter group, and image data converter
JP2006222783A (en) Preparation of color conversion table
US9531919B2 (en) Image processing apparatus, image processing method, and recording medium that color-convert an input image into an output image suitable for a color gamut of a printing apparatus
Wen P‐46: A Color Space Derived from CIELUV for Display Color Management

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG DISPLAY CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, KYOUNG-TAE;RYU, BYONG-TAE;BAE, JAE-WOO;REEL/FRAME:031200/0400

Effective date: 20130828

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8