US20020001409A1 - Interpolation processing apparatus and recording medium having interpolation processing program recorded therein - Google Patents

Interpolation processing apparatus and recording medium having interpolation processing program recorded therein Download PDF

Info

Publication number
US20020001409A1
US20020001409A1 US09/877,002 US87700201A US2002001409A1 US 20020001409 A1 US20020001409 A1 US 20020001409A1 US 87700201 A US87700201 A US 87700201A US 2002001409 A1 US2002001409 A1 US 2002001409A1
Authority
US
United States
Prior art keywords
interpolation
color component
color
target pixel
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/877,002
Other languages
English (en)
Inventor
Zhe-Hong Chen
Kenichi Ishiga
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Assigned to NIKON CORPORATION reassignment NIKON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, ZHE-HONG, ISHIGA, KENICHI
Publication of US20020001409A1 publication Critical patent/US20020001409A1/en
Priority to US11/367,583 priority Critical patent/US7236628B2/en
Priority to US11/477,666 priority patent/US7362897B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4015Demosaicing, e.g. colour filter array [CFA], Bayer pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • the present invention relates to an interpolation processing apparatus that engages in interpolation processing on color image data to supplement a color component and a luminance component missing in pixels and a recording medium having an interpolation processing program for achieving the interpolation processing on a computer, that can be read by a computer.
  • Some electronic cameras generate color image data by employing an image-capturing sensor having three color (R, G and B.: red, green and blue) color filters provided at specific positions (e.g., a Bayer array).
  • RGB red, green and blue
  • RGB red, green and blue
  • a green color interpolation value G5 for the interpolation target pixel is calculated through one formula among formula 1 through formula 3 when the color information corresponding to individual pixels is provided as shown below, with A5 representing the color information at the interpolation target pixel (a pixel with the green color component missing), A1, A3, A7 and A9 representing color information from pixels provided with color filters in the same color as the color of the filter at the interpolation target pixel and G2, G4, G6 and G8 representing color information from pixels provided with green color filters.
  • G5 (G4+G6)/2+( ⁇ A3+2A5 ⁇ A7)/4 (formula 1).
  • G5 (G2+G8)/2+( ⁇ A1+2A5 ⁇ A9)/4 (formula 2).
  • the green color interpolation value G5 for the interpolation target pixel is calculated through
  • G5 (G2+G4+G6+G8)/4+( ⁇ A1 ⁇ A3+4A5 ⁇ A7 ⁇ A9)/8 (formula 3).
  • a green color interpolation value is calculated by assuming that the color difference between the green color component and the color component (the red color component or the blue color component) at the interpolation target pixel is constant ((A4-G4), (A5-G5) and (A6-G6) in FIG. 17 match) and correcting the average of the values indicated by the color information from the pixels that are adjacent along the direction in which a high degree of similarity is manifested with color information corresponding to the same color component as that of the interpolation target pixel.
  • Optical systems such as lenses are known to manifest magnification chromatic aberration. For instance, if there is magnification chromatic aberration at the photographic lens of an electronic camera having an image-capturing sensor provided with color filters in three colors, i.e., R, G and B, arranged in a Bayer array, images corresponding to the red color component and the blue color component are formed at positions slightly offset from the position at which the image corresponding to the green color component is formed, as shown in FIGS. 18B and 18C.
  • the photographic lens is free of any magnification chromatic aberration and color information corresponding to the individual pixels is provided as indicated by the ⁇ marks in FIG. 19A (the image data undergoing the interpolation processing manifest marked similarity along the horizontal direction, the color information corresponding to the green color component indicates a constant value and the values indicated by the color information corresponding to the red color component and the color information corresponding to the blue color component both change gently in the vicinity of the interpolation target pixel (the pixel at which A5 is present), the value of the correctional term in formula 1 is 0 and, as a result, the average of G4 and G6 (the primary term) is directly used as the green color interpolation value G5 without correction.
  • A3, A5 and A7 each represent color information corresponding to the blue color component and each set of color information corresponding to the blue color component is offset by one pixel to the left due to a magnification chromatic aberration, the color information from the individual pixels undergoes a change as shown in FIG. 19C.
  • the value of the correctional term in formula 1 is not 0, resulting in the green color interpolation value G5 that should be similar to the G4 and G6 values becoming smaller than those corresponding to G4 and G6 (hereafter this phenomenon is referred to as an “undershoot”) through an over-correction.
  • An over correction also occurs at a color boundary where the color difference changes as well as when there is a magnification chromatic aberration.
  • the color information corresponding to the individual pixels is provided as indicated by the ⁇ marks in FIGS. 20A and 20B (when the color information corresponding to the green color component is constant and the values identified by the color information corresponding to the red color component or the blue color component change drastically near the interpolation target pixel (the pixel at which A5 is present)), the value of the correctional term in formula 1 is not 0 and, an overshoot or an undershoot occurs due to an over correction with regard to the green color interpolation value G5, which should be similar to the values indicated by G4 and G6.
  • a color artifact occurs as a result of interpolation processing even if there is no magnification chromatic aberration. It is to be noted that such a color artifact as that described above may occur when calculating a red color interpolation value or a blue color interpolation value as well as when calculating a green color interpolation value.
  • An object of the present invention is to provide an interpolation processing apparatus capable of preventing occurrence of color artifacts and a recording medium having recorded therein an interpolation processing program with which occurrence of color artifacts can be prevented.
  • an object of the present invention is to suppress the occurrence of color artifacts by reducing the problems of the prior art while retaining the advantages of the interpolation processing in the prior art and reducing the degree of the adverse effect of magnification chromatic aberration.
  • a first interpolation processing apparatus that engages in processing on image data which are provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components and include color information corresponding to a single color component provided at each pixel to determine an interpolation value equivalent to color information corresponding to the first color component for a pixel at which the first color component is missing, comprises: an interpolation value calculation section that uses color information at pixels located in a local area containing an interpolation target pixel to undergo interpolation processing to calculate an interpolation value including, at least (1) local average information of the first color component with regard to the interpolation target pixel and (2) local curvature information corresponding to at least two color components with regard to the interpolation target pixel.
  • the first interpolation processing apparatus calculates an interpolation value by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information corresponding to at least two color components with regard to the interpolation target pixel.”
  • the “local average information of the first color component with regard to the interpolation target pixel” may be the average of the values indicated by the color information corresponding to the first color component present in the local area containing the interpolation target pixel or a value within the range of the values indicated by the color information corresponding to the first color component in the local area containing the interpolation target pixel.
  • the “local curvature information corresponding to at least two color components with regard to the interpolation target pixel” refers to information that indicates how the color information corresponding to at least two color components in the local area containing the interpolation target pixel changes.
  • the local curvature information corresponding to a given color component is information that indicates the degree of change in the rate of change occurring with regard to the color component in the local area, and when the values corresponding to each color component are plotted and rendered as a curve (or a polygonal line), the information indicates the curvature and the degree of change in the curvature (this definition applies in the subsequent description).
  • the information which may be obtained by calculating a quadratic differential or a higher differential of the color component, indicates a value reflecting structural information with regard to fluctuations in the values corresponding to the color component.
  • the information is rendered in a polygonal line, it indicates changes in the inclinations of the individual line segments.
  • a second interpolation processing apparatus achieves that in the first interpolation processing apparatus the interpolation value calculation section calculates, as the local curvature information corresponding to at least two color components, (1) local curvature information based upon a color component matching a color component at the interpolation target pixel and (2) local curvature information based upon a color component other than the color component at the interpolation target pixel.
  • the interpolation value is calculated by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information based upon a color component matching the color component at the interpolation target pixel” and the “local curvature information based upon a color component other than the color component at the interpolation target pixel.”
  • the “local curvature information based upon a color component matching the color component at the interpolation target pixel (or a color component other than the color component at the interpolation target pixel)” refers to information that indicates how the color information corresponding to the color component matching the color component at the interpolation target pixel (or a color component other than the color component at the interpolation target pixel) in the local area containing the interpolation target pixel changes and is represented as a value reflecting the structural information regarding the fluctuations obtained through calculation of a quadratic differential or a higher differential of the color component.
  • a third interpolation processing apparatus that engages in processing on image data which are provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components and include color information corresponding to a single color component provided at each pixel to determine an interpolation value equivalent to color information corresponding to the first color component for a pixel at which the first color component is missing, comprises: an interpolation value calculation section that uses color information at pixels located in a local area containing an interpolation target pixel to undergo interpolation processing to calculate an interpolation value including, at least (1) local average information of the first color component with regard to the interpolation target pixel and (2) local curvature information based upon a color component other than a color component at the interpolation target pixel.
  • the interpolation value is calculated by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information based upon a color component other than the color component at the interpolation target pixel.”
  • the “local curvature information based upon a color component other than the color component at the interpolation target pixel” refers to information that indicates how the color information corresponding to the color component other than the color component at the interpolation target pixel in the local area containing the interpolation target pixel changes and is represented as a value reflecting the structural information regarding the fluctuations obtained through calculation of a quadratic differential or a higher differential of the color component.
  • a fourth interpolation processing apparatus that engages in processing on image data which are provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components and include color information corresponding to a single color component provided at each pixel to determine an interpolation value equivalent to color information corresponding to the first color component for a pixel at which the first color component is missing, comprises: an interpolation value calculation section that uses color information at pixels located in a local area containing an interpolation target pixel to undergo interpolation processing to calculate an interpolation value including, at least (1) local average information of the first color component with regard to the interpolation target pixel and (2) local curvature information corresponding to the first color component with respect to the interpolation target pixel.
  • the interpolation value is calculated by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information corresponding to the first color component with regard to the interpolation target pixel”
  • the “local curvature information corresponding to the first color component with respect to the interpolation target pixel” refers to information that indicates how the color information corresponding to the first color component in the local area containing the interpolation target pixel changes and is represented as a value reflecting the structural information regarding the fluctuations obtained through calculation of a quadratic differential or a higher differential of the color component.
  • a fifth interpolation processing apparatus achieves that in the first through third interpolation processing apparatus a first similarity judgment section that judges degrees of similarity to the interpolation target pixel along at least two directions in which pixels with color information corresponding to the first color component are connected with the interpolation target pixel; and a second similarity judgment section that judges degrees of similarity to the interpolation target pixel along at least two directions other than the directions in which the degrees of similarity are judged by the first similarity judgment section, are further provided, and: the interpolation value calculation section selects a direction along which pixels having color information to be used to calculate the local average information of the first color component are set based upon results of a judgment made by the first similarity judgment section; (1) the interpolation value calculation section selects a direction along which pixels having color information to be used to calculate the local curvature information are set based upon results of the judgment made by the first similarity judgment section if the local curvature information is “local curvature information constituted of a single color component and manifesting directionality
  • any of the “local curvature information corresponding to at least two color components with regard to the interpolation target pixel” in the first interpolation processing apparatus and the “local curvature information based upon a color component matching the color component at the interpolation target pixel” and the “local curvature information based upon a color component other than the color component at interpolation target pixel” in the second or the third interpolation processing apparatus that constitutes “local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the first similarity judgment section” is calculated by using color information at pixels present along a direction selected based upon the results of the judgment made by the first similarity judgment section, whereas any of the information listed above that constitutes “local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the second similarity judgment section” is calculated by using color information at pixels present along a direction selected based upon the results of the judgment made by the first
  • the color information used to calculate local curvature information can be selected in correspondence to degrees of similarity to the interpolation target pixel in the fifth interpolation processing apparatus.
  • the color information used to calculate the local average information of the first color component too, can be selected in correspondence to the degrees of similarity to the interpolation target pixel.
  • a sixth interpolation processing apparatus that engages in processing on image data which are provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components and include color information corresponding to a single color component provided at each pixel to determine an interpolation value equivalent to color information corresponding to the first color component for a pixel at which the first color component is missing, comprises: an interpolation value calculation section that calculates an interpolation value including at least two terms, i.e., a first term and a second term by using color information at pixels set in a local area containing an interpolation target pixel to undergo interpolation processing; a first similarity judgement section that judges degrees of similarity to the interpolation target pixel along at least two directions in which pixels having color information corresponding to the first color component are connected to the interpolation target pixel; and a second similarity judgment section that judges degrees of similarity to the interpolation target pixel along at least two directions other than the directions in which the degrees of similarity are judged by the first similarity judgment
  • the processing can be performed by using color information from pixels set along more directions including the direction along which color information used to calculate the second term is provided as well as the direction along which color information used to calculate the first term is provided.
  • the interpolation value can be calculated by using color information at pixels located along a finely differentiated plurality of directions in the sixth interpolation processing apparatus.
  • the first term and the second term can be calculated by using color information at pixels set along the direction in which a high degree of similarity is manifested or through weighted synthesis of color information from pixels located along a plurality of directions, which is performed in correspondence to varying degrees of similarity, in the sixth interpolation processing apparatus.
  • a seventh interpolation processing apparatus achieves that in the sixth interpolation processing apparatus the interpolation value calculation section: calculates a term containing (a) local average information of the first color component with regard to the interpolation target pixel and (b) local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the first similarity judgment section, as the first term; and calculates a term containing local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the second similarity judgment section, as the second term.
  • the interpolation value is calculated by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the first similarity judgment section” and the “local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the second similarity judgment section.”
  • An eighth interpolation processing apparatus achieves that in the fifth or seventh interpolation processing apparatus: when image data are provided in a colorimetric system constituted of first ⁇ third color components with the first color component achieving a higher spatial frequency than the second color component and the third color component, the first color component set in a checker-board pattern, the second color component and the third color component each set in a line sequence between pixels at which color information corresponding to the first color component is present and information corresponding to the second color component present at the interpolation target pixel; the first similarity judgment section calculates similarity degrees manifested by the interpolation target pixel along two directions, i.e., a vertical direction and a horizontal direction, in which pixels with color information corresponding to the first color component that are closest to the interpolation target pixel are connected to the interpolation target pixel and makes a judgment with regard to degrees of similarity manifested by the interpolation target pixel along the vertical direction and the horizontal direction based upon a difference between the similarity degrees; the second similarity
  • the color component manifesting directionality along the two directions i.e., the vertical direction and the horizontal direction in which degrees of similarity are judged by the first similarity judgment section
  • the color components manifesting directionality along the two diagonal directions in which degrees of similarity are judged by the second similarity judgment section includes the second color component and the third color component.
  • the “local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the first similarity judgment section” in the fifth or seventh interpolation processing apparatus is calculated with respect to at least either the second color component or the first color component based upon the results of the judgment made by the first similarity judgment section and the “local curvature information constituted of a single color component and manifesting directionality along a direction in which degrees of similarity are judged by the second similarity judgment section” is calculated with respect to at least either the second color component or the third color component based upon the results of the judgment made by the second similarity judgment section.
  • the similarity manifesting along the diagonal directions is reflected with a high degree of reliability when calculating the “local curvature information based upon a color component achieving similarity along a direction in which degrees of similarity are judged by the second similarity judgment section.”
  • a ninth interpolation processing apparatus achieves that in the eighth interpolation processing apparatus: when the local curvature information is “local curvature information based upon a color component other than the color component at the interpolation target pixel”, the interpolation value calculation section selects the first color component or the third color component to which the local curvature information is to correspond in conformance to the degrees of similarity judged by the second similarity judgment section.
  • the third color component is present at pixels adjacent to the interpolation target pixel along the two diagonal directions and the similarity judged by the second similarity judgment section is the similarity manifested by the interpolation target pixel along the two diagonal directions.
  • the similarity along the diagonal directions can be reflected in the “local curvature information based upon a color component other than the color component at the interpolation target pixel” by switching the “local curvature information based upon a color component other than the color component at the interpolation target pixel” to correspond to the first color component or the third color component.
  • a tenth interpolation processing apparatus achieves that in the ninth interpolation processing apparatus: the interpolation value calculation section calculates local curvature information based upon the first color component if the second similarity judgment section judges that roughly equal degrees of similarity manifest along the two diagonal directions and calculates local curvature information based upon the third color component if the second similarity judgment section judges that a higher degree of similarity manifests along one of the two diagonal directions compared to the other diagonal direction.
  • the similarity manifesting along the diagonal directions is reflected with a high degree of reliability when calculating the “local curvature information based upon a color component other than the color component at the interpolation target pixel.”
  • An 11th interpolation processing apparatus achieves that in the eighth interpolation processing apparatus: the first similarity judgment section judges that roughly equal degrees of similarity manifest along the vertical direction and the horizontal direction if a difference between the similarity degrees along the vertical direction and the horizontal direction is smaller than a specific threshold value; and the second similarity judgment section judges that roughly equal degrees of similarity manifest along the two diagonal directions if a difference between the similarity degrees along the two diagonal directions is smaller than a specific threshold value.
  • a 12th interpolation processing apparatus achieves that in the eighth interpolation processing apparatus: the first similarity judgment section calculates the similarity degrees along the vertical direction and the horizontal direction by using color information corresponding to a plurality of color components for a single interpolation target pixel; and the second similarity judgment section calculates the similarity degrees along the two diagonal directions by using color information corresponding to a plurality of color components for a single interpolation target pixel.
  • color information corresponding to a plurality of color components is reflected in the judgement of the similarity manifesting along the vertical and horizontal directions and the similarity manifesting along the two diagonal directions.
  • a 13th interpolation processing apparatus achieves that in the twelfth interpolation processing apparatus: the second similarity judgment section calculates a similarity degree manifesting along each of the two diagonal directions through weighted addition of: (1) a similarity degree component constituted of color information corresponding to the first color component alone; (2) a similarity degree component constituted of color information corresponding to the second color component alone; (3) a similarity degree component constituted of color information corresponding to the third color component alone; and (4) a similarity degree component constituted of color information corresponding to the second color component and the third color component.
  • a 14th interpolation processing apparatus achieves that in the eighth: the first similarity judgment section calculates similarity degrees along the vertical direction and the horizontal direction for each pixel and makes a judgment on similarity manifested by the interpolation target pixel along the vertical direction and the horizontal direction based upon differences in similarity degrees manifesting at nearby pixels as well as at the interpolation target pixel; and the second similarity judgment section calculates similarity degrees along the two diagonal directions for each pixel and makes a judgment on similarity manifested by the interpolation target pixel along the two diagonal directions based upon differences in similarity degrees manifesting at nearby pixels as well as at the interpolation target pixel.
  • a 15th interpolation processing apparatus that engages in processing on image data which are provided in a calorimetric system constituted of first ⁇ nth (n ⁇ 2) color components and include color information corresponding to a single color component provided at each pixel to determine an interpolation value equivalent to color information corresponding to the first color component for a pixel at which the first color component is missing, comprises: a first term calculation section that calculates a first term representing average information of the first color component with regard to an interpolation target pixel to undergo interpolation processing by using color information corresponding to color components at pixels set in a local area containing the interpolation target pixel; a second term calculation section that calculates a second term representing local curvature information based upon a color component matching the color component at the interpolation target pixel with regard to the interpolation target pixel by using color information corresponding to color components at pixels set in a local area containing the interpolation target pixel; and an interpolation value calculation section that calculates an interpolation value by adding the second
  • the interpolation value is calculated by correcting the “average information of the first color component with regard to the interpolation target pixel” with the “local curvature information based upon a color component matching the color component at the interpolation target pixel with regard to the interpolation target pixel” multiplied by a weighting coefficient constituted of color information corresponding to a plurality of color components present at pixels within a local area containing the interpolation target pixel.
  • a 16th interpolation processing apparatus achieves that in the 15th interpolation processing apparatus: the interpolation value calculation section uses color information corresponding to a plurality of color components provided at the interpolation target pixel and at a plurality of pixels set along a predetermined direction relative to the interpolation target pixel to ascertain inclinations manifesting in color information corresponding to the individual color components along the direction and calculates the weighting coefficient in conformance to a correlation manifesting among the inclinations in the color information corresponding to the individual color components.
  • the weighting coefficient is calculated in conformance to the correlation among the inclinations of the color information corresponding to the different color components in the local area containing the interpolation target pixel.
  • a 17th interpolation processing apparatus that implements processing for supplementing a color component value at a pixel at which information corresponding to a color component is missing in image data provided in a calorimetric system constituted of a luminance component and the color component, with the luminance component having a higher spatial frequency than the color component and the luminance component present both at pixels having information corresponding to the color component and at pixels lacking information corresponding to the color component, comprises: a hue value calculation section that calculates hue values at a plurality of pixels located near an interpolation target pixel to undergo interpolation processing and having both the luminance component and the color component by using luminance component values and color component values at the individual pixels; a hue value interpolation section that calculates a hue value at the interpolation target pixel by using a median of the hue values at the plurality of pixels calculated by the hue value calculation section; and a color conversion section that interpolates a color component at the interpolation target pixel by using the luminance component at the interpolation target pixel to convert
  • the hue value of the interpolation target pixel is calculated by using the median of the hue values of a plurality of pixels located near the interpolation target pixel.
  • a 18th interpolation processing apparatus that implements processing for supplementing a luminance component at a pixel at which information corresponding to a luminance component is missing and supplementing a color component at a pixel at which information corresponding to a color component is missing, on image data provided in a calorimetric system constituted of the luminance component and the color component, with the luminance component having a higher spatial frequency than the color component and a given pixel having only information corresponding to either the luminance component or the color component, comprises: a luminance component interpolation section that interpolates a luminance component at a luminance component interpolation target pixel to undergo luminance component interpolation processing by using at least either “similarity manifesting between the luminance component interpolation target pixel and a pixel near the luminance component interpolation target pixel” or “a plurality of color components within a local area containing the luminance component interpolation target pixel”; a hue value calculation section that calculates hue values at a plurality of pixels located near an
  • the hue value of the interpolation target pixel is calculated by using the median of the hue values of a plurality of pixels located near the interpolation target pixel.
  • a 19th interpolation processing apparatus achieves that in the 17th or 18th interpolation processing apparatus: when the luminance component in the image data corresponds to a green color component and the color component in the image data corresponds to a red color component and a blue color component, the hue value interpolation section calculates a hue value for the interpolation target pixel by using a median of hue values containing the red color component at pixels near the interpolation target pixel if the green color component is present but the red color component is missing at the interpolation target pixel and calculates a hue value for the interpolation target pixel by using a median of hue values containing the blue color component at pixels near the interpolation target pixel if the green color component is present but the blue color component is missing at the interpolation target pixel.
  • the hue value of the interpolation target pixel at which the green color component is present but the red color component is missing is calculated by using the median of the hue values containing the red color component from pixels located near the interpolation target pixel
  • the hue value of the interpolation target pixel at which the green color component is present but the blue color component is missing is calculated by using the median of the hue values containing the blue color component from pixels located near the interpolation target pixel.
  • a 20th interpolation processing apparatus achieves that in the 17th or 18th interpolation processing apparatus: when the luminance component in the image data corresponds to a green color component and the color component in the image data corresponds to a red color component and a blue color component, the hue value interpolation section calculates a hue value for the interpolation target pixel by using a median of hue values containing the red color component at pixels set near the interpolation target pixel if the blue color component is present but the red color component is missing at the interpolation target pixel.
  • the hue value of the interpolation target pixel at which the blue color component is present but the red color component is missing is calculated by using the median of the hue values containing the red color component from pixels located near the interpolation target pixel.
  • a 21st interpolation processing apparatus achieves that in the 17th or 18th interpolation processing apparatus: when the luminance component in the image data corresponds to a green color component and the color component in the image data corresponds to a red color component and a blue color component, the hue value interpolation section calculates a hue value for the interpolation target pixel by using a median of hue values containing the blue color component at pixels set near the interpolation target pixel if the red color component is present but the blue color component is missing at the interpolation target pixel.
  • the hue value of the interpolation target pixel at which the red color component is present but the blue color component is missing is calculated by using the median of the hue values containing the blue color component from pixels located near the interpolation target pixel.
  • a 22nd interpolation processing apparatus achieves that in the any one of the 17th through 21st interpolation processing apparatus a color component is missing at the interpolation target pixel present at only one pixel among four pixels set symmetrically along the vertical direction and the horizontal direction, and the hue value interpolation section comprises: a first hue value interpolation unit that calculates a hue value for the interpolation target pixel by using a median of hue values at a plurality of diagonally adjacent pixels if the hue values of the plurality of diagonally adjacent pixels adjacent to the interpolation target pixel along diagonal directions have been calculated by the hue value calculation section; and a second hue value interpolation unit that calculates a hue value for the interpolation target pixel by using a median of hue values at a plurality of vertically and horizontally adjacent pixels if the hue values of the plurality of vertically and horizontally adjacent pixels adjacent to the interpolation target pixel in the vertical direction and the horizontal direction have been calculated by the hue value calculation section or the first hue value interpolation unit.
  • the hue value of the interpolation target pixel is calculated by using the median of the hue values at the diagonally adjacent pixels, whereas if the hue values at pixels adjacent in the vertical and horizontal directions are already calculated, the hue value of the interpolation target pixel is calculated by using the median of the hue values at the vertically and horizontally adjacent pixels.
  • a first recording medium has an interpolation processing program recorded therein to implement on a computer processing for determining an interpolation value equivalent to color information corresponding to a first color component missing at a pixel, on image data provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components with color information corresponding to a single color component present at each pixel.
  • the interpolation processing program comprises: an interpolation value calculation step in which an interpolation value including, at least (1) local average information of the first color component with regard to an interpolation target pixel to undergo interpolation processing and (2) local curvature information corresponding to at least two color components with regard to the interpolation target pixel, is calculated by using color information provided at pixels set within a local area containing the interpolation target pixel.
  • the interpolation value is calculated by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information corresponding to at least two color components with regard to the interpolation target pixel.”
  • a second recording medium has an interpolation processing program recorded therein to implement on a computer processing for determining an interpolation value equivalent to color information corresponding to a first color component missing at a pixel, on image data provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components with color information corresponding to a single color component present at each pixel.
  • the interpolation processing program comprises: an interpolation value calculation step in which an interpolation value including, at least (1) local average information of the first color component with regard to an interpolation target pixel to undergo the interpolation processing; and (2) local curvature information based upon a color component other than a color component at the interpolation target pixel, is calculated by using color information provided at pixels set within a local area containing the interpolation target pixel.
  • the interpolation value is calculated by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information based upon a color component other than the color component at the interpolation target pixel.”
  • a third recording medium has an interpolation processing program recorded therein to implement on a computer processing for determining an interpolation value equivalent to color information corresponding to a first color component missing at a pixel, on image data provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components with color information corresponding to a single color component present at each pixel.
  • the interpolation processing program comprises: an interpolation value calculation step in which an interpolation value including, at least (1) local average information of the first color component with regard to an interpolation target pixel to undergo the interpolation processing, and (2) local curvature information corresponding to the first color component with respect to the interpolation target pixel, is calculated by using color information provided at pixels set within a local area containing the interpolation target pixel.
  • the interpolation value is calculated by correcting the “local average information of the first color component with regard to the interpolation target pixel” with the “local curvature information corresponding to the first color component with respect to the interpolation target pixel.”
  • a fourth recording medium has an interpolation processing program recorded therein to implement on a computer processing for determining an interpolation value equivalent to color information corresponding to a first color component missing at a pixel, on image data provided in a calorimetric system constituted of first ⁇ nth (n ⁇ 2) color components with color information corresponding to a single color component present at each pixel.
  • the interpolation processing program comprises: an interpolation value calculation step in which an interpolation value including at least two terms, i.e., a first term and a second term is calculated by using color information at pixels set within a local area containing an interpolation target pixel to undergo interpolation processing; a first similarity judgment step in which degrees of similarity to the interpolation target pixel are judged along at least two directions in which pixels having color information corresponding to the first color component are connected with the interpolation target pixel; and a second similarity judgment step in which degrees of similarity to the interpolation target pixel are judged along at least two directions other than the directions along which the degrees of similarity are judged in the first similarity judgment step, wherein: in the interpolation value calculation step, a direction in which pixels having color information to be used to calculate the first term are set is selected based upon results of a judgment made in the first similarity judgment step and a direction in which pixels having color information to be used to calculate the second term are set is selected based upon results of
  • a fifth recording medium has an interpolation processing program recorded therein to implement on a computer processing for determining an interpolation value equivalent to color information corresponding to a first color component missing at a pixel, on image data provided in a colorimetric system constituted of first ⁇ nth (n ⁇ 2) color components with color information corresponding to a single color component present at each pixel.
  • the interpolation processing program comprises: a first term calculation step in which a first term representing average information of the first color component with regard to an interpolation target pixel to undergo interpolation processing is calculated by using color information corresponding to a color component at pixels set within a local area containing the interpolation target pixel; a second term calculation step in which a second term representing local curvature information based upon a color component matching the color component at the interpolation target pixel is calculated with regard to the interpolation target pixel by using color information corresponding to a color component at pixels set within a local area containing the interpolation target pixel; and an interpolation value calculation step in which an interpolation value is calculated by adding the second term multiplied by a weighting coefficient constituted of color information corresponding to a plurality of color components provided at pixels set within a local area containing the interpolation target pixel to the first term.
  • the interpolation value is calculated by correcting the “average information of the first color component with regard to the interpolation target pixel” with the “local curvature information based upon a color component matching the color component at the interpolation target pixel with regard to the interpolation target pixel” multiplied by a weighting coefficient constituted of color information corresponding to a plurality of color components at the interpolation target pixel and at pixels located in the local area containing the interpolation target pixel.
  • a sixth recording medium has an interpolation processing program recorded therein for implementing on a computer processing supplementing a color component value at a pixel at which information corresponding to a color component is missing, on image data provided in a colorimetric system constituted of a luminance component and the color component, with the luminance component having a higher spatial frequency than the color component and the luminance component present both at pixels having information corresponding to the color component and at pixels lacking information corresponding to the color component.
  • the interpolation processing program comprises: a hue value calculation step in which hue values for a plurality of pixels near an interpolation target pixel to undergo interpolation processing and having information corresponding to both the luminance component and the color component are calculated by using luminance component values and color component values at the individual pixels; a hue value interpolation step in which a hue value for the interpolation target pixel is calculated by using a median of the hue values at the plurality of pixels calculated in the hue value calculation step; and a color conversion step in which a color component value at the interpolation target pixel is interpolated by using a value indicated by the luminance component present at the interpolation target pixel to convert the hue value of the interpolation target pixel calculated in the hue value interpolation step to a color component value.
  • the hue value of the interpolation target pixel is calculated by using the median of the hue values at a plurality of pixels present near the interpolation target pixel.
  • a seventh recording medium has an interpolation processing program recorded therein for implementing on a computer processing for supplementing a luminance component value at a pixel at which information corresponding to a luminance component is missing and a color component value at a pixel at which information corresponding to a color component missing, on image data provided in a calorimetric system constituted of the luminance component and the color component, with the luminance component having a higher spatial frequency than the color component and information corresponding to either the luminance component or the color component present at each pixel.
  • the interpolation processing program comprises: a luminance component interpolation step in which a luminance component value is interpolated for a luminance component interpolation target pixel to undergo luminance component interpolation processing by using at least either “similarity between the luminance component interpolation target pixel and a pixel near the luminance component interpolation target pixel” or “information corresponding to a plurality of color components within a local area containing the luminance component interpolation target pixel”; a hue value calculation step in which hue values at a plurality of pixels located near an interpolation target pixel to undergo color component interpolation processing, having color component values and having luminance component values interpolated in the luminance component interpolation step are calculated by using the luminance component values and color component values at the individual pixels; a hue value interpolation step in which a hue value for the interpolation target pixel is calculated by using a median of the hue values at the plurality of pixels calculated in the hue value calculation step; and a color conversion step in which a color component value is
  • the hue value of the interpolation target pixel is calculated by using the median of the hue values at a plurality of pixels present near the interpolation target pixel.
  • the first, third, fourth, sixth, 15th, 17th and 19th interpolation processing apparatuses may be realized on a computer.
  • the second, fifth, seventh ⁇ 14th, 16th, 18th, and 20th ⁇ 22nd interpolation processing apparatuses may be realized through interpolation processing programs recorded at recording media.
  • These interpolation processing programs may be provided to a computer through a communication line such as the Internet.
  • FIG. 1 is a functional block diagram of an electronic camera corresponding to first through fifth embodiments
  • FIGS. 2A and 2B show the arrangements of the color components in the image data adopted in the first embodiment, the second embodiment and the fourth embodiment;
  • FIGS. 3A and 3B show the arrangements of the color components in the image data adopted in the third embodiment and the fifth embodiment
  • FIG. 4 is a flowchart (1) of the operation achieved at the interpolation processing unit in the first embodiment
  • FIGS. 6A and 6B illustrate methods of weighted addition of similarity degree components
  • FIG. 7 shows the directions along which marked similarity manifests in correspondence to values (HV[i,j], DN[i,j]);
  • FIGS. 9A and 9B show how the adverse effect of magnification chromatic aberration is eliminated
  • FIGS. 10A 10 C illustrate median processing of the prior art
  • FIGS. 12A and 12B illustrate the ranges of the median processing implemented in the first embodiment
  • FIG. 14 shows the positions of the color information used to calculate local curvature information
  • FIG. 16 is a functional block diagram of a sixth embodiment
  • FIG. 17 illustrates an example of the interpolation processing in the prior art
  • FIG. 1 is a functional block diagram of the electronic camera corresponding to the first through fifth embodiments.
  • FIG. 1 shows only the interpolation processing unit 17 in the image processing unit 15 to simplify the illustration, a functional block that engages in other image processing such as gradation conversion processing may also be provided in the image processing unit 15 .
  • the control unit 11 is connected to the image-capturing unit 13 , the A/D conversion unit 14 , the image processing unit 15 and the recording unit 16 .
  • an optical image obtained at the photographic optical system 12 is formed at the image-capturing sensor in the image-capturing unit 13 .
  • An output from the image-capturing unit 13 is quantized at the A/D conversion unit 14 and is provided to the image processing unit 15 as image data.
  • the image data provided to the image processing unit 15 undergo interpolation processing at the interpolation processing unit 17 and after having undergone image compression as necessary, they are recorded via the recording unit 16 .
  • the image data with the degrees of resolution corresponding to the individual color components improved through the interpolation processing are ultimately output as image data in a calorimetric system that corresponds to the type of device that is connected, such as a display or a printer.
  • FIGS. 2A and 2B show the arrangements of the color components in the image data adopted in the first embodiment, the second embodiment and the fourth embodiment, and FIGS. 3A and 3B show the arrangements of the color components in the image data adopted in the third embodiment and the fifth embodiment. It is to be noted that in FIGS. 2A and 2B and in FIGS. 3A and 3B, the individual color components are indicated as R, G and B, with the positions of pixels at which the various color components are present indicated with i and j.
  • B interpolation processing the interpolation processing implemented to supplement blue color interpolation values
  • R interpolation processing the interpolation processing for supplementing red color interpolation values
  • FIGS. 4 and 5 present a flowchart of the operation achieved in the interpolation processing unit 17 in the first embodiment, with FIG. 4 corresponding to the operation of the interpolation processing unit 17 during the G interpolation processing and FIG. 5 corresponding to the operation of the interpolation processing unit 17 during the R interpolation processing.
  • the interpolation processing unit 17 calculates a similarity degree Cv[i,j] along the vertical direction and a similarity degree Ch[i,j] along the horizontal direction for an interpolation target pixel at which the green component is missing (FIG. 4 S 1 ).
  • Ch1[i,j]
  • Ch2[i,j] (
  • Ch3[i,j] (
  • G-R (G-B) similarity degree component along vertical direction [0131]
  • G-R (G-B) similarity degree component along horizontal direction [0132]
  • Ch4[i,j] (
  • Ch5[i,j] (
  • Ch6[i,j] (
  • Y[i,j] represents a value calculated through
  • A[i,j] represents an arbitrary set of color information on the Bayer array which may assume a G value or a Z value depending upon the position at which the color information is provided.
  • the interpolation processing unit 17 performs weighted addition of the plurality of types of similarity degree components along each direction by using weighting coefficients a1, a2, a3, a4, a5 and a6, as expressed in the follow formulae 23 and 24, to calculate a similarity degree Ch0[i,j] along the vertical direction and a similarity degree Ch 0 [i,j] along the horizontal direction for the interpolation target pixel.
  • Ch0[i,j] (a1,Ch1[i,j]+a2 ⁇ Ch2[i,j]+a3 ⁇ Ch3[i,j]+a4 ⁇ Ch4[i,j]+a5 ⁇ Ch5[i,j]+a6 ⁇ Ch6[i,j])/(a1+a2+a3+a4+a5+a6) formula 24
  • a further improvement is achieved in the accuracy with which the similarity degrees are calculated by calculating the similarity degree components along the vertical and horizontal directions and performing weighted addition of the similarity degree components for nearby pixels around the interpolation target pixel as well as for the interpolation target pixel.
  • the interpolation processing unit 17 performs weighted addition of the results obtained by implementing weighted addition of the similarity degree components at the interpolation target pixel and the nearby pixels (Cv0[i,j], Cv0[i ⁇ 1, j ⁇ 1], Cv0[i ⁇ 1, j+1], Cv0[i+1,j ⁇ 1], Cv0[i+1,j+1] and the like), through either (method 1) or (method 2) detailed below, to obtain a similarity degree Cv[i,j] along the vertical direction and a similarity degree Ch[i,j] along the horizontal direction manifesting by the interpolation target pixel.
  • Ch[i,j]3 (4 Ch0[i,j]+Ch0[i ⁇ 1, j ⁇ 1]+Ch0[i ⁇ 1, j+1]+Ch0[i+1,j ⁇ 1]+Ch0[i+1,j+1])/8 formula 26
  • Cv[i,j] (4 ⁇ Cv0[i,j]+2 ⁇ (Cv0[i ⁇ 1, j ⁇ 1]+Cv0[i+1,j ⁇ 1]+Cv0[i ⁇ 1, j+1]+Cv0[i+1,j+1]) +Cv0[i,j ⁇ 2]+Cv0[i,j+2]+Cv0[i ⁇ 2,j]+Cv0[i+2,j])/16 formula 27
  • Ch[i,j] (4-Ch0[i,j]+2 ⁇ (Ch0[i ⁇ 1, j ⁇ 1]+Ch0[i+1,j ⁇ 1]+Ch0[i ⁇ 1, j+1]+Ch0[i+1,j+1]) +Ch0[i,j ⁇ 2]+Ch0[i,j+2]+Ch0[i ⁇ 2,j]+Ch0[i+2,j])/16 formula 28
  • method 1 corresponds to that weighted addition of the similarity degree components at the interpolation target pixel and the nearby pixels is implemented as illustrated in FIG. 6A
  • method 2 corresponds to that weighted addition of the similarity degree components at the interpolation target pixel and the nearby pixels is implemented as illustrated in FIG. 6B.
  • the similarity degree components each calculated by using color information corresponding to the same color component such as the G-G similarity degree components, B-B (R-R) similarity degree component and the R-R (B-B) similarity degree components (hereafter referred to as “same-color similarity degree components”) have been confirmed through testing to be suitable for use in the evaluation of similarity manifesting in an image with a low spatial frequency and a large colored area.
  • the similarity degree components each calculated by using color information corresponding to different color components such as the G-R (G-B) similarity degree components and B-G (R-G) similarity degree components (hereafter referred to as “different-color similarity degree components”) have been confirmed through testing to be suitable for use in the evaluation of similarity manifesting in an image with a high spatial frequency and a large achromatic image area.
  • the luminance similarity degree components have been confirmed through testing to be suitable for use in the evaluation of similarity manifesting in an image containing both a colored area and an image area with a fairly high spatial frequency.
  • the evaluation of similarity manifesting in various types of images can be achieved with a high degree of accuracy by using similarity degrees obtained through weighted addition of same-color similarity degree components, different-color similarity degree components and luminance similarity degree components.
  • the functions of the three types of similarity degree components calculated as the same-color similarity degree components (the G-G similarity degree components, the B-B (R-R) similarity degree components and the R-R (B-B) similarity degree components) in the similarity evaluation can be complemented by one another and the functions of the two types of similarity degrees components calculated as the different-color similarity degree components (the G-R (G-B) similarity degree components and the B-G (R-G) similarity degree components) in the similarity evaluation, too, can be complemented by each other.
  • the vertical similarity degree Cv[i,j] and the horizontal similarity degree Ch[i,j] are calculated through weighted addition of the results of weighted addition of similarity degree components at the interpolation target pixel and the results of weighted addition of similarity degree components at nearby pixels.
  • the continuity between the color information at the interpolation target pixel and the color information at the pixels located near the interpolation target pixel is readily reflected in the vertical similarity degree Cv[i,j] and the horizontal similarity degree Ch[i,j].
  • the vertical similarity degree Cv[i,j] and the horizontal similarity degree Ch[i,j] calculated through (method 2) reflect color information corresponding to the color components at pixels over a wide range and thus, are effective in the similarity evaluation of an image manifesting a pronounced magnification chromatic aberration.
  • the vertical similarity degree Cv[i,j] and the horizontal similarity degree Ch[i,j] in the first embodiment indicate more marked similarity as their values become smaller.
  • the interpolation processing unit 17 compares the similarity along the vertical direction and the similarity along the horizontal direction manifesting at the interpolation target pixel (hereafter referred to as the “vertical/horizontal similarity”) based upon the vertical similarity degree Cv[i,j] and the horizontal similarity degree Ch[i,j] (FIG. 4 S 2 ). Then, it sets one of the following values for an index HV[i,j] which indicates the vertical/horizontal similarity based upon the results of the comparison.
  • the threshold value Ti is used to prevent an erroneous judgment that the similarity along either direction is more marked from being made due to noise when the difference between the vertical similarity degree Cv[i,j] and the horizontal similarity degree Ch[i,j] is very little. Accordingly, by setting a high value for the threshold value T1 when processing a color image with a great deal of noise, an improvement in the accuracy of the vertical/horizontal similarity judgment is achieved.
  • the interpolation processing unit 17 calculates a similarity degree C45[i,j] along the diagonal 45° direction and a similarity degree C135[i,j] along the diagonal 135° degree direction for the interpolation target pixel (FIG. 4 S 6 ).
  • the interpolation processing unit 17 calculates a plurality of types of similarity degree components along the diagonal 45° direction and the diagonal 135° direction as defined in the following formulae 29 ⁇ 36;
  • R-R (B-B) similarity degree component along the diagonal 135° direction [0163]
  • the interpolation processing unit 17 calculates a similarity degree C45 — 0[i,j] along the diagonal 45° direction and a similarity degree C135 — 0[i,j] along the diagonal 135° direction through weighted addition of the plurality of types of similarity degree components performed along each of the two directions by using weighting coefficients b1, b2, b3 and b4 , as expressed in the following formulae 37 and 38.
  • a further improvement is achieved in the accuracy with which the similarity degrees are calculated by calculating the similarity degree components along the diagonal 45° direction and the diagonal 135° direction and performing weighted addition of the similarity degree components for nearby pixels around the interpolation target pixel as well as for the interpolation target pixel.
  • the interpolation processing unit 17 performs weighted addition of the results obtained by implementing weighted addition of the similarity degree components at the interpolation target pixel and the nearby pixels (C45 — 0[i,j], C45 — 0[i ⁇ 1, j ⁇ 1]C45 — 0[i ⁇ 1, j+1], C45 — 0[i+1,j ⁇ 1]C45 — 0[i+1,j+1] and the like) through either (method 1) or (method 2) detailed below, to obtain a similarity degree C45[i,j] along the diagonal 45° direction and a similarity degree C135[i,j] along the diagonal 135° direction manifesting by the interpolation target pixel (equivalent to implementing weighted addition of similarity degree components at the interpolation target pixel and the nearby pixels as illustrated in FIGS. 6A and 6B).
  • C45[i,j] (4 ⁇ C45 — 0[i,j]+2(C45 — 0[i ⁇ 1, j ⁇ 1]+C45 — 0[i+1,j ⁇ 1]+C45 — 0[i ⁇ 1, j+1]+C45 — 0[i+1,j+1])+C45 — 0[i,j ⁇ 2]+C45 — 0[i,j+2]+C45 — 0[i ⁇ 2,j]+C45 — 0[i+2,j])/16 formula 41
  • C135[i,j] (4 ⁇ C135 — 0[i,j]+2(C135 — 0[i ⁇ 1, j ⁇ 1]+C135 — 0[i+1,j ⁇ 1]+C135 — 0[i+1,j+1])+Cl35 — 0[i,j ⁇ 2]+C135 — 0[i,j+2]+C135 — 0[i ⁇ 2,j]+C135 — 0[i+2,j])/16 formula 42
  • the weighted addition of the plurality of similarity degree components and the consideration of the evaluation of similarity degrees at the nearby pixels with regard to the diagonal 45° similarity degree C45[i,j] and the diagonal 135° similarity degree C135[i,j] thus calculated achieves the same function as that with regard to the vertical similarity degree Cv[i,j] and the horizontal similarity degree Ch[i,j].
  • the diagonal 45° similarity degree C45[i,j] and the diagonal 135° similarity degree C135[i,j] in the first embodiment indicate more marked similarity as their values become smaller.
  • the interpolation processing unit 17 compares the similarity along the diagonal 45° direction and the similarity along the diagonal 135° direction manifesting at the interpolation target pixel (hereafter referred to as the “diagonal similarity”) based upon the diagonal 45° similarity degree C45[i,j] and the diagonal 135° similarity degree C135[i,j] (FIG. 4 S 7 ). Then, it sets one of the following values for an index DN[i,j] which indicates the diagonal similarity based upon the results of the comparison.
  • the interpolation processing unit 17 judges that a more marked similarity is manifested along the diagonal 45° direction than along the diagonal 135° direction and sets 1 for the index DN[i,j] (FIG.
  • the threshold value T2 is used to prevent an erroneous judgment that the similarity along either direction is more marked from being made due to noise.
  • the interpolation processing unit 17 ascertains the specific values of the index HV[i,j] indicating the vertical/horizontal similarity and the index DN[i,j] indicating the diagonal similarity (FIG. 4 S 11 ) and classifies the class of the similarity manifesting at the interpolation target pixel as one of the following; case 1 case 9.
  • FIG. 7 illustrates the directions along which marked similarity manifests, as indicated by the values of HV[i,j], DN[i,j]
  • the interpolation processing unit 17 calculates the green color interpolation value G[i,j] as indicated below based upon the results of the judgment explained above.
  • Gv[i, j] (G [i, j ⁇ 1]+G [i, j+1])/2+(2 ⁇ Z [i, j] ⁇ Z[i, j ⁇ 2] ⁇ Z[i, j+2])/8+(2 G[i ⁇ 1,j] ⁇ G[i ⁇ 1, j ⁇ 2] ⁇ G[i ⁇ 1, j+2]+2 ⁇ G[i+1, j] ⁇ G[i+1, j ⁇ 2] ⁇ G[i+1, j+2])16 formula 43
  • Gv135[i,j] (G[i ,j ⁇ 1]+G[i,j+1])/2+(2 ⁇ Z[i,j] ⁇ Z[i,j ⁇ 2] ⁇ Z[i,j+2])/8+(2 ⁇ Z[i ⁇ 1,j ⁇ 1] ⁇ Z[i ⁇ 1,j ⁇ 3] ⁇ Z[i ⁇ 1,j+1]+2 ⁇ Z[i+1,j+1] ⁇ Z[i+1,j ⁇ 1] ⁇ Z[i+1,j+3])/16 formula 45
  • Gh[i,j] (G[i ⁇ 1, j]+G[i+1,j])/2+(2 ⁇ Z [i,j] ⁇ Z [i ⁇ 2, j] ⁇ Z [i+2, j])/8+(2 ⁇ G[i,j ⁇ 1] ⁇ G[i ⁇ 2,j ⁇ 1] ⁇ G[i+2,j ⁇ 1]+2 ⁇ G[i,j+1] ⁇ G[i ⁇ 2,j+1] ⁇ G[i+2,j+1])/16 formula 46
  • Gh45[i,j] (G[i ⁇ 1, j]+G[i+1,j])/2+(2 ⁇ Z[i,j] ⁇ Z[i ⁇ 2,j] ⁇ Z[i+2,j])/8+(2 ⁇ Z[i+1,j ⁇ 1] ⁇ Z[i ⁇ 1, j ⁇ 1] ⁇ Z[i+3,j ⁇ 1]+2 ⁇ Z[i ⁇ 1, j+1] ⁇ Z[i ⁇ 3,j+1] ⁇ Z[i+1,j+1])/16 formula 47
  • Gh135[i,j] (G[i ⁇ 1, j]+G[i+1,j])/2+(2 ⁇ Z[i,j] ⁇ Z[i ⁇ 2,j] ⁇ Z[i+2,j])/8+(2 ⁇ Z[i ⁇ 1, j ⁇ 1] ⁇ Z[i ⁇ 3,j ⁇ 1] ⁇ Z[i+1,j ⁇ 1+2] ⁇ Z[i+1,j+1] ⁇ Z[i ⁇ 1, j+1] ⁇ Z[i+3,j+1])/16 formula 48
  • the first term constitutes the “local average information of the green color component” and corresponds to the primary terms in formulae 1 and 2.
  • the second term represents the “local curvature information based upon a color component matching a color component at the interpolation target pixel”
  • the third term represents the “local curvature information based upon a color component other than the color component at the interpolation target pixel.” It is to be noted that the curvature information in the second and third terms is obtained through quadratic differentiation of the color components.
  • the difference between the color information Z[i ⁇ 1, j+1] and the color information Z[i ⁇ 1, j ⁇ 1] and the difference between the color information Z[i ⁇ 1,j+3] and the color information Z[i ⁇ 1, j+1] are obtained with the difference between these differences then ascertained and the difference between the color information Z[i+1,j ⁇ 1] and the color information Z[i+1,j ⁇ 3] and the difference between the color information Z[i+1,j+1] and the color information Z[i+1,j ⁇ 1] are obtained with the difference between these differences then ascertained.
  • the “local curvature information based upon a color component matching the color component at the interpolation target pixel” is local curvature information with directionality manifesting along the vertical direction
  • the “local curvature information based upon a color component other than the color component at the interpolation target pixel” is local curvature information with directionality manifesting along the vertical direction and the diagonal 45 direction.
  • the “local curvature information based upon a color component matching the color component at the interpolation target pixel” is local curvature information with directionality manifesting along the vertical direction
  • the “local curvature information based upon a color component other than the color component at the interpolation target pixel” is local curvature information with directionality manifesting along the vertical direction and the diagonal 135° direction.
  • the “local curvature information based upon a color component matching the color component at the interpolation target pixel” is local curvature information with the directionality manifesting along the horizontal direction
  • the “local curvature information based upon a color component other than the color component at the interpolation target pixel” is local curvature information with directionality manifesting along the horizontal direction and the diagonal 45° direction.
  • the “local curvature information based upon a color component matching the color component at the interpolation target pixel” is local curvature information with directionality manifesting along the horizontal direction
  • the “local curvature information based upon a color component other than the color component at the interpolation target pixel” is local curvature information with directionality manifesting along the horizontal direction and the diagonal 135° direction.
  • the “local curvature information based upon a color component matching the color component at the interpolation target pixel” and the “local curvature information based upon a color component other than the color component at the interpolation target pixel” in Gv[i,j] are both local curvature information with directionality manifesting along the vertical direction
  • the “local curvature information based upon a color component matching the color component at the interpolation target pixel” and the “local curvature information based upon a color component other than the color component at the interpolation target pixel” in Gh[i,j] are both local curvature information with directionality manifesting along the horizontal direction.
  • the local average information of the green color component is corrected by using the “local curvature information based upon a color component matching the color component at the interpolation target pixel” and the “local curvature information based upon a color component other than the color component at the interpolation target pixel.”
  • the local average information of the green color component (the primary term) is corrected by using the local curvature information based upon the red color component and the local curvature information based upon the blue color component at phases that are opposite from each other.
  • color information in the individual color components to be used to calculate the local curvature information corresponding to each color component is obtained from pixels that are present on both sides of a line drawn along a direction judged to manifest marked similarity.
  • the primary term can be corrected for a desired pixel even if there is magnification chromatic aberration at the photographic optical system 12 , with the overshoot and the undershoot occurring as a result of the G interpolation processing disclosed in U.S. Pat. No. 5,629,734 canceled out by each other. Consequently, the occurrence of color artifacts attributable to over correction can be reduced in the first embodiment.
  • overshoot values corresponding to the individual color components are averaged in the first embodiment and thus, the average value does not exceed an overshoot value resulting from the G interpolation processing disclosed in U.S. Pat. No. 5,629,734.
  • the undershoot value in the first embodiment never exceeds the undershoot value resulting from the G interpolation processing disclosed in U.S. Pat. No. 5,629,734.
  • the image data to undergo the G interpolation processing are arranged in a Bayer array as shown in FIGS. 2A and 2B, with the color information corresponding to the red color component and the color information corresponding to the blue color component positioned diagonally to each other.
  • the local curvature information based upon the red color component to be used to correct the primary term is calculated by using color information corresponding to the red color component at pixels positioned along a diagonal direction along which marked similarity to the interpolation target pixel manifests.
  • the green color interpolation value is calculated by using the color information at pixels set along a diagonal direction distanced from the interpolation target pixel such as Z[i ⁇ 1, j+3] and Z[i+1,j ⁇ 3] in formula 44, Z[i ⁇ 1, j ⁇ 31 and Z[i+1,j+3] in formula 45, Z[i+3,j1] and Z[i ⁇ 3,j+1] in formula 47 and Z[i ⁇ 3,j ⁇ 11 and Z[i+3,j+1] in formula 48.
  • the interpolation processing unit 17 achieves a high degree of accuracy in the judgement of the diagonal similarity by using a plurality of sets of color information when calculating a plurality of types of similarity degree components along the diagonal 45° direction and the diagonal 135° direction.
  • the red color component, the green color component and the blue color component may have relative positional offsets in order of wavelength, i.e. in order of red color, green color and blue color. And the green color component positions between the red color component and the blue color component.
  • the local curvature information based upon the green color component can be used as a component at a phase opposite from the phase of the local curvature information based upon the red color component to reduce the occurrence of color artifacts resulting from over correction.
  • the local curvature information based upon the green color component can be used as a component at a phase opposite from the phase of the local curvature information based upon the blue color component to reduce the occurrence of color artifacts resulting from over correction.
  • a known example of the RB interpolation processing in the prior art is linear interpolation processing implemented in a color difference space in which after calculating color differences at all the pixels (values each obtained by subtracting the value indicated by color information corresponding to the green color component from the value indicated by color information corresponding to the red color component (or the blue color component)), one of the three different types of processing (1) ⁇ (3) described below is implemented on each interpolation target pixel to calculate the interpolation value.
  • the interpolation target value is calculated as a value achieved by adding the color information corresponding to the green color component at the interpolation target pixel to the average of the color differences at the two pixels.
  • the interpolation target value is calculated as a value achieved by adding the value indicated by the color information corresponding to the green color component at the interpolation target pixel to the average of the color differences at the two pixels.
  • the interpolation target value is calculated as a value achieved by adding the value indicated by the color information corresponding to the green color component at the interpolation target pixel to the average of the color differences at the four pixels.
  • the pixel marked X alone is interpolated by using the median value (median) of the values at four nearby pixels, the pixels marked O are each interpolated by using the average of the values at the pixels adjacent along the horizontal direction and the pixels marked ⁇ are each interpolated by using the average of the values at the pixels adjacent along the vertical direction.
  • the interpolation processing on the green color component which is equivalent to the luminance component with a high spatial frequency can be implemented with a very high degree of accuracy by using similarity manifesting between the interpolation target pixel and nearby pixels and calculating the interpolation value using a plurality of color components, as explained earlier.
  • the interpolation processing on the red color component and the blue color component is achieved through linear interpolation in color difference spaces relative to the green color component to reduce color artifacts by reflecting the high-frequency information in the image data in the red color component and the blue color component.
  • R2 (R1+R3)/2+(2 ⁇ G2 ⁇ G1 ⁇ G3)/2 formula 49.
  • G2 represents color information corresponding to the green color component in the original image and G1 and G3 each represent a green color interpolation value obtained through the G interpolation processing.
  • this RB interpolation processing poses a problem in that the color artifact is allowed to remain in the vicinity of an isolated point (an image area manifesting only slight similarity to nearby pixels and having a high spatial frequency).
  • this type of color artifact is often eliminated in post processing, in which a and b hue planes obtained by converting the image data to the Lab calorimetric system individually undergo median filtering after the G interpolation processing and the RB interpolation processing are implemented.
  • RB interpolation processing through which red color and blue color interpolation values can be calculated quickly with a high degree of accuracy without allowing any color artifacts to remain in the vicinity of an isolated point or losing the color structure is proposed. It is to be noted that the following is an explanation of the only R interpolation processing in the RB interpolation processing, given in reference to FIG. 5.
  • the interpolation processing unit 17 calculates a color difference that contains the red color component for each pixel at which color information corresponding to the red color component is present by subtracting the green color interpolation value (the value obtained through the G interpolation processing explained earlier) from the value indicated by the color information corresponding to the red color component (FIG. 5 S 1 ).
  • the interpolation processing unit 17 calculates a color difference Cr[i,j] containing the red color component at a pixel at given coordinates [i,j] with color information corresponding to the red color component as;
  • the color differences containing the red color component are set so as to surround pixels at which color information corresponding to the red color component is missing and color information corresponding to the blue color component is present from the four diagonal directions.
  • the interpolation processing unit 17 interpolates the color difference containing the red color component for each of the pixels surrounded by color differences containing the red color component from the four diagonal directions (each pixel at which color information corresponding to the red color component is missing and color information corresponding to the blue color component is present in the first embodiment) by using the median of the color differences containing the red color component at the pixels set diagonally to the target pixel (FIG. 5 S 2 ).
  • median ⁇ represents a function through which the median of a plurality of elements is calculated and, if there are an even number of elements, it takes the average of the two middle elements.
  • the color differences containing the red color component are set so as to surround pixels at which color information corresponding to the red color component and color information corresponding to the blue color component are both missing from the four directions; i.e., from above, from below and from the left and the right.
  • the interpolation processing unit 17 interpolates the color difference containing the red color component for each of the pixels surrounded by color differences containing the red color component from the four directions; i.e., from above, from below and from the left and the right (each pixel at which color information corresponding to the red color component and color information corresponding to the blue color component are both missing in the first embodiment) by using the median of the color differences containing the red color component at the pixels set above, below and to the left and the right of the pixel (FIG. 5 S 3 ).
  • the interpolation processing unit 17 converts the color difference containing the red color component calculated through formula 51 or formula 52 for each pixel at which color information corresponding to the red color component is missing to a red color interpolation value by using color information corresponding to the green color component (or the green color interpolation value) (FIG. 5 S 4 ).
  • the interpolation processing unit 17 calculates the red color interpolation value R[m,n] for the pixel at given coordinates [m,n] through;
  • the median processing described above is implemented on the color differences representing the hue alone and is not implemented on the luminance component.
  • the color differences containing the red color component at the pixels marked X are calculated by using the color differences Cr over a 3 ⁇ 5 range, and thus, the color difference containing the red color component at the pixel marked O represents a value which is close to the results of median processing implemented by weighting the color differences Cr within the 3 ⁇ 5 range.
  • the color differences containing the red color component at the pixels marked X are calculated by using the color differences Cr over a 5 ⁇ 3 range, and thus, the color difference containing the red color component at the pixel marked ⁇ represents a value which is close to the results of median processing implemented by weighting the color differences Cr within the 5 ⁇ 3 range.
  • RB interpolation processing is implemented after the G interpolation processing in the first embodiment
  • RB interpolation processing similar to that in the embodiment can be implemented without having to perform G interpolation processing on image data provided in the YCbCr colorimetric system with Y, Cb and Cr culled at a ratio of 4:2:0 since the luminance component Y is left intact in the image data.
  • the second embodiment may be adopted when implementing processing on an interpolation target pixel at which the blue color component is present, as shown in FIG. 2B.
  • the interpolation processing unit 17 ascertains the degrees of similarity manifesting at the interpolation target pixel as in the first embodiment (corresponds to FIG. 4 S 1 ⁇ S 11 ) and classifies the type of the similarity at the interpolation target pixel as one of cases 1 ⁇ 9 explained earlier. Then, the interpolation processing unit 17 calculates the green color interpolation value G[i,j] as indicated below.
  • Gv135[i,j] gv[i,j]+ ⁇ red ⁇ Rv135[i,j]+ ⁇ green ⁇ Gv[i,j]+ ⁇ blue ⁇ Bv135[i,j] formula 56
  • Gh45[i,j] gh[i,j]+ ⁇ red ⁇ Rh45[i,j]+ ⁇ green ⁇ Gh[i,j]+ ⁇ blue ⁇ Bh45[i,j] formula 58
  • the local average information of the green color component and the local curvature information based upon the individual color components are calculated as indicated below, depending upon the direction along which similarity manifests.
  • FIGS. 13 and 14 show the positions of the color information used when calculating local curvature information based upon the individual color components. Namely, local curvature information corresponding to a given color component is obtained through weighted addition of the components of the curvature information calculated by using color information at the pixels contained within the area enclosed by the oval in FIG. 13 or 14 .
  • ⁇ Rv45[i,j] is local curvature information with directionality manifesting along the vertical direction and the diagonal 45° direction
  • ⁇ Rv[i,j] is local curvature information with directionality manifesting along the vertical direction
  • ⁇ Rv135[i,j] is local curvature information with directionality manifesting along the vertical direction and the diagonal 135° direction
  • ⁇ Rh45[i,j] is local curvature information with directionality manifesting along the horizontal direction and the diagonal 45° direction
  • ⁇ Rh(i,j] is local curvature information with directionality manifesting along the horizontal direction
  • ⁇ Rh135[i,j] is local curvature information with directionality manifesting along the horizontal direction and the diagonal 135° direction.
  • ⁇ Gv[i,j] is local curvature information with directionality manifesting along the vertical direction and (Gh[i,j] is local curvature information with directionality manifesting along the horizontal direction.
  • ⁇ Bv45[i,j] is local curvature information with directionality manifesting along the vertical direction and the diagonal 45° direction
  • ⁇ Bv135[i,j] is local curvature information with directionality manifesting along the vertical direction and the diagonal 135° direction
  • ⁇ Bh45[i,j] is local curvature information with directionality manifesting along the horizontal direction and the diagonal 45° direction
  • ⁇ Bh135[i,j] is local curvature information with directionality manifesting along the horizontal direction and the diagonal 135° direction.
  • the method of calculating the local curvature information based upon the red color component is changed from that adopted in the first embodiment, and the local curvature information based upon the red color component extracted from a wider range than in the first embodiment is made to undergo mild low pass filtering as appropriate while taking into consideration the directionality.
  • the overall effect for reducing over correction is improved over the first embodiment.
  • the local curvature information based upon the blue color component which prevents an over correction attributable to the local curvature information based upon the red color component when similarity manifests along the diagonal direction in the first embodiment is all substituted with local curvature information based upon the green color component.
  • the need for making a judgment with regard to similarity manifesting along the diagonal directions is eliminated to simplify the algorithm and, at the same time, the extraction of structural information at a sufficient level is achieved while preventing over correction.
  • the local curvature information based upon the green color component used to prevent over correction of the red color component in example 2 is now used as the main element in the correctional term.
  • the structural information can be extracted even when the correctional term is constituted only of the curvature information based upon the green color component.
  • the curvature information based upon the green color component too, contains a great deal of structural information equivalent to the curvature information based upon the red color component that passes through the center.
  • a correction is performed with local curvature information based upon the same color component, i.e., the green color component, as the color component of the average information constituting the primary term.
  • These settings represent an example of ratios of the coefficients effective even when the local curvature information based upon the red color component that passes through the center is not used as a measure against over -correction occurring at the settings in example 4.
  • the degree of over correction attributable to local curvature information based upon the blue color component can be reduced with the local curvature information based upon the red color component obtained from nearby pixels when similarity manifests along the diagonal directions, while achieving the advantage of extracting the local structural information as in example 3 and example 4.
  • the closest pixels at which the green color component (the closest green color component) is present are located along the horizontal direction relative to the pixel to undergo the G interpolation processing as shown in FIGS. 3A and 3B in the third embodiment, it is not necessary to calculate the similarity degrees or to judge the direction along which similarity manifests as required in the first embodiment during the G interpolation processing.
  • the calculation of similarity degrees and judgment with regard to the direction along which similarity manifests may be performed along the diagonal 45° direction and the diagonal 135° direction in which the second closest pixels with green color component (the second closest green color component) are present.
  • the interpolation processing unit 17 calculates the green color interpolation value G[i,j] through the following formula 74 based upon the image data arranged as shown in FIGS. 3A and 3B.
  • G[i,j] (Gi ⁇ 1, ]+G[i+1,j])/2+(2 ⁇ Z[i,j] ⁇ Z[i ⁇ 2,j] ⁇ Z[i+2,j])/8+(2 ⁇ Z[i,j] ⁇ 1] ⁇ Z[i ⁇ 2,j ⁇ 1] ⁇ Z[i+2,j ⁇ 1]+2 ⁇ Z[i,j+1] ⁇ Z[i ⁇ 2,j+1] ⁇ Z[i+2,j+1j]/16 formula 74
  • the first term represents the “local average information of the green color component”, which is equivalent to the primary terms in formula 1 and formula 2.
  • the second term represents the “local curvature information based upon a color component matching the color component at the interpolation target pixel”
  • the third term represents the “local curvature information based upon a color component other than the color component at the interpolation target pixel”
  • the third term constitutes the “local curvature information based upon the blue color component” if color information corresponding to the red color component is present at the interpolation target pixel (FIG. 3A)
  • the third term constitutes the “local curvature information based upon the red color component” if color information corresponding to the blue color component is present at the interpolation target pixel (FIG. 3B).
  • the local average information of the green color component (the primary term) is corrected by using the local curvature information based upon the red color component and the local curvature information based upon the blue color component at phases opposite from each other in the third embodiment.
  • the primary term is corrected in correspondence to the average change in quantity in the color information corresponding to the red color component and the color information corresponding to the blue color component in the third embodiment (see FIGS. 9A and 9B).
  • the primary term can be corrected for a desired pixel with the overshoot and undershoot occurring in the G interpolation processing disclosed in U.S. Pat. No. 5,629,734 canceling out each other in the third embodiment. Consequently, the occurrence of color artifacts due to over correction can be reduced by adopting the third embodiment.
  • the interpolation processing unit 17 ascertains degrees of similarity manifesting by the interpolation target pixel as in the first embodiment (corresponds to FIG. 4 S 1 ⁇ S 11 ) and classifies the type of similarity at the interpolation target pixel as one of case 1 ⁇ 9 explained earlier.
  • the interpolation processing unit 17 calculates the inclination Gk[i,j] of the green color component and the inclination Zk[i,j] of the red color component (or the blue color component) relative to the direction perpendicular to the direction judged to manifest marked similarity as indicated below.
  • the interpolation processing unit 17 calculates the green color interpolation value G[i,j] as follows.
  • Gvk[i,j] (G[i,j ⁇ 1]+G[i,j+1])/2+Gk[i,j]/Zk[i,j] (2 Z[i,j] ⁇ Z[i,j ⁇ 2] ⁇ Z[i,j+2])/4 formula 83
  • Ghk[i,j] (G[i ⁇ 1, j]+G[i+1,j])/2+Gk[i,j]/Zk[i,j](2 Z [i, j] ⁇ Z[i ⁇ 2,j] ⁇ Z[i+2,j])/4 formula 84.
  • the first term represents the “local average information of the green color component” which is equivalent to the primary term in formula 1 and formula 2.
  • the second term is the “local curvature information based upon a color component matching the color component at the interpolation target pixel” multiplied by a weighting coefficient (a value indicating the correlation between the inclination Gk[i,j] of the green color component and the inclination zk[i,j] of the red color component (or the blue color component): Gk[i,j]/Zk[i,j]), and is equivalent to the correctional term.
  • the local average information of the green color component is corrected by using the “local curvature information based upon a color component matching the color component at the interpolation target pixel” multiplied by the weighting coefficient.
  • the “local curvature information based upon a color component matching and the color component at the interpolation target pixel” is added to the “local average information of the green color component” without first multiplying it with the weighting coefficient, the “local average information of the green color component”, which should be corrected along the negative direction, becomes corrected in the positive direction as indicated by ⁇ in FIG. 15, resulting in an overshoot.
  • the range for the weighting coefficient may be set;
  • the closest pixels at which color information corresponding to the green color component is present are set along the horizontal direction to a pixel to undergo the G interpolation processing as shown in FIGS. 3A and 3B
  • interpolation processing is achieved in the simplest manner by using the color information at the pixels set along the horizontal direction. Accordingly, the green color interpolation value G[i,j] is calculated in the fifth embodiment as in case 8 in the fourth embodiment.
  • the interpolation processing unit 17 calculates the green color interpolation value G[i,j] through formula 85.
  • G[i,j] (G[i ⁇ 1,j]+G[i+1,j])/2+Gk[i,j]/Zk[i,j] ⁇ (2 ⁇ Z[i,j] ⁇ Z[i ⁇ 2,j] ⁇ Z[i+2,j])/4 formula 85,
  • the local average information of the green color component is corrected by using the “local curvature information based upon a color component matching the color component at the interpolation target pixel” multiplied by the weighting coefficient (a value representing the correlation between the inclination Gk[i,j] corresponding to the green color component and the inclination Zk[i,j] corresponding to the red color component (or the blue color component): Gk[i,j]/Zk[i,j]) as in the fourth embodiment.
  • the weighting coefficient a value representing the correlation between the inclination Gk[i,j] corresponding to the green color component and the inclination Zk[i,j] corresponding to the red color component (or the blue color component): Gk[i,j]/Zk[i,j]
  • G interpolation processing and RB interpolation processing can be achieved in a similar manner by using a color ratio or the like as a hue instead of a color difference.
  • curvature information based upon each color component is calculated through quadratic differentiation
  • the present invention is not limited to this example, and curvature information may be obtained through differentiation of a higher order. In other words, any method may be adopted as long as the degree of change in the rate of change occurring in each color component is ascertained.
  • FIG. 16 is a functional block diagram representing the sixth embodiment.
  • the same reference numbers are assigned to components achieving identical functions to those in the functional block diagram in FIG. 1 to preclude the necessity for repeated explanation of their structures.
  • FIG. 16 The structure of an electronic camera 20 shown in FIG. 16 differs from that of the electronic camera 10 in FIG. 1 in that a control unit 21 and an image processing unit 22 in FIG. 16 replace the control unit 11 and the image processing unit 15 in FIG. 1, with an interface unit 23 in FIG. 16 provided as an additional component.
  • a personal computer 30 is provided with a CPU 31 , an interface unit 32 , a hard disk 33 , a memory 34 , a CD-ROM drive device 35 and a communication interface unit 36 with the CPU 31 connected to the interface unit 32 , the hard disk 33 , the memory 34 , the CD-ROM drive device 35 and the communication interface unit 36 via a bus.
  • an interpolation processing program (an interpolation processing program for executing interpolation processing similar to that implemented at the interpolation processing unit 17 in the various embodiments explained earlier) recorded at a recording medium such as a CD-ROM 37 is pre-installed at the personal computer 30 via the CD-ROM drive device 35 .
  • the interpolation processing program is stored at the hard disk 33 in an execution-ready state.
  • image data generated as in the electronic camera 10 shown in FIG. 1 are provided to the image processing unit 22 in the electronic camera 20 .
  • the image data undergo image processing (e.g., gradation conversion processing) other than interpolation processing at the image processing unit 22 , and the image data having undergone the image processing are then recorded at the recording unit 16 in an image file format.
  • image processing e.g., gradation conversion processing
  • This image file is provided to the personal computer 30 via the interface unit 23 .
  • the CPU 31 in the personal computer 30 executes the interpolation processing program.
  • the image data with resolutions corresponding to the individual color components enhanced through the interpolation processing then undergo image compression and the like as necessary, are recorded at the hard disk 33 or the like and are finally output as data in a calorimetric system corresponding to the type of individual device connected, such as a display or a printer.
  • the recording medium such as the CD-ROM 37
  • the recording medium that may be used is not limited to a CD-ROM and any of various types of recording media including magnetic tape and a DVD may be used instead.
  • programs may be provided via a transmission medium such as a communication line 38 , a typical example of which is the Internet.
  • a transmission medium such as a communication line 38
  • the programs which are first converted to signals on a carrier wave that carries a transmission medium may be transmitted.
  • the personal computer 30 shown in FIG. 16 has such a function as well.
  • the personal computer 30 is provided with the communication interface unit 36 that connects with the communication line 38 .
  • a server computer 39 which provides the interpolation processing program, has the interpolation processing program stored at a recording medium such as an internal hard disk.
  • the communication line 38 may be a communication line for connection with the Internet or for a personal computer communication or it may be a dedicated communication line 38 .
  • the communication line 38 may be a telephone line or a wireless telephone line for a mobile telephone or the like.
  • the interpolation processing program according to the present invention that is executed within the electronic camera 10 in FIG. 1 is normally installed in a ROM (not shown) or the like at the time of camera production.
  • the ROM in which the interpolation processing program is installed may be an overwritible ROM, and the electronic camera may be then connected to a computer assuming a structure similar to that shown in FIG. 16, to allow an upgrade program to be provided from a recording medium such as a CD-ROM via the computer.
  • an upgrade program may be obtained via the Internet or the like as described earlier.
US09/877,002 1999-12-21 2001-06-11 Interpolation processing apparatus and recording medium having interpolation processing program recorded therein Abandoned US20020001409A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/367,583 US7236628B2 (en) 1999-12-21 2006-03-06 Interpolation processing apparatus and recording medium having interpolation processing program recording therein
US11/477,666 US7362897B2 (en) 1999-12-21 2006-06-30 Interpolation processing apparatus and recording medium having interpolation processing program recorded therein

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP36300799 1999-12-21
JP11-363007 1999-12-21
JP2000-204768 2000-07-06
JP2000204768A JP4599672B2 (ja) 1999-12-21 2000-07-06 補間処理装置および補間処理プログラムを記録した記録媒体
PCT/JP2000/009040 WO2001047244A1 (fr) 1999-12-21 2000-12-20 Dispositif d'interpolation et support d'enregistrement sur lequel un programme d'interpolation est enregistre

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2000/009040 Continuation WO2001047244A1 (fr) 1999-12-21 2000-12-20 Dispositif d'interpolation et support d'enregistrement sur lequel un programme d'interpolation est enregistre

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/367,583 Continuation US7236628B2 (en) 1999-12-21 2006-03-06 Interpolation processing apparatus and recording medium having interpolation processing program recording therein

Publications (1)

Publication Number Publication Date
US20020001409A1 true US20020001409A1 (en) 2002-01-03

Family

ID=26581442

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/877,002 Abandoned US20020001409A1 (en) 1999-12-21 2001-06-11 Interpolation processing apparatus and recording medium having interpolation processing program recorded therein
US11/367,583 Expired - Lifetime US7236628B2 (en) 1999-12-21 2006-03-06 Interpolation processing apparatus and recording medium having interpolation processing program recording therein
US11/477,666 Expired - Fee Related US7362897B2 (en) 1999-12-21 2006-06-30 Interpolation processing apparatus and recording medium having interpolation processing program recorded therein

Family Applications After (2)

Application Number Title Priority Date Filing Date
US11/367,583 Expired - Lifetime US7236628B2 (en) 1999-12-21 2006-03-06 Interpolation processing apparatus and recording medium having interpolation processing program recording therein
US11/477,666 Expired - Fee Related US7362897B2 (en) 1999-12-21 2006-06-30 Interpolation processing apparatus and recording medium having interpolation processing program recorded therein

Country Status (3)

Country Link
US (3) US20020001409A1 (ja)
JP (1) JP4599672B2 (ja)
WO (1) WO2001047244A1 (ja)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030052981A1 (en) * 2001-08-27 2003-03-20 Ramakrishna Kakarala Digital image system and method for implementing an adaptive demosaicing method
US6724945B1 (en) 2000-05-24 2004-04-20 Hewlett-Packard Development Company, L.P. Correcting defect pixels in a digital image
US20050073591A1 (en) * 2001-03-05 2005-04-07 Kenichi Ishiga Image processing device and image processing program
US20060050956A1 (en) * 2004-09-09 2006-03-09 Fuji Photo Film Co., Ltd. Signal processing apparatus, signal processing method, and signal processing program
US20060055794A1 (en) * 2002-05-15 2006-03-16 Nobuyuki Sato Image processing system, and image processing method, recording medium, and program
US20060098869A1 (en) * 2003-06-30 2006-05-11 Nikon Corporation Signal correcting method
US20060188165A1 (en) * 2002-06-12 2006-08-24 Marta Karczewicz Spatial prediction based intra-coding
WO2006093266A1 (ja) 2005-03-04 2006-09-08 Nikon Corporation 色ズレを補正する画像処理装置、画像処理プログラム、画像処理方法、および電子カメラ
US20070035636A1 (en) * 2005-08-09 2007-02-15 Sunplus Technology Co., Ltd. Method and system of eliminating color noises caused by an interpolation
US20070126885A1 (en) * 2005-12-01 2007-06-07 Megachips Lsi Solutions Inc. Pixel Interpolation Method
US20070229676A1 (en) * 2006-01-16 2007-10-04 Futabako Tanaka Physical quantity interpolating method, and color signal processing circuit and camera system using the same
US20080002915A1 (en) * 2006-06-30 2008-01-03 Samsung Electronics Co., Ltd. Image processing apparatus, method and medium
US20080013629A1 (en) * 2002-06-11 2008-01-17 Marta Karczewicz Spatial prediction based intra coding
US20080123999A1 (en) * 2004-07-07 2008-05-29 Nikon Corporation Image Processor and Computer Program Product
US20080247643A1 (en) * 2003-06-12 2008-10-09 Nikon Corporation Image processing method, image processing program and image processor
US20090135267A1 (en) * 2005-09-29 2009-05-28 Nikon Corporation Image Processing Apparatus and Image Processing Method
US20090310884A1 (en) * 2005-05-19 2009-12-17 Mstar Semiconductor, Inc. Noise reduction method and noise reduction apparatus
US20110199507A1 (en) * 2009-12-17 2011-08-18 Nikon Corporation Medium storing image processing program and imaging apparatus
US8175387B1 (en) * 2007-09-19 2012-05-08 Trend Micro, Inc. Image similarity detection using approximate pattern matching
US20130028538A1 (en) * 2011-07-29 2013-01-31 Simske Steven J Method and system for image upscaling
US20130077825A1 (en) * 2011-09-27 2013-03-28 Fuji Jukogyo Kabushiki Kaisha Image processing apparatus
WO2015104667A1 (en) * 2014-01-08 2015-07-16 Marvell World Trade Ltd. Methods and apparatus for demosaicking artifacts suppression
US11233921B2 (en) * 2017-03-28 2022-01-25 Brother Kogyo Kabushiki Kaisha Image processing apparatus that specifies edge pixel in target image using single-component image data

Families Citing this family (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6950469B2 (en) 2001-09-17 2005-09-27 Nokia Corporation Method for sub-pixel value interpolation
WO2003101119A1 (fr) * 2002-05-24 2003-12-04 Nikon Corporation Procede de traitement d'images, programme de traitement d'images et processeur d'images
JP2004241991A (ja) * 2003-02-05 2004-08-26 Minolta Co Ltd 撮像装置、画像処理装置及び画像処理プログラム
US7440016B2 (en) * 2003-12-22 2008-10-21 Hewlett-Packard Development Company, L.P. Method of processing a digital image
CN1857008B (zh) * 2004-02-19 2010-05-05 三菱电机株式会社 图像处理方法
JP4396332B2 (ja) * 2004-03-10 2010-01-13 セイコーエプソン株式会社 ディジタルカメラ
JP4333997B2 (ja) * 2004-08-24 2009-09-16 シャープ株式会社 画像処理装置、撮影装置、画像処理方法、画像処理プログラムおよび記録媒体
KR100721338B1 (ko) * 2004-12-30 2007-05-28 엘지전자 주식회사 디지털 촬영장치의 색상 보간법
JP4580791B2 (ja) * 2005-03-17 2010-11-17 ユニデン株式会社 画素補完方法
JP4388909B2 (ja) * 2005-04-25 2009-12-24 イーストマン コダック カンパニー 画素欠陥補正装置
TWI259728B (en) * 2005-04-26 2006-08-01 Ali Corp Method of encoding and decoding image data by applying image capturing device
WO2007007878A1 (ja) 2005-07-14 2007-01-18 Nikon Corporation 画像処理装置および画像処理方法
US7668366B2 (en) * 2005-08-09 2010-02-23 Seiko Epson Corporation Mosaic image data processing
WO2007036055A1 (en) * 2005-09-30 2007-04-05 Simon Fraser University Methods and apparatus for detecting defects in imaging arrays by image analysis
JP5183880B2 (ja) * 2006-03-14 2013-04-17 ソニー株式会社 カラーフィルタおよび撮像素子
JP4709084B2 (ja) * 2006-07-07 2011-06-22 キヤノン株式会社 画像処理装置及び画像処理方法
JP4735978B2 (ja) * 2006-07-21 2011-07-27 ソニー株式会社 画像処理装置、画像処理方法、及びプログラム
TWI332798B (en) * 2006-09-01 2010-11-01 Mstar Semiconductor Inc Method and device for reconstructing missing color component
JP4958538B2 (ja) * 2006-12-25 2012-06-20 三星電子株式会社 画像処理装置、画像処理方法、および撮像装置
JP2008294714A (ja) * 2007-05-24 2008-12-04 Panasonic Corp 映像信号処理装置
US8077234B2 (en) * 2007-07-27 2011-12-13 Kabushiki Kaisha Toshiba Image pickup device and method for processing an interpolated color signal
JP4620095B2 (ja) * 2007-08-24 2011-01-26 盛群半導體股▲ふん▼有限公司 Ymcgカラー・フィルタ・アレイに用いられる色彩補間法
JP4982897B2 (ja) * 2007-08-27 2012-07-25 株式会社メガチップス 画像処理装置
WO2009066770A1 (ja) 2007-11-22 2009-05-28 Nikon Corporation デジタルカメラおよびデジタルカメラシステム
US8564680B1 (en) 2007-12-13 2013-10-22 Marvell International Ltd. Method and apparatus for noise management for color data synthesis in digital image and video capture systems
JP4962293B2 (ja) * 2007-12-14 2012-06-27 ソニー株式会社 画像処理装置、画像処理方法、プログラム
US8229212B2 (en) * 2008-04-08 2012-07-24 Qualcomm Incorporated Interpolation system and method
JP5006835B2 (ja) * 2008-05-07 2012-08-22 ルネサスエレクトロニクス株式会社 エラー低減方法及びエラー低減装置
JP5272581B2 (ja) * 2008-08-25 2013-08-28 ソニー株式会社 画像処理装置、撮像装置、画像処理方法およびプログラム
JP2010061565A (ja) * 2008-09-05 2010-03-18 Konica Minolta Business Technologies Inc 画素補間装置、画素補間方法および画像読取装置
US8131067B2 (en) * 2008-09-11 2012-03-06 Seiko Epson Corporation Image processing apparatus, image processing method, and computer-readable media for attaining image processing
WO2010088465A1 (en) * 2009-02-02 2010-08-05 Gentex Corporation Improved digital image processing and systems incorporating the same
JP5169994B2 (ja) * 2009-05-27 2013-03-27 ソニー株式会社 画像処理装置、撮像装置及び画像処理方法
US8463035B2 (en) * 2009-05-28 2013-06-11 Gentex Corporation Digital image processing for calculating a missing color value
JP5267445B2 (ja) * 2009-12-17 2013-08-21 株式会社ニコン 画像処理プログラムおよび撮像装置
US8447105B2 (en) * 2010-06-07 2013-05-21 Microsoft Corporation Data driven interpolation using geodesic affinity
US8633942B2 (en) * 2010-06-07 2014-01-21 Microsoft Corporation View generation using interpolated values
JP5672872B2 (ja) * 2010-09-08 2015-02-18 株式会社リコー 画像形成装置
JP5056927B2 (ja) * 2010-09-17 2012-10-24 ソニー株式会社 画像処理装置、画像処理方法および撮像装置
JP5353945B2 (ja) * 2011-05-13 2013-11-27 株式会社ニコン 画像処理装置および画像処理プログラム並びに電子カメラ
US9769430B1 (en) 2011-06-23 2017-09-19 Gentex Corporation Imager system with median filter and method thereof
TWI463430B (zh) * 2011-08-12 2014-12-01 Univ Nat Cheng Kung Color interpolation and image magnifying device and method thereof
TWI455063B (zh) * 2011-08-12 2014-10-01 Univ Nat Cheng Kung Color interpolation device and method thereof
CN104115211B (zh) 2012-02-14 2017-09-22 金泰克斯公司 高动态范围成像系统
JP6131545B2 (ja) 2012-03-16 2017-05-24 株式会社ニコン 画像処理装置、撮像装置および画像処理プログラム
JP6131546B2 (ja) 2012-03-16 2017-05-24 株式会社ニコン 画像処理装置、撮像装置および画像処理プログラム
WO2013161313A1 (ja) 2012-04-25 2013-10-31 株式会社ニコン 画像処理装置、撮像装置および画像処理プログラム
CN104429056B (zh) 2012-08-10 2017-11-14 株式会社尼康 图像处理方法、图像处理装置、摄像装置及图像处理程序
JP6036829B2 (ja) 2012-08-10 2016-11-30 株式会社ニコン 画像処理装置、撮像装置および画像処理装置の制御プログラム
JP5749406B2 (ja) * 2012-08-23 2015-07-15 富士フイルム株式会社 画像処理装置、撮像装置、コンピュータ、画像処理方法及びプログラム
JP2014110507A (ja) * 2012-11-30 2014-06-12 Canon Inc 画像処理装置および画像処理方法
CN103236036B (zh) * 2013-04-01 2016-09-14 北京开明智达科技有限责任公司深圳分公司 一种基于方向的色平面插值方法
JP6302272B2 (ja) * 2014-02-06 2018-03-28 株式会社東芝 画像処理装置、画像処理方法及び撮像装置
CN106165398B (zh) 2014-04-04 2019-07-02 株式会社尼康 摄像元件、摄像装置以及图像处理装置
JP6415094B2 (ja) * 2014-04-25 2018-10-31 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法およびプログラム
JP6426909B2 (ja) * 2014-05-12 2018-11-21 日本放送協会 色情報補完装置およびそのプログラム

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4642678A (en) * 1984-09-10 1987-02-10 Eastman Kodak Company Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal
US5534919A (en) * 1993-04-15 1996-07-09 Canon Kabushiki Kaisha Image pickup apparatus for estimating a complementary color value of a target pixel
US5534914A (en) * 1993-06-03 1996-07-09 Target Technologies, Inc. Videoconferencing system
US5541653A (en) * 1993-07-27 1996-07-30 Sri International Method and appartus for increasing resolution of digital color images using correlated decoding
US5552827A (en) * 1993-08-31 1996-09-03 Sanyo Electric Co., Ltd. Color video camera with a solid state image sensing device
US5629734A (en) * 1995-03-17 1997-05-13 Eastman Kodak Company Adaptive color plan interpolation in single sensor color electronic camera
US5799113A (en) * 1996-01-19 1998-08-25 Microsoft Corporation Method for expanding contracted video images
US5805216A (en) * 1994-06-06 1998-09-08 Matsushita Electric Industrial Co., Ltd. Defective pixel correction circuit
US5805217A (en) * 1996-06-14 1998-09-08 Iterated Systems, Inc. Method and system for interpolating missing picture elements in a single color component array obtained from a single color sensor
US5901242A (en) * 1996-07-03 1999-05-04 Sri International Method and apparatus for decoding spatiochromatically multiplexed color images using predetermined coefficients
US6075889A (en) * 1998-06-12 2000-06-13 Eastman Kodak Company Computing color specification (luminance and chrominance) values for images
US6091862A (en) * 1996-11-26 2000-07-18 Minolta Co., Ltd. Pixel interpolation device and pixel interpolation method
US6130960A (en) * 1997-11-03 2000-10-10 Intel Corporation Block-matching algorithm for color interpolation
US6724945B1 (en) * 2000-05-24 2004-04-20 Hewlett-Packard Development Company, L.P. Correcting defect pixels in a digital image
US6744916B1 (en) * 1998-11-24 2004-06-01 Ricoh Company, Ltd. Image processing apparatus and method for interpolating missing pixels

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4605956A (en) * 1984-09-10 1986-08-12 Eastman Kodak Company Single-chip electronic color camera with color-dependent birefringent optical spatial frequency filter and red and blue signal interpolating circuit
US4663655A (en) * 1985-08-05 1987-05-05 Polaroid Corporation Method and apparatus for reconstructing missing color samples
US4724395A (en) * 1985-08-05 1988-02-09 Polaroid Corporation Median filter for reconstructing missing color samples
JPS62190994A (ja) * 1986-02-18 1987-08-21 Fuji Photo Film Co Ltd 色差線順次映像信号の信号補間装置
US4774565A (en) * 1987-08-24 1988-09-27 Polaroid Corporation Method and apparatus for reconstructing missing color samples
JPH0548886A (ja) * 1991-08-16 1993-02-26 Konica Corp データ変換装置
JP3527291B2 (ja) * 1994-08-25 2004-05-17 三菱電機株式会社 色変換処理装置、色逆変換処理装置及び映像信号処理装置
JP3611890B2 (ja) * 1995-02-20 2005-01-19 株式会社リコー 色変換装置
JP3683397B2 (ja) * 1997-07-02 2005-08-17 富士写真フイルム株式会社 カラー画像データ補間方法および装置
US6091851A (en) * 1997-11-03 2000-07-18 Intel Corporation Efficient algorithm for color recovery from 8-bit to 24-bit color pixels
KR100729559B1 (ko) * 1998-01-29 2007-06-18 코닌클리케 필립스 일렉트로닉스 엔.브이. 색상 신호 보간
US6356276B1 (en) * 1998-03-18 2002-03-12 Intel Corporation Median computation-based integrated color interpolation and color space conversion methodology from 8-bit bayer pattern RGB color space to 12-bit YCrCb color space
JP4045645B2 (ja) * 1998-05-19 2008-02-13 株式会社ニコン 補間処理装置および補間処理プログラムを記録した記録媒体
JP2000197067A (ja) 1998-12-28 2000-07-14 Fuji Photo Film Co Ltd 固体撮像装置および画像デ―タ作成方法
US6809765B1 (en) * 1999-10-05 2004-10-26 Sony Corporation Demosaicing for digital imaging device using perceptually uniform color space
JP3905708B2 (ja) * 2001-01-26 2007-04-18 ペンタックス株式会社 画像補間装置
US7053908B2 (en) * 2001-04-12 2006-05-30 Polaroid Corporation Method and apparatus for sensing and interpolating color image data

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4642678A (en) * 1984-09-10 1987-02-10 Eastman Kodak Company Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal
US5534919A (en) * 1993-04-15 1996-07-09 Canon Kabushiki Kaisha Image pickup apparatus for estimating a complementary color value of a target pixel
US5534914A (en) * 1993-06-03 1996-07-09 Target Technologies, Inc. Videoconferencing system
US5541653A (en) * 1993-07-27 1996-07-30 Sri International Method and appartus for increasing resolution of digital color images using correlated decoding
US5552827A (en) * 1993-08-31 1996-09-03 Sanyo Electric Co., Ltd. Color video camera with a solid state image sensing device
US5805216A (en) * 1994-06-06 1998-09-08 Matsushita Electric Industrial Co., Ltd. Defective pixel correction circuit
US5629734A (en) * 1995-03-17 1997-05-13 Eastman Kodak Company Adaptive color plan interpolation in single sensor color electronic camera
US5799113A (en) * 1996-01-19 1998-08-25 Microsoft Corporation Method for expanding contracted video images
US5805217A (en) * 1996-06-14 1998-09-08 Iterated Systems, Inc. Method and system for interpolating missing picture elements in a single color component array obtained from a single color sensor
US5901242A (en) * 1996-07-03 1999-05-04 Sri International Method and apparatus for decoding spatiochromatically multiplexed color images using predetermined coefficients
US6091862A (en) * 1996-11-26 2000-07-18 Minolta Co., Ltd. Pixel interpolation device and pixel interpolation method
US6130960A (en) * 1997-11-03 2000-10-10 Intel Corporation Block-matching algorithm for color interpolation
US6075889A (en) * 1998-06-12 2000-06-13 Eastman Kodak Company Computing color specification (luminance and chrominance) values for images
US6744916B1 (en) * 1998-11-24 2004-06-01 Ricoh Company, Ltd. Image processing apparatus and method for interpolating missing pixels
US6724945B1 (en) * 2000-05-24 2004-04-20 Hewlett-Packard Development Company, L.P. Correcting defect pixels in a digital image

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6724945B1 (en) 2000-05-24 2004-04-20 Hewlett-Packard Development Company, L.P. Correcting defect pixels in a digital image
GB2364461B (en) * 2000-05-24 2004-06-23 Hewlett Packard Co Correcting defect pixels in a digital image
US7289665B2 (en) 2001-03-05 2007-10-30 Nikon Corporation Image processing device and image processing program
US20050073591A1 (en) * 2001-03-05 2005-04-07 Kenichi Ishiga Image processing device and image processing program
US7088392B2 (en) * 2001-08-27 2006-08-08 Ramakrishna Kakarala Digital image system and method for implementing an adaptive demosaicing method
US20030052981A1 (en) * 2001-08-27 2003-03-20 Ramakrishna Kakarala Digital image system and method for implementing an adaptive demosaicing method
US20060055794A1 (en) * 2002-05-15 2006-03-16 Nobuyuki Sato Image processing system, and image processing method, recording medium, and program
US7826658B2 (en) * 2002-05-15 2010-11-02 Sony Corporation Image processing system, image processing method, image processing recording medium, and program suitable for extraction processing
US20080013629A1 (en) * 2002-06-11 2008-01-17 Marta Karczewicz Spatial prediction based intra coding
US20060188165A1 (en) * 2002-06-12 2006-08-24 Marta Karczewicz Spatial prediction based intra-coding
US20080247643A1 (en) * 2003-06-12 2008-10-09 Nikon Corporation Image processing method, image processing program and image processor
US7630546B2 (en) * 2003-06-12 2009-12-08 Nikon Corporation Image processing method, image processing program and image processor
US7684615B2 (en) * 2003-06-30 2010-03-23 Nikon Corporation Signal correcting method
US20100142817A1 (en) * 2003-06-30 2010-06-10 Nikon Corporation Signal correcting method
US7856139B2 (en) 2003-06-30 2010-12-21 Nikon Corporation Signal correcting method
US20060098869A1 (en) * 2003-06-30 2006-05-11 Nikon Corporation Signal correcting method
US7957588B2 (en) * 2004-07-07 2011-06-07 Nikon Corporation Image processor and computer program product
US20080123999A1 (en) * 2004-07-07 2008-05-29 Nikon Corporation Image Processor and Computer Program Product
US20060050956A1 (en) * 2004-09-09 2006-03-09 Fuji Photo Film Co., Ltd. Signal processing apparatus, signal processing method, and signal processing program
EP1855486A1 (en) * 2005-03-04 2007-11-14 Nikon Corporation Image processor correcting color misregistration, image processing program, image processing method, and electronic camera
US7945091B2 (en) 2005-03-04 2011-05-17 Nikon Corporation Image processor correcting color misregistration, image processing program, image processing method, and electronic camera
US20090207271A1 (en) * 2005-03-04 2009-08-20 Nikon Corporation Image Processor Correcting Color Misregistration, Image Processing Program, Image Processing Method, and Electronic Camera
EP1855486A4 (en) * 2005-03-04 2011-08-03 Nikon Corp COLOR ERRORIZATION CORRIGATORY IMAGE PROCESSOR, IMAGE PROCESSING, PICTURE PROCESSING, AND ELECTRONIC CAMERA
EP3258687A1 (en) * 2005-03-04 2017-12-20 Nikon Corporation Image processor correcting color misregistration, image processing method, and electronic camera
WO2006093266A1 (ja) 2005-03-04 2006-09-08 Nikon Corporation 色ズレを補正する画像処理装置、画像処理プログラム、画像処理方法、および電子カメラ
US7885478B2 (en) * 2005-05-19 2011-02-08 Mstar Semiconductor, Inc. Noise reduction method and noise reduction apparatus
US20090310884A1 (en) * 2005-05-19 2009-12-17 Mstar Semiconductor, Inc. Noise reduction method and noise reduction apparatus
US7609300B2 (en) * 2005-08-09 2009-10-27 Sunplus Technology Co., Ltd. Method and system of eliminating color noises caused by an interpolation
US20070035636A1 (en) * 2005-08-09 2007-02-15 Sunplus Technology Co., Ltd. Method and system of eliminating color noises caused by an interpolation
US7973850B2 (en) * 2005-09-29 2011-07-05 Nikon Corporation Image processing apparatus and image processing method
US20090135267A1 (en) * 2005-09-29 2009-05-28 Nikon Corporation Image Processing Apparatus and Image Processing Method
US7551214B2 (en) * 2005-12-01 2009-06-23 Megachips Lsi Solutions Inc. Pixel interpolation method
US20070126885A1 (en) * 2005-12-01 2007-06-07 Megachips Lsi Solutions Inc. Pixel Interpolation Method
US20070229676A1 (en) * 2006-01-16 2007-10-04 Futabako Tanaka Physical quantity interpolating method, and color signal processing circuit and camera system using the same
US8135213B2 (en) * 2006-01-16 2012-03-13 Sony Corporation Physical quantity interpolating method, and color signal processing circuit and camera system using the same
US20080002915A1 (en) * 2006-06-30 2008-01-03 Samsung Electronics Co., Ltd. Image processing apparatus, method and medium
US7885488B2 (en) * 2006-06-30 2011-02-08 Samsung Electronics Co., Ltd. Image processing apparatus, method and medium
US8175387B1 (en) * 2007-09-19 2012-05-08 Trend Micro, Inc. Image similarity detection using approximate pattern matching
US8654205B2 (en) 2009-12-17 2014-02-18 Nikon Corporation Medium storing image processing program and imaging apparatus
US20110199507A1 (en) * 2009-12-17 2011-08-18 Nikon Corporation Medium storing image processing program and imaging apparatus
US20130028538A1 (en) * 2011-07-29 2013-01-31 Simske Steven J Method and system for image upscaling
US9542608B2 (en) * 2011-09-27 2017-01-10 Fuji Jukogyo Kabushiki Kaisha Image processing apparatus
US20130077825A1 (en) * 2011-09-27 2013-03-28 Fuji Jukogyo Kabushiki Kaisha Image processing apparatus
WO2015104667A1 (en) * 2014-01-08 2015-07-16 Marvell World Trade Ltd. Methods and apparatus for demosaicking artifacts suppression
US9479746B2 (en) 2014-01-08 2016-10-25 Marvell World Trade Ltd. Methods and apparatus for demosaicking artifacts suppression
US11233921B2 (en) * 2017-03-28 2022-01-25 Brother Kogyo Kabushiki Kaisha Image processing apparatus that specifies edge pixel in target image using single-component image data

Also Published As

Publication number Publication date
JP2001245314A (ja) 2001-09-07
US7362897B2 (en) 2008-04-22
WO2001047244A1 (fr) 2001-06-28
US7236628B2 (en) 2007-06-26
US20060198556A1 (en) 2006-09-07
JP4599672B2 (ja) 2010-12-15
US20060245646A1 (en) 2006-11-02

Similar Documents

Publication Publication Date Title
US7362897B2 (en) Interpolation processing apparatus and recording medium having interpolation processing program recorded therein
US7565007B2 (en) Image processing method, image processing program, and image processing apparatus
US6836572B2 (en) Interpolation processing apparatus and recording medium having interpolation processing program recorded therein
US7630546B2 (en) Image processing method, image processing program and image processor
EP1977613B1 (en) Interpolation of panchromatic and color pixels
US7916937B2 (en) Image processing device having color shift correcting function, image processing program and electronic camera
EP1761072B1 (en) Image processing device for detecting chromatic difference of magnification from raw data, image processing program, and electronic camera
JP4610930B2 (ja) 画像処理装置、画像処理プログラム
US20020025069A1 (en) Signal processing apparatus and method for reducing generation of false color by adaptive luminance interpolation
EP1337115B1 (en) Image processor and colorimetric system converting method
US6542187B1 (en) Correcting for chrominance interpolation artifacts
EP2056607B1 (en) Image processing apparatus and image processing program
US7346210B2 (en) Image processing device and image processing program for determining similarity factors of pixels
US7801355B2 (en) Image processing method, image processing device, semiconductor device, electronic apparatus, image processing program, and computer-readable storage medium
EP1109411A1 (en) Interpolation processor and recording medium recording interpolation processing program
JP4748278B2 (ja) 補間処理装置および補間処理プログラムを記録した記録媒体
JP4196055B2 (ja) 画像処理方法、画像処理プログラム、画像処理装置
JP4239483B2 (ja) 画像処理方法、画像処理プログラム、画像処理装置
JP4239480B2 (ja) 画像処理方法、画像処理プログラム、画像処理装置
JP2010283888A (ja) 画像処理装置、画像処理プログラム
JP4239484B2 (ja) 画像処理方法、画像処理プログラム、画像処理装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIKON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEN, ZHE-HONG;ISHIGA, KENICHI;REEL/FRAME:011893/0528

Effective date: 20010530

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE