US7567724B2 - Image processing apparatus, image processing method, program of image processing method, and recording medium in which program of image processing method has been recorded - Google Patents

Image processing apparatus, image processing method, program of image processing method, and recording medium in which program of image processing method has been recorded Download PDF

Info

Publication number
US7567724B2
US7567724B2 US10/563,919 US56391905A US7567724B2 US 7567724 B2 US7567724 B2 US 7567724B2 US 56391905 A US56391905 A US 56391905A US 7567724 B2 US7567724 B2 US 7567724B2
Authority
US
United States
Prior art keywords
edge
image data
pixel values
gradient
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US10/563,919
Other languages
English (en)
Other versions
US20080123998A1 (en
Inventor
Shinichiro Gomi
Masami Ogata
Kazuhiko Ueda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GOMI, SHINICHIRO, OGATA, MASAMI, UEDA, KAZUHIKO
Publication of US20080123998A1 publication Critical patent/US20080123998A1/en
Application granted granted Critical
Publication of US7567724B2 publication Critical patent/US7567724B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/403Edge-driven scaling; Edge-based scaling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/409Edge or detail enhancement; Noise or error suppression
    • H04N1/4092Edge or detail enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Definitions

  • the present invention relates to image progressing apparatuses, image processing methods, programs for image processing methods, and recording media recording thereon programs for image processing methods, and is applicable, for example, to resolution conversion.
  • the present invention is capable of effectively avoiding loss of high-frequency components and preventing occurrence of jaggies by detecting an edge gradient direction with the largest gradient of pixel values and an edge direction orthogonal to the edge gradient direction and by performing edge enhancement and smoothing processing in the edge gradient direction and in the edge direction, respectively, to generate output image data.
  • the present invention is designed in view of the foregoing points, and proposes an image processing apparatus, an image processing method, a program for an image processing method, and a recording medium recording thereon a program for an image processing method that are capable of effectively avoiding loss of high-frequency components and preventing occurrence of jaggies.
  • the present invention is applied to an image processing apparatus for processing input image data and for outputting output image data.
  • the image processing apparatus includes an edge detection unit for detecting an edge gradient direction with the largest gradient of pixel values and an edge direction orthogonal to the edge gradient direction for each pixel of the input image data; an edge direction processing unit for performing smoothing processing on the image data in the edge direction for each pixel of the output image data in accordance with a detection result of the edge detection unit and for sequentially detecting pixel values corresponding to respective pixels of the output image data; and an edge gradient direction processing unit for performing edge enhancement processing in the edge gradient direction on the pixel values output from the edge direction processing unit for the respective pixels of the output image data in accordance with the detection result of the edge detection unit and for sequentially outputting pixel values of the output image data.
  • the configuration according to the present invention is applied to an image processing apparatus for processing input image data and for outputting output image data.
  • the image processing apparatus includes an edge detection unit for detecting an edge gradient direction with the largest gradient of pixel values and an edge direction orthogonal to the edge gradient direction for each pixel of the input image data; an edge direction processing unit for performing smoothing processing on the image data in the edge direction for each pixel of the output image data in accordance with a detection result of the edge detection unit and for sequentially detecting pixel values corresponding to respective pixels of the output image data; and an edge gradient direction processing unit for performing edge enhancement processing in the edge gradient direction on the pixel values output from the edge direction processing unit for the respective pixels of the output image data in accordance with the detection result of the edge detection unit and for sequentially outputting pixel values of the output image data.
  • edge enhancement and smoothing processing are performed in the edge gradient direction with the largest gradient of pixel values and in the edge direction orthogonal to the edge gradient direction, respectively, to generate the output image data. Therefore, loss of high-frequency components can be effectively avoided, and occurrence of jaggies can be prevented.
  • the present invention is applied to an image processing method for processing input image data and for outputting output image data.
  • the image processing method includes an edge detection step of detecting an edge gradient direction with the largest gradient of pixel values and an edge direction orthogonal to the edge gradient direction for each pixel of the input image data; an edge direction processing step of performing smoothing processing on the image data in the edge direction for each pixel of the output image data in accordance with a detection result by the edge detection step and sequentially detecting pixel values corresponding to respective pixels of the output image data; and an edge gradient direction processing step of performing edge enhancement processing in the edge gradient direction on the pixel values detected by the edge direction processing step for the respective pixels of the output image data in accordance with the detection result by the edge detection step and sequentially outputting pixel values of the output image data.
  • an image processing method capable of effectively avoiding loss of high-frequency components and preventing occurrence of jaggies can be provided.
  • the present invention is applied to a program for an image processing method performed by arithmetic processing means for processing input image data and for outputting output image data.
  • the program includes an edge detection step of detecting an edge gradient direction with the largest gradient of pixel values and an edge direction orthogonal to the edge gradient direction for each pixel of the input image data; an edge direction processing step of performing smoothing processing on the image data in the edge direction for each pixel of the output image data in accordance with a detection result by the edge detection step and sequentially detecting pixel values corresponding to respective pixels of the output image data; and an edge gradient direction processing step of performing edge enhancement processing in the edge gradient direction on the pixel values detected by the edge direction processing step for the respective pixels of the output image data in accordance with the detection result by the edge detection step and sequentially outputting pixel values of the output image data.
  • a program for an image processing method capable of effectively avoiding loss of high-frequency components and preventing occurrence of jaggies can be provided.
  • the present invention is applied to a recording medium recording thereon a program for an image processing method performed by arithmetic processing means for processing input image data and for outputting output image data.
  • the program for the image processing method includes an edge detection step of detecting an edge gradient direction with the largest gradient of pixel values and an edge direction orthogonal to the edge gradient direction for each pixel of the input image data; an edge direction processing step of performing smoothing processing on the image data in the edge direction for each pixel of the output image data in accordance with a detection result by the edge detection step and sequentially detecting pixel values corresponding to respective pixels of the output image data; and an edge gradient direction processing step of performing edge enhancement processing in the edge gradient direction on the pixel values detected by the edge direction processing step for the respective pixels of the output image data in accordance with the detection result by the edge detection step and sequentially outputting pixel values of the output image data.
  • a recording medium recording thereon a program for an image processing method capable of effectively avoiding loss of high-frequency components and preventing occurrence of jaggies can be provided.
  • loss of high-frequency components can be effectively avoided and occurrence of jaggies can be prevented.
  • FIG. 1 is a functional block diagram showing the configuration of an image processing apparatus according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram for generation of a pixel gradient matrix.
  • FIG. 3 is a schematic diagram for explaining an edge gradient direction and an edge direction.
  • FIG. 4 is a schematic diagram for explaining an operation of an edge direction processing unit.
  • FIG. 5 is a characteristic curve diagram showing a parameter used for setting a smoothing processing range.
  • FIG. 6 is a characteristic curve diagram showing another parameter used for setting the smoothing processing range.
  • FIG. 7 is a schematic diagram used for explaining an operation of an edge gradient direction processing unit.
  • FIG. 8 is a characteristic curve diagram showing a parameter used for setting a blend processing unit.
  • FIG. 9 is a characteristic curve diagram showing another parameter used for setting the blend processing unit.
  • FIG. 1 is a functional block diagram showing an image processing apparatus according to an embodiment of the present invention.
  • the image processing apparatus 1 is formed by, for example, a digital signal processor functioning as arithmetic processing means.
  • the resolution of image data D 1 which is input image data, is converted into a resolution designated by a controller, which is not shown, and the converted image data is output.
  • the image processing apparatus 1 generates output image data D 2 obtained by magnifying or reducing an image based on the image data D 1 , and outputs the generated output image data D 2 to displaying means or the like.
  • processing program for the arithmetic processing means is supplied by being pre-installed in the image processing apparatus in this embodiment, for example, such a processing program may be supplied by being downloaded via a network, such as the Internet, or may be supplied via various recording media.
  • Such recording media may be widely applicable to various recording media, such as optical disks, memory cards, and detachable hard disk drives.
  • an edge detection unit 2 detects an edge gradient direction with the largest gradient of pixel values and an edge direction orthogonal to the edge gradient direction for each pixel of the image data D 1 .
  • a gradient matrix generation unit 3 sequentially selects a target pixel, for example, in a raster scanning order, and, as shown in FIG. 2 , generates a brightness gradient matrix G represented by the next condition by performing arithmetic processing using pixel values in a range W centered on the target pixel.
  • FIG. 2 shows an example in which ⁇ 3 pixels in x and y directions centered on a target pixel are set as the range W.
  • w (i,j) represents a Gaussian weight represented by condition (2)
  • g represents a brightness gradient represented by condition (3) using a partial differential “gx” in the x direction of an image brightness I and a partial differential “gy” in the y direction of the image brightness I.
  • the gradient matrix generation unit 3 detects a brightness gradient obtained by weighting based on a target pixel in the predetermined range W centered on the target pixel.
  • a subsequent eigenvalue and eigenvector detection unit 4 detects an edge gradient direction v 1 with the largest gradient of pixel values and an edge direction v 2 orthogonal to the edge gradient direction v 1 for a target pixel, as shown in FIG. 3 , by processing the brightness gradient matrix G generated by the gradient matrix generation unit 3 .
  • the eigenvalue and eigenvector detection unit 4 detects eigenvalues ⁇ 1 and ⁇ 2 representing dispersions of gradients of the pixel values in the edge gradient direction v 1 and in the edge direction v 2 , respectively.
  • the eigenvalue and eigenvector detection unit 4 detects the edge gradient direction v 1 , the edge direction v 2 , and the eigenvalues ⁇ 1 and ⁇ 2 ( ⁇ 1 ⁇ 2 ) in accordance with the following arithmetic processing:
  • An edge direction processing unit 5 calculates an edge direction vc for a pixel of the image data D 2 acquired after resolution conversion in accordance with the edge direction v 2 for a target pixel of the image data D 1 detected as described above by the edge detection unit 2 , and sequentially calculates a pixel value corresponding to each pixel Pc of the output image data D 2 by interpolation processing based on the edge direction vc.
  • the edge direction processing unit 5 performs arithmetic processing based on the next condition for pixels of the image data D 1 (in the example shown in FIG. 4 , P 3 , P 4 , P 9 , and P 10 ) adjacent to a pixel Pc of the output image data D 2 to be calculated (hereinafter, referred to as a target pixel of the image data D 2 ), and calculates the edge direction vc of the target pixel Pc by performing interpolation processing using the edge directions v 2 (v 3 , v 4 , v 9 , and v 10 ) of the adjacent pixels of the image data D 1 .
  • v c (1 ⁇ t y )((1 ⁇ t x ) v 3 +t x v 4 )+ t y ((1 ⁇ t x ) v 9 +t x v 10 ) (9)
  • t x and t y represent coordinate values of a target pixel for internally dividing the sampling points P3, P4, P9, and P10 based on the image data D 1 in the x and y directions, and the conditions 0 ⁇ t x ⁇ 1 and 0 ⁇ t y ⁇ 1 are satisfied.
  • the edge direction processing unit 5 sets a predetermined number of sampling points P ⁇ -2, P ⁇ -1, P 1 , and P 2 based on a sampling pitch in the image data D 1 from the sampling points for the target pixel Pc on a line in the edge direction vc.
  • pixel values of the sampling points P ⁇ -2, P ⁇ -1, P 1 , and P 2 and the target pixel Pc are calculated by an interpolation operation using pixel values of the image data D 1 .
  • interpolated image data in the edge direction based on interpolation processing for the input image data D 1 is generated on the line extending in the edge direction vc for each pixel of the output image data D 2 .
  • the number of sampling points set as described above is changed, and the subsequent filtering processing is changed.
  • the number of taps for the filtering processing is changed in accordance with reliability of an edge in the edge direction vc of the target pixel.
  • a pixel value of the target pixel Pc is calculated by linear interpolation using the peripheral pixels P 3 , P 4 , P 9 , and P 10
  • pixel values of the previous and subsequent sampling points P ⁇ 1 and P 1 are calculated by linear interpolation using P 2 , P 3 , P 8 , and P 9 ; and P 4 , P 5 , P 10 , and P 11 , respectively.
  • a pixel value of the target pixel Pc is calculated by linear interpolation using the peripheral pixels P 3 , P 4 , P 9 , and P 10 , and pixel values of the sampling points P ⁇ -2, P ⁇ -1, P 1 , and P 2 are calculated in a similar way.
  • the edge direction processing unit 5 smoothes the calculated pixel values of the sampling points P ⁇ -2, P ⁇ -1, P 1 , and P 2 and the target pixel Pc by filtering processing, and determines a pixel value Pc′ of the target pixel Pc.
  • a pixel value corresponding to a pixel of the output image data D 2 is calculated by performing smoothing processing on interpolated image data in an edge direction.
  • an interpolation operation for generating such interpolated image data need not be linear interpolation using pixel values of adjacent neighboring pixels, and various interpolation operation methods using various peripheral pixels are widely applicable.
  • arithmetic processing for filtering processing using the interpolated image data need not be the arithmetic processing based on condition (10) or (11), and interpolation operations using various weighting coefficients are widely applicable.
  • smoothing processing may be performed in a direction orthogonal to a brightness gradient in a portion other than an edge.
  • filtering processing is performed with a large number of taps involving a wide range, an image quality is reduced.
  • filtering processing performed in a wide range prevents occurrence of jaggies more reliably and thus forms a smooth edge.
  • the number of taps for filtering processing is changed for each pixel.
  • a range for smoothing processing in an edge direction is changed for each pixel.
  • such a filtering range is changed in accordance with reliability of an edge in the edge direction vc. Therefore, a reduction in the image quality due to smoothing processing can be prevented.
  • the edge direction processing range determination unit 6 generates a parameter p that increases substantially linearly in accordance with a decrease in the value of the ratio ⁇ 2 / ⁇ 1 when the ratio ⁇ 2 / ⁇ 1 is within a predetermined range between ⁇ 2 / ⁇ 1 min and ⁇ 2 / ⁇ 1 max and that exhibits the maximum value pmax or the minimum value pmin when the ratio ⁇ 2 / ⁇ 1 is outside the predetermined range between ⁇ 2 / ⁇ 1 min and ⁇ 2 / ⁇ 1 max. Accordingly, the parameter p changing depending on reliability of an edge in an edge direction is generated.
  • the edge direction processing range determination unit 6 generates a parameter q that increases substantially linearly in accordance with the eigenvalue ⁇ 1 in a predetermined range between ⁇ 1 min and ⁇ 1 max and that exhibits a lower limit value qmin or an upper limit value qmax in a range smaller or larger than the range between ⁇ 1 min and ⁇ 1 max. Accordingly, the parameter q changing in accordance with rising of an edge is generated.
  • the edge direction processing range determination unit 6 converts the eigenvalues ⁇ 1 and ⁇ 2 based on the sampling points of the image data D 1 into eigenvalues based on the sampling points of the image data D 2 , and calculates a filtering range r.
  • a filtering range r based on the sampling points of the image data D 2 may be calculated by interpolation processing for the calculation result.
  • a filtering range r based on the sampling points of the image data D 2 may be calculated from the calculation result.
  • the edge direction processing unit 5 changes the number of taps for the filtering processing in accordance with the range r calculated as described above, and calculates the pixel value Pc′ of the target pixel Pc of the image data D 2 .
  • the edge direction processing unit 5 performs filtering processing based on the real number of taps by blending filtering processing results.
  • the edge direction processing unit 5 overcomes unnaturalness in changing the number of taps when the filtering processing is performed based on the integral number of taps.
  • n integer ⁇ 1 floor( n real )
  • n integer +1 ceil( n real ) (13)
  • floor(n) represents the maximum integral number of taps not exceeding n
  • ceil(n) represents the minimum integral number of taps larger than n.
  • range r calculated based on condition (12) is applied to “nreal”. Thus, when n is 3.5, the number of taps “floor(n)” is 3, and the number of taps “ceil(n)” is 5.
  • Blending of the filtering processing results is performed by calculating a filtering processing result f(n) based on a real number by performing arithmetic processing represented by the next condition using the two types of filtering processing results.
  • the edge direction processing unit 5 calculates the pixel value Pc′ of the target pixel Pc of the image data D 2 by performing filtering processing based on the two types of tap numbers in the filtering range r and by further performing arithmetic processing represented by condition (14) using the two types of filtering processing results.
  • the edge direction processing unit 5 calculates the pixel value Pc′ of the target pixel Pc of the image data D 2 by performing filtering processing based on the number of taps corresponding to reliability of an edge in an edge direction, and changes the number of taps in a decimal fractional part.
  • f ( n ) ⁇ f ( n integer ⁇ 1 )+(1 ⁇ ) f ( n integer +1 ) (14)
  • An edge gradient direction processing unit 7 performs edge enhancement processing in the edge gradient direction v 1 using the pixel value Pc′ of the target pixel Pc of the image data D 2 calculated as described above by the edge direction processing unit 5 .
  • the edge gradient direction processing unit 7 calculates an edge gradient direction vg for a target pixel Pcc of the image data D 2 in accordance with the edge gradient direction v 1 based on adjacent sampling points of the image data D 1 , as in calculation of the edge direction vc for the target pixel Pc of the image data D 2 performed by the edge direction processing unit 5 .
  • the edge gradient direction processing unit 7 sets a predetermined number of sampling points Pc ⁇ 1 and Pc+1 based on a sampling pitch in the image data D 2 from the sampling points for the target pixel Pcc on a line in the edge gradient direction vg. In addition, the edge gradient direction processing unit 7 calculates pixel values of the sampling points Pc ⁇ 1 and Pc+1 and the target pixel Pcc by an interpolation operation using pixel values output from the edge direction processing unit 5 .
  • the edge gradient direction processing unit 7 generates interpolated image data in the edge gradient direction obtained by interpolation processing for the image data based on the pixel values output from the edge direction processing unit 5 on a line extending in the edge gradient direction vg for each pixel of the output image data D 2 .
  • the edge gradient direction processing unit 7 performs filtering processing on the pixel values of the sampling points Pc ⁇ 1 and PC+1 and the target pixel Pcc calculated as described above, and determines a pixel value Pcc′ of the target pixel Pcc.
  • a case where the pixel value Pcc′ of the target pixel Pcc is calculated using three taps is described.
  • the pixel value of the sampling point Pc ⁇ 1 is generated by linear interpolation based on peripheral sampling points Pc 1 , Pc 2 , Pc 4 , and Pcc
  • the pixel value of the sampling point Pc+1 is generated by linear interpolation based on peripheral sampling points Pcc, Pc 5 , Pc 7 , and Pc 8 .
  • the edge gradient direction processing unit 7 performs edge enhancement in the direction that crosses the edge.
  • linear interpolation using pixel values of adjacent neighboring pixels need not be performed, and interpolation operation methods using various peripheral pixels are widely applicable.
  • interpolation operations using various weighting coefficients are widely applicable.
  • An interpolation processing unit 8 converts a resolution of the image data D 1 , for example, by linear interpolation or bicubic conversion, and outputs a pixel value Pa based on a sampling pitch corresponding to the image data D 2 .
  • a blend ratio determination unit 9 generates a weighting coefficient for blending in accordance with reliability of an edge in the edge direction vc.
  • a blend processing unit 10 performs weighting addition of the pixel value Pa of the image data D 2 generated by the interpolation processing unit 8 using a known procedure and the pixel value Pcc′ generated by the edge gradient direction processing unit 7 in order to generate the image data D 2 , and the blend ratio determination unit 9 changes the weighting coefficient for the weighting addition processing.
  • the blend ratio determination unit 9 changes the weighting coefficient in accordance with reliability of an edge in an edge direction, and this prevents excessive unnaturalness in processing for the edge.
  • the ratio ⁇ 2 / ⁇ 1 of the eigenvalue ⁇ 2 for the edge direction v 2 to the eigenvalue ⁇ 1 for the edge gradient direction v 1 is applied to the reliability of the edge in the edge direction.
  • the blend ratio determination unit 9 generates a parameter s that increases substantially linearly in accordance with a decrease in the value of the ratio ⁇ 2 / ⁇ 1 when the ratio ⁇ 2 / ⁇ 1 is within a predetermined range between ⁇ 2 / ⁇ 1 min and ⁇ 2 / ⁇ 1 max and that exhibits the maximum value smax or the minimum value smin when the ratio ⁇ 2 / ⁇ 1 is outside the predetermined range between ⁇ 2 / ⁇ 1 min and ⁇ 2 / ⁇ 1 max. Accordingly, the parameter s changing depending on reliability of an edge in an edge direction is generated.
  • the blend ratio determination unit 9 generates a parameter t that increases substantially linearly in accordance with the eigenvalue ⁇ 1 in a predetermined range between ⁇ 1 min and ⁇ 1 max and that exhibits a lower limit value tmin or an upper limit value tmax in a range outside the range between ⁇ 1 min and ⁇ 1 max. Accordingly, the parameter t changing in accordance with rising of an edge is generated.
  • the blend ratio determination unit 9 converts the eigenvalues ⁇ 1 and ⁇ 2 based on the sampling points of the image data D 1 into eigenvalues based on the sampling points of the image data D 2 , and calculates a weighting coefficient ⁇ for blending.
  • a weighting coefficient ⁇ for blending may be calculated by interpolation processing for the calculation result.
  • a weighting coefficient ⁇ for blending based on the sampling points of the image data D 2 may be calculated from the calculation result.
  • the blend processing unit 10 performs weighting addition of image data S 3 based on the pixel value Pcc′ calculated by the edge gradient direction processing unit 7 and image data S 11 based on the pixel value Pa calculated by the interpolation processing unit 8 using the weighting coefficient ⁇ obtained by the blend ratio determination unit 9 by performing arithmetic processing represented by the next condition, and outputs the processing result as the image data D 2 .
  • S 4 ⁇ S 3+(1 ⁇ ) ⁇ S 11 (16)
  • the input image data D 1 ( FIG. 1 ) is input to the edge detection unit 2 .
  • the edge gradient direction v 1 with the largest gradient of pixel values and the edge direction v 2 orthogonal to the edge gradient direction v 1 are sequentially detected for each pixel ( FIGS. 2 and 3 ).
  • the input image data D 1 is input to the edge direction processing unit 5 .
  • the image data is smoothed in the edge direction v 2 for each pixel of the output image data D 2 in accordance with a detection result of the edge detection unit 2 , and a pixel value Pc corresponding to each pixel of the output image data D 2 is sequentially calculated.
  • the pixel value Pc is input to the edge gradient direction processing unit 7 in accordance with a calculation result.
  • edge enhancement is performed in the edge gradient direction v 1 for each pixel of the output image data D 2 , and a pixel value of the output image data D 2 is calculated. Accordingly, since the input image data D 1 is smoothed in the edge direction v 2 , occurrence of jaggies can be effectively avoided.
  • edge enhancement is performed in the edge gradient direction v 1 orthogonal to the edge direction v 2 . Thus, high-frequency components are enhanced in the direction orthogonal to the edge direction. Therefore, conversion into the image data D 2 is performed while effectively avoiding loss of high-frequency components and preventing occurrence of jaggies.
  • interpolated image data Pc in the edge direction based on interpolation processing for the input image data D 1 is generated for each pixel of the output image data D 2 on a line extending in the edge direction v 2 in accordance with a detection result of the edge detection unit 2 , and the pixel value Pc′ corresponding to each pixel of the output image data D 2 is sequentially calculated by filtering processing of the interpolated image data Pc in the edge direction generated as described above. Accordingly, by setting a characteristic for the filtering processing, smoothing processing is performed in the edge direction v 2 , and occurrence of jaggies can be effectively avoided.
  • the gradient matrix generation unit 3 In the edge detection unit 2 , the gradient matrix generation unit 3 generates a brightness gradient matrix G for each pixel, and the subsequent eigenvalue and eigenvector detection unit 4 processes this brightness gradient matrix G to detect the edge gradient direction v 1 and the edge direction v 2 .
  • the eigenvalues ⁇ 1 and ⁇ 2 representing dispersions of pixel value gradients for the edge gradient direction v 1 and the edge direction v 2 are also calculated.
  • the ratio ⁇ 2 / ⁇ 1 of the eigenvalue ⁇ 2 to the eigenvalue ⁇ 1 is calculated by the edge direction processing range determination unit 6 .
  • the ratio ⁇ 2 / ⁇ 1 decreases as the pixel gradient in the edge gradient direction is more predominant. Accordingly, reliability of an edge is detected.
  • the parameter p representing the reliability of the edge is generated in accordance with the ratio ⁇ 2 / ⁇ 1 ( FIG. 5 ).
  • the reliability of the edge is calculated by effectively using the gradient matrix g used for calculation of the edge gradient direction v 1 and the edge direction v 2 .
  • the parameter q corresponding to the parameter p is generated in accordance with the eigenvalue ⁇ 1 ( FIG. 6 ).
  • the filtering range r is calculated by multiplication of the parameters p and q.
  • the edge direction processing unit 5 performs filtering processing on the interpolated image data Pc in the edge direction to generate the pixel value Pc′ of the output image D 2
  • the number of taps for the filtering processing is changed in accordance with the filtering range r generated by the edge direction processing range determination unit 6 as described above. If the reliability of the edge is high, the pixel value Pc′ of the output image data D 2 is generated by performing filtering processing on the interpolated image data Pc in the edge direction in a wide range. If the reliability of the edge is low, the pixel value Pc′ of the output image data D 2 is generated by performing filtering processing in a narrow range.
  • the number of taps for the filtering processing for the edge direction v 2 is changed in accordance with the reliability of the edge.
  • excessive smoothing processing in a portion other than the edge can be prevented, and a reduction in the image quality can be effectively avoided.
  • the weighting coefficient ⁇ for filtering processing is changed (condition (12)) in accordance with the reliability of the edge in the edge direction v 2 .
  • the weighting coefficient ⁇ is changed in a decimal fractional part.
  • interpolated image data Pcc in the edge gradient direction obtained by interpolation processing for the image data based on the pixel values output from the edge direction processing unit 5 is generated for each pixel of the output image data D 2 on a line extending in the edge gradient direction v 1 ( FIG. 7 ), and the pixel value Pcc′ of the output image data D 2 is calculated by performing filtering processing of the interpolated image data Pcc in the edge gradient direction. Accordingly, by setting a characteristic for the filtering processing, edge enhancement is performed in the edge gradient direction v 1 , and the image data D 2 is generated.
  • the interpolation processing unit 8 calculates a pixel value Pa′ for each pixel of the output image data D 2 by interpolation processing in accordance with a known procedure. Accordingly, although high-frequency components of the image data S 11 generated by the interpolation processing unit 8 based on the known procedure are lost, the outline of the image data S 11 is not excessively enhanced.
  • the blend processing unit 10 performs weighting addition of the image data S 11 from the interpolation processing unit 8 and the image data S 3 from the edge gradient direction processing unit 7 in accordance with an image to generate the output image data D 2 .
  • a portion in which the outline is excessively enhanced is corrected using a pixel value Pa′ in accordance with the known procedure, and the output image data D 2 is generated. Therefore, in this embodiment, a reduction in the image quality due to excessive edge enhancement can be prevented.
  • the reliability of an edge is calculated using the ratio ⁇ 2 / ⁇ 1 of the eigenvalue ⁇ 2 to the eigenvalue ⁇ 1 , and the weighting coefficient ⁇ for weighting addition performed in the blend processing unit 10 is calculated in accordance with the reliability of the edge ( FIG. 8 ).
  • the weighting coefficient ⁇ is corrected using the eigenvalue ⁇ 1 ( FIG. 9 ).
  • the gradient matrix g used for calculating the edge gradient direction v 1 and the edge direction v 2 is effectively used to calculate the reliability of the edge, and the weighting coefficient ⁇ is set in accordance with the reliability of the edge.
  • excessive edge enhancement can be prevented with such a simplified configuration.
  • the edge gradient direction v 1 with the largest gradient of pixel values and the edge direction v 2 orthogonal to the edge gradient direction v 1 are detected, and edge enhancement and smoothing processing are performed in the edge gradient direction v 1 and the edge direction v 2 , respectively, to generate the output image data D 2 .
  • edge enhancement and smoothing processing are performed in the edge gradient direction v 1 and the edge direction v 2 , respectively, to generate the output image data D 2 .
  • filtering processing for such smoothing processing after the interpolated image data Pc in the edge direction obtained by interpolation processing for the input image data D 1 is generated for each pixel of the output image data D 2 on a line extending in the edge direction v 2 in accordance with a detection result of the edge detection unit 2 , the pixel value Pc′ corresponding to each pixel of the output image data D 2 is sequentially detected by performing filtering processing of the interpolated image data Pc in the edge direction. Accordingly, a characteristic for the filtering processing is variously set, and smoothing processing with a desired characteristic can thus be performed.
  • the number of taps for the filtering processing is changed in accordance with the reliability of the edge in the edge direction v 2 .
  • incorrect smoothing processing in a portion other than the edge can be prevented.
  • the weighting coefficient ⁇ for the filtering processing is changed in accordance with the reliability of the edge in the edge direction v 2 , and the number of taps is changed in a decimal fractional part by performing weighting addition of filtering processing results of different numbers of taps using the weighting coefficient ⁇ for the filtering processing.
  • the reliability of the edge in the edge direction v 2 is the ratio of the dispersion ⁇ 2 of the pixel value gradient in the edge direction v 2 to the dispersion ⁇ 1 of the pixel value gradient in the edge gradient direction v 1 , the configuration for detecting the edge gradient direction v 1 and the edge direction v 2 is effectively used, and unnaturalness in changing the number of taps can be effectively prevented.
  • processing of edge enhancement is performed by filtering processing of the interpolated data Pcc in the edge gradient direction.
  • a characteristic for the filtering processing is variously set, and edge enhancement with a desired characteristic can thus be performed.
  • a series of processing is performed such that a sampling pitch in the output image data D 2 is different from a sampling pitch in the input image data D 1 , and sampling pitches are changed together in smoothing processing and edge enhancement processing.
  • the interpolation processing unit 8 converts the input image data D 1 into the interpolated image data S 11 .
  • the weighting coefficient ⁇ for blending is changed in accordance with the reliability of the edge in the edge direction v 2 .
  • Weighting addition of the image data S 3 output from the edge gradient direction processing unit and the interpolated image data S 11 is performed in accordance with the weighting coefficient ⁇ for blending, and the output image data D 2 is output. Accordingly, excessive edge enhancement in a natural image can be effectively prevented.
  • the present invention is not limited to this.
  • the number of taps may be equally changed in accordance with a change in a weighting coefficient for smoothing processing.
  • the present invention is not limited to this.
  • the present invention is also widely applicable to processing for edge correction.
  • a series of processing is performed in accordance with a sampling pitch of output image data that is equal to a sampling pitch of the input image data.
  • the configuration for interpolation processing is omitted, and the blend processing unit 10 performs weighting addition of output data of the edge gradient direction processing unit 7 and the input image data.
  • edge enhancement of input image data can be simply performed.
  • the present invention is not limited to this.
  • the present invention is also widely applicable to a case where image data is processed with hardware configuration.
  • the present invention is applicable, for example, to resolution conversion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Editing Of Facsimile Originals (AREA)
  • Image Analysis (AREA)
US10/563,919 2004-05-19 2005-04-21 Image processing apparatus, image processing method, program of image processing method, and recording medium in which program of image processing method has been recorded Expired - Fee Related US7567724B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2004148991A JP4534594B2 (ja) 2004-05-19 2004-05-19 画像処理装置、画像処理方法、画像処理方法のプログラム及び画像処理方法のプログラムを記録した記録媒体
JP2004-148991 2004-05-19
PCT/JP2005/008106 WO2005111933A1 (ja) 2004-05-19 2005-04-21 画像処理装置、画像処理方法、画像処理方法のプログラム及び画像処理方法のプログラムを記録した記録媒体

Publications (2)

Publication Number Publication Date
US20080123998A1 US20080123998A1 (en) 2008-05-29
US7567724B2 true US7567724B2 (en) 2009-07-28

Family

ID=35394354

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/563,919 Expired - Fee Related US7567724B2 (en) 2004-05-19 2005-04-21 Image processing apparatus, image processing method, program of image processing method, and recording medium in which program of image processing method has been recorded

Country Status (7)

Country Link
US (1) US7567724B2 (ja)
EP (1) EP1748386A1 (ja)
JP (1) JP4534594B2 (ja)
KR (1) KR101128661B1 (ja)
CN (1) CN100565579C (ja)
RU (1) RU2367021C2 (ja)
WO (1) WO2005111933A1 (ja)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080063064A1 (en) * 2006-09-13 2008-03-13 Macinnis Alexander Method and System for Motion Compensated Temporal Filtering Using IIR Filtering
US20110075035A1 (en) * 2006-09-13 2011-03-31 Macinnis Alexander Method and System for Motion Compensated Temporal Filtering Using Both FIR and IIR Filtering
US20120038800A1 (en) * 2010-08-16 2012-02-16 Industry-University Cooperation Foundation Sogang University Method of image processing and image processing apparatus
US20140105517A1 (en) * 2008-09-04 2014-04-17 Silicon Image, Inc. System, method, and apparatus for smoothing of edges in images to remove irregularities
US20150146064A1 (en) * 2013-11-27 2015-05-28 Sony Corporation Image processing apparatus, image processing method, solid-state imaging device, and electronic apparatus
US20160117804A1 (en) * 2013-07-30 2016-04-28 Byd Company Limited Method and device for enhancing edge of image and digital camera
US9892509B2 (en) 2015-09-29 2018-02-13 Koninklijke Philips N.V. Visualization of projection X-ray image
US20200349713A1 (en) * 2018-10-31 2020-11-05 Fei Company Smart metrology on microscope images
US11283963B2 (en) 2018-09-10 2022-03-22 Canon Kabushiki Kaisha Image processing apparatus and image processing method and storage medium

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI288890B (en) * 2005-04-01 2007-10-21 Realtek Semiconductor Corp Method and apparatus for pixel interpolation
KR100765751B1 (ko) * 2005-06-08 2007-10-15 삼성전자주식회사 비월 주사 방식의 cmyg 컬러 포맷의 컬러 보간 방법 및그 장치
US8120703B2 (en) * 2005-09-08 2012-02-21 Silicon Image/BSTZ Source-adaptive video deinterlacer
US7982798B2 (en) * 2005-09-08 2011-07-19 Silicon Image, Inc. Edge detection
US8004606B2 (en) * 2005-09-08 2011-08-23 Silicon Image, Inc. Original scan line detection
JP4774265B2 (ja) * 2005-09-30 2011-09-14 富士通株式会社 画像符号化装置
JP4987355B2 (ja) * 2005-10-14 2012-07-25 京セラ株式会社 撮像装置および撮像方法
JP4703504B2 (ja) 2006-07-21 2011-06-15 ソニー株式会社 画像処理装置、画像処理方法、及びプログラム
JP4778859B2 (ja) 2006-08-10 2011-09-21 富士通株式会社 画像処理装置、画像処理方法及び画像処理プログラム
KR100818988B1 (ko) 2006-09-05 2008-04-04 삼성전자주식회사 영상신호 처리 방법 및 장치
CN101163250B (zh) * 2006-10-09 2011-07-13 北京航空航天大学 一种基于边界梯度的视频流容错方法
CN101197911B (zh) * 2006-12-05 2010-08-25 广达电脑股份有限公司 图像边缘强化方法及其装置
WO2008076566A1 (en) 2006-12-20 2008-06-26 Anchor Bay Technologies, Inc. Noise cancellation
JP4846644B2 (ja) * 2007-03-30 2011-12-28 株式会社東芝 映像信号補間装置および映像信号補間方法
JP4863505B2 (ja) * 2007-08-07 2012-01-25 株式会社メガチップス 画像処理装置
WO2009038559A1 (en) * 2007-09-19 2009-03-26 Thomson Licensing System and method for scaling images
JP2009098925A (ja) * 2007-10-17 2009-05-07 Sony Corp 画像処理装置、画像処理方法、および、プログラム
US8452117B2 (en) * 2009-02-10 2013-05-28 Silicon Image, Inc. Block noise detection and filtering
JP2010211466A (ja) * 2009-03-10 2010-09-24 Canon Inc 画像処理装置および画像処理方法及びプログラム
WO2011077334A1 (en) * 2009-12-22 2011-06-30 Koninklijke Philips Electronics N.V. Bone suppression in x-ray radiograms
JP5724185B2 (ja) * 2010-03-04 2015-05-27 ソニー株式会社 画像処理装置、および画像処理方法、並びにプログラム
GB2492039B (en) * 2010-05-11 2016-01-06 Zoran France Two dimensional super resolution scaling
CN101930599B (zh) * 2010-08-24 2013-02-13 黄伟萍 一种医学图像增强方法及系统
JP5632680B2 (ja) * 2010-08-25 2014-11-26 日立アロカメディカル株式会社 超音波画像処理装置
US8494308B2 (en) * 2011-03-10 2013-07-23 Sharp Laboratories Of America, Inc. Image upscaling based upon directional interpolation
JP2011125757A (ja) * 2011-03-30 2011-06-30 Hitachi Aloka Medical Ltd 超音波画像データ処理装置
WO2012175010A1 (en) * 2011-06-24 2012-12-27 Technicolor (China) Technology Co., Ltd. Method and device for processing of an image
CN102521803B (zh) * 2011-11-29 2013-12-11 青岛海信信芯科技有限公司 图像缩放中的抗锯齿失真方法及装置
CN103456255B (zh) * 2012-05-31 2016-06-15 欣德洺企业有限公司 显示器像素驱动系统及显示器子像素驱动流程
KR101934261B1 (ko) * 2012-07-09 2019-01-03 삼성전자 주식회사 이미지 해상도 변화 방법과 장치, 및 상기 장치를 포함하는 전자 장치
WO2014078985A1 (en) * 2012-11-20 2014-05-30 Thomson Licensing Method and apparatus for image regularization
US9659348B1 (en) * 2014-05-12 2017-05-23 Marvell International Ltd. Methods and systems for high definition scaling with low hardware complexity
JP6473608B2 (ja) 2014-11-27 2019-02-20 三星ディスプレイ株式會社Samsung Display Co.,Ltd. 画像処理装置、画像処理方法、及びプログラム
US20170061580A1 (en) * 2015-08-28 2017-03-02 Qualcomm Incorporated Rescaling and/or reconstructing image data with directional interpolation
JP6619638B2 (ja) * 2015-12-09 2019-12-11 Eizo株式会社 画像処理装置及びプログラム
CN106023160B (zh) * 2016-05-11 2018-11-02 中南大学 高炉料面图像边缘检测方法及装置
CN106482739B (zh) * 2016-11-30 2020-07-17 英华达(上海)科技有限公司 自动导引运输车导航方法
JP7265316B2 (ja) * 2017-12-28 2023-04-26 三星電子株式会社 画像処理装置及び画像処理方法
GB2570872A (en) * 2018-01-31 2019-08-14 Res & Innovation Uk Radar image processing

Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55133179A (en) 1979-04-03 1980-10-16 Ricoh Co Ltd Picture processing system
US5072384A (en) * 1988-11-23 1991-12-10 Arch Development Corp. Method and system for automated computerized analysis of sizes of hearts and lungs in digital chest radiographs
JPH06195457A (ja) 1992-08-26 1994-07-15 Minolta Camera Co Ltd 画像処理装置
US5373322A (en) * 1993-06-30 1994-12-13 Eastman Kodak Company Apparatus and method for adaptively interpolating a full color image utilizing chrominance gradients
JPH06348842A (ja) 1993-06-11 1994-12-22 Hitachi Ltd ノイズ低減フィルター
US5392137A (en) * 1992-04-30 1995-02-21 Ricoh Company, Ltd. Image processing apparatus in which filtering is selected for input image characteristics
JPH07262368A (ja) 1994-02-18 1995-10-13 Siemens Ag 画像信号処理方法
JPH0822539A (ja) 1994-07-06 1996-01-23 Hitachi Ltd 画像処理方法
US5506619A (en) * 1995-03-17 1996-04-09 Eastman Kodak Company Adaptive color plan interpolation in single sensor color electronic camera
US5515181A (en) * 1992-03-06 1996-05-07 Fuji Xerox Co., Ltd. Image reading apparatus providing high quality images through synthesis of segmented image data
US5600731A (en) * 1991-05-09 1997-02-04 Eastman Kodak Company Method for temporally adaptive filtering of frames of a noisy image sequence using motion estimation
US5668888A (en) * 1990-11-21 1997-09-16 Arch Development Corporation Method and system for automatic detection of ribs and pneumothorax in digital chest radiographs
US5771318A (en) * 1996-06-27 1998-06-23 Siemens Corporate Research, Inc. Adaptive edge-preserving smoothing filter
US5798830A (en) * 1993-06-17 1998-08-25 Ultrapointe Corporation Method of establishing thresholds for image comparison
US5808735A (en) * 1993-06-17 1998-09-15 Ultrapointe Corporation Method for characterizing defects on semiconductor wafers
US5867592A (en) * 1994-02-23 1999-02-02 Matsushita Electric Works, Ltd. Method of utilizing edge images of a circular surface for detecting the position, posture, and shape of a three-dimensional objective having the circular surface part
US5870103A (en) * 1996-09-25 1999-02-09 Eastman Kodak Company Method for creating realistic-looking composite images
US5903660A (en) * 1997-07-16 1999-05-11 The Regents Of The University Of California Automatic background recognition and removal (ABRR) in projection digital radiographic images (PDRI)
US5930007A (en) * 1993-10-08 1999-07-27 Matsushita Electric Industrial Co., Ltd. Area recognizing device for image signals
US5960371A (en) * 1997-09-04 1999-09-28 Schlumberger Technology Corporation Method of determining dips and azimuths of fractures from borehole images
US6144697A (en) * 1998-02-02 2000-11-07 Purdue Research Foundation Equalization techniques to reduce intersymbol interference
US6337925B1 (en) * 2000-05-08 2002-01-08 Adobe Systems Incorporated Method for determining a border in a complex scene with applications to image masking
US6388706B1 (en) * 1996-09-18 2002-05-14 Konica Corporation Image processing method for actively edge-enhancing image data obtained by an electronic camera
US6445832B1 (en) * 2000-10-10 2002-09-03 Lockheed Martin Corporation Balanced template tracker for tracking an object image sequence
US6463167B1 (en) * 1996-09-19 2002-10-08 Philips Medical Systems Technologies Ltd. Adaptive filtering
US20020159650A1 (en) * 2000-07-06 2002-10-31 Seiko Epson Corporation Image processing apparatus and recording medium, and image processing apparatus
US20020167602A1 (en) * 2001-03-20 2002-11-14 Truong-Thao Nguyen System and method for asymmetrically demosaicing raw data images using color discontinuity equalization
US20030053161A1 (en) * 2001-09-13 2003-03-20 Guo Li Method and system for enhancing images using edge orientation
US20030107770A1 (en) * 2001-07-11 2003-06-12 Applied Materials, Inc. Algorithm for adjusting edges of grayscale pixel-map images
JP2003224715A (ja) 2002-01-31 2003-08-08 Sony Corp 画像処理回路および画像処理方法
US20030185420A1 (en) * 2002-03-29 2003-10-02 Jason Sefcik Target detection method and system
US20030190075A1 (en) * 2001-10-31 2003-10-09 Infowrap, Inc. Method for illumination independent change detection in a pair of registered gray images
US6678405B1 (en) * 1999-06-08 2004-01-13 Sony Corporation Data processing apparatus, data processing method, learning apparatus, learning method, and medium
US6681053B1 (en) * 1999-08-05 2004-01-20 Matsushita Electric Industrial Co., Ltd. Method and apparatus for improving the definition of black and white text and graphics on a color matrix digital display device
US6718072B1 (en) * 1999-12-22 2004-04-06 International Business Machines Corporation Image conversion method, image processing apparatus, and image display apparatus
US6735341B1 (en) * 1998-06-18 2004-05-11 Minolta Co., Ltd. Image processing device and method and recording medium for recording image processing program for same
US20040119884A1 (en) * 2002-12-19 2004-06-24 Hong Jiang Edge adaptive spatial temporal deinterlacing
US6757442B1 (en) * 2000-11-22 2004-06-29 Ge Medical Systems Global Technology Company, Llc Image enhancement method with simultaneous noise reduction, non-uniformity equalization, and contrast enhancement
US6778698B1 (en) * 1999-06-11 2004-08-17 Pts Corporation Method and apparatus for digital image segmentation
US20040190787A1 (en) * 2002-12-27 2004-09-30 Yoshihiro Nakami Image noise reduction
US20050002570A1 (en) * 2002-07-10 2005-01-06 Northrop Grumman Corporation System and method for analyzing a contour of an image by applying a sobel operator thereto
US20050036689A1 (en) * 2003-07-22 2005-02-17 L-3 Communications Security And Detection Systems Methods and apparatus for detecting objects in baggage
US20050134730A1 (en) * 2003-12-23 2005-06-23 Lsi Logic Corporation Method and apparatus for video deinterlacing and format conversion
US20050201616A1 (en) * 2004-03-15 2005-09-15 Microsoft Corporation High-quality gradient-corrected linear interpolation for demosaicing of color images
US20060140497A1 (en) * 2003-02-28 2006-06-29 Sony Corporation Image processing device and method, recording medium, and program
US20060291741A1 (en) * 2005-02-10 2006-12-28 Sony Corporation Image processing apparatus, image processing method, program, and recording medium therefor
US7158632B2 (en) * 2003-08-20 2007-01-02 Intel Corporation Adaptive scaling and echo reduction
US7379626B2 (en) * 2004-08-20 2008-05-27 Silicon Optix Inc. Edge adaptive image expansion and enhancement system and method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02162475A (ja) * 1988-12-15 1990-06-22 Dainippon Screen Mfg Co Ltd 画像輪郭修正方法
JPH05225323A (ja) * 1992-02-10 1993-09-03 Fuji Photo Film Co Ltd 画像補間方法
JP2858530B2 (ja) * 1993-12-27 1999-02-17 日本電気株式会社 エッジ強調装置
JP3494764B2 (ja) * 1995-08-09 2004-02-09 富士写真フイルム株式会社 画像データの補間演算方法および装置
JPH1063824A (ja) * 1996-08-16 1998-03-06 Fujitsu Ltd 画像データの補間平滑化装置
JP2000013596A (ja) * 1998-06-18 2000-01-14 Minolta Co Ltd 画像処理装置および方法ならびに画像処理プログラムを記録した記録媒体
JP2000090268A (ja) * 1998-09-08 2000-03-31 Nec Corp 車両領域検出方法
JP4679710B2 (ja) * 2000-10-25 2011-04-27 富士フイルム株式会社 ノイズ抑制処理装置並びに記録媒体
JP2003016442A (ja) * 2001-07-03 2003-01-17 Sony Corp 画像処理装置および方法、記録媒体、並びにプログラム
JP4053362B2 (ja) * 2002-07-08 2008-02-27 シャープ株式会社 補間処理方法、補間処理プログラムおよびこれを記録した記録媒体ならびに画像処理装置およびこれを備えた画像形成装置

Patent Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55133179A (en) 1979-04-03 1980-10-16 Ricoh Co Ltd Picture processing system
US5072384A (en) * 1988-11-23 1991-12-10 Arch Development Corp. Method and system for automated computerized analysis of sizes of hearts and lungs in digital chest radiographs
US5668888A (en) * 1990-11-21 1997-09-16 Arch Development Corporation Method and system for automatic detection of ribs and pneumothorax in digital chest radiographs
US5600731A (en) * 1991-05-09 1997-02-04 Eastman Kodak Company Method for temporally adaptive filtering of frames of a noisy image sequence using motion estimation
US5515181A (en) * 1992-03-06 1996-05-07 Fuji Xerox Co., Ltd. Image reading apparatus providing high quality images through synthesis of segmented image data
US5392137A (en) * 1992-04-30 1995-02-21 Ricoh Company, Ltd. Image processing apparatus in which filtering is selected for input image characteristics
JPH06195457A (ja) 1992-08-26 1994-07-15 Minolta Camera Co Ltd 画像処理装置
JPH06348842A (ja) 1993-06-11 1994-12-22 Hitachi Ltd ノイズ低減フィルター
US5808735A (en) * 1993-06-17 1998-09-15 Ultrapointe Corporation Method for characterizing defects on semiconductor wafers
US5798830A (en) * 1993-06-17 1998-08-25 Ultrapointe Corporation Method of establishing thresholds for image comparison
US5373322A (en) * 1993-06-30 1994-12-13 Eastman Kodak Company Apparatus and method for adaptively interpolating a full color image utilizing chrominance gradients
US5930007A (en) * 1993-10-08 1999-07-27 Matsushita Electric Industrial Co., Ltd. Area recognizing device for image signals
JPH07262368A (ja) 1994-02-18 1995-10-13 Siemens Ag 画像信号処理方法
US5867592A (en) * 1994-02-23 1999-02-02 Matsushita Electric Works, Ltd. Method of utilizing edge images of a circular surface for detecting the position, posture, and shape of a three-dimensional objective having the circular surface part
JPH0822539A (ja) 1994-07-06 1996-01-23 Hitachi Ltd 画像処理方法
US5506619A (en) * 1995-03-17 1996-04-09 Eastman Kodak Company Adaptive color plan interpolation in single sensor color electronic camera
US5771318A (en) * 1996-06-27 1998-06-23 Siemens Corporate Research, Inc. Adaptive edge-preserving smoothing filter
US6388706B1 (en) * 1996-09-18 2002-05-14 Konica Corporation Image processing method for actively edge-enhancing image data obtained by an electronic camera
US6463167B1 (en) * 1996-09-19 2002-10-08 Philips Medical Systems Technologies Ltd. Adaptive filtering
US5870103A (en) * 1996-09-25 1999-02-09 Eastman Kodak Company Method for creating realistic-looking composite images
US5903660A (en) * 1997-07-16 1999-05-11 The Regents Of The University Of California Automatic background recognition and removal (ABRR) in projection digital radiographic images (PDRI)
US5960371A (en) * 1997-09-04 1999-09-28 Schlumberger Technology Corporation Method of determining dips and azimuths of fractures from borehole images
US6144697A (en) * 1998-02-02 2000-11-07 Purdue Research Foundation Equalization techniques to reduce intersymbol interference
US6735341B1 (en) * 1998-06-18 2004-05-11 Minolta Co., Ltd. Image processing device and method and recording medium for recording image processing program for same
US6678405B1 (en) * 1999-06-08 2004-01-13 Sony Corporation Data processing apparatus, data processing method, learning apparatus, learning method, and medium
US6778698B1 (en) * 1999-06-11 2004-08-17 Pts Corporation Method and apparatus for digital image segmentation
US6681053B1 (en) * 1999-08-05 2004-01-20 Matsushita Electric Industrial Co., Ltd. Method and apparatus for improving the definition of black and white text and graphics on a color matrix digital display device
US6718072B1 (en) * 1999-12-22 2004-04-06 International Business Machines Corporation Image conversion method, image processing apparatus, and image display apparatus
US6337925B1 (en) * 2000-05-08 2002-01-08 Adobe Systems Incorporated Method for determining a border in a complex scene with applications to image masking
US20020159650A1 (en) * 2000-07-06 2002-10-31 Seiko Epson Corporation Image processing apparatus and recording medium, and image processing apparatus
US6445832B1 (en) * 2000-10-10 2002-09-03 Lockheed Martin Corporation Balanced template tracker for tracking an object image sequence
US6757442B1 (en) * 2000-11-22 2004-06-29 Ge Medical Systems Global Technology Company, Llc Image enhancement method with simultaneous noise reduction, non-uniformity equalization, and contrast enhancement
US20020167602A1 (en) * 2001-03-20 2002-11-14 Truong-Thao Nguyen System and method for asymmetrically demosaicing raw data images using color discontinuity equalization
US20030107770A1 (en) * 2001-07-11 2003-06-12 Applied Materials, Inc. Algorithm for adjusting edges of grayscale pixel-map images
US20030053161A1 (en) * 2001-09-13 2003-03-20 Guo Li Method and system for enhancing images using edge orientation
US20030190075A1 (en) * 2001-10-31 2003-10-09 Infowrap, Inc. Method for illumination independent change detection in a pair of registered gray images
JP2003224715A (ja) 2002-01-31 2003-08-08 Sony Corp 画像処理回路および画像処理方法
US20030185420A1 (en) * 2002-03-29 2003-10-02 Jason Sefcik Target detection method and system
US20050002570A1 (en) * 2002-07-10 2005-01-06 Northrop Grumman Corporation System and method for analyzing a contour of an image by applying a sobel operator thereto
US20040119884A1 (en) * 2002-12-19 2004-06-24 Hong Jiang Edge adaptive spatial temporal deinterlacing
US20040190787A1 (en) * 2002-12-27 2004-09-30 Yoshihiro Nakami Image noise reduction
US20060140497A1 (en) * 2003-02-28 2006-06-29 Sony Corporation Image processing device and method, recording medium, and program
US20050036689A1 (en) * 2003-07-22 2005-02-17 L-3 Communications Security And Detection Systems Methods and apparatus for detecting objects in baggage
US7158632B2 (en) * 2003-08-20 2007-01-02 Intel Corporation Adaptive scaling and echo reduction
US20050134730A1 (en) * 2003-12-23 2005-06-23 Lsi Logic Corporation Method and apparatus for video deinterlacing and format conversion
US20050201616A1 (en) * 2004-03-15 2005-09-15 Microsoft Corporation High-quality gradient-corrected linear interpolation for demosaicing of color images
US7379626B2 (en) * 2004-08-20 2008-05-27 Silicon Optix Inc. Edge adaptive image expansion and enhancement system and method
US20060291741A1 (en) * 2005-02-10 2006-12-28 Sony Corporation Image processing apparatus, image processing method, program, and recording medium therefor

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110075035A1 (en) * 2006-09-13 2011-03-31 Macinnis Alexander Method and System for Motion Compensated Temporal Filtering Using Both FIR and IIR Filtering
US8503812B2 (en) * 2006-09-13 2013-08-06 Broadcom Corporation Method and system for motion compensated temporal filtering using both FIR and IIR filtering
US20080063064A1 (en) * 2006-09-13 2008-03-13 Macinnis Alexander Method and System for Motion Compensated Temporal Filtering Using IIR Filtering
US9305337B2 (en) * 2008-09-04 2016-04-05 Lattice Semiconductor Corporation System, method, and apparatus for smoothing of edges in images to remove irregularities
US20140105517A1 (en) * 2008-09-04 2014-04-17 Silicon Image, Inc. System, method, and apparatus for smoothing of edges in images to remove irregularities
US20120038800A1 (en) * 2010-08-16 2012-02-16 Industry-University Cooperation Foundation Sogang University Method of image processing and image processing apparatus
US8472748B2 (en) * 2010-08-16 2013-06-25 Samsung Electronics Co., Ltd. Method of image processing and image processing apparatus
US20160117804A1 (en) * 2013-07-30 2016-04-28 Byd Company Limited Method and device for enhancing edge of image and digital camera
US9836823B2 (en) * 2013-07-30 2017-12-05 Byd Company Limited Method and device for enhancing edge of image and digital camera
US20150146064A1 (en) * 2013-11-27 2015-05-28 Sony Corporation Image processing apparatus, image processing method, solid-state imaging device, and electronic apparatus
US9659346B2 (en) * 2013-11-27 2017-05-23 Sony Semiconductor Solutions Corporation Image processing apparatus, image processing method, solid-state imaging device, and electronic apparatus configured to calculate a pixel value of a target position in accordance with a weighted value of each pixel on a candidate line of a plurality of candidate lines
US9892509B2 (en) 2015-09-29 2018-02-13 Koninklijke Philips N.V. Visualization of projection X-ray image
US10360680B2 (en) 2015-09-29 2019-07-23 Koninklijke Philips N.V. Visualization of projection X-ray image
US11283963B2 (en) 2018-09-10 2022-03-22 Canon Kabushiki Kaisha Image processing apparatus and image processing method and storage medium
US20200349713A1 (en) * 2018-10-31 2020-11-05 Fei Company Smart metrology on microscope images
US11494914B2 (en) * 2018-10-31 2022-11-08 Fei Company Smart metrology on microscope images

Also Published As

Publication number Publication date
KR101128661B1 (ko) 2012-03-27
CN100565579C (zh) 2009-12-02
US20080123998A1 (en) 2008-05-29
JP2005332130A (ja) 2005-12-02
WO2005111933A1 (ja) 2005-11-24
RU2006101393A (ru) 2007-08-10
RU2367021C2 (ru) 2009-09-10
EP1748386A1 (en) 2007-01-31
CN1806257A (zh) 2006-07-19
JP4534594B2 (ja) 2010-09-01
KR20070011223A (ko) 2007-01-24

Similar Documents

Publication Publication Date Title
US7567724B2 (en) Image processing apparatus, image processing method, program of image processing method, and recording medium in which program of image processing method has been recorded
US7792384B2 (en) Image processing apparatus, image processing method, program, and recording medium therefor
US6664973B1 (en) Image processing apparatus, method for processing and image and computer-readable recording medium for causing a computer to process images
US7406208B2 (en) Edge enhancement process and system
US7483040B2 (en) Information processing apparatus, information processing method, recording medium, and program
US6999099B2 (en) Image processing apparatus and method, recording medium, and program thereof
EP1137258B1 (en) Image processing circuit and method for processing image
US7319496B2 (en) Signal processing apparatus, image display apparatus and signal processing method
JP4216830B2 (ja) 映像処理装置及び方法並びにコンピュータが読出可能な記憶媒体
US7813587B2 (en) Method and apparatus for shoot suppression in image detail enhancement
US8369624B2 (en) Image processing apparatus, image processing method, program of image processing method, and recording medium having program of image processing method recorded thereon
KR20050041886A (ko) 전역 통계 및 국부 통계에 의해 컨트롤되는 노이즈 감소장치 및 그 방법
KR20160069453A (ko) 영상 처리 장치 및 영상 처리 방법
US8345163B2 (en) Image processing device and method and image display device
JP2002290773A (ja) 画像強調装置および画像強調プログラム
US20110243464A1 (en) Image processing control device and method
JP4804271B2 (ja) 画像処理装置及びその制御方法
JP4635652B2 (ja) 画像処理装置、画像処理方法、画像処理方法のプログラム及び画像処理方法のプログラムを記録した記録媒体
JP5901350B2 (ja) 画像処理装置
JP4176681B2 (ja) 画像エッジ補正装置及び画像エッジ補正方法
JP5014514B2 (ja) 画像処理装置
JPH103539A (ja) 画像処理方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOMI, SHINICHIRO;OGATA, MASAMI;UEDA, KAZUHIKO;REEL/FRAME:017464/0152

Effective date: 20051102

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210728