WO2017098832A1 - 画像処理装置及びプログラム - Google Patents
画像処理装置及びプログラム Download PDFInfo
- Publication number
- WO2017098832A1 WO2017098832A1 PCT/JP2016/081977 JP2016081977W WO2017098832A1 WO 2017098832 A1 WO2017098832 A1 WO 2017098832A1 JP 2016081977 W JP2016081977 W JP 2016081977W WO 2017098832 A1 WO2017098832 A1 WO 2017098832A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- ringing
- pixel
- image data
- target pixel
- degree
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 66
- 238000000034 method Methods 0.000 claims abstract description 49
- 238000006243 chemical reaction Methods 0.000 claims abstract description 33
- 230000002093 peripheral effect Effects 0.000 claims description 50
- 238000000605 extraction Methods 0.000 claims 1
- 230000001629 suppression Effects 0.000 description 14
- 238000012937 correction Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 3
- 239000006185 dispersion Substances 0.000 description 3
- 238000003707 image sharpening Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
Images
Classifications
-
- G06T5/70—
-
- G06T5/73—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration by the use of local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/409—Edge or detail enhancement; Noise or error suppression
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
- G06T2207/20012—Locally adaptive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
Definitions
- the present invention relates to an image processing apparatus and a program capable of suppressing ringing that occurs in converted image data obtained by converting input image data into an image while considering the sharpness of the image data.
- Patent Document 1 discloses an image sharpening device that has the object of suppressing enhancement of ringing and sharpening a blurred image, and temporarily sharpening the input image and pixel values of the input image as initial values.
- a storage unit for storing an image a determination unit that determines whether the pixel belongs to an edge region for each pixel of the input image, and a periphery of the edge pixel that is determined to belong to the edge region.
- the upper limit value and the lower limit value of the pixel values of the limited range from the distribution of pixel values of each pixel of the input image or the distribution of pixel values of each pixel around the corresponding pixel of the temporarily sharpened image corresponding to the edge pixel
- an image sharpening device including a conversion unit that writes pixel values of a sharpened image.
- the present invention has been made in view of such circumstances, and provides an image processing apparatus and program capable of suppressing ringing occurring in converted image data subjected to image conversion processing in consideration of the sharpness of the image data. To do.
- a ringing occurrence degree estimating unit that estimates a ringing occurrence degree representing a degree of occurrence of ringing based on input image data, and an input image represented by the input image data based on the input image data.
- a sharpness estimation unit that estimates a sharpness enhancement degree representing the strength of sharpness, and the image conversion based on the ringing occurrence degree and / or the sharpness enhancement degree with respect to the converted image data obtained by converting the input image data.
- an image processing apparatus including an adjustment unit that executes an adjustment process for suppressing the degree of the image for each pixel.
- the inventors of the present invention are excessive in the image conversion processing if the image represented by the input image data is an unblurred image (that is, an image having a high degree of sharpness). In this case, attention is paid to the fact that an output image is corrected, and ringing is likely to occur in the output image. In addition, if the image represented by the input image data is a blurred image (that is, an image with a low degree of sharpness), ringing is likely to occur around the edge even if the image corrected by the image conversion process is output. Focused on that. The inventors have found the concept of suppressing ringing in consideration of the sharpness of image data in order to further improve the image quality, and have completed the present invention.
- the ringing occurrence degree estimation unit uses one pixel included in the input image data as a target pixel, and is based on a feature amount in a peripheral pixel region including the target pixel and a plurality of peripheral pixels around the target pixel.
- a ringing flag setting unit that sets a ringing flag for the target pixel that is estimated to cause ringing, and the adjustment unit determines whether or not the ringing flag is set for the target pixel. A different adjustment process is executed for this.
- the adjustment unit performs an adjustment process that further suppresses the degree of image conversion for a target pixel for which the ringing flag is set, as compared to a target pixel for which the ringing flag is not set.
- the adjustment unit performs the adjustment process based on the sharpness enhancement degree for the target pixel for which the ringing flag is not set.
- the adjustment unit performs the adjustment process on the target pixel in which the ringing flag is set based on the ringing occurrence degree and the sharpness enhancement degree.
- the adjustment processing by the adjustment unit mixes the gradation value of the target pixel included in the input image data and the gradation value of the target pixel included in the converted image data at a predetermined ratio.
- the image processing apparatus includes an adjustment coefficient calculation unit that calculates an adjustment coefficient for multiplying a gradation value of the target pixel included in the converted image data, and the adjustment unit includes a scale of the target pixel included in the converted image data.
- the tone value is multiplied by the adjustment coefficient, and the gradation value of the pixel of interest included in the input image data is multiplied by the value resulting from the adjustment coefficient.
- the adjustment coefficient calculation unit calculates an adjustment coefficient based on the sharpness enhancement degree, or the sharpness enhancement degree and the ringing occurrence degree.
- the ringing occurrence degree estimation unit estimates the ringing occurrence degree of the pixel of interest based on the number of pixels in which the ringing flag is set in the peripheral pixel region.
- the sharpness estimation unit includes a sum of variance values of pixel values included in image data obtained by processing the input image data with a low-pass filter, and a variance value of pixel values included in the input image data. The sharpness enhancement degree is estimated based on the sum.
- the target pixel is set as a big edge.
- a big edge determination unit for determining, a flat pixel determination unit for determining that the target pixel is a flat pixel when a feature amount in the peripheral pixel region is lower than a predetermined second threshold, and the target pixel is the flat pixel
- a total sum calculation unit for calculating the sum of the flat pixels included in the peripheral pixel region, and the sum of the flat pixels included in the peripheral pixel region exceeds a predetermined third threshold
- a flat portion determining unit that determines that the periphery of the pixel of interest is a flat portion, and a case where the pixel of interest is determined to be the big edge by the big edge extracting unit;
- An estimation unit configured to estimate that ringing occurs in the target pixel when the flat part determination unit determines that the periphery of the target pixel is a flat part and the surrounding pixel region includes the big edge.
- a ringing estimation device is provided.
- the computer is configured to estimate a ringing occurrence degree representing a degree of occurrence of ringing based on the input image data, and a sharpness of the input image represented by the input image data based on the input image data.
- a degree of image conversion based on the degree of occurrence of ringing and / or the degree of sharpness enhancement with respect to converted image data obtained by converting the input image data into an image.
- a program is provided that functions as an adjustment unit that executes an adjustment process for suppressing each pixel.
- FIG. 1 is a block diagram of an image processing apparatus 1 according to an embodiment of the present invention. It is a flowchart showing the process which sets a ringing flag to the attention pixel which concerns on one Embodiment of this invention. It is a conceptual diagram explaining the process which sets a ringing flag to the attention pixel which concerns on one Embodiment of this invention. It is presumed that a ringing flag is set for a pixel in which the ringing flag is set, and that a ringing flag is not set for a pixel in which the ringing flag is not set. It is a figure showing the ringing estimation result which concerns on one Embodiment of this invention. In FIG. 4, pixels that are estimated to cause ringing are shown in white.
- FIGS. 1 is a block diagram of an image processing apparatus according to an embodiment of the present invention.
- the image processing apparatus 1 is realized by a computer having at least a control unit and a storage unit (not shown).
- a control part performs various arithmetic processing, for example, is comprised by CPU etc.
- the storage unit stores various data and programs, and includes, for example, a memory, an HDD, or an SSD.
- the program may be preinstalled at the time of shipment of the image processing apparatus 1, may be downloaded as an application from a site on the Web, or may be transferred from another information processing apparatus by wired or wireless communication. Good.
- the image processing apparatus 1 includes an input image output unit 30, an image conversion processing unit 10, and a ringing suppression unit 20.
- the ringing suppression unit 20 includes a ringing occurrence degree estimation unit 21, a sharpness estimation unit 22, an adjustment coefficient calculation unit 23, and an adjustment unit 24.
- the input image output unit 30 outputs input image data S1 that is image data input to the image processing apparatus 1 to the image conversion processing unit 10 and the ringing suppression unit 20.
- the image conversion processing unit 10 receives the input image data S1 output from the input image output unit 30, and executes various image conversion processes on the input image data S1.
- image conversion processing is not particularly limited, and examples thereof include edge sharpening processing for enhancing edges.
- the wavelength component constituting the input image data S1 is separated into a low frequency component and a high frequency component, and image conversion processing for correcting the high frequency component, and image conversion for correcting the luminance of the high frequency component constituting the input image data S1 Processing, learning-type image reconstruction processing using sparse representation, and the like.
- the ringing suppression unit 20 is based on the degree of occurrence of ringing or the degree of sharpness and the sharpness of the input image data (hereinafter, converted image data S2) subjected to the image conversion processing by the image conversion processing unit 10. Based on this degree, adjustment processing for suppressing ringing is executed for each pixel. For example, if the image represented by the input image data S1 is an unblurred image (that is, an image with a high degree of sharpness), the correction in the image conversion processing by the image conversion processing unit 10 becomes excessive, and ringing occurs in the converted image data S2. May occur. Therefore, occurrence of ringing is suppressed by the ringing suppression unit 20 suppressing such excessive correction.
- the ringing suppression unit 20 executes processing for suppressing such ringing.
- the ringing occurrence degree estimation unit 21 estimates the ringing occurrence degree based on the feature amount in the peripheral pixel region including the target pixel included in the input image data S1 and a plurality of peripheral pixels around the target pixel. First, the ringing occurrence degree estimation unit 21 sets a ringing flag for a pixel of interest estimated to cause ringing based on the feature amount in the peripheral pixel region. Then, the ringing occurrence degree of the target pixel is estimated based on the number of pixels in which the ringing flag is set in the peripheral pixel region centered on the target pixel. The process of setting the ringing flag for the target pixel will be described with reference to FIG.
- FIG. 2 is a flowchart showing a process in which the ringing occurrence degree estimation unit 21 according to an embodiment of the present invention sets a ringing flag for a pixel of interest.
- the ringing occurrence degree estimation unit 21 uses the feature that (1) the ringing is present around the big edge and (2) is conspicuous in the flat part around the edge, so that the attention pixel included in the input image data S1 is used.
- the ringing occurrence degree representing the degree of occurrence of ringing is estimated. Specifically, the ringing occurrence degree is estimated based on the feature amount in the peripheral pixel region including the target pixel included in the input image data S1 and a plurality of peripheral pixels around the target pixel.
- the peripheral pixel region is a region including a plurality of pixels around the pixel of interest with the pixel of interest at the center.
- the peripheral pixel region can be 3 ⁇ 3 pixels, 5 ⁇ 5 pixels, 7 ⁇ 7 pixels, and 9 ⁇ 9 pixels centered on the target pixel.
- the peripheral pixel region is not limited to these, and may be a plurality of pixels centered on the target pixel, and may have different numbers of pixels in the main scanning direction and the sub-scanning direction. Further, any number of pixels can be adopted as the number of pixels.
- the feature amount in the peripheral pixel region of the target pixel is calculated by setting one pixel included in the input image data S1 output by the input image output unit 30 as the target pixel.
- any amount can be used as the feature amount.
- the distribution of luminance in the peripheral pixel region of 3 ⁇ 3 pixels centered on the target pixel, the average, or the difference in luminance between the target pixel and the adjacent pixel adjacent to the target pixel may be used as the feature amount.
- brightness or saturation may be used instead of luminance.
- the number of pixels constituting the peripheral pixel region can be an arbitrary number of pixels.
- the process in S1 is executed by a feature amount calculation unit controlled by a control unit (not shown).
- S2 it is determined whether or not the pixel of interest is a big edge. Such a determination is made by comparing the feature amount of the target pixel with a predetermined first threshold (hereinafter, BIG_EDGE_TH).
- BIG_EDGE_TH is used for determination of a big edge, and the larger the value, the less likely that ringing occurs in the pixel of interest. If feature amount> BIG_EDGE_TH (Yes), the target pixel is determined to be a big edge, and the process proceeds to S7.
- a ringing flag is set for the target pixel determined to be a big edge.
- S3 it is determined whether or not the target pixel is a flat pixel. Such a determination is made by comparing the feature amount of the target pixel with a predetermined second threshold (hereinafter referred to as FLAT_TH). FLAT_TH is used for determination of a flat pixel, and it is less likely that ringing occurs in the target pixel as the value is decreased. If the feature amount ⁇ FLAT_TH (Yes), the target pixel is determined to be a flat pixel, and the process proceeds to S4. On the other hand, when the feature amount ⁇ FLAT_TH is not satisfied (No), it is determined that the target pixel is not a flat pixel, and the process is terminated. The process in S3 is executed by a flat pixel determination unit controlled by a control unit (not shown).
- the sum of the flat pixels in the peripheral pixel area centered on the target pixel is calculated. For example, as shown in FIG. 3A, if the target pixel is determined to be a flat pixel, whether or not the feature amount ⁇ FLAT_TH is satisfied for each of the other pixels in the peripheral pixel region around the target pixel. Judging. Note that the feature amounts of other pixels may be those previously calculated in S1.
- the peripheral pixel region can be, for example, 3 ⁇ 3, 5 ⁇ 5, 7 ⁇ 7, 9 ⁇ 9, 11 ⁇ 11, 13 ⁇ 13, and 15 ⁇ 15 pixels. Preferably, it is 5 ⁇ 5 to 13 ⁇ 13. More preferably, it is 7 ⁇ 7 to 11 ⁇ 11.
- processing is performed using a 9 ⁇ 9 peripheral pixel region.
- the number of pixels constituting the peripheral pixel region is arbitrary.
- the process in S4 is executed by a sum calculation unit controlled by a control unit (not shown).
- S5 it is determined whether or not the periphery of the target pixel is a flat portion. Such a determination is made by comparing the total sum of flat pixels in the peripheral pixel region with a predetermined third threshold (hereinafter referred to as SUM_FLAT_TH).
- SUM_FLAT_TH is used to determine whether or not the periphery of the pixel of interest is a flat portion. As the value is increased, it is less likely that ringing occurs in the pixel of interest. For example, when SUM_FLAT_TH is set to 45, in the example of FIG.
- the sum of flat pixels in the peripheral pixel region is 53 (including the target pixel) (the pixel satisfying the feature amount> SUM_FLAT_TH is 52 other than the target pixel). Therefore, the sum of flat pixels> SUM_FLAT_TH is satisfied. Therefore, the peripheral pixel region centered on the target pixel is determined as a flat portion (Yes), and the process proceeds to S6. On the other hand, if the sum of the flat pixels> SUM_FLAT_TH is not satisfied (No), it is determined that the peripheral pixel region centered on the target pixel is not a flat portion, and the process ends.
- the process in S5 is executed by a flat portion determination unit controlled by a control unit (not shown).
- the peripheral pixel region can be, for example, 3 ⁇ 3, 5 ⁇ 5, 7 ⁇ 7, 9 ⁇ 9, 11 ⁇ 11, 13 ⁇ 13, and 15 ⁇ 15 pixels.
- it is 5 ⁇ 5 to 13 ⁇ 13. More preferably, it is 7 ⁇ 7 to 11 ⁇ 11.
- processing is performed using a 9 ⁇ 9 peripheral pixel region.
- Such determination is made based on whether or not the feature amount> BIG_EDGE_TH is satisfied for each of the other pixels in the peripheral pixel region, as in S2.
- the feature values of other pixels may be received in advance in S1. For example, in the case of FIG. 3B, since there is a pixel (a pixel determined to be a big edge) satisfying the feature amount> BIG_EDGE_TH in the peripheral pixel region (Yes), the process proceeds to S7. On the other hand, when there is no big edge in the peripheral pixel region (No), the process is terminated.
- the process in S6 is executed by a peripheral pixel confirmation unit controlled by a control unit (not shown).
- a ringing flag is set for the target pixel.
- the ringing satisfies the characteristics of (1) existing around the big edge and (2) conspicuous in the flat portion around the edge.
- the pixel of interest passing through the route of S2 is determined to be a big edge, it is estimated that ringing will occur, and therefore a ringing flag is set.
- the process in S7 is executed by a ringing flag setting unit controlled by a control unit (not shown).
- the ringing occurrence estimation unit 21 calculates the number of ringing flags set for the pixels in the peripheral pixel region centered on the target pixel.
- the calculation result is normalized by dividing by the number of pixels constituting the peripheral pixel region, and a value of 0 to 1 is calculated as the ringing occurrence degree indicating the degree of occurrence of ringing.
- the number of pixels in which the ringing flag is set takes a value of 0 to 49, and is divided by 49 to be normalized and expressed as 0 to 1.
- the ringing occurrence degree is calculated. Note that the number of pixels constituting the peripheral pixel region is arbitrary, and may be, for example, 9 ⁇ 9 pixels.
- the sharpness estimation unit 22 estimates a sharpness enhancement degree representing the strength of the sharpness of the input image data S1.
- the sharpness enhancement degree is an index representing the degree of blurring of an image. For example, when an image is not blurred, the image has a high sharpness enhancement degree and is a high sharpness image. On the other hand, when an image is blurred by enlarging the image, the image has a low sharpness enhancement degree and is a low sharpness image.
- the sharpness enhancement degree estimation process by the sharpness estimation unit 22 is not particularly limited.
- the sharpness enhancement degree can be estimated based on the sum S_org of the dispersion values of the pixel values included in the data S1.
- the value of S_diff / S_org is used as the sharpness enhancement degree.
- image data obtained by processing with a low-pass filter satisfies the relationship of S_diff ⁇ S_org because the sum of dispersion values of pixel values is smaller than that of the original image data. Therefore, the sharpness enhancement degree (S_diff / S_org) is in the range of 0-1.
- the sharpness estimation unit 22 outputs the sharpness enhancement degree to the adjustment coefficient calculation unit 23.
- the sharpness enhancement degree has the same value for all the pixels included in the input image data S1.
- the adjustment coefficient calculation unit 23 calculates an adjustment coefficient ⁇ that multiplies the gradation value of the target pixel included in the converted image data.
- ⁇ is a value to be multiplied by the converted image data S2 in the adjustment process described later. As the value of ⁇ is smaller, excessive correction to the converted image data S2 is suppressed.
- ⁇ is calculated by a different method depending on whether or not a ringing flag is set for the pixel of interest. Specifically, ⁇ is calculated using the following equation.
- both (1-ringing occurrence degree) and (1-sharpness enhancement degree) are in the range of 0 to 1. Therefore, ⁇ is smaller when the ringing flag is set for the pixel of interest than when the ringing flag is not set for the pixel of interest. In other words, when the ringing flag is set for the target pixel, ⁇ is set so that excessive correction on the converted image data S2 is suppressed more than when the ringing flag is not set for the target pixel. Calculated.
- the adjustment coefficient calculation unit 23 calculates ⁇ based on the sharpness enhancement degree when the ringing flag is not set for the target pixel. Further, when the ringing flag is set for the target pixel, the adjustment coefficient calculation unit 23 calculates ⁇ based on the ringing occurrence degree and the sharpness enhancement degree. Then, the adjustment coefficient calculation unit 23 outputs the calculated ⁇ to the adjustment unit 24.
- the adjustment unit 24 performs different adjustment processing on the target pixel depending on whether or not a ringing flag is set for the target pixel. Specifically, the adjustment unit 24 mixes the gradation value of the target pixel included in the input image data S1 and the gradation value of the target pixel included in the converted image data S2 at a predetermined ratio. The gradation value of the target pixel included in the output image data S3 is calculated. More specifically, the adjustment unit 24 multiplies the gradation value of the target pixel included in the input image data S1 and the gradation value of the target pixel included in the converted image data S2 by ⁇ or a value caused by ⁇ .
- the adjustment process is executed by dividing. Then, the image data obtained by executing the adjustment process is output to the subsequent stage as the gradation value of the target pixel included in the output image data S3.
- the image is obtained. It is possible to suppress excessive correction in the image conversion process executed by the conversion processing unit 10. At this time, when the ringing flag is set for the target pixel, ⁇ is smaller than that when the ringing flag is not set for the target pixel. Therefore, the tone value of the target pixel included in the output image data S3 Among these, the component resulting from the converted image data S2 can be further suppressed.
- the gradation value of the target pixel included in the converted image data S2 is multiplied by ⁇
- the gradation value of the target pixel included in the input image data S1 is multiplied by ⁇ . It is not limited to.
- the gradation value of the target pixel included in the converted image data S2 or the gradation value of the target pixel included in the input image data S1 may be divided by 1 / ⁇ or 1 / 1 ⁇ , respectively.
- such adjustment processing separates the input image data S1 and the converted image data S2 into a high-frequency component and a low-frequency component using a low-pass filter or a high-pass filter, and only for the gradation value of the target pixel included in the high-frequency component. May be executed.
- the high-frequency component for which the adjustment process has been executed and the low-frequency component for which the adjustment process has not been executed are combined to obtain the tone value of the target pixel included in the output image data S3.
- FIG. 4A is an input image represented by the input image data S1
- FIG. 4B is an image representing a ringing estimation result by the ringing occurrence degree estimation unit 21.
- FIG. 4 (a) the difference between the gradation values of the pixels at the boundary between the living thing and the background (the sea) is large (that is, there is a big edge near the boundary). I was able to set it.
- the ringing suppression unit 20 performs overcorrection on the tone value of the target pixel included in the converted image data S2 that is overcorrected by the image conversion processing unit 10 and that is noticeable ringing in the vicinity of the big edge.
- the image processing apparatus 1 can be provided as a PC, cloud computing, or a set top box connected to a computer.
- an ASIC application specific integrated circuit
- FPGA field-programmable gate array
- DRP Dynamic Reconfigurable Processor
- a program that realizes the functions of the image processing apparatus 1 can also be distributed via the Internet or the like.
- the ringing suppression unit 20 includes a ringing occurrence estimation unit 21, a sharpness estimation unit 22, an adjustment coefficient calculation unit 23, and an adjustment unit 24.
- some of these elements are configured externally. can do.
- the ringing occurrence estimation unit 21 and the sharpness estimation unit 22 can be configured outside, and the ringing suppression unit 20 can be configured to include an adjustment coefficient calculation unit 23 and an adjustment unit 24.
- the ringing suppression unit 20 includes a ringing occurrence degree estimation unit 21 and a sharpness estimation unit 22, one may be stopped and the adjustment process may be performed using the other.
- the sharpness estimation unit 22 when the sharpness estimation unit 22 is stopped, the sharpness enhancement degree when ⁇ is calculated may be a constant (0 to 1). In this case, the constant may be made smaller when the ringing flag is set than when the ringing flag is not set.
- Image processing apparatus 10: Image conversion processing unit, 20: Ringing suppression unit, 21: Ringing occurrence estimation unit, 22: Sharpness estimation unit, 23: Adjustment coefficient calculation unit, 24: Adjustment unit, 30: Input image output Part
Abstract
Description
好ましくは、前記リンギング発生度推定部は、前記入力画像データに含まれる一の画素を注目画素とし、前記注目画素及び前記注目画素の周辺の複数の周辺画素を含む周辺画素領域における特徴量に基づいて、リンギングが発生すると推定される前記注目画素にリンギングフラグを設定するリンギングフラグ設定部を備え、前記調整部は、前記注目画素に前記リンギングフラグが設定されているか否かにより、前記注目画素に対して異なる調整処理を実行する。
好ましくは、前記調整部は、前記リンギングフラグが設定された注目画素に対して、前記リンギングフラグが設定されていない注目画素と比べて、前記画像変換の度合いをより抑制する調整処理を実行する。
好ましくは、前記調整部は、前記リンギングフラグが設定されていない前記注目画素に対して、前記シャープネス強調度に基づいて前記調整処理を実行する。
好ましくは、前記調整部は、前記リンギングフラグが設定された前記注目画素に対して、前記リンギング発生度及び前記シャープネス強調度に基づいて前記調整処理を実行する。
好ましくは、前記調整部による前記調整処理は、前記入力画像データに含まれる前記注目画素の階調値と、前記変換画像データに含まれる前記注目画素の階調値と、を所定の比率で混合することにより、出力画像データに含まれる前記注目画素の階調値を算出する。
好ましくは、前記変換画像データに含まれる前記注目画素の階調値に乗算する調整係数を算出する調整係数算出部を有し、前記調整部は、前記変換画像データに含まれる前記注目画素の階調値に前記調整係数を乗算し、前記入力画像データに含まれる前記注目画素の階調値に前記調整係数に起因する値を乗算する。
好ましくは、前記調整係数算出部は、前記シャープネス強調度、又は前記シャープネス強調度及び前記リンギング発生度に基づいて調整係数を算出する。
好ましくは、前記リンギング発生度推定部は、前記周辺画素領域のうち、前記リンギングフラグが設定された画素の数に基づいて、前記注目画素のリンギング発生度を推定する。
好ましくは、前記シャープネス推定部は、前記入力画像データをローパスフィルタで処理して得られた画像データに含まれる画素値の分散値の総和と、前記入力画像データに含まれる画素値の分散値の総和と、に基づいて、前記シャープネス強調度を推定する。
好ましくは、入力画像データに含まれる注目画素及び前記注目画素の周辺の複数の周辺画素を含む周辺画素領域における特徴量が、予め定められた第1閾値を上回る場合、前記注目画素をビッグエッジと判定するビッグエッジ判定部と、前記周辺画素領域における特徴量が、予め定められた第2閾値を下回る場合、前記注目画素を平坦画素と判定する平坦画素判定部と、前記注目画素が前記平坦画素と判定された場合、前記周辺画素領域に含まれる前記平坦画素の総和を計算する総和計算部と、前記周辺画素領域に含まれる前記平坦画素の総和が予め定められた第3閾値を上回る場合、前記注目画素の周囲は平坦部分であると判定する平坦部分判定部と、前記ビッグエッジ抽出部により前記注目画素が前記ビッグエッジと判定された場合、又は、前記平坦部分判定部により前記注目画素の周囲は平坦部分であると判定され且つ前記周囲画素領域に前記ビッグエッジが含まれる場合、前記注目画素にリンギングが発生すると推定する推定部と、を含むリンギング推定装置が提供される。
好ましくは、コンピュータを、入力画像データに基づいて、リンギングが発生する度合いを表すリンギング発生度を推定するリンギング発生度推定部、前記入力画像データに基づいて、前記入力画像データが表す入力画像のシャープネスの強さを表すシャープネス強調度を推定するシャープネス推定部、前記入力画像データが画像変換された変換画像データに対し、前記リンギング発生度及び/又は前記シャープネス強調度に基づいて、前記画像変換の度合いを抑制する調整処理を画素毎に実行する調整部、として機能させるプログラムが提供される。
画像処理装置1は、入力画像出力部30と、画像変換処理部10と、リンギング抑制部20と、を備える。リンギング抑制部20は、リンギング発生度推定部21と、シャープネス推定部22と、調整係数算出部23と、調整部24と、を備える。
次に、リンギング抑制部20を構成するリンギング発生度推定部21と、シャープネス推定部22と、調整係数算出部23と、調整部24について説明する。
リンギング発生度推定部21は、入力画像データS1に含まれる注目画素及び注目画素の周辺の複数の周辺画素を含む周辺画素領域における特徴量に基づいて、リンギング発生度を推定する。まず、リンギング発生度推定部21は、周辺画素領域における特徴量に基づいて、リンギングが発生すると推定される注目画素にリンギングフラグを設定する。そして、注目画素を中心とした周辺画素領域のうち、リンギングフラグが設定された画素の数に基づいて、注目画素のリンギング発生度を推定する。注目画素にリンギングフラグを設定する処理については、図2を用いて説明する。
次に、図1を用いて、シャープネス推定部22について説明する。シャープネス推定部22は、入力画像データS1のシャープネスの強さを表すシャープネス強調度を推定する。シャープネス強調度は、画像のぼやけ具合を表す指標であり、例えば画像を拡大してもぼやけない場合、かかる画像はシャープネス強調度が強く、高シャープネス画像である。一方、画像を拡大することで画像がぼやける場合、かかる画像はシャープネス強調度が弱く、低シャープネス画像である。シャープネス推定部22によるシャープネス強調度の推定処理は特に限定されないが、例えば、入力画像データS1をローパスフィルタで処理して得られた画像データに含まれる画素値の分散値の総和S_diffと、入力画像データS1に含まれる画素値の分散値の総和S_orgと、に基づいて、シャープネス強調度を推定することができる。具体的には、S_diff/S_orgの値をシャープネス強調度とする。なお、一般的に、ローパスフィルタで処理されて得られた画像データの方が、元の画像データと比べて画素値の分散値の総和が小さくなるので、S_diff<S_orgの関係を満たす。よって、シャープネス強調度(S_diff/S_org)は、0~1の範囲となる。そして、シャープネス推定部22は、シャープネス強調度を調整係数算出部23に出力する。なお、シャープネス強調度は、入力画像データS1に含まれる全ての画素で同じ値をとる。
次に、図1を用いて、調整係数算出部23について説明する。調整係数算出部23は、変換画像データに含まれる注目画素の階調値に乗算する調整係数αを算出する。αは、後述する調整処理において変換画像データS2に乗算される値であり、αの値が小さいほど変換画像データS2に対する過剰な補正が抑制される。ここで、入力画像データS1のシャープネス強調度を考慮しつつ適切にリンギングを抑制するために、注目画素にリンギングフラグが設定されているか否かにより、異なる方法でαを算出する。具体的には、以下の式を用いてαを算出する。
α=(1-リンギング発生度)×(1-シャープネス強調度)(注目画素にリンギングフラグが設定されている場合)
α=(1-シャープネス強調度)(注目画素にリンギングフラグが設定されていない場合)
次に、図1を用いて、調整部24について説明する。調整部24は、注目画素にリンギングフラグが設定されているか否かにより、注目画素に対して異なる調整処理を実行する。具体的には、調整部24は、入力画像データS1に含まれる注目画素の階調値と、変換画像データS2に含まれる注目画素の階調値と、を所定の比率で混合することにより、出力画像データS3に含まれる注目画素の階調値を算出する。さらに具体的には、調整部24は、入力画像データS1に含まれる注目画素の階調値及び変換画像データS2に含まれる注目画素の階調値と、α又はαに起因する値を乗算又は除算することにより、調整処理を実行する。そして、調整処理を実行して得られた画像データを出力画像データS3に含まれる注目画素の階調値として後段に出力する。出力画像データS3に含まれる注目画素の階調値は、以下の式に基づいた調整処理により得られる。
出力画像データS3に含まれる注目画素の階調値=変換画像データS2に含まれる注目画素の階調値×α+入力画像データS1に含まれる注目画素の階調値×(1-α)
次に、図4を用いてリンギング発生度推定部21によるリンギング推定処理を実行した実際の画像について説明する。図4(a)は、入力画像データS1が表す入力画像であり、図4(b)はリンギング発生度推定部21によるリンギング推定結果を表す画像である。具体的には、リンギング発生度推定部21が図2に示されるリンギングフラグ設定処理を実行し、リンギングフラグが設定された画素を白色で表したものである。図4(a)に示されるように、生物と背景(海)の境界は画素の階調値の差が大きい(つまり、境界付近にビッグエッジが存在)が、かかる境界を中心としてリンギングフラグを設定することができた。
Claims (12)
- 入力画像データに基づいて、リンギングが発生する度合いを表すリンギング発生度を推定するリンギング発生度推定部と、
前記入力画像データに基づいて、前記入力画像データが表す入力画像のシャープネスの強さを表すシャープネス強調度を推定するシャープネス推定部と、
前記入力画像データが画像変換された変換画像データに対し、前記リンギング発生度及び/又は前記シャープネス強調度に基づいて、前記画像変換の度合いを抑制する調整処理を画素毎に実行する調整部と、
を有する画像処理装置。 - 前記リンギング発生度推定部は、前記入力画像データに含まれる一の画素を注目画素とし、前記注目画素及び前記注目画素の周辺の複数の周辺画素を含む周辺画素領域における特徴量に基づいて、リンギングが発生すると推定される前記注目画素にリンギングフラグを設定するリンギングフラグ設定部を備え、
前記調整部は、前記注目画素に前記リンギングフラグが設定されているか否かにより、前記注目画素に対して異なる調整処理を実行する、
請求項1に記載の画像処理装置。 - 前記調整部は、前記リンギングフラグが設定された注目画素に対して、前記リンギングフラグが設定されていない注目画素と比べて、前記画像変換の度合いをより抑制する調整処理を実行する、
請求項2に記載の画像処理装置。 - 前記調整部は、前記リンギングフラグが設定されていない前記注目画素に対して、前記シャープネス強調度に基づいて前記調整処理を実行する、
請求項2又は請求項3に記載の画像処理装置。 - 前記調整部は、前記リンギングフラグが設定された前記注目画素に対して、前記リンギング発生度及び前記シャープネス強調度に基づいて前記調整処理を実行する、
請求項2~請求項4のいずれか1項に記載の画像処理装置。 - 前記調整部による前記調整処理は、
前記入力画像データに含まれる前記注目画素の階調値と、前記変換画像データに含まれる前記注目画素の階調値と、を所定の比率で混合することにより、出力画像データに含まれる前記注目画素の階調値を算出する、
請求項2~請求項5のいずれか1項に記載の画像処理装置。 - 前記変換画像データに含まれる前記注目画素の階調値に乗算する調整係数を算出する調整係数算出部を有し、
前記調整部は、前記変換画像データに含まれる前記注目画素の階調値に前記調整係数を乗算し、前記入力画像データに含まれる前記注目画素の階調値に前記調整係数に起因する値を乗算する、
請求項2~請求項6のいずれか1項に記載の画像処理装置。 - 前記調整係数算出部は、前記シャープネス強調度、又は前記シャープネス強調度及び前記リンギング発生度に基づいて調整係数を算出する、
請求項2~請求項7のいずれか1項に記載の画像処理装置。 - 前記リンギング発生度推定部は、前記周辺画素領域のうち、前記リンギングフラグが設定された画素の数に基づいて、前記注目画素のリンギング発生度を推定する、
請求項2~請求項8のいずれか1項に記載の画像処理装置。 - 前記シャープネス推定部は、前記入力画像データをローパスフィルタで処理して得られた画像データに含まれる画素値の分散値の総和と、前記入力画像データに含まれる画素値の分散値の総和と、に基づいて、前記シャープネス強調度を推定する、
請求項2~請求項9のいずれか1項に記載の画像処理装置。 - 入力画像データに含まれる注目画素及び前記注目画素の周辺の複数の周辺画素を含む周辺画素領域における特徴量が、予め定められた第1閾値を上回る場合、前記注目画素をビッグエッジと判定するビッグエッジ判定部と、
前記周辺画素領域における特徴量が、予め定められた第2閾値を下回る場合、前記注目画素を平坦画素と判定する平坦画素判定部と、
前記注目画素が前記平坦画素と判定された場合、前記周辺画素領域に含まれる前記平坦画素の総和を計算する総和計算部と、
前記周辺画素領域に含まれる前記平坦画素の総和が予め定められた第3閾値を上回る場合、前記注目画素の周囲は平坦部分であると判定する平坦部分判定部と、
前記ビッグエッジ抽出部により前記注目画素が前記ビッグエッジと判定された場合、又は、前記平坦部分判定部により前記注目画素の周囲は平坦部分であると判定され且つ前記周囲画素領域に前記ビッグエッジが含まれる場合、前記注目画素にリンギングが発生すると推定する推定部と、
を含むリンギング推定装置。 - コンピュータを、
入力画像データに基づいて、リンギングが発生する度合いを表すリンギング発生度を推定するリンギング発生度推定部、
前記入力画像データに基づいて、前記入力画像データが表す入力画像のシャープネスの強さを表すシャープネス強調度を推定するシャープネス推定部、
前記入力画像データが画像変換された変換画像データに対し、前記リンギング発生度及び/又は前記シャープネス強調度に基づいて、前記画像変換の度合いを抑制する調整処理を画素毎に実行する調整部、
として機能させるプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680071817.5A CN108475418B (zh) | 2015-12-09 | 2016-10-28 | 图像处理装置及记录介质 |
US16/060,597 US10664956B2 (en) | 2015-12-09 | 2016-10-28 | Image processing apparatus and program |
EP16872732.9A EP3385902B1 (en) | 2015-12-09 | 2016-10-28 | Image processing apparatus and program |
RU2018121948A RU2679542C1 (ru) | 2015-12-09 | 2016-10-28 | Устройство и программа для обработки изображений |
AU2016368003A AU2016368003B2 (en) | 2015-12-09 | 2016-10-28 | Image processing apparatus and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-240071 | 2015-12-09 | ||
JP2015240071A JP6619638B2 (ja) | 2015-12-09 | 2015-12-09 | 画像処理装置及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017098832A1 true WO2017098832A1 (ja) | 2017-06-15 |
Family
ID=59013970
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/081977 WO2017098832A1 (ja) | 2015-12-09 | 2016-10-28 | 画像処理装置及びプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US10664956B2 (ja) |
EP (1) | EP3385902B1 (ja) |
JP (1) | JP6619638B2 (ja) |
CN (1) | CN108475418B (ja) |
AU (1) | AU2016368003B2 (ja) |
RU (1) | RU2679542C1 (ja) |
WO (1) | WO2017098832A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107767326A (zh) * | 2017-09-28 | 2018-03-06 | 北京奇虎科技有限公司 | 图像中对象变换处理方法、装置及计算设备 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018148378A (ja) * | 2017-03-03 | 2018-09-20 | ブラザー工業株式会社 | 画像処理装置、および、コンピュータプログラム |
CN115877808B (zh) * | 2023-01-30 | 2023-05-16 | 成都秦川物联网科技股份有限公司 | 用于薄片工件加工的工业物联网及控制方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06152992A (ja) * | 1992-10-29 | 1994-05-31 | Canon Inc | 画像処理方法及び装置 |
JP2000101846A (ja) * | 1998-09-18 | 2000-04-07 | Fuji Xerox Co Ltd | 画像情報符号化装置 |
JP2006507775A (ja) * | 2002-11-25 | 2006-03-02 | サーノフ・コーポレーション | 基準なしで圧縮ビデオシーケンスの品質を測定する方法及び装置 |
JP2008306656A (ja) * | 2007-06-11 | 2008-12-18 | Sony Corp | 画像信号処理装置、画像信号処理方法及びプログラム |
WO2010131296A1 (ja) * | 2009-05-14 | 2010-11-18 | 株式会社 東芝 | 画像処理装置 |
JP2012249079A (ja) * | 2011-05-27 | 2012-12-13 | Semiconductor Components Industries Llc | 輪郭補正装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4534594B2 (ja) * | 2004-05-19 | 2010-09-01 | ソニー株式会社 | 画像処理装置、画像処理方法、画像処理方法のプログラム及び画像処理方法のプログラムを記録した記録媒体 |
EP2030167A1 (en) * | 2005-02-24 | 2009-03-04 | Bang & Olufsen A/S | A filter for adaptive noise reduction and sharpness enhancement for electronically displayed pictures |
US7876973B2 (en) * | 2006-01-12 | 2011-01-25 | Integrity Applications Incorporated | Edge ringing artifact suppression methods and apparatuses |
EP2059902B1 (en) * | 2006-08-28 | 2010-04-07 | Koninklijke Philips Electronics N.V. | Method and apparatus for image enhancement |
TR201810171T4 (tr) | 2006-10-03 | 2018-08-27 | Vestel Elektronik Sanayi Ve Ticaret Anonim Sirketi | Bir girdi görüntüsündeki halka şeklinde artefaktları en aza indirmek için yöntem ve aparat. |
EP1909221A1 (en) * | 2006-10-06 | 2008-04-09 | A.P. Moller - Maersk A/S | Container vessel stowage planning |
JP5150224B2 (ja) * | 2006-11-29 | 2013-02-20 | パナソニック株式会社 | 画像処理方法および画像処理装置 |
JP5060447B2 (ja) * | 2008-10-07 | 2012-10-31 | 株式会社東芝 | ノイズキャンセル処理回路および固体撮像装置 |
JP2010212782A (ja) | 2009-03-06 | 2010-09-24 | Toshiba Corp | 画像鮮鋭化装置、画像鮮鋭化方法 |
EP2851866A4 (en) * | 2012-05-14 | 2016-05-25 | Nat Inst Japan Science & Technology Agency | Image processing device, image processing method, program, printing medium and recording medium |
US9118932B2 (en) * | 2013-06-14 | 2015-08-25 | Nvidia Corporation | Adaptive filtering mechanism to remove encoding artifacts in video data |
WO2015156152A1 (ja) * | 2014-04-11 | 2015-10-15 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
-
2015
- 2015-12-09 JP JP2015240071A patent/JP6619638B2/ja active Active
-
2016
- 2016-10-28 WO PCT/JP2016/081977 patent/WO2017098832A1/ja active Application Filing
- 2016-10-28 EP EP16872732.9A patent/EP3385902B1/en active Active
- 2016-10-28 AU AU2016368003A patent/AU2016368003B2/en not_active Ceased
- 2016-10-28 CN CN201680071817.5A patent/CN108475418B/zh active Active
- 2016-10-28 RU RU2018121948A patent/RU2679542C1/ru not_active IP Right Cessation
- 2016-10-28 US US16/060,597 patent/US10664956B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06152992A (ja) * | 1992-10-29 | 1994-05-31 | Canon Inc | 画像処理方法及び装置 |
JP2000101846A (ja) * | 1998-09-18 | 2000-04-07 | Fuji Xerox Co Ltd | 画像情報符号化装置 |
JP2006507775A (ja) * | 2002-11-25 | 2006-03-02 | サーノフ・コーポレーション | 基準なしで圧縮ビデオシーケンスの品質を測定する方法及び装置 |
JP2008306656A (ja) * | 2007-06-11 | 2008-12-18 | Sony Corp | 画像信号処理装置、画像信号処理方法及びプログラム |
WO2010131296A1 (ja) * | 2009-05-14 | 2010-11-18 | 株式会社 東芝 | 画像処理装置 |
JP2012249079A (ja) * | 2011-05-27 | 2012-12-13 | Semiconductor Components Industries Llc | 輪郭補正装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107767326A (zh) * | 2017-09-28 | 2018-03-06 | 北京奇虎科技有限公司 | 图像中对象变换处理方法、装置及计算设备 |
Also Published As
Publication number | Publication date |
---|---|
CN108475418A (zh) | 2018-08-31 |
CN108475418B (zh) | 2022-07-29 |
EP3385902A1 (en) | 2018-10-10 |
AU2016368003A1 (en) | 2018-07-05 |
RU2679542C1 (ru) | 2019-02-11 |
US20180374198A1 (en) | 2018-12-27 |
AU2016368003B2 (en) | 2019-08-15 |
EP3385902B1 (en) | 2023-08-16 |
EP3385902A4 (en) | 2019-03-06 |
JP6619638B2 (ja) | 2019-12-11 |
JP2017107366A (ja) | 2017-06-15 |
US10664956B2 (en) | 2020-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7817875B2 (en) | Image processing apparatus and method, recording medium, and program | |
US8831372B2 (en) | Image processing device, image processing method and storage medium storing image processing program | |
US8335396B2 (en) | Image output apparatus, edge enhancement method, and recording medium | |
TWI519151B (zh) | 影像處理方法以及影像處理裝置 | |
US8619330B2 (en) | Image processing apparatus and image processing method | |
JP2009059118A (ja) | 階調補正装置、階調補正方法、階調補正プログラム | |
WO2017098832A1 (ja) | 画像処理装置及びプログラム | |
JP2015099546A (ja) | 画像処理装置、画像処理方法、プログラム及び記憶媒体 | |
JP2008011286A (ja) | 画像処理プログラムおよび画像処理装置 | |
JP5652272B2 (ja) | 画像処理装置、画像処理プログラム及び画像処理方法 | |
US10438323B2 (en) | Image brightness correction and noise suppression method, device, and recording medium for storing image processing program | |
JP4381240B2 (ja) | 画像処理装置及びこれを用いた画像表示装置、並びに画像処理方法及びこれをコンピュータに実行させるためのプログラム | |
KR102482225B1 (ko) | 영상 처리 장치 및 방법 | |
JP2003281534A (ja) | 画像処理装置および画像出力装置 | |
JP6701687B2 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP2015114729A (ja) | 画像処理装置、表示装置、画像処理方法およびプログラム | |
CN112184583B (zh) | 一种图像降噪方法及装置 | |
JP6190152B2 (ja) | 画像処理装置及び画像処理方法 | |
US8625923B2 (en) | Image processing apparatus, imaging apparatus, and image processing program | |
JP2011022779A (ja) | 画像補正装置、画像補正方法及びプログラム | |
JP2010063059A (ja) | 画像処理装置及びその方法 | |
JP6843510B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP4992438B2 (ja) | 画像処理装置および画像処理プログラム | |
KR20160103213A (ko) | 레티넥스 기법을 이용한 고속의 영상처리방법 | |
JP2013242817A (ja) | 画像処理装置およびその方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16872732 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2016368003 Country of ref document: AU Date of ref document: 20161028 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016872732 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018121948 Country of ref document: RU |
|
ENP | Entry into the national phase |
Ref document number: 2016872732 Country of ref document: EP Effective date: 20180706 |