WO2017086121A1 - Dispositif et programme de traitement d'images - Google Patents

Dispositif et programme de traitement d'images Download PDF

Info

Publication number
WO2017086121A1
WO2017086121A1 PCT/JP2016/081976 JP2016081976W WO2017086121A1 WO 2017086121 A1 WO2017086121 A1 WO 2017086121A1 JP 2016081976 W JP2016081976 W JP 2016081976W WO 2017086121 A1 WO2017086121 A1 WO 2017086121A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
edge
corner
target pixel
size
Prior art date
Application number
PCT/JP2016/081976
Other languages
English (en)
Japanese (ja)
Inventor
正史 東
Original Assignee
Eizo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eizo株式会社 filed Critical Eizo株式会社
Publication of WO2017086121A1 publication Critical patent/WO2017086121A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/409Edge or detail enhancement; Noise or error suppression
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Definitions

  • the present invention relates to an image processing apparatus and a program.
  • MSAA Multi-Sampling Anti-Aliasing
  • MLAA Merphological Anti-Aliasing
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2014-174867
  • an image processing apparatus performs anti-aliasing processing on pixels included in the input image.
  • the image processing apparatus controls an anti-aliasing process for a target pixel based on at least one of size information regarding a size of an object formed by the target pixel and corner information indicating whether the target pixel forms a corner of the object.
  • a control unit may be provided.
  • the processing control unit obtains first pixel data acquired by performing anti-aliasing processing on the target pixel, and second pixel data acquired without performing the anti-aliasing processing on the target pixel.
  • a selector unit that selectively outputs may be provided, and the selector unit may output the first pixel data or the second pixel data based on at least one of the size information and the corner information.
  • the selector unit includes the first portion.
  • Pixel data may be output.
  • the selector unit may output the second pixel data when the size information indicates that the size of the object is not larger than a predetermined size.
  • the selector unit may output the second pixel data when the corner information indicates that the pixel of interest constitutes a corner of the object.
  • the processing control unit may include an image adjustment unit that controls the processing intensity of the anti-aliasing processing for the pixel of interest based on at least one of the size information and the corner information.
  • the image adjustment unit controls the processing intensity of the anti-aliasing processing for the pixel of interest based on at least one of the size information and the corner information.
  • the image adjustment unit For pixel, pixel data whose pixel value is closer to the pixel data acquired by performing the anti-aliasing process on the target pixel than the pixel data acquired without performing the anti-aliasing process on the target pixel. You may output.
  • the image adjustment unit is configured such that the pixel value of the target pixel is subjected to the anti-aliasing process for the target pixel.
  • Pixel data closer to the acquired pixel data than the acquired pixel data may be output without performing the anti-aliasing process on the target pixel.
  • the image adjustment unit acquires a pixel value of the target pixel by performing the anti-aliasing process on the target pixel. Pixel data closer to the pixel data obtained without the anti-aliasing processing being applied to the target pixel than the pixel data may be output.
  • the image processing apparatus includes, from the input image, an edge pixel detection unit that detects edge pixels that constitute an edge of an object, a pixel value of the pixel of interest that is an edge pixel, and an edge pixel that is adjacent to the pixel of interest. By comparing the pixel value, the same object constituent pixel extraction unit that extracts the edge pixel that forms the edge of the same object as the target pixel, and the length of the edge that the target pixel configures are determined in advance.
  • a size information generation unit that generates the size information indicating that the size of the object is larger than a predetermined size when the length is longer than the length may be further provided.
  • the image processing device includes, from the input image, an edge pixel detection unit that detects an edge pixel that forms an edge of an object, a pixel value of the pixel of interest that is an edge pixel, and a pixel of an edge pixel adjacent to the pixel of interest
  • the same object constituent pixel extraction unit that extracts edge pixels that form the edge of the same object as the target pixel by comparing the values, and the target pixel that is arranged in the vertical direction or the horizontal direction of the input image
  • the edge pixel exists in an oblique direction opposite to the direction in which the edge pixel group exists and in a direction perpendicular to the direction in which the edge pixel group exists.
  • It may further comprise a corner information generation unit for generating the corner information indicating that constitutes the corner of the object.
  • the image processing device includes, from the input image, an edge pixel detection unit that detects an edge pixel that forms an edge of an object, a pixel value of the pixel of interest that is an edge pixel, and a pixel of an edge pixel adjacent to the pixel of interest By comparing the values, the same object component pixel extracting unit that extracts the edge pixel that forms the edge of the same object as the target pixel, and the anti-aliasing processing unit that executes the anti-aliasing process on the target pixel
  • the anti-aliasing processing unit may include the pixel of interest, and does not belong to the same object as the pixel of interest adjacent to the edge pixel group arranged in the vertical direction or the horizontal direction of the input image.
  • One pixel is selected from the pixels, the distance between the one pixel and the target pixel, and the pixel group Accordance percentage based on the, by blending the pixel values of the pixel values and the one pixel of the target pixel may output the pixel data of the pixel of interest.
  • An example of functional composition of display 100 is shown roughly.
  • An example of processing by the edge detection unit 112 is schematically shown.
  • An example of the flow of processing by the same object constituent pixel extraction unit 114 is schematically shown.
  • An example of processing by the object information extraction unit 116 is schematically shown.
  • An example of MLAA processing is shown roughly.
  • An example of MLAA processing is shown roughly.
  • An example of the flow of processing by the corner pixel determination unit 118 is schematically shown.
  • An example of MLAA processing is shown roughly.
  • An example of the process by the anti-aliasing process part 120 is shown roughly.
  • An example of the process by the anti-aliasing process part 120 is shown roughly.
  • An example of the process by the anti-aliasing process part 120 is shown roughly.
  • An example of the process by the anti-aliasing process part 120 is shown roughly.
  • An example of the process by the anti-aliasing process part 120 is shown roughly.
  • FIG. 1 schematically shows an example of a functional configuration of the display device 100.
  • the display device 100 may be a liquid crystal display, a plasma display, an organic EL display, or the like.
  • the display device 100 according to the present embodiment performs an anti-aliasing process on the pixels included in the input image, and displays the input image subjected to the anti-aliasing process.
  • the display device 100 may be an example of an image processing device.
  • the display device 100 includes an image processing unit 102, a display control unit 104, and a display unit 106.
  • the image processing unit 102 performs anti-aliasing processing on the pixels included in the input image.
  • the image processing unit 102 may be an example of an image processing apparatus.
  • the image processing unit 102 includes an image input unit 110, an edge detection unit 112, an identical object constituent pixel extraction unit 114, an object information extraction unit 116, a corner pixel determination unit 118, an anti-aliasing processing unit 120, and a processing control unit 130.
  • the image input unit 110 inputs an image.
  • the input image may be a moving image or a still image, or may be a frame included in the moving image.
  • the image input unit 110 may input RGB data, YUV data, or HSV data.
  • the image input unit 110 may sequentially transmit pixel data of pixels included in the input image to the edge detection unit 112, the anti-aliasing processing unit 120, and the processing control unit 130.
  • the edge detection unit 112 detects edge pixels constituting the edge of the object.
  • the object may be an edge, and in this case, the edge detection unit 112 may detect the edge pixels constituting the object.
  • the edge detection unit 112 calculates an absolute difference value between the pixel value of the target pixel and the pixel values of the four neighboring pixels of the target pixel, and the absolute difference value is determined in advance among the four neighboring pixels. If even one pixel is larger than the threshold, the target pixel is determined as an edge pixel. Note that the edge detection unit 112 may detect an edge pixel by using a known method.
  • the absolute difference value of the pixel value may be calculated by any method.
  • the absolute difference value of the pixel value is the sum of the absolute difference values of RGB.
  • the absolute difference value of the pixel value may be the maximum value of the absolute difference values of RGB.
  • the absolute difference value of the pixel value may be an absolute difference value of the luminance signal Y.
  • the same object component pixel extraction unit 114 extracts edge pixels that constitute the same object as the target pixel determined as the edge pixel by the edge detection unit 112. For example, the same object component pixel extraction unit 114 compares the pixel value of the target pixel determined as the edge pixel by the edge detection unit 112 with the pixel value of the edge pixel adjacent to the target pixel, thereby matching the target pixel. Edge pixels constituting the object are extracted. The same object constituent pixel extraction unit 114 may extract an edge pixel that forms an edge of the same object as the target pixel.
  • the object information extraction unit 116 extracts information on an object that is formed by the target pixel.
  • the object information extraction unit 116 may extract information on an object composed of the pixel of interest and the edge pixels extracted by the same object component pixel extraction unit 114.
  • the object information extraction unit 116 may generate size information related to the size of the object.
  • the size information may indicate whether the size of the object is larger than a predetermined size.
  • the predetermined size may be changeable by a user of the display device 100, for example.
  • the object information extraction unit 116 may be an example of a size information generation unit.
  • the corner pixel determination unit 118 determines whether or not the target pixel constitutes a corner of the object.
  • the corner pixel determination unit 118 may generate corner information indicating whether or not the target pixel constitutes a corner of the object.
  • the corner pixel determination unit 118 performs an oblique direction opposite to the direction in which the edge pixel group exists with respect to the edge pixel located at the end of the edge pixel group including the target pixel, which is aligned in the vertical direction or the horizontal direction of the input image.
  • edge pixels exist in a direction perpendicular to the direction in which the edge pixel group exists, and there are more than a predetermined number of edge pixels from the edge pixel in a direction perpendicular to the direction in which the edge pixel group is arranged.
  • corner information indicating that the pixel of interest constitutes a corner of the object is generated.
  • the corner pixel determination unit 118 may determine whether or not the pixel of interest constitutes a corner of the object by using a known method, and may generate corner information according to the determination result.
  • the corner pixel determination unit 118 may be an example of a corner information generation unit.
  • the anti-aliasing processing unit 120 performs anti-aliasing processing on the pixels included in the input image.
  • the anti-aliasing processing unit 120 executes conventional anti-aliasing processing such as MLAA, for example. Further, as will be described later, the anti-aliasing processing unit 120 is one pixel from a pixel that does not belong to the same object as the target pixel, which is adjacent to the edge pixel group arranged in the horizontal direction of the input image, including the target pixel, in the horizontal direction. And combining the pixel value of the target pixel and the pixel value of the one pixel in accordance with the ratio based on the distance between the target pixel and the one pixel and the length of the edge pixel group including the target pixel. An aliasing process may be executed.
  • the anti-aliasing processing unit 120 selects one pixel from pixels that are adjacent to the edge pixel group arranged in the vertical direction of the input image including the target pixel and that do not belong to the same object as the target pixel. Perform anti-aliasing processing by blending the pixel value of the target pixel and the pixel value of the one pixel according to the ratio based on the distance between the pixel and the one pixel and the length of the edge pixel group including the target pixel May be.
  • the processing control unit 130 controls the anti-aliasing processing for the pixel of interest based on at least one of size information regarding the size of the object formed by the pixel of interest and corner information indicating whether the pixel of interest forms a corner of the object. To do.
  • the processing control unit 130 may include a selector unit 132 and an image adjustment unit 134.
  • the processing control unit 130 may include only one of the selector unit 132 and the image adjustment unit 134 or both.
  • the selector unit 132 selectively outputs pixel data acquired by performing anti-aliasing processing on the target pixel and pixel data acquired without performing anti-aliasing processing on the target pixel.
  • the pixel data acquired by performing the anti-aliasing process on the target pixel may be an example of the first pixel data.
  • the pixel data acquired without performing anti-aliasing processing on the target pixel may be an example of second pixel data.
  • the selector unit 132 outputs the first pixel data or the second pixel data based on at least one of the size information generated by the object information extraction unit 116 and the corner information generated by the corner pixel determination unit 118. For example, when the size information indicates that the size of the object formed by the pixel of interest is larger than a predetermined size and the corner information indicates that the pixel of interest does not form the corner of the object, the selector unit 132 The first pixel data is output.
  • the selector unit 132 when the size information indicates that the size of the object formed by the target pixel is not larger than a predetermined size, the selector unit 132 outputs the second pixel data. For example, the selector unit 132 outputs the second pixel data when the corner information indicates that the target pixel forms a corner of the object.
  • the image adjustment unit 134 controls the processing strength of the anti-aliasing process for the target pixel based on at least one of the size information generated by the object information extraction unit 116 and the corner information generated by the corner pixel determination unit 118. For example, when the size information indicates that the size of the object is not larger than a predetermined size and when the corner information indicates that the pixel of interest constitutes the corner of the object, the image adjustment unit 134 determines that the size of the object is Compared to the case where the size information indicates that the size is larger than the predetermined size and the corner information indicates that the target pixel does not constitute the corner of the object, the processing strength of the anti-aliasing processing is reduced.
  • the image adjustment unit 134 Pixel data whose value is closer to the first pixel data than the second pixel data is output.
  • the image adjustment unit 134 when the size information indicates that the size of the object is not larger than a predetermined size, the image adjustment unit 134 is a pixel whose pixel value is closer to the second pixel data than the first pixel data for the target pixel. Output data. Further, for example, when the corner information indicates that the target pixel forms the corner of the object, the image adjustment unit 134 outputs pixel data whose pixel value is closer to the second pixel data than the first pixel data for the target pixel. To do.
  • the display control unit 104 causes the display unit 106 to display an image using the pixel data output by the processing control unit 130.
  • the display unit 106 displays an image according to control by the display control unit 104.
  • FIG. 2 schematically shows an example of processing by the edge detection unit 112.
  • a process for detecting edge pixels included in a part 210 of the input image will be described as an example.
  • the edge detection unit 112 first calculates an absolute difference between the pixel value of the target pixel 211 and the pixel values of the pixels 212, 213, 214, and 215 in the vicinity of the target pixel 211. Next, the edge detection unit 112 compares the calculated absolute difference value with a threshold value. Then, if any one of the four difference absolute values is larger than the threshold value, the edge detection unit 112 determines the target pixel 211 as an edge pixel.
  • the threshold value may be set in advance. The threshold value may be changeable by a user of the display device 100 or the like.
  • the edge detection unit 112 detects all the edge pixels included in the example 210 by performing the above-described processing for each pixel included in the example 210.
  • the hatched pixels in Example 220 indicate detected edge pixels.
  • FIG. 3 schematically shows an example of the flow of processing by the same object constituent pixel extraction unit 114.
  • the process of extracting the edge pixel constituting the edge of the same object as the target pixel 231 will be described as an example.
  • the same object component pixel extracting unit 114 first detects edge pixels located in the vicinity of 8 of the target pixel 231. When the number of detected edge pixels is two, the same object constituent pixel extraction unit 114 selects the two edge pixels, and in the case of three or more, the same object component pixel extraction unit 114 selects two of the pixel values closer to the pixel value of the target pixel 231. When an edge pixel is selected and the number is less than two, it is determined that the process is finished.
  • the same object component pixel extraction unit 114 calculates the absolute difference between the pixel value of the target pixel 231 and the pixel value of the target pixel 231 out of the pixel values of the two selected edge pixels. To do. Then, when the calculated difference absolute value is smaller than the threshold value, the same object constituent pixel extraction unit 114 determines the two selected pixels as the contour constituent pixels.
  • the threshold value may be set in advance. The threshold value may be changeable by a user of the display device 100 or the like. In the example 230, the pixels 232 and 233 are determined to be contour constituent pixels.
  • the same object constituent pixel extraction unit 114 detects edge pixels located in the vicinity of 8 of the pixel 232 determined as the contour constituent pixel. If the number of detected edge pixels is two including the target pixel 231, the same object component pixel extraction unit 114 selects an edge pixel other than the target pixel 231 out of the two, and if the number is three or more, the target pixel Among the edge pixels other than 231, one edge pixel having a pixel value closer to the pixel value of the pixel 232 is selected.
  • the same object constituent pixel extraction unit 114 calculates the absolute difference between the pixel value of the selected one edge pixel and the pixel value of the pixel 232. Then, if the calculated difference absolute value is smaller than the threshold value, the same object component pixel extraction unit 114 determines the selected one pixel as the contour component pixel. The same object constituent pixel extraction unit 114 performs the same process on the pixel 233.
  • the same object constituent pixel extraction unit 114 repeats the above-described processing for the pixels determined to be the contour constituent pixels until it is determined that the processing is finished. Thereby, the edge pixel which comprises the edge of the same object as the attention pixel 231 can be extracted.
  • the pixels 234, 235, 236, and 237 are determined to be contour constituent pixels.
  • the hatched pixels in the example 240 indicate edge pixels extracted by the same object component pixel extraction unit 114.
  • FIG. 4 schematically shows an example of processing by the object information extraction unit 116.
  • the object information extraction unit 116 extracts a region 250 having a predetermined size centered on the pixel of interest 251, and an edge pixel included in an edge formed by the pixel of interest 251 exists at the end of the region 250.
  • size information indicating that the size of the object formed by the target pixel 251 is larger than a predetermined size may be generated.
  • the object information extraction unit 116 makes the size of the object formed by the target pixel 251 larger than a predetermined size. Size information indicating that there is no data may be generated.
  • the object information extraction unit 116 configures the target pixel 251. Size information indicating that the size of the object is larger than a predetermined size is generated.
  • the size of the object formed by the target pixel 251 is larger than a predetermined size.
  • the size of the region 250 is 7 ⁇ 7 pixels has been described as an example, but the present invention is not limited thereto.
  • the size of the region 250 may be other than 7 ⁇ 7 pixels. Further, the size of the region 250 may be changeable by a user of the display device 100 or the like.
  • the target pixel 261 is determined to be a part of the Z-shaped contour 262. Then, the pixel value TargetValue of the pixel of interest 261 and the pixel value OppositeValue of the pixel 264 adjacent to the pixel of interest 261 are blended according to the following formula 1, and the output value OutValue of the pixel value of the pixel of interest 261 is calculated.
  • a represents the area of the region 263.
  • the pixel of interest 271 is determined to be a part of the Z-shaped contour 272 as in Example 260.
  • the pixel value TargetValue of the target pixel 271 and the pixel value OppositeValue of the pixel 274 adjacent to the target pixel 271 are combined to calculate the output value OutValue of the pixel value of the target pixel 271. May end up. In this case, there is a possibility that excessive blending of pixels may occur, which may cause problems such as loss of texture fineness and a decrease in local contrast.
  • the same object component pixel extraction unit 114 extracts edge pixels that constitute the edge of the same object as the target pixel, and the size of the object is determined in advance.
  • the selector unit 132 outputs pixel data acquired without performing anti-aliasing processing on the target pixel. Thereby, generation
  • the image adjustment unit 134 has the pixel value of the pixel of interest anti-aliased more than the pixel data acquired by performing anti-aliasing processing on the pixel of interest. Pixel data close to the pixel data acquired without performing the aliasing process is output. Thereby, the influence of the problem mentioned above can be reduced.
  • FIG. 7 schematically shows an example of the flow of processing by the corner pixel determination unit 118.
  • the corner pixel determination unit 118 performs an oblique direction opposite to the direction in which the edge pixel group exists with respect to the edge pixel located at the end of the edge pixel group including the target pixel and arranged in the vertical direction or the horizontal direction of the input image.
  • An edge pixel exists in a direction perpendicular to the direction in which the edge pixel group exists, and there are more than a predetermined number of edge pixels in the direction perpendicular to the direction in which the edge pixel group is arranged from the edge pixel.
  • the corner information indicating that the pixel of interest constitutes the corner of the object may be generated.
  • the corner pixel determination unit 118 obliquely reverses the direction in which the edge pixel group exists with respect to edge pixels located at the end of the edge pixel group including the target pixel, which are arranged in the vertical direction or the horizontal direction of the input image. Edge pixels exist in a direction perpendicular to the direction and the direction in which the edge pixel group exists, and more than a predetermined number of edge pixels exist from the edge pixel in a direction perpendicular to the direction in which the edge pixel group is arranged. Otherwise, corner information indicating that the pixel of interest does not constitute a corner of the object may be generated.
  • the corner pixel determination unit 118 may use a partial area having a predetermined size as a processing unit.
  • the partial area 280 having a size of 7 ⁇ 7 pixels will be described as an example, but the size of the partial area is not limited to this and may be other sizes.
  • the corner pixel determination unit 118 specifies the direction of the edge pixel group arranged in the vertical direction or the horizontal direction of the input image from the target pixel 281.
  • the corner pixel determination unit 118 specifies a direction in which the number of continuous edge pixels is larger. That is, the corner pixel determination unit 118 specifies the direction of a longer straight line from the straight line extending in the horizontal direction and the straight line extending in the vertical direction from the target pixel 281.
  • the corner pixel determination unit 118 specifies the horizontal direction when the adjacent edge pixel exists only in the horizontal direction and the vertical direction when the adjacent edge pixel exists only in the vertical direction among the vertical direction and the horizontal direction of the target pixel 281. .
  • the horizontal direction is specified.
  • the corner pixel determining unit 118 specifies an edge pixel that is located at the end in the specified direction and is not located at the end of the partial region in the edge pixel group.
  • the corner pixel determination unit 118 determines whether the edge pixels adjacent to each other at the specified edge pixel are in an oblique direction opposite to the direction in which the edge pixel group exists and in a direction perpendicular to the direction in which the edge pixel group exists. Determine if it exists. In the example 280, it is determined that the adjacent edge pixel 283 exists in the diagonally upper right direction of the target pixel 281.
  • the corner pixel determination unit 118 determines whether or not there are a predetermined number or more of edge pixels in the direction in which the edge pixels exist and in a direction perpendicular to the specified direction. In the example 280, it is determined whether or not there are more than a predetermined number of edge pixels in the direction indicated by the arrow 284.
  • the corner pixel determination unit 118 determines that there are more than a predetermined number of edge pixels. If the corner pixel determination unit 118 determines that there are not more than a predetermined number of edge pixels, the corner pixel determination unit 118 determines that the pixel of interest does not form a corner of the object.
  • the predetermined number is 1, in Example 280, since there is no edge pixel, it is determined that the pixel of interest does not form a corner of the object.
  • the predetermined number may be changeable by a user of the display device 100 or the like.
  • FIG. 8 schematically shows an example of MLAA processing.
  • the MLAA process as illustrated in the example 300, the L-shaped contours 301, 302, 303, and 304 are determined, and the pixel value of each pixel is calculated as illustrated in the example 310.
  • the MLAA process there is a case where a color blur occurs near a corner and the composition of an intentionally created object is destroyed.
  • the selector unit 132 when the corner information indicates that the target pixel constitutes the corner of the object, the pixel data acquired without the anti-aliasing process being performed on the target pixel. Is output. Thereby, generation
  • the image adjustment unit 134 when the corner information indicates that the target pixel forms the corner of the object, the pixel value of the target pixel is subjected to the anti-aliasing process for the target pixel.
  • the pixel data closer to the pixel data acquired without performing anti-aliasing processing on the pixel of interest than the pixel data acquired by the above is output. Thereby, the influence of the problem mentioned above can be reduced.
  • the anti-aliasing processing unit 120 selects one pixel from the pixels that do not belong to the same object as the target pixel and that are adjacent in the horizontal direction to the edge pixel group that includes the target pixel and is aligned in the horizontal direction of the input image.
  • the pixel data of the target pixel may be output by combining the pixel value of the target pixel and the pixel value of one pixel according to the ratio based on the distance between the pixel and the target pixel and the length of the pixel group.
  • the anti-aliasing processing unit 120 selects one pixel from pixels that do not belong to the same object as the target pixel, adjacent to the edge pixel group that includes the target pixel and is aligned in the vertical direction of the input image.
  • the pixel data of the target pixel may be output by combining the pixel value of the target pixel and the pixel value of the one pixel according to the ratio based on the distance between the one pixel and the target pixel and the length of the pixel group.
  • the anti-aliasing processing unit 120 specifies a pixel that is adjacent to the edge pixel group in the horizontal direction and does not belong to the same object as the target pixel. Further, when the target pixel is included in the edge pixel group arranged in the vertical direction, the anti-aliasing processing unit 120 specifies a pixel that does not belong to the same object as the target pixel that is adjacent to the edge pixel group in the vertical direction. When the number of specified pixels is one, the anti-aliasing processing unit 120 selects the pixel.
  • the anti-aliasing processing unit 120 selects one of the two pixels.
  • the anti-aliasing processing unit 120 first identifies a pixel whose pixel value is close to the pixel value of the target pixel from the two pixels.
  • the anti-aliasing processing unit 120 calculates the absolute value of the difference between the pixel value of the identified pixel and the pixel value of the target pixel and compares it with a threshold value.
  • the threshold value may be set in advance.
  • the threshold value may be changeable by a user of the display device 100 or the like.
  • the anti-aliasing processing unit 120 selects a pixel whose pixel value is not close to the pixel value of the target pixel from the two pixels. Further, when the difference absolute value is larger than the threshold value, the anti-aliasing processing unit 120 selects a pixel having a distance close to the target pixel from the two pixels.
  • the anti-aliasing processing unit 120 first includes the target pixel 321 including the target pixel 321 and adjacent to the edge pixel group arranged in the horizontal direction of the input image in the horizontal direction. Pixels 322 and 323 that are pixels that do not belong to the same object are specified. Next, the anti-aliasing processing unit 120 identifies a pixel 322 whose pixel value is close to the pixel value of the target pixel 321 among the pixels 322 and 323. Next, the anti-aliasing processing unit 120 calculates an absolute difference value between the pixel value of the target pixel 321 and the pixel value of the pixel 322 and compares it with a threshold value.
  • the anti-aliasing processing unit 120 selects a pixel 323 whose pixel value is not close to the pixel value of the target pixel 321 among the pixels 322 and 323.
  • the anti-aliasing processing unit 120 When the target pixel 331 of the example 330 is set as the processing target, the anti-aliasing processing unit 120 first includes the target pixel 331 and the target pixel 331 that is adjacent to the edge pixel group arranged in the horizontal direction of the input image in the horizontal direction. Pixels 332 and 333 that are pixels that do not belong to the same object are specified. Next, the anti-aliasing processing unit 120 identifies a pixel whose pixel value is close to the pixel value of the target pixel 331 among the pixels 332 and 333. Here, the description is continued assuming that the pixel value of the pixel 333 is closer to the pixel value of the target pixel 331. The anti-aliasing processing unit 120 identifies the pixel 333.
  • the anti-aliasing processing unit 120 calculates a difference absolute value between the pixel value of the target pixel 331 and the pixel value of the pixel 333, and compares it with a threshold value.
  • the description will be continued assuming that the absolute difference value is larger than the threshold value.
  • the anti-aliasing processing unit 120 selects a pixel 332 that is close to the target pixel 331 among the pixels 332 and 333.
  • the anti-aliasing processing unit 120 calculates the output pixel value OutValue of the target pixel using the following formula 2.
  • the anti-aliasing processing unit 120 outputs the calculated OutValue as pixel data of the target pixel.
  • A is the pixel value of the target pixel
  • B is the pixel value of the selected pixel
  • Distance is the distance between the target pixel and the selected pixel
  • Length is a value based on the length of the pixel group including the target pixel.
  • the weight may be a preset weight value, and may be arbitrarily set by a user of the display device 100 or the like.
  • the anti-aliasing processing unit 120 may change the method of deriving the Length depending on the reason for selecting the pixel.
  • the anti-aliasing processing unit 120 sets the length of the edge pixel group as Length. Good.
  • the anti-aliasing processing unit 120 may set half of the length of the edge pixel group as Length.
  • the distance between the target pixel 321 and the selected pixel 323 is set to Distance, and the length of the pixel group including the target pixel 321 is Length. It is said.
  • the distance between the target pixel 331 and the selected pixel 332 is set to Distance, and is half the length of the pixel group including the target pixel 331. Is referred to as Length.
  • the anti-aliasing processing unit 120 may calculate the output pixel value such that the greater the distance value, the higher the blend ratio of the pixel values of the selected pixel. Further, as described above, the anti-aliasing processing unit 120 may calculate the output pixel value such that the smaller the Length value, the higher the blending ratio of the pixel values of the selected pixels.
  • the anti-aliasing processing unit 120 determines that the pixel value of the target pixel and the pixel of the selected pixel when the OutValue calculated by Equation 2 does not fit between the pixel value of the target pixel and the pixel value of the selected pixel. You may adjust so that it may fall between values. For example, the anti-aliasing processing unit 120 sets the upper value and the lower value of OutValue to the pixel value of the target pixel and the pixel value of the selected pixel, and if the value exceeds the upper value, sets the lower value to the upper value. If it is lower, OutValue may be set as the lower limit value.
  • the image processing unit 102 is disposed inside the display device 100 has been described as an example, but the present invention is not limited thereto.
  • the image processing unit 102 may be a computer outside the display device 100.
  • FIG. 13 shows an example of a hardware configuration of a computer 900 that functions as the image processing unit 102.
  • the computer 900 includes a CPU peripheral unit including a CPU 1000, a RAM 1020, and a graphic controller 1075 connected to each other by a host controller 1082, and a communication I / F 1030 connected to the host controller 1082 by an input / output controller 1084.
  • An input / output unit having a hard disk drive 1040 and a DVD drive 1060, and a legacy input / output unit having a ROM 1010, an FD drive 1050 and an input / output chip 1070 connected to the input / output controller 1084 are provided.
  • the CPU 1000 operates based on programs stored in the ROM 1010 and the RAM 1020 and controls each unit.
  • the graphic controller 1075 acquires image data generated by the CPU 1000 or the like on a frame buffer provided in the RAM 1020 and displays it on the display 1080.
  • the graphic controller 1075 may include a frame buffer for storing image data generated by the CPU 1000 or the like.
  • the communication I / F 1030 communicates with other apparatuses via a wired or wireless network.
  • the communication I / F 1030 functions as hardware that performs communication.
  • the hard disk drive 1040 stores programs and data used by the CPU 1000.
  • the DVD drive 1060 reads a program or data from the DVD-ROM 1095 and provides it to the hard disk drive 1040 via the RAM 1020.
  • the ROM 1010 stores a boot program that the computer 900 executes at startup, a program that depends on the hardware of the computer 900, and the like.
  • the FD drive 1050 reads a program or data from the flexible disk 1090 and provides it to the hard disk drive 1040 via the RAM 1020.
  • the input / output chip 1070 connects the FD drive 1050 to the input / output controller 1084, and connects various input / output devices to the input / output controller 1084 via, for example, a parallel port, a serial port, a keyboard port, and a mouse port. Connect to.
  • the program provided to the hard disk drive 1040 via the RAM 1020 is stored in a recording medium such as the flexible disk 1090, the DVD-ROM 1095, or an IC card and provided by the user.
  • the program is read from the recording medium, installed in the hard disk drive 1040 via the RAM 1020, and executed by the CPU 1000.
  • a program that is installed in the computer 900 and causes the computer 900 to function as the image processing unit 102 may work on the CPU 1000 or the like to cause the computer 900 to function as each unit of the image processing unit 102.
  • the information processing described in these programs is read by the computer 900, whereby the image input unit 110 and the edge detection unit 112, which are specific means in which the software and the various hardware resources described above cooperate, are the same. It functions as an object constituent pixel extraction unit 114, an object information extraction unit 116, a corner pixel determination unit 118, an anti-aliasing processing unit 120, and a processing control unit 130.
  • the specific image processing unit 102 corresponding to the purpose of use is constructed by realizing calculation or processing of information according to the purpose of use of the computer 900 in this embodiment by these specific means.
  • 100 display device 102 image processing unit, 104 display control unit, 106 display unit, 110 image input unit, 112 edge detection unit, 114 identical object component pixel extraction unit, 116 object information extraction unit (size information generation unit), 118 corner Pixel determination unit (corner information generation unit), 120 anti-aliasing processing unit, 130 processing control unit, 132 selector unit, 134 image adjustment unit, 210 example, 211 pixel of interest, 212 pixel, 213 pixel, 214 pixel, 215 pixel, 220 Example, 230 example, 231 pixel of interest, 232 pixel, 233 pixel, 234 pixel, 235 pixel, 236 pixel, 237 pixel, 240 example, 250 region, 251 pixel of interest, 252 pixel, 253 pixel, 260 example, 261 pixel of interest 262 Z-shaped contour, 263 area, 264 pixels, 270 examples, 271 attention pixel, 272 Z-shaped contour, 273 area, 274 pixels, 280 partial area, 281 attention pixel, 28

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

L'invention aborde le problème de la nécessité d'une technologie permettant que les irrégularités passent inaperçues de manière plus appropriée que les technologies conventionnelles. La solution selon l'invention concerne un dispositif de traitement d'images qui met en œuvre un processus d'anticrénelage sur des pixels inclus dans une image entrée, et qui comporte une unité de contrôle de processus qui contrôle le processus d'anticrénelage sur un pixel d'intérêt sur la base d'au moins soit des informations de taille concernant la taille d'un objet incluant le pixel d'intérêt, soit des informations de coin indiquant si le pixel d'intérêt est inclus ou non dans un coin de l'objet.
PCT/JP2016/081976 2015-11-17 2016-10-28 Dispositif et programme de traitement d'images WO2017086121A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-225126 2015-11-17
JP2015225126A JP6549971B2 (ja) 2015-11-17 2015-11-17 画像処理装置およびプログラム

Publications (1)

Publication Number Publication Date
WO2017086121A1 true WO2017086121A1 (fr) 2017-05-26

Family

ID=58718832

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/081976 WO2017086121A1 (fr) 2015-11-17 2016-10-28 Dispositif et programme de traitement d'images

Country Status (2)

Country Link
JP (1) JP6549971B2 (fr)
WO (1) WO2017086121A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220085067A (ko) 2020-12-14 2022-06-22 삼성디스플레이 주식회사 화소 휘도 결정 방법 및 이를 채용한 표시 장치

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10202952A (ja) * 1997-01-28 1998-08-04 Fuji Xerox Co Ltd 画像処理装置
JP2012015669A (ja) * 2010-06-30 2012-01-19 Hitachi Consumer Electronics Co Ltd 表示装置および画像処理装置
JP2012043049A (ja) * 2010-08-16 2012-03-01 Dainippon Printing Co Ltd ジャギー緩和処理装置及びジャギー緩和処理方法
JP2013093783A (ja) * 2011-10-27 2013-05-16 Kyocera Document Solutions Inc 画像処理装置および画像形成装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10202952A (ja) * 1997-01-28 1998-08-04 Fuji Xerox Co Ltd 画像処理装置
JP2012015669A (ja) * 2010-06-30 2012-01-19 Hitachi Consumer Electronics Co Ltd 表示装置および画像処理装置
JP2012043049A (ja) * 2010-08-16 2012-03-01 Dainippon Printing Co Ltd ジャギー緩和処理装置及びジャギー緩和処理方法
JP2013093783A (ja) * 2011-10-27 2013-05-16 Kyocera Document Solutions Inc 画像処理装置および画像形成装置

Also Published As

Publication number Publication date
JP2017091475A (ja) 2017-05-25
JP6549971B2 (ja) 2019-07-24

Similar Documents

Publication Publication Date Title
US11836961B2 (en) Information processing apparatus, information processing method, and storage medium for determining whether a captured image of a subject is suitable for recognition processing
CN109408008B (zh) 影像辨识系统及其信息显示方法
EP3167429A1 (fr) Système et procédé de prise en charge d'un débruitage d'image basé sur le degré de différentiation de blocs voisins
US9311557B2 (en) Motion image region identification device and method thereof
JP2018185265A (ja) 情報処理装置、制御方法、及びプログラム
WO2017086121A1 (fr) Dispositif et programme de traitement d'images
US10109077B2 (en) Image generation device and display device
US9918057B2 (en) Projecting text characters onto a textured surface
US11935155B2 (en) Information processing apparatus, information processing method, and storage medium
JP2019100937A (ja) 欠陥検査装置及び欠陥検査方法
US11212480B2 (en) Brightness and contrast optimization of images in real time
US9430959B2 (en) Character region pixel identification device and method thereof
JP5911352B2 (ja) 画像処理装置及びその制御方法
JP4773240B2 (ja) 類似判別装置および方法並びにプログラム
EP3719740B1 (fr) Dispositif, procédé et programme de traitement d'images
JP2017183860A (ja) 画像処理装置および画像処理方法
CN109670519B (zh) 图像处理装置和图像处理方法
US10096090B2 (en) Image processing apparatus, image processing method, and storage medium, relating to emphasizing a contour region
JP5001231B2 (ja) 領域抽出方法、領域抽出装置およびプログラム
JP2017091475A5 (fr)
JP2015001967A (ja) エッジ検出装置およびエッジ検出方法
CN117764889A (zh) 图像增强方法、装置及电子设备
JP2010038881A (ja) 画像処理装置、画像処理方法およびプログラム
JP2013186610A (ja) 文字抽出装置および文字抽出プログラム
JPH11232469A (ja) 画像データ処理装置及び画像データ処理プログラムを記録した記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16866123

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16866123

Country of ref document: EP

Kind code of ref document: A1