CN105282461A - Image processing apparatus image, acquiring apparatus, image processing method and image acquiring method - Google Patents

Image processing apparatus image, acquiring apparatus, image processing method and image acquiring method Download PDF

Info

Publication number
CN105282461A
CN105282461A CN201510340961.1A CN201510340961A CN105282461A CN 105282461 A CN105282461 A CN 105282461A CN 201510340961 A CN201510340961 A CN 201510340961A CN 105282461 A CN105282461 A CN 105282461A
Authority
CN
China
Prior art keywords
value
brightness value
pixel
image
object pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510340961.1A
Other languages
Chinese (zh)
Other versions
CN105282461B (en
Inventor
安田拓矢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dainippon Screen Manufacturing Co Ltd
Original Assignee
Dainippon Screen Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dainippon Screen Manufacturing Co Ltd filed Critical Dainippon Screen Manufacturing Co Ltd
Publication of CN105282461A publication Critical patent/CN105282461A/en
Application granted granted Critical
Publication of CN105282461B publication Critical patent/CN105282461B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

In an image contains higher contrast than the edges of the pattern area of the particle pixels in the element by a representative value acquiring part (411) of an image processing part (41), each pixel is taken as the focus pixel; and the representative value of the brightness value of multiple target pixels contained in an area of a prescriptive size that takes the focus pixel as the center. A reference value setting part (412) sets the brightness value of the focus pixel to the reference value when the brightness value of the focus pixel is within the brightness value scope set by taking the representative value as the center, and sets the representative value to be the reference value when the brightness value of the focus pixel is outside the brightness value scope. A filter processing part (413) calculates the new brightness value of the focus pixel through the following filter processing: the smaller the gap between the brightness value of each target pixel and the reference value is, the weight corresponding to the brightness value of the target pixel is bigger. In this way, in the image, the edges of the pattern area can be preserved, and the partical elements in high contrast can be removed.

Description

The processing unit of image and acquisition device, processing method and adquisitiones
Technical field
The present invention relates to and a kind ofly process the technology of image and obtain the technology of image.
Background technology
In recent years, in various electronic equipment, be all provided with FPD (FlatPanelDisplay, flat panel display).In the manufacture of display unit as such, when checking the outward appearance being formed at the transparent pattern such as transparent electrode thin film on transparent base material, such as, by irradiating light to base material, then receiving reverberation, obtaining the image of pattern.Disclose in TOHKEMY 2013-68460 publication (document 1), while maintain from illumination part to base material on the optical axis of shooting area and the normal of base material formed by illumination angle, equal detection angle formed by optical axis from shooting area to line sensor and this normal, obtain the device of image.In this device, make use of the set angle of the higher illumination angle of the contrast of image and detection angle.
In addition, in TOHKEMY 2008-205737 publication (document 2), disclose the method obtaining the smoothing model preserving edge.In the method, calculate pixel value deviation degree and the position deviation degree of concerned pixel in region-of-interest and surrounding pixel, re-use pixel value deviation degree and position deviation degree to obtain index of correlation value.Then, obtain the filter factor of surrounding pixel according to this index of correlation value, obtain the filter factor of concerned pixel according to pixel value deviation degree, use this filter factor to carry out filtering operation.In addition, as not only preserving edge but also removing the smoothing method of noise, known also have bilateral filtering, ε filtering.
Incidentally, in transparent base material containing small particulate matter (so-called filler), or there is small concavo-convex situation etc. on the surface of base material under, multiple particle elements can be there is in the image of shooting, cannot check pattern accurately.Although consider that the above-mentioned image procossing by not only preserving edge but also removing noise removes particle elements, when particle elements is higher than the contrast on border of area of the pattern, particle elements also can preserve together with the edge of area of the pattern.
Summary of the invention
The object of the invention is to, for image processing apparatus, in the image of the particle elements higher than the edge of area of the pattern containing contrast, both preserved the edge of area of the pattern, remove again the higher particle elements of contrast.
Image processing apparatus of the present invention has: typical value obtaining section, area of the pattern is being shown and is including in the image of the contrast particle elements higher than the edge of described area of the pattern, using each pixel as concerned pixel, obtain the typical value of the brightness value of the multiple object pixel comprised in the region of the given size centered by described concerned pixel; Reference value portion, when being in the range of luminance values set centered by described typical value at the brightness value of described concerned pixel, the brightness value of described concerned pixel is set as fiducial value, when the brightness value of described concerned pixel is in beyond described range of luminance values, described typical value is set as fiducial value, or, independently described typical value is set as fiducial value with the brightness value of described concerned pixel; Filtering handling part, the new brightness value of described concerned pixel is obtained by carrying out following filtering process, in this filtering process, use the brightness value of described multiple object pixel, and the difference of the brightness value of each object pixel and described fiducial value is less, then make the weight corresponding with the brightness value of described each object pixel larger.
According to the present invention, illustrating area of the pattern and comprising in the image of the contrast particle elements higher than the edge of area of the pattern, both can preserve the edge of area of the pattern, the higher particle elements of contrast can be removed again.
Of the present invention one preferred embodiment in, described typical value obtaining section, obtains the value of the distribution at random of the brightness value representing described multiple object pixel; Described reference value portion, based on representing that the value of described distribution at random decides the width of described range of luminance values, uses described range of luminance values to set described fiducial value.
In other preferred implementations of the present invention, described filtering handling part, in described filtering process, gets rid of the described brightness value that brightness value is in the object pixel beyond the range of luminance values that sets centered by described typical value in described multiple object pixel.
In other execution modes preferred of the present invention, in described filtering process, the distance of described each object pixel and described concerned pixel is less, then make the weight corresponding with the brightness value of described each object pixel larger.
In an embodiment of the invention, described image is the image that the ELD be formed on transparent base is shown.
The present invention is also applicable to the image capturing device of the image obtaining the Thinfilm pattern be formed on base material.Image capturing device of the present invention has: shooting unit, obtains photographic images by taking described base material; Image processing apparatus as above, processes described photographic images.
The present invention is also applicable to image processing method and obtains the image acquisition method of image of the Thinfilm pattern be formed on base material.
By referring to add accompanying drawing and explain the present invention following, disclose above-mentioned object and other object, feature, execution mode and advantage.
Accompanying drawing explanation
Fig. 1 is the figure of the structure that image capturing device is shown.
Fig. 2 is the figure of the structure that computer is shown.
Fig. 3 is the block diagram that computer implemented functional structure is shown.
Fig. 4 is the figure of the flow process of the process that the pattern checked on base material is shown.
Fig. 5 is the cutaway view that base material is shown.
Fig. 6 is the figure that photographic images is shown.
Fig. 7 is the figure of the brightness value overview that target area is shown.
Fig. 8 is the figure of the brightness value overview that target area is shown.
Fig. 9 illustrates the figure having processed image.
Figure 10 illustrates that the treated process of comparative example completes the figure of image.
Wherein, description of reference numerals is as follows:
1 image capturing device
8 photographic images
8a process completes image
9 base materials
13 shooting unit
41 image processing parts
71,72 concerned pixels
81 area of the pattern
91 patterns
411 typical value obtaining sections
412 reference value portions
413 optical filtering handling parts
711,721 target areas
811 edges
831 ~ 833 particle elements
P1, P2 (concerned pixel) brightness value
PA typical value
R1, R2 fiducial value determination range
S11 ~ S18 step
Embodiment
Fig. 1 is the figure of the structure of the image capturing device 1 that an embodiment of the invention are shown.Image capturing device 1 obtains the image of the Thinfilm pattern be formed on base material 9, and carries out the visual examination of Thinfilm pattern based on this image.That is, image capturing device 1 has the function as inspection apparatus for pattern.In the present embodiment, base material 9 is glass substrate or transparent membrane.Such as, Thinfilm pattern is ELD.Base material 9 also can be provided with other the film such as antireflection film.In the following description, Thinfilm pattern is only called " pattern ".Base material 9 is for such as manufacturing the touch panel of capacitance type.
Image capturing device 1 has: the travel mechanism 11 of moving substrate 9, shooting unit 13 and computer 3.Travel mechanism 11 has: objective table 21, X-direction moving part 22 and Y-direction moving part 23.This objective table 21 maintains base material 9 on an upper.This X-direction moving part 22 is along the X-direction moving stage 21 in Fig. 1 of the major surfaces in parallel with base material 9.This Y-direction moving part 23 is along the major surfaces in parallel with base material 9, and the Y-direction perpendicular to X-direction moves this X-direction moving part 22.Travel mechanism 11 is the mechanisms relative to shooting area 90 described later relatively moving substrate 9.In addition, also following mechanism can be added in travel mechanism 11: along the mechanism of the Z-direction moving stage 21 in Fig. 1 vertical with X-direction and Y-direction, centered by the axle being parallel to Z-direction, come the mechanism of rotatable stage 21.In image capturing device 1, the effect of the overall control part that the entirety that computer 3 realizes bearing image capturing device 1 controls.
Shooting unit 13 has: illumination part 131, line sensor 132 and angle change mechanism 133.This illumination part 131 is to shooting area 90 emergent light on base material 9.This line sensor 132 receives the reverberation from shooting area 90.This angle change mechanism 133 changes the illumination angle of the light of illumination part 131 and the detection angle of line sensor 132.Herein, illumination angle refers to, from illumination part 131 to angle θ 1 formed by the normal N of the optical axis J1 of shooting area 90 and base material 9.Detection angle refers to, from shooting area 90 to angle θ 2 formed by the optical axis J2 of line sensor 132 and normal N.
Illumination part 131 outgoing has the light of the wavelength of permeability to pattern.Light at least irradiates the shooting area 90 in wire.Illumination part 131 has: the multiple LED arranged in X direction, and, make the light uniformization from LED and conduct to the optical system of the shooting area 90 extended in X direction.Line sensor 132 has: the capturing element of one dimension, and, make the optical system of the sensitive surface optical conjugate of shooting area 90 and capturing element.In addition, also can be provided with in shooting unit 13: by illumination part 131, line sensor 132 and angle change mechanism 133 autofocus mechanism along the direction movement integratedly of the normal N of base material 9.
When obtaining photographic images described later, base material 9 is moved along the direction intersected with shooting area 90 by travel mechanism 11.That is, travel mechanism 11 is the mechanisms relative to shooting area 90 relatively moving substrate 9.Moving substrate 9, utilizes line sensor 132 to repeat to obtain the line image of the shooting area 90 of wire rapidly on one side, and then obtains the photographic images of two dimension.In the present embodiment, base material 9 moves along the Y-direction perpendicular to shooting area 90, but shooting area 90 also can tilt relative to moving direction.Also can be understood as the part comprising travel mechanism 11 at the shooting unit 13 of shooting base material 9.
Angle change mechanism 133, while maintenance illumination angle theta 1 is equal with detection angle θ 2, changes illumination angle theta 1 and detection angle θ 2.So the size of detection angle in the following description is also the size of illumination angle, the size of illumination angle is also the size of detection angle.Illumination part 131 and line sensor 132 are supported by basal wall 134 via angle change mechanism 133.Basal wall 134 is and Y-direction and Z-direction parallel planar component.
Basal wall 134 is provided with the 1st opening 201 and the 2nd opening 202 of the arc-shaped centered by shooting area 90.Angle change mechanism 133 has: for moving the motor 135 of illumination part 131 and guide portion, support and travelling gear (omitting diagram) along the 1st opening 201, also have: for along the motor 136 of the 2nd opening 202 portable cord transducer 132 and guide portion, support and travelling gear (omitting diagram).
Fig. 2 is the figure of the structure that computer 3 is shown.Computer 3 is structures of general computer system, and has: carry out the CPU31 of various calculation process, store the ROM32 of base program and store the RAM33 of various information.Computer 3 also has: the fixed disk 34 storing information, the display 35 of the various information such as display image, receive the keyboard 36a from the input of operator and mouse 36b, the reading device 37 of information is read from the recording medium 30 of the embodied on computer readable such as CD, disk, photomagneto disk, and, the Department of Communication Force 38 of transmission and reception signal between image capturing device 1 and other structure.Wherein, I/F represents input/output interface
In computer 3, read program 300 via reading device 37 from recording medium 30 in advance and be stored in fixed disk 34.CPU31, according to program 300, utilizes RAM33 or fixed disk 34, while perform calculation process, realizes function described later.
Fig. 3 is the block diagram that the functional structure that computer 3 realizes is shown, shows the functional structure that computer 3 utilizes CPU31, ROM32, RAM33, fixed disk (hard disk) 34 etc. to realize.Computer 3 has: image processing part 41, inspection portion 42 and storage part 49.Image processing part 41 has: typical value obtaining section 411, reference value portion 412 and filtering handling part 413, and carries out image procossing described later to the photographic images obtained by shooting unit 13.Storage part 49 stores captured image data 491.The details of the function realized for these structures describes later.In addition, these functions can be built by special circuit, also partly can utilize special circuit.
Fig. 4 is the figure of the flow process of the process that the pattern checked on base material 9 is shown.In pattern checks, first, prepare the base material 9 being formed with the pattern checking target, and be positioned over (step S11) on objective table 21.Fig. 5 is the cutaway view that base material 9 is shown.As already mentioned, the first type surface of base material 9 is formed with the pattern 91 of ELD.In addition, small particulate matter 92 (so-called filler) is contained in the inside of base material 9.Particulate matter 92 is scattered in whole base material 9.
In the present embodiment, to the membrane structure (kind of film and the combination of thickness) on various base material 9, obtain in advance can improve pattern shown in photographic images region (hereinafter referred to as " area of the pattern ".) illumination angle of contrast at edge and the angle of detection angle.The contrast at the edge of area of the pattern refers to, the luminance difference (absolute value) between the area of the pattern of the adjacent edges of the area of the pattern in photographic images and background area.Computer 3 determines that the angle coordinating the illumination angle of the membrane structure of the base material 9 on objective table 21 and detection angle to set is (hereinafter referred to as " set angle ".), by control angle change mechanism 133, illumination angle and detection angle are set to this set angle (step S12).
Next, start from illumination part 131 emergent light, base material 9 utilizes travel mechanism 11 to move continuously along the Y direction, to make the position that should check on base material 9 by shooting area 90.Moving substrate 9, repeats the line image of the shooting area 90 obtaining wire rapidly by line sensor 132 on one side.By like this, obtain the photographic images of the two dimension that pattern is shown, and it can be used as captured image data 491 to be stored in storage part 49 (step S13).
Fig. 6 is the figure of the part that photographic images is shown.In figure 6, illustrate that the width of the parallel diagonal lines added in each region is narrower, the brightness (mean value of brightness value) in this region is lower.Photographic images 8 comprises area of the pattern 81 and background area 82.Background area 82 is non-area of the pattern.In addition, in photographic images 8, multiple small particle elements 831,832,833 is comprised.Herein, particle elements 831 ~ 833 is caused by the small concavo-convex or noise etc. on the above-mentioned particulate matter 92 of the inside being present in base material 9 or the first type surface of base material 9.In particle elements 831 ~ 833, the brightness marking multiple particle elements of Reference numeral 831 is in figure 6 minimum, and the particle elements brightness of mark Reference numeral 832 is the highest.In addition, using the luminance difference between each particle elements 831 ~ 833 and the surrounding of this particle elements 831 ~ 833 as the contrast of this particle elements 831 ~ 833, the contrast of particle elements 831 is higher than the contrast at the edge 811 of area of the pattern 81.The contrast of particle elements 832,833 is lower than the contrast at the edge 811 of area of the pattern 81.
Typical value obtaining section 411 is with each pixel in photographic images 8 for concerned pixel, and the target area of the given size of setting centered by this concerned pixel, determines that the multiple pixels comprised in target area are (hereinafter referred to as " object pixel ".)。Such as, when the pixel of mark Reference numeral 71 is concerned pixels in figure 6, the target area of mark Reference numeral 711 (represents with the rectangle of dotted line.) whole pixel of comprising all becomes object pixel.When the pixel of mark Reference numeral 72 is concerned pixels in figure 6, the target area of mark Reference numeral 721 (represents with the rectangle of dotted line.) whole pixels of comprising all become object pixel.Target area is preferably enough large compared with particle elements.In following process, concerned pixel 71,72 also processes as one of object pixel.In addition, the target area of each concerned pixel is equivalent to the filter range of this concerned pixel being carried out to filtering process described later.
Fig. 7 and Fig. 8 is the figure of the brightness value overview that target area is shown.Shown in Figure 7, the brightness value of object pixel horizontal arranged centered by concerned pixel 71 in target area 711 in figure 6.Shown in Figure 8, the brightness value of object pixel horizontal arranged centered by concerned pixel 72 in target area 721 in figure 6.In addition, in Fig. 7 and Fig. 8, the position of concerned pixel marks Reference numeral 71,72, utilize the arrow of mark identical Reference numeral that the scope of area of the pattern 81, background area 82, particle elements 831,832 is shown.As shown in Figure 8, do not comprise in background area 82 the on-granulated region of particle elements 831 and the luminance difference of particle elements 831 than the on-granulated region of background area 82 and the luminance difference of area of the pattern 81 large.
When determining the multiple object pixel corresponding to concerned pixel, the value (step S14) of the typical value obtaining the brightness value of the plurality of object pixel and the distribution at random representing brightness value.The typical value of brightness value is such as mean value, represents that the value of the distribution at random of brightness value is such as standard deviation.In fact, using photographic images 8 pixel as concerned pixel, carry out the process of step S14 and the process of step S15 described later ~ S17 successively, then concerned pixel is changed to other pixel, while repeat this series of process.In the diagram, the diagram of the square frame representing reprocessing is omitted.Hereinafter, in order to contrast the situation that situation that pixel 71 is concerned pixels and pixel 72 are concerned pixels, in the lump process when respectively using multiple pixel as concerned pixel is described.
The typical value of brightness value is set to m by reference value portion 412, represent that the value of the distribution at random of brightness value is set to d, the coefficient of regulation is set to s, thus using (m-sd) as lower limit, fiducial value determination range will be defined as using (m+sd) as the range of luminance values of higher limit.Then, as the example of Fig. 7, in fiducial value determination range R1, (comprise the situation that brightness value is higher limit or lower limit at the brightness value P1 of concerned pixel 71.) when, the brightness value P1 of this concerned pixel 71 is set as the fiducial value utilized in filtering process described later.In addition, as the example of Fig. 8, when the brightness value P2 of concerned pixel 72 is outside fiducial value determination range R2, the typical value PA of brightness value is set as fiducial value (step S15).In the target area 721 of Fig. 6, compared with the area of area of the pattern 81, the area of background area 82 is enough large, and in this background area 82, compared with comprising the region of particle elements 831, the on-granulated region outside particle elements 831 is enough large.So the typical value PA of the brightness value in Fig. 8 is similar to the brightness value in the on-granulated region of background area 82.In addition, above-mentioned coefficient s be consider the size of particle elements, (same with the size of target area etc.) that the contrast etc. of distribution or particle elements is suitably determined.
Further, with concerned pixel 71,72 corresponding multiple object pixel, the object pixel of brightness value in this fiducial value determination range R1, R2 is defined as extracting pixel (step S16) by reference value portion 412.In other words, in multiple object pixel, beyond the object pixel of brightness value except this fiducial value determination range R1, R2, remaining object pixel is all extract pixel.In the example of fig. 7, the object pixel that background area 82 comprises is got rid of.In the example of fig. 8, the object pixel that particle elements 831 comprises is got rid of.
Next, the filtering process that specifies of filtering handling part 413 pairs of concerned pixels.In filtering process, coefficient (weight) is multiplied by the brightness value of the multiple pixels in filter range and supplies calculating.Herein, be described for the bilateral filtering as one of general filtering process.In bilateral filtering, be (2fx+1) in the size Expressing of the line direction by filter range, the size Expressing of column direction is (2fy+1), and the brightness value of the concerned pixel in image is expressed as i (x, y), when, the factor sigma of regulation is used d, σ v, through type 1 obtains the new brightness value i0 (x, y) of this concerned pixel.
(formula 1)
io ( x , y ) = Σ k = - fx fx Σ l = - fy fy i ( x + k , y + l ) exp ( - k 2 + l 2 2 σ d 2 ) exp ( - ( i ( x , y ) - i ( x + k , y + l ) ) 2 2 σ v 2 ) Σ k = - fx fx Σ l = - fy fy exp ( - k 2 + l 2 2 σ d 2 ) exp ( - ( i ( x , y ) - i ( x + k , y + l ) ) 2 2 σ v 2 )
In bilateral filtering, because the distance of (when brightness value is equal) each pixel that filter range comprises and concerned pixel is less, the weight corresponding with the brightness value of this pixel just becomes larger, so have the effect of the smoothing centered by the pixel near concerned pixel.Factor sigma in formula 1 dadjustment makes the degree of image smoothing by bilateral filtering process.In addition, in bilateral filtering, the difference of (with in the equidistant situation of concerned pixel) brightness value of each pixel that filter range comprises and the brightness value of concerned pixel is less, and the weight corresponding with the brightness value of this pixel just becomes larger.So when having different two regions of light and shade (brightness) in filter range, the impact for the new brightness value in the region different from the region that concerned pixel is positioned at reduces.Therefore, the edge between these two regions becomes and is difficult to fuzzy, that is, edge becomes easy preservation.Factor sigma in formula 1 vthe degree at edge is preserved in adjustment by bilateral filtering process.
The filtering handling part 413 of image capturing device 1 carries out the filtering process that bilateral filtering is transformed into.Particularly, not the combination of k and l when extracting pixel for the object pixel determined by " i (x+k, the y+l) " in formula 1, do not carry out calculating that (that is, relevant to the combination of this k and l value is 0.)。In addition, " i (x, the y) " of formula 1 is replaced with fiducial value.So, in the example of Fig. 8 typical value PA of brightness value being set as fiducial value, when obtaining the new brightness value of concerned pixel 72 that particle elements 831 comprises, the brightness value of each extraction pixel and the difference of fiducial value PA less, the weight corresponding with the brightness value of this extraction pixel just becomes larger.Therefore, the new brightness value of concerned pixel 72 is compared with the brightness value with the brightness value of area of the pattern 81 and particle elements 831, the value more approximate with the typical value PA of brightness value, and seemingly closer with the brightness value in the on-granulated region of background area 82.
On the other hand, as the example of Fig. 7, when the brightness value P1 of concerned pixel 71 is set as fiducial value, the difference of each extraction brightness value of pixel and the brightness value P1 of concerned pixel 71 is less, and the weight corresponding with the brightness value of this extraction pixel just becomes larger.Therefore, the new brightness value of concerned pixel 71 and the original brightness value of concerned pixel 71 seemingly closer.In this way, concerned pixel is used to the filtering process of the brightness value of multiple extraction pixel, obtain the new brightness value (step S17) of this concerned pixel.
As already mentioned, in fact, concerned pixel is changed to other pixel, while repeat the process using photographic images 8 pixel as the step S14 ~ S17 of concerned pixel.So as shown in Figure 9, by obtaining new brightness value to whole pixel of photographic images 8, next life, paired photographic images 8 applied the image 8a of above-mentioned filtering process (hereinafter referred to as " process completes image 8a ".), and be stored as process by storage part 49 and complete view data 492.
In the photographic images 8 of Fig. 6, in the filtering process of the concerned pixel comprised in the particle elements 831 higher for contrast, due to the typical value of brightness value is set as fiducial value, so brightness value new for worthwhile work like closer for the brightness value of the surrounding with particle elements 831 is obtained.By like this, complete in image 8a in the process shown in Fig. 9 and remove particle elements 831 haply.By common smoothing composition, complete in image 8a in the process shown in Fig. 9, remove the small particle elements 832,833 that contrast in the photographic images 8 of Fig. 6 is lower haply.As required, process image 8a and can be shown in display 35.
Store in inspection portion 42 and illustrate not containing reference image (data) of the pattern of defect, and complete image and reference image by comparing process, judge that pattern has zero defect (step S18).Based on the inspection of the pattern having processed image also can utilize except with reference to image ratio comparatively except method carry out.In addition, the measurement such as distance (width of pattern) between the edge that also can carry out area of the pattern, then the quality of pattern is judged based on measurement result.
Herein, the process for the comparative example using general bilateral filtering is described.In the process of comparative example, use formula 1 carries out filtering process.As already mentioned, in the filtering process of formula 1, the difference of the brightness value of each object pixel that the target area corresponding with each concerned pixel comprises and the brightness value of this concerned pixel is less, and the weight corresponding with the brightness value of this object pixel just becomes larger.So, as shown in the example of figure 8, the new brightness value corresponding with the concerned pixel 72 that particle elements 831 comprises is similar to the brightness value of particle elements 831, as shown in Figure 10, completes the particle elements 831 also remaining photographic images 8 in image 99 in the process of the process by comparative example.
On the other hand, with regard to image capturing device 1, obtain the typical value of the brightness value of the target area centered by each concerned pixel; When being in the fiducial value determination range set centered by this typical value at the brightness value of this concerned pixel, the brightness value of this concerned pixel is set as fiducial value; When the brightness value of this concerned pixel is in beyond this fiducial value determination range, this typical value is set as fiducial value.Then, less by the difference of the brightness value and fiducial value that carry out each object pixel that target area comprises, the weight corresponding with the brightness value of this object pixel just becomes larger filtering process, obtains the new brightness value of concerned pixel.By like this, illustrating area of the pattern and comprising in the photographic images of the contrast particle elements higher than the edge of area of the pattern, both can preserve the edge of area of the pattern, the higher particle elements of contrast can be removed again.Therefore, the inspection of pattern can be made (to comprise the detection of defect, the measurement of shape.) high accuracy and stably carrying out.
By the value of the distribution at random of the brightness value based on the multiple object pixel represented in target area, decide the width of the fiducial value determination range for setting fiducial value, thus easily can set suitable fiducial value determination range.Further, got rid of outside filtering process by this brightness value brightness value in the plurality of object pixel being in the object pixel beyond fiducial value determination range, thus can prevent from utilizing abnormal brightness value in filtering process, and suitably obtain the new brightness value of concerned pixel.
Above-mentioned image capturing device 1 can carry out various distortion.
As long as the typical value of the brightness value utilizing typical value obtaining section 411 to obtain represents the value near the central authorities of the distribution of the brightness value of the multiple object pixel in target area, it can be other the statistic such as intermediate value.In addition, the typical value of brightness value also can be the mean value, intermediate value etc. that brightness value is in the brightness value of object pixel (extraction pixel) contained in fiducial value determination range.Similarly, representing that the value of the distribution at random of the brightness value of multiple object pixel is except standard deviation, also can be other statistic of the distribution at random representing statistics.
In the target area corresponding with each concerned pixel, only when the value of the distribution at random representing brightness value is more than or equal to setting, just carry out the above-mentioned filtering process " i (x, the y) " of formula 1 being replaced with fiducial value.When the value of the distribution at random representing brightness value is less than setting, general bilateral filtering process (process in comparative example) also can be carried out.In this case, only to the region of the particle elements 831 higher containing contrast in photographic images, the above-mentioned filtering process utilizing fiducial value can be carried out.In other words, above-mentioned filtering process is carried out to the image (part for photographic images) in the region only illustrated as such.
In the above-described embodiment, based on the value of the distribution at random of the brightness value of the multiple object pixel of expression, decide the width of the range of luminance values (fiducial value determination range) for setting fiducial value, but also according to the kind etc. of base material 9, the width of the range of luminance values for setting fiducial value can be preset.So, the range of luminance values for setting fiducial value is set as the variable-width obtained by computing or certain (regulation) width preset.In addition, if set centered by typical value for the range of luminance values of (the extracting the determination of pixel) of selecting excluded object pixel in filtering process, then also can be different from fiducial value determination range.
No matter reference value portion 412 also can the brightness value of concerned pixel, all the typical value of the brightness value of target area is set as fiducial value.In this case, less by the difference of the brightness value and fiducial value that carry out each object pixel, the weight corresponding with the brightness value of this object pixel just becomes larger filtering process, thus complete in image in process, both can preserve the edge of area of the pattern, the higher particle elements of contrast can have been removed again.On the other hand, preferably, in order to avoid processing in image the phenomenon occurring edge rust, when being in the range of luminance values set centered by this typical value at the brightness value of concerned pixel, the brightness value of this concerned pixel is being set as fiducial value; When the brightness value of concerned pixel is in beyond this range of luminance values, this typical value is set as fiducial value.
In the above-described embodiment, the process that the bilateral filtering carrying out one of Edge preservation type smoothing filtering is transformed into, but filtering handling part 413 also can carry out other filtering process, such as, by filtering process, the brightness value of each object pixel and the difference of fiducial value less, then make the weight corresponding with the brightness value of this object pixel just become larger.As filtering process as such, following example can be enumerated represent: the brightness value of the concerned pixel of ε filtering replaced with the process of fiducial value, the brightness value of the concerned pixel of the method recorded in TOHKEMY 2008-205737 publication (above-mentioned document 2) replaced with the process of fiducial value.
Also according to the purposes etc. having processed image, the process determining the step S16 extracting pixel from multiple object pixel that the target area of each concerned pixel comprises can be omitted; In the filtering process of filtering handling part 413, the brightness value of the whole object pixel using target area to comprise is to obtain the new brightness value of concerned pixel.
As already mentioned, the impacts such as the minute asperities on the filler comprised due to inside, surface, in the image of shooting transparent base, easilier than the edge of area of the pattern comprise the higher particle elements of contrast.So, the edge of area of the pattern can either be preserved in the picture, the above-mentioned image procossing of the higher particle elements of contrast can be removed again, the process that the image showing the ELD formed over the transparent substrate is carried out can be applicable to especially.Certainly, at the coloured base material of shooting or the object except base material, when obtaining the image comprising the contrast particle elements higher than the edge of area of the pattern, also above-mentioned image procossing can be carried out.Except detecting the pattern on base material 9, utilize the process of the image processing part 41 as image processing apparatus can also be used for various purposes.
The structure of above-mentioned execution mode and each variation, only otherwise conflictingly just can suitably to combine.
Although describe in detail and describe invention, above-mentioned explanation illustrates and is not determinate explanation.So, only otherwise depart from the scope of the present invention, multiple distortion or execution mode can be had.

Claims (12)

1. an image processing apparatus, is characterized in that, has:
Typical value obtaining section, area of the pattern is being shown and is including in the image of the contrast particle elements higher than the edge of described area of the pattern, using each pixel as concerned pixel, obtain the typical value of the brightness value of the multiple object pixel comprised in the region of the given size centered by described concerned pixel
Reference value portion, when being in the range of luminance values set centered by described typical value at the brightness value of described concerned pixel, the brightness value of described concerned pixel is set as fiducial value, when the brightness value of described concerned pixel is in beyond described range of luminance values, described typical value is set as fiducial value, or, independently described typical value is set as fiducial value with the brightness value of described concerned pixel
Filtering handling part, the new brightness value of described concerned pixel is obtained by carrying out following filtering process, in this filtering process, use the brightness value of described multiple object pixel, and the difference of the brightness value of each object pixel and described fiducial value is less, then make the weight corresponding with the brightness value of described each object pixel larger.
2. image processing apparatus as claimed in claim 1, is characterized in that,
Described typical value obtaining section, obtains the value of the distribution at random of the brightness value representing described multiple object pixel,
Described reference value portion, based on representing that the value of described distribution at random decides the width of described range of luminance values, uses described range of luminance values to set described fiducial value.
3. image processing apparatus as described in claim 1 or 2, is characterized in that,
Described filtering handling part, in described filtering process, gets rid of the described brightness value that brightness value is in the object pixel beyond the range of luminance values that sets centered by described typical value in described multiple object pixel.
4. image processing apparatus as described in claim 1 or 2, is characterized in that,
In described filtering process, the distance of described each object pixel and described concerned pixel is less, then make the weight corresponding with the brightness value of described each object pixel larger.
5. image processing apparatus as described in claim 1 or 2, is characterized in that,
Described image is the image that the ELD be formed on transparent base is shown.
6. an image capturing device, obtains the image of the Thinfilm pattern be formed on base material, it is characterized in that having:
Shooting unit, obtains photographic images by taking described base material,
Image processing apparatus as described in claim 1 or 2, processes described photographic images.
7. an image processing method, is characterized in that, comprising:
A operation: area of the pattern is being shown and is including in the image of the contrast particle elements higher than the edge of described area of the pattern, using each pixel as concerned pixel, the typical value of the brightness value of multiple object pixel that the region obtaining the given size centered by described concerned pixel comprises
B operation: when being in the range of luminance values set centered by described typical value at the brightness value of described concerned pixel, the brightness value of described concerned pixel is set as fiducial value, when the brightness value of described concerned pixel is in beyond described range of luminance values, described typical value is set as fiducial value, or, independently described typical value is set as fiducial value with the brightness value of described concerned pixel
C operation: the new brightness value obtaining described concerned pixel by carrying out following filtering process, in this filtering process, use the brightness value of described multiple object pixel, and the difference of the brightness value of each object pixel and described fiducial value is less, then make the weight corresponding with the brightness value of described each object pixel larger.
8. image processing method as claimed in claim 7, is characterized in that,
In described a operation, obtain the value of the distribution at random of the brightness value representing described multiple object pixel,
In described b operation, based on representing that the value of described distribution at random decides the width of described range of luminance values, use described range of luminance values to set described fiducial value.
9. the image processing method as described in claim 7 or 8, is characterized in that,
In described c operation, in described filtering process, in described multiple object pixel, get rid of the described brightness value that brightness value is in the object pixel beyond the range of luminance values that sets centered by described typical value.
10. the image processing method as described in claim 7 or 8, is characterized in that,
In described filtering process, the distance of described each object pixel and described concerned pixel is less, then make the weight corresponding with the brightness value of described each object pixel larger.
11. image processing methods as described in claim 7 or 8, is characterized in that,
Described image is the image that the ELD be formed on transparent base is shown.
12. 1 kinds of image acquisition methods, obtain the image of the Thinfilm pattern be formed on base material, it is characterized in that, comprising:
The operation of photographic images is obtained by taking described base material,
To the image processing method as described in claim 7 or 8 that described photographic images processes.
CN201510340961.1A 2014-06-19 2015-06-18 Processing unit and acquisition device, the processing method and adquisitiones of image Expired - Fee Related CN105282461B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-126237 2014-06-19
JP2014126237A JP6286291B2 (en) 2014-06-19 2014-06-19 Image processing apparatus, image acquisition apparatus, image processing method, and image acquisition method

Publications (2)

Publication Number Publication Date
CN105282461A true CN105282461A (en) 2016-01-27
CN105282461B CN105282461B (en) 2018-06-26

Family

ID=55088046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510340961.1A Expired - Fee Related CN105282461B (en) 2014-06-19 2015-06-18 Processing unit and acquisition device, the processing method and adquisitiones of image

Country Status (4)

Country Link
JP (1) JP6286291B2 (en)
KR (1) KR101671112B1 (en)
CN (1) CN105282461B (en)
TW (1) TWI582415B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106776877A (en) * 2016-11-28 2017-05-31 常州市星网计算机技术有限公司 Motorcycle parts model automatic identifying method
CN110645909A (en) * 2019-08-16 2020-01-03 广州瑞松北斗汽车装备有限公司 Vehicle body appearance defect detection method and detection system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019036821A (en) * 2017-08-14 2019-03-07 キヤノン株式会社 Image processing system, image processing method, and program
JP7173763B2 (en) * 2018-06-20 2022-11-16 株式会社日本マイクロニクス Image generation device and image generation method
JPWO2022163859A1 (en) * 2021-02-01 2022-08-04
KR20230170294A (en) 2022-06-10 2023-12-19 마채준 Wheelchair-accessible Healing Agricultural Plant Box(Flowerbed)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006250536A (en) * 2005-03-08 2006-09-21 Matsushita Electric Ind Co Ltd Pattern recognition device and pattern recognition method
JP2008205737A (en) * 2007-02-19 2008-09-04 Olympus Corp Imaging system, image processing program, and image processing method
JP2009182735A (en) * 2008-01-30 2009-08-13 Kyocera Corp Noise eliminating method, image processor, and information code reader
CN101782530A (en) * 2010-01-29 2010-07-21 天津大学 Optical detection method for microcosmic defect expansion of film surfaces and implementing device
KR20120103878A (en) * 2011-03-11 2012-09-20 이화여자대학교 산학협력단 Method for removing noise in image
WO2012153568A1 (en) * 2011-05-10 2012-11-15 オリンパスメディカルシステムズ株式会社 Medical image processing device and medical image processing method
CN102809567A (en) * 2011-06-01 2012-12-05 大日本网屏制造株式会社 Image acquisition apparatus, pattern inspection apparatus, and image acquisition method
KR20130031572A (en) * 2011-09-21 2013-03-29 삼성전자주식회사 Image processing method and image processing apparatus
JP2013065946A (en) * 2011-09-15 2013-04-11 Ricoh Co Ltd Image processing apparatus and image processing method
JP2013235517A (en) * 2012-05-10 2013-11-21 Sharp Corp Image processing device, image display device, computer program and recording medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1310189C (en) * 2003-06-13 2007-04-11 金宝电子工业股份有限公司 Method for reducing image video and reinforcing edge in digital camera
KR20070099398A (en) * 2006-04-03 2007-10-09 삼성전자주식회사 Apparatus for inspecting substrate and method of inspecting substrate using the same
JP5052301B2 (en) * 2007-11-21 2012-10-17 オリンパス株式会社 Image processing apparatus and image processing method
JP2010052304A (en) * 2008-08-28 2010-03-11 Canon Inc Image formation device
JP5416377B2 (en) * 2008-08-28 2014-02-12 アンリツ産機システム株式会社 Image processing apparatus, X-ray foreign object detection apparatus including the same, and image processing method
JP5315158B2 (en) * 2008-09-12 2013-10-16 キヤノン株式会社 Image processing apparatus and image processing method
KR101590868B1 (en) * 2009-07-17 2016-02-02 삼성전자주식회사 A image processing method an image processing apparatus a digital photographing apparatus and a computer-readable storage medium for correcting skin color
JP5728348B2 (en) * 2011-09-21 2015-06-03 株式会社Screenホールディングス Pattern image display device and pattern image display method
WO2014049667A1 (en) * 2012-09-28 2014-04-03 株式会社島津製作所 Digital image processing method and imaging device
TW201826611A (en) * 2012-10-25 2018-07-16 美商應用材料股份有限公司 Diffractive optical elements and methods for patterning thin film electrochemical devices

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006250536A (en) * 2005-03-08 2006-09-21 Matsushita Electric Ind Co Ltd Pattern recognition device and pattern recognition method
JP2008205737A (en) * 2007-02-19 2008-09-04 Olympus Corp Imaging system, image processing program, and image processing method
JP2009182735A (en) * 2008-01-30 2009-08-13 Kyocera Corp Noise eliminating method, image processor, and information code reader
CN101782530A (en) * 2010-01-29 2010-07-21 天津大学 Optical detection method for microcosmic defect expansion of film surfaces and implementing device
KR20120103878A (en) * 2011-03-11 2012-09-20 이화여자대학교 산학협력단 Method for removing noise in image
WO2012153568A1 (en) * 2011-05-10 2012-11-15 オリンパスメディカルシステムズ株式会社 Medical image processing device and medical image processing method
CN102809567A (en) * 2011-06-01 2012-12-05 大日本网屏制造株式会社 Image acquisition apparatus, pattern inspection apparatus, and image acquisition method
JP2013065946A (en) * 2011-09-15 2013-04-11 Ricoh Co Ltd Image processing apparatus and image processing method
KR20130031572A (en) * 2011-09-21 2013-03-29 삼성전자주식회사 Image processing method and image processing apparatus
JP2013235517A (en) * 2012-05-10 2013-11-21 Sharp Corp Image processing device, image display device, computer program and recording medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106776877A (en) * 2016-11-28 2017-05-31 常州市星网计算机技术有限公司 Motorcycle parts model automatic identifying method
CN106776877B (en) * 2016-11-28 2020-05-01 常州市星网计算机技术有限公司 Automatic identification method for motorcycle part models
CN110645909A (en) * 2019-08-16 2020-01-03 广州瑞松北斗汽车装备有限公司 Vehicle body appearance defect detection method and detection system

Also Published As

Publication number Publication date
CN105282461B (en) 2018-06-26
KR101671112B1 (en) 2016-10-31
JP6286291B2 (en) 2018-02-28
JP2016004020A (en) 2016-01-12
TWI582415B (en) 2017-05-11
TW201612505A (en) 2016-04-01
KR20150145695A (en) 2015-12-30

Similar Documents

Publication Publication Date Title
CN105282461A (en) Image processing apparatus image, acquiring apparatus, image processing method and image acquiring method
TWI484161B (en) Defect inspection system and filming device for defect inspection, image processing device for defect inspection, image processing program for defect inspection, recording media, and image processing method for defect inspection used therein
JP6867712B2 (en) Systems, devices, and methods for quality assessment of OLED stack membranes
KR100924986B1 (en) Defect detecting apparatus, defect detecting method, information processing apparatus, information processing method, and program therefor
KR100924985B1 (en) Defect detecting apparatus, defect detecting method, information processing apparatus, information processing method, and program therefor
US20150154742A1 (en) Visualization of defects in a frame of image data
CN111784684B (en) Method and device for detecting internal defects of transparent product at fixed depth based on laser assistance
CN114341619A (en) Measurement accuracy and reliability improvement
CN109580658A (en) Inspection method and check device
US20230408534A1 (en) Assay Error Reduction
CN102445168A (en) Detecting method and detecting device for surface shape
JP2015087389A (en) Visibility evaluation method of sample, dot pattern for visibility evaluation, and metal-clad laminate
JP4842376B2 (en) Surface inspection apparatus and method
JP6031751B2 (en) Glass substrate inspection apparatus and glass substrate manufacturing method
JP2008249413A (en) Defect detection method and device
JPWO2014112653A1 (en) Image generating apparatus, defect inspection apparatus, and defect inspection method
CN115713491A (en) Liquid crystal display panel defect detection method and device and electronic equipment
JP2019066222A (en) Visual inspection device and visual inspection method
KR102584696B1 (en) Method for optical inspection of display panel
JP5353553B2 (en) Substrate inspection apparatus and inspection method
JP2016109437A (en) Image processing device and method, and defect inspection device
JP4415540B2 (en) Substrate inspection apparatus and inspection method
JP2011145305A (en) Defect inspection system, and photographing device for defect inspection, image processing apparatus for defect inspection, image processing program for defect inspection, recording medium, and image processing method for defect inspection used for the same
JP5529829B2 (en) Height measuring device and height measuring method
JP2008175682A (en) Detailed shape height measuring device and measuring method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180626

Termination date: 20200618

CF01 Termination of patent right due to non-payment of annual fee