WO2010044433A1 - Procédé de traitement d'image, dispositif de traitement d'image et dispositif d'inspection de surface utilisant le dispositif de traitement d'image - Google Patents

Procédé de traitement d'image, dispositif de traitement d'image et dispositif d'inspection de surface utilisant le dispositif de traitement d'image Download PDF

Info

Publication number
WO2010044433A1
WO2010044433A1 PCT/JP2009/067808 JP2009067808W WO2010044433A1 WO 2010044433 A1 WO2010044433 A1 WO 2010044433A1 JP 2009067808 W JP2009067808 W JP 2009067808W WO 2010044433 A1 WO2010044433 A1 WO 2010044433A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
edge
rectangular
processed
line
Prior art date
Application number
PCT/JP2009/067808
Other languages
English (en)
Japanese (ja)
Inventor
博之 若葉
義典 林
浩一 宮園
祥三 川崎
秀樹 森
和彦 浜谷
Original Assignee
芝浦メカトロニクス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 芝浦メカトロニクス株式会社 filed Critical 芝浦メカトロニクス株式会社
Priority to KR1020117006596A priority Critical patent/KR101227706B1/ko
Priority to CN2009801404438A priority patent/CN102177428B/zh
Publication of WO2010044433A1 publication Critical patent/WO2010044433A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30121CRT, LCD or plasma display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Definitions

  • the present invention relates to an image processing method, an image processing apparatus, and a surface inspection apparatus using the image processing apparatus that process an image including an object surface image composed of grayscale values in pixel units.
  • a surface inspection apparatus has been proposed in which a surface of an object is photographed and a defect on the object surface is inspected based on the obtained image (see Patent Document 1).
  • the surface of an object to be inspected such as a semiconductor wafer or a liquid crystal glass substrate is scanned by a line sensor, and a two-dimensional surface image of the object to be inspected is generated based on an imaging signal output from the line sensor. Is done.
  • each part of the surface image is processed, and defects (film thickness unevenness, pattern defects, etc.) on the surface of the object to be inspected are detected from the shaded state or the like in the processed surface image of each part.
  • a filter process can be performed to emphasize the density of the defective portion in the object surface image.
  • This filtering process emphasizes the gray value of each pixel of interest in consideration of the gray values of surrounding pixels. That is, the object surface image is subjected to two-dimensional image processing for sequentially processing gray values of a plurality of pixels (a target pixel and surrounding pixels) arranged two-dimensionally.
  • the gray value changes abruptly between the inside (edge image) and outside (background image) of the edge line.
  • An appropriate processed image cannot be obtained in the partial area.
  • the present invention has been made in view of such circumstances, and provides an image processing method and an image processing apparatus capable of obtaining an image subjected to appropriate image processing in the region of the edge portion of the object surface image.
  • a surface inspection apparatus using the image processing apparatus is also provided.
  • An image processing method includes an image acquisition step configured to acquire a processed image including an object surface image, which is configured by grayscale values in pixel units, and an edge line of the object surface image set in the processed image.
  • An edge image specifying step for specifying an image of a region having a predetermined width in a direction orthogonal to the edge line as an edge image, and a conversion for converting the edge image into a rectangular image having a width corresponding to the predetermined width.
  • an image processing step for performing one-dimensional image processing for sequentially processing the grayscale values of a plurality of pixels arranged in a direction orthogonal to the width direction of the rectangular image.
  • the edge image of the region including the edge line of the object surface image set in the image to be processed and having a predetermined width in the direction orthogonal to the edge line becomes a rectangular image having a width corresponding to the predetermined width.
  • the converted rectangular image is subjected to one-dimensional image processing for sequentially processing the gray values of a plurality of pixels arranged in a direction orthogonal to the width of the rectangular image.
  • the grayscale values of a plurality of pixels arranged in a direction orthogonal to the width direction of the rectangular image are sequentially processed. Therefore, the rectangular image is a straight line corresponding to the edge line of the object surface image. Are sequentially processed in the direction along the line.
  • the image processing method according to the present invention further includes an inverse conversion step of inversely converting the rectangular image subjected to the one-dimensional image processing into an image of an original region to generate a processed edge image. can do.
  • the rectangular image that has been subjected to the one-dimensional image processing is inversely converted into an image of the original region, and a processed edge image is generated.
  • the one-dimensional image processing for the rectangular image is performed in the direction along the straight line corresponding to the edge line of the object surface image
  • the one-dimensional image processed rectangular image is obtained by being inversely transformed.
  • the processed edge image is substantially subjected to image processing along the edge line of the object surface image.
  • the edge image specifying step has a first predetermined width in the inner direction of the object surface image of the edge line, and in the outer direction of the object surface image of the edge line.
  • An image of an area having a second predetermined width can be specified as the edge image.
  • an edge line that exactly matches the edge of the object surface image is not set in the image to be processed, an image of a region that always includes the edge of the object surface image is specified as the edge image. Can do.
  • the conversion step includes a position conversion step of acquiring a position in the edge image corresponding to each pixel of the rectangular image, and a gray value of each pixel in the edge image. And determining a gray value of a pixel of the rectangular image corresponding to each position in the acquired edge image.
  • the position in the edge image corresponding to each pixel constituting the rectangular image is acquired, and each position in the acquired edge image is based on the gray value of each pixel in the edge image.
  • the gray value of the pixel of the rectangular image corresponding to is determined.
  • the gray value of the pixel of the rectangular image is set to the gray value of the pixel of the corresponding edge image.
  • I can decide.
  • the gray value of the pixel of the rectangular image is, for example, in the corresponding edge image. It can be determined by interpolation processing based on the gray value of the surrounding pixels of the position.
  • a conversion table representing a position in the edge image corresponding to each pixel of the rectangular image is created in advance, and the position conversion step uses the conversion table to A position in the edge image corresponding to each pixel of the rectangular image can be obtained.
  • the position in the edge image corresponding to each pixel in the rectangular image is determined using the conversion table, so that the position in the edge image corresponding to each pixel in the rectangular image is faster. Can get to.
  • the inverse conversion step includes a position reverse conversion step of acquiring a position in a rectangular image corresponding to each pixel of the image of the original region, and the one-dimensional image processing. Based on the gray value of each pixel in the obtained rectangular image, the gray value of the pixel in the original area image corresponding to each position in the obtained rectangular image is determined, and the processed edge image is generated. And a step.
  • the position in the rectangular image corresponding to each pixel constituting the image of the original region is acquired, and the acquired rectangle is based on the gray value of each pixel in the one-dimensionally processed rectangular image.
  • the gray value of the pixel of the image of the original area corresponding to each position in the image is determined.
  • a processed edge image constituted by the gray value of each pixel of the image of the original area determined in this way is generated.
  • the gray value of the pixel of the original region is the gray value of the pixel of the corresponding rectangular image. Can be decided.
  • the gray value of the pixel in the image in the original area corresponds to, for example, It can be determined by interpolation processing based on the gray value of the surrounding pixels at the position of the rectangular image.
  • an inverse conversion table representing a position in the rectangular image corresponding to each pixel of the image in the original area is created in advance, and the position inverse conversion step includes the inverse conversion step.
  • a position in the rectangular image corresponding to each pixel of the image in the original area can be obtained using a table.
  • the position in the rectangular image corresponding to each pixel in the image in the original area is determined using the inverse conversion table, so the rectangular image corresponding to each pixel in the image in the original area The position at can be obtained at higher speed.
  • the image acquisition step acquires an image including a surface image of a disk-shaped semiconductor wafer as the processed image
  • the edge image specifying step includes an image of the semiconductor wafer surface image.
  • An image of all or part of a ring-shaped region having a predetermined width in a direction orthogonal to a circular line along the edge is specified as an edge image
  • the inverse transformation step includes the rectangle subjected to the one-dimensional image processing
  • the processed edge image can be generated by inversely transforming the image into an image of all or part of the original ring-shaped region.
  • the whole or part of the ring-shaped region including the circular line set along the edge of the semiconductor wafer surface image is substantially in the direction along the circular line.
  • a processed edge image subjected to sequential image processing can be obtained.
  • the edge image specifying step in the edge image specifying step, the entire ring-shaped region having a first predetermined width inside the circular line and a second predetermined width outside the circular line.
  • a part of the image can be specified as the edge image.
  • the converting step may include converting the edge image into one of the two dividing lines parallel to the edge line of the object image of the edge image. And pixels arranged in a rectangle corresponding to each position set at a predetermined interval from each position to a normal line perpendicular to the lane marking line to the other lane marking line. It can be configured to convert to a rectangular image.
  • each of the pixels arranged in a rectangular shape obtains a rectangular image corresponding to a position in the edge image that is partitioned by two partition lines parallel to the edge line of the object image. be able to.
  • the conversion step may include the step of converting the edge image into a longer partition of two partition lines parallel to the edge image of the object image across the edge line. Pixels arranged in a rectangle corresponding to each position set at a predetermined interval from each position set at a predetermined interval along the line and each pixel to a shorter division line on the normal line orthogonal to the division line It can comprise so that it may convert into the rectangular image which consists of.
  • each position in the edge image is set based on each position set at a predetermined interval along the longer lane line of the two lane lines that divide the edge image. More positions can be set in the edge image. As a result, it can be converted into a rectangular image composed of more pixels.
  • one row of the rectangular image associated with each position along the partition line in the edge image and all positions set on the normal passing through the position includes a step of determining a position in the edge image corresponding to each pixel of the rectangular image using the conversion table. can do.
  • the conversion table does not associate each of the positions in the edge image with the pixels of the rectangular image, but maps the position to each position along one division line in the edge image. Since the pixels for one column of the rectangular image associated with all the positions set on the normal line passing through are associated with each other, the conversion table can be configured on a smaller scale. It becomes possible to reduce the capacity of the memory for storing.
  • An image processing method includes an image acquisition step of acquiring a processed image divided by two parallel dividing lines, and the processed image of the two parallel dividing lines of the processed image.
  • a rectangular array corresponding to each position set at a predetermined interval along each lane line and each position set at a predetermined interval from each position to a normal line perpendicular to the lane line to the other lane line The conversion step of converting into a rectangular image made up of the processed pixels is provided.
  • each position along the partition line in the image to be processed corresponds to one column of the rectangular image associated with all positions set on the normal passing through the position.
  • a conversion table for associating pixels is created in advance, and the conversion step includes a step of determining a position in the processed image corresponding to each pixel of the rectangular image using the conversion table. Can do.
  • the conversion table does not associate each of the positions in the edge image with the pixels of the rectangular image, but maps the position to each position along one division line in the edge image. Since the pixels for one column of the rectangular image associated with all the positions set on the normal line passing through are associated with each other, the conversion table can be configured on a smaller scale. It becomes possible to reduce the capacity of the memory for storing.
  • An image processing apparatus includes a photographing unit that photographs a surface of an object, and a processing unit that processes a photographed image obtained by the photographing unit, and the processing unit is configured by grayscale values in pixel units.
  • Image acquisition means for acquiring a processed image including the surface image of the object, and an edge line of the object surface image set in the processed image, and having a predetermined width in a direction orthogonal to the edge line
  • Edge image specifying means for specifying an image of a region as an edge image, conversion means for converting the edge image into a rectangular image having a width corresponding to the predetermined width, and the rectangular image in the width direction
  • the image processing means for performing one-dimensional image processing for sequentially processing the gray values of a plurality of pixels arranged in the orthogonal direction.
  • the processing unit includes the edge line of the region including the edge line of the object surface image set in the image to be processed and having a predetermined width in a direction orthogonal to the edge line, and a width corresponding to the predetermined width.
  • the rectangular image is subjected to one-dimensional image processing for sequentially processing the grayscale values of a plurality of pixels arranged in a direction orthogonal to the width of the rectangular image.
  • the grayscale values of a plurality of pixels arranged in a direction orthogonal to the width direction of the rectangular image are sequentially processed. Therefore, the rectangular image is a straight line corresponding to the edge line of the object surface image. Are sequentially processed in the direction along the line.
  • the image processing apparatus further includes inverse conversion means for inversely converting the rectangular image subjected to the one-dimensional image processing into an image of an original region to generate a processed edge image. can do.
  • the rectangular image that has been subjected to the one-dimensional image processing is inversely converted into an image of the original region, and a processed edge image is generated.
  • the one-dimensional image processing for the rectangular image is performed in the direction along the straight line corresponding to the edge line set in the object surface image
  • the one-dimensional image processed rectangular image is inversely transformed.
  • the obtained processed edge image is substantially subjected to image processing along the edge line of the object surface image.
  • the conversion unit includes a position conversion unit that acquires a position in the edge image corresponding to each pixel of the rectangular image, and a shade of each pixel in the edge image. Means for determining a gray value of a pixel of the rectangular image corresponding to each position of the acquired edge image based on the value.
  • the processing unit acquires the position in the edge image corresponding to each pixel constituting the rectangular image, and based on the gray value of each pixel in the edge image, the acquired edge portion The gray value of the pixel of the rectangular image corresponding to each position of the image is determined.
  • the image processing apparatus further includes a storage unit that stores a conversion table representing a position in the edge image corresponding to each pixel of the rectangular image, and the position conversion unit stores the conversion table. It can be set as the structure which acquires the position in the said edge image corresponding to each pixel of the said rectangular image using.
  • the processing unit obtains the position in the edge image corresponding to each pixel of the rectangular image using the conversion table stored in the storage means, so that each pixel in the rectangular image in the processing unit.
  • the position in the edge image corresponding to can be obtained at higher speed.
  • the photographing unit photographs a surface of a disk-shaped semiconductor wafer
  • the image obtaining unit obtains an image including the semiconductor wafer surface image as the processed image
  • the edge image specifying means specifies all or part of an image of a ring-shaped region having a predetermined width in a direction orthogonal to a circular line along the edge of the semiconductor wafer surface image as the edge image
  • the inverse conversion means Can be configured to inversely convert the rectangular image subjected to the one-dimensional image processing into an image of all or part of the original ring-shaped region to generate a processed edge image.
  • the processing unit substantially follows the circular line with respect to all or a part of the image of the ring-shaped region including the circular line set along the edge of the surface image of the disk-shaped semiconductor wafer. It is possible to obtain a processed edge image that has been sequentially subjected to image processing in the selected direction.
  • the edge image specifying means has a ring-shaped region that has a first predetermined width inside the circular line and a second predetermined width outside the circular line.
  • a part of the image can be specified as the edge image.
  • the processing unit can image all or part of the ring-shaped region that always includes the edge of the semiconductor wafer surface image. Can be specified as the edge image.
  • An image processing apparatus includes a photographing unit that photographs a surface of an object, and a processing unit that processes a photographed image obtained by the photographing unit, and the processing unit includes two parallel parts from the photographed image.
  • Image acquisition means for acquiring an image to be processed divided by one lane line, and the image to be processed at a predetermined interval along one of the two parallel lane lines of the image to be processed.
  • Conversion means for converting into a rectangular image composed of pixels arranged in a rectangular shape corresponding to each position set at a predetermined interval from each position set to a normal line perpendicular to the lane marking line to the other lane marking line. It becomes the composition which has.
  • each position of the rectangular image corresponding to all positions set on the normal passing through the position at each position along the partition line in the processed image.
  • a storage unit that stores a conversion table that associates pixels; and the conversion unit includes a unit that determines a position in the image to be processed corresponding to each pixel of the rectangular image using the conversion table. can do.
  • the conversion table does not associate each position in the edge image with each pixel of the rectangular image, but each one along one partition line in the edge image. Since the pixels corresponding to one column of the rectangular image associated with all positions set on the normal line passing through the position are associated with the position, the conversion table can be configured on a smaller scale. As a result, the capacity of the storage means for storing it can be reduced.
  • the surface inspection apparatus includes an imaging unit that images the surface of an object, and a processing unit that processes a captured image obtained by the imaging unit and detects a defect on the object surface.
  • the unit includes an image acquisition unit configured to obtain a processed image including a surface image of the object, and an edge line of the object surface image set in the processed image.
  • Edge image specifying means for specifying an image of a region having a predetermined width in a direction orthogonal to the line as an edge image, conversion means for converting the edge image into a rectangular image having a width corresponding to the predetermined width, and First image processing means for performing one-dimensional image processing for sequentially processing grayscale values of a plurality of pixels arranged in a direction orthogonal to the width direction of the rectangular image; and the rectangular image subjected to the one-dimensional image processing.
  • Inverse conversion means for generating a processed edge image by conversion, and a body image obtained by removing the edge image from the object surface image, sequentially processing the gray values of a plurality of pixels arranged two-dimensionally 2
  • a second image processing unit that generates a processed main body image by performing dimensional image processing; and a defect specifying unit that specifies a defective portion in the processed edge image and the processed product main body image.
  • the processing unit includes the edge line of the region including the edge line of the object surface image set in the image to be processed and having a predetermined width in a direction orthogonal to the edge line, and a width corresponding to the predetermined width.
  • the rectangular image is subjected to one-dimensional image processing for sequentially processing the grayscale values of a plurality of pixels arranged in a direction orthogonal to the width of the rectangular image.
  • the grayscale values of a plurality of pixels arranged in a direction orthogonal to the width direction of the rectangular image are sequentially processed. Therefore, the rectangular image is a straight line corresponding to the edge line of the object surface image. Are sequentially processed in the direction along the line.
  • the processing unit reversely converts the rectangular image subjected to the one-dimensional image processing into an image of the original region, and generates a processed edge image.
  • the one-dimensional image processing for the rectangular image is processing in a direction along the straight line in the rectangular image corresponding to the edge line of the object surface image
  • the one-dimensional image processed rectangular image is The processed edge image obtained by the inverse transformation is substantially sequentially processed in the direction along the edge line of the object surface image.
  • the processing unit performs a two-dimensional image processing for sequentially processing the grayscale values of a plurality of pixels arranged two-dimensionally on the main body image obtained by removing the edge image from the object surface image, thereby processing the processed main body image. And a defective portion is specified in the processed edge image and the processed main body image.
  • both the edge line image and the main body image of the object surface image are identified in the image processed state, so that the inspection of the defect over the entire object surface can be performed with higher accuracy. it can.
  • the surface inspection apparatus wherein the imaging unit images a surface of a disk-shaped semiconductor wafer, and the processing unit detects a defect on the surface of the semiconductor wafer.
  • the image acquisition means acquires an image including the surface image of the semiconductor wafer as the processed image, and the edge image specifying means is a direction orthogonal to a circular line along the edge of the semiconductor wafer surface image.
  • the whole or part of the ring-shaped region having a predetermined width is specified as an edge image, and the inverse transforming means converts the rectangular image subjected to the one-dimensional image processing to all or the original ring-shaped region.
  • the second image processing means generates the processed edge image by inversely transforming into a part of the image, and the second image processing means applies the 2 Configured to generate a processed body part image by performing the original image processing.
  • the edge image including the edge line of the object surface image specified in the image to be processed is converted into a rectangular image, and one-dimensional image processing is performed on the rectangular image.
  • one-dimensional image processing for the rectangular image the gray values of a plurality of pixels arranged in the direction orthogonal to the width direction of the rectangular image are sequentially processed. This corresponds to an image obtained by sequentially performing image processing in the direction along the edge line. Then, if the processed rectangular image is inversely transformed, it is possible to easily obtain an image that has undergone appropriate image processing in the region of the edge portion of the object surface image.
  • FIG. 1 It is a figure which shows the state in which the circular line was set to the edge of the wafer surface image in the picked-up image. It is a figure which shows the ring-shaped area
  • 10B is an enlarged view of the angle range 0 ° to 90 ° in FIG. 10A. It is a figure which expands and shows the position set to the outer periphery of an edge part image. It is a figure which shows the position set on the normal line NL. It is a figure which expands and shows the position set on the normal line NL. It is a figure which shows the relationship between each pixel of a rectangular image, and the position of an edge image. It is a figure which shows an example of the information content which a conversion table has. It is a figure which shows the relationship between a rectangular image and the image (edge part image) of the ring-shaped area
  • An image processing apparatus that performs image processing according to the image processing method of the present invention is configured, for example, in a surface inspection apparatus for a semiconductor wafer.
  • the mechanism part of this surface inspection apparatus is configured as shown in FIGS.
  • FIG. 1 is a side view showing the configuration of the mechanism
  • FIG. 2 is a front view thereof.
  • the surface inspection apparatus has a base 50, and a carriage 60 and a shift movement mechanism 70 are provided in the base 50 (see particularly FIG. 1).
  • the shift moving mechanism 70 includes a stage 71, and the carriage 60 can reciprocate by moving on the stage 71 (moving left and right in FIG. 1 and perpendicular to the paper surface in FIG. 2).
  • the shift movement mechanism 70 also has a drive unit 72 including a motor, a gear mechanism, and the like.
  • the drive unit 72 causes the stage 71 to move in a direction perpendicular to the movement direction of the carriage 60 due to self-running (the left-right direction in FIG. 1 in a direction perpendicular to the paper surface).
  • the carriage 60 is provided with a support mechanism 20 that supports a semiconductor wafer (hereinafter simply referred to as a wafer) 10 to be inspected, and the support mechanism 20 protrudes from the upper surface of the base 50.
  • the support mechanism 20 includes a circular table 21 on which the wafer 10 is set, and two support legs 22 a and 22 b that support the table 21 side by side along a direction orthogonal to the moving direction of the carriage 60. These support legs 22 a and 22 b are fixed to the carriage 60.
  • a guide mechanism is provided on the upper surface of the base 50 to guide the support legs 22a and 22b in the moving direction when the carriage 60 is self-propelled.
  • a shift guide mechanism for guiding the support legs 22a and 22b in that direction is provided.
  • An arch-shaped frame 55 is provided at a substantially central portion in the self-running direction of the carriage 60 on the upper surface of the base 50.
  • a camera unit 30 is provided in a substantially central portion of the frame 55 facing downward.
  • An illumination unit 31 is provided in the vicinity of the installation position of the camera unit 30 on the frame 55. The illumination unit 31 illuminates the surface of the wafer 10 supported by the support mechanism 20 that moves below the frame 55.
  • the camera unit 30 has a line sensor (for example, a one-dimensional CCD image sensor) and photographs the surface of the wafer 10 supported by the support mechanism 20 that moves as the carriage 60 moves. Specifically, as shown in FIG. 3, the camera unit 30 images the entire surface of the wafer 10 by four scans.
  • a line sensor for example, a one-dimensional CCD image sensor
  • the camera unit 30 scans a predetermined width (corresponding to the length of the line sensor) on the surface of the wafer 10 in the direction S1.
  • the carriage 60 is shifted in the direction B to the shift position Ps2 by the shift moving mechanism 70.
  • the camera unit 30 scans the wafer 10 in the direction S2 (reverse to the direction S1).
  • the carriage 60 is further shifted in the direction B to the shift position Ps3 by the shift mechanism 70.
  • the camera unit 30 scans the wafer 10 in the direction S1.
  • the carriage is further shifted to the shift position Ps4 by the shift mechanism 70.
  • the camera unit 30 scans in the wafer S2 direction.
  • the camera unit 30 scans the surface of the wafer 10 four times while repeating the shift movement in the direction B by the shift movement mechanism 70. To do. In the process, the camera unit 30 images the surface of the wafer 10 and sequentially outputs image signals.
  • the control system of the surface inspection apparatus is configured as shown in FIG.
  • the processing unit 100 inputs the image signal from the camera unit 30 described above and processes the image signal as information representing the surface image of the wafer 10. Further, a drive control unit 120 is connected to the processing unit 100, and the drive control unit 120 performs drive control of the moving mechanism 200 including the carriage 60 and the shift moving mechanism 70 described above under the control of the processing unit 100. . Further, an operation unit 111 and a display unit 112 are connected to the processing unit 100. The processing unit 100 that inputs an operation signal from the operation unit 111 operated by the user executes various processes based on the operation signal. The processing unit 100 causes the display unit 112 to display various types of information, such as displaying a captured image including the surface image of the wafer 10 on the display unit 112 based on the image data obtained by the processing.
  • processing related to registration of various information related to image processing of the surface image of the wafer 10 is performed.
  • FIG. 5 shows a processing unit 100 in which the surface of the wafer 10 representing the wafer to be inspected is photographed by the camera unit 30 in the above-described mechanism (see FIGS. 1 to 3), and an image signal from the camera unit 30 is input. The process is executed according to the procedure shown.
  • the processing unit 100 converts the image signal sequentially supplied from the camera unit 30 into image data composed of grayscale values in pixel units, and includes a wafer surface image (data) representing the entire surface of the wafer 10.
  • a captured image is acquired as an image to be processed (S1).
  • the captured image including the wafer surface image is developed in an image memory (not shown).
  • FIG. 6A when the processing unit 100 acquires a captured image (processing target image) I including a wafer surface image I10, the captured image I is displayed on the image memory as X (lateral coordinate axis), Y
  • the position is corrected based on the respective references (vertical coordinate axis) and ⁇ (counterclockwise angle) (S2). For example, as shown in FIG.
  • the position is corrected so that the notch portion I11 of the wafer surface image I10 is positioned in a predetermined direction.
  • the processing unit 100 causes the display unit 112 to display the captured image I including the wafer surface image I10 whose position has been corrected.
  • the user For the captured image I displayed on the display unit 112, the user operates the operation unit 111, for example, as shown in FIG. 6C, a circular line C0 (edge line: along the edge Edg of the wafer surface image I10). Set up a dashed line).
  • the processing unit 100 determines the set radius R of the circular line C0 and the position (Xo, Yo) of the center O thereof. Obtain (S3).
  • the processing unit 100 is concentric with the circular line C0 specified by the radius R and the position (Xo, Yo) of the center O, and is inside the radius (R ⁇ L1).
  • a circular division line C1 and an outer circular division line C2 having a radius (R + L2) are set concentrically with the circular line C0.
  • the processing unit 100 is partitioned by the inner circular partition line C1 and the outer circular partition line C2 that are parallel to the circular line C0 and has a width from the circular line C0 to the inside of the wafer surface image I10.
  • a ring-shaped area E Ling having a width L2 on the outer side is recognized as L1
  • an image in the ring-shaped area E Ling in the captured image I is specified as the edge image IE (S4).
  • L1 and L2 are set, for example, by the operation of the operation unit 11 by the user.
  • the processing unit 100 creates a conversion table (HLUT) to be used for converting the edge image IE of the ring-shaped region E Ling into a rectangular image IR having a width corresponding to the width (L1 + L2) ( S5).
  • This conversion table (HLUT) is a ring-shaped edge image having an outer circumference 2 ⁇ (R + L2) (the length of the outer circular division line C2) and an inner circumference 2 ⁇ (R ⁇ L1) (the length of the inner circular division line C1).
  • Each position of IE is associated with a pixel Px of a rectangular image IR having a length and width corresponding to the outer periphery 2 ⁇ (R + L2) and width (L1 + L2) of the edge image IE as shown in FIG.
  • each position of the first edge image portion IE (1) in the first angle range (0 ° ⁇ ⁇ ⁇ 90 °) in the ring-shaped edge image IE is long. Is associated with a pixel of the first rectangular image portion IR (1) having a length corresponding to the first angle range (0 ° ⁇ ⁇ ⁇ 90 °) of 2 ⁇ (R + L2), and Similarly, each position of the second edge image portion IE (2) in the two angle range (90 ° ⁇ ⁇ ⁇ 180 °) has a length corresponding to the second angle range (90 ° ⁇ ⁇ ⁇ 180 °). Corresponding to the pixel of the second rectangular image portion IR (2).
  • each position of the third edge image portion IE (3) in the third angle range (180 ° ⁇ ⁇ ⁇ 270 °) in the ring-shaped edge image IE corresponds to the position of the length 2 ⁇ (R + L2).
  • the fourth angle range (270 ° ⁇ 270 ° in the edge image IE) corresponds to the position of the third angle range (180 ° ⁇ ⁇ ⁇ 270 °).
  • each position of the fourth edge image portion IE (4) of ⁇ ⁇ 360 °, 0 ° has a length corresponding to the fourth angle range (270 ° ⁇ ⁇ ⁇ 360 °, 0 °).
  • each position of the fourth edge image portion IE (4) of ⁇ ⁇ 360 °, 0 ° has a length corresponding to the fourth angle range (270 ° ⁇ ⁇ 360 °, 0 °).
  • the conversion table represents the position of the edge image IE corresponding to each pixel of the rectangular image IR, and is created as follows.
  • the coordinate values at each position are P j (X j , Y j ), P j1 ⁇ (X j ⁇ X j , Y j ⁇ Y j ),..., P jm ⁇ (X j ⁇ m ⁇ X j , Y j ⁇ m ⁇ Y j ).
  • each position set in the edge image IE is associated with a pixel of the rectangular image IR shown in FIG. Specifically, as shown in FIG. 13, each of the n pixels Px 0 to Px 0 aligned along the side V (C2) of the rectangular image IR corresponding to the outer peripheral edge (outer circular division line C2) of the edge image IE. Positions P 0 (X 0 , Y 0 ) to P n-1 (X n ⁇ 1 , Y n ⁇ ) where Px n ⁇ 1 is set along the outer peripheral edge (outer circular division line C2) of the edge image IE.
  • n pixels Px 01 to Px (n-1) 1 arranged inward by one pixel from the side V (C2) of the rectangular image IR are the outer peripheral edge (outer circular shape ) of the edge image IE.
  • the conversion table (HLUT) takes into account the above-described relationship, and as shown in FIG. 14, the pixels (Px 0 to Px 0m ), (Px 1 to Px 1m ), (Px 1 to Px 1m ) for one column in the width direction in the rectangular image IR, ..., (Px n-1 to Px (n-1) m ), positions P 0 to P n- set along the outer peripheral edge (outer circular division line C2) of the edge image IE.
  • each position of the edge image IE corresponding to each pixel (Px 0 to Px om ) in the first column lined up in the width direction in the rectangular image IR becomes (X 0 , Y 0 ), (X 0 ⁇ X 0 , Y 0 ⁇ Y 0 ),..., (X 0 ⁇ m ⁇ X 0 , Y 0 ⁇ m ⁇ Y 0 )
  • Each position of the edge image IE corresponding to each pixel (Px i to Px im ) in the (i + 1) th column is (X i , Y i ), (X i ⁇ X i , Y i ⁇ Y i ),. .., (X i ⁇ m ⁇ X i , Y i ⁇ m ⁇ Y i )
  • the conversion table (HLUT) is not the correspondence between all the positions set in the edge image IE and each pixel of the rectangular image IR as shown in FIG. 13, but the width direction of the rectangular image IR as shown in FIG. , Each position P 0 (X 0 , Y 0 ) to P n-1 (X n-1 , Y n-1 ) along the outer edge line (outer circular division line C2) in the edge image IE ) And the position interval ⁇ set on the normal line passing through each position, the conversion table (HLUT) can be configured on a smaller scale, and as a result, it is stored. The memory capacity can be reduced.
  • each position of the rectangular image IR having the length 2 ⁇ (R + L2) and the length and width corresponding to the width (L1 + L2) is represented by the elements of the outer periphery 2 ⁇ (R + L2) and the inner periphery 2 ⁇ (R ⁇ L1).
  • HLUT conversion table
  • NLUT inverse conversion table
  • the first rectangular image portion IR (1) having a length corresponding to the first angle range (0 ° ⁇ ⁇ ⁇ 90 °) in the length 2 ⁇ (R + L2).
  • Each position is associated with each pixel of the first edge image portion IE (1) in the first angle range (0 ° ⁇ ⁇ ⁇ 90 °) in the ring-shaped edge image IE, and similarly the second angle range.
  • Each position of the second rectangular image portion IR (2) having a length corresponding to (90 ° ⁇ ⁇ ⁇ 180 °) is represented by the second angle range (90 ° ⁇ ⁇ ⁇ 180 °) in the edge image IE.
  • the inverse conversion table also indicates each position of the third rectangular image portion IR (3) having a length corresponding to the third angle range (180 ° ⁇ ⁇ ⁇ 270 °) of the length 2 ⁇ (R + L2).
  • the fourth angle range (270 ° ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇
  • Each position of the fourth rectangular image portion IR (4) having a length corresponding to ⁇ 360 °, 0 °) is set in the fourth angle range (270 ° ⁇ ⁇ 360 °, 0 °) in the edge image IE.
  • the fourth edge image portion IE (4) corresponds to the pixels of the fourth edge image portion IE (4).
  • the processing unit 100 uses the conversion table (HLUT) to specify the edge specified in the captured image I including the wafer surface image I10.
  • the image IE is converted into a rectangular image IR, and the rectangular image IR obtained by the conversion using the inverse conversion table (NLUT) is further converted back into the original ring-shaped area E Ling image (edge image IE).
  • S7 A captured image I including an image obtained by the inverse transformation is displayed on the display unit 112. If there is no problem in the image portion of the captured image I displayed on the display unit 112, particularly in the ring-shaped region E Ling , the user performs a predetermined registration execution operation with the operation unit 111.
  • the processing unit 100 When the processing unit 100 receives an operation signal based on the registration execution operation from the operation unit 111, the processing unit 100 specifies the circular line C0 (edge line) along the edge of the wafer surface image I10 set as described above. Center position O (Xo, Yo) and radius (R), boundary line of ring-shaped region E Ling (C1, ) From the circular line C0 representing the center line (Xo, Yo) obtained by removing the ring-shaped region E Ling from the circular region of the conversion table (HLUT), inverse conversion table (NLUT), and wafer surface image I10. The circular main body region having the radius (R-L1) is registered in the internal memory (S8), and the process is terminated. This registration process can be performed for each diameter of the wafer to be inspected.
  • the user again captures the surface of the wafer 10 and takes the wafer.
  • a circular line C0 and widths L1 and L2 for the surface image I10 are determined, and a conversion table (HLUT) and an inverse conversion table (NLUT) can be generated.
  • the surface inspection apparatus can perform the surface inspection of each wafer 10.
  • the surface of the wafer 10 to be inspected is photographed by the camera unit 30 in the above-described mechanism (see FIGS. 1 to 3), and processed by the processing unit 100 that inputs an image signal from the camera unit 30 according to the procedure shown in FIG. Execute.
  • the processing unit 100 converts image signals sequentially supplied from the camera unit 30 that images the surface of the wafer 10 to be inspected into image data composed of grayscale values in pixel units, and the surface of the wafer 10.
  • a photographed image including a wafer surface image (data) representing the whole is acquired as an image to be processed (S11).
  • the image to be processed including the wafer surface image is developed in an image memory (not shown).
  • the processing unit 100 acquires a captured image I including a wafer surface image I20 as shown in FIG. 17A
  • the captured image I is displayed on the image memory as X (horizontal coordinate axis), Y (vertical coordinate axis), and ⁇ .
  • the position is corrected based on each reference (counterclockwise angle) (S12).
  • the notch portion I21 of the wafer surface image I20 is positioned in a predetermined direction, and the wafer surface image I20 is positioned in the same manner as that during registration processing (see FIGS. 5 and 6B).
  • the position of the captured image I after position correction is developed on the image memory. In this case, there is a defective portion D1 in the outer edge portion of the wafer surface image I20.
  • the processing unit 100 is position-corrected based on the center position (Xo, Yo) and radius (R) that specify the registered circular line C0 and the widths L1 and L2 that specify the ring-shaped region E Ling .
  • the inspection area is mapped in the wafer surface image I20 (S13).
  • an image of the ring-shaped region E Ling is specified as the edge image IE, and a circular main body region (radius) obtained by removing the ring-shaped region E Ling (edge image IE) from the region of the wafer surface image I20. (R-L1) wafer main body image) is identified.
  • the processing unit 100 converts the edge image IE, which is an image of the ring-shaped region E Ling obtained by the mapping, into a rectangular image IR using the conversion table (HLUT) (see FIGS. 13 and 14) described above. (S14).
  • the position in the ring-shaped edge image IE corresponding to each pixel (see FIG. 8) of the rectangular image IR is acquired from the conversion table (HLUT).
  • the gray value of the position in the acquired edge image IE is acquired, and the gray value is determined as the gray value of the pixel of the rectangular image IR corresponding to the position.
  • the gray value of the pixel is determined as the gray value of the pixel in the corresponding rectangular image IR. Further, when the position in the edge image IE does not match the pixel position of the edge image IE, the gray value of the position is determined by interpolation processing using the gray value of the peripheral pixels at the position, and the gray value corresponds to the position value. It is determined as the gray value of the pixel of the rectangular image IR.
  • the gray value of each pixel in the rectangular image IR is determined as the gray value of the position in the edge image IE associated with the conversion table (HLUT), so that the edge image IE becomes the rectangular image IR.
  • the first edge image portion IE (1) in the first angle range (0 ° ⁇ ⁇ ⁇ 90 °) in the edge image IE has a length of 2 ⁇ (R + L2).
  • the first rectangular image portion IR (1) having a length corresponding to the first angular range (0 ° ⁇ ⁇ ⁇ 90 °) is converted into the second angular range (90 ° ⁇ ⁇ ⁇ in the edge image IE).
  • second edge image portion IE (2) is similarly converted into a second rectangular image portion IR (2) having a length corresponding to the second angle range (90 ° ⁇ ⁇ ⁇ 180 °).
  • the third edge image portion IE (3) in the third angle range (180 ° ⁇ ⁇ ⁇ 270 °) in the ring-shaped edge image IE is the third angle of the length 2 ⁇ (R + L2). It is converted into a third rectangular image portion IR (3) having a length corresponding to the range (180 ° ⁇ ⁇ ⁇ 270 °), and further, the fourth angular range (270 ° ⁇ ⁇ ⁇ 360 °, 0 in the edge image IE).
  • the fourth edge image portion IE (4) at the same angle is the fourth rectangular image portion IR (4) having a length corresponding to the fourth angle range (270 ° ⁇ ⁇ ⁇ 360 °, 0 °). Converted. In this case, the defect portion D1 present in the second edge image portion IE (2) is converted into the defect portion DR1 of the corresponding second rectangular image portion IR (2).
  • the processing unit 100 when the processing unit 100 generates the rectangular image IR, the processing unit 100 performs one-dimensional filter processing on the rectangular image IR in order to emphasize the defective portion (S15).
  • the gray value of each target pixel corresponds to the direction orthogonal to the width direction of the rectangular image IR with respect to the target pixel, that is, the circular line C0 set as the edge line of the edge image IE. It is determined based on the gray value of the pixels lined up and down in the direction parallel to the horizontal line V (C0) (see FIG. 8).
  • Such one-dimensional filtering process the processed rectangular image IR P in a state in which the defect portion is emphasized is obtained. Specifically, as shown in FIG.
  • the first processed rectangular image corresponding to the same angular range from the first rectangular image portion IR (1) corresponding to the first angular range (0 ° ⁇ ⁇ ⁇ 90 °).
  • IR P (1) is generated, and from the second rectangular image portion IR (2) corresponding to the second angular range (90 ° ⁇ ⁇ ⁇ 180 °), the second processed rectangular image IR P ( 2) is generated, and the third processed rectangular image IR P (3) corresponding to the same angular range is obtained from the third rectangular image portion IR (3) corresponding to the third angular range (180 ° ⁇ ⁇ ⁇ 270 °).
  • a fourth processed rectangular image IR P (4) corresponding to the same angular range is generated from the fourth rectangular image portion IR (4) corresponding to the fourth angular range (270 ° ⁇ ⁇ ⁇ 360 °, 0 °). ) Is generated.
  • the defective portion DR1 included in the second rectangular image portion IR (2) is emphasized and generated as the defective portion DR1 P in the second processed rectangular image portion IR P (2).
  • the processing unit 100 uses the inverse conversion table (NLUT) to convert the processed rectangular image IR P to the original ring shape.
  • the image is inversely converted into an image of the area E Ling (S16).
  • the position of the processed rectangular image IR P corresponding to each pixel of the original ring-shaped region E Ling is obtained from the inverse conversion table (NLUT).
  • the acquired grayscale values of the position in the acquired the processed rectangular image IR P the gray value is determined as the gray value of the pixel in the ring region E Ling corresponding to the position.
  • the gray value of the pixel is determined as the gray value of the pixels of the corresponding ring-shaped region E Ling. Further, when the position of the processed rectangular image IR P does not match the pixel position in the processed rectangular image IR P, gray value of the position is determined by an interpolation process according to the gray value of the peripheral pixel of the position, the shade The value is determined as the gray value of the pixel in the corresponding ring-shaped region E Ling .
  • the processed rectangular image The IR P is converted into an image of the ring-shaped region E Ling , and a processed edge image IE P is obtained.
  • the first processed rectangular image portion IR P (1) corresponding to the first angular range (0 ° ⁇ ⁇ ⁇ 90 °) is converted into the first processing within the same angular range.
  • the second processed rectangular image part IR P (2) corresponding to the second angle range (90 ° ⁇ ⁇ ⁇ 180 °) is inversely transformed into the finished edge image part IE P (1), 2
  • the processed edge image portion IE P (2) is converted into a third processed rectangular image portion IR P (3) corresponding to the third angle range (180 ° ⁇ ⁇ ⁇ 270 °).
  • the fourth processed rectangular image portion IR P (4) converted to the third processed edge image portion IE P (3) and further corresponding to the fourth angle range (270 ° ⁇ ⁇ ⁇ 360 °, 0 °). Is converted into the fourth processed edge image portion IE P (4) in the same angle range.
  • the defect portion DR1 P highlighted in the second processed rectangular image portion IR P (2) is converted into the defect portion D1 P emphasized in the second processed edge image IE P (2). Is done.
  • the processing unit 100 removes the ring region E Ling from the region of the wafer surface image I20 obtained by the mapping (see S13) following the generation of the processed edge image IE P described above.
  • Two-dimensional filter processing is performed on the image of the circular main body region (S17). Specifically, in the image of the circular main body region (wafer main body image), the gray value of each pixel of interest is the gray value of its surrounding pixels (for example, eight pixels surrounding the pixel of interest) arranged two-dimensionally. It is decided based on. By performing such a two-dimensional filter process, a processed wafer main body image is obtained. In the processed wafer main body image, the defective portion is emphasized.
  • the processing unit 100 synthesizes the processed edge image IE P and the processed wafer main body image to generate a single processed wafer surface image I20 P (S18). Then, the processing unit 100, the defect detection process is performed on the generated processed wafer surface images I20 P (S19). For example, the defect portion is specified based on the density distribution of the processed wafer surface images I20 P.
  • the processing unit 100 is information for specifying the defect part obtained by the defect detection process, that is, defect information (for example, defect number (NO.), Center coordinates of the defect part, area of the defect part, flatness of the defect part).
  • the processing unit 100 causes the display unit 112 to display the defective portions D1 P , D2 P , D3 P , D4 P specified by the defect information together with the processed wafer surface image I20 P.
  • the defective portion D1 P is specified on the processed edge image IE P
  • the other defective portions D2 P , D3 P and D4 P are the processed wafer main body image.
  • the edge image IE including the circular line C0 along the edge of the wafer surface image I10 specified in the photographed image I (image to be processed) is converted into the rectangular image IR.
  • the one-dimensional filter process is performed on the rectangular image IR. Since this one-dimensional filter process is a process in the direction along the straight line V (Co) corresponding to the circular line Co set with respect to the edge of the wafer surface image I20, the process subjected to the one-dimensional filter process processing Sumien unit image IE P obtained already rectangular image IR P is inverse transform becomes the filter along the edge lines of substantially the wafer surface image I20 is applied. Accordingly, it is possible to easily obtain an image (processed edge image IE P ) that has been subjected to appropriate filtering in the region of the edge portion of the wafer surface image I20.
  • the defect portion D1 p in the edge portion of the wafer surface image I20 is also obtained. Like the defective portions D2 P , D3 P , and D4 P in the inner wafer main body image, it can be accurately detected.
  • the image in the ring-shaped region E Ling including the circular line C0 as the edge line set in the wafer surface image I20 is the edge image IE. It is also possible to process a part of the Ling image as the edge image IE. Even if the edge line of the object surface is not a circle but an arbitrary curve, an edge line along the edge curve is set, and an image of an area having a predetermined width in the direction orthogonal to the edge line is It can also be specified as a partial image.
  • the edge image is converted into a rectangular image on the basis of the correspondence between each position of the edge image in the arbitrary curved shape and each pixel of the rectangular image, and the processed rectangular image is converted into the original curved shape. Is converted back to an image of the region (processed edge image).
  • the one-dimensional image processing performed on the rectangular image IR is not limited to the filter processing, and a plurality of pixels arranged in a direction orthogonal to the width direction of the rectangular image IR.
  • the processing of the defect detected in the processed rectangular image IR P It can be performed.
  • the captured image of the semiconductor wafer is the processed image, but the processed image is not particularly limited as long as it includes a surface image of some object.
  • the image processing method and the image processing device have an effect that an image subjected to appropriate image processing can be obtained in the region of the edge portion of the object surface image, It is useful as an image processing method and an image processing apparatus for processing an object surface image composed of grayscale values in pixel units, and is suitable for a surface inspection apparatus.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention porte sur un procédé de traitement d'image pour obtenir une image traitée de façon appropriée de la partie marginale d'une surface d'objet. Le procédé de traitement d'image comprend une étape d'obtention d'image (S11) consistant à obtenir une image devant être traitée comprenant l'image de surface d'un objet et représentée par une valeur d'échelle de gris pour chaque pixel, une étape de spécification d'image de partie marginale (S13) pour spécifier, en tant qu'image de partie marginale, l'image d'une zone contenant la ligne de bord de l'image de surface d'objet réglée dans l'image devant être traitée et ayant une largeur prédéfinie dans la direction perpendiculaire à la ligne de bord, une étape de conversion (S14) consistant à convertir l'image de partie marginale en une image rectangulaire ayant une largeur correspondant à la largeur prédéfinie, une étape de traitement d'image (S15) consistant à effectuer un traitement d'image unidimensionnel de l'image rectangulaire dans laquelle les valeurs d'échelle de gris des pixels agencés dans la direction perpendiculaire à la direction de largeur sont traitées de façon séquentielle, et une étape de conversion inverse (S16) consistant à générer une image de partie marginale traitée par conversion inverse de l'image rectangulaire soumise au traitement d'image unidimensionnel en l'image de la zone initiale.
PCT/JP2009/067808 2008-10-14 2009-10-14 Procédé de traitement d'image, dispositif de traitement d'image et dispositif d'inspection de surface utilisant le dispositif de traitement d'image WO2010044433A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020117006596A KR101227706B1 (ko) 2008-10-14 2009-10-14 화상 처리 방법, 화상 처리 장치 및 이 화상 처리 장치를 이용한 표면 검사 장치
CN2009801404438A CN102177428B (zh) 2008-10-14 2009-10-14 图像处理方法、图像处理装置及使用该图像处理装置的表面检查装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-265808 2008-10-14
JP2008265808 2008-10-14

Publications (1)

Publication Number Publication Date
WO2010044433A1 true WO2010044433A1 (fr) 2010-04-22

Family

ID=42106597

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/067808 WO2010044433A1 (fr) 2008-10-14 2009-10-14 Procédé de traitement d'image, dispositif de traitement d'image et dispositif d'inspection de surface utilisant le dispositif de traitement d'image

Country Status (4)

Country Link
JP (1) JP2010118046A (fr)
KR (1) KR101227706B1 (fr)
CN (1) CN102177428B (fr)
WO (1) WO2010044433A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530885A (zh) * 2013-10-23 2014-01-22 北京倍肯恒业科技发展有限责任公司 一种一维图像自适应分层边缘检测提取算法
US20220208580A1 (en) * 2020-12-28 2022-06-30 Semes Co., Ltd. Wafer inspection method

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011110783A1 (de) * 2011-08-22 2013-02-28 Focke & Co. (Gmbh & Co. Kg) Verfahren und Vorrichtung zum Prüfen stabförmiger Tabakprodukte
CN103149215B (zh) * 2013-02-27 2016-04-13 中国计量学院 一种钢化玻璃绝缘子缺陷检测方法与装置
CN106767425B (zh) * 2016-11-07 2019-07-26 无锡市莱科自动化科技有限公司 一种轴承卡簧豁口的视觉测量方法
CN107680048B (zh) * 2017-09-05 2020-11-10 信利(惠州)智能显示有限公司 一种边缘显示效果处理方法
CN108844471B (zh) * 2018-08-02 2019-05-07 成都天衡智造科技有限公司 一种对圆形工件边缘凹陷区域向圆心延伸长度的测量方法及装置
CN112884769B (zh) * 2021-04-12 2021-09-28 深圳中科飞测科技股份有限公司 图像处理方法、装置、光学系统和计算机可读存储介质
CN115775241B (zh) * 2022-12-04 2023-07-07 武汉惠强新能源材料科技有限公司 用于锂电池隔膜生产的流延厚度均匀性检测方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0772097A (ja) * 1993-09-03 1995-03-17 Dainippon Printing Co Ltd 欠陥検査方法
JPH0968502A (ja) * 1995-08-30 1997-03-11 Dainippon Screen Mfg Co Ltd 透孔板の検査方法および検査装置
JPH10311776A (ja) * 1997-03-13 1998-11-24 Asahi Optical Co Ltd 光学部材検査装置
JP2000136916A (ja) * 1998-10-15 2000-05-16 Wacker Siltronic Corp 半導体ウエ―ハ上のエッジ欠陥を検出、モニタ及び特徴付ける方法及び装置
JP2001221749A (ja) * 2000-02-10 2001-08-17 Hitachi Ltd 観察装置及び観察方法
JP2002328094A (ja) * 2001-05-02 2002-11-15 Nidec Tosok Corp Ledリング照明及びそれを備えた画像検査装置
JP2007234932A (ja) * 2006-03-02 2007-09-13 Olympus Corp 外観検査装置
JP2008216248A (ja) * 2007-02-28 2008-09-18 Vistec Semiconductor Systems Gmbh ウエハ縁端部の上部表面の欠陥の高分解能画像を取得する方法
US20080309927A1 (en) * 2007-06-15 2008-12-18 Qimonda Ag Wafer inspection system and method
JP2009133797A (ja) * 2007-12-03 2009-06-18 Shibaura Mechatronics Corp 基板表面検査装置及び基板表面検査方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0718811B2 (ja) * 1989-03-15 1995-03-06 松下電工株式会社 欠陥検査方法
JP2803930B2 (ja) * 1991-11-22 1998-09-24 株式会社三協精機製作所 円形パターン識別方法および装置
JP3580088B2 (ja) * 1997-06-25 2004-10-20 松下電工株式会社 外観検査方法
JP2002325246A (ja) * 2001-04-25 2002-11-08 Univ Waseda 競技場用中継システム
JP3385276B2 (ja) * 2001-08-27 2003-03-10 東芝アイティー・ソリューション株式会社 円形体の形状検査装置
JP2007147441A (ja) * 2005-11-28 2007-06-14 Nikon Corp 検査装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0772097A (ja) * 1993-09-03 1995-03-17 Dainippon Printing Co Ltd 欠陥検査方法
JPH0968502A (ja) * 1995-08-30 1997-03-11 Dainippon Screen Mfg Co Ltd 透孔板の検査方法および検査装置
JPH10311776A (ja) * 1997-03-13 1998-11-24 Asahi Optical Co Ltd 光学部材検査装置
JP2000136916A (ja) * 1998-10-15 2000-05-16 Wacker Siltronic Corp 半導体ウエ―ハ上のエッジ欠陥を検出、モニタ及び特徴付ける方法及び装置
JP2001221749A (ja) * 2000-02-10 2001-08-17 Hitachi Ltd 観察装置及び観察方法
JP2002328094A (ja) * 2001-05-02 2002-11-15 Nidec Tosok Corp Ledリング照明及びそれを備えた画像検査装置
JP2007234932A (ja) * 2006-03-02 2007-09-13 Olympus Corp 外観検査装置
JP2008216248A (ja) * 2007-02-28 2008-09-18 Vistec Semiconductor Systems Gmbh ウエハ縁端部の上部表面の欠陥の高分解能画像を取得する方法
US20080309927A1 (en) * 2007-06-15 2008-12-18 Qimonda Ag Wafer inspection system and method
JP2009133797A (ja) * 2007-12-03 2009-06-18 Shibaura Mechatronics Corp 基板表面検査装置及び基板表面検査方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530885A (zh) * 2013-10-23 2014-01-22 北京倍肯恒业科技发展有限责任公司 一种一维图像自适应分层边缘检测提取算法
CN103530885B (zh) * 2013-10-23 2015-10-07 北京倍肯恒业科技发展有限责任公司 一种一维图像自适应分层边缘检测提取方法
US20220208580A1 (en) * 2020-12-28 2022-06-30 Semes Co., Ltd. Wafer inspection method

Also Published As

Publication number Publication date
JP2010118046A (ja) 2010-05-27
KR20110057170A (ko) 2011-05-31
CN102177428A (zh) 2011-09-07
KR101227706B1 (ko) 2013-01-29
CN102177428B (zh) 2013-01-02

Similar Documents

Publication Publication Date Title
WO2010044433A1 (fr) Procédé de traitement d'image, dispositif de traitement d'image et dispositif d'inspection de surface utilisant le dispositif de traitement d'image
JP5997039B2 (ja) 欠陥検査方法および欠陥検査装置
JP4652391B2 (ja) パターン検査装置、及び、パターン検査方法
JP5193113B2 (ja) Mtf測定装置およびmtf測定プログラム
US7359546B2 (en) Defect inspection apparatus and defect inspection method
JP2010118046A5 (fr)
JP5296967B2 (ja) 3次元形状計測装置
JP2008098968A (ja) 欠陥検査装置
IL189010A (en) Advanced cell-to-cell inspection
JP4906128B2 (ja) カメラユニット検査装置、及びカメラユニット検査方法
JP6241052B2 (ja) 画像処理システムおよび画像処理プログラム
JP7017207B2 (ja) 画像検査装置及び、その画像検査方法
JP2006276454A (ja) 画像補正方法、およびこれを用いたパターン欠陥検査方法
JP2008040705A (ja) ぼかしフィルタ設計方法
JP2004212221A (ja) パターン検査方法及びパターン検査装置
JP2004191112A (ja) 欠陥検査方法
JP5178781B2 (ja) センサ出力データの補正装置及びセンサ出力データの補正方法
JP2009031006A (ja) 外観検査装置及び方法
JP4277026B2 (ja) パターン検査装置、及びパターン検査方法
JP2010091425A (ja) 欠陥検査装置及び欠陥検査方法
JP2009188175A (ja) 外観検査装置及び外観検査方法
JP2017138246A (ja) 検査装置、検査方法、及びイメージセンサ
JP2008154195A (ja) レンズのキャリブレーション用パターン作成方法、レンズのキャリブレーション用パターン、キャリブレーション用パターンを利用したレンズのキャリブレーション方法、レンズのキャリブレーション装置、撮像装置のキャリブレーション方法、および撮像装置のキャリブレーション装置
JP2012068761A (ja) 画像処理装置
JP4235756B2 (ja) 位置ずれ検出方法及び位置ずれ検出装置並びに画像処理方法及び画像処理装置とそれを用いた検査装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980140443.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09820611

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20117006596

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09820611

Country of ref document: EP

Kind code of ref document: A1