US20100150426A1 - Apparatus and method for inspecting pattern - Google Patents

Apparatus and method for inspecting pattern Download PDF

Info

Publication number
US20100150426A1
US20100150426A1 US12/710,844 US71084410A US2010150426A1 US 20100150426 A1 US20100150426 A1 US 20100150426A1 US 71084410 A US71084410 A US 71084410A US 2010150426 A1 US2010150426 A1 US 2010150426A1
Authority
US
United States
Prior art keywords
image
defect candidate
feature value
inspection
masked
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/710,844
Inventor
Hiroyuki Onishi
Hiroshi Asai
Hiroshi Ogi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Screen Holdings Co Ltd
Original Assignee
Screen Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JPP2004-283003 priority Critical
Priority to JP2004283003A priority patent/JP2006098151A/en
Priority to US11/235,288 priority patent/US7689029B2/en
Application filed by Screen Holdings Co Ltd filed Critical Screen Holdings Co Ltd
Priority to US12/710,844 priority patent/US20100150426A1/en
Publication of US20100150426A1 publication Critical patent/US20100150426A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Abstract

An operation part in a pattern inspection apparatus includes a defect candidate image generator for generating a binary defect candidate image representing a defect candidate area in an inspection image by comparing the inspection image with a reference image, in an inspection image masking part the inspection image is masked with the defect candidate image to obtain a masked inspection image. In a feature value calculation part, an autocorrelation feature value is obtained from the masked inspection image, and outputted to a classifying part. The classifying part comprises a classifier outputting a classification result on the basis of the autocorrelation feature value and a classifier construction part for constructing the classifier by learning. It is thereby possible to easily perform the high accurate classification of defect candidate using the autocorrelation feature value which is hard to characterize as compared with geometric feature value or feature value representing density.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a technique for inspecting pattern on an object.
  • 2. Description of the Background Art
  • A comparison check method has been mainly used, conventionally, in a field of inspection of pattern formed on a semiconductor substrate, a glass substrate, a printed circuit board or the like. For example, in binary image, an image of exclusive OR of an inspection image (an image to be inspected) and a reference image is obtained, and in grayscale image, an absolute difference value image which represents absolute values of the difference between pixel values in an inspection image and pixel values in a reference image is obtained and each pixel is binarized by a predetermined threshold value, to detect an area of a defect.
  • In the circuit formation process or the like of a semiconductor substrate, a pattern inspection apparatus with a function to classify a class of a detected defect automatically (i.e., to classify a defect automatically) is used to specify the cause of a low yield and improve manufacturing conditions in each operation. In general, classification of a defect is performed in accordance with classification conditions given by a user in advance (on a so-called rule base) or by using a method based on learning such as discriminant analysis or neural network on the basis of a feature value obtained from an area in an inspection image or a reference image corresponding to a defect.
  • Japanese Examined Patent Application Laid Open Gazette No. 5-13256 (Document 1) discloses a method where a binary image which represents defects is divided into partial areas by using a reference image divided into areas and a feature value of the defect is obtained in each partial area, to classify a class of the defect on a rule base. Japanese Patent Application Laid Open Gazette No. 9-186208 (Document 2) suggests a method where a grayscale inspection image is separated into a wire portion and a background portion, and a feature value representing density of defect is acquired in each portion, to classify the defect on the basis of the feature value. Japanese Patent Application Laid Open Gazette No. 8-21803 (Document 3) discloses a method where geometric feature values such as an area (i.e., the number of pixels), length of circumference, feret diameter, roundness, centroid position are mainly obtained and inputted to neural network, to perform a high-level classification of a defect.
  • Japanese Patent Application Laid Open Gazette No. 2001-99625 (Document 4) suggests a method where average density, an autocorrelation feature value or the like of an area in an inspection image or a reference image corresponding to a specified defect candidate is obtained, to classify whether the defect candidate is true or false on a rule base. Japanese Patent Application Laid Open Gazette No. 2002-22421 (Document 5) discloses a technique where two differential images between an inspection image and two reference images are generated, values of pixels in the two differential images are converted into error probability values by using a standard deviation of the pixel values to generate two error probability value images. Further, products of the values of corresponding pixels in the two error probability value images are obtained to generate a probability product image, and value of each pixel in the probability product image is compared with a predetermined threshold value to determine whether there is a defect or not on an object.
  • Recently, it is required to increase the accuracy of classification of defects, however, even if geometric feature values or feature values representing density are obtained as the methods disclosed in Documents 1 or 3, it is difficult to classify classes of defects with high accuracy in some cases. In this case, even if the method based on leaning such as neural network in Document 3 is used, since the accuracy of classification largely depends on a type of inputted feature value in general, it is not possible to meet the required accuracy.
  • On the contrary, by applying the method of Document 4 obtaining an autocorrelation feature value to classification of defects, it is expected to classify with certain accuracy, but depending on an object to be inspected, it is needed to increase the accuracy of classification. To classify by using an autocorrelation feature value on a rule base, normally, it needs to determine classification conditions by complicated operation. Thus, it is not possible to perform a classification of defects easily.
  • SUMMARY OF THE INVENTION
  • The present invention is intended for an apparatus for inspecting pattern on an object. It is an object of the present invention to classify a class of a defect candidate of pattern on an object with high accuracy. It is another object of the present invention to perform a classification of a defect candidate easily.
  • The apparatus comprises a differential image generator for generating a differential image representing a difference between a grayscale inspection image representing pattern on an object and a grayscale reference image or a difference between two images obtained from the inspection image and the reference image, respectively, or a differential image obtained from an image representing the difference, a defect candidate image generator for generating a defect candidate image representing an area which includes a defect candidate in the inspection image by comparing the inspection image with the reference image, a differential image masking part for masking the differential image with the defect candidate image to obtain a masked differential image, a feature value calculation part for obtaining an autocorrelation feature value from the masked differential image and a classifying part for performing a classification of the defect candidate on the basis of the autocorrelation feature value.
  • According to the present invention, it is possible to classify a class of the defect candidate on the basis of the autocorrelation feature value with high accuracy.
  • According to one preferred embodiment of the present invention, the apparatus further comprises an inspection image masking part for masking the inspection image with the defect candidate image to obtain a masked inspection image, and in the apparatus, the classification is also based on an autocorrelation feature value obtained from the masked inspection image.
  • According to another preferred embodiment of the present invention, the apparatus further comprises a reference image masking part for masking the reference image with the defect candidate image to obtain a masked reference image, and in the apparatus, the classification is also based on an autocorrelation feature value obtained from the masked reference image.
  • According to still another preferred embodiment of the present invention, the classification is also based on an autocorrelation feature value obtained from the defect candidate image.
  • The other apparatus comprises a defect candidate image generator for generating a defect candidate image representing an area which includes a defect candidate in a grayscale inspection image representing pattern on an object by comparing the inspection image with a reference image, an inspection image masking part for masking the inspection image with the defect candidate image to obtain a masked inspection image, a feature value calculation part for obtaining an autocorrelation feature value from the masked inspection image and a classifying part for performing a classification of the defect candidate on the basis of the autocorrelation feature value, and in the apparatus, the classifying part comprises a classifier construction part for constructing a classifier by learning which outputs a classification result in accordance with the autocorrelation feature value. It is thereby possible to easily perform the high accurate classification of defect candidate.
  • The present invention is also intended for a method of inspecting pattern on an object.
  • These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a view showing a construction of a pattern inspection apparatus;
  • FIG. 2 is a block diagram showing a structure of a computer;
  • FIG. 3 is a block diagram showing a structure of an operation part in accordance with a first preferred embodiment;
  • FIG. 4 is a flowchart showing an operation flow for inspecting pattern on a substrate;
  • FIG. 5 is a view showing a state where an inspection image and a reference image are masked;
  • FIG. 6 is a view illustrating an autocorrelation matrix;
  • FIGS. 7A, 7B and 7C are views illustrating other examples of autocorrelation matrixes;
  • FIG. 8 is a block diagram showing another exemplary structure of an operation part;
  • FIG. 9 is a flowchart showing part of an operation flow for inspecting pattern on a substrate;
  • FIG. 10 is a block diagram showing still another exemplary structure of an operation part;
  • FIG. 11 is a block diagram showing a structure of an operation part in accordance with a second preferred embodiment;
  • FIG. 12 is a flowchart showing an operation flow for inspecting pattern on a substrate;
  • FIG. 13 is a block diagram showing another exemplary structure of an operation part;
  • FIG. 14 is a block diagram showing still another exemplary structure of an operation part; and
  • FIG. 15 is a block diagram showing still another exemplary structure of an operation part.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • FIG. 1 is a view showing a construction of a pattern inspection apparatus 1 in accordance with the first preferred embodiment of the present invention. The pattern inspection apparatus 1 comprises a stage 2 for holding a semiconductor substrate (hereinafter, referred to as a “substrate”) 9 formed predetermined wiring pattern, an image pickup part 3 for performing an image pickup of the substrate 9 to acquire a grayscale image of the substrate 9, a stage driving part 21 for moving the stage 2 relatively to the image pickup part 3 and a computer 4 constituted of a CPU which performs various computations and memories which store various information and the like. Each constituent element of the pattern inspection apparatus 1 is controlled by the computer 4.
  • The image pickup part 3 comprises a lighting part 31 for emitting an illumination light, an optical system 32 for guiding the illumination light to the substrate 9 and receiving the light from the substrate 9 and an image pickup device 33 for converting an image of the substrate 9 formed by the optical system 32 into an electrical signal, and an image data of the substrate 9 is outputted from the image pickup device 33. The stage driving part 21 has mechanisms for moving the stage 2 in the X direction and the Y direction of FIG. 1. Though the image is acquired by the image pickup part 3 with the illumination light which is a visible light in the first preferred embodiment, for example, an electron beam, an X-ray or the like may be used to acquire images.
  • FIG. 2 is a block diagram showing a structure of the computer 4. The computer 4 has a constitution of general computer system where a CPU 41 for performing various computations, a ROM 42 for storing a basic program and a RAM 43 for storing various information are connected to a bus line as shown in FIG. 2. To the bus line, a fixed disk 44 for storing information, a display 45 for displaying various information such as images, a keyboard 46 a and a mouse 46 b for receiving an input from a user, a reader 47 for reading information from a computer-readable recording medium 8 such as an optical disk, a magnetic disk, a magneto-optical disk, and a communication part 48 for transmitting and receiving a signal to/from other constituent elements in the pattern inspection apparatus 1 are further connected through an interface (I/F) or the like as appropriate.
  • A program 80 is read out from the recording medium 8 through the reader 47 into the computer 4 and stored into the fixed disk 44 in advance. The program 80 is copied to the RAM 43 and the CPU 41 executes computation in accordance with a program stored in the RAM 43 (in other words, the computer executes the program), and the computer 4 thereby serves as an operation part in the pattern inspection apparatus 1.
  • FIG. 3 is a block diagram showing a structure of functions implemented by the CPU 41, the ROM 42, the RAM 43, the fixed disk 44 and the like in an operation by the CPU 41 according to the program 80. In FIG. 3, each constituent (an inspection part 51, an inspection image masking part 521, a feature value calculation part 531 and a classifying part 54) of the operation part 50 is a function implemented by the CPU 41 and the like. These functions of the operation part 50 may be implemented by dedicated electric circuits, or may be partially implemented by dedicated electric circuits.
  • FIG. 4 is a flowchart showing an operation flow of the pattern inspection apparatus 1 for inspecting pattern on the substrate 9. In the pattern inspection apparatus 1, first, a predetermined inspection area (hereinafter, referred to as a “first inspection area”) on the substrate 9 is moved to an image pickup position of the image pickup part 3 by the stage driving part 21 and an image of the first inspection area is acquired. Subsequently, a second inspection area which is located on the substrate 9, away from the first inspection area by a period of patterns (for example, a distance between centers of dies arranged on the substrate 9) and a third inspection area away from the second inspection area by the same distance as the period are subsequently adjusted to the image pickup position and images of the second inspection area and the third inspection area are thereby acquired (Step S11). In the first inspection area and the third inspection area on the substrate 9, the same pattern as that in the second inspection area is formed, and in the following operation, the image of the second inspection area serves as an inspection image (an image to be inspected) and the images of the first inspection area and the third inspection area serve as reference images. One inspection image and two reference images which are acquired thus are outputted to the operation part 50. In FIG. 3, arrows assigned reference signs 61 and 62 show signals representing one inspection image and two reference images, respectively. An image which can be acquired in advance by picking up an image of a substrate with no defect or an image which can be obtained from design data may be prepared as a reference image.
  • In a defect candidate image generator 511 in the inspection part 51, a defect candidate image representing areas (or an area) each of which includes a defect candidate(s) in the inspection image (hereinafter, referred to as a “defect candidate area”) is generated from one inspection image and two reference images (Step S12). As a process for generating the defect candidate image, for example, the method of Japanese Patent Application Laid Open Gazette No. 2002-22421 (Document 5) can be used partially, and the disclosure of which is herein incorporated by reference. In this case, first, a first image which is a simple differential image representing absolute difference values between the inspection image and the reference image of the first inspection area and a second image which is a simple differential image representing absolute difference values between the inspection image and the reference image of the third inspection area are generated. Subsequently, a standard deviation of values of pixels of the first image is obtained and the value of each pixel in the first image is divided by the standard deviation, to generate a first error probability value image. Similarly, a standard deviation of values of pixels of the second image is obtained and the value of each pixel in the second image is divided by the standard deviation, to generate a second error probability value image.
  • After the two error probability value images are generated, one probability product image is generated by obtaining the square root of a product of value of each pixel in the first error probability value image and value of the corresponding pixel in the second error probability value image. The value of each pixel in the probability product image is compared with a predetermined threshold value, for example, “1” is assigned to pixels of values which are larger than the threshold value and “0” is assigned to pixels of values which are not larger than the threshold value, to binarize the probability product image. In the binarized probability product image, spatially continuously extending pixels of the value “1” are labeled as an area and dilation is performed on this area, to generate a binary defect candidate image representing defect candidate areas which include defect candidates.
  • The defect candidate image is not necessarily obtained by binarization of the probability product image, for example, a defect candidate image may be generated by binarizing an absolute difference value image by the predetermined threshold value which represents an absolute value of the difference between a value of each pixel of the inspection image and a value of the corresponding pixel of one reference image. In other words, a defect candidate image 71 representing defect candidate areas 7 may be generated by using various methods where the inspection image 61 is compared with the reference image 62 as conceptually shown in the upper and middle positions of FIG. 5. The defect candidate image is not needed to be a binary image.
  • After the defect candidate image 71 is generated, the inspection image 61 and the defect candidate image 71 are inputted to the inspection image masking part 521 and the inspection image 61 is masked with defect candidate image 71, to obtain a masked inspection image 611 (Step S13). With this operation, as shown in the left of lower position of FIG. 5, values of pixels included in each area in the inspection image 61 corresponding to the defect candidate area 7 (in the left of lower position of FIG. 5, the same reference sign 7 as the middle of FIG. 5 is assigned) do not change and values of pixels included in other areas change to, for example, “0”. In the feature value calculation part 531, an autocorrelation feature value is obtained from the masked inspection image 611 (Step S14).
  • Discussion will be made on a basic method for obtaining an autocorrelation feature value. FIG. 6 is a view illustrating an exemplary autocorrelation matrix 81 used in obtaining an autocorrelation feature value. In the autocorrelation matrix 81 of FIG. 6, the elements are arranged in 3×3 (3 rows and 3 columns), where the value “1” is assigned to the center element and the element adjacent to the upper of this element and the value “0” is assigned to the other elements. In the feature value calculation part 531, the center element of the autocorrelation matrix 81 is overlapped to one pixel in the masked inspection image 611 (hereinafter, referred to as a “target pixel”) which is an image to be operated, to obtain a product of values of pixels in the masked inspection image 611 corresponding to the elements of the value “1” in the autocorrelation matrix 81. Then, a new value of the target pixel is determined by dividing the obtained value by a predetermined value such as the average pixel value, the largest pixel value or the like of inspection image 611.
  • The above operation is subsequently performed to each pixel included in (an area corresponding to) each defect candidate area 7 in the masked inspection image 611 and the sum, the average or the like of new values of pixels included in one defect candidate area 7 is extracted as an autocorrelation feature value of the defect candidate area 7. At that time, since the image to be operated in the feature value calculation part 531 is masked with the defect candidate image, the feature value which is obtained is not affected by areas other than the defect candidate area 7. In the feature value calculation part 531, the autocorrelation feature value is obtained relative to only the defect candidate area 7, thus the efficiency of the operation can be increased. When the function of feature value calculation part 531 is implemented by electric circuits, the operation using the autocorrelation matrix 81 may be performed to all the pixels of the masked inspection image 611.
  • When the operation using the autocorrelation matrix 81 of FIG. 6 is performed to areas which represent longitudinally (i.e., in a column direction of arrangement of pixels) continuous pattern, the autocorrelation feature value becomes large, and when the operation is performed to areas which represent transversely (i.e., in a row direction of arrangement of pixels) continuous pattern, the autocorrelation feature value becomes small. Thus, the feature value which is obtained by the operation using the autocorrelation matrix 81 indicates a feature including geometric feature with directivity on a specific direction (longitudinal direction in the autocorrelation matrix 81 of FIG. 6).
  • FIGS. 7A, 7B and 7C may be used as other examples of autocorrelation matrixes. In the autocorrelation matrixes 81A, 81B and 81C of FIGS. 7A to 7C, an autocorrelation feature value which indicates a feature with the directivities on the directions of 0°, 45° and 135° transversely can be obtained, respectively.
  • Various kinds of autocorrelation matrixes combining some of the autocorrelation matrixes 81 and 81A to 81C or the like are known, and actually, some (or all) kinds of autocorrelation matrixes are selected according to features of pattern to be inspected or features of supposed defects, a plurality of autocorrelation matrixes are prepared, and then an autocorrelation feature value relative to each defect candidate area 7 is obtained as a high-dimensional vector of which values of elements are obtained by using the plurality of autocorrelation matrixes, respectively. Such an autocorrelation feature value can be used for a defect candidate having no geometric feature (for example, textured). In this case, for example, an autocorrelation feature value may be obtained after removing an effect in rotation of a defect candidate by mapping an inspection image to a Rθ plane.
  • The autocorrelation feature value is obtained in the feature value calculation part 531 related to each defect candidate area 7 and the autocorrelation value is outputted to the classifying part 54. The classifying part 54 comprises a classifier 541 which outputs a classification result in accordance with an input of an autocorrelation feature value and a classifier construction part 542 for constructing the classifier 541 by learning. The classifier construction part 542 uses discriminant analysis method, neural network, genetic algorithms or the like with creating training data. The classifier construction part 542 generates defect classification conditions corresponding to the classifier 541 by learning, and inputs the defect classification conditions to the classifier 541. In the classifying part 54, the autocorrelation feature value of each defect candidate area 7 is inputted to the classifier 541 and a classification result on a class of each defect candidate is outputted as a signal R according to the defect classification conditions. As mentioned above, the classification of defect candidate which is included in each defect candidate area 7 is performed on the basis of the autocorrelation feature value in the classifier 54 (Step S15). Then, the classification result is informed to a user if necessary. By taking a false defect (pseudo defect) as one of classes of defects, a defect candidate can be classified as the false defect.
  • As discussed above, in the pattern inspection apparatus 1, the inspection image 61 is masked with the defect candidate image 71 which represents the defect candidate area(s) 7 to obtain the autocorrelation feature value of the defect candidate area 7 from the masked inspection image 611. The autocorrelation feature value is inputted to the classifier 541 constructed in advance by learning and the defect candidate included in the defect candidate area 7 is classified automatically. In many cases, since autocorrelation feature value obtained by using a plurality of autocorrelation matrixes includes elemental features which are hardly grasped at the human vision, it is hard to characterize the autocorrelation feature value as compared with geometric feature value or feature value representing density. Therefore, it is difficult for a user to determine classification conditions for autocorrelation feature value in classifying on a so-called rule base. Conversely, in the pattern inspection apparatus 1, since the classifier 541 is constructed by learning and it outputs a classification result in accordance with an autocorrelation feature value, various classifications can be performed flexibly and the classification(s) of defect candidate(s) using autocorrelation feature(s) values can be performed easily with high accuracy.
  • In the feature value calculation part 531, other type(s) of a feature value(s) such as geometric feature value, feature value representing density in each defect candidate area 7 is further obtained and a classification of defect candidate may be performed together with the autocorrelation feature value with high accuracy. When it is important that a true defect is detected by classifying whether a defect candidate is true or false, it is preferable to make the threshold value small used in binarization of the probability product image or the simple differential image. By this operation, more defect candidate areas 7 are extracted and possibility for missing true defect is reduced. Also in this case, it is possible to reduce a probability of detecting a false defect by classifying on the basis of the autocorrelation feature value in the classifying part 54.
  • FIG. 8 is a block diagram showing another exemplary structure of an operation part. FIG. 9 is a flowchart showing part of an operation flow for inspecting pattern on a substrate and the operation is performed between Step S13 and Step S14 of FIG. 4. An operation part 50 a of FIG. 8, in comparison to the operation part 50 of FIG. 3, further comprises a reference image masking part 522 where signals of a reference image 62 and a defect candidate image 71 are inputted and a feature value calculation part 532 where a signal of a masked reference image 62 is inputted. Other constituent elements are the same as those described in the operation part 50 of FIG. 3 and are represented by the same reference signs.
  • In the operation part 50 a, when the defect candidate image 71 is generated (FIG. 4: Step S12), concurrently with masking the inspection image 61 in the inspection image masking part 521 (Step S13), the reference image 62 (when two reference images are acquired, any one of reference images or one new reference image which is generated by calculating the average of values of corresponding pixels in the two reference images) is masked by the reference image masking part 522 with the defect candidate image 71, to obtain a masked reference image 621 as shown in the right of lower position of FIG. 5 (Step 513 a). Then, an autocorrelation feature value of each defect candidate area 7 is obtained from the masked inspection image 611 and an autocorrelation feature value of each defect candidate area 7 is also obtained from the masked inspection image 621 (FIG. 4: Step S14). In the classifying part 54, the two autocorrelation feature values of each defect candidate area 7 obtained from the two masked images 611 and 612 are inputted to the classifier 541, to classify a class of defect candidate (Step S15).
  • As discussed above, in the operation part 50 a of FIG. 8, the classification by the classifying part 54 is also based on the autocorrelation feature value obtained from the masked reference image 621 in addition to the autocorrelation feature value obtained from the masked reference image 611. As a result, in the pattern inspection apparatus 1 with the operation part 50 a, it becomes possible to classify classes of defect candidates with high accuracy.
  • FIG. 10 is a block diagram showing still another exemplary structure of an operation part. An operation part 50 b of FIG. 10 is a part added a feature value calculation part 530 to the operation part 50 a of FIG. 8. In the operation part 50 b, a defect candidate image which is generated in the inspection part 51 is outputted to the feature value calculation part 530. In Step S14 of FIG. 4, autocorrelation feature values of each defect candidate area are extracted from the defect candidate image, the masked inspection image and the masked reference image, respectively, in the feature value calculation parts 530 to 532. Thus, the classification which is also based on the autocorrelation feature value obtained from the defect candidate image is performed in the classifying part 54. As a result, in the pattern inspection apparatus 1 with the operation part 50 b, it is possible to classify classes of defect candidates with high accuracy.
  • FIG. 11 is a block diagram showing a structure of an operation part 50 c of the pattern inspection apparatus 1 in accordance with the second preferred embodiment of the present invention. The operation part 50 c of FIG. 11 comprises an inspection part 55 and the inspection part 55 comprises the same defect candidate image generator 551 as the defect candidate image generator 511 of FIG. 3 and a differential image generator 552 for generating a differential image. The operation part 50 c further comprises a differential image masking part 523 to which a differential image is inputted and a feature value calculation part 533 to which a masked differential image is inputted.
  • FIG. 12 is a flowchart showing an operation flow of the pattern inspection apparatus 1 with the operation part 50 c for inspecting pattern on a substrate 9 and the operation is performed instead of Step 13 of FIG. 4. In the pattern inspection apparatus 1, after a grayscale inspection image representing pattern on the substrate 9 and a grayscale reference image(s) are acquired (FIG. 4: Step S11), in the defect candidate image generator 551 a defect candidate image representing defect candidate areas (or a defect candidate area) in the inspection image is generated by comparing the inspection image with the reference image (Step S12) and at the same time in the differential image generator 552 an differential image between the inspection image and the reference image is generated (Step S21). Actually, as described in the first preferred embodiment, in generating the defect candidate image, the first image and the second image which are the simple differential images are generated, and any of these images serve as the differential image generated in Step S21. Thus, in the inspection part 55 in the second preferred embodiment, it is not needed to distinguish each function of the defect candidate image generator 551 and the differential image generator 552 clearly.
  • The differential image and the defect candidate image are inputted to the differential image masking part 523 and masked with the candidate defect image (Step S22). In other words, the differential image in which areas other than the defect candidate area(s) are masked is generated. In the feature value calculation part 533, an autocorrelation feature value of each defect candidate area is obtained from the masked differential image (Step S14), the autocorrelation feature value(s) is outputted to the classifying part 54. At this time, other type(s) of feature value(s) such as geometric feature value, feature value representing density may be further obtained. Then, the autocorrelation feature value is inputted to the classifier 541 constructed by learning performed in the classifier construction part 542 in advance (refer to FIG. 3), and the signal R representing the classification result is outputted by the classifier 541, the defect candidate(s) included in the defect candidate area(s) 7 is classified.
  • As discussed above, in the pattern inspection apparatus 1 with the operation part 50 c of FIG. 11, the simple differential image which represents the difference of density information between the inspection image and the reference image is generated and the classification of the defect candidate is performed on the basis of the autocorrelation feature value obtained from the simple differential image which is masked with the defect candidate image. Generally, a simple differential image is an image which represents geometric feature of defect candidate in defect candidate area directly, however, when defect candidate is not clear, textured or the like in a simple differential image, even if geometric feature value or feature value representing density is obtained from the simple differential image, it is hard to perform a classification of defect candidate with high accuracy. Conversely, in the pattern inspection apparatus 1, since an autocorrelation feature value is obtained from the simple differential image, it is possible to classify a class of defect candidate with high accuracy. Since the classifying part 54 comprises the classifier construction part 542 for constructing the classifier 541 by learning which outputs a classification result in accordance with the autocorrelation feature value, it is possible to perform the high accurate classification of defect candidate by using the autocorrelation feature value easily.
  • The differential image generated in the differential image generator 552 may be an image other than the simple differential image between the inspection image and the reference image. For example, the differential image may be the error probability value image or the probability product image in the first preferred embodiment, an image representing the difference between two images which represent edge information generated from the inspection image and the reference image, respectively, or a derivative image of an image representing this difference, the error probability value image or the probability product image, or the like. In other words, a differential image representing a difference between the inspection image and the reference image or a difference between two images obtained from the inspection image and the reference image, respectively, or a differential image obtained from the image representing this difference is generated as a image from which the autocorrelation feature value is extracted in the differential image generator 522 by using various methods.
  • In inspecting pattern, since the inspection image and the reference image are acquired and the differential image is generated on the basis of these images, even if density of an acquired image changes by change over time of light intensity of the illumination light emitted from the lighting part 31 (refer to FIG. 1) or the like, it becomes possible to perform a robust classification. According to methods for generating the differential image and the defect candidate image, whichever image may be generated first.
  • FIG. 13 is a block diagram showing another exemplary structure of an operation part. An operation part 50 d of FIG. 13 is a part added the inspection image masking part 521 and the feature value calculation part 531 in the operation part 50 of FIG. 3 to the operation part 50 c of FIG. 11. In the operation of the operation part 50 d, the operation of FIG. 12 is performed between Step S12 and Step S13 of FIG. 4.
  • In inspecting pattern of the pattern inspection apparatus 1 with the operation part 50 d of FIG. 13, concurrently with masking the differential image (FIG. 12: Step S22), in the inspection image masking part 521 the inspection image is masked with the defect candidate image to obtain a masked inspection image (FIG. 4: Step S13), and autocorrelation feature values of each defect candidate area are obtained from the masked differential image and the masked inspection image, respectively (Step S14). In the classifying part 54, a classification of defect candidate is performed, and the classification is also based on the autocorrelation feature value obtained from the masked inspection image in addition to the autocorrelation feature value obtained from the masked differential image. As a result, in the pattern inspection apparatus 1 with the operation part 50 d, it is possible to classify classes of defect candidates with high accuracy.
  • FIG. 14 is a block diagram showing still another exemplary structure of an operation part. An operation part 50 e of FIG. 14 is a part added the reference image masking part 522 and the feature value calculation part 532 in the operation part 50 a of FIG. 8 to the operation part 50 d of FIG. 13. In the operation of the operation part 50 e, the operation of FIG. 12 is performed between Step S12 and Step S13 of FIG. 4 and the operation of FIG. 9 is performed between Step S13 and Step S14 of FIG. 4.
  • In the pattern inspection apparatus 1 with the operation part 50 e of FIG. 14, the differential image, the inspection image and the reference image are masked with the defect candidate image in the differential image masking part 523, the inspection image masking part 521 and the reference image masking part 522, respectively (FIG. 12: Step S22, FIG. 4: Step S13, FIG. 9: Step 513 a) to obtain a masked differential image, a masked inspection image and a masked reference image, and autocorrelation feature values of each defect candidate area are obtained from the masked differential image, the masked inspection image and the masked reference image, respectively (FIG. 4: Step S14). In the classifying part 54, a classification of defect candidate is performed on the basis of the autocorrelation feature value obtained from the masked reference image in addition to the autocorrelation feature values obtained from the masked differential image and the masked inspection image. It is therefore possible to classify classes of defect candidates with high accuracy.
  • FIG. 15 is a block diagram showing still another exemplary structure of an operation part 50 f. An operation part 50 f of FIG. 15 is a part added the feature value calculation part 530 in the operation part 50 b of FIG. 10 to the operation part 50 e of FIG. 14 and an autocorrelation feature value of the defect candidate image is further obtained. The autocorrelation feature values of each defect candidate area obtained from the masked differential image, the masked inspection image, the masked reference image and the defect candidate image, respectively, are inputted to the classifying part 54 in the operation part 50 f of FIG. 15, it is therefore possible to classify classes of defect candidates with high accuracy.
  • Though the preferred embodiment of the present invention has been discussed above, the present invention is not limited to the above-discussed preferred embodiment, but allows various variations.
  • In the second preferred embodiment, if the autocorrelation feature value extracted from the differential image is obtained as low-dimensional vector (i.e., only a few autocorrelation matrixes are used), the classification of defect candidate may be performed on a rule base.
  • While the autocorrelation matrix in which the elements are arranged in 3×3 is used in the preferred embodiments, by using an autocorrelation matrix representing large area in which elements are arranged in 5×5, 9×9 or the like, high-level texture analysis or the like may be performed. In this case, it is preferable to enlarge a defect candidate area where a feature value is extracted by performing a larger dilation or the like on the defect candidate area in the operation of the inspection part.
  • In a plurality of divided areas obtained by dividing a defect candidate image, the whole divided area including defect candidates specified by the operation of the inspection part may be served as a defect candidate area.
  • The functions of the operation part in the pattern inspection apparatus may be added to a defect reviewing apparatus which picks up images of defects inspected by other inspecting apparatus or the like and confirms defects. In this case, for example, an image picked up by the defect reviewing apparatus is served as an inspection image and a classification of a defect is performed on the basis of the inspection image.
  • In the preferred embodiment, the inspection is performed to pattern formed on a semiconductor substrate, but the pattern inspection apparatus can be utilized to inspect pattern formed on, for example, a printed circuit board, a glass substrate for manufacturing a flat panel display or the like. An object inspected by the pattern inspection apparatus may be something other than the substrate.
  • While the invention has been shown and described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is therefore understood that numerous modifications and variations can be devised without departing from the scope of the invention.
  • This application claims priority benefit under 35 U.S.C. Section 119 of Japanese Patent Application No. 2004-283003 filed in the Japan Patent Office on Sep. 29, 2004, the entire disclosure of which is incorporated herein by reference.

Claims (12)

1. An apparatus for inspecting pattern on an object, comprising:
a differential image generator for generating a differential image representing a difference between a grayscale inspection image representing pattern on an object and a grayscale reference image or a difference between two images obtained from said inspection image and said reference image, respectively, or a differential image obtained from an image representing said difference;
a defect candidate image generator for generating a defect candidate image representing an area which includes a defect candidate in said inspection image by comparing said inspection image with said reference image;
a differential image masking part for masking said differential image with said defect candidate image to obtain a masked differential image;
a feature value calculation part for obtaining an autocorrelation feature value from said masked differential image; and
a classifying part for performing a classification of said defect candidate on the basis of said autocorrelation feature value.
2. The apparatus according to claim 1, further comprising
an inspection image masking part for masking said inspection image with said defect candidate image to obtain a masked inspection image, wherein
said classification is also based on an autocorrelation feature value obtained from said masked inspection image.
3. The apparatus according to claim 2, further comprising
a reference image masking part for masking said reference image with said defect candidate image to obtain a masked reference image, wherein
said classification is also based on an autocorrelation feature value obtained from said masked reference image.
4. The apparatus according to claim 1, wherein
said classification is also based on an autocorrelation feature value obtained from said defect candidate image.
5. The apparatus according to claim 1, wherein
said classifying part comprises a classifier construction part for constructing a classifier by learning which outputs a classification result in accordance with autocorrelation feature value.
6-8. (canceled)
9. A method for inspecting pattern on an object, comprising the steps of:
a) generating a differential image representing a difference between a grayscale inspection image representing pattern on an object and a grayscale reference image or a difference between two images obtained from said inspection image and said reference image, respectively, or a differential image obtained from an image representing said difference;
b) generating a defect candidate image representing an area which includes a defect candidate in said inspection image by comparing said inspection image with said reference image;
c) masking said differential image with said defect candidate image to obtain a masked differential image;
d) obtaining an autocorrelation feature value from said masked differential image; and
e) performing a classification of said defect candidate on the basis of said autocorrelation feature value.
10. The method according to claim 9, further comprising the step of
masking said inspection image with said defect candidate image to obtain a masked inspection image, wherein
said classification is also based on an autocorrelation feature value obtained from said masked inspection image.
11. The method according to claim 10, further comprising the step of
masking said reference image with said defect candidate image to obtain a masked reference image, wherein
said classification is also based on an autocorrelation feature value obtained from said masked reference image.
12. The method according to claim 9, wherein
said classification is also based on an autocorrelation feature value obtained from said defect candidate image.
13. The method according to claim 9, wherein
said classification is performed by a classifier constructed by learning which outputs a classification result in accordance with said autocorrelation feature value.
14-16. (canceled)
US12/710,844 2004-09-29 2010-02-23 Apparatus and method for inspecting pattern Abandoned US20100150426A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JPP2004-283003 2004-09-29
JP2004283003A JP2006098151A (en) 2004-09-29 2004-09-29 Pattern inspection device and pattern inspection method
US11/235,288 US7689029B2 (en) 2004-09-29 2005-09-27 Apparatus and method for inspecting pattern
US12/710,844 US20100150426A1 (en) 2004-09-29 2010-02-23 Apparatus and method for inspecting pattern

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/710,844 US20100150426A1 (en) 2004-09-29 2010-02-23 Apparatus and method for inspecting pattern

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/235,288 Division US7689029B2 (en) 2004-09-29 2005-09-27 Apparatus and method for inspecting pattern

Publications (1)

Publication Number Publication Date
US20100150426A1 true US20100150426A1 (en) 2010-06-17

Family

ID=36099142

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/235,288 Expired - Fee Related US7689029B2 (en) 2004-09-29 2005-09-27 Apparatus and method for inspecting pattern
US12/710,844 Abandoned US20100150426A1 (en) 2004-09-29 2010-02-23 Apparatus and method for inspecting pattern

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/235,288 Expired - Fee Related US7689029B2 (en) 2004-09-29 2005-09-27 Apparatus and method for inspecting pattern

Country Status (2)

Country Link
US (2) US7689029B2 (en)
JP (1) JP2006098151A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101996405A (en) * 2010-08-30 2011-03-30 中国科学院计算技术研究所;圣戈班研发(上海)有限公司 Method and device for rapidly detecting and classifying defects of glass image
CN102305798A (en) * 2011-08-02 2012-01-04 上海交通大学 Method for detecting and classifying glass defects based on machine vision
US20120314910A1 (en) * 2011-06-09 2012-12-13 Carl Zeiss Smt Gmbh Method and device for determining the position of a first structure relative to a second structure or a part thereof
US20130101221A1 (en) * 2011-10-25 2013-04-25 International Business Machines Corporation Anomaly detection in images and videos

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7482037B2 (en) * 2004-08-20 2009-01-27 Micron Technology, Inc. Methods for forming niobium and/or vanadium containing layers using atomic layer deposition
JP2006098151A (en) * 2004-09-29 2006-04-13 Dainippon Screen Mfg Co Ltd Pattern inspection device and pattern inspection method
JP2007205974A (en) * 2006-02-03 2007-08-16 Toppan Printing Co Ltd Method of inspecting plating, and method of inspecting lead frame
JP4616864B2 (en) 2007-06-20 2011-01-19 株式会社日立ハイテクノロジーズ Appearance inspection method and apparatus, and image processing evaluation system
JP4989687B2 (en) * 2009-06-30 2012-08-01 株式会社日立ハイテクノロジーズ Pattern shape evaluation method and pattern shape evaluation apparatus
JP2013167596A (en) * 2012-02-17 2013-08-29 Honda Motor Co Ltd Defect inspection device, defect inspection method, and program
US8971663B2 (en) 2012-05-21 2015-03-03 Cognex Corporation System and method for producing synthetic golden template image for vision system inspection of multi-layer patterns
JP5943722B2 (en) * 2012-06-08 2016-07-05 三菱重工業株式会社 Defect determination apparatus, radiation imaging system, and defect determination method
KR101996917B1 (en) 2012-07-20 2019-10-02 삼성디스플레이 주식회사 Method and apparatus for inspecting flat panel display
US9275437B2 (en) * 2013-03-14 2016-03-01 Algotec Systems Ltd. Method for efficient digital subtraction angiography
US9430824B2 (en) * 2013-05-14 2016-08-30 Kla-Tencor Corporation Machine learning method and apparatus for inspecting reticles
WO2015121952A1 (en) * 2014-02-14 2015-08-20 株式会社 日立ハイテクノロジーズ Detection device
JP6350204B2 (en) * 2014-10-21 2018-07-04 株式会社ニューフレアテクノロジー Drawing data verification method, program, and multi-charged particle beam drawing apparatus
US9286675B1 (en) 2014-10-23 2016-03-15 Applied Materials Israel Ltd. Iterative defect filtering process
TWI627588B (en) * 2015-04-23 2018-06-21 日商思可林集團股份有限公司 Inspection device and substrate processing apparatus
US9846929B2 (en) * 2016-03-24 2017-12-19 Hong Kong Applied Science and Technology Research Institute Company Limited Fast density estimation method for defect inspection application
US10453366B2 (en) 2017-04-18 2019-10-22 Samsung Display Co., Ltd. System and method for white spot mura detection
EP3503024A1 (en) * 2017-12-19 2019-06-26 Tetra Laval Holdings & Finance S.A. Method of defect detection in packaging containers

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860371A (en) * 1986-07-28 1989-08-22 Hitachi, Ltd. Method and apparatus for detecting pattern defects
US5129009A (en) * 1990-06-04 1992-07-07 Motorola, Inc. Method for automatic semiconductor wafer inspection
US5153444A (en) * 1988-12-23 1992-10-06 Hitachi, Ltd. Method and apparatus for detecting patterns
US5479523A (en) * 1994-03-16 1995-12-26 Eastman Kodak Company Constructing classification weights matrices for pattern recognition systems using reduced element feature subsets
US5544256A (en) * 1993-10-22 1996-08-06 International Business Machines Corporation Automated defect classification system
US5640200A (en) * 1994-08-31 1997-06-17 Cognex Corporation Golden template comparison using efficient image registration
US5699447A (en) * 1990-11-16 1997-12-16 Orbot Instruments Ltd. Two-phase optical inspection method and apparatus for defect detection
US5801965A (en) * 1993-12-28 1998-09-01 Hitachi, Ltd. Method and system for manufacturing semiconductor devices, and method and system for inspecting semiconductor devices
US5943437A (en) * 1995-10-09 1999-08-24 Kabushiki Kaisha Kobe Seiko Sho Method and apparatus for classifying a defect on a semiconductor wafer
US6047083A (en) * 1997-01-29 2000-04-04 Hitachi, Ltd. Method of and apparatus for pattern inspection
US6122397A (en) * 1997-07-03 2000-09-19 Tri Path Imaging, Inc. Method and apparatus for maskless semiconductor and liquid crystal display inspection
US6205239B1 (en) * 1996-05-31 2001-03-20 Texas Instruments Incorporated System and method for circuit repair
US6396945B1 (en) * 1997-12-25 2002-05-28 Nec Corporation Image defect detection apparatus and method
US20020131644A1 (en) * 2001-01-31 2002-09-19 Hiroaki Takebe Pattern recognition apparatus and method using probability density function
US20030021462A1 (en) * 2001-06-22 2003-01-30 Kaoru Sakai Defect detection method and its apparatus
US20030228045A1 (en) * 2002-06-10 2003-12-11 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for inspecting pattern
US20030228049A1 (en) * 2002-06-11 2003-12-11 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for inspecting pattern
US6671404B1 (en) * 1997-02-14 2003-12-30 Hewlett-Packard Development Company, L.P. Method and apparatus for recognizing patterns
US20040022432A1 (en) * 2001-09-25 2004-02-05 Keisuke Hayata Parameter estimation apparatus and data collating apparatus
US20040028276A1 (en) * 2002-08-12 2004-02-12 Hitachi, Ltd. Method and its apparatus for classifying defects
US20040057629A1 (en) * 2002-09-20 2004-03-25 Masaki Shikami Print inspection method and print inspection apparatus
US20040133369A1 (en) * 2002-07-12 2004-07-08 Cadence Design Systems, Inc. Method and system for context-specific mask inspection
US20040218806A1 (en) * 2003-02-25 2004-11-04 Hitachi High-Technologies Corporation Method of classifying defects
US6865288B1 (en) * 1999-07-07 2005-03-08 Hitachi, Ltd. Pattern inspection method and apparatus
US6868175B1 (en) * 1999-08-26 2005-03-15 Nanogeometry Research Pattern inspection apparatus, pattern inspection method, and recording medium
US20060034505A1 (en) * 2004-08-13 2006-02-16 Synopsys, Inc. Method and apparatus for deblurring mask images
US20060067570A1 (en) * 2004-09-29 2006-03-30 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for inspecting pattern
US20060067571A1 (en) * 2004-09-29 2006-03-30 Dainippon Screen Mfg. Co., Ltd. Defect detection apparatus and defect detection method
US20060078191A1 (en) * 2004-09-29 2006-04-13 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for detecting defect on object
US20060126916A1 (en) * 2003-05-23 2006-06-15 Nikon Corporation Template generating method and apparatus of the same, pattern detecting method, position detecting method and apparatus of the same, exposure apparatus and method of the same, device manufacturing method and template generating program
US7162073B1 (en) * 2001-11-30 2007-01-09 Cognex Technology And Investment Corporation Methods and apparatuses for detecting classifying and measuring spot defects in an image of an object
US20070133860A1 (en) * 2005-12-14 2007-06-14 Lin Jason Z Methods and systems for binning defects detected on a specimen
US7602946B2 (en) * 2004-09-24 2009-10-13 Nissan Motor Co., Ltd. Motion detection apparatus and motion detection method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0513256B2 (en) 1983-04-15 1993-02-22 Hitachi Ltd
JP2842708B2 (en) 1991-07-02 1999-01-06 日特エンジニアリング株式会社 Taping equipped with a winding apparatus
JP3476913B2 (en) 1994-07-08 2003-12-10 オリンパス株式会社 Defect type determination device and process management system
JP3536884B2 (en) 1995-10-09 2004-06-14 マイクロン ジャパン株式会社 Semiconductor wafer defect classification method and apparatus
JP2001099625A (en) * 1999-09-30 2001-04-13 Dainippon Screen Mfg Co Ltd Device and method for pattern inspection
JP3749090B2 (en) 2000-07-06 2006-02-22 大日本スクリーン製造株式会社 Pattern inspection device
JP4059429B2 (en) * 2002-08-29 2008-03-12 大日本スクリーン製造株式会社 Classification device, yield management system, classification method, substrate manufacturing method, and program
JP4017148B2 (en) 2002-09-05 2007-12-05 大日本スクリーン製造株式会社 Pattern inspection apparatus, yield management system, pattern inspection method, substrate manufacturing method, and program

Patent Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4860371A (en) * 1986-07-28 1989-08-22 Hitachi, Ltd. Method and apparatus for detecting pattern defects
US5153444A (en) * 1988-12-23 1992-10-06 Hitachi, Ltd. Method and apparatus for detecting patterns
US5129009A (en) * 1990-06-04 1992-07-07 Motorola, Inc. Method for automatic semiconductor wafer inspection
US20040218807A1 (en) * 1990-11-16 2004-11-04 Applied Materials, Inc. Optical inspection method for substrate defect detection
US5982921A (en) * 1990-11-16 1999-11-09 Applied Materials, Inc. Optical inspection method and apparatus
US5699447A (en) * 1990-11-16 1997-12-16 Orbot Instruments Ltd. Two-phase optical inspection method and apparatus for defect detection
US5544256A (en) * 1993-10-22 1996-08-06 International Business Machines Corporation Automated defect classification system
US5801965A (en) * 1993-12-28 1998-09-01 Hitachi, Ltd. Method and system for manufacturing semiconductor devices, and method and system for inspecting semiconductor devices
US5479523A (en) * 1994-03-16 1995-12-26 Eastman Kodak Company Constructing classification weights matrices for pattern recognition systems using reduced element feature subsets
US5640200A (en) * 1994-08-31 1997-06-17 Cognex Corporation Golden template comparison using efficient image registration
US5943437A (en) * 1995-10-09 1999-08-24 Kabushiki Kaisha Kobe Seiko Sho Method and apparatus for classifying a defect on a semiconductor wafer
US6205239B1 (en) * 1996-05-31 2001-03-20 Texas Instruments Incorporated System and method for circuit repair
US6047083A (en) * 1997-01-29 2000-04-04 Hitachi, Ltd. Method of and apparatus for pattern inspection
US6671404B1 (en) * 1997-02-14 2003-12-30 Hewlett-Packard Development Company, L.P. Method and apparatus for recognizing patterns
US6122397A (en) * 1997-07-03 2000-09-19 Tri Path Imaging, Inc. Method and apparatus for maskless semiconductor and liquid crystal display inspection
US6396945B1 (en) * 1997-12-25 2002-05-28 Nec Corporation Image defect detection apparatus and method
US6865288B1 (en) * 1999-07-07 2005-03-08 Hitachi, Ltd. Pattern inspection method and apparatus
US6868175B1 (en) * 1999-08-26 2005-03-15 Nanogeometry Research Pattern inspection apparatus, pattern inspection method, and recording medium
US20020131644A1 (en) * 2001-01-31 2002-09-19 Hiroaki Takebe Pattern recognition apparatus and method using probability density function
US20030021462A1 (en) * 2001-06-22 2003-01-30 Kaoru Sakai Defect detection method and its apparatus
US20040022432A1 (en) * 2001-09-25 2004-02-05 Keisuke Hayata Parameter estimation apparatus and data collating apparatus
US7162073B1 (en) * 2001-11-30 2007-01-09 Cognex Technology And Investment Corporation Methods and apparatuses for detecting classifying and measuring spot defects in an image of an object
US20030228045A1 (en) * 2002-06-10 2003-12-11 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for inspecting pattern
US20030228049A1 (en) * 2002-06-11 2003-12-11 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for inspecting pattern
US20040133369A1 (en) * 2002-07-12 2004-07-08 Cadence Design Systems, Inc. Method and system for context-specific mask inspection
US20040028276A1 (en) * 2002-08-12 2004-02-12 Hitachi, Ltd. Method and its apparatus for classifying defects
US20040057629A1 (en) * 2002-09-20 2004-03-25 Masaki Shikami Print inspection method and print inspection apparatus
US20040218806A1 (en) * 2003-02-25 2004-11-04 Hitachi High-Technologies Corporation Method of classifying defects
US20060126916A1 (en) * 2003-05-23 2006-06-15 Nikon Corporation Template generating method and apparatus of the same, pattern detecting method, position detecting method and apparatus of the same, exposure apparatus and method of the same, device manufacturing method and template generating program
US20060034505A1 (en) * 2004-08-13 2006-02-16 Synopsys, Inc. Method and apparatus for deblurring mask images
US7602946B2 (en) * 2004-09-24 2009-10-13 Nissan Motor Co., Ltd. Motion detection apparatus and motion detection method
US20060067571A1 (en) * 2004-09-29 2006-03-30 Dainippon Screen Mfg. Co., Ltd. Defect detection apparatus and defect detection method
US20060078191A1 (en) * 2004-09-29 2006-04-13 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for detecting defect on object
US20060067570A1 (en) * 2004-09-29 2006-03-30 Dainippon Screen Mfg. Co., Ltd. Apparatus and method for inspecting pattern
US20070133860A1 (en) * 2005-12-14 2007-06-14 Lin Jason Z Methods and systems for binning defects detected on a specimen

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101996405A (en) * 2010-08-30 2011-03-30 中国科学院计算技术研究所;圣戈班研发(上海)有限公司 Method and device for rapidly detecting and classifying defects of glass image
US20120314910A1 (en) * 2011-06-09 2012-12-13 Carl Zeiss Smt Gmbh Method and device for determining the position of a first structure relative to a second structure or a part thereof
US9014505B2 (en) * 2011-06-09 2015-04-21 Carl Zeiss Smt Gmbh Method and device for determining the position of a first structure relative to a second structure or a part thereof
CN102305798A (en) * 2011-08-02 2012-01-04 上海交通大学 Method for detecting and classifying glass defects based on machine vision
US20130101221A1 (en) * 2011-10-25 2013-04-25 International Business Machines Corporation Anomaly detection in images and videos
US8724904B2 (en) * 2011-10-25 2014-05-13 International Business Machines Corporation Anomaly detection in images and videos

Also Published As

Publication number Publication date
US20060067570A1 (en) 2006-03-30
JP2006098151A (en) 2006-04-13
US7689029B2 (en) 2010-03-30

Similar Documents

Publication Publication Date Title
Jiao et al. A configurable method for multi-style license plate recognition
KR930003403B1 (en) Technique for object orientation detection using a feed-forward neural network
US7330248B2 (en) Method and apparatus for inspecting defects
US7084968B2 (en) Method for analyzing defect data and inspection apparatus and review system
US7813539B2 (en) Method and apparatus for analyzing defect data and a review system
JP3566470B2 (en) Pattern inspection method and apparatus
US7127099B2 (en) Image searching defect detector
CN1195978C (en) Surface state checking method and circuit board checker
US6628808B1 (en) Apparatus and method for verifying a scanned image
US20070176927A1 (en) Image Processing method and image processor
US7602962B2 (en) Method of classifying defects using multiple inspection machines
JP4616864B2 (en) Appearance inspection method and apparatus, and image processing evaluation system
US20060238755A1 (en) Method for analyzing defect data and inspection apparatus and review system
US8660340B2 (en) Defect classification method and apparatus, and defect inspection apparatus
US7155052B2 (en) Method for pattern inspection
TW571246B (en) System and method for dynamic image recognition
US7231079B2 (en) Method and system for inspecting electronic circuit pattern
US7440605B2 (en) Defect inspection apparatus, defect inspection method and program
Huang et al. Automated visual inspection in the semiconductor industry: A survey
JP4479478B2 (en) Pattern recognition method and apparatus
US8577124B2 (en) Method and apparatus of pattern inspection and semiconductor inspection system using the same
EP0862132A2 (en) Robust identification code recognition system
US20030152276A1 (en) Defect classification/inspection system
JP5599387B2 (en) System and method for detecting defects on a wafer and generating inspection results
US7424146B2 (en) Defect inspection method

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION