WO2017087571A1 - Systems and methods for region-adaptive defect detection - Google Patents

Systems and methods for region-adaptive defect detection Download PDF

Info

Publication number
WO2017087571A1
WO2017087571A1 PCT/US2016/062355 US2016062355W WO2017087571A1 WO 2017087571 A1 WO2017087571 A1 WO 2017087571A1 US 2016062355 W US2016062355 W US 2016062355W WO 2017087571 A1 WO2017087571 A1 WO 2017087571A1
Authority
WO
WIPO (PCT)
Prior art keywords
comparative
defect detection
defect
target
reference image
Prior art date
Application number
PCT/US2016/062355
Other languages
English (en)
French (fr)
Inventor
Christopher Maher
Bjorn BRAUER
Vijay Ramachandran
Laurent Karsenti
Eliezer Rosengaus
John Jordan
Roni MILLER
Original Assignee
Kla-Tencor Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kla-Tencor Corporation filed Critical Kla-Tencor Corporation
Priority to KR1020187017023A priority Critical patent/KR102445535B1/ko
Priority to JP2018525650A priority patent/JP6873129B2/ja
Priority to CN201680066077.6A priority patent/CN108352063B/zh
Priority to SG11201803667RA priority patent/SG11201803667RA/en
Publication of WO2017087571A1 publication Critical patent/WO2017087571A1/en
Priority to IL258804A priority patent/IL258804B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/1717Systems in which incident light is modified in accordance with the properties of the material investigated with a modulation of one or more physical properties of the sample during the optical investigation, e.g. electro-reflectance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8803Visual inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/758Involving statistics of pixels or of feature values, e.g. histogram matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/88Image or video recognition using optical means, e.g. reference filters, holographic masks, frequency domain filters or spatial domain filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1765Method using an image detector and processing of image signal
    • G01N2021/177Detector of the video camera type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20182Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Definitions

  • the present disclosure relates generally to defect detection, and, more particularly, to region-adaptive defect detection.
  • Inspection systems identify and classify defects on semiconductor wafers to generate a defect population on a wafer.
  • a given semiconductor wafer may include hundreds of chips, each chip containing thousands of components of interest, and each component of interest may have millions of instances on a given layer of a chip.
  • inspection systems may generate vast numbers of data points (e.g. hundreds of billions of data points for some systems) on a given wafer.
  • the demands include the need for increased resolution and capacity without sacrificing inspection speed or sensitivity.
  • defect detection is critically dependent on sources of noise in the defect detection method.
  • typical defect detection systems generate a difference image between a test image and a reference image in which defects in the test image are manifest as a difference between pixel values in the test image and the reference image.
  • noise associated with the reference image and/or the test image reduces the defect detection sensitivity.
  • Some additional defect detection systems utilize multiple reference images (e.g. from different wafers, different dies, different regions of a repeating pattern within a die, or the like) in an attempt to increase the sensitivity. Even so, such systems are inherently susceptible to reference data noise, which ultimately limits the defect detection sensitivity. Therefore, it would be desirable to provide a system and method for curing shortcomings such as those identified above.
  • the method includes acquiring a reference image.
  • the method includes selecting a target region of the reference image.
  • the method includes identifying, based on a matching metric, one or more comparative regions of the reference image corresponding to the target region.
  • the method includes acquiring a test image.
  • the method includes masking the test image with the target region of the reference image and the one or more comparative regions of the reference image.
  • the method includes defining a defect threshold for the target region in the test image based on the one or more comparative regions in the test image. In another illustrative embodiment, the method includes determining whether the target region of the test image contains a defect based on the defect threshold.
  • the system includes an inspection sub-system.
  • the inspection sub-system includes an illumination source to generate a beam of illumination.
  • the inspection sub-system includes a set of illumination optics to direct the beam of illumination to a sample.
  • the system includes a detector to collect illumination emanating from the sample.
  • the system includes a controller communicatively coupled to the detector. In another illustrative embodiment, the controller is configured to acquire a reference image.
  • the controller is configured to select a target region of the reference image. In another illustrative embodiment, the controller is configured to identify, based on a matching metric, one or more comparative regions of the reference image corresponding to the target region. In another illustrative embodiment, the controller is configured to acquire a test image. In another illustrative embodiment, the controller is configured to mask the test image with the target region of the reference image and the one or more comparative regions of the reference image. In another illustrative embodiment, the controller is configured to define a defect threshold for the target region in the test image based on the one or more comparative regions in the test image. In another illustrative embodiment, the controller is configured to determine whether the target region of the test image contains a defect based on the defect threshold.
  • the system includes an inspection sub-system.
  • the inspection sub-system includes an illumination source to generate a beam of illumination.
  • the inspection sub-system includes a set of illumination optics to direct the beam of illumination to a sample.
  • the system includes a detector to collect illumination emanating from the sample.
  • the system includes a controller communicatively coupled to the detector. In another illustrative embodiment, the controller is configured to acquire a reference image.
  • the controller is configured to select a target pixel of the reference image.
  • the controller is configured to define a vicinity pattern including a defined layout of pixels.
  • the controller is configured to define a target vicinity in the reference image arranged according to the vicinity pattern.
  • the target vicinity includes the target pixel.
  • the controller is configured to identify, based on a matching metric, one or more comparative vicinities of the reference image corresponding to the target region.
  • the matching metric includes a pixel value distribution of the target vicinity.
  • the controller is configured to acquire a test image. In another illustrative embodiment, the controller is configured to mask the test image with the target region of the reference image and the one or more comparative vicinities of the reference image. In another illustrative embodiment, the controller is configured to calculate one or more pixel value distributions of the one or more comparative vicinities of the test image. In another illustrative embodiment, the controller is configured to estimate a pixel value distribution in the target vicinity of the test image based on the pixel value distributions of the one or more comparative vicinities of the test image. In another illustrative embodiment, the controller is configured to define the defect threshold for the target pixel based on the estimated pixel value distribution in the target vicinity. In another illustrative embodiment, the controller is configured to determine whether the target pixel of the test image contains a defect based on the defect threshold.
  • FIG. 1 is a conceptual view of an inspection system, in accordance with one or more embodiments of the present disclosure.
  • FIG. 2 is a schematic view of a sample including a defect suitable for detection by the inspection system, in accordance with one or more embodiments of the present disclosure.
  • FIG. 3 is a flow diagram illustrating steps performed in a method for inspecting a sample, in accordance with one or more embodiments of the present disclosure.
  • FIG. 4A is a conceptual view of a reference image having multiple target regions, in accordance with one or more embodiments of the present disclosure.
  • FIG. 4B is a plot of a distribution of pixel values associated with the reference image illustrated in FIG. 4A, in accordance with one or more embodiments of the present disclosure.
  • FIG. 5A is a conceptual view of a mask illustrating mask pattern elements corresponding to target region and associated comparative regions.
  • FIG. 5B is a conceptual view of a mask illustrating mask pattern elements corresponding to target region and associated comparative regions.
  • FIG. 5C is a conceptual view of a mask illustrating mask pattern elements corresponding to target region and associated comparative regions.
  • FIG. 6 is a conceptual view of a test image of the sample including a defect, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7A is a conceptual view of the test image as masked by mask, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7B is a plot of the pixel value distribution of the remaining pixels, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7C is a conceptual view of the test image as masked by mask, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7D is a plot of the pixel value distribution of the remaining pixels, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7E is a conceptual view of the test image as masked by mask, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7F is a plot of the pixel value distribution of the remaining pixels, in accordance with one or more embodiments of the present disclosure.
  • FIG. 8 is a conceptual view of a defect map of the sample, in accordance with one or more embodiments of the present disclosure.
  • FIG. 9A is a conceptual view of an inspection measurement sub-system configured as an optical inspection sub-system, in accordance with one or more embodiments of the present disclosure.
  • FIG. 9B is a simplified schematic view of an inspection sub-system configured as a particle beam inspection sub-system in accordance with one or more embodiments of the present disclosure.
  • Embodiments of the present disclosure are directed to systems and methods for region-adaptive defect detection.
  • redundant information in a test image may be used to determine regions of the test image with similar characteristics (e.g. comparative regions). Accordingly, defect detection thresholds may be generated based on the pixel values in the comparative regions of the test image.
  • Embodiments of the present disclosure are directed to determining comparative regions on a test image directly without using a reference image. Additional embodiments of the present disclosure are directed to determining comparative regions on a reference image using reference data (e.g. reference images, design data, or the like) and masking the test image with the comparative regions.
  • reference data e.g. reference images, design data, or the like
  • the reference data may facilitate accurate determination of the comparative regions in the test image, but defect detection thresholds may be based solely on pixel values of the test image.
  • Further embodiments of the present disclosure are directed to the determination of pixel-specific defect detection thresholds. For example, for each pixel in a reference image, a distribution of pixel values in a vicinity of surrounding pixels may be compared to similar vicinities throughout the test image to determine whether a defect is present for that pixel. Additional embodiments of the present disclosure are directed to segmenting the test image into multiple segments and determining a region-specific defect detection threshold for each segment. For example, a test image may be segmented according to regions with similar pixel values (e.g. graylevel pixel values, or the like) and region- specific defect detection thresholds may be determined for each segment.
  • regions with similar pixel values e.g. graylevel pixel values, or the like
  • typical defect detection systems may detect defects by comparing a test image to one or more reference images (e.g. by the generation of one or more difference images, or the like).
  • a reference image may be associated with another die (e.g. die-to-die detection, or the like) or another cell (e.g. cell-to-cell detection, or the like).
  • noise associated with reference images may critically reduce the defect detection sensitivity.
  • Attempts to overcome reference image noise by utilizing multiple reference images may negatively impact additional system performance metrics such as, but not limited to, throughput or processing requirements. For example, comparing a test image to two reference images presents challenges because a defect is only flagged if identified by a comparison to both images. Such a system remains limited by reference image noise.
  • an "optimized" reference image may include a statistical aggregation of multiple reference images such as, but not limited to, a median die in which each pixel of a reference image represents a median of corresponding pixels in multiple reference images.
  • Such systems may still suffer from low sensitivity, particularly for test images that also exhibit noise.
  • Additional embodiments of the present disclosure provide for the determination of defect thresholds based on pixel values of the test image rather than a reference image, which may circumvent reference image noise.
  • defect detection systems may divide the sample into multiple portions such that structures on the sample within each portion may be expected to have similar characteristics.
  • Such systems may separately interrogate each portion to provide a statistical comparison of similar structures within each portion.
  • defect detection systems may employ, but are not limited to, Segmented Auto-Thresholding (SAT), Multi-die Adaptive-Thresholding (MDATVMDAT2), Hierarchical and Local Auto-Thresholding (HLAT), Context-Based Imaging (CBI), Standard Reference Patch (SRP), or Template-Based Inspection (TBI). Inspection systems using design data of a sample to facilitate inspection is generally described in U.S. Patent no. 7,676,077, issued on March 9, 2010, U.S. Patent no. 6,154,714, issued on November 28, 2000, and U.S. Patent no. 8,041 ,103, issued on October 18, 2011 , which are incorporated herein by reference in their entirety.
  • particle-based imaging methods e.g. scanning electron microscopy, focused ion beam imaging, and the like
  • a pixel surrounded by an edge and imaged over a large measurement area may behave differently than a pixel imaged by a measurement area defined by the same edge.
  • Additional embodiments of the present disclosure identify comparative regions within a full image with a large measurement area, which may circumvent artificial measurement effects.
  • sample generally refers to a substrate formed of a semiconductor or non-semiconductor material (e.g. a wafer, or the like).
  • a semiconductor or non-semiconductor material may include, but is not limited to, monocrystalline silicon, gallium arsenide, and indium phosphide.
  • a sample may include one or more layers.
  • such layers may include, but are not limited to, a resist, a dielectric material, a conductive material, and a semiconductive material. Many different types of such layers are known in the art, and the term sample as used herein is intended to encompass a sample on which all types of such layers may be formed.
  • One or more layers formed on a sample may be patterned or unpatterned.
  • a sample may include a plurality of dies, each having repeatable patterned features. Formation and processing of such layers of material may ultimately result in completed devices.
  • Many different types of devices may be formed on a sample, and the term sample as used herein is intended to encompass a sample on which any type of device known in the art is being fabricated.
  • the term sample and wafer should be interpreted as interchangeable.
  • the terms patterning device, mask and reticle should be interpreted as interchangeable.
  • FIG. 1 is a conceptual view of an inspection system 100, in accordance with one or more embodiments of the present disclosure.
  • the inspection system 100 includes an inspection measurement sub-system 102 to interrogate a sample 104.
  • the inspection measurement sub-system 102 may detect one or more defects on the sample 104
  • inspection measurement sub-system 102 may be any type of inspection system known in the art suitable for detecting defects on a sample 104.
  • the inspection measurement sub-system 102 may include a particle- beam inspection sub-system.
  • inspection measurement sub-system 102 may direct one or more particle beams (e.g. electron beams, ion beams, or the like) to the sample 104 such that one or more defects are detectable based on detected radiation emanating from the sample 104 (e.g. secondary electrons, backscattered electrons, luminescence, or the like).
  • inspection measurement sub-system 102 may include an optical inspection sub-system.
  • inspection measurement sub-system 102 may direct optical radiation to the sample 104 such that one or more defects are detectable based on detected radiation emanating from the sample 104 (e.g. reflected radiation, scattered radiation, diffracted radiation, luminescent radiation, or the like).
  • detected radiation emanating from the sample 104 e.g. reflected radiation, scattered radiation, diffracted radiation, luminescent radiation, or the like.
  • the inspection measurement sub-system 102 may operate in an imaging mode or a non-imaging mode.
  • individual objects e.g. defects
  • the illuminated spot on the sample e.g. as part of a bright-field image, a dark-field image, a phase-contrast image, or the like.
  • radiation collected by one or more detectors may associated with a single illuminated spot on the sample and may represent a single pixel of an image of the sample 104.
  • an image of the sample 104 may be generated by acquiring data from an array of sample locations.
  • the inspection measurement sub-system 102 may operate as a scatterometry-based inspection system in which radiation from the sample is analyzed at a pupil plane to characterize the angular distribution of radiation from the sample 104 (e.g. associated with scattering and/or diffraction of radiation by the sample 104).
  • the inspection system 100 includes a controller 106 coupled to the inspection measurement sub-system 102.
  • the controller 106 may be configured to receive data including, but not limited to, inspection data from the inspection measurement sub-system 102.
  • the controller 106 includes one or more processors 108.
  • the one or more processors 108 may be configured to execute a set of program instructions maintained in a memory device 110, or memory.
  • the one or more processors 108 of a controller 106 may include any processing element known in the art. In this sense, the one or more processors 108 may include any microprocessor-type device configured to execute algorithms and/or instructions.
  • the memory device 110 may include any storage medium known in the art suitable for storing program instructions executable by the associated one or more processors 108.
  • the memory device 110 may include a non-transitory memory medium.
  • the memory device 110 may include, but is not limited to, a read-only memory, a random access memory, a magnetic or optical memory device (e.g., disk), a magnetic tape, a solid state drive and the like. It is further noted that memory device 110 may be housed in a common controller housing with the one or more processors 108.
  • FIG. 2 is a schematic view of a sample 104 including a defect suitable for detection by the inspection system 100, in accordance with one or more embodiments of the present disclosure.
  • the sample 104 includes a defect 200 associated with a structure 202.
  • the defect 200 may be identified by comparing an image of the sample 104 (e.g. a test image) to a reference image.
  • noise associated with the reference image may negatively impact the detection sensitivity.
  • a defect may be detected by utilizing repeated information (e.g. small motif repetition) associated with the sample 104.
  • the sample 104 may include one or more sets of repeated structures. Pixels in a test image associated with a particular structure on the sample 104 may be expected to have pixel values similar to pixels in comparative regions associated with other repeated structures. As illustrated in FIG. 2, the structures 202- 208 may form a first set of similar structures, structures 210-212 may form a second set of similar structures, and so on. Further, defect 200 may be detected based on a comparison of pixel values of the defect 200 and pixel values of comparative regions across the sample.
  • defect 200 may be, but is not required to be, detected based on a comparison of pixel values of the defect 200 and pixels in comparative regions 216-220 associated with structures 204-208.
  • defect 200 may be, but is not required to be, detected based on a comparison of pixel values within a surrounding vicinity 214 (e.g. a pixel neighborhood) and pixels in comparative regions 216-220 associated with structures 204-208.
  • repeating structures need not have the same orientation on a sample in order to provide repeated data suitable for the detection of defects.
  • structures 202,208 do not have the same orientation on the sample 104 as structures 204-206.
  • repeated information associated with the sample 104 may not be limited to repeated structures. Rather, it may be the case that a sample 104 may contain a large number of comparable vicinities (e.g. pixel neighborhoods) across the sample associated with portions of a variety of structures such that pixels in comparative vicinities may be expected to have similar pixel values.
  • pixel statistics are estimated for regions of interest (e.g. target regions) of a test image based on comparative regions across the test image.
  • reference data e.g. a reference image, design data, or the like
  • defects on a sample may be detected based on an analysis of pixel values within comparative regions of the test image in which the comparative regions may be provided by reference data.
  • the inspection system 100 may detect defects in a test image of a sample 104 without a direct comparison of pixels with the test image and a reference image (e.g. without generating a difference image between the test image and a reference image). Accordingly, noise associated with a reference image may be circumvented to provide highly sensitive defect detection.
  • design data generally refers to the physical design of an integrated circuit and data derived from the physical design through complex simulation or simple geometric and Boolean operations.
  • an image of a reticle acquired by a reticle inspection system and/or derivatives thereof may be used as a proxy or proxies for the design data.
  • Such a reticle image or a derivative thereof may serve as a substitute for the design layout in any embodiments described herein that uses design data.
  • Design data and design data proxies are described in U.S. Patent No. 7,676,007 by Kulkarni issued on March 9, 2010; U.S. Patent Application Ser. No. 13/115,957 by Kulkarni filed on May 25, 2011 ; U.S.
  • Design data may include characteristics of individual components and/or layers on the sample 104 (e.g. an insulator, a conductor, a semiconductor, a well, a substrate, or the like), a connectivity relationship between layers on the sample 104, or a physical layout of components and connections (e.g. wires) on the sample 104.
  • design data may include a plurality of design pattern elements corresponding to printed pattern elements on the sample 104.
  • design data may include what is known as a "floorplan,” which contains placement information for pattern elements on the sample 104. It is further noted herein that this information may be extracted from the physical design of a chip usually stored in GDSII or OASIS file formats.
  • the structural behavior or process- design interactions may be a function of the context (surroundings) of a pattern element.
  • the analysis proposed can identify pattern elements within the design data, such as polygons describing features to be constructed on a semiconductor layer. Further, the proposed method may provide the coordination information of these repeating blocks as well as contextual data (e.g. the positions of adjacent structures, or the like.
  • design data includes one or more graphical representations (e.g. visual representations, symbolic representations, diagrammatic representations, or the like) of pattern elements.
  • design data may include a graphical representation of the physical layout of components (e.g. descriptions of one or more polygons corresponding to printed pattern elements fabricated on the sample 104).
  • design data may include a graphical representation of one or more layers of a sample design (e.g. one or more layers of printed pattern elements fabricated on the sample 104) or the connectivity between the one or more layers.
  • design data may include a graphical representation of electrical connectivity of components on the sample 104.
  • the design data may include a graphical representation of one or more circuits or sub-circuits associated with the sample.
  • design data includes one or more image files containing graphical representations of one or more portions of the sample 104.
  • design data includes one or more textual descriptions (e.g. one or more lists, one or more tables, one or more databases, or the like) of the connectivity of pattern elements of the sample 104.
  • design data may include, but is not limited to, netlist data, circuit simulation data, or hardware description language data.
  • Netlists may include any type of netlist known in the art for providing a description of the connectivity of an electrical circuit including, but not limited to physical netlists, logical netlists, instance-based netlists, or net-based netlists.
  • a netlist may include one or more sub-netlists (e.g. in a hierarchal configuration) to describe circuits and/or sub-circuits on a sample 104.
  • netlist data associated with a netlist may include, but is not limited to, a list of nodes (e.g. nets, wires between components of a circuit, or the like), a list of ports (e.g. terminals, pins, connectors, or the like), a description of electrical components between the nets, (e.g. resistor, capacitor, inductor, transistor, diode, power source, or the like), values associated with the electrical components (e.g. a resistance value in ohms of a resistor, a voltage value in volts of a power source, frequency characteristics of a voltage source, initial conditions of components, or the like).
  • design data may include one or more netlists associated with specific steps of a semiconductor process flow.
  • a sample 104 may be inspected (e.g. by inspection system 100) at one or more intermediate points in a semiconductor process flow.
  • design data utilized to generate care areas may be specific to the layout of the sample 104 at a current point in the semiconductor process flow.
  • a netlist associated with a particular intermediate point in a semiconductor process flow may be derived (e.g. extracted, or the like) from either the physical design layout in combination with a technology file (layer connectivity, electrical properties of each of the layers, and the like) or a netlist associated with a final layout of a sample 104 to include only components present on the wafer at the particular intermediate point in the semiconductor process flow.
  • FIG. 3 is a flow diagram illustrating steps performed in a method 300 for inspecting a sample, in accordance with one or more embodiments of the present disclosure. Applicant notes that the embodiments and enabling technologies described previously herein in the context of inspection system 100 should be interpreted to extend to method 300. It is further noted, however, that the method 300 is not limited to the architecture of inspection system 100.
  • Each of the steps of the method 300 may be performed as described further herein.
  • the steps may be performed by one or more controllers (e.g. controller 106, or the like), which may be configured according to any of the embodiments described herein.
  • controllers e.g. controller 106, or the like
  • the method described above may be performed by any of the system embodiments described herein.
  • the method 300 may also include one or more additional steps that may be performed by controller or any system embodiments described herein.
  • the method 300 includes a step 302 of acquiring a reference image.
  • the reference image may be representative of a portion of the sample 104 to be inspected for defects.
  • a reference image may include an image of a portion of the sample 104 (e.g. a die, a cell, or the like).
  • the reference image may be formed as an aggregate of multiple reference sub-images.
  • each pixel of the reference image may have a value corresponding to a statistical aggregation of corresponding pixels of multiple reference sub-images (e.g. a median of pixel values of corresponding pixels, an average of pixel values of corresponding pixels, or the like).
  • the reference image includes noise data.
  • the reference image may include noise data associated with multiple reference sub-images.
  • the reference image may include data indicating a relative measure of noise for multiple regions of the sample 104 (e.g. variations between pixel values of multiple reference sub-images, or the like).
  • FIG. 4A is a conceptual view of a reference image 400 having multiple target regions, in accordance with one or more embodiments of the present disclosure.
  • the sample 104 may include a first set of similar structures 402, a second set of similar structures 404, and a third set of similar structures 406.
  • the first set of similar structures 402, the second set of similar structures 404, and the third set of similar structures 406 are illustrated in FIG. 4A as sets of circles solely for illustrative purposes and that the pixels within each of the first set of similar structures 402, the second set of similar structures 404, and the third set of similar structures 406 may be arranged in any pattern.
  • the reference image may be formed by any method known in the art.
  • the reference image may be, but is not required to be, generated at least in part using the inspection measurement sub-system 102. Accordingly, the reference image may correspond to an optical image, a scanning electron microscope image, a particle-beam image, or the like.
  • the reference image may be stored by the inspection system 100.
  • the reference image may be stored within the memory device 110 of controller 106.
  • the reference image may be retrieved from an external source (e.g. a data storage system, a server, an additional inspection system, or the like).
  • the reference image is generated at least in part using design data.
  • the reference image may include one or more aspects of the intended layout (e.g. physical layout, electrical layout, or the like) of one or more features to be inspected.
  • the method 300 includes a step 304 of selecting a target region of the reference image.
  • the target region may include one or more pixels of interest to be inspected for defects (e.g. in a given iteration of the method 300).
  • a reference image may include any number of target regions.
  • each pixel of the test image may be a separate target region.
  • each pixel of the reference image may be considered separately.
  • the target region may include a set of pixels in the test image. In this regard, all pixels in the target region may be considered simultaneously. Referring again to FIG.
  • the target region may include, but is not limited to, pixel set 408 of the first set of similar structures 402, pixel set 410 of the second set of similar structures 404, or pixel set 412 of the third set of similar structures 406. Further, multiple target regions may be considered sequentially or in parallel (e.g. by the controller 106, or the like).
  • the method 300 includes a step 306 of identifying, based on a matching metric, one or more comparative regions of the reference image corresponding to the target region.
  • the matching metric may control the selection of comparative regions of the reference image that are expected to have similar pixel statistics as the target region.
  • the method 300 includes a step 308 of acquiring a test image.
  • the method 300 includes a step 310 of masking the test image with the target region of the reference image and the one or more comparative regions of the reference image. In this regard, location data associated with the one or more comparative regions of the reference image may be used to select relevant portions of the test image. Further, a direct comparison of pixel values of the reference image and the test image (e.g. in a difference image), which may be a significant source of noise, may be avoided.
  • the test image may be an image (e.g. of sample 104) to be inspected for defects.
  • the test image may be formed by any method known in the art.
  • the test image may be, but is not required to be, generated at least in part using the inspection measurement sub-system 102.
  • the test image may correspond to an optical image, a scanning electron microscope image, a particle-beam image, or the like.
  • the test image may be stored by the inspection system 100.
  • the test image may be stored within the memory device 110 of controller 106.
  • the inspection system 100 may operate as a virtual inspection system.
  • the test image may be retrieved from an external source (e.g. a data storage system, a server, an additional inspection system, or the like).
  • the matching metric may be any type of metric known in the art to compare regions of pixels in images.
  • the matching metric includes location data of similarly designed regions of the sample 104 based on design data.
  • design data may be utilized to determine the locations of possible comparative regions.
  • the locations of possible comparative regions for each pixel of the reference image may be stored by the inspection system 100 (e.g. in the memory device 110, or the like).
  • the possible comparative regions for each pixel of the reference image may be stored in a data storage device (e.g. an indexed data storage device, or the like) for efficient retrieval.
  • the matching metric includes a distribution of pixel values around a target pixel (e.g. a target region including a single pixel).
  • a particular distribution of pixel values associated with a matching metric may include, but is not limited to, a particular histogram of pixel values associated with a particular vicinity (e.g. a pixel neighborhood), or a particular spatial distribution of pixels having a relative or absolutely defined set of pixel values.
  • comparative regions may include vicinities of pixels throughout the reference image having the same (or substantially similar) pixel value distribution as pixels surrounding the target pixel.
  • a vicinity matching metric may have a particular vicinity pattern (e.g. a layout of pixels with a defined size and/or shape (e.g. rectangular, circular, or the like).
  • the one or more comparative regions of the reference image may include comparative vicinities across the reference image arranged according to the vicinity pattern in which the comparative vicinities have a pixel value distribution corresponding to that of a target vicinity.
  • the dimensions and/or layout of the vicinity pattern are based on the inspection measurement sub-system 102 used to generate the reference image and/or an image of the sample 104 (e.g. a test image).
  • the dimensions and/or layout of the vicinity pattern may be based on an interaction function and/or a resolution of a lithography system used to fabricate the sample.
  • the vicinity may, but is not required to, represent approximately 100 nm on the sample 104 and may thus represent an approximate resolution of a lithography system (e.g. a 193-nm lithography system, or the like).
  • dimensions and/or layout of the vicinity pattern may be based on an interaction function and/or a resolution of the inspection measurement sub-system 102.
  • the one or more comparative regions may be identified in a reference image based on any method known in the art.
  • comparative regions may be identified based on a pattern matching technique. It is recognized herein that identifying comparative regions in a reference image may potentially be time and/or computationally intensive. For example, the time and/or computational resources required to identify comparative regions using a brute force pattern matching process may negatively impact the overall performance of the inspection system 100.
  • the inspection system 100 (e.g. via the controller 106) may identify comparative regions using a locality-sensitive hashing technique.
  • the inspection system 100 may utilize kernalized locality-sensitive hashing to efficiently identify comparative regions as "nearest neighbors" of a target vicinity.
  • the search method is computationally bound and exploits the stationary nature of the reference image.
  • the matching metric includes a defined range of pixel values (e.g. grayscale pixel values, RGB pixel values, or the like).
  • the one or more comparative regions of the reference image may include pixels having pixel values within the defined range of pixel values.
  • FIG. 4B is a plot 414 of a distribution (e.g. a histogram) 416 of pixel values associated with the reference image illustrated in FIG. 4A, in accordance with one or more embodiments of the present disclosure.
  • reference image 400 is a grayscale image such that pixel values of the reference image represent grayscale values.
  • the first set of similar structures 402, the second set of similar structures 404, and the third set of similar structures 406 have pixel values that lie in distinguishable ranges. For example, as illustrated in FIG. 4B, three ranges may be identified: pixels with grayscale values lower than cutoff 418, pixels with grayscale values between cutoff 418 and cutoff 420, and pixels with grayscale values higher than cutoff 420. Accordingly, a matching metric including a range of pixel values may be used to identify structures in the reference image having pixels with a pixel value within the same range as the target region.
  • location data of the one or more comparative regions are stored by the inspection system 100 (e.g. in the memory device 110, or the like). Location data may include, but is not limited to, locations, sizes, or shapes of the one or more comparative regions in the reference image.
  • the test image is masked with the test region and the comparative regions. In this regard, the pixels associated with the test region and the comparative regions may be analyzed as a group.
  • step 310 includes generating a mask based on the comparative regions identified in step 306.
  • the mask may be generated by any method known in the art.
  • step 310 may include generating a mask to include a binary pattern of pixels corresponding to the locations of the target region and the comparative regions. Further step 310 may include modifying the mask to remove unwanted artifacts such as noise associated with the reference image.
  • step 310 includes one or more image processing steps (e.g. filtering, edge detection, morphological image processing, or the like) to remove unwanted artifacts.
  • step 310 includes generating a mask at least in part using design data. In this regard, the mask may be supplemented and/or modified based on design data such that the mask patterns correspond to designed characteristics of the structures on the sample associated with the comparative regions identified in step 306.
  • FIGS. 5A through 5C are conceptual views of exemplary masks associated with different comparative regions, in accordance with one or more embodiments of the present disclosure.
  • FIG. 5A is a conceptual view of a mask 502 illustrating mask pattern elements 504 corresponding to target region 408 (e.g. selected in step 304) and associated comparative regions (e.g. identified in step 306).
  • FIG. 5B is a conceptual view of a mask 506 illustrating mask pattern elements 508 corresponding to target region 410 (e.g. selected in step 304) and associated comparative regions (e.g. identified in step 306).
  • FIG. 5C is a conceptual view of a mask 510 illustrating mask pattern elements 512 corresponding to target region 412 (e.g. selected in step 304) and associated comparative regions (e.g. identified in step 306).
  • FIG. 6 is a conceptual view of a test image of the sample 104 including a defect, in accordance with one or more embodiments of the present disclosure.
  • the test image 600 may include a first set of similar structures 602, a second set of similar structures 604, and a third set of similar structures 606.
  • a defect 608 may be present on the sample 104 and is manifest in the test image 600 as a modification of pixel values.
  • the method 300 includes a step 312 of defining a defect threshold for the target region in the test image based on the one or more comparative regions in the test image. Accordingly, the defect threshold for the target region in the test image may be based on pixel values of the test image itself and the reference image may provide locations of comparative regions of pixels. In another embodiment, the method 300 includes a step 31 of determining whether the target region of the test image contains a defect based on the defect threshold.
  • the defect threshold may be defined based on the comparative regions using any method known in the art.
  • step 312 may include defining a defect threshold (e.g. for the target region and/or the comparative regions as filtered by the mask in step 310) in order to detect outlier pixels in the pixel value distribution of pixels remaining in the test image after masking such that the outlier pixels may be determined as defects in step 314.
  • outlier pixels may be determined based on the presence (or lack thereof) of tails of a histogram of the pixel values of the remaining pixels. In this regard, pixels with pixel values associated with a tail of a histogram may be determined as defects.
  • FIG. 7A is a conceptual view of the test image 600 as masked by mask 502, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7B is a plot 702 of the pixel value distribution 704 of the remaining pixels, in accordance with one or more embodiments of the present disclosure.
  • the defect threshold is defined by the cutoff values 706 designed to detect outliers in the pixel value distribution 704. For example, as illustrated in FIGS. 7A and 7B, no pixels fall outside the bounds defined by the cutoff values 706 and thus no defects are detected.
  • FIG. 7C is a conceptual view of the test image 600 as masked by mask 506, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7D is a plot 708 of the pixel value distribution 710 of the remaining pixels, in accordance with one or more embodiments of the present disclosure.
  • the defect threshold is defined by the cutoff values 712 designed to detect outliers in the pixel value distribution 710. For example, as illustrated in FIGS. 7C and 7D, no pixels fall outside the bounds defined by the cutoff values 712 and thus no defects are detected.
  • FIG. 7E is a conceptual view of the test image 600 as masked by mask 508, in accordance with one or more embodiments of the present disclosure.
  • FIG. 7F is a plot 71 of the pixel value distribution 704 of the remaining pixels, in accordance with one or more embodiments of the present disclosure.
  • the defect threshold is defined by the cutoff values 718 designed to detect outliers in the pixel value distribution 716.
  • the pixel value distribution 716 includes a tail extending beyond the cutoff values 718 such that outlier pixels 720 beyond the cutoff values 718 are associated with the defect 608.
  • the defect threshold for a target region including a single target pixel may be determined by calculating pixel value distributions of the comparative vicinities of the test image, estimating a pixel value distribution in the target vicinity of the test image based on the pixel value distributions of the one or more comparative vicinities of the test image, and defining the defect threshold for the target pixel based on the estimated pixel value distribution in the target vicinity.
  • step 314 includes generating a defect map of the sample 104.
  • FIG. 8 is a conceptual view of a defect map 800 of the sample 104, in accordance with one or more embodiments of the present disclosure.
  • a defect map 800 may include an image of any identified defects.
  • a defect map may include data associated with any identified defects such as, but not limited to, the size, shape, or location of the identified defects.
  • the high detection sensitivity associated with method 300 may provide self-tuning defect detection based on local properties of a test image.
  • a defect detection threshold associated with a particular target region may be based on local image properties. This may avoid errors associated with global or semi-global defect detection threshold values associated with alternative techniques that may create a competition between different parts of a test image having different average grayscale values.
  • pixel statistics of various comparative regions across the test image may be adjusted (e.g. in step 310) to compensate for large-scale variations in the test image (e.g. variations of the average grayscale values, or the like).
  • Defect detection according to method 300 may additionally provide a high signal to noise ratio sufficient to detect single-pixel defects. Such high detection sensitivity may provide accurate detection of the contours of identified defects and may be highly tolerant to pattern noise.
  • the defect detection method 300 may form a part of a hybrid defect detection method.
  • the detection method 300 may be supplemented with any additional defect detection method known in the art.
  • the detection method 300 may be supplemented with an alternative method to define the defect threshold for the particular target region.
  • the detection method 300 may be supplemented with an alternative method to define the defect threshold for the particular target region.
  • the inspection system 100 may include any inspection sub-system known in the art.
  • FIG. 9A is a conceptual view of an inspection measurement sub-system 102 configured as an optical inspection sub-system, in accordance with one or more embodiments of the present disclosure.
  • the inspection measurement sub-system 102 includes an illumination source 902.
  • the illumination source 902 may include any illumination source known in the art suitable for generating one or more illumination beams 904 (e.g. a beam of photons).
  • the illumination source 902 may include, but is not limited to, a monochromatic light source (e.g. a laser), a polychromatic light source with a spectrum including two or more discrete wavelengths, a broadband light source, or a wavelength-sweeping light source.
  • the illumination source 902 may, but is not limited to, be formed from a white light source (e.g. a broadband light source with a spectrum including visible wavelengths), an laser source, a free-form illumination source, a single-pole illumination source, a multi-pole illumination source, an arc lamp, an electrode-less lamp, or a laser sustained plasma (LSP) source.
  • the illumination beam 904 may be delivered via free-space propagation or guided light (e.g. an optical fiber, a light pipe, or the like).
  • the illumination source 902 directs the one or more illumination beams 904 to the sample 104 via an illumination pathway 906.
  • the illumination pathway 906 may include one or more lenses 910.
  • the illumination pathway 906 may include one or more additional optical components 908 suitable for modifying and/or conditioning the one or more illumination beams 904.
  • the one or more optical components 908 may include, but are not limited to, one or more polarizers, one or more filters, one or more beam splitters, one or more diffusers, one or more homogenizers, one or more apodizers, or one or more beam shapers.
  • the illumination pathway 906 includes a beamsplitter 914.
  • the inspection measurement sub-system 102 includes an objective lens 916 to focus the one or more illumination beams 904 onto the sample 104.
  • the illumination source 902 may direct the one or more illumination beams 904 to the sample at any angle via the illumination pathway 906. In one embodiment, as shown in FIG. 9A, the illumination source 902 directs the one or more illumination beams 904 to the sample 104 at normal incidence angle. In another embodiment, the illumination source 902 directs the one or more illumination beams 904 to the sample 104 at a non-normal incidence angle (e.g. a glancing angle, a 45-degree angle, or the like).
  • a non-normal incidence angle e.g. a glancing angle, a 45-degree angle, or the like.
  • the sample 104 is disposed on a sample stage 912 suitable for securing the sample 104 during scanning.
  • the sample stage 912 is an actuatable stage.
  • the sample stage 912 may include, but is not limited to, one or more translational stages suitable for selectably translating the sample 104 along one or more linear directions (e.g., x-direction, y- direction and/or z-direction).
  • the sample stage 912 may include, but is not limited to, one or more rotational stages suitable for selectably rotating the sample 104 along a rotational direction.
  • the sample stage 912 may include, but is not limited to, a rotational stage and a translational stage suitable for selectably translating the sample along a linear direction and/or rotating the sample 104 along a rotational direction.
  • the illumination pathway 906 includes one or more beam scanning optics (not shown) suitable for scanning the illumination beam 904 across the sample 104.
  • the one or more illumination pathway 906 may include any type of beam scanner known in the art such as, but is not limited to, one or more electro-optic beam deflectors, one or more acousto-optic beam deflectors, one or more galvanometric scanners, one or more resonant scanners, or one or more polygonal scanners.
  • the surface of a sample 104 may be scanned in an r- theta pattern.
  • the illumination beam 904 may be scanned according to any pattern on the sample.
  • the illumination beam 904 is split into one or more beams such that one or more beams may be scanned simultaneously.
  • the inspection measurement sub-system 102 includes one or more detectors 922 (e.g. one or more optical detectors, one or more photon detectors, or the like) configured to capture radiation emanating from the sample 104 through a collection pathway 918.
  • the collection pathway 918 may include multiple optical elements to direct and/or modify illumination collected by the objective lens 916 including, but not limited to one or more lenses 920, one or more filters, one or more polarizers, one or more beam blocks, or one or more beamsplitters. It is noted herein that components of the collection pathway 918 may be oriented in any position relative to the sample 104.
  • the collection pathway includes the objective lens 916 oriented normal to the sample 104.
  • the collection pathway 918 includes multiple collection lenses oriented to collect radiation from the sample at multiple solid angles.
  • the inspection system 100 includes a bright-field inspection system.
  • a bright-field image of the sample 104, or a portion of the sample 104 may be projected onto the detector 922 (e.g. by the objective lens 916, the one or more lenses 920, or the like).
  • the inspection system 100 includes a dark-field inspection system.
  • the inspection system 100 may include one or more components (e.g. an annular beam block, a dark-field objective lens 916 or the like) to direct the illumination beam 904 to the sample 104 at a large incidence angle such that the image of the sample on the detector 922 is associated with scattered and/or diffracted light.
  • the inspection system 100 includes an oblique angle inspection system.
  • the inspection system 100 may direct the illumination beam 904 to the sample at an off-axis angle to provide contrast for the inspection of defects.
  • the inspection system 100 includes a phase contrast inspection system.
  • the inspection system 100 may include one or more phase plates and/or beam blocks (e.g. an annular beam block, or the like) to provide a phase contrast between diffracted and undiffracted light from the sample to provide contrast for defect inspection.
  • the inspection system 100 may include a luminescence inspection system (e.g. a fluorescence inspection system, a phosphorescence inspection system, or the like).
  • the inspection system 100 may direct an illumination beam 904 with a first wavelength spectrum to the sample 104, and include one or more filters to detect one or more additional wavelength spectra emanating from the sample 104 (e.g. emanating from one or more components of the sample 104 and/or one or more defects on the sample 104).
  • the inspection system includes one or more pinholes located in confocal positions such that the system 100 may operate as a confocal inspection system.
  • FIG. 9B is a simplified schematic view of an inspection sub-system configured as a particle beam inspection sub-system in accordance with one or more embodiments of the present disclosure.
  • the illumination source 902 includes a particle source configured to generate a particle beam 904.
  • the particle source 902 may include any particle source known in the art suitable for generating a particle beam 904.
  • the particle source 902 may include, but is not limited to, an electron gun or an ion gun.
  • the particle source 902 is configured to provide a particle beam 904 with a tunable energy.
  • a particle source 902 including an electron source may, but is not limited to, provide an accelerating voltage in the range of 0.1 kV to 30 kV.
  • a particle source including an ion source may, but is not required to, provide an ion beam with an energy value in the range of 1 to 50 keV.
  • the inspection measurement sub-system 102 includes two or more particle beam sources 902 (e.g. electron beam sources or ion beam sources) for the generation of two or more particle beams 904.
  • particle beam sources 902 e.g. electron beam sources or ion beam sources
  • the illumination pathway 906 includes one or more particle focusing elements 924.
  • the one or more particle focusing elements 924 may include, but are not limited to, a single particle focusing element or one or more particle focusing elements forming a compound system.
  • an objective lens 916 of the system 100 is configured to direct the particle beam 904 to the sample 104.
  • the one or more particle focusing elements 924 and/or the objective lens 916 may include any type of particle lenses known in the art including, but not limited to, electrostatic, magnetic, uni-potential, or double-potential lenses.
  • the inspection measurement sub-system 102 may include, but is not limited to one or more electron deflectors, one or more apertures, one or more filters, or one or more stigmators.
  • the inspection measurement sub-system 102 includes one or more particle beam scanning elements 926.
  • the one or more particle beam scanning elements may include, but are not limited to, one or more scanning coils or deflectors suitable for controlling a position of the beam relative to the surface of the sample 104.
  • the one or more scanning elements may be utilized to scan the particle beam 904 across the sample 104 in a selected pattern.
  • the inspection sub-system includes a detector 922 to image or otherwise detect particles emanating from the sample 104.
  • the detector 922 includes an electron collector (e.g., a secondary electron collector, a backscattered electron detector, or the like).
  • the detector 922 includes a photon detector (e.g., a photodetector, an x-ray detector, a scintillating element coupled to photomultiplier tube (PMT) detector, or the like) for detecting electrons and/or photons from the sample surface.
  • a photon detector e.g., a photodetector, an x-ray detector, a scintillating element coupled to photomultiplier tube (PMT) detector, or the like
  • PMT photomultiplier tube
  • the detector 922 may include any particle detector known in the art configured to collect backscattered electrons, Auger electrons, transmitted electrons or photons (e.g., x-rays emitted by surface in response to incident electrons, cathodoluminescence of the sample 104, or the like).
  • the inspection system 100 includes a voltage contrast imaging (VCI) system.
  • VCI voltage contrast imaging
  • particle beams e.g. electron beams, ion beams, or the like
  • a semiconductor sample e.g. a random logic chip, or the like
  • particle beams may be utilized within an inspection system to image a sample (e.g. by capturing secondary electrons, backscattered electrons, or the like emanating from the sample).
  • structures on a sample e.g. a patterned semiconductor wafer
  • Charging effects may include a modification of the number of electrons (e.g. secondary electrons) captured by the system and thus the VCI signal strength.
  • a voltage contrast imaging (VCI) system may generate a high-resolution image of a sample in which the intensity of each pixel of the image provides data on the electrical properties of the sample at the pixel location.
  • insulating structures and/or structures that are not connected to a ground source e.g. are not grounded
  • may develop a charge e.g. a positive charge or a negative charge
  • particles e.g. secondary electrons, ions, or the like
  • a VCI image may include a grayscale image in which the grayscale value of each pixel provides data on the relative electrical characteristics of that location on the wafer.
  • the inspection system 100 includes one or more components (e.g. one or more electrodes) configured to apply one or more voltages to one or more locations of the sample 108. In this regard, the system 100 may generate active voltage contrast imaging data.
  • the inspection system 100 may include a display (not shown).
  • the display is communicatively coupled to the controller 106.
  • the display may be communicatively coupled to one or more processors 108 of controller 106.
  • the one or more processors 108 may display one or more of the various results of the present invention on display.
  • the display device may include any display device known in the art.
  • the display device may include, but is not limited to, a liquid crystal display (LCD).
  • the display device may include, but is not limited to, an organic light-emitting diode (OLED) based display.
  • the display device may include, but is not limited to a CRT display.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • CRT CRT display.
  • the inspection system 100 may include a user interface device (not shown).
  • the user interface device is communicatively couple to the one or more processors 108 of controller 106.
  • the user interface device may be utilized by controller 106 to accept selections and/or instructions from a user.
  • the display may be used to display data to a user.
  • a user may input selection and/or instructions (e.g., a user selection of inspection regions) responsive to inspection data displayed to the user via display device.
  • the user interface device may include any user interface known in the art.
  • the user interface may include, but is not limited to, a keyboard, a keypad, a touchscreen, a lever, a knob, a scroll wheel, a track ball, a switch, a dial, a sliding bar, a scroll bar, a slide, a handle, a touch pad, a paddle, a steering wheel, a joystick, a bezel input device or the like.
  • a touchscreen interface device those skilled in the art should recognize that a large number of touchscreen interface devices may be suitable for implementation in the present invention.
  • the display device may be integrated with a touchscreen interface, such as, but not limited to, a capacitive touchscreen, a resistive touchscreen, a surface acoustic based touchscreen, an infrared based touchscreen, or the like.
  • a touchscreen interface such as, but not limited to, a capacitive touchscreen, a resistive touchscreen, a surface acoustic based touchscreen, an infrared based touchscreen, or the like.
  • any touchscreen interface capable of integration with the display portion of the display device 105 is suitable for implementation in the present invention.
  • the user interface may include, but is not limited to, a bezel mounted interface.
  • FIGS. 9A and 9B are provided merely for illustration and should not be interpreted as limiting. It is anticipated that a number of equivalent or additional configurations may be utilized within the scope of the present invention.
  • the system 100 may be configured as a "real" or a "virtual" inspection system.
  • the system 100 may generate actual images or other output data associated with the sample 104.
  • the system 100 may be configured as a "real" inspection system, rather than a "virtual” system.
  • a storage medium (not shown) and the controller 106 described herein may be configured as a "virtual" inspection system. Accordingly, the system 100 may not operate on a physical sample, but may rather reproduce and/or stream stored data (e.g. data stored in a memory medium 110, or the like) as if a physical sample were being scanned.
  • the output of a "detector” may be data that was previously generated by one or more detectors (e.g. a detector 922) of an actual inspection system in a previous step.
  • detectors e.g. a detector 922
  • systems and methods configured as "virtual" inspection systems are described in commonly assigned U.S. Patent o. 8,126,255 issued on February 28, 2012, and U.S. Patent Application no. 9,222,895, issued on December 29, 2015, both of which are incorporated by reference in their entirety.
  • any two components so associated can also be viewed as being “connected”, or “coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “couplable”, to each other to achieve the desired functionality.
  • Specific examples of couplable include but are not limited to physically interactable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interactable and/or logically interacting components.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Quality & Reliability (AREA)
  • Chemical & Material Sciences (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)
  • Peptides Or Proteins (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Image Analysis (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)
PCT/US2016/062355 2015-11-18 2016-11-16 Systems and methods for region-adaptive defect detection WO2017087571A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020187017023A KR102445535B1 (ko) 2015-11-18 2016-11-16 영역 적응식 결함 검출을 위한 시스템 및 방법
JP2018525650A JP6873129B2 (ja) 2015-11-18 2016-11-16 領域適応的欠陥検出を行うシステムおよび方法
CN201680066077.6A CN108352063B (zh) 2015-11-18 2016-11-16 用于区域自适应缺陷检测的系统及方法
SG11201803667RA SG11201803667RA (en) 2015-11-18 2016-11-16 Systems and methods for region-adaptive defect detection
IL258804A IL258804B (en) 2015-11-18 2018-04-18 Systems and methods for adaptive detection - area of defects

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562257025P 2015-11-18 2015-11-18
US62/257,025 2015-11-18
US15/350,632 US10535131B2 (en) 2015-11-18 2016-11-14 Systems and methods for region-adaptive defect detection
US15/350,632 2016-11-14

Publications (1)

Publication Number Publication Date
WO2017087571A1 true WO2017087571A1 (en) 2017-05-26

Family

ID=58690746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/062355 WO2017087571A1 (en) 2015-11-18 2016-11-16 Systems and methods for region-adaptive defect detection

Country Status (8)

Country Link
US (1) US10535131B2 (zh)
JP (1) JP6873129B2 (zh)
KR (1) KR102445535B1 (zh)
CN (1) CN108352063B (zh)
IL (1) IL258804B (zh)
SG (1) SG11201803667RA (zh)
TW (1) TWI699521B (zh)
WO (1) WO2017087571A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019236624A1 (en) * 2018-06-06 2019-12-12 Kla-Tencor Corporation Cross layer common-unique analysis for nuisance filtering
JP7458767B2 (ja) 2019-01-09 2024-04-01 ザ・ボーイング・カンパニー リアルタイム付加製造プロセス検査

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10186026B2 (en) * 2015-11-17 2019-01-22 Kla-Tencor Corp. Single image detection
EP3560375B1 (en) * 2016-12-20 2023-08-16 Shiseido Company, Ltd. Application control device, application device, application control method, and recording medium
US11429806B2 (en) 2018-11-09 2022-08-30 Canon Virginia, Inc. Devices, systems, and methods for anomaly detection
CN109580632B (zh) * 2018-11-23 2021-03-30 京东方科技集团股份有限公司 一种缺陷确定方法、装置及存储介质
US11321846B2 (en) 2019-03-28 2022-05-03 Canon Virginia, Inc. Devices, systems, and methods for topological normalization for anomaly detection
CN111768357B (zh) * 2019-03-29 2024-03-01 银河水滴科技(北京)有限公司 一种图像检测的方法及装置
US11120546B2 (en) * 2019-09-24 2021-09-14 Kla Corporation Unsupervised learning-based reference selection for enhanced defect inspection sensitivity
US11416982B2 (en) 2019-10-01 2022-08-16 KLA Corp. Controlling a process for inspection of a specimen
US11127136B2 (en) * 2019-12-05 2021-09-21 Kla Corporation System and method for defining flexible regions on a sample during inspection
US11328435B2 (en) * 2020-06-08 2022-05-10 KLA Corp. Image alignment setup for specimens with intra- and inter-specimen variations using unsupervised learning and adaptive database generation methods
CN111986159B (zh) * 2020-07-24 2024-02-27 苏州威华智能装备有限公司 太阳能电池片的电极缺陷检测方法、设备及存储介质
US11803960B2 (en) * 2020-08-12 2023-10-31 Kla Corporation Optical image contrast metric for optical target search
DE102020123979A1 (de) 2020-09-15 2022-03-17 Carl Zeiss Smt Gmbh Defekterkennung für Halbleiterstrukturen auf einem Wafer
CN112381799B (zh) * 2020-11-16 2024-01-23 广东电网有限责任公司肇庆供电局 一种导线断股确认方法、装置、电子设备和计算机可读存储介质
US11854184B2 (en) * 2021-01-14 2023-12-26 Applied Materials Israel Ltd. Determination of defects and/or edge roughness in a specimen based on a reference image
CN113283279B (zh) * 2021-01-25 2024-01-19 广东技术师范大学 一种基于深度学习的视频中多目标跟踪方法及装置
CN112819778B (zh) * 2021-01-28 2022-04-12 中国空气动力研究与发展中心超高速空气动力研究所 一种航天材料损伤检测图像多目标全像素分割方法
CN114723701B (zh) * 2022-03-31 2023-04-18 厦门力和行自动化有限公司 基于计算机视觉的齿轮缺陷检测方法和系统
US11922619B2 (en) 2022-03-31 2024-03-05 Kla Corporation Context-based defect inspection
US20230314336A1 (en) 2022-03-31 2023-10-05 Kla Corporation Multi-mode optical inspection
CN115876823B (zh) * 2023-01-19 2023-07-14 合肥晶合集成电路股份有限公司 薄膜缺陷的检测方法、薄膜缺陷的检测装置及检测系统
CN116152257B (zh) * 2023-04-22 2023-06-27 拓普思传感器(太仓)有限公司 应用于传感器的检测信息优化方法、服务器及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6539106B1 (en) * 1999-01-08 2003-03-25 Applied Materials, Inc. Feature-based defect detection
US20050010890A1 (en) * 2003-07-11 2005-01-13 Applied Materials Israel Ltd Design-based monitoring
US20090196490A1 (en) * 2008-02-06 2009-08-06 Fujitsu Microelectronics Limited Defect inspection method and defect inspection apparatus
US20140193065A1 (en) * 2013-01-09 2014-07-10 Kla-Tencor Corporation Detecting Defects on a Wafer Using Template Image Matching
US20150310600A1 (en) * 2011-12-21 2015-10-29 Applied Materials Israel Ltd. System, method and computer program product for classification within inspection images

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6154714A (en) 1997-11-17 2000-11-28 Heuristic Physics Laboratories Method for using wafer navigation to reduce testing times of integrated circuit wafers
US7676007B1 (en) 2004-07-21 2010-03-09 Jihoon Choi System and method for interpolation based transmit beamforming for MIMO-OFDM with partial feedback
US7570796B2 (en) 2005-11-18 2009-08-04 Kla-Tencor Technologies Corp. Methods and systems for utilizing design data in combination with inspection data
US7676077B2 (en) 2005-11-18 2010-03-09 Kla-Tencor Technologies Corp. Methods and systems for utilizing design data in combination with inspection data
US8041103B2 (en) 2005-11-18 2011-10-18 Kla-Tencor Technologies Corp. Methods and systems for determining a position of inspection data in design data space
JP2008203034A (ja) * 2007-02-19 2008-09-04 Olympus Corp 欠陥検出装置および欠陥検出方法
JP2008258982A (ja) * 2007-04-05 2008-10-23 Canon Inc 画像処理装置及びその制御方法及びプログラム
US7796804B2 (en) * 2007-07-20 2010-09-14 Kla-Tencor Corp. Methods for generating a standard reference die for use in a die to standard reference die inspection and methods for inspecting a wafer
US8126255B2 (en) 2007-09-20 2012-02-28 Kla-Tencor Corp. Systems and methods for creating persistent data for a wafer and for using persistent data for inspection-related functions
US8702566B2 (en) 2008-12-23 2014-04-22 Paul Mazzanobile Speed and/or agility training devices and systems and methods for use thereof
JP5275017B2 (ja) * 2008-12-25 2013-08-28 株式会社日立ハイテクノロジーズ 欠陥検査方法及びその装置
JP2010164487A (ja) * 2009-01-16 2010-07-29 Tokyo Seimitsu Co Ltd 欠陥検査装置及び欠陥検査方法
EP2394294B1 (en) * 2009-02-03 2014-04-02 Qcept Technologies Inc. Patterned wafer inspection system using a non-vibrating contact potential difference sensor
EP2396815A4 (en) * 2009-02-13 2012-11-28 Kla Tencor Corp DETECTION OF DEFECTS ON A WAFER
JP5767963B2 (ja) * 2011-12-28 2015-08-26 株式会社キーエンス 外観検査装置、外観検査方法及びコンピュータプログラム
JP2013160629A (ja) * 2012-02-06 2013-08-19 Hitachi High-Technologies Corp 欠陥検査方法、欠陥検査装置、プログラムおよび出力部
US9041793B2 (en) * 2012-05-17 2015-05-26 Fei Company Scanning microscope having an adaptive scan
US8977035B2 (en) * 2012-06-13 2015-03-10 Applied Materials Israel, Ltd. System, method and computer program product for detection of defects within inspection images
US9367911B2 (en) * 2012-06-13 2016-06-14 Applied Materials Israel, Ltd. Apparatus and method for defect detection including patch-to-patch comparisons
US9916653B2 (en) * 2012-06-27 2018-03-13 Kla-Tenor Corporation Detection of defects embedded in noise for inspection in semiconductor manufacturing
US9189844B2 (en) * 2012-10-15 2015-11-17 Kla-Tencor Corp. Detecting defects on a wafer using defect-specific information
US9222895B2 (en) 2013-02-25 2015-12-29 Kla-Tencor Corp. Generalized virtual inspector
WO2014165547A1 (en) * 2013-04-01 2014-10-09 Kla-Tencor Corporation Mesoscopic defect detection for reticle inspection
US10127652B2 (en) * 2014-02-06 2018-11-13 Kla-Tencor Corp. Defect detection and classification based on attributes determined from a standard reference image
US10290092B2 (en) * 2014-05-15 2019-05-14 Applied Materials Israel, Ltd System, a method and a computer program product for fitting based defect detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6539106B1 (en) * 1999-01-08 2003-03-25 Applied Materials, Inc. Feature-based defect detection
US20050010890A1 (en) * 2003-07-11 2005-01-13 Applied Materials Israel Ltd Design-based monitoring
US20090196490A1 (en) * 2008-02-06 2009-08-06 Fujitsu Microelectronics Limited Defect inspection method and defect inspection apparatus
US20150310600A1 (en) * 2011-12-21 2015-10-29 Applied Materials Israel Ltd. System, method and computer program product for classification within inspection images
US20140193065A1 (en) * 2013-01-09 2014-07-10 Kla-Tencor Corporation Detecting Defects on a Wafer Using Template Image Matching

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019236624A1 (en) * 2018-06-06 2019-12-12 Kla-Tencor Corporation Cross layer common-unique analysis for nuisance filtering
US11151711B2 (en) 2018-06-06 2021-10-19 Kla-Tencor Corporation Cross layer common-unique analysis for nuisance filtering
JP7458767B2 (ja) 2019-01-09 2024-04-01 ザ・ボーイング・カンパニー リアルタイム付加製造プロセス検査

Also Published As

Publication number Publication date
TWI699521B (zh) 2020-07-21
IL258804B (en) 2021-04-29
SG11201803667RA (en) 2018-06-28
JP6873129B2 (ja) 2021-05-19
US20170140516A1 (en) 2017-05-18
CN108352063B (zh) 2022-03-25
JP2019505089A (ja) 2019-02-21
KR102445535B1 (ko) 2022-09-20
US10535131B2 (en) 2020-01-14
CN108352063A (zh) 2018-07-31
KR20180071404A (ko) 2018-06-27
IL258804A (en) 2018-06-28
TW201728892A (zh) 2017-08-16

Similar Documents

Publication Publication Date Title
US10535131B2 (en) Systems and methods for region-adaptive defect detection
US10018571B2 (en) System and method for dynamic care area generation on an inspection tool
JP7169402B2 (ja) 検査ツールへのダイナミックケアエリア生成システムおよび方法
KR101600209B1 (ko) 영역 결정 장치, 검사 장치, 영역 결정 방법 및 영역 결정 방법을 사용한 검사 방법
US11010886B2 (en) Systems and methods for automatic correction of drift between inspection and design for massive pattern searching
WO2019245830A1 (en) Correlating sem and optical images for wafer noise nuisance identification
US6642726B2 (en) Apparatus and methods for reliable and efficient detection of voltage contrast defects
KR20170143485A (ko) 논리 칩 내의 전압 콘트라스트 기반 장애 및 결함 추론
EP4367632A1 (en) Method and system for anomaly-based defect inspection
CN109314067B (zh) 在逻辑及热点检验中使用z层上下文来改善灵敏度及抑制干扰的系统及方法
TWI683997B (zh) 用於在檢測工具上之動態看護區域產生的系統及方法
EP4152096A1 (en) System and method for inspection by failure mechanism classification and identification in a charged particle system
WO2023156125A1 (en) Systems and methods for defect location binning in charged-particle systems
KR20240113497A (ko) 하전 입자 시스템에서 결함 검출 및 결함 위치 식별을 위한 시스템 및 방법
KR20220153067A (ko) 웨이퍼 검사를 위한 기준 데이터 처리

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16867075

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 258804

Country of ref document: IL

WWE Wipo information: entry into national phase

Ref document number: 11201803667R

Country of ref document: SG

WWE Wipo information: entry into national phase

Ref document number: 2018525650

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20187017023

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 16867075

Country of ref document: EP

Kind code of ref document: A1