CN116368377A - Multi-view wafer analysis - Google Patents

Multi-view wafer analysis Download PDF

Info

Publication number
CN116368377A
CN116368377A CN202180067312.2A CN202180067312A CN116368377A CN 116368377 A CN116368377 A CN 116368377A CN 202180067312 A CN202180067312 A CN 202180067312A CN 116368377 A CN116368377 A CN 116368377A
Authority
CN
China
Prior art keywords
sub
region
regions
scan data
covariance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180067312.2A
Other languages
Chinese (zh)
Inventor
H·费尔德曼
E·内斯坦
H·伊兰
S·阿拉德
I·阿尔莫格
O·戈拉尼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Applied Materials Israel Ltd
Original Assignee
Applied Materials Israel Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/010,746 external-priority patent/US11815470B2/en
Application filed by Applied Materials Israel Ltd filed Critical Applied Materials Israel Ltd
Publication of CN116368377A publication Critical patent/CN116368377A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Biochemistry (AREA)
  • Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Immunology (AREA)
  • Analytical Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)

Abstract

Disclosed herein is a method for detecting defects on a sample. The method comprises the following steps: obtaining scan data of a region of a sample in multiple view angles; and performing an integrated analysis of the obtained scan data. The integrated analysis includes: calculating cross-view covariance based on the obtained scan data, and/or estimating cross-view covariance; and determining the presence of defects in the region taking into account cross-view covariance.

Description

Multi-view wafer analysis
Technical Field
The present disclosure relates generally to wafer analysis.
Background
As design rules shrink, wafer analysis tools are correspondingly required to detect smaller and smaller defects. Previously, defect detection was limited mainly by laser power and detector noise. Currently, prior art wafer analysis tools are limited primarily by wafer noise due to diffuse reflection from the wafer surface: surface irregularities on the wafer, which are composed of the roughness of the etched pattern, often appear as bright spots (specks) in the scanned image. These bright spots may closely resemble defective "fingerprints" (signature). Accordingly, there is a need for improved techniques for distinguishing defects from wafer noise.
Disclosure of Invention
Aspects of the present disclosure, in accordance with some embodiments thereof, relate to methods and systems for wafer analysis. More particularly, but not exclusively, aspects of the present disclosure according to some embodiments thereof relate to methods and systems for multi-perspective wafer analysis in which measurement data from multiple angles of view are subjected to an integrated analysis.
Thus, according to an aspect of some embodiments, there is provided a method for detecting defects on a sample (e.g., a wafer or photomask). The method comprises the following steps:
obtaining scan data of a first region (e.g. on a surface) of a sample in multiple view angles.
-performing an integrated analysis of the obtained scan data. The integrated analysis includes:
■ Calculate cross-view covariance (i.e., covariance between different views) based on the obtained scan data, and/or estimate cross-view covariance.
■ The presence of defects in the first region is determined taking into account cross-view covariance.
According to some embodiments of the method, the sample is a patterned wafer.
According to some embodiments of the method, the sample is a bare wafer.
According to some embodiments of the method, multiple perspectives include two or more of the following: one or more angles of incidence of the one or more illuminating beams, one or more collection angles of the one or more collected beams, at least one intensity of the one or more illuminating beams, at least one intensity of the one or more collected beams, and compatible combinations thereof.
According to some embodiments of the method, the method is optical-based, scanning electron microscopy-based and/or atomic force microscopy-based.
According to some embodiments of the method, the method is optically based and the multiple viewing angles include two or more of: one or more illumination angles, intensity of illumination radiation, illumination polarization, illumination wavefront, illumination spectrum, one or more focus shifts of illumination beam, one or more collection angles, intensity of collected radiation, collection polarization, phase of one or more collected beams, bright field (bright field) channel, gray field (gray field) channel, fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
According to some embodiments of the method, the integrated analysis comprises:
-generating, for each of a plurality of sub-regions of the first region, a difference value in each of the multiple views based on the obtained scan data and corresponding reference data of the first region in each of the multiple views. (i.e., a set of difference values is generated, wherein each difference value in the set corresponds to a different viewing angle.)
-determining whether each of the plurality of sub-regions is defective based at least on a difference value corresponding to the sub-region and a sub-region adjacent to the sub-region and a noise value (i.e. a set of noise values) corresponding to the sub-region and the adjacent sub-region. The noise value includes a corresponding covariance from the cross-view covariance.
According to some embodiments of the method, the method further comprises the steps of: a difference image of the first region in each of the multiple perspectives is generated based on the obtained scan data and reference data. The difference value corresponding to each sub-region from the plurality of sub-regions is derived from and/or characterizes a sub-image of the difference image corresponding to the sub-region. ( Such that N sub-images (i.e., a set of N sub-images) correspond to each sub-region given N difference images. More specifically, N sub-images (one sub-image for each of the N difference images) and N corresponding difference values correspond to each sub-region. )
According to some embodiments of the method, the noise value is calculated based at least on the difference value.
According to some embodiments of the method, the step of determining whether each of the plurality of sub-areas is defective comprises:
-generating a covariance matrix comprising noise values corresponding to sub-regions and sub-regions adjacent to the sub-regions.
-multiplying a first vector comprising difference values corresponding to the sub-region and the adjacent sub-region by the inverse of the covariance matrix to obtain a second vector.
-calculating a scalar product of the second vector and a third vector, the components of the third vector comprising values characterizing the one or more defects.
-marking (designating) the sub-region as defective if the scalar product is greater than a predetermined threshold.
According to some embodiments of the method, at least one of the plurality of sub-regions has a size corresponding to a single (image) pixel.
According to some embodiments of the method, cross-view covariance is estimated based at least on scan data obtained at a preliminary scan of a sample in which a region of the sample (e.g., a region on a surface) is sampled. Each sampled region represents a group of regions of the sample, wherein at least one of the sampled regions represents a first region.
According to some embodiments of the method, the method further comprises the steps of: when the presence of a defect is determined, it is determined whether the defect is a defect of interest, and optionally, the defect is classified when the defect is determined to be of interest.
According to some embodiments of the method, the method is repeated for each of a plurality of additional regions in order to scan a larger region of the sample (e.g., a larger region on the surface of the sample) formed by the first region and the additional regions.
According to an aspect of some embodiments, a computerized system for obtaining and analyzing multi-view scan data of a sample (e.g., a wafer or photomask) is provided. The computerized system is configured to implement the above-described method.
According to an aspect of some embodiments, there is provided a non-transitory computer-readable storage medium storing instructions that cause a computerized analysis system (e.g., a wafer analysis system) to implement the above-described method.
According to an aspect of some embodiments, a computerized system for obtaining and analyzing multi-view scan data of a sample is provided. The system comprises:
-a scanning device configured to scan a region of the sample (e.g. a region on a surface) in multiple perspectives.
-a scan data analysis module (comprising one or more processors and memory components) configured to perform an integrated analysis of scan data obtained in a scan, wherein the integrated analysis comprises:
■ The cross-view covariance is calculated based on the obtained scan data and/or estimated.
■ The presence of defects in the region is determined taking into account cross-view covariance.
According to some embodiments of the system, the system is configured for analyzing scan data of a patterned wafer.
According to some embodiments of the system, the system is configured for analyzing scan data of a bare wafer.
According to some embodiments of the system, multiple perspectives include two or more of the following: one or more angles of incidence of the one or more illuminating beams, one or more collection angles of the one or more collected beams, at least one intensity of the one or more illuminating beams, and at least one intensity of the one or more collected beams.
According to some embodiments of the system, the scanning device comprises an optically-based imager.
According to some embodiments of the system, the scanning device comprises a scanning electron microscope.
According to some embodiments of the system, the scanning device comprises an atomic force microscope.
According to some embodiments of the system, multiple perspectives include two or more of the following: one or more illumination angles, intensity of illumination radiation, illumination polarization, illumination wavefront, illumination spectrum, one or more focus shifts of the illumination beam, one or more collection angles, intensity of collected radiation, collection polarization, phase of one or more collected beams, bright field channel, gray field channel, fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
According to some embodiments of the system, the integrated analysis comprises:
-generating, for each of a plurality of sub-regions of the first region, a difference value in each of the multiple views based on the obtained scan data and corresponding reference data of the first region in each of the multiple views.
-determining whether each of the plurality of sub-areas is defective based at least on a difference value corresponding to the sub-area and a sub-area adjacent to the sub-area and a noise value corresponding to the sub-area and an adjacent sub-area. The noise value includes a corresponding covariance from the cross-view covariance.
According to some embodiments of the system, the scan data analysis module is further configured to: a difference image of the first region in each of the multiple perspectives is generated based on the obtained scan data and reference data, wherein a difference value corresponding to each sub-region from the plurality of sub-regions is derived from and/or characterizes a sub-image of the difference image corresponding to the sub-region.
According to some embodiments of the system, the scan data analysis module is configured to calculate the noise value based at least on the difference value.
According to some embodiments of the system, the step of determining whether each of the plurality of sub-areas is defective comprises:
-generating a covariance matrix comprising noise values corresponding to sub-regions and sub-regions adjacent to the sub-regions.
-multiplying a first vector comprising difference values corresponding to the sub-region and the adjacent sub-region by the inverse of the covariance matrix to obtain a second vector.
-calculating a scalar product of the second vector and a third vector, the components of the third vector comprising values characterizing the one or more defects.
-if the scalar product is greater than a predetermined threshold, marking the sub-region as defective.
According to some embodiments of the system, at least one of the plurality of sub-regions has a size corresponding to a single (image) pixel.
According to some embodiments of the system, the scan data analysis module is configured to estimate cross-view covariance based at least on scan data obtained at a preliminary scan of the sample in which a region of the sample (e.g., a region on a surface) is sampled. Each sampled region represents a group of regions of the sample, wherein at least one of the sampled regions represents a first region.
According to some embodiments of the system, the scan data analysis module is further configured to: after determining the presence of the defect, it is further determined whether the defect is a defect of interest, and optionally, classifying the defect when the defect is determined to be of interest.
According to some embodiments of the system, the system is further configured to repeat the scanning and integration analysis with respect to each of the plurality of additional regions in order to scan a larger region of the sample (e.g., a larger region on the surface) formed by the first region and the additional regions.
According to an aspect of some embodiments, there is provided a non-transitory computer-readable storage medium storing instructions that cause a computerized analysis system (e.g., a wafer analysis system) to:
Scanning a region (e.g. a region on a surface) of a sample (e.g. a wafer or a photomask) in multiple view angles.
-performing an integrated analysis of the scan data obtained in the scan, the integrated analysis comprising:
■ The cross-view covariance is calculated based on the obtained scan data and/or estimated.
■ The presence of defects in the region is determined taking into account cross-view covariance.
According to some embodiments of the storage medium, the sample is a patterned wafer.
According to some embodiments of the storage medium, the sample is a bare wafer.
According to some embodiments of the storage medium, multiple perspectives include two or more of the following: one or more angles of incidence of the one or more illuminating beams, one or more collection angles of the one or more collected beams, at least one intensity of the one or more illuminating beams, and at least one intensity of the one or more collected beams.
According to some embodiments of the storage medium, the computerized analysis system is optically based.
According to some embodiments of the storage medium, the computerized analysis system scanning is electron microscopy-based or atomic force microscopy-based.
According to some embodiments of the storage medium, multiple perspectives include two or more of the following: one or more illumination angles, intensity of illumination radiation, illumination polarization, illumination wavefront, illumination spectrum, one or more focus shifts of the illumination beam, one or more collection angles, intensity of collected radiation, collection polarization, phase of one or more collected beams, bright field channel, gray field channel, fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
According to some embodiments of the storage medium, the integrated analysis comprises:
-generating, for each of a plurality of sub-regions of the first region, a difference value in each of the multiple views based on the obtained scan data and corresponding reference data of the first region in each of the multiple views.
-determining whether each of the plurality of sub-areas is defective based at least on a difference value corresponding to the sub-area and a sub-area adjacent to the sub-area and a noise value corresponding to the sub-area and an adjacent sub-area. The noise value includes a corresponding covariance from the cross-view covariance.
According to some embodiments of the storage medium, the stored instructions cause a scan data analysis module of the computerized system to: a difference image of the first region in each of the multiple perspectives is generated based on the obtained scan data and reference data, wherein a difference value corresponding to each sub-region from the plurality of sub-regions is derived from and/or characterizes a sub-image of the difference image corresponding to the sub-region.
According to some embodiments of the storage medium, the stored instructions cause the scan data analysis module to calculate a noise value based at least on the variance value.
According to some embodiments of the storage medium, the step of determining whether each of the plurality of sub-areas is defective comprises:
-generating a covariance matrix comprising noise values corresponding to sub-regions and sub-regions adjacent to the sub-regions.
-multiplying a first vector comprising difference values corresponding to the sub-region and the adjacent sub-region by the inverse of the covariance matrix to obtain a second vector.
-calculating a scalar product of the second vector and a third vector, the components of the third vector comprising values characterizing the one or more defects.
-if the scalar product is greater than a predetermined threshold, marking the sub-region as defective.
According to some embodiments of the storage medium, at least one of the plurality of sub-regions has a size corresponding to a single (image) pixel.
According to some embodiments of the storage medium, the stored instructions cause the scan data analysis module to: cross-view covariance is estimated based at least on scan data obtained at a preliminary scan of a sample in which a region of the sample (e.g., a region on a surface) is sampled. Each sampled region represents a group of regions of the sample, wherein at least one of the sampled regions represents a first region.
Some embodiments of the present disclosure may include some, all, or none of the above advantages. One or more other technical advantages may be readily apparent to one skilled in the art from the figures, descriptions, and claims included herein. Moreover, while specific advantages have been enumerated above, various embodiments may include all, some, or none of the enumerated advantages.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. If there is a conflict, the patent specification, including definitions, will control. The indefinite articles "a" and "an" mean "at least one" or "one or more", as used herein, unless the context clearly dictates otherwise.
Unless specifically stated otherwise, as apparent from the present disclosure, it is appreciated that in accordance with some embodiments, terms such as "processing," "computing," "determining," "estimating," "evaluating," "metering," or the like, may refer to actions and/or processes of a computer or computing system or similar electronic computing device that manipulate and/or transform data represented as physical (e.g., electronic) quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, information transmission or information display devices.
Embodiments of the present disclosure may include means for performing the operations herein. The apparatus may be specially constructed for the desired purposes, or may comprise one or more general-purpose computers selectively activated or reconfigured by a computer program stored in the computers. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random Access Memories (RAMs), electrically programmable read-only memories (EPROMs), electrically Erasable and Programmable Read Only Memories (EEPROMs), magnetic or optical cards, or any type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the desired method(s). The desired configuration(s) of various of these systems will be apparent from the description below. In addition, embodiments of the present disclosure are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the disclosure as described herein.
Aspects of the disclosure may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The disclosed embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Drawings
Some embodiments of the present disclosure are described herein with reference to the accompanying drawings. The description and the drawings enable one of ordinary skill in the art to understand how to practice some embodiments. The drawings are for illustrative purposes only and are not intended to show structural details necessary for a basic understanding of the embodiments of the present disclosure. For clarity, some of the objects depicted in the drawings are not drawn to scale. Moreover, two different objects in the same drawing may be drawn to different scales. In particular, the scale of some objects may be greatly exaggerated compared to other objects in the same drawing.
In the drawings:
FIG. 1 is a flow chart of a method for multi-view wafer analysis according to some embodiments;
FIG. 2 is a flowchart of the operation of integrated analysis of multi-view scan data according to some embodiments of the method of FIG. 1;
FIG. 3 is a flowchart of sub-operations for identifying (detecting) defects in a scanned area of a wafer, in accordance with some embodiments of the operations of FIG. 2;
4A-4G present algebraic representations for use in the calculations included in the sub-operations of FIG. 3 in accordance with some embodiments;
FIGS. 5A and 5B present two different ways of enumerating sub-images, according to some embodiments;
FIG. 6 presents a block diagram of a computerized system for obtaining and analyzing multi-view scan data of a wafer (which is also depicted), in accordance with some embodiments;
FIG. 7A schematically depicts a computerized system for obtaining and analyzing multi-view scan data of a wafer (which is also depicted), the computerized system depicted being a particular embodiment of the computerized system of FIG. 6;
FIG. 7B schematically depicts a mirror reflection of light off the wafer of FIG. 7A, in accordance with some embodiments;
FIG. 8 schematically depicts a computerized system for obtaining and analyzing multi-view scan data of a wafer (which is also depicted), the computerized system depicted being a particular embodiment of the computerized system of FIG. 6;
FIG. 9 schematically depicts a computerized system for obtaining and analyzing multi-view scan data of a wafer (which is also depicted), the computerized system depicted being a particular embodiment of the computerized system of FIG. 6; and
fig. 10A-10C present simulation results showing the efficacy of the method of fig. 1.
Detailed Description
The principles, uses and embodiments of the teachings herein may be better understood with reference to the accompanying description and the drawings. Those of ordinary skill in the art, with the benefit of the description and drawings presented herein, will be able to implement the teachings herein without undue effort or experimentation. In the drawings, like reference numerals refer to like parts throughout.
In the description and claims of this application, the words "include" and "have" and their forms are not limited to the members of the list that may be associated with the words.
As used herein, the term "about" may be used to designate a value of a quantity or parameter (e.g., length of an element) as being within a continuous range of values that are about (and including) the given (clarified) value. According to some embodiments, "about" may designate the value of the parameter as between 80% and 120% of the given value. For example, stating "the length of an element is equal to about 1m" is equivalent to stating "the length of an element is between 0.8m and 1.2 m. According to some embodiments, "about" may designate the value of the parameter as being between 90% and 110% of the given value. According to some embodiments, "about" may designate the value of the parameter as being between 95% and 105% of the given value.
As used herein, the terms "substantially" and "about" may be interchangeable according to some embodiments.
Referring to the drawings, in the flow chart, optional operations may occur within blocks delineated by dashed lines.
As used herein, the term "multi-view wafer analysis" is used to refer to wafer analysis that employs scan data from multiple views. For example, the different viewing angles may differ from one another in terms of polarization, collection pupil segments, phase information, focus offset, and the like. The additional information provided by multiple views (as compared to a single view) may be used to more efficiently cope with wafer noise, among other things. Scan data from several perspectives can produce a predictable or self-learning pattern that can be distinguished from wafer noise, thus resulting in improved defect detection rates.
As used herein, the terms "identify" and "detect" and their derivatives, as employed with reference to, for example, defects on a wafer, may be used interchangeably in accordance with some embodiments.
As used herein, the term "sample" may refer to a wafer or photomask according to some embodiments. The wafer may be patterned or bare.
Method
According to an aspect of some embodiments, a computer-implemented method for wafer analysis is provided, wherein scan data from multiple perspectives is subjected to an integrated analysis (as defined and explained below). Fig. 1 presents a flow chart of such a method (method 100) according to some embodiments.
According to some embodiments, the method 100 includes an operation 110, in which scan data for multiple perspectives of a region (area) of a wafer is obtained in operation 110. More specifically, in operation 110, a plurality of images (e.g., image frames) of multiple perspectives of a scanned region of a wafer (e.g., a slice segment corresponding to the image frames) may be obtained. As described in detail below, the plurality of images may be obtained using a scanning device configured to scan the wafer at multiple perspectives. In particular, the scanning device may comprise an imager (imaging module or unit) configured to illuminate (e.g. illuminate) a region of the wafer and to collect radiation from said region. According to some embodiments, the imager may be optically-based (configured to illuminate a region of the wafer with electromagnetic radiation, such as visible and/or Ultraviolet (UV) radiation). According to some embodiments, the UV radiation may be or may include deep UV radiation and/or extreme UV radiation. According to some embodiments, the imager may be configured to illuminate a region of the wafer with one or more charged particle beams (e.g., electron beams).
According to some embodiments, the imager may be configured to allow the wafer to be irradiated with multiple radiation beams simultaneously, thereby facilitating simultaneous scanning of multiple regions of the wafer.
In general, the viewing angles can be classified into two groups: radiation channel viewing angle and collection channel viewing angle. Broadly, the radiation path determines one or more physical properties of an illumination beam incident on the wafer, such as a trajectory line of the beam, a shape of the beam, and/or polarization of the beam (when the beam is a light beam). In contrast, the collection channel includes a sensing type (intensity, polarization, phase) and a "filter", which is broadly referred to herein as a mechanism (e.g., a segmented pupil, fourier filter, polarizing beam splitter) that allows selective collection (and sensing) of components of radiation returned from the wafer that are characterized by certain physical properties, such as return (reflection, scattering) angle, intensity, and polarization (when the radiation is electromagnetic radiation).
According to some embodiments where the imager is optically based, multiple perspectives may include two or more of the following: the illumination angle(s), i.e. the angle(s) of incidence of the illumination radiation, the illumination intensity (as determined by the amplitude of the illumination radiation), the illumination polarization (i.e. the polarization of the illumination radiation), the illumination wavefront (the shape of the wavefront of the illumination radiation in a single color), the illumination spectrum (i.e. the spectrum of the illumination radiation) and one or more focus shifts of the illumination beam (which may be slightly out of focus), the collection angle(s) (thereby allowing selective sensing of light returned at an angle or range of angles), the intensity of the collected radiation (thereby allowing selective sensing of light returned at an intensity or range of intensities), the collection polarization, the phase of the collected beam(s) (when the illumination beam(s) is (are) single color, the bright field channel, the gray field channel (which may be further subdivided into dark field (dark field) and "pure" gray field), fourier filtering of the returned light, the type of sensing (e.g. amplitude, phase and/or polarization), and compatible combinations of the above listed items.
In particular, it is to be understood that a perspective may be characterized by more than one item from the above list. That is, combinations of items from the above list. For example, the viewing angle may be characterized by the angle at which the incident beam impinges on the wafer surface (i.e., the illumination angle) and the polarization of the incident beam (i.e., the illumination polarization). As another example, the viewing angle may be characterized by a collection angle and a collection phase (i.e., the phase of the collected light beam). Further, it is to be understood that the viewing angle may combine characteristics from both the illumination channel and the collection channel. For example, viewing angle may be characterized by illumination polarization and collection polarization. As another example, the viewing angle may be characterized by illumination angle and polarization, and collection intensity and phase.
Thus, the acquired (obtained) images may be different from each other by at least one parameter selected from the view angle list specified above.
As used herein, according to some embodiments, reference to a list comprising a sub-list comprising a plurality of items (e.g., elements or claim definitions) and at least one item that is not in the sub-list, terms such as "two or more of … …" and "at least two of … …" may refer to only two elements of the sub-list, one element of the sub-list and one listed element that is not in the sub-list, two elements that are not in the sub-list, and so forth. For example, according to some embodiments in which at least one illumination spectrum comprises two illumination spectrums, multiple viewing angles may consist of or include the two illumination spectrums.
More generally, according to some embodiments, the reflected and/or scattered light may undergo fourier filtering before being detected. Fourier filtering can be used to increase the number of views and the amount of information available from them. According to some embodiments, multiple viewing angles may include slightly out-of-focus illumination.
According to some embodiments, for example when the illumination source is a laser, the illumination spectrum may be narrow. According to some embodiments, the illumination spectrum may be broad, for example, when the illumination source is from an incoherent light source such as a lamp. According to some embodiments, the at least one illumination spectrum comprises a plurality of illumination spectra. Each of the plurality of illumination spectra may be narrow-and optionally coherent (e.g., when the illumination light is a coherent laser) -or broad.
According to some embodiments, multi-view scan data may be obtained from a bright field channel (i.e., bright field reflected light) and/or a gray field channel (i.e., gray field scattered light). As used herein, the term "gray field scattered light" is used broadly to refer to non-bright field reflected light according to some embodiments. In particular, according to some embodiments, the term "gray field scattered light" may also be used to refer to dark field scattered light.
According to some embodiments, images corresponding to different perspectives may be obtained simultaneously or substantially simultaneously. According to some embodiments, images corresponding to different viewing angles may be obtained continuously. According to some embodiments, some images corresponding to different perspectives may be obtained simultaneously or substantially simultaneously, while some images corresponding to other perspectives may be obtained at an earlier or later time.
According to some embodiments, the imager used to obtain the scan data in operation 110 may include a plurality of detectors. For example, the first detector may be configured to detect the intensity of the returned light beam, while the second detector may be configured to detect the polarization of the returned light beam.
According to some embodiments, in which all view angles are obtained simultaneously, each detector may be assigned (assigned) to a different view angle. Alternatively, a single detector may be employed according to some embodiments in which all viewing angles are obtained continuously (sequentially). According to some embodiments, in which some of the views are obtained simultaneously and some of the views are obtained consecutively, at least some of the detectors may be assigned to a subset of multiple views, the subset comprising at least two of the views, respectively.
According to some embodiments, a segmented pupil may be employed to separate the returned radiation beams reaching the pupil according to the reflection or scattering angles of the sub-beams of the returned radiation beams from the wafer. Different detectors may be allocated to detect radiation from different pupil segments, one detector for each pupil segment, respectively, such that each pupil segment constitutes a different collection channel corresponding to a different collection angle (and a different viewing angle). (the detector may be positioned in a conjugate plane of the pupil plane in which the segmented pupil may be positioned.)
According to some embodiments, the method 100 includes an operation 120, in which the scan data obtained in operation 110 is subjected to an integration analysis to identify (detect) defects in the scanned region. As used herein, the term "integrated analysis" as used in connection with analysis of multi-view scan data (i.e., scan data of at least two different views) refers to analysis that utilizes scan data from multiple views in order to obtain improved defect detection rates. According to some embodiments, the integration analysis may take into account cross-view covariance, i.e., covariance between at least some of the different views.
Optionally, according to some embodiments, the method 100 may further include an operation 125, in which a determination is made as to whether the identified defect (i.e., the defect identified in operation 125) is of interest (or nuisance). According to some such embodiments, defects determined to be of interest may be further classified. That is, operation 125 may determine the type of deformation that caused the defect. Some variations may be specific to certain types of components (semiconductor devices) fabricated on a wafer, such as chips or other components, e.g., transistors. The classification may be based on measured or derived characteristics of the identified defects in the multiple perspectives.
According to some embodiments, the method 100 further includes an operation 130, in which operations 110 and 120 (and optionally operation 125) may be repeated with respect to additional areas of the wafer (e.g., with respect to other slices). In particular, the additional regions may constitute one or more predefined larger regions (e.g., one or more dies) of the wafer to be scanned. Operations 110 and 120 (and optional operation 125) may be repeated until the wafer is completely scanned, according to some embodiments.
According to some embodiments in which method 100 includes both operations 125 and 130, the order of operations 125 and 130 may be reversed.
Fig. 2 presents a flowchart of operation 220, operation 220 being a particular embodiment of operation 120. According to some embodiments, operation 220 may include:
a sub-operation 220a, in which a set of difference images of the scan region is generated based on the obtained images (i.e. the plurality of images obtained in operation 110) and the corresponding reference data.
Each difference image (of the set of difference images) corresponds to one of the views (from multiple views). Each difference image may be generated using one or more of the obtained image corresponding to the view angle and reference data (of the scan region) corresponding to the view angle.
Sub-operation 220b, a difference value(s) (also referred to as attribute (s)) is calculated for each of a plurality of sub-images (e.g., pixels) of each difference image in the set in sub-operation 220 b. Sub-images corresponding to the same wafer sub-region of the scanned wafer region define respective sets of sub-images such that each sub-image in the set of sub-images may correspond to a different viewing angle (from multiple viewing angles). (in particular, in sub-operation 220b, for each sub-image in the set of sub-images (which corresponds to the same wafer sub-region), a respective difference value may be calculated, thereby producing a set of difference values corresponding to the wafer sub-region (and the set of sub-images))
A sub-operation 220c, in which each of a plurality of wafer sub-areas corresponding to a plurality of sub-images of sub-operation 220b may be determined to be defective (or not defective) based on at least the set of difference values and the corresponding (corresponding) set of noise values corresponding to the wafer sub-areas.
As used herein, a sub-region (e.g., whose size corresponds to a pixel or small group of pixels) is referred to as "defective" when it includes a defect or a portion of a defect, according to some embodiments.
As used herein, the term difference value(s) with respect to a sub-image and pixel value(s) with respect to the same sub-image may be used interchangeably, according to some embodiments, when the sub-image is a pixel.
According to some embodiments, the reference data may include reference images that have been obtained, for example, when scanning a wafer or when scanning a wafer manufactured to have the same design, or reference images generated based on design data of a wafer such as CAD data.
As used herein, the term "difference image" is to be understood in a broad sense and may refer to any image obtained by combining at least two images, such as a first image (e.g., an image of a scanned area of a wafer or an image obtained from multiple images of the scanned area) and a second image (e.g., a reference image derived from reference data associated with the scanned area). The combination of two images may involve any manipulation of the two images that results in at least one "difference image" that may reveal a change (difference) between the two images, or more generally may distinguish (distinguish) between the two images (when a difference exists). In particular, it is to be understood that the term "combining" with respect to two images may be used more broadly than subtracting one image from another image, and encompasses other mathematical operations that may be implemented in addition to or instead of subtraction. Further, it is to be understood that one or both of the two images may be manipulated (i.e., preprocessed) separately before combining the two images to obtain the difference image. For example, the first image may be registered with respect to the second image.
As used herein, the term "reference data" should be construed broadly to encompass any data indicative of the physical design of the (patterned) wafer and/or data derived from the physical design (e.g., derived by simulation). According to some embodiments, the "reference data" may include or consist of "design data" of the wafer (such as, for example, CAD data in various formats).
Additionally or alternatively, "reference data" may include or consist of data obtained by scanning the wafer, for example, completely or partially during recipe set-up or even during run-time (run). For example, a scan of one die or multiple dies having the same architecture during runtime may be used as reference data for another die of the same architecture. Further, a first wafer manufactured to a certain design may be scanned during recipe set-up, and the resulting scanned data may be processed to generate reference data or additional reference data for subsequently manufactured wafers of the same design (same design as the first wafer). Such "self-generated" reference data is necessary when design data is not available, but may be beneficial even when design data is available.
More generally, it is understood that the term "difference image" may refer to any set of derived values obtained by jointly manipulating the following two sets of values: a first set of values (obtained during scanning) and a second set of values (reference values obtained from reference data) such that each derived value in the set corresponds to a sub-region (e.g., pixel) of a scanned area on the wafer. Joint manipulation may involve any mathematical operation performed on two sets of values such that the (resulting) derived set of values may reveal a difference (if any) between the two sets of values or, more generally, may distinguish between the two sets of values. (the mathematical operations may or may not include subtraction.) in particular, joint manipulation is not limited to manipulation of corresponding pairs of values. That is, each (difference) value in the set of difference values may result from joint manipulation of multiple values in the first set and multiple values in the second set.
According to some embodiments, the set of difference values associated with the sub-images may also comprise scan data associated with neighboring sub-images or data generated based on scan data associated with neighboring sub-images. For example, according to some embodiments in which each sub-image is a pixel, the set of pixel values (e.g., intensity values) corresponding to the pixel may also include the pixel values of neighboring pixels. As used herein, two sub-images (of a given image (e.g., a difference image) may be referred to as "neighbors" when they are the "nearest neighbors" according to some embodiments. That is, the two sub-images are adjacent to each other in the sense that there are no other sub-images between the two sub-images. According to some embodiments, two pixels may be said to be "neighbors" not only when the two pixels are nearest neighbors, but also when the two pixels are separated from each other by at most one pixel, at most two pixels, at most three pixels, at most five pixels, or even at most ten pixels. Each possibility corresponds to a different embodiment.
According to some embodiments, the set of difference values associated (corresponding) with the first sub-image also comprises scan data associated with the neighboring sub-image such that the first sub-image is centrally positioned with respect to the neighboring sub-image.
According to some embodiments in which the sub-image is a pixel, the set of difference values associated with the first pixel also includes scan data associated with adjacent pixels such that the first pixel and adjacent pixels constitute a block of m n pixels, where 3.ltoreq.m.ltoreq.11 and 3.ltoreq.n.ltoreq.11. Larger values of n and m are also possible and may be desirable, for example when the size of the defect or the associated length of the noise is large. According to some such embodiments, the first pixel may be positioned at the center of the block. In particular, when the size of the suspected defect is larger than the first pixel (i.e., when the first pixel may include only (in the sense of depiction) a portion of the suspected defect), n and m may be selected such that the tile (which is formed by the first pixel and the adjacent first pixel) entirely delineates the suspected defect.
According to some embodiments, sub-operation 220c may include calculating a set of noise values. According to some embodiments, the set of noise values may be calculated based on a set of difference values corresponding to the sub-regions.
According to some embodiments, the method 100 may include a preliminary scanning operation in which the wafer is partially scanned. More specifically, the wafer may be "sampled" in the sense that a sample area of the wafer is scanned. Each region in the sample (i.e., each region from the sampling region) represents a wafer region characterized by a certain architecture, component type(s), and so forth.
According to some embodiments, to reduce computational load and speed up wafer analysis, certain computational operations may be performed with respect to only preliminary scan data. For example, one or more dies from a group of dies manufactured to have the same design may be sampled (in a preliminary scanning operation). The scan data obtained from the corresponding region within the sampled die may be later (e.g., in sub-operation 220 c) used with respect to the corresponding region of the non-sampled die. In particular, according to some such embodiments, a set of noise values corresponding to the sampling region may be calculated and stored in memory (i.e., prior to operation 110). The set of noise values may later be used as part of the determination of whether the scanned area includes a defect in sub-operation 220 c.
According to some embodiments, operation 120 may additionally include a sub-operation in which images (of the same region) associated with different perspectives and that have been obtained at different times, in particular times differing by more than a typical time scale affecting the wafer analysis system (which is used to inspect the wafer) and/or the high frequency physical effects of the wafer, are registered with each other. Such "view-to-view" registration may be implemented, for example, prior to sub-operation 220a (i.e., in an embodiment implementing operation 120 in accordance with fig. 2). In addition to standard die-to-die registration and/or cell-to-cell registration, perspective-to-perspective registration may also be implemented. According to some embodiments, for example, where different images, which are known to be associated with different viewing angles, are offset from each other by one sub-pixel, one pixel, or even up to ten pixels, an alignment protocol may be employed. This may advantageously avoid the need to apply a registration protocol, which is relatively more cumbersome.
According to some embodiments, images in different perspectives of the same scan region may be registered with each other prior to sub-operation 220 a. Registration may be implemented using scan data obtained from a common channel (which does not change when switching between viewing angles). According to some such embodiments, the multi-view scan data is obtained from a bright field channel, while a gray field channel is used to register the images to each other. Alternatively, according to some embodiments, multi-view scan data is obtained from gray field channels, while bright field channels are used to register images to each other. (in addition to standard die-to-die registration and/or unit-to-unit registration, "view-to-view" registration may be implemented.) according to some embodiments, at least two views are always acquired at a time, with one view being common to all acquired views.
FIG. 3 presents a flowchart of sub-operation 320c, sub-operation 320c being a particular embodiment of sub-operation 220 c. According to some embodiments, sub-operation 320c may include a calculation of a covariance matrix (which constitutes a set of noise values). According to some embodiments, the calculation of the covariance matrix may be based on the corresponding set of variance values calculated in sub-operation 220b and/or scan data obtained at the time of a preliminary scan of the wafer. The entries in the non-diagonal blocks in the covariance matrix include cross-view covariances (cross-view covariances between sub-images corresponding to different (adjacent) sub-regions and cross-view covariances between sub-images corresponding to the same sub-region). According to some such embodiments, determining whether the sub-region includes a defect (or a portion of a defect) in sub-operation 220c may include:
Sub-operation 320C1 of multiplying the first vector v (the component of which comprises the difference value of the set of difference values corresponding to the sub-region) by the inverse of the corresponding covariance matrix C to obtain the second vector u. (Note that the set of difference values corresponding to a sub-region also includes difference values associated with adjacent sub-regions.)
Sub-operation 320c2, taking the scalar product of the second vector u and the third vector k, e.g. a predetermined kernel (kernel) corresponding to the sub-region. The component of the third vector k may characterize a signature (signature) of a particular type of defect(s) that would appear in a difference image obtained (ideally) substantially without wafer noise, the sub-region being suspected to include at least in part the particular type of defect(s).
Sub-operation 320c3, in sub-operation 320c3 it is checked whether the scalar product exceeds a predetermined threshold B, if so, the sub-region is marked as comprising a defect (or a part of a defect).
Fig. 4A-4G present algebraic representations used in the computations involved in the sub-operations of fig. 3 according to some embodiments. A first vector v for the case where the number of sub-images is n and the number of views is m is shown in fig. 4A. v thus comprises n x m components. (note that each of vectors v, u, and k is defined as a column vector.) each component of v may be labeled by a pair of indices i and j, where indices i=1, 2, …, n represent sub-images (e.g., pixels) and indices j=1, 2, …, m represent viewing angles. Thus, as defined in FIG. 4A, the first n components of v (i.e., v 11 、v 12 、…、v 1n ) Representing the difference value (of the n sub-images) in the first viewing angle. Similarly, the components n+1 to 2n of v (i.e. v 21 、v 22 、…、v 2n ) Representing the difference value in the second viewing angle, and so on. Vector v is thus composed of m n component vectors v shown in FIG. 4B j The composition. v j Corresponding to a different view (which is marked by index j).
Referring also to fig. 5A and 5B, fig. 5A shows the pixel p under consideration i In the case where the number of (i=1, 2, …, 9) is nine, the possible ways of enumerating the pixels (more generally the sub-images) and thus showing the order of the items in v (and C and k). Except for the central pixel p 5 In addition to (which is the pixel to be analyzed), eight pixels closest to it are shown. The set of pixel values (of the center pixel) includes not only the values associated with the center pixel, but also the values associated with the eight surrounding pixels.
Fig. 5B shows the pixel p under consideration j In the case where the number of (j=1, 2, …, 5) is five, the possible methods of pixel (more generally sub-image) are enumerated and thus the order of the items in v (and C and k) is shown. Except for the central pixel p 1 In addition, four pixels closest to it are shown. The set of pixel values (of the center pixel) includes not only the value associated with the center pixel, but also the values associated with the four nearest neighbor pixels.
FIG. 4CThe covariance matrix C is shown. For the arrangement selection of the components within the first vector v above (i.e. as defined in fig. 4A and 4B), C takes the form of a matrix C in which C is defined by m x m smaller matrices C ab (a=1, 2, …, m; b=1, 2, …, m) such that C ab Is the covariance matrix of the n x n matrix. m C aa Each of (a=1, 2, …, m) corresponds to the a-th view, respectively, and "links" between different sub-images corresponding to the same (i.e., a-th) view. "off-diagonal" matrix C a,b≠a Each of (i.e. when b+.a) is "associated" between sub-images in different views (i.e. in the a-th and b-th views), respectively. C (C) ab Shown in fig. 4D.
For the same case (i.e. where the number of sub-images is n and the number of viewing angles is m), a third vector k is shown in fig. 4E. Similar to the first vector v, the third vector k is composed of m n component vectors k shown in FIG. 4F j The composition. k (k) j Corresponding to a different view (which is marked by index j).
The second vector u (obtained in sub-operation 320C 1) is a matrix product of the inverse of (one-dimensional matrix) v and C. In sub-operation 320c3, it is checked whether k·u > B. Note that the value of the threshold B may depend on a predetermined kernel (i.e. on the nature of the defect(s) suspected to be included or partially included by the sub-region). The value of the threshold B may also vary from one sub-region to an adjacent sub-region, depending on the geometry of the corresponding pattern on the sub-region. This may be the case even when the sub-region and the adjacent sub-region each correspond in size to a pixel and each comprise a respective portion of the same defect. Defects typically may have an area of at least about 10nm by 10nm and may affect the signal obtained from an area of about 100nm by 100nm measured around the defect (i.e., corresponding to at least about 3 by 3 pixels in the case where a given pixel corresponds to an area of about 10nm by 10nm on a wafer). The threshold B may be chosen such that the percentage of false positives, i.e. the case where a sub-area of the wafer that is not defective is erroneously determined to be defective, does not exceed a predefined (threshold) ratio.
According to some embodiments, to speed up the computation, some of the off-diagonal terms or off-diagonal blocks of the covariance matrix (e.g., matrix C a,b≠a Some of which are described below). ( If no off-diagonal block is calculated, the calculations involved are equivalent to calculating m smaller covariance matrices (e.g., as shown in fig. 4G for the case of m=3). Each of m smaller covariance matrices corresponds to one of the views, where m is the number of views. )
As described above, the third vector k (i.e., the predetermined kernel) characterizes the defect or defect family (i.e., similar defect) in the absence (or substantial absence) of wafer noise, and may be obtained by applying a matched filter to the defect or defect family signature in the presence of wafer noise in order to maximize signal-to-noise ratio. According to some embodiments, the third vector k characterizes a signature of a specific type of defect that the sub-region is suspected to include (or partially include).
According to some embodiments, the predetermined kernel may be derived based on one or more of the following: (i) Experimental measurements performed on areas of the wafer known to include one or more defects; (ii) a computer simulation of light scattering from defects; (iii) a physical model describing the behavior of the defect; and (iv) a machine learning algorithm designed to provide an optimized kernel.
According to some embodiments, it may be known that some view pairs exhibit weaker relevance than other view pairs (e.g., based on scan data obtained at the time of a preliminary scan). According to some such embodiments, in sub-operation 320c, entries in the block corresponding to view-pairs known to exhibit weaker relevance are not computed to expedite analysis.
According to some embodiments, in addition to covariance, the higher moment of the joint probability distribution (which is related to the measured value obtained by the imager in operation 110) may be considered as part of the determination of whether the sub-region includes (or partially includes) the defect in sub-operation 220 c. For example, skewness and/or kurtosis may be considered in accordance with some embodiments.
Although some of the above embodiments relate to implementing the method 100 using optical scanning, as already mentioned, the method 100 may also be implemented using a Scanning Electron Microscope (SEM) according to some embodiments. According to some such embodiments, the multiple viewing angles include two or more of at least one intensity of the illuminating electron beam (e-beam), at least one intensity of the returning electron beam(s), at least one spin of the illuminating electron beam(s), at least one spin of the returning electron beam(s), one or more incident angles of the illuminating electron beam(s), and one or more collection angles of the returning electron beam(s).
According to some alternative embodiments, the method 100 may be implemented using an Atomic Force Microscope (AFM). According to some such embodiments, multiple perspectives may include different types of AFM tips, different tapping (tapping) modes, and/or applying AFM at different resonant frequencies.
According to some embodiments, in which the image resolution (i.e., pixel size) provided by the imager may be higher than desired (e.g., when implementing method 100 using SEM or AFM), or to speed up wafer analysis, pairs of variance values corresponding to pixels within a sub-image of a variance image may be averaged to obtain a single ("coarse-grained") variance value corresponding to the sub-image. In such embodiments, the set of difference values corresponding to the sub-regions may include an average difference value associated with the sub-images of the sub-regions and an average difference value associated with the sub-images of the adjacent sub-regions in each of the multiple views. (each average difference value is obtained by averaging the difference values associated with the pixels constituting the corresponding sub-image). The covariance matrix may then be calculated based on the average difference values, thereby potentially allowing for significant computational load reduction.
System and method for controlling a system
According to an aspect of some embodiments, a computerized system for obtaining and analyzing multi-view scan data of a wafer is provided. Fig. 6 is a block diagram of such a computerized system (computerized system 600) according to some embodiments. The system 600 includes a scanning device 602 and a scan data analysis module 604.
The scanning device 602 is configured to scan the wafer in each of multiple views (e.g., the multiple views listed above in the methods section). According to some embodiments, scan data relating to two or more of the multiple perspectives may be obtained simultaneously or substantially simultaneously. Additionally or alternatively, according to some embodiments, the scanning device 602 may be configured to scan the wafer one view at a time (which comes from multiple views). That is, the scanning device 602 may be configured to switch between viewing angles.
The scan data analysis module 604 is configured to (i) receive multi-view scan data obtained by the scanning device 602, and (ii) perform an integrated analysis of the multi-view scan data, as described in further detail below.
According to some embodiments, scanning device 602 includes a stage 612, a controller 614, an imager 616 (imaging apparatus), and an optical device 618. The scanning device 602 is delineated by a double-dot dashed box to indicate that the components therein (e.g., the stage 612 and the imager 616) may be separated from each other, for example, in the sense that they are not included in a common housing.
Stage 612 is configured to place a sample to be inspected, such as a wafer 620 (or photomask), thereon. Wafer 620 may be patterned, but the skilled artisan will appreciate that method 100 may also be used to detect defects in bare wafers. According to some embodiments, platform 612 may be movable, as set forth below. The imager 616 may include one or more light emitters (e.g., visible and/or ultraviolet light sources) configured to illuminate the wafer 620. Further, the imager 616 may include one or more photodetectors. In particular, the imager 616 may apply collection techniques including bright field collection, gray field collection, and the like. The optical device 618 may include an optical filter (e.g., spatial filter, polarizing filter, fourier filter), a beam splitter (e.g., polarizing beam splitter), a mirror, a lens, a prism, a grating, a deflector, a reflector, an aperture, etc., configured to allow scan data associated with multiple viewing angles to be obtained. According to some embodiments, the optical device 618 may be configured to allow switching of the scanning device 602 between different viewing angles. For example, the optical device 618 may include a polarization filter and/or beam splitter configured to set the polarization of the emitted (illuminated) light and/or to select the polarization of the collected (returned) light.
More specifically, according to some embodiments, optical device 618 may include any arrangement of optical components configured to: one or more optical properties (such as shape, spread, polarization) of a radiation beam from a radiation source of imager 616 and a trajectory line of the incident radiation beam are determined (set). According to some embodiments, the optical device 618 may further comprise any arrangement of optical components configured to: one or more optical properties of the one or more returned radiation beams are selected (e.g., selected by filtering) prior to detecting the one or more returned radiation beams (e.g., beams that are specularly reflected by wafer 620 or diffusely scattered from wafer 620), and a trajectory line followed by the one or more returned beams is selected as the one or more returned beams are returned from wafer 620. According to some embodiments, the optical device 618 may further include optical components configured to direct one or more returned radiation beams toward the detector of the imager 616.
The controller 614 may be functionally associated with the platform 612, the imager 616, and the optical device 618, as well as the scan data analysis module 604. More specifically, the controller 614 is configured to control and synchronize the operation and functions of the modules and components listed above during scanning of the wafer. For example, the stage 612 is configured to support a sample under test (such as a wafer 620) and to mechanically translate the sample under test along a trajectory set by the controller 614, the controller 614 also controlling the imager 616.
The scan data analysis module 604 includes computer hardware (one or more processors, such as image and/or graphics processor units, and volatile and non-volatile memory components; not shown). The computer hardware is configured to analyze the multi-view scan data received from the imager 616 for areas on the wafer 620 to determine the presence of defects, substantially as described above in the method section.
The scan data analysis module 604 may further include an analog-to-digital (signal) converter (ADC) and a frame grabber (not shown). The ADC may be configured to receive analog image signals from the imager 616. Each analog image signal may correspond to a different view from multiple views. The ADC may be further configured to convert the analog image signal to a digital image signal and transmit the digital image signal to the frame grabber. The frame grabber may be configured to obtain a digital image (block image or image frame) of a scanned region on a scanned wafer (e.g., wafer 620) from the digital image signal. Each digital image may be at one of multiple viewing angles. The frame grabber may be further configured to transmit the digital image to one or more of the processor and/or the memory component.
More specifically, the scan data analysis module 604 may be configured to:
generating a set of difference values in each of the multiple perspectives based on scan data of the scan region received from the imager 616 and corresponding reference data that may be stored in the memory component(s). Each set of difference values corresponds to a sub-region (e.g., "pixel") of the scan region, substantially as described above in the description of fig. 2 in the method section.
Determining, for each sub-region, whether the sub-region is defective based at least on the corresponding set of difference values and the corresponding set of noise values, substantially as described above in the description of fig. 2 in the method section and in the description of fig. 3 according to some embodiments of the system 600.
According to some embodiments, the scan data analysis module 604 may be configured to: for each set of difference values and based at least on the set of difference values, a corresponding set of noise values is generated. According to some embodiments, the generation of the set of noise values may be based at least on scan data obtained in a preliminary scan(s) of the wafer, wherein a representative region of the wafer is scanned.
According to some embodiments, the determination of whether a sub-region is defective may be performed in consideration of the type of defect(s) that the sub-region is suspected to include or partially include. In particular, the determining may involve calculating a covariance matrix, and may further include calculating involving a predetermined kernel characterizing the suspected defect type(s) in the substantial absence of wafer noise and a corresponding threshold.
According to some alternative embodiments not depicted in the figures, a computerized system for obtaining and analyzing multi-view scan data of a wafer is provided. The system may be similar to system 600, but differs at least in that the wafer is irradiated with electron beam(s) instead of electromagnetic radiation. In such embodiments, the imager of the system may comprise a scanning electron microscope.
According to some alternative embodiments not depicted in the figures, a computerized system for obtaining and analyzing multi-view scan data of a wafer is provided. The system may be similar to system 600, but differs at least in that an atomic force microscope is utilized instead of an optical-based imager.
Fig. 7A schematically depicts a computerized system 700, the computerized system 700 being a particular embodiment of the system 600. The system 700 includes a radiation source 722, a plurality of detectors 724, the radiation source 722 and the plurality of detectors 724 together forming (or forming part of) an imager that is a particular embodiment of the imager 616 of the system 600. The system 700 further includes a scan data analysis module 704, the scan data analysis module 704 being a particular embodiment of the scan data analysis module 604 of the system 600. The system 700 further includes a beam splitter 732 and an objective 734, the beam splitter 732 and the objective 734 together forming (or forming part of) an optical device that is a particular embodiment of the optical device 618 of the system 600. Also shown is a platform 712 (which is a particular embodiment of platform 612 of system 600) and a wafer 720 disposed on platform 712.
The optical axis O of the objective 734 is also indicated. The optical axis O extends parallel to the z-axis.
In operation, light is emitted by the radiation source 722. Directing light toward the beam splitter 732, some of the light is transmitted through the beam splitter 732. The transmitted light is focused by the objective 734 onto the wafer 720 to form an illumination spot S on the wafer 720. The returned light, which undergoes specular reflection from the wafer 720, is directed back toward the objective 734 and refracted through the objective 734 toward the beam splitter 732. A portion of the returned light (which has been refracted through the objective 734) is reflected by the beam splitter 732 towards the detector 724.
For ease of illustration, the trajectories of a pair of rays are indicated. More specifically, a first light ray L 1 And a second light ray L 2 Indicating the light emitted by the radiation source 722. Third ray L 3 And fourth ray L 4 Indicating the (return) light ray traveling toward detector 724 after having been reflected from beam splitter 732 (after scattering from wafer 712 and refracting through objective lens 734). Third ray L 3 Constitute a first light ray L 1 Which remains after transmission through beam splitter 732 and subsequent reflection by beam splitter 732. Fourth ray L 4 Form the second light ray L 2 Which remains after transmission through beam splitter 732 and subsequent reflection by beam splitter 732.
A segmented pupil 740 (segmented aperture, which also forms part of the optical device) is also indicated. The segmented pupil 740 may be positioned in a pupil plane and the detector 724 may be positioned in a plane conjugate to the pupil plane. The segmented pupil 740 is divided into a plurality of pupil segments (or sub-apertures). The segmentation of the pupil allows separating the returned beam (e.g. reflected from the wafer) reaching the pupil into sub-beams, which is done according to the respective return angle of each of the sub-beams, such that each pupil segment will correspond to a different viewing angle. That is, each of the viewing angles produced by the segmented pupil 740 corresponds to a different collection angle.
As a non-limiting example, in fig. 7A, the segmented pupil 740 is shown as divided into nine pupil segments 740a to 740i, which are arranged in a square array, and the detector 724 comprises nine corresponding detectors 724a to 724i. The system 700 is configured such that light reaching each of the pupil segments (which originates from the radiation source 722 and has undergone specular reflection from the wafer 720) continues from the pupil segments toward the respective detector from the detector 724. That is, light passing through the first pupil segment 740a is sensed by the first detector 724a, light passing through the second pupil segment 740b is sensed by the second detector 724b, and so on. Thus, each of the detectors 724 is configured to sense light returned at a different angle, respectively.
According to some embodiments, the optical device may further comprise an optical guiding mechanism (not shown) for guiding the light passing through each of the pupil segments. The optical guiding mechanism may be configured to ensure that light passing through the pupil section is guided to the corresponding (target) detector (which comes from detector 724) without "leaking" to other detectors.
According to some embodiments, and as depicted in fig. 7A, the optical device may be configured such that light reaching the objective 734 (directly) from the radiation source 722 arrives there as a collimated beam. The wafer 720 may be positioned at or substantially at the focal plane of the objective 734 such that light rays incident on the wafer 720 form an illumination point S on the wafer 720, which may be as small as about 100 nanometers.
Different light rays from the collimated beam that have been refracted through the objective 734 may be incident on the wafer 720 at different angles. First light ray L 1 At a first incident angle theta 1 Incident on the wafer 720 (i.e., the angle formed by the refractive portion and the optical axis O), and the second light ray L 2 At a second incident angle theta 2 Incident on wafer 720. For ease of description, assume θ 2 Equal to theta 1 Such that a second light ray L from the objective 734 to the wafer 720 2 The first ray L after being reflected from the wafer 720 follows a trajectory that is followed by the refractive portion of (c) 1 Is followed in opposite directions by the refractive part of (c). Likewise, a first light ray L from the objective 734 to the wafer 720 1 A second ray L after being reflected from the wafer 720 following a trajectory that is followed by the refractive portion of (c) 2 Is followed in opposite directions by the refractive part of (c).
Thus, when the context is disambiguated, θ 2 Can be used to refer to the first light L 1 Rather than the second ray L, the refractive portion of (return) angle of (i) away from the wafer 720 2 Incident angle of the refractive part on the wafer 720. Similarly, θ when the context is disambiguated 1 Can be used to refer to the second light ray L 2 Is folded into (a)The reflection (return) angle of the incident portion off the wafer 720 is not the first ray L 1 Incident angle of the refractive part on the wafer 720.
Note that not only the angle of incidence may be related to multi-view wafer analysis, but the azimuth angle may also be related to multi-view wafer analysis, particularly when wafer 720 is patterned (due to one or more asymmetries introduced by the pattern relative to the wafer surface). That is, the angle formed by the "projection" of the incident light ray onto the wafer surface and the x-axis of the orthogonal coordinate system parameterizing (the lateral dimension of) the wafer surface. In FIG. 7B, light L incident on wafer 720 is indicated i Angle of incidence (or polar angle) theta i And a first azimuth angle
Figure BDA0004153788590000271
Also indicated is the light L reflected from the wafer 720 r Angle of reflection (or polar angle) theta r =θ i And a second azimuth angle->
Figure BDA0004153788590000272
Each of the detectors 724 is positioned to detect light rays that have been centered at a polar angle θ (or more precisely, a continuous range of polar angles centered at θ) and azimuth angles
Figure BDA0004153788590000273
(or more precisely +.>
Figure BDA0004153788590000274
A continuous range of azimuth angles that are the center) hits the wafer 720.
According to some embodiments, the system 700 may further comprise an infrastructure (e.g., a suitably positioned detector) for sensing light that has been diffusely scattered from the wafer 720 (in particular, light rays outside the cone of light generated by the objective 734). According to some embodiments, the system 700 may be configured to use an image generated based on sensed gray field scattered light as additional view angle(s) and/or reference image for view-to-view registration.
The scan data analysis module 704 is configured to receive scan data from the detector 724 and determine whether the scan region includes one or more defects based on the scan data, substantially as described with respect to the scan data analysis module 604 of the system 600. Scan data from each of the detectors 724a through 724I may be used to generate difference image I, respectively 1 To I 9 Each difference image is at a different viewing angle.
Because in fig. 7A the segmented pupil 740 is depicted as comprising nine pupil segments, a detector from detector 724 corresponds to each of the pupil segments, the number of viewing angles is nine. Thus, the set of difference values associated with the first "pixel" on the wafer (i.e. the sub-region of the size corresponding to the image pixel) comprises 9× (n+1) elements (difference values), where N is the number of adjacent pixels considered. That is, N is the number of neighboring pixels whose difference values are included in the difference value set associated with the first pixel. For example, when the number of adjacent pixels is eight (substantially as depicted in fig. 5A), the difference value set includes 81 elements. (the predetermined core also includes 81 elements). Then, the covariance matrix is a 81×81 matrix.
Although in fig. 7A, the pupil segments are depicted as having equal shapes and sizes, it is to be understood that, in general, the shapes and/or sizes of the different pupil segments of the segmented pupil 740 may differ from one another. In particular, according to some embodiments, different pupil segments may differ in area (i.e., the lateral dimension of the pupil segment parallel to the zx plane) and corresponding longitudinal extension of the pupil segments (e.g., the y-coordinates of the entrance and/or exit of a pupil segment may vary from one pupil segment to another).
Fig. 8 schematically depicts a computerized system 800, the computerized system 800 being a particular embodiment of the system 600. System 800 is similar to system 700, but differs in that it includes optical components that separate the light returning from the wafer into different polarizations, thereby allowing the number of viewing angles to be doubled. More specifically, system 800 includes a radiation source 822, a first plurality of detectors 824, and a second plurality of detectors 826, where radiation source 822, first plurality of detectors 824, and second plurality of detectors 826 together form (or form part of) an imager, which is a particular embodiment of imager 616 of system 600. The system 800 further includes a scan data analysis module 804, the scan data analysis module 804 being a particular embodiment of the scan data analysis module 604 of the system 600. The system 800 further includes a first beam splitter 832, an objective 834, a second beam splitter 836, a first segmented pupil 840, and a second segmented pupil 850, which together form (or form part of) an optical device that is a particular embodiment of the optical device 618 of the system 600. The second beam splitter 836 is a polarizing beam splitter. Also shown is a platform 812 (which is a particular embodiment of the platform 612 of the system 600) and a wafer 820 disposed on the platform 812.
According to some embodiments, radiation source 822 may be similar to radiation source 722, and each of plurality of detectors 824 and 826 may be similar to plurality of detectors 724. The first beam splitter 832 and the objective lens 834 may be similar to the beam splitter 732 and the objective lens 734, and each of the segmented pupils 840 and 850 may be similar to the segmented pupil 740.
In operation, a portion of the beam emitted by radiation source 822 is transmitted through first beam splitter 832, focused by objective lens 834 (to form an illumination spot S' on wafer 820), returned by wafer 820, focused again by objective lens 834, and reflected from first beam splitter 832, substantially as described above with respect to system 700. The portion of the return beam reflected from the first beam splitter 832 travels toward the second beam splitter 836 and is split by the second beam splitter 836 into two beams of different polarization (e.g., s-polarized light and p-polarized light): a first polarized light beam and a second polarized light beam. The first polarized light beam travels toward the first segmented pupil 840 and the first plurality of detectors 824, and the second polarized light beam travels toward the second segmented pupil 850 and the second plurality of detectors 826 (such that each combination of pupil segments and polarization is assigned a detector).
The arrows indicating the trajectory of the light emitted by the radiation source 822 are not numbered.
The scan data analysis module 804 is configured substantially as described with respect to the scan data analysis module 604 of the system 600Is arranged to receive scan data from the detector 824 and the detector 826 and to determine whether the scan area includes one or more defects based on the scan data. The scan data from each of the first detectors 824a through 824i may be used to generate a difference image J, respectively 1 To J 9 Each difference image is at a different viewing angle. The scan data from each of the second detectors 826 a-826 i may be used to generate difference images J, respectively 10 To J 18 Each difference image is at a different viewing angle (and at a different viewing angle than difference image J 1 To J 9 Different polarization). Thus, two difference images in two different perspectives can be obtained from each pair of polar and azimuthal angles characterizing light returning from wafer 820: a first difference image corresponding to the first polarization and a second polarization corresponding to the second polarization.
Because each of the segmented pupils 840 and 850 is depicted in fig. 8 as including nine pupil segments, the detectors from detectors 824 and 826 correspond to each of the pupil segments, respectively, the number of view angles is eighteen. Thus, the set of variance values associated with the first "pixel" on the wafer includes 18× (N '+1) elements (variance values), where N' is the number of adjacent pixels considered. For example, when the number of adjacent pixels is eight, the difference value set includes 162 elements. (the predetermined core also includes 162 elements). Then the covariance matrix is 162 x 162 matrix.
Fig. 9 schematically depicts a computerized system 900, the computerized system 900 being a particular embodiment of the system 600. The system 900 includes a radiation source 922, a first detector 924, a second detector 926, and a third detector 928, which together form (or form part of) an imager that is a particular embodiment of the imager 616 of the system 600. The system 900 further includes a scan data analysis module 904, the scan data analysis module 904 being a particular embodiment of the scan data analysis module 604 of the system 600. The system 900 further includes a first beam splitter 932, an objective lens 934, a second beam splitter 936, a third beam splitter 938, a first polarizer 942, and a second polarizer 944, which together form (or form part of) an optical device that is a particular embodiment of the optical device 618 of the system 600. The (non-segmented) pupil before each of the detectors 924, 926 and 928 is not shown. Also shown is a platen 912 (which is a particular embodiment of the platen 612 of the system 600) and a wafer 920 mounted on the platen 912.
The first polarizer 942 is positioned before the second detector 926 and the second polarizer 944 is positioned before the third detector 928. The first polarizer 942 is configured to filter out light of a first polarization and the second polarizer 944 is configured to filter out light of a second polarization, the second polarization being different from the first polarization.
In operation, a portion of the beam emitted by radiation source 922 is transmitted through first beam splitter 932, focused by objective 934 (to form an illumination spot S on wafer 920), returned by wafer 920, focused again by objective 934, and reflected from first beam splitter 932, substantially as described above with respect to system 700. The portion of the return beam reflected from the first beam splitter 932 travels toward the second beam splitter 936 and is split into a first return sub-beam and a second return sub-beam by the second beam splitter 936. The first return sub-beam constitutes part of the return beam transmitted through the second beam splitter 936. The second return sub-beam forms part of the return beam reflected by the second beam splitter 936.
The first return Shu Chaoxiang travels the first detector 924 and is sensed by the first detector 924. The second return sub-beam travels toward the third beam splitter 938 and is split into a transmissive portion and a reflective portion by the third beam splitter 938. The transmissive portion travels toward the first polarizer 942 and the reflective portion travels toward the second polarizer 944. Polarizers 942 and 944 can be aligned at different angles such that each of second detector 926 and third detector 928 senses light of a different polarization. Thus, detectors 924, 926, and 928 can be configured to provide readings sufficient to fully characterize the polarization of the return beam (which is reflected from wafer 920).
The arrows indicating the trajectory line of the light emitted by the radiation source 922 are not numbered.
The scan data analysis module 904 is configured to perform a slave inspection substantially as described with respect to the scan data analysis module 604 of the system 600Detectors 924, 926, and 928 receive the scan data and determine whether the scan region includes one or more defects based on the scan data. Scan data from each of detectors 924, 926, and 928, respectively, may be used to generate difference image K 1 、K 2 And K 3 Each difference image is at a different viewing angle.
Because unlike the pupils of systems 700 and 800, the pupil (not shown) of system 900 is not segmented, the number of view angles is three (one view angle per detector). Thus, the set of variance values associated with the first "pixel" on the wafer includes 3× (N "+1) elements (variance values), where N" is the number of adjacent pixels considered. For example, when the number of adjacent pixels is eight, the difference value set includes 27 elements. (the predetermined core also includes 27 elements). Then the covariance matrix is a 27 x 27 matrix.
Note that according to some embodiments, a single polarizing beam splitter may be used in place of the combination of third beam splitter 938 with first polarizer 942 and second polarizer 944.
Simulation results
This section describes simulation results that demonstrate the efficacy of the above-described methods (e.g., method 100) and systems. Fig. 10A presents multi-view scan data obtained by a simulated computerized system, such as system 700. The multi-view scan data includes nine images (enumerated by roman numerals I through IX) of square areas of the (simulated) wafer, each image at a different view angle. The area is considered to be uniform except for deformations in the center of the area (i.e. at the center pixel), which may be introduced by dust particles. The dimensions of the region were set to 1 μm 2 . Each of images I through IX is an intensity image corresponding to a different collection angle, which may be obtained by way of a segmented pupil such as segmented pupil 740. Also indicated is the intensity scale in the range from black to white, black corresponding to zero (i=0) or minimum intensity, white corresponding to maximum intensity reading (i=i max ) Or intensity readings of the above.
In each of images I through IX, the intensity of the center pixel typically varies from one pixel to the next, and on average appears neither brighter nor darker than surrounding pixels. In other words, in any of the images, the defect is not noticeable to the naked eye, even when the images are viewed side-by-side together.
As explained above, at the quantity s ij Above a corresponding threshold, the pixel may be determined to be defective, where s ij =k ij ·((C ij ) -1 v ij ) And indexes i and j mark pixels (i and j represent rows and columns of pixels, respectively). Here, v ij Is a first vector corresponding to the (i, j) th pixel, C ij Is the covariance matrix corresponding to the (i, j) th pixel, and k ij Is the third vector (kernel) corresponding to the (i, j) th pixel.
FIG. 10B is s corresponding to the simulation area when cross-view covariance is not taken into account ij Is a graphical representation of (c). This is in fact equivalent to C ij Is set to zero. s is(s) ij Arranged in a square array according to the values assumed by i and j. (Note that since the simulation area is intended to be uniform according to its "bare design", the threshold B for all pixels can be taken to be the same, and by going from s ij Subtracting B does not obtain additional information. ) The scale s=k· (C is also indicated -1 v) ranging from black to white, black and s=s min Corresponding to white and s=s max Corresponding to each other.
FIG. 10C is a graph of the cross-view covariance (i.e., C ij All components of (c) are calculated) s corresponding to the simulation area ij Is a graphical representation of (c).
A dashed circle D is drawn in fig. 10B around the center pixel (which corresponds to the defective pixel), where the center pixel is indicated by arrow D. A dashed circle D 'is drawn around the center pixel in fig. 10C, where the center pixel is indicated by arrow D'. It can be readily seen that the center pixel appears much brighter in fig. 10C than in fig. 10B, i.e., the defect signal is much stronger in fig. 10C than in fig. 10B, demonstrating the improved defect detection capability of the disclosed method. The signal-to-noise ratio is increased from 0.7 to 2.2 taking into account the cross-view covariance.
According to an aspect of some embodiments, a method for obtaining information about a region of a sample (e.g., a wafer) is provided. The method comprises the following steps:
-obtaining a plurality of images of the region by an imager. The plurality of images may differ from each other in at least one parameter selected from the group consisting of: illumination spectrum, collection spectrum, illumination polarization, collection polarization, illumination angle, collection angle, and type of sensing (e.g., intensity, phase, polarization). The step of obtaining a plurality of images includes illuminating (illuminating) the area and collecting radiation from the area. The region includes a plurality of region pixels (i.e., the region includes a plurality of sub-regions, each of the sub-regions having a size corresponding to a pixel).
-receiving or generating a plurality of reference pictures.
-generating, by an image processor (e.g. a scan data analysis module), a plurality of difference images representing differences between the plurality of images and a plurality of reference images.
Calculate a set of region pixel attributes (set of pixel values) for each region pixel of the plurality of region pixels (i.e. each pixel in the region). The computation is performed based on pixels of the plurality of difference images.
-calculating a set of noise attributes based on a plurality of sets of region pixel attributes of a plurality of region pixels (i.e. based on a set of pixel values corresponding to each of the plurality of region pixels). Note that the covariance matrix (and its inverse) is a set of numbers that characterize the statistical properties of noise. Those statistical properties may be generally referred to as "attributes". The use of covariance matrices as statistical properties is a specific non-limiting example.
-determining for each region pixel whether the region pixel represents a defect based on a relation between the noise property set and the region pixel property set of the pixel.
According to some embodiments of the method, the step of determining whether the region pixel represents a defect is further performed in response to a set of attributes of the actual defect.
According to some embodiments of the method, the step of determining whether the region pixel represents a defect is further performed in response to estimating a set of attributes of the defect.
According to some embodiments of the method, the method comprises the steps of: the set of noise properties is calculated by calculating a covariance matrix.
According to some embodiments of the method, the step of calculating the covariance matrix comprises: a set of covariance values is calculated for each region pixel, the set of covariance values representing covariance between different attributes (i.e., between different views) of a region pixel attribute set of the region pixel, and a given covariance matrix is calculated based on a plurality of sets of covariance values for a plurality of region pixels.
According to some embodiments of the method, after the covariance matrix is calculated, the inverse of the covariance matrix is used for further calculation. The inverse of the covariance matrix is multiplied with a set of attributes representing the defect of interest (rather than noise).
According to some embodiments of the method, the method further comprises the steps of: for each region pixel, determining whether the region pixel represents a defect by comparing the product of the multiplication between: (i) The set of attributes of the region pixels (e.g., first vector v), (ii) the inverse of the covariance matrix (e.g., matrix) corresponding to noise affecting the set of attributes of the region pixels -1 C) And (iii) a set of attributes for the defect of interest.
According to some embodiments of the method, the set of pixel attributes of the region pixel comprises data about the region pixel and neighboring region pixels of the region pixel.
According to some embodiments of the method (e.g., as shown in fig. 7A and 8), the imager includes a plurality of detectors for generating a plurality of images, and the method further includes the steps of: different detectors are allocated to detect radiation from different ones of the plurality of pupil segments (of the segmented pupil).
According to some embodiments of the method, different pupil segments of the plurality of pupil segments exceed four pupil segments.
According to some embodiments of the method (e.g., as shown in fig. 8), the imager comprises a plurality of detectors for generating a plurality of images, and the method further comprises the steps of: different detectors are assigned to detect radiation from different combinations of (a) polarization and (b) different ones of the plurality of pupil segments.
According to some embodiments of the method, the method comprises the steps of: multiple images are obtained at the same point in time.
According to some embodiments of the method, the method comprises the steps of: multiple images are obtained at different points in time.
According to some embodiments of the method, the method further comprises the steps of: defects are classified.
According to some embodiments of the method, the method further comprises the steps of: it is determined whether the defect is a defect of interest or is not a defect of interest.
According to an aspect of some embodiments, a computerized system for obtaining information about a region of a sample (e.g., a region on a wafer) is provided. The system includes an imager including optics and an image processor. The imager is configured to obtain a plurality of images of the region. The plurality of images may differ from each other in at least one parameter selected from the group consisting of: illumination spectrum, collection spectrum, illumination polarization, collection polarization, illumination angle, and collection angle. The step of obtaining a plurality of images includes illuminating the area and collecting radiation from the area. The region includes a plurality of region pixels. The computerized system is configured to receive or generate a plurality of reference images. The image processor is configured to:
-generating a plurality of difference images representing differences between the plurality of images and a plurality of reference images.
-calculating a set of region pixel attributes for each region pixel of the plurality of region pixels. The regional pixel attribute set is calculated based on pixels of the plurality of difference images.
-calculating a set of noise properties based on a plurality of sets of region pixel properties of a plurality of region pixels.
-for each region pixel, determining whether the region pixel represents a defect based on a relationship between the noise attribute set and the region pixel attribute set of the pixel.
According to an aspect of some embodiments, there is provided a non-transitory computer-readable medium storing instructions that cause a computerized system to:
obtaining a plurality of images of an area of an object (e.g. an area on a wafer) by an imager of a computerized system (as described above). The plurality of images differ from each other in at least one parameter selected from the group consisting of: illumination spectrum, collection spectrum, illumination polarization, collection polarization, illumination angle, collection angle, and type of sensing. The step of obtaining a plurality of images includes illuminating the area and collecting radiation from the area. The region includes a plurality of region pixels.
-receiving or generating a plurality of reference pictures.
-generating, by an image processor of the computerized system, a plurality of difference images representing differences between the plurality of images and a plurality of reference images.
-calculating a set of region pixel attributes for each region pixel of the plurality of region pixels, wherein the calculating step is performed based on pixels of the plurality of difference images.
-calculating a set of noise properties based on a plurality of sets of region pixel properties of a plurality of region pixels.
-for each region pixel, determining whether the region pixel represents a defect based on a relationship between a noise property set of the pixel and a region pixel property set.
While the present disclosure focuses on scanning and inspection of wafers, the skilled artisan will appreciate that the disclosed methods and systems may also be applicable to detecting irregularities in optical masks used in wafer fabrication ("mask inspection").
As used herein, the terms "collection channel" and "detection channel" may be used interchangeably according to some embodiments. According to some embodiments, the symbols "V data", "Cov" and "V defect" may be used to indicate the first vector V, the covariance matrix C and the third vector k, respectively.
As used herein, the term "group" may refer not only to multiple elements (e.g., components, features), but also to a single element, according to some embodiments. In the latter case, the group may be referred to as a "single member group".
It is appreciated that certain features of the disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or in any suitable manner in any other described embodiment of the disclosure. No feature described in the context of an embodiment is to be considered an essential feature of the embodiment unless explicitly so specified.
Although the operations of methods according to some embodiments may be described in a particular sequence, the methods of the present disclosure may also include some or all of the described operations implemented in a different order. The methods of the present disclosure may include some or all of the operations. No particular operation in the disclosed methods is to be construed as an essential operation of the methods unless explicitly specified as such.
While the present disclosure has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, the present disclosure is intended to embrace all such alternatives, modifications and variances which fall within the scope of the appended claims. It is to be understood that this disclosure is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth herein. Other embodiments may be practiced, and the embodiments may be implemented in various ways.
The phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting. Citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present disclosure. Section headings are used herein to simplify understanding of the specification and should not be construed as necessarily limiting.

Claims (20)

1. A method for detecting defects on a sample, the method comprising the steps of:
obtaining scan data of a first region of a sample in multiple view angles; and
performing an integrated analysis of the obtained scan data, the integrated analysis comprising:
Calculating cross-view covariance based on the obtained scan data, and/or estimating the cross-view covariance; and
the presence of defects in the first region is determined taking into account the cross-view covariance.
2. The method of claim 1, wherein the sample is a patterned wafer.
3. The method of claim 1, wherein the multiple perspectives comprise two or more of: one or more angles of incidence of one or more illuminating beams, one or more collection angles of one or more collected beams, at least one intensity of the one or more illuminating beams, and at least one intensity of the one or more collected beams, and compatible combinations thereof.
4. The method of claim 1, wherein the method is optically-based, and wherein the multiple viewing angles comprise two or more of: one or more illumination angles, intensity of the illumination radiation, illumination polarization, illumination wavefront, illumination spectrum, one or more focus shifts of the illumination beam, one or more collection angles, intensity of the collected radiation, collection polarization, phase of the one or more collected beams, bright field channel, gray field channel, fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
5. The method of claim 1, wherein the integrated analysis comprises:
generating, for each of a plurality of sub-regions of the first region, a difference value in each of the multiple views based on the obtained scan data and corresponding reference data for the first region in each of the multiple views; and
determining whether each of the plurality of sub-regions is defective based at least on the difference values corresponding to the sub-region and sub-regions adjacent to the sub-region and a noise value corresponding to the sub-region and the adjacent sub-region, the noise value including a corresponding covariance from the cross-view covariance.
6. The method of claim 5, further comprising the step of: generating a difference image of the first region in each of the multiple perspectives based on the obtained scan data and the reference data, and wherein the difference value corresponding to each sub-region from the plurality of sub-regions is derived from and/or characterizes a sub-image of the difference image corresponding to the sub-region.
7. The method of claim 5, wherein the noise value is calculated based at least on the difference value.
8. The method of claim 5, wherein the step of determining whether each of the plurality of sub-regions is defective comprises:
generating a covariance matrix comprising the noise values corresponding to the sub-regions and the sub-regions adjacent to the sub-regions;
multiplying a first vector comprising the difference value corresponding to the sub-region and the adjacent sub-region by an inverse of the covariance matrix to obtain a second vector;
calculating a scalar product of the second vector and a third vector, components of the third vector comprising values characterizing defects; and
if the scalar product is greater than a predetermined threshold, the sub-region is marked as defective.
9. The method of claim 5, wherein at least one of the plurality of sub-regions has a size corresponding to a single pixel.
10. The method of claim 1, wherein the cross-view covariance is estimated based at least on scan data obtained at a preliminary scan of the sample in which regions of the sample are sampled, each sampled region representing a group of regions of the sample, wherein at least one of the sampled regions represents the first region.
11. The method of claim 1, further comprising the step of: when the presence of a defect is determined, it is determined whether the defect is a defect of interest, and optionally, when the defect is determined to be of interest, the defect is classified.
12. The method of claim 1, wherein repeating is performed with respect to each of a plurality of additional regions in order to scan a larger region of the sample formed by the first region and the additional regions.
13. A computerized system for obtaining and analyzing multi-view scan data of a sample, the system comprising:
a scanning device configured to scan a region of a sample in a multiple view angle; and
a scan data analysis module configured to perform an integrated analysis of scan data obtained in the scan, the integrated analysis comprising:
calculating cross-view covariance based on the obtained scan data, and/or estimating the cross-view covariance; and
the presence of defects in the region is determined taking into account the cross-view covariance.
14. The system of claim 13, wherein the scanning device comprises an optically-based imager, and wherein the multiple viewing angles comprise two or more of: one or more illumination angles, intensity of the illumination radiation, illumination polarization, illumination wavefront, illumination spectrum, one or more focus shifts of the illumination beam, one or more collection angles, intensity of the collected radiation, collection polarization, phase of the one or more collected beams, bright field channel, gray field channel, fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
15. The system of claim 13, wherein the integrated analysis comprises:
generating, for each of a plurality of sub-regions of the region, a difference value in each of the multiple views based on the obtained scan data and corresponding reference data for the region in each of the multiple views; and
determining whether each of the plurality of sub-regions is defective based at least on the difference values corresponding to the sub-region and sub-regions adjacent to the sub-region and a noise value corresponding to the sub-region and the adjacent sub-region, the noise value including a corresponding covariance from the cross-view covariance.
16. The system of claim 15, wherein the scan data analysis module is further configured to: generating a difference image of the region in each of the multiple perspectives based on the obtained scan data and the reference data, and wherein the difference value corresponding to each sub-region from the plurality of sub-regions is derived from and/or characterizes a sub-image of the difference image corresponding to the sub-region.
17. The system of claim 15, wherein the step of determining whether each of the plurality of sub-regions is defective comprises:
generating a covariance matrix comprising the noise values corresponding to the sub-regions and the sub-regions adjacent to the sub-regions;
multiplying a first vector comprising the difference value corresponding to the sub-region and the adjacent sub-region by an inverse of the covariance matrix to obtain a second vector;
calculating a scalar product of the second vector and a third vector, components of the third vector comprising values characterizing defects; and
if the scalar product is greater than a predetermined threshold, the sub-region is marked as defective.
18. A non-transitory computer-readable storage medium storing instructions that cause a sample analysis system to:
scanning a region of the sample in multiple view angles; and
performing an integrated analysis of scan data obtained in the scan, the integrated analysis comprising:
calculating cross-view covariance based on the obtained scan data, and/or estimating the cross-view covariance; and
the presence of defects in the region is determined taking into account the cross-view covariance.
19. The storage medium of claim 18, wherein the multiple perspectives comprise two or more of: one or more illumination angles, intensity of the illumination radiation, illumination polarization, illumination wavefront, illumination spectrum, one or more focus shifts of the illumination beam, one or more collection angles, intensity of the collected radiation, collection polarization, phase of the one or more collected beams, bright field channel, gray field channel, fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
20. The storage medium of claim 18, wherein the integrated analysis comprises:
generating, for each of a plurality of sub-regions of the region, a difference value in each of the multiple views based on the obtained scan data and corresponding reference data for the region in each of the multiple views; and
determining whether each of the plurality of sub-regions is defective based at least on the difference values corresponding to the sub-region and sub-regions adjacent to the sub-region and a noise value corresponding to the sub-region and the adjacent sub-region, the noise value including a corresponding covariance from the cross-view covariance.
CN202180067312.2A 2020-09-02 2021-09-02 Multi-view wafer analysis Pending CN116368377A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US17/010,746 2020-09-02
US17/010,746 US11815470B2 (en) 2019-01-17 2020-09-02 Multi-perspective wafer analysis
PCT/US2021/048935 WO2022051551A1 (en) 2020-09-02 2021-09-02 Multi-perspective wafer analysis

Publications (1)

Publication Number Publication Date
CN116368377A true CN116368377A (en) 2023-06-30

Family

ID=80491432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180067312.2A Pending CN116368377A (en) 2020-09-02 2021-09-02 Multi-view wafer analysis

Country Status (4)

Country Link
KR (1) KR20230056781A (en)
CN (1) CN116368377A (en)
TW (1) TW202221314A (en)
WO (1) WO2022051551A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101171506A (en) * 2005-05-06 2008-04-30 恪纳腾技术公司 Wafer edge inspection
CN107003249A (en) * 2014-11-04 2017-08-01 科磊股份有限公司 Wafer defect is found
US20180106723A1 (en) * 2012-06-26 2018-04-19 Kla-Tencor Corporation Scanning in angle-resolved reflectometry and algorithmically eliminating diffraction from optical metrology
US20190304851A1 (en) * 2018-03-30 2019-10-03 Nanometrics Incorporated Sample inspection using topography
CN110603626A (en) * 2017-05-23 2019-12-20 科磊股份有限公司 Wafer inspection using difference images
US20200232934A1 (en) * 2019-01-17 2020-07-23 Applied Materials Israel, Ltd. Multi-perspective wafer analysis

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8223327B2 (en) * 2009-01-26 2012-07-17 Kla-Tencor Corp. Systems and methods for detecting defects on a wafer
JP2011047724A (en) * 2009-08-26 2011-03-10 Hitachi High-Technologies Corp Apparatus and method for inspecting defect
JP5498189B2 (en) * 2010-02-08 2014-05-21 株式会社日立ハイテクノロジーズ Defect inspection method and apparatus
JP5921990B2 (en) * 2012-08-23 2016-05-24 株式会社ニューフレアテクノロジー Defect detection method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101171506A (en) * 2005-05-06 2008-04-30 恪纳腾技术公司 Wafer edge inspection
US20180106723A1 (en) * 2012-06-26 2018-04-19 Kla-Tencor Corporation Scanning in angle-resolved reflectometry and algorithmically eliminating diffraction from optical metrology
CN107003249A (en) * 2014-11-04 2017-08-01 科磊股份有限公司 Wafer defect is found
CN110603626A (en) * 2017-05-23 2019-12-20 科磊股份有限公司 Wafer inspection using difference images
US20190304851A1 (en) * 2018-03-30 2019-10-03 Nanometrics Incorporated Sample inspection using topography
US20200232934A1 (en) * 2019-01-17 2020-07-23 Applied Materials Israel, Ltd. Multi-perspective wafer analysis

Also Published As

Publication number Publication date
WO2022051551A1 (en) 2022-03-10
KR20230056781A (en) 2023-04-27
TW202221314A (en) 2022-06-01

Similar Documents

Publication Publication Date Title
US20200232934A1 (en) Multi-perspective wafer analysis
US11815470B2 (en) Multi-perspective wafer analysis
TWI677679B (en) Methods and apparatus for speckle suppression in laser dark-field systems
JP5199539B2 (en) Multispectral technique for defocus detection
JP6617143B2 (en) Defect detection system and method using structure information
KR101338837B1 (en) Defect inspection method and device thereof
JP5570530B2 (en) Defect detection on wafer
JP4996856B2 (en) Defect inspection apparatus and method
US7664608B2 (en) Defect inspection method and apparatus
JP5182090B2 (en) Defect detection apparatus and defect detection method
US11790510B2 (en) Material testing of optical test pieces
WO2010073453A1 (en) Defect inspection method and device thereof
TWI497032B (en) Defect inspection apparatus
CN111164646A (en) Multi-step image alignment method for large offset die-to-die inspection
WO2012035852A1 (en) Defect inspection method and device thereof
US9194811B1 (en) Apparatus and methods for improving defect detection sensitivity
Liu et al. Microscopic scattering imaging measurement and digital evaluation system of defects for fine optical surface
EP2587313B1 (en) Optical measurement system and method for measuring critical dimension of nanostructure
TWI778258B (en) Methods, systems, and non-transitory computer readable medium of defect detection
KR20100110321A (en) Inspecting apparatus and inspecting method
US9702827B1 (en) Optical mode analysis with design-based care areas
TW201945721A (en) Combining simulation and optical microscopy to determine inspection mode
Yang et al. Surface defects evaluation system based on electromagnetic model simulation and inverse-recognition calibration method
CN116368377A (en) Multi-view wafer analysis
US20220237758A1 (en) Methods and systems for analysis of wafer scan data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination