WO2022051551A1 - Analyse de tranche multi-perspective - Google Patents

Analyse de tranche multi-perspective Download PDF

Info

Publication number
WO2022051551A1
WO2022051551A1 PCT/US2021/048935 US2021048935W WO2022051551A1 WO 2022051551 A1 WO2022051551 A1 WO 2022051551A1 US 2021048935 W US2021048935 W US 2021048935W WO 2022051551 A1 WO2022051551 A1 WO 2022051551A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
region
regions
perspectives
scan data
Prior art date
Application number
PCT/US2021/048935
Other languages
English (en)
Inventor
Haim Feldman
Eyal Neistein
Harel Ilan
Shahar ARAD
Ido Almog
Ori GOLANI
Original Assignee
Applied Materials Israel Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/010,746 external-priority patent/US11815470B2/en
Application filed by Applied Materials Israel Ltd. filed Critical Applied Materials Israel Ltd.
Priority to KR1020237010768A priority Critical patent/KR20230056781A/ko
Priority to CN202180067312.2A priority patent/CN116368377A/zh
Publication of WO2022051551A1 publication Critical patent/WO2022051551A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Definitions

  • the present disclosure relates generally to wafer analysis.
  • wafer analysis tools are accordingly required to detect increasingly smaller defects.
  • defect detection was mainly limited by laser power and detector noise.
  • state-of-the-art wafer analysis tools are mostly limited by wafer noise due to diffuse reflection from the surface of the wafer: Surface irregularities on the wafer, constituted by the roughness of the etched patterns, are often manifested as bright spots (speckles) in a scanned image. These bright spots may highly resemble the “thumbprint” (signature) of a defect. There is thus a need for improved techniques of distinguishing defects from wafer noise.
  • aspects of the disclosure relate to methods and systems for wafer analysis. More specifically, but not exclusively, aspects of the disclosure, according to some embodiments thereof, relate to methods and systems for multiperspective wafer analysis wherein measurement data from a plurality of perspectives are subjected to an integrated analysis.
  • a method for detecting defects on a sample e.g. a wafer or an optical mask.
  • the method includes:
  • the integrated analysis includes:
  • the sample is a patterned wafer.
  • the sample is a bare wafer.
  • the multiplicity of perspectives includes two or more of an incidence angle(s) of an irradiating beam(s), a collection angle(s) of a collected beam(s), at least one intensity of the irradiating beam(s), at least one intensity of the collected beam(s), and compatible combinations thereof.
  • the method is optical -based, scanning electron microscopy-based, and/or atomic force microscopy-based.
  • the method is optical -based and the multiplicity of perspectives includes two or more of an illumination angle(s), an intensity of the illuminating radiation, an illumination polarization, an illumination wavefront, an illumination spectrum, one or more focus offsets of the illuminating light beam, a collection angle(s), an intensity of the collected radiation, a collection polarization, a phase of the collected beam(s), brightfield channel, grayfield channel, Fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
  • the integrated analysis includes:
  • each of the plurality of sub-regions is defective, based at least on the difference values corresponding to the sub-region and to sub-regions neighboring the sub-region, and noise values (i.e. a set of noise values) corresponding to the sub-region and to the neighboring sub-regions.
  • the noise values include corresponding covariances from the cross-perspective covariances.
  • the method further includes generating difference images of the first region in each of the multiplicity of perspectives, based on the obtained scan data and the reference data.
  • the difference values corresponding to each sub-region, from the plurality of sub-regions are derived from, and/or characterize, sub-images of the difference images, which correspond to the sub-region. (So that given N difference images, to each sub-region correspond N sub-images (i.e. a set of N sub-images).
  • the noise values are computed based at least on the difference values.
  • the determining of whether each of the plurality of sub-regions is defective includes:
  • At least one of the plurality of sub-regions is of a size corresponding to a single (image) pixel.
  • the cross-perspective covariances are estimated based at least on scan data obtained in a preliminary scanning of the sample wherein regions (e.g. on the surface) of the sample are sampled. Each sampled region is representative of a group of regions of the sample, with at least one of the sampled regions being representative of the first region.
  • the method further includes, when a presence of a defect is determined, determining whether the defect is a defect of interest and, optionally, when the defect is determined to be of interest, classifying the defect.
  • the method is repeated with respect to each of a plurality of additional regions, such as to scan a greater region (e.g. on the surface of) of the sample formed by the first region and the additional regions.
  • a computerized system for obtaining and analyzing multi-perspective scan data of a sample (e.g. a wafer or an optical mask).
  • the computerized system is configured to implement the above-described method.
  • a non-transitory computer-readable storage medium storing instructions that cause a computerized analysis system (e.g. a wafer analysis system) to implement the above-described method.
  • a computerized analysis system e.g. a wafer analysis system
  • computerized system for obtaining and analyzing multi-perspective scan data of a sample includes:
  • - Scanning equipment configured to scan a region (e.g. on a surface) of a sample in a multiplicity of perspectives.
  • a scan data analysis module (including one or more processors and memory components) configured to perform an integrated analysis of scan data obtained in the scan, wherein the integrated analysis includes:
  • the system is configured for analyzing scan data of a patterned wafer.
  • the system is configured for analyzing scan data of a bare wafer.
  • the multiplicity of perspectives includes two or more of an incidence angle(s) of an irradiating beam(s), a collection angle(s) of a collected beam(s), at least one intensity of the irradiating beam(s), and at least one intensity of the collected beam(s).
  • the scanning equipment includes an optical-based imager.
  • the scanning equipment includes a scanning electron microscope.
  • the scanning equipment includes an atomic force microscope.
  • the multiplicity of perspectives includes two or more of an illumination angle(s), an intensity of the illuminating radiation, an illumination polarization, an illumination wavefront, an illumination spectrum, one or more focus offsets of the illuminating light beam, a collection angle(s), an intensity of the collected radiation, a collection polarization, a phase of the collected beam(s), brightfield channel, grayfield channel, Fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
  • the integrated analysis includes:
  • each of the plurality of sub-regions is defective, based at least on the difference values corresponding to the sub-region and to sub-regions neighboring the sub-region, and noise values corresponding to the sub-region and to the neighboring sub-regions.
  • the noise values include corresponding covariances from the cross-perspective covariances.
  • the scan data analysis module is further configured to generate difference images of the first region in each of the multiplicity of perspectives based on the obtained scan data and the reference data, wherein the difference values corresponding to each sub-region, from the plurality of sub-regions, are derived from, and/or characterize, sub-images of the difference images, which correspond to the sub-region.
  • the scan data analysis module is configured to compute the noise values based at least on the difference values.
  • the determining of whether each of the plurality of sub-regions is defective includes:
  • At least one of the plurality of subregions is of a size corresponding to a single (image) pixel.
  • the scan data analysis module is configured to estimate the cross-perspective covariances based at least on scan data obtained in a preliminary scanning of the sample wherein regions (e.g. on the surface) of the sample are sampled. Each sampled region is representative of a group of regions of the sample, with at least one of the sampled regions being representative of the first region.
  • the scan data analysis module is further configured to, upon determining a presence of a defect, further determining whether the defect is a defect of interest and, optionally, when the defect is determined to be of interest, classifying the defect.
  • the system is further configured to repeat the scanning and the integrated analysis with respect to each of a plurality of additional regions, such as to scan a greater region (e.g. on the surface) of the sample formed by the first region and the additional regions.
  • a non-transitory computer-readable storage medium storing instructions that cause a computerized analysis system (e.g. a wafer analysis system) to:
  • - Scan a region (e.g. on the surface) of a sample (e.g. a wafer or an optical mask) in a multiplicity of perspectives.
  • a sample e.g. a wafer or an optical mask
  • the sample is a patterned wafer.
  • the sample is a bare wafer.
  • the multiplicity of perspectives includes two or more of an incidence angle(s) of an irradiating beam(s), a collection angle(s) of a collected beam(s), at least one intensity of the irradiating beam(s), and at least one intensity of the collected beam(s).
  • the computerized analysis system is optical -based.
  • the computerized analysis system scanning is electron microscopy-based or atomic force microscopy-based.
  • the multiplicity of perspectives includes two or more of an illumination angle(s), an intensity of the illuminating radiation, an illumination polarization, an illumination wavefront, an illumination spectrum, one or more focus offsets of the illuminating light beam, a collection angle(s), an intensity of the collected radiation, a collection polarization, a phase of the collected beam(s), brightfield channel, grayfield channel, Fourier filtering of returned light, and a sensing type selected from intensity, phase, or polarization, and compatible combinations thereof.
  • the integrated analysis includes:
  • each of the plurality of sub-regions is defective, based at least on the difference values corresponding to the sub-region and to sub-regions neighboring the sub-region, and noise values corresponding to the sub-region and to the neighboring sub-regions.
  • the noise values include corresponding covariances from the cross-perspective covariances.
  • the stored instructions cause a scan data analysis module of the computerized system to generate difference images of the first region in each of the multiplicity of perspectives based on the obtained scan data and the reference data, wherein the difference values corresponding to each sub-region, from the plurality of sub-regions, are derived from, and/or characterize, sub-images of the difference images, which correspond to the sub-region.
  • the stored instructions cause the scan data analysis module to compute the noise values based at least on the difference values.
  • the determining of whether each of the plurality of sub-regions is defective includes:
  • At least one of the plurality of sub-regions is of a size corresponding to a single (image) pixel.
  • the stored instructions cause the scan data analysis module to estimate the cross-perspective covariances based at least on scan data obtained in a preliminary scanning of the sample wherein regions (e.g. on the surface) of the sample are sampled. Each sampled region is representative of a group of regions of the sample, with at least one of the sampled regions being representative of the first region.
  • Certain embodiments of the present disclosure may include some, all, or none of the above advantages.
  • One or more other technical advantages may be readily apparent to those skilled in the art from the figures, descriptions, and claims included herein.
  • specific advantages have been enumerated above, various embodiments may include all, some, or none of the enumerated advantages.
  • terms such as “processing”, “computing”, “calculating”, “determining”, “estimating”, “assessing”, “gauging” or the like may refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data, represented as physical (e.g. electronic) quantities within the computing system’s registers and/or memories, into other data similarly represented as physical quantities within the computing system’s memories, registers or other such information storage, transmission or display devices.
  • Embodiments of the present disclosure may include apparatuses for performing the operations herein.
  • the apparatuses may be specially constructed for the desired purposes or may include a general-purpose computer(s) selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
  • program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types.
  • Disclosed embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including memory storage devices.
  • Figure l is a flowchart of a method for multi-perspective wafer analysis, according to some embodiments.
  • Figure l is a flowchart of an operation for integrated analysis of multi -perspective scan data, according to some specific embodiments of the method of Fig. 1;
  • Figure 3 is a flowchart of a sub-operation for identifying (detecting) defects in a scanned region of a wafer, according to some specific embodiments of the operation of Fig.
  • Figures 4A-4G present algebraic expressions used in computations included in the sub-operation of Fig. 3, according to some embodiments;
  • Figures 5A and 5B present two different ways of enumerating sub-images, according to some embodiments.
  • Figure 6 presents a block diagram of a computerized system for obtaining and analyzing multi-perspective scan data of a wafer (also depicted), according to some embodiments;
  • Figure 7A schematically depicts a computerized system for obtaining and analyzing multi-perspective scan data of a wafer (also depicted), the depicted computerized system is a specific embodiment of the computerized system of Fig. 6;
  • Figure 7B schematically depicts specular reflection of a light ray off of the wafer of Fig. 7A, according to some embodiments
  • Figure 8 schematically depicts a computerized system for obtaining and analyzing multi-perspective scan data of a wafer (also depicted), the depicted computerized system is a specific embodiment of the computerized system of Fig. 6;
  • Figure 9 schematically depicts a computerized system for obtaining and analyzing multi-perspective scan data of a wafer (also depicted), the depicted computerized system is a specific embodiment of the computerized system of Fig. 6;
  • Figures 10A-10C presents simulation results demonstrating the efficacy of the method of Fig. 1. DETAILED DESCRIPTION
  • the term “about” may be used to specify a value of a quantity or parameter (e.g. the length of an element) to within a continuous range of values in the neighborhood of (and including) a given (stated) value. According to some embodiments, “about” may specify the value of a parameter to be between 80 % and 120 % of the given value. For example, the statement “the length of the element is equal to about 1 m” is equivalent to the statement “the length of the element is between 0.8 m and 1.2 m”. According to some embodiments, “about” may specify the value of a parameter to be between 90 % and 110 % of the given value. According to some embodiments, “about” may specify the value of a parameter to be between 95 % and 105 % of the given value.
  • multi-perspective wafer analysis is used to refer to wafer analysis employing scan data from a multiplicity of perspectives. Different perspectives may differ from one another, for example, by polarization, collection pupil segment, phase information, focus offset, and so on. The extra information provided by the multiplicity of perspectives (as compared to by a single perspective) may be used, in particular, to cope more efficiently with wafer noise. Scan data from several perspectives may give rise to a predictable or self-learnable pattern, which is distinguishable from wafer noise, thus, leading to improved defect detection rates.
  • the terms “identifying” and “detecting” and derivatives thereof, employed in reference to defects on e.g. a wafer, may be used interchangeably.
  • the term “sample” may refer to a wafer or an optical mask. The wafer may be patterned or bare.
  • Fig. 1 presents a flowchart of such a method, a method 100, according to some embodiments.
  • method 100 includes an operation 110, wherein scan data in a multiplicity of perspectives of a region (area) of a wafer are obtained. More specifically, in operation 110 a plurality of images (e.g. image frames) - in a multiplicity of perspectives - of a scanned region (e.g. a segment of a slice corresponding to an image frame) of a wafer may be obtained.
  • the plurality of images may be obtained using scanning equipment configured to scan a wafer in a multiplicity of perspectives, as elaborated on below.
  • the scanning equipment may include an imager (imaging module or unit) configured to irradiate (e.g. illuminate) a region of a wafer and collect radiation therefrom.
  • the imager may be optical-based (being configured to illuminate a region of a wafer with electromagnetic radiation, such as visible and/or ultraviolet (UV) radiation).
  • electromagnetic radiation such as visible and/or ultraviolet (UV) radiation
  • the UV radiation may be or include deep UV radiation and/or extreme UV radiation.
  • the imager may be configured to irradiate a region of a wafer with one or more charged- particle beams (e.g. electron beams).
  • the imager may be configured to allow simultaneously irradiating the wafer with a plurality of radiation beams, thereby facilitating simultaneously scanning a plurality of regions of the wafer.
  • an irradiation channel determines one or more physical properties of the irradiation beam incident on the wafer, such as the trajectory of the beam, the shape of the beam, and/or the polarization of the beam (when the beam is a light beam).
  • the collection channel includes sensing type (intensity, polarization, phase), as well as “filters”, which are used herein in a broad sense to refer to mechanisms (e.g.
  • the multiplicity of perspectives may include two or more of: an illumination angle(s) (i.e. the incidence angle(s) of the illuminating radiation), an illumination intensity (as determined by the amplitude of the illuminating radiation), an illumination polarization (i.e.
  • an illumination wavefront (the shape of the wavefront of the illuminating radiation when monochromatic), an illumination spectrum (i.e. the spectrum of the illuminating radiation), and one or more focus offsets of the illuminating light beam (which may be slightly out-of-focus), a collection angle(s) (allowing to selectively sense light returned at certain angle or range of angles), an intensity of the collected radiation (allowing to selectively sense light returned at certain intensity or range of intensities), a collection polarization, a phase of the collected beam(s) (when the illumination beam(s) is monochromatic), brightfield channel, grayfield channel (which may be further subdivided into darkfield and “pure” grayfield), Fourier filtering of returned light, a sensing type (for example, amplitude, phase, and/or polarization), and compatible combinations of the abovelisted items.
  • a sensing type for example, amplitude, phase, and/or polarization
  • a perspective may be characterized by more than one of the items from the above list. That is, a combination of items from the above list.
  • a perspective may be characterized by an angle at which the incident light beam impinges on the wafer surface (i.e. an illumination angle) and the polarization of the incident light beam (i.e. an illumination polarization).
  • a perspective may be characterized by a collection angle and a collection phase (i.e. a phase of a collected light beam).
  • a perspective may combine characteristics from both the illumination channels and the collection channels.
  • a perspective may be characterized by an illumination polarization and a collection polarization.
  • a perspective may be characterized by an illumination angle and polarization and a collection intensity and phase.
  • acquired (obtained) images may differ from one another by at least one parameter selected from the above-specified list of perspectives.
  • expressions such as “two or more of’ and “at least two of’ in reference to a list including a sub-list (which includes a plurality of items (e.g. elements or claim limitations)) and at least one item not in the sub-list may refer to only two elements of the sub-list, one element of the sub-list and one listed element which is not in the sub-list, two elements not in the sub-list, and so on.
  • the at least one illumination spectrum includes two illumination spectra
  • the multiplicity of perspectives may consist of, or include, the two illumination spectra.
  • reflected and/or scattered light may undergo Fourier filtering prior to being detected.
  • the Fourier filtering may be used to increase the number of perspectives and the amount of information obtainable therefrom.
  • the multiplicity of perspectives may include slightly out-of- focus illumination.
  • the illumination spectrum may be narrow, for example, when the illuminating light source is a laser.
  • the illumination spectrum may be wide, for example, when the illuminating light originates from an incoherent light source such as a lamp.
  • the at least one illumination spectrum includes a plurality of illumination spectra. Each illumination spectrum in the plurality of illumination spectra may be narrow - and optionally coherent (e.g. when the illuminating light is coherent laser light) - or wide.
  • multi-perspective scan data may be obtained from the brightfield channel (i.e. brightfield reflected light) and/or the grayfield channel (i.e. grayfield scattered light).
  • the term “grayfield scattered light” is used in a broad sense to refer to non-brightfield reflected light.
  • the term “grayfield scattered light” may be used to refer also to darkfield scattered light.
  • images corresponding to different perspectives may be obtained simultaneously, or substantially simultaneously. According to some embodiments, images corresponding to different perspectives may be obtained successively. According to some embodiments, some images corresponding to different perspectives may be obtained simultaneously, or substantially simultaneously, while some images corresponding to other perspectives may be obtained at an earlier or later time.
  • the imager used to obtain the scan data in operation 110 may include a plurality of detectors.
  • a first detector may be configured to detect an intensity of a returned light beam while a second detector may be configured to detect a polarization of the returned light beam.
  • each detector may be allocated (assigned) to a different perspective. Altematively, according to some embodiments, wherein all the perspectives are obtained successively (sequentially), a single detector may be employed. According to some embodiments, wherein some of the perspectives are obtained simultaneously and some are obtained successively, at least some of the detectors may be allocated to subsets of the multiplicity of perspectives, which respectively include at least two of the perspectives.
  • a segmented pupil may be employed, such as to separate a returned radiation beam, arriving at the pupil, according to the reflection or scattering angle of sub-beams of the returned radiation beam from the wafer.
  • Different detectors may be allocated to detect radiation from different pupil segments, respectively, one detector per pupil segment, so that each pupil segment constitutes a different collection channel corresponding to a different collection angle (and different perspective).
  • the detectors may be positioned in the conjugate plane to the pupil plane on which the segmented pupil may be positioned.
  • method 100 includes an operation 120 wherein the scan data obtained in operation 110 undergo an integrated analysis to identify (detect) defects in the scanned region.
  • integrated analysis employed with respect to analysis of multi-perspective scan data (i.e. scan data in at least two different perspectives) refers to an analysis that utilizes scan data from the multiplicity of perspectives, such as to obtain improved defect detection rates.
  • the integrated analysis may take into account cross-perspective covariances, that is, covariances between at least some of the different perspectives.
  • method 100 may further include an operation 125, wherein it is determined whether identified defects (i.e. identified in operation 125) are of interest (or nuisance).
  • defects, determined to be of interest may be further classified.
  • operation 125 may determine the type of deformation giving rise to the defect.
  • Some deformations may be specific to certain types of components (semi-conductor devices) fabricated on the wafer, such as chips or other components, for example, transistors.
  • the classification may be based on measured or derived characteristics of the identified defects in the multiplicity of perspectives.
  • method 100 further includes an operation 130, wherein operations 110 and 120 (and optionally operation 125) may be repeated with respect to additional regions of the wafer (e.g. with respect to other segments of the slice).
  • the additional regions may constitute one or more predefined greater regions of the wafer, which are to be scanned (e.g. one or more dies).
  • operations 110 and 120 (and optionally operation 125) may be repeated until the wafer is fully scanned.
  • method 100 includes both operations 125 and 130
  • the order of operations 125 and 130 may be reversed.
  • Fig. 2 presents a flow chart of an operation 220, which is a specific embodiment of operation 120.
  • operation 220 may include:
  • a sub-operation 220a wherein a set of difference images of the scanned region are generated based on the obtained images (i.e. the plurality of images obtained in operation 110) and corresponding reference data.
  • Each difference image (in the set of difference images) corresponds to one of the perspectives (from multiplicity of perspectives).
  • Each difference image may be generated using one or more of the obtained images corresponding to the perspective and reference data (of the scanned region) corresponding to the perspective.
  • a difference value(s) (also referred to as “attribute(s)”) is computed.
  • Sub-images corresponding to a same wafer sub-region of a scanned wafer region define a respective set of sub-images such that each sub-image in the set of subimages may correspond to a different perspective (from the multiplicity of perspectives).
  • a respective difference value may be computed, thereby generating a set of difference values corresponding to the wafer sub-region (and the set of sub-images).
  • each of a plurality of wafer sub-regions, corresponding to the plurality of sub-images of sub-operation 220b, may be determined as being defective (or not), based at least on the set of difference values, corresponding to the wafer sub-region, and a respective (corresponding) set of noise values.
  • a sub-region e.g. corresponding in size to a pixel or a small group of pixels
  • a sub-region is said to be “defective” when including a defect or a part of a defect.
  • the terms “difference value(s)” - in reference to a sub-image - and “pixel value(s)” - in reference to the same sub-image - may be used interchangeably when the sub -image is a pixel.
  • the reference data may include reference images that have been obtained, for example, in scanning of the wafer or a wafer fabricated to have the same design, or generated based on design data of the wafer such as CAD data.
  • the term “difference image” is to be understood in an expansive manner and may refer to any image obtained by combining at least two images, for example, a first image (e.g. an image of a scanned region of a wafer or an image obtained from a plurality of images of the scanned region) and a second image (e.g. a reference image derived from reference data pertaining to the scanned region).
  • the combination of the two images may involve any manipulation of the two images resulting in at least one “difference image”, which may reveal variation (differences) between the two images, or, more generally, may distinguish (differentiate) between the two images (when differences are present).
  • the term “combination”, with reference to two images, may be used in a broader sense than subtraction of one image from the other and covers other mathematical operations, which may be implemented additionally or alternatively to subtraction.
  • one or both of the two images may be individually manipulated (that is, pre-processed). For example, the first image may be registered with respect to the second image.
  • reference data should be expansively construed to cover any data indicative of the physical design of a (patterned) wafer and/or data derived from the physical design (e.g. through simulation).
  • reference data may include, or consist of, “design data” of the wafer, such as, for example, the various formats of CAD data.
  • “reference data” may include, or consist of, data obtained by fully or partially scanning the wafer, e.g. during recipe setup or even in runtime. For example, scanning of one die, or multiple dies having the same architecture, during runtime may serve as reference data for another die of the same architecture. Further, a first wafer fabricated to a certain design, may be scanned during recipe setup and the obtained scan(s) data may be processed to generate reference data or additional reference data for subsequently fabricated wafers of the same design (as the first wafer). Such “self-gen erated” reference data is imperative when design data is not available but may also be beneficial even when design data is available.
  • the term “difference image” may refer to any set of derived values obtained by jointly manipulating two sets of values: a first set of values (obtained during a scan) and a second set of values (reference values obtained from reference data), such that each derived value in the set corresponds to a sub-region (e.g. a pixel) of a scanned region on the wafer.
  • the joint manipulation may involve any mathematical operations on the two sets of values such that the (resulting) set of derived values may reveal differences, if present, between the two sets of values, or, more generally, may distinguish between the two sets of values.
  • each (difference) value in the set of difference values may result from joint manipulation of a plurality of values in the first set and a plurality of values in the second set.
  • the set of difference values pertaining to a subimage may also include scan data pertaining to neighboring sub-images or data generated based on scan data pertaining to neighboring sub-images.
  • the set of pixel values e.g. intensity values
  • the set of pixel values corresponding to a pixel may also include pixel values of neighboring pixels.
  • two sub-images (of a given image, e.g. a difference image) may be said to be “neighbors” when the sub-images are “nearest neighbors”. That is, the subimages are adjacent to one another in the sense of no other sub-image being present there between.
  • two pixels may be said to be “neighbors” not only when the pixels are nearest neighbors, but also when separated from one another by one pixel at most, two pixels at most, three pixels at most, five pixels at most, or even ten pixels at most.
  • neighbores not only when the pixels are nearest neighbors, but also when separated from one another by one pixel at most, two pixels at most, three pixels at most, five pixels at most, or even ten pixels at most.
  • the set of difference values pertaining (corresponding) to a first sub-image includes also scan data pertaining to neighboring subimages, such that the first sub-image is positioned centrally relative to the neighboring subimages.
  • the set of difference values pertaining to a first pixel includes also scan data pertaining to neighboring pixels, such that the first pixel and the neighboring pixels constitute a block of m x n pixels with 3 ⁇ m ⁇ 11 and 3 ⁇ n ⁇ 11. Larger values of n and m are also possible and may be required, for example, when the size of the defects or the correlation length of the noise are large.
  • the first pixel may be positioned at the center of the block. In particular, when the size of a suspected defect is greater than the first pixel (i.e. when the first pixel may include (in the sense of depicting) only a part of the suspected defect), n and m may be selected such that the block (formed by the first pixel and the neighboring first pixels) depicts in full the suspected defect.
  • sub-operation 220c may include computing the set of noise values.
  • the set of noise values may be computed based on the set of difference values corresponding to the sub-region.
  • method 100 may include a preliminary scanning operation wherein the wafer is partially scanned. More specifically, the wafer may be “sampled” in the sense that a sample of regions thereof is scanned. Each region in the sample (i.e. each region from the sampled regions) is representative of regions of the wafer which are characterized by a certain architecture, type(s) of components, and so on.
  • certain computational operations may be implemented only with respect to preliminary scan data. For example, from a group of dies fabricated to have the same design, one or more dies may be sampled (in the preliminary scanning operation). Scan data obtained from corresponding regions within the sampled dies may be used later (e.g. in suboperation 220c) with respect to corresponding regions of dies which have not been sampled.
  • sets of noise values corresponding to the sampled regions may be computed and stored in a memory (i.e. prior to operation 110). The sets of noise values may later be used in sub-operation 220c as part of the determination of whether a scanned region includes a defect.
  • operation 120 may additionally include a suboperation wherein images (of the same region) pertaining to different perspectives, and which have been obtained at different times (in particular, times differing by more than a timescale typical of high-frequency physical effects impacting the wafer analysis system (used to inspect the wafer) and/or the wafer.), are registered with respect to one another.
  • This “perspective-to-perspective” registration may be implemented, for example, prior to suboperation 220a (i.e. in embodiments wherein operation 120 is carried out in accordance with Fig. 2).
  • the perspective-to-perspective registration may be implemented in addition to standard die-to-die registration and/or cell-to-cell registration. According to some embodiments, e.g.
  • images in different perspectives of a same scanned region may be registered with respect to one another.
  • the registration may be implemented using scan data obtained from a common channel (which does not change when switching between perspectives).
  • the multi-perspective scan data is obtained from the brightfield channel, while the grayfield channel is used for registering the images with respect to one another.
  • the multi-perspective scan data is obtained from the grayfield channel, while the brightfield channel is used for registering the images with respect to one another.
  • the “perspective-to-perspective” registration may be implemented in addition to standard die-to-die registration and/or cell-to-cell registration.
  • the perspectives are always acquired at least two at a time, with one perspective being common to all the acquired perspectives.
  • Fig. 3 presents a flowchart of a sub-operation 320c, which is a specific embodiment of sub-operation 220c.
  • sub-operation 320c may include computation of a covariance matrix (which constitutes the set of noise values).
  • the computation of the covariance matrix may be based on a corresponding set of difference values computed in sub-operation 220b and/or on scan data obtained in the preliminary scanning of the wafer.
  • the terms in the off-diagonal blocks in the covariance matrix include cross-perspective covariances (both between sub-images corresponding to different (neighboring) sub-regions as well as sub-images corresponding to a same sub-region).
  • the determining of whether a subregion includes a defect (or a part of a defect), in sub-operation 220c may include:
  • the set of difference values, corresponding to the sub-region also includes difference values pertaining to neighboring sub-regions.
  • the components of the third vector k may characterize the signature of a specific type of defect(s) - which the sub-region is suspected of at least partially including - as would appear in a difference image (ideally) obtained in essentially the absence of wafer noise.
  • - A sub-operation 320c3 wherein it is checked whether the scalar product exceeds a pre-determined threshold , and, if so, labeling the sub-region as including a defect (or a part of a defect).
  • Figs. 4A-4G present algebraic expressions used in the computations involved in the sub-operation of Fig. 3, according to some embodiments.
  • the first vector v is shown in Fig. 4A for the case that the number of sub-images is n and the number of perspectives is m. v therefore includes n m components.
  • each of the vectors v, u, and k is defined as a column vector.
  • the first n components of v i.e. vu, V12, ..., vin represent difference values (of the n sub-images) in the first perspective.
  • components n + 1 to In of v i.e. V21, V22, ..., V2
  • the vector v is thus “composed” of m z-component vectors v 7 shown in Fig. 4B. Each of the v 7 corresponds to a different perspective (labeled by the index J')-
  • the central pixel ps which is the pixel to be analyzed
  • the eight closest pixels thereto are shown.
  • the set of pixel values (of the central pixel) includes not only values pertaining to the central pixel, but also to the eight surrounding pixels.
  • the four closest pixels thereto are shown.
  • the set of pixel values (of the central pixel) includes not only values pertaining to the central pixel, but also to the four closest neighboring pixels.
  • the covariance matrix C is shown in Fig. 4C.
  • Each of the “off-diagonal” matrices C a , b a i.e. when b a) “correlates” between sub-images in diff erent perspectives, i.e. in the a-th and A-th perspectives, respectively.
  • the Cab are shown in Fig. 4D.
  • the third vector k is shown in Fig. 4E for the same case (i.e. wherein the number of sub-images is n and the number of perspectives is m). Similarly, to the first vector v, the third vector k is “composed” of m //-component vectors k 7 shown in Fig. 4F. Each of the k 7 corresponds to a different perspective (labeled by the index j).
  • the second vector u (obtained in sub-operation 320cl) is the matrix product of the (one-dimensional matrix) v by the inverse of C).
  • sub-operation 320c3 it is checked whether k • u > B.
  • the value of the threshold B may depend on the predetermined kernel (i.e. on characteristics of a defect(s) the sub-region is suspected of including or partially including).
  • the value of the threshold B may also vary from one subregion to a neighboring sub-region depending on the geometries of the respective patterns thereon. This may be the case even when the sub-regions and the neighboring sub-regions each correspond in size to a pixel and each include a respective part of the same defect.
  • a defect may typically measure at a least about 10 nm x 10 nm in area, and may affect the signals obtained from a region measuring at least about 100 nm x 100 nm therearound (i.e. corresponding to at least about 3 x 3 pixels given that a pixel corresponds to an area of about 10 nm x 10 nm on the wafer) when the radiation source is optical and produces an illumination spot having a diameter of about 100 nm.
  • the threshold B may be chosen, such that the percentage of false alarms (i.e. cases wherein a sub-region of a wafer, which is not defective, is mistakenly determined as defective) would not exceed a pre-defined (threshold) rate.
  • some of the off- diagonal terms or blocks e.g. some of the matrices C a , b a
  • Each of the m smaller covariance matrices corresponds to one of the perspectives with m being the number of perspectives.
  • the third vector k (i.e. the predetermined kernel) characterizes the signature of a defect, or a family of defects (i.e. similar defects), in the absence (or substantially in the absence) of wafer noise, and may be obtained by applying a matched filter to the signature of the defect, or the family of defects, in the presence of wafer noise, such as to maximize the signal -to-noise ratio.
  • the third vector k characterizes the signature of a specific type of defect the sub-region is suspected of including (or partially including).
  • the predetermined kernel may be derived based on one or more of: (i) experimental measurements implemented on wafer regions known to include one or more defects, (ii) computer simulations of light scattering from defects, (iii) physical models describing defect behavior, and (iv) machine learning algorithms designed to provide optimized kernels.
  • some pairs of perspectives may be known to exhibit weaker correlations than other pairs of perspectives (e.g. based on scan data obtained in preliminary scanning).
  • terms in blocks corresponding to pairs of perspectives, which are known to exhibit weaker correlations are not computed in order to expedite the analysis.
  • higher moments of the joint probability distribution (relating the measured values obtained by the imager in operation 110) - beyond covariances - may be taken into account as part of the determination in sub-operation 220c of whether a sub-region includes (or partially includes) a defect.
  • skewness and/or kurtosis may be taken into account.
  • method 100 may be implemented using a scanning electron microscope (SEM).
  • the multiplicity of perspectives includes two or more of at least one intensity of an irradiating electron beam(s) (e-beam(s)), at least one intensity of a returned e- beam(s), at least one spin of an irradiating e-beam(s), at least one spin of a returned e- beam(s), one or more incidence angle(s) of the irradiated e-beam(s), and one or more collection angle(s) of the returned e-beam(s).
  • SEM scanning electron microscope
  • method 100 may be implemented using an atomic force microscope (AFM).
  • AFM atomic force microscope
  • the multiplicity of perspectives may include different types of AFM tips, different tapping modes, and/or applying the AFM at different resonance frequencies.
  • difference values corresponding to pixels within a sub-image of a difference image may be averaged over to obtain a single (“coarse-grained”) difference value corresponding to the sub- image.
  • the set of difference values corresponding to a sub-region may include averaged difference values pertaining to sub-images of the sub-region, and averaged difference values pertaining to sub-images of neighboring sub-regions, in each of the multiplicity of perspectives. (Each averaged difference value was obtained by averaging over difference values pertaining to pixels making up the respective sub-image).
  • the covariance matrix may then be computed based on the averaged difference values, thereby potentially allowing to significantly lighten the computational load.
  • a computerized system for obtaining and analyzing multi-perspective scan data of a wafer.
  • Fig. 6 is a block diagram of such a computerized system, a computerized system 600, according to some embodiments.
  • System 600 includes scanning equipment 602 and a scan data analysis module 604
  • Scanning equipment 602 is configured to scan a wafer in each of a multiplicity of perspectives (e.g. as listed above in the Methods subsection). According to some embodiments, scan data pertaining to two or more of the multiplicity of perspectives may be obtained simultaneously or substantially simultaneously. Additionally or alternatively, according to some embodiments, scanning equipment 602 may be configured to scan a wafer, one perspective (from the multiplicity of perspectives) at a time. That is, scanning equipment 602 may be configured to switch between perspectives.
  • Scan data analysis module 604 is configured to (i) receive multi-perspective scan data obtained by scanning equipment 602, and (ii) perform an integrated analysis of the multi-perspective scan data, as further elaborated on below.
  • scanning equipment 602 includes a stage 612, a controller 614, an imager 616 (imaging device), and optical equipment 618.
  • Scanning equipment 602 is delineated by a dashed-double-dotted box to indicate that components therein (e.g. stage 612 and imager 616) may be separate from one another, e.g. in the sense of not being included in a common housing.
  • Stage 612 is configured to have placed thereon a sample to be inspected, such as a wafer 620 (or an optical mask). Wafer 620 may be patterned, but the skilled person will appreciate that method 100 may be utilized to detect defects also in bare wafers. According to some embodiments, stage 612 may be moveable, as elaborated on below.
  • Imager 616 may include one or more light emitters (e.g. a visible and/or ultraviolet light source) configured to irradiate wafer 620. Further, imager 616 may include one or more light detectors. In particular, imager 616 may apply collection techniques including brightfield collection, grayfield collection, and the like.
  • Optical equipment 618 may include optical filters (e.g.
  • optical equipment 618 may be configured to allow switching scanning equipment 602 between different perspectives.
  • optical equipment 618 may include polarizing filters and/or beam splitters configured to set the polarization of emitted (illuminating) light and/or select the polarization of the collected (returned) light.
  • optical equipment 618 may include any arrangement of optical components configured to determine (set) one or more optical properties (such as shape, spread, polarization) of the radiation beam from a radiation source of imager 616, and the trajectory of the incident radiation beam.
  • optical equipment 618 may further include any arrangement of optical components configured to select (e.g. by filtering) one or more optical properties of one or more returned radiation beams (e.g. beams specularly reflected by, or diffusely scattered off of, wafer 620) prior to the detection thereof, and the trajectories followed by the one or more returned beams when returned from wafer 620.
  • optical equipment 618 may further include optical components configured to direct the one or more returned radiation beams towards the detectors of imager 616.
  • Controller 614 may be functionally associated with stage 612, imager 616, and optical equipment 618, as well as with scan data analysis module 604. More specifically, controller 614 is configured to control and synchronize operations and functions of the above-listed modules and components during a scan of a wafer.
  • stage 612 is configured to support an inspected sample, such as wafer 620, and to mechanically translate the inspected sample along a trajectory set by controller 614, which also controls imager 616.
  • Scan data analysis module 604 includes computer hardware (one or more processors, such as image and/or graphics processor units, and volatile as well as non-volatile memory components; not shown). The computer hardware is configured to analyze multiperspective scan data received from imager 616, of a region on wafer 620, for presence of defects, essentially as described above in the Methods subsection.
  • Scan data analysis module 604 may further include an analog-to-digital (signal) converter (ADC) and a frame grabber (not shown).
  • ADC analog-to-digital converter
  • the ADC may be configured to receive analog image signals from imager 616. Each analog image signal may correspond to a different perspective from a multiplicity of perspectives.
  • the ADC may further be configured to convert the analog image signals into digital image signals and to transmit the digital image signals to the frame grabber.
  • the frame grabber may be configured to obtain from the digital image signals, digital images (block images or image frames) of scanned regions on a scanned wafer (e.g. wafer 620). Each digital image may be in one of the multiplicity of perspectives.
  • the frame grabber may be further configured to transmit the digital images to one or more of the processors and/or memory components.
  • scan data analysis module 604 may be configured to:
  • Each set of difference values corresponds to a sub-region (e.g. “pixel”) of the scanned region, essentially as described above in the Methods subsection in the description of Fig. 2.
  • scan data analysis module 604 may be configured to, for each set of difference values, and, based at least thereon, generate the corresponding set of noise values, essentially as described above in the Methods subsection in the description of Fig. 2, and - according to some embodiments of system 600 - in the description of Fig. 3.
  • the generation of the set of noise values may be based at least on scan data obtained in a preliminary scan(s) of the wafer, wherein representative regions of the wafer are scanned.
  • the determination of whether the sub-region is defective may be implemented taking into account the type of defect(s) the sub-region is suspected of including or partially including.
  • the determination may involve computation of a covariance matrix, and may further include computations involving a predetermined kernel - which characterizes the signature of the suspected type of defect(s) in essentially the absence of wafer noise - and a corresponding threshold.
  • a computerized system for obtaining and analyzing multi -perspective scan data of a wafer.
  • the system may be similar to system 600 but differs therefrom at least in utilizing an electron beam(s), rather than electromagnetic radiation, to irradiate the wafer.
  • an imager of the system may include a scanning electron microscope.
  • a computerized system for obtaining and analyzing multi -perspective scan data of a wafer.
  • the system may be similar to system 600 but differs therefrom at least in utilizing an atomic force microscope rather than an optical-based imager.
  • Fig. 7A schematically depicts a computerized system 700, which is a specific embodiment of system 600.
  • System 700 includes a radiation source 722, a plurality of detectors 724, which together constitute (or form part of) an imager, which is a specific embodiment of imager 616 of system 600.
  • System 700 further includes a scan data analysis module 704, which is a specific embodiment of scan data analysis module 604 of system 600.
  • System 700 further includes a beam splitter 732, and an objective lens 734, which together constitute (or form part of) optical equipment, which is a specific embodiment of optical equipment 618 of system 600.
  • a stage 712 which is a specific embodiment of stage 612 of system 600
  • a wafer 720 placed thereon.
  • optical axis O of objective lens 734 is also indicated.
  • Optical axis O extends in parallel to the z-axis.
  • light is emitted by radiation source 722.
  • the light is directed towards beam splitter 732 wherethrough some of the light is transmitted.
  • the transmitted light is focused by objective lens 734 on wafer 720, such as to form an illumination spot S thereon.
  • Returned light (which underwent specular reflection off of wafer 720) is directed back towards objective lens 734, and is refracted therethrough towards beam splitter 732.
  • a portion of the returned light (having been refracted via objective lens 734) is reflected by beam splitter 732 towards detectors 724.
  • a first light ray Li and a second light ray L2 indicate light rays emitted by radiation source 722.
  • a third light ray L3 and a fourth light ray L4 indicate (returned) light rays travelling towards detector 724 after having been reflected off of beam splitter 732 (following scattering off of wafer 712 and refraction through objective lens 734).
  • Third light ray L3 constitutes the portion of first light ray Li, which remains after the transmission through, and subsequent reflection by, beam splitter 732.
  • Fourth light ray L4 constitutes the portion of second light ray L2, which remains after the transmission through, and subsequent reflection by, beam splitter 732.
  • segmented pupil 740 (a segmented aperture, which also forms part of the optical equipment). Segmented pupil 740 may be positioned on the pupil plane and detectors 724 may be positioned on a plane conjugate to the pupil plane. Segmented pupil 740 is partitioned into a plurality of pupil segments (or sub-apertures). The segmentation of the pupil allows to separate a returned beam (e.g. a light beam reflected off of the wafer), arriving at the pupil, into sub-beams, according to the respective return angle of each of the sub-beams, so that each pupil segment will correspond to a different perspective. That is, each of the perspectives generated by segmented pupil 740 corresponds to a different collection angle.
  • a returned beam e.g. a light beam reflected off of the wafer
  • segmented pupil 740 is shown partitioned into nine pupil segments 740a to 740i, which are arranged in a square array, and detectors 724 include nine corresponding detectors 724a to 724i.
  • System 700 is configured such that light (originating in radiation source 722, and which has undergone specular reflection off of wafer 720) arriving at each of the pupil segments, continues therefrom towards a respective detector from detectors 724. That is, light passing through first pupil segment 740a is sensed by first detector 724a, light passing through second pupil segment 740b is sensed by second detector 724b, and so on).
  • Each of detectors 724 is thus configured to sense light returned at a different angle, respectively.
  • the optical equipment may further include optical guiding mechanisms (not shown) for directing light passing through each of the pupil segments.
  • the optical guiding mechanisms may be configured to ensure that light passing through a pupil segment is directed to a respective (target) detector (from detectors 724) without “leakage” to the other detectors.
  • the optical equipment may be configured such that light arriving at objective lens 734 (directly) from radiation source 722, arrives thereat as a collimated light beam.
  • Wafer 720 may be positioned at, or substantially at, the focal plane of objective lens 734, so that the light rays, incident on wafer 720, form illumination spot S thereon, which may be as small as about 100 nanometers.
  • Different light rays from the collimated light beam, having been refracted through objective lens 734 may be incident on wafer 720 at different angles.
  • a refracted portion of first light ray Li is incident on wafer 720 at a first angle of incidence 0i (i.e.
  • 02 may be used to refer to the reflection (return) angle of the refracted portion of first light ray Li off of wafer 720 instead of to the incidence angle of the refracted portion of second light ray L2 on wafer 720.
  • 0i may be used to refer to the reflection (return) angle of the refracted portion of second light ray L2 off of wafer 720 instead of to the incidence angle of the refracted portion of first light ray Li on wafer 720.
  • the angle of incidence may be relevant for multiperspective wafer analysis, but also the azimuthal angle, in particular, when wafer 720 is patterned (due to one or more asymmetries introduced by the pattern relative to the wafer surface). That is, the angle formed by the “projection” of the incident light ray onto the wafer surface with an x-axis of an orthogonal coordinate system parameterizing the (lateral dimensions of the) wafer surface.
  • Each of detectors 724 is positioned such as to detect light rays, which have impinged on wafer 720 at a polar angle 0 (or, more precisely, a continuous range of polar angles centered about 0) and an azimuthal angle ⁇ p (or, more precisely, a continuous range of azimuthal angles centered about ⁇ p).
  • system 700 may further include infrastructure (e.g. suitably positioned detectors) for sensing light, which has been diffusely scattered off of wafer 720 (in particular, light rays outside of the cone of light generated by objective lens 734).
  • infrastructure e.g. suitably positioned detectors
  • system 700 may be configured to use images, generated based on the sensed grayfield scattered light, as an additional perspective(s) and/or as reference images for perspective-to-perspective registration.
  • Scan data analysis module 704 is configured to receive scan data from detectors 724 and based thereon, determine whether a scanned region includes one or more defects, essentially as described with respect to scan data analysis module 604 of system 600. Scan data from each of detectors 724a to 724i may be used to generate difference images Ii to I9, respectively, each in a different perspective.
  • segmented pupil 740 is depicted as including nine pupil segments, to each of which corresponds a detector from detectors 724, the number of perspectives is nine.
  • the set of difference values associated with a first “pixel” (i.e. subregion of a size corresponding to an image pixel) on the wafer therefore includes 9 * (TV + 1) elements (difference values), where N is the number of neighboring pixels taken into account. That is to say, N is the number of neighboring pixels whose difference values are included in the set of difference values associated with the first pixel.
  • the set of difference values includes 81 elements.
  • the predetermined kernel also includes 81 elements).
  • the covariance matrix is then an 81 * 81 matrix.
  • Fig. 7A While in Fig. 7A, the pupil segments are depicted as being of equal shape and size, it is to be understood that in general different pupil segments of segmented pupil 740 may differ from one another in shape and/or in size. In particular, according to some embodiments, different pupil segments may differ in area (i.e. their lateral dimensions parallel to the zx-plane) as well as in their respective longitudinal extensions (for example, the j'-coordinate of the entrance and/or exit of a pupil segment may vary from one pupil segment to another).
  • Fig. 8 schematically depicts a computerized system 800, which is a specific embodiment of system 600.
  • System 800 is similar to system 700 but differs therefrom in including optical components separating the light returned from the wafer into different polarizations, thereby allowing to double the number of perspectives.
  • system 800 includes a radiation source 822, a first plurality of detectors 824, and a second plurality of detectors 826, which together constitute (or form part of) an imager, which is a specific embodiment of imager 616 of system 600.
  • System 800 further includes a scan data analysis module 804, which is a specific embodiment of scan data analysis module 604 of system 600.
  • System 800 further includes a first beam splitter 832, an objective lens 834, a second beam splitter 836, a first segmented pupil 840, and a second segmented pupil 850, which together constitute (or form part of) optical equipment, which is a specific embodiment of optical equipment 618 of system 600.
  • Second beam splitter 836 is a polarizing beam splitter. Also shown are a stage 812 (which is a specific embodiment of stage 612 of system 600) and a wafer 820 placed thereon.
  • radiation source 822 may be similar to radiation source 722, and each of plurality of detectors 824 and 826 may be similar to plurality of detectors 724.
  • First beam splitter 832 and objective lens 834 may be similar to beam splitter 732 and objective lens 734, and each of segmented pupils 840 and 850 may be similar to segmented pupil 740.
  • a portion of a light beam, emitted by radiation source 822, is transmitted through first beam splitter 832, focused by objective lens 834 (to form an illumination spot S' on wafer 820), returned by wafer 820, focused again by objective lens 834, and reflected off of first beam splitter 832, essentially as described above with respect to system 700.
  • the portion of the returned light beam reflected off of first beam splitter 832 travels towards second beam splitter 836, and is partitioned thereby into two light beams of different polarization (e.g. s-polarized light and p-polarized light): a first polarized light beam and a second polarized light beam.
  • the first polarized light beam travels towards first segmented pupil 840 and first plurality of detectors 824, and the second polarized light beam travels towards second segmented pupil 850 and second plurality of detectors 826 (so that, per combination of pupil segment and polarization, a detector is allocated).
  • Scan data analysis module 804 is configured to receive scan data from detectors 824 and detectors 826, and based on the scan data, determine whether a scanned region includes one or more defects, essentially as described with respect to scan data analysis module 604 of system 600.
  • Scan data from each of first detectors 824a to 824i may be used to generate difference images Ji to J9, respectively, each in a different perspective.
  • Scan data from each of second detectors 826a to 826i may be used to generate difference images J10 to Jis, respectively, each in a different perspective (and in a different polarization to difference images Ji to J9).
  • two difference images in two different perspectives may be obtained: A first difference image corresponding to the first polarization, and a second polarization corresponding to the second polarization.
  • each of segmented pupils 840 and 850 is depicted as including nine pupil segments, to each of which corresponds a detector from detectors 824 and detectors 826, respectively, the number of perspectives is eighteen.
  • the set of difference values associated with a first “pixel” on the wafer therefore includes 18 x (N' + 1) elements (difference values), where N' is the number of neighboring pixels taken into account.
  • the set of difference values includes 162 elements.
  • the predetermined kernel also includes 162 elements).
  • the covariance matrix is then a 162 * 162 matrix.
  • Fig. 9 schematically depicts a computerized system 900, which is a specific embodiment of system 600.
  • System 900 includes a radiation source 922, a first detector 924, a second detector 926, and a third detector 928, which together constitute (or form part of) an imager, which is a specific embodiment of imager 616 of system 600.
  • System 900 further includes a scan data analysis module 904, which is a specific embodiment of scan data analysis module 604 of system 600.
  • System 900 further includes a first beam splitter 932, an objective lens 934, a second beam splitter 936, a third beam splitter 938, a first polarizer 942, and a second polarizer 944, which together constitute (or form part of) optical equipment, which is a specific embodiment of optical equipment 618 of system 600. (Non-segmented) pupils before each of detectors 924, 926, and 928, are not shown. Also shown are a stage 912 (which is a specific embodiment of stage 612 of system 600) and a wafer 920 placed thereon. [00163] First polarizer 942 is positioned before second detector 926 and second polarizer 944 is positioned before third detector 928. First polarizer 942 is configured to filter through light of a first polarization, and second polarizer 944 is configured to filter through light of a second polarization, which is different from the first polarization.
  • a portion of a light beam, emitted by radiation source 922, is transmitted through first beam splitter 932, focused by objective lens 934 (to form an illumination spot S" on wafer 920), returned by wafer 920, focused again by objective lens 934, and reflected off of first beam splitter 932, essentially as described above with respect to system 700.
  • the portion of the returned light beam reflected off of first beam splitter 932 travels towards second beam splitter 936, and is partitioned thereby into a first returned subbeam and a second returned sub-beam.
  • the first returned sub-beam constitutes the portion of the returned light beam which is transmitted via second beam splitter 936.
  • the second returned sub-beam constitutes the portion of the returned light beam which is reflected by second beam splitter 936.
  • the first returned sub-beam travels towards first detector 924 and is sensed thereby.
  • the second returned sub-beam travels towards third beam splitter 938 and is partitioned thereby into a transmitted portion and a reflected portion.
  • the transmitted portion travels towards first polarizer 942 and the reflected portion travels towards second polarizer 944.
  • Polarizers 942 and 944 may be aligned at different angles, such that each of second detector 926 and third detector 928 sense light of different polarizations.
  • Detectors 924, 926, and 928 may thus be configured to provide readings sufficient to fully characterize the polarization of the returned light beam (reflected off of wafer 920).
  • Scan data analysis module 904 is configured to receive scan data from detectors 924, 926 and 928, and based on the scan data, determine whether a scanned region includes one or more defects, essentially as described with respect to scan data analysis module 604 of system 600. Scan data from each of detectors 924, 926, and 928 may be used to generate difference images Ki, K2, and KJ, respectively, each in a different perspective.
  • the set of difference values associated with a first “pixel” on the wafer therefore includes 3 x (N" + 1) elements (difference values), where N" is the number of neighboring pixels taken into account.
  • N is the number of neighboring pixels taken into account.
  • the set of difference values includes 27 elements.
  • the predetermined kernel also includes 27 elements).
  • the covariance matrix is then a 27 x 27 matrix.
  • a single polarizing beam splitter may be substituted for the combination of third beam splitter 938 and first polarizer 942 and second polarizer 944.
  • Fig. 10A presents multi-perspective scan data obtained by simulating a computerized system, such as system 700.
  • the multiperspective scan data include nine images (enumerated by Roman numerals I to IX) - each in a different perspective - of a square region of a (simulated) wafer.
  • the region was taken to be uniform except for a deformity (as may be introduced by a dust particle) in the center of the region (i.e. at a central pixel).
  • the dimensions of the region were set to 1 pm 2 .
  • a pixel may be determined as being defective when the quantity and the indices i and j label the pixel (z and j respectively denote the row and column of the pixel), is greater than a corresponding threshold.
  • v' 7 is the first vector corresponding to the (z, /)-th pixel
  • C' 7 is the covariance matrix corresponding to the (z, /)-th pixel
  • k' 7 is the third vector (the kernel) corresponding to the (z, /)-th pixel.
  • Fig. 10B is a graphical representation of the s/j corresponding to the simulated region, when the cross-perspective covariances are not taken into account. This effectively amounts to setting to zero the off-diagonal blocks of the C' 7 .
  • the s y are arranged in a square array in accordance with the values assumed by z and j.
  • the threshold B may be taken to be the same for all the pixels and no extra information is be gained by subtracting B from the y .
  • Fig. 10C is a graphical representation of the Sjj corresponding to the simulated region, when the cross-perspective covariances are taken into account (i.e. all the components of the C' 7 are computed).
  • a dashed circle D is drawn around the central pixel (which corresponds to the defective pixel) in Fig. 10B, with the central pixel being indicated by an arrow d.
  • a dashed circle D' is drawn around the central pixel in Fig. 10C. with the central pixel being indicated by an arrow d'.
  • the central pixel appears much brighter in Fig. 10C than in Fig. 10B, i.e. the defect signal is much stronger in Fig. 10C than in Fig. 10B, demonstrating the improved defect detection capacity of the disclosed methods. Taking the cross-perspective covariances into account, increased the signal -to-noise ratio from ⁇ 0.7 to ⁇ 2.2.
  • a method for obtaining information about a region of a sample e.g. a wafer.
  • the method includes:
  • the multiple images may differ from each other by at least one parameter selected out of illumination spectrum, collection spectrum, illumination polarization, collection polarization, angle of illumination, angle of collection, and sensing type (e.g. intensity, phase, polarization).
  • the obtaining of the multiple images includes illuminating (irradiating) the region and collecting radiation from the region.
  • the region includes multiple region pixels (i.e. the region includes a plurality of sub-regions, each of which being of a size corresponding to a pixel).
  • an image processor e.g. a scan data analysis module
  • multiple difference images that represent differences between the multiple images and the multiple reference images.
  • the covariance matrix (and its inverse) is a set of numbers characterizing statistical properties of the noise. Those statistical properties can generally be referred to as “attributes”. The use of the covariance matrix as statistical properties is a specific non-limiting embodiment.
  • the determining of whether the region pixel represents a defect is also responsive to a set of attributes of an actual defect.
  • the determining of whether the region pixel represents a defect is also responsive to a set of attributes of an estimated defect.
  • the method includes calculating the set of noise attributes by calculating a covariance matrix.
  • the calculating of the covariance matrix includes: calculating, for each region pixel, a set of covariance values that represent the covariance between different attributes (i.e. between different perspectives) of the set of region pixel attributes of the region pixel, and calculating the given covariance matrix based on multiple sets of covariance values of the multiple region pixels.
  • the method further includes determining, for each region pixel, whether the region pixel represents a defect by comparing, to a threshold (e.g. the threshold ), a product of a multiplication between (i) a set of attributes of the region pixel (e.g. the first vector v), (ii) an inverse of the covariance matrix (e.g. the matrix C' 1 ) corresponding to the noise afflicting the set of attributes of the region pixel, and (iii) a set of attributes of the defect of interest.
  • a threshold e.g. the threshold
  • a product of a multiplication between i) a set of attributes of the region pixel (e.g. the first vector v),
  • an inverse of the covariance matrix e.g. the matrix C' 1
  • a set of attributes of the defect of interest e.g. the matrix C' 1
  • the set of pixel attributes of a region pixel includes data regarding the region pixel and neighboring region pixels of the region pixel.
  • the imager includes multiple detectors for generating the multiple images, and the method further includes allocating different detectors to detect radiation from different pupil segments of the multiple pupil segments (of a segmented pupil).
  • the different pupil segments of the multiple pupil segments exceed four pupil segments.
  • the imager includes multiple detectors for generating the multiple images, and the method further includes allocating different detectors to detect radiation from different combinations of (a) polarization and (b) different pupil segments of the multiple pupil segments.
  • the method includes obtaining the multiple images at a same point in time.
  • the method includes obtaining the multiple images at different points in time.
  • the method further includes classifying the defect.
  • the method further includes determining whether the defect is a defect of interest or not a defect of interest.
  • a computerized system for obtaining information about a region of a sample (e.g. an area on a wafer).
  • the system includes an imager that includes optics and an image processor.
  • the imager is configured to obtain multiple images of the region.
  • the multiple images may differ from each other by at least one parameter selected out of illumination spectrum, collection spectrum, illumination polarization, collection polarization, angle of illumination, and angle of collection.
  • the obtaining of the multiple images includes illuminating the region and collecting radiation from the region.
  • the region includes multiple region pixels.
  • the computerized system is configured to receive or generate multiple reference images.
  • the image processor is configured to:
  • a non-transitory computer-readable medium that stores instructions that cause a computerized system to:
  • the multiple images differ from each other by at least one parameter selected out of illumination spectrum, collection spectrum, illumination polarization, collection polarization, angle of illumination, angle of collection, and sensing type.
  • the obtaining of the multiple images includes illuminating the region and collecting radiation from the region.
  • the region includes multiple region pixels.
  • the terms “collection channel” and “detection channel” may be used interchangeably.
  • the notation “Vdata”, “Cov”, and “Vdefecf ’ may be used to indicate the first vector v, the covariance matrix C, and the third vector k, respectively.
  • the term “group” may refer not only to pluralities of elements (e.g. components, features) but also to single elements. In the latter case, the group may be referred to as a “single-member group”.

Landscapes

  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Immunology (AREA)
  • Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)

Abstract

L'invention divulgue un procédé de détection de défauts sur un échantillon. Le procédé consiste à obtenir des données de balayage d'une région d'un échantillon selon une multitude de perspectives, et à effectuer une analyse intégrée des données de balayage obtenues. L'analyse intégrée consiste à calculer, sur la base des données de balayage obtenues, et/ou à estimer des covariances de perspectives croisées, et à déterminer la présence de défauts dans la région, en tenant compte des covariances de perspectives croisées.
PCT/US2021/048935 2020-09-02 2021-09-02 Analyse de tranche multi-perspective WO2022051551A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020237010768A KR20230056781A (ko) 2020-09-02 2021-09-02 다중 관점 웨이퍼 분석
CN202180067312.2A CN116368377A (zh) 2020-09-02 2021-09-02 多视角晶片分析

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/010,746 US11815470B2 (en) 2019-01-17 2020-09-02 Multi-perspective wafer analysis
US17/010,746 2020-09-02

Publications (1)

Publication Number Publication Date
WO2022051551A1 true WO2022051551A1 (fr) 2022-03-10

Family

ID=80491432

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/048935 WO2022051551A1 (fr) 2020-09-02 2021-09-02 Analyse de tranche multi-perspective

Country Status (4)

Country Link
KR (1) KR20230056781A (fr)
CN (1) CN116368377A (fr)
TW (1) TW202221314A (fr)
WO (1) WO2022051551A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110118623A (ko) * 2009-01-26 2011-10-31 케이엘에이-텐코 코포레이션 웨이퍼 상의 결함 검출 시스템 및 방법
US20120141012A1 (en) * 2009-08-26 2012-06-07 Kaoru Sakai Apparatus and method for inspecting defect
US20120294507A1 (en) * 2010-02-08 2012-11-22 Kaoru Sakai Defect inspection method and device thereof
US9194817B2 (en) * 2012-08-23 2015-11-24 Nuflare Technology, Inc. Defect detection method
US20200232934A1 (en) * 2019-01-17 2020-07-23 Applied Materials Israel, Ltd. Multi-perspective wafer analysis

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7161667B2 (en) * 2005-05-06 2007-01-09 Kla-Tencor Technologies Corporation Wafer edge inspection
KR102330741B1 (ko) * 2012-06-26 2021-11-23 케이엘에이 코포레이션 각도 분해형 반사율 측정에서의 스캐닝 및 광학 계측으로부터 회절의 알고리즘적 제거
US9518934B2 (en) * 2014-11-04 2016-12-13 Kla-Tencor Corp. Wafer defect discovery
US11270430B2 (en) * 2017-05-23 2022-03-08 Kla-Tencor Corporation Wafer inspection using difference images
US10937705B2 (en) * 2018-03-30 2021-03-02 Onto Innovation Inc. Sample inspection using topography

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110118623A (ko) * 2009-01-26 2011-10-31 케이엘에이-텐코 코포레이션 웨이퍼 상의 결함 검출 시스템 및 방법
US20120141012A1 (en) * 2009-08-26 2012-06-07 Kaoru Sakai Apparatus and method for inspecting defect
US20120294507A1 (en) * 2010-02-08 2012-11-22 Kaoru Sakai Defect inspection method and device thereof
US9194817B2 (en) * 2012-08-23 2015-11-24 Nuflare Technology, Inc. Defect detection method
US20200232934A1 (en) * 2019-01-17 2020-07-23 Applied Materials Israel, Ltd. Multi-perspective wafer analysis

Also Published As

Publication number Publication date
CN116368377A (zh) 2023-06-30
KR20230056781A (ko) 2023-04-27
TW202221314A (zh) 2022-06-01

Similar Documents

Publication Publication Date Title
US20200232934A1 (en) Multi-perspective wafer analysis
US11815470B2 (en) Multi-perspective wafer analysis
JP5570530B2 (ja) ウェハー上の欠陥検出
US10921262B2 (en) Correlating SEM and optical images for wafer noise nuisance identification
US8274651B2 (en) Method of inspecting a semiconductor device and an apparatus thereof
TWI677679B (zh) 在雷射暗場系統中用於斑點抑制之方法及裝置
JP6617143B2 (ja) 構造情報を用いた欠陥検出システム及び方法
KR20190082911A (ko) 3차원 반도체 구조체들의 검사를 위한 결함 발견 및 레시피 최적화
WO2010073453A1 (fr) Procédé d'inspection de défaut et dispositif associé
KR20170086539A (ko) 프로세스 윈도우 특성묘사를 위한 가상 검사 시스템
CN111164646A (zh) 用于大型偏移裸片对裸片检验的多步骤图像对准方法
WO2014149197A1 (fr) Détection de défauts sur une tranche utilisant des informations propres aux défauts et multi-canal
US20140376802A1 (en) Wafer Inspection Using Free-Form Care Areas
Liu et al. Microscopic scattering imaging measurement and digital evaluation system of defects for fine optical surface
TWI778258B (zh) 缺陷偵測之方法、系統及非暫時性電腦可讀媒體
CN111819596B (zh) 组合模拟及光学显微术以确定检验模式的方法和系统
US9702827B1 (en) Optical mode analysis with design-based care areas
JPWO2020092856A5 (fr)
JP5450161B2 (ja) 欠陥検査装置および欠陥検査方法
US11688055B2 (en) Methods and systems for analysis of wafer scan data
WO2022051551A1 (fr) Analyse de tranche multi-perspective
JP2013174575A (ja) パターン検査装置、及びこれを使用した露光装置の制御方法
EP4139748A1 (fr) Procédé et système permettant de déterminer une ou plusieurs dimensions d'une ou de plusieurs structures sur une surface d'échantillon

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21865140

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20237010768

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21865140

Country of ref document: EP

Kind code of ref document: A1