WO2015159792A1 - Charged particle beam device - Google Patents

Charged particle beam device Download PDF

Info

Publication number
WO2015159792A1
WO2015159792A1 PCT/JP2015/061100 JP2015061100W WO2015159792A1 WO 2015159792 A1 WO2015159792 A1 WO 2015159792A1 JP 2015061100 W JP2015061100 W JP 2015061100W WO 2015159792 A1 WO2015159792 A1 WO 2015159792A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
charged particle
particle beam
detector
detectors
Prior art date
Application number
PCT/JP2015/061100
Other languages
French (fr)
Japanese (ja)
Inventor
郭介 牛場
康隆 豊田
池田 光二
安部 雄一
渉 長友
Original Assignee
株式会社 日立ハイテクノロジーズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 日立ハイテクノロジーズ filed Critical 株式会社 日立ハイテクノロジーズ
Publication of WO2015159792A1 publication Critical patent/WO2015159792A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01JELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
    • H01J37/00Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
    • H01J37/02Details
    • H01J37/22Optical or photographic arrangements associated with the tube
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01JELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
    • H01J37/00Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
    • H01J37/02Details
    • H01J37/244Detectors; Associated components or circuits therefor
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01JELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
    • H01J37/00Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
    • H01J37/26Electron or ion microscopes; Electron or ion diffraction tubes
    • H01J37/28Electron or ion microscopes; Electron or ion diffraction tubes with scanning beams

Definitions

  • the present invention relates to a charged particle beam apparatus, and more particularly to a charged particle beam apparatus including a plurality of detectors.
  • a semiconductor inspection / measurement apparatus (hereinafter sometimes simply referred to as a semiconductor inspection apparatus) provided with a charged particle beam apparatus such as a scanning electron microscope is an apparatus that measures or inspects a fine pattern. Positioning using a reference image (template) is performed in order to accurately position the field of view in a fine pattern or to correct field shift.
  • Patent Document 1 describes an image to be inspected taken using a secondary electron detector and a registration method for visual field shift detection using design data.
  • Patent Document 2 discloses a method of segmenting a secondary electron image based on backscattered electron detection by a plurality of backscattered electron detectors, and specifying an inspection position through pattern matching processing on the region-divided image. Has been explained.
  • JP-A-5-324836 JP 2011-165479 A (corresponding US Pat. No. 8,653,456)
  • Patent Documents 1 and 2 do not disclose a method for suppressing the influence of alignment based on such edge deformation.
  • the purpose is to suppress the deterioration of the alignment or position identification accuracy based on the deformation of the pattern or the like, or the decrease in the specific performance of the length measurement object or inspection object.
  • a charged particle beam apparatus will be described.
  • a charged particle beam device including an image processing device that processes an image formed in the image processing device, wherein the image processing device is an edge region of an image obtained based on charged particles obtained by the two or more detectors
  • the flowchart which shows the process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors.
  • the figure which shows an example of a secondary electron image.
  • the figure which shows an example of the pattern cross section of a semiconductor device.
  • the flowchart which shows the process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors.
  • the flowchart which shows the process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors.
  • the embodiments described below relate to an image processing apparatus that processes an image signal obtained by, for example, a scanning electron microscope included in an appearance inspection apparatus, a pattern measurement apparatus, a defect inspection apparatus, and the like, and is particularly photographed by a measurement apparatus or an inspection apparatus.
  • An image processing technique for detecting a visual field shift of an inspected image using the inspected image and semiconductor design data, a semiconductor inspection apparatus equipped with the image processing technique, an image processing apparatus for realizing the same, and the like will be described.
  • the uneven region of the circuit pattern is divided mainly from two or more secondary electron images having different shooting conditions and the shooting conditions.
  • a charged particle beam apparatus including an image processing apparatus that performs alignment using the uneven area will be described. According to the charged particle beam apparatus including such an image processing apparatus, it is possible to perform a robust alignment that is not affected by a layout in which a high-density circuit pattern exists, deformation, or generation of defects.
  • the image processing apparatus can be realized by a general-purpose computer.
  • the apparatus includes an input / output interface (not shown) (display, keyboard, mouse, LAN port, USB port, etc.), CPU (calculation unit), memory (storage unit).
  • Basic computer components such as The following description is also an explanation as a program for operating the computer.
  • FIG. 1 is a flowchart showing a process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus equipped with two or more detectors.
  • the processing illustrated in FIG. 1 is executed by a control device (image processing device) described later, an external image processing device, or the like.
  • the flowchart illustrated in FIG. 1 includes a step 101 for reading two or more SE images with different imaging conditions, a step 102 for extracting an edge region from each SE image, and a flat region by combining the edge regions of each image. And step 104 for acquiring layer information of the flat region from the flat region, the edge region, and the photographing conditions.
  • the imaging conditions in step 101 are the three-dimensional position of the secondary electron detector of the inspection apparatus, the angle and direction of electrons acquired by the detector, the set value of the energy amount of electrons acquired, and the alignment deflection described later.
  • the degree is a mathematical value.
  • the edge region of step 102 is an edge portion of a pattern that appears in a band shape called a white band 201 on the SE image in the SE image as shown in FIG.
  • the flat area segmentation in step 103 is to segment the edge portion, the pattern portion 202, and the space portion 203 in the image using the edge region extracted in step 102.
  • the layer information in step 104 means that when the semiconductor is viewed in a cross-sectional view as shown in FIG. 3, the region convex portion 301 located above the three-dimensional space and the region concave portion located lower than the convex portion 301 are patterned. This is information indicating whether the part 202 or the space part 203 is used. Further, in the SE (Secondary Electron) image, as shown in FIG. 4, the upper layer pattern portion 401, the lower layer pattern portion 402 partially hidden therein, and the space portion 403, two or more circuit patterns are shown. There is a case to be depressed. In such a case, the layer information acquired in step 104 is acquired as the upper layer pattern portion, the lower layer pattern portion 402, and the space portion 403.
  • the layer information is not limited to the two-layer and three-layer information described above. Further, after step 104, alignment using the layer information is performed. The detailed processing of each step is described below for each shooting condition.
  • FIG. 5 shows an outline of a scanning electron microscope for acquiring image data.
  • SEM scanning electron microscope
  • An image is formed based on detection of charged particles obtained by scanning an ion beam on a sample.
  • Another charged particle beam device such as a focused ion beam device may be used as the image acquisition device.
  • An electron beam 503 extracted from the electron source 501 by the extraction electrode 502 and accelerated by an acceleration electrode (not shown) is focused by a condenser lens 504 which is a form of a focusing lens, and then by a scanning deflector (not shown).
  • the sample (semiconductor wafer) 509 is scanned one-dimensionally or two-dimensionally.
  • the electron beam 503 is decelerated by a negative voltage applied to an electrode built in the sample stage 508 and is focused by the lens action of the objective lens 506 and irradiated onto the sample 509.
  • secondary electrons and electrons 510 such as backscattered electrons are emitted from the irradiated portion.
  • the emitted electrons 510 are accelerated in the direction of the electron source by the acceleration action based on the negative voltage applied to the sample and are deflected by the aligner 505.
  • the electrons 510 deflected by the aligner 505 collide with the conversion electrode 512 to generate secondary electrons 511.
  • the secondary electrons 511 emitted from the conversion electrode 512 are captured by the detector 513, and the output I of the detector 513 changes depending on the amount of captured secondary electrons. Depending on the output I, the brightness of a display device (not shown) changes. For example, when a two-dimensional image is formed, an image of the scanning region is formed by synchronizing the deflection signal to the scanning deflector 505 and the output I of the detector 513.
  • the scanning electron microscope illustrated in FIG. 5 includes a deflector (not shown) that moves the scanning region of the electron beam.
  • the control device 514 controls each component of the scanning electron microscope, and forms a pattern on the sample based on the function of forming an image based on detected electrons and the intensity distribution of detected electrons called a line profile. It has a function to measure the pattern width.
  • FIG. 22 is a detailed configuration example of the control device (image processing device) 514.
  • the control device 514 includes basic computer components such as a CPU 2201, an image memory 2202, an LSI 2203, an output device 2204, and a storage device 2205. In order to receive design data and the like, it is connected to the design system 2208 and its control device 2207 via a network such as a LAN.
  • FIG. 23 illustrates an image obtained by two detectors, an upper detector 513a and a lower detector 513b, as shown in FIG.
  • a line and space (L & S) pattern and a hole (Hole) pattern are given as examples.
  • the L & S pattern is a pattern in which wiring portions and space portions of a circuit pattern are alternately arranged in a certain direction.
  • the Hole pattern is a pattern in which holes reaching the underlying material are opened in the planar upper pattern.
  • the L & S pattern is composed of titanium nitride TiN at the upper part (line part) and silicon Si at the lower part (space part).
  • the Hole pattern is composed of silicon nitride SiN at the top (hole top side) and silicon Si at the bottom (bottom of the hole).
  • FIG. 23 The upper part of FIG. 23 is an acquired image with the Upper detector 513a, and the lower part is an acquired image with the Lower detector 513b.
  • the electrons obtained by the lower detector 513b tend to decrease in the space portion 701 of the pattern. For this reason, there is a characteristic that it becomes darker than the upper part 702 of the pattern.
  • the electrons obtained by the Upper detector mainly include secondary electrons having a small angle with the orbit of the irradiated electrons, and even if electrons emitted from the bottom of the deep hole deep groove, Since the electrons are not captured by the side wall of the deep groove, the space portion 703a and the bottom portion 703b are relatively brighter than the space portion 701a and the bottom portion 701b.
  • electrons emitted from the bottom of deep holes and deep grooves are detected more by the Upper detector than the Lower detector, but this condition may not apply depending on the material constituting the sample. There is.
  • the contrast may be reversed when imaging the amount of electrons observed by the detector. Since the contrast changes depending on the material, the space portion is not always relatively dark in the image of the Lower detector.
  • FIG. 8 is a flowchart showing a process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors. According to the flowchart illustrated in FIG. 8, since the portion corresponding to the edge portion of the pattern has disappeared from the image, it is possible to suppress a decrease in matching position accuracy based on pattern deformation or the like when performing pattern matching. It becomes.
  • Step 101 for reading an image and step 102 for extracting an edge region are the same as in FIG.
  • the edge extraction method in step 102 for extracting the edge region can be performed by using an edge extraction filter such as a Sobel filter or a Laplacian filter.
  • an edge extraction filter such as a Sobel filter or a Laplacian filter.
  • the edge portion appears in the image as a white band having a higher luminance than the flat portion, it can be extracted using binarization or the like. Further, more advanced edge region extraction may be performed by combining a filter and binarization.
  • the edge region obtained from the Upper image and the edge region obtained from the Lower image are combined to create a synthesized edge region.
  • the composite edge region may be given as a binary image in which the edge portion is 1 and the non-edge portion is 0.
  • the form of the composite edge region is not limited to this.
  • the combination method may be performed by obtaining a sum and binarizing.
  • the edge region obtained from the Upper image may not exist in the edge region obtained from the Lower image.
  • the area may be ignored and divided into the upper layer and the other layers, or the area may be divided into three or more types in consideration of the lower layer. A method for considering the lower layer will be described later.
  • the edge region synthesis method is not limited to the above-described method using the sum.
  • an edge mask step 802 the edge areas of the upper image and the lower image are masked using the combined edge area. For example, processing such as setting the pixel value to a negative value may be performed.
  • the mask method is not limited to this. Further, when performing the processing described later, it may be determined each time whether or not it is a composite edge region, and whether or not the processing for each image is to be performed may be determined.
  • the combined edge region is deleted as described above in order to eliminate the edges of the two images.
  • the mask area is set based on the creation of.
  • Step 803 for calculating the luminance difference the area is divided using the edge area masked portion as a divided area (as a boundary between areas). For example, if adjacent pixels are not negative, processing such as attaching the same label is performed. For both the upper image and the lower image, the average pixel value is calculated within the same label, and the difference is calculated. Alternatively, an average of the differences may be obtained after taking a difference between each pixel of the upper image and the lower image.
  • FIG. 9 is a diagram illustrating an example of an L & S pattern image in which a difference between an image obtained by the Upper detector and an image obtained by the Lower detector is a luminance signal.
  • a gray area 901 surrounded by a broken line is an area having a large luminance difference between the upper image and the lower image
  • a black area 902 is an area having a small luminance difference.
  • the vicinity of the boundary between the gray area 901 and the black area 902 is a masked insensitive area.
  • each region is classified using the luminance difference obtained in step 803 of calculating the luminance difference as an index value.
  • the method for example, when each luminance difference is plotted on one dimension, two classes may be classified between points having the longest distance, or an algorithm such as a K-average method may be used. The classification method is not limited to this.
  • the classified class is assigned to each area.
  • the image may be possessed in an image format in which the edge portion is a negative value as a separate image and the class number arbitrarily determined in each flat region is a luminance value.
  • the description method of the layer information is not limited to this.
  • the difference in brightness between the upper layer and the lower layer can be obtained regardless of the material of the sample by creating a difference image of the images obtained by the two detectors where the difference between the upper layer and the lower layer is likely to be manifested.
  • a search image and a reference image for matching processing based on the classification according to the brightness difference, stable matching processing is realized without being affected by changes in brightness, etc. It becomes possible to do.
  • these images are a region A having a specific brightness and a region B having a brightness different from the region A. Since it is classified into regions, for example, the registration processing based on the processing in which the overlapping area of the region A between the searched image and the reference image and the position where the overlapping area of the region B is maximized are set as the matching position Can also be done.
  • the region dividing process can be applied not only to the matching process using the reference image but also to the process of specifying the edge to be measured or inspected.
  • the following processing is executed.
  • the target measurement can be easily performed by extracting an edge close to the image area classified as the bottom. Therefore, in this example, a method is proposed in which an edge adjacent to a region of a specific layer divided into regions is searched, and an edge extracted based on the search is selectively measured and inspected. According to such a method, it becomes possible to select a measurement target without setting a cursor or the like at an edge to be measured. Especially when measuring the diameter of multiple hole patterns contained in an image and calculating the average value, it is possible to selectively extract edges without setting a cursor or the like for multiple holes. It becomes possible.
  • the center of gravity (measurement reference position) of the classified area is obtained for each hole pattern (area divided), and the dimension between edges located on a straight line extending in a specific direction from the center of gravity is measured.
  • the edge can be identified from the luminance or luminance gradient in the normal direction of the region. For example, when searching for pixels away from the contour portion on the basis of the contour portion on the normal line, the magnitude of the luminance gradient is the first or second time compared with the contour portion of the region divided into the regions. When the size is reduced to the same size, the area between them is set as an edge region.
  • the size comparison includes a determination using a threshold value and a method using a statistic such as variance. This is a technique using the characteristic that the brightness of the edge region of the pattern of the SE image is increased. If such a method is used, an edge can be specified even for a pattern having a complicated shape. Automatic selection of the measurement target by providing an algorithm that measures the dimension between the specified edge and, for example, the position of another edge that is also specified in the normal direction of the specified edge. Measurement is possible.
  • edge disappearance processing and region separation processing during measurement and inspection preprocessing such as matching processing and position identification processing, subsequent measurement and inspection can be performed with high accuracy.
  • FIG. 10 is a diagram for explaining the trajectory of electrons emitted from the sample when the multilayer circuit pattern is irradiated with an electron beam. Since the secondary electrons 1001 generated by irradiating the lower layer with the electron beam 503 are blocked by the upper side wall 1002, the amount of detectable electrons is limited. However, this amount is larger than that of the secondary electrons 1003 generated by irradiating the space part with the electron beam. Therefore, as in the multilayer image shown in FIG. 4, in the lower image, the luminance of the flat area of the circuit pattern is basically upper layer 401> lower layer 402> space part 403. However, the above formula may not hold depending on the material. However, even when there is a lower layer, the characteristics of the upper image and the lower image do not change, so that the brightness difference of the flat region is such that space portion> lower layer region> upper layer region.
  • step 804 for performing area classification division for the included layers is performed. If it is unclear how many layers of wiring are reflected in the image, the number of divisions may be automatically determined using an index value such as the Akaike information criterion or Bayesian information criterion using the K-Means method or the like. If the number of layers to be captured as input parameters and design data used for alignment are given, the number of divisions may be determined based on the information. The method for determining the number of divisions is not limited to this. If the number of divisions can be determined, layer information can be formed by the same method as in step 805 for layer information. By creating the searched image and the reference image for the matching process based on the layer informationization, it is possible to realize a stable matching process without being influenced by a change in brightness or the like.
  • FIG. 11 is a diagram showing the behavior of electrons when secondary electrons are deflected by the aligner 505.
  • the secondary electrons 1101 emitted from the semiconductor wafer 509 have their trajectories corrected by the aligner 505, pass through the gaps of the conversion electrodes 512, are corrected again by the aligners 505, collide with the upper conversion electrodes 512, and the upper detector. Detected.
  • Electrons passing through an orbit other than that detected by the upper detector are deflected by the lower aligner 505, collide with the lower conversion electrode 512, and are detected by the lower detector.
  • FIG. 12 shows the electron generated in the left direction deflected by the aligner and observed.
  • the white band 1201 of the Upper detector image is obtained by observing only the electrons generated in the left direction, and the left side electrons are not generated on the right side surface of the cross-shaped convex pattern, so that a black shadow region 1202 is formed.
  • the white band other than the left side of the pattern is detected as a white band 1204.
  • the left side surface portion 1205 is dark because electrons in the left side surface direction are taken by the Upper detector, but for example, electrons in the upward direction are detected and a large amount of secondary electrons are emitted from the edge portion. It will not be darker than the plane. Therefore, as shown in FIG. 12, the white band at the edge portion becomes thin.
  • FIG. 13 shows a processing flow for performing area division on the image as shown in FIG. The process until the edge extraction is the same as in FIG.
  • the region on the right side of the white band of the upper detector image is a pattern portion.
  • a tentatively determined region 1401 is determined as shown in FIG.
  • Step 1302 is performed in which processing is performed to distinguish this from the edge 1402 of the white band obtained from the lower detection image and the edge obtained from the white band of the upper detector image.
  • the area division is also performed there.
  • the provisionally determined area that has not been deleted is classified as a pattern part, and the area not included in the provisionally determined area is classified as a space part (step 1304), and layer information is formed (step 805).
  • an image 1501 focused on an edge in a specific direction as shown in FIG. 15 or a composite image 1502 thereof can be created. If the white band of the composite image 1502 is used, it is not necessary to use the white band of the Upper detector image in the process of FIG.
  • a method for creating a composite image for example, it is conceivable to selectively synthesize a portion indicating the maximum luminance value of the direction focus images 1501a to 1501d.
  • the method for creating a composite image and the combination of images are not limited to this. Further, although an example of four directions is shown in FIG. 15, edges in a plurality of directions other than the four directions may be selected.
  • FIG. 16 shows a configuration example of a new inspection apparatus for acquiring an inspection image.
  • the difference from FIG. 5 is that two Lower detectors 513b as shown in FIG. 5 are installed. As shown in the figure, the left is the Left detector 1613a, and the right is the Right detector 1613b. The two are collectively called an LR detector.
  • the upper detector is omitted. The upper detector is not indispensable when the area is divided in the apparatus configuration of FIG.
  • FIG. 17 shows an example of images obtained from the Left detector 1613a and the Right detector 1613b. Each is called an L image and an R image, and a set of these images is called an LR image.
  • the LR image is similar to the image 1501b and the image 1501c shown in FIG.
  • the electrons detected by the detector are imaged as electrons emitted in the left direction by the Left detector 2101 and electrons emitted in the right direction by the Right detector 2102.
  • FIG. 18 is a diagram illustrating an example of a flowchart showing a process of dividing an image into regions based on an image obtained by the scanning electron microscope illustrated in FIG.
  • an LR image is read.
  • the white band may be extracted from the LR image and synthesized to obtain the edge region.
  • the edge region may be obtained from the white band of the composite image by combining the LR images.
  • an image synthesizing method for example, it is conceivable to selectively synthesize a portion showing an image value having a high luminance.
  • step 1802 The same processing method as in step 102 can be applied to white band extraction.
  • the process illustrated in FIG. 14 may be performed on at least one of the L image and the R image to obtain the pattern portion. Moreover, you may carry out about both. Thereby, it is possible to classify into a pattern part, an edge part, and other parts.
  • the layer information is formed based on the classification result (step 805). Further, as illustrated in FIG. 17, if there are not a plurality of pattern portions, the combined edge region is not calculated, and the provisional determination regions illustrated in FIG. 14 are obtained for each of the L image and the R image, and the overlapping portion is obtained. It is good also as a pattern part.
  • Each process is not limited to the method described above.
  • FIG. 19 illustrates an outline of a scanning electron microscope for image acquisition.
  • a total of four Lower detectors 513b described with reference to FIG. 5 are installed in all directions.
  • each detector will be referred to as an X +, X-, Y +, Y-detector.
  • the electrons detected by each detector for imaging are the same as those shown in FIG. 21.
  • two detectors sandwiching the beam optical axis. Is placed.
  • An example of an image formed based on the electrons detected by each detector is shown in FIG.
  • the characteristics of the image obtained from each detector are similar to the four directions in the example using the aligner, and region classification is possible by the same processing.
  • FIG. 24 is a diagram illustrating an example of an image formed based on the output of each detector of a multilayer image.
  • the lower layer pattern 2401 is clearly visible, but in the X + and X ⁇ detectors corresponding to the LR detector in FIG. 16, the lower layer pattern may not be reflected. This is because, since the pattern density is high, electrons with a large angle formed from the lower layer pattern are blocked by the upper layer pattern and are not detected by the X + and X ⁇ detectors. Such a situation can also occur in the LR detector.
  • the above-described method for example, the region dividing method described with reference to FIG. 16, is used to divide the region into the upper layer pattern 2403 and the other region 2404. Do. Further, an image is created by masking the upper layer pattern 2403 and its edge region 2405 with respect to the Y +, Y ⁇ detector image, and the image is input to input the lower layer pattern 2402 and the space portion in the same manner as the X +, X ⁇ detector image. The area division of 2406 is performed. This is layer information. As described above, it is possible to divide a multi-layered image by combining the methods described so far. The processing for the four-direction multilayer image is not limited to this.
  • Electron source 502 Extraction electrode 503
  • Electron beam 504 Condenser lens 505
  • Aligner 506 Objective lens 507
  • Vacuum sample chamber 508 Sample stage 509
  • Secondary electrons 512

Abstract

The objective of the present invention is to provide a charged particle beam device wherein, during a positioning process, a decrease in the positioning accuracy, or the like, due to the deformation of a pattern, or the like, is suppressed. The present invention proposes a charged particle beam device equipped with at least two detectors for detecting charged particles that are obtained based on the irradiation of a charged particle beam emitted by a charged particle source, and an image processing device for processing an image that has been formed based on the output from the detector, wherein the image processing device performs masking on the edge regions of the image that is obtained based on the charged particles that have been obtained by the at least two detectors. Also proposed is a charged particle beam device that executes matching, or the like, using an image that has been subjected to the masking.

Description

荷電粒子線装置Charged particle beam equipment
 本発明は荷電粒子線装置に係り、特に複数の検出器を備えた荷電粒子線装置に関する。 The present invention relates to a charged particle beam apparatus, and more particularly to a charged particle beam apparatus including a plurality of detectors.
 走査電子顕微鏡等の荷電粒子線装置を備えた半導体検査、測定装置(以下、単に半導体検査装置と称することもある)は、微細なパターンを測定、或いは検査する装置である。微細なパターンに正確に視野を位置づけるため、或いは視野ずれを補正するために、参照画像(テンプレート)を用いた位置合わせが行われる。例えば、特許文献1には2次電子検出器を用いて撮影された被検査画像と、設計データを用いた視野ずれ検出のための位置合わせ法が説明されている。また、特許文献2には、複数の反射電子検出器による反射電子検出に基づいて、二次電子画像を領域分けし、領域分けされた画像について、パターンマッチング処理を経て、検査位置を特定する手法が説明されている。 2. Description of the Related Art A semiconductor inspection / measurement apparatus (hereinafter sometimes simply referred to as a semiconductor inspection apparatus) provided with a charged particle beam apparatus such as a scanning electron microscope is an apparatus that measures or inspects a fine pattern. Positioning using a reference image (template) is performed in order to accurately position the field of view in a fine pattern or to correct field shift. For example, Patent Document 1 describes an image to be inspected taken using a secondary electron detector and a registration method for visual field shift detection using design data. Further, Patent Document 2 discloses a method of segmenting a secondary electron image based on backscattered electron detection by a plurality of backscattered electron detectors, and specifying an inspection position through pattern matching processing on the region-divided image. Has been explained.
特開平5-324836号公報JP-A-5-324836 特開2011-165479号公報(対応米国特許USP8,653,456)JP 2011-165479 A (corresponding US Pat. No. 8,653,456)
 一方、昨今の半導体パターンの微細化に伴い、パターン変形が位置合わせ処理に与える影響が無視できなくなりつつある。パターンのコーナーやラインエンドの丸まり、ホールパターンの大きさの変動や位置ずれ等によって、撮像画像上の本来の位置合わせ位置とは異なる位置に、参照画像が位置合わせされる場合がある。特に電子顕微鏡等で得られた画像は、エッジ効果によってパターンエッジが強調された画像となる。ホワイトバンドと呼ばれるパターンエッジ部分の高輝度部分は、撮像画像内において特徴的な部分であるため、エッジ部分の出来栄えによって、位置合わせ位置が大きく変化することになる。特許文献1、2には、このようなエッジの変形に基づく位置合わせの影響を抑制する手法についての開示がない。 On the other hand, with the recent miniaturization of semiconductor patterns, the influence of pattern deformation on alignment processing is becoming ignorable. In some cases, the reference image is aligned at a position different from the original alignment position on the captured image due to rounding of the pattern corner or line end, variation in the size of the hole pattern, displacement, and the like. In particular, an image obtained with an electron microscope or the like is an image in which the pattern edge is enhanced by the edge effect. Since the high brightness portion of the pattern edge portion called white band is a characteristic portion in the captured image, the position of alignment greatly changes depending on the quality of the edge portion. Patent Documents 1 and 2 do not disclose a method for suppressing the influence of alignment based on such edge deformation.
 以下に、位置合わせ処理、或いは測定対象や検査対象を特定する処理において、パターンの変形等に基づく位置合わせや位置特定精度の低下、或いは測長対象や検査対象の特定性能の低下の抑制を目的とする荷電粒子線装置について説明する。 In the following, in the alignment process or the process of specifying the measurement object or inspection object, the purpose is to suppress the deterioration of the alignment or position identification accuracy based on the deformation of the pattern or the like, or the decrease in the specific performance of the length measurement object or inspection object. A charged particle beam apparatus will be described.
 上記目的を達成するための一態様として以下に、荷電粒子源から放出された荷電粒子ビームの照射に基づいて得られる荷電粒子を検出する少なくとも2以上の検出器と、当該検出器の出力に基づいて形成される画像の処理を行う画像処理装置を備えた荷電粒子線装置であって、前記画像処理装置は、前記2以上の検出器によって得られた荷電粒子に基づいて得られる画像のエッジ領域にマスク処理を施す荷電粒子線装置を提案する。 As one aspect for achieving the above object, at least two or more detectors for detecting charged particles obtained based on irradiation of a charged particle beam emitted from a charged particle source and an output of the detector will be described below. A charged particle beam device including an image processing device that processes an image formed in the image processing device, wherein the image processing device is an edge region of an image obtained based on charged particles obtained by the two or more detectors We propose a charged particle beam device that performs mask processing on the surface.
 上記構成によれば、パターンの変形等に基づく位置合わせ精度や位置特定精度の低下を抑制することが可能となる。また、測長対象や検査対象の特定性能を向上させることが可能となる。本発明の他の目的、特徴及び利点は添付図面に関する以下の本発明の実施例の記載から明らかになるであろう。 According to the above configuration, it is possible to suppress a decrease in alignment accuracy and position identification accuracy based on pattern deformation and the like. Moreover, it becomes possible to improve the specific performance of the length measurement object or the inspection object. Other objects, features and advantages of the present invention will become apparent from the following description of embodiments of the present invention with reference to the accompanying drawings.
2以上の検出器を備えた荷電粒子線装置によって得られた2以上の画像を用いて、画像を領域分けする工程を示すフローチャート。The flowchart which shows the process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors. 2次電子像の一例を示す図。The figure which shows an example of a secondary electron image. 半導体デバイスのパターン断面の一例を示す図。The figure which shows an example of the pattern cross section of a semiconductor device. 下層が写り込んだ2次電子像の一例を示す図。The figure which shows an example of the secondary electron image in which the lower layer was reflected. 画像データを取得するための走査電子顕微鏡の概要を示す図。The figure which shows the outline | summary of the scanning electron microscope for acquiring image data. 試料にビームを照射したときに試料から放出される2次電子の軌道を示す図。The figure which shows the track | orbit of the secondary electron discharge | released from a sample when a sample is irradiated with a beam. 下段検出器と上段検出器によって検出された電子に基づいて形成される画像の一例を示す図。The figure which shows an example of the image formed based on the electrons detected by the lower stage detector and the upper stage detector. 2以上の検出器を備えた荷電粒子線装置によって得られた2以上の画像を用いて、画像を領域分けする工程を示すフローチャート。The flowchart which shows the process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors. Upper検出器で得られた画像とLower検出器で得られた画像の差分を輝度信号としたL&Sパターン画像の一例を示す図。The figure which shows an example of the L & S pattern image which made the brightness | luminance signal the difference of the image obtained by the Upper detector, and the image obtained by the Lower detector. 試料にビームを照射したときに試料から放出される2次電子の軌道を示す図。The figure which shows the track | orbit of the secondary electron discharge | released from a sample when a sample is irradiated with a beam. 2次電子偏向用のアライナによって2次電子を偏向した場合の電子の挙動を示した図。The figure which showed the behavior of the electron at the time of deflecting a secondary electron with the aligner for secondary electron deflection | deviation. 紙面左方向に放出された電子をアライナで偏向して検出する例を示す図。The figure which shows the example which deflects and detects the electron discharge | released in the paper left direction with an aligner. 2以上の検出器を備えた荷電粒子線装置によって得られた2以上の画像を用いて、画像を領域分けする工程を示すフローチャート。The flowchart which shows the process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors. 2次電子像について仮決め領域を設定した例を示す図。The figure which shows the example which set the provisional determination area | region about the secondary electron image. 特定方向のエッジが強調された画像を合成した例を示す図。The figure which shows the example which synthesize | combined the image in which the edge of the specific direction was emphasized. 画像データを取得するための走査電子顕微鏡の概要を示す図。The figure which shows the outline | summary of the scanning electron microscope for acquiring image data. Left検出器とRight検出器の出力に基づいて形成される画像の一例を示す図。The figure which shows an example of the image formed based on the output of a Left detector and a Right detector. 走査電子顕微鏡によって得られた画像に基づいて、画像の領域分けを行う工程を示すフローチャートの一例を示す図。The figure which shows an example of the flowchart which shows the process of dividing the area | region of an image based on the image obtained by the scanning electron microscope. 画像データを取得するための走査電子顕微鏡の概要を示す図。The figure which shows the outline | summary of the scanning electron microscope for acquiring image data. 4つの検出器によって検出される電子に基づいて形成される画像の一例を示す図。The figure which shows an example of the image formed based on the electrons detected by four detectors. LR検出器と試料との位置関係を示す図。The figure which shows the positional relationship of LR detector and a sample. 電子顕微鏡の制御装置の一例を示す図。The figure which shows an example of the control apparatus of an electron microscope. Upper検出器とLower検出器によって得られる画像の一例を示す図。The figure which shows an example of the image acquired by an Upper detector and a Lower detector. 多層構造パターンへのビーム照射によって得られる電子を、複数の検出器で検出したときに形成される画像の一例を示す図。The figure which shows an example of the image formed when the electron obtained by the beam irradiation to a multilayer structure pattern is detected with a some detector.
 以下に説明する実施例では、例えば外観検査装置、パターン測定装置、欠陥検査装置等が有する走査電子顕微鏡などによって得られた画像信号を処理する画像処理装置に係り、特に測定装置や検査装置で撮影した被検査画像と半導体設計データを用いて被検査画像の視野ずれを検出するための画像処理技術、およびそれを搭載する半導体検査装置、及びそれを実現する画像処理装置等を説明する。 The embodiments described below relate to an image processing apparatus that processes an image signal obtained by, for example, a scanning electron microscope included in an appearance inspection apparatus, a pattern measurement apparatus, a defect inspection apparatus, and the like, and is particularly photographed by a measurement apparatus or an inspection apparatus. An image processing technique for detecting a visual field shift of an inspected image using the inspected image and semiconductor design data, a semiconductor inspection apparatus equipped with the image processing technique, an image processing apparatus for realizing the same, and the like will be described.
 本実施例では、主に撮影条件の違う2枚以上の2次電子画像と、その撮影条件から、回路パターンの凹凸領域の領域わけを実施する。この凹凸領域を用いて位置合わせをおこなう画像処理装置を備えた荷電粒子線装置について説明する。このような画像処理装置を備えた荷電粒子線装置によれば、高密度な回路パターンの存在するレイアウトや、変形、欠陥の発生などに影響を受けないロバストな位置合わせが可能になる。 In the present embodiment, the uneven region of the circuit pattern is divided mainly from two or more secondary electron images having different shooting conditions and the shooting conditions. A charged particle beam apparatus including an image processing apparatus that performs alignment using the uneven area will be described. According to the charged particle beam apparatus including such an image processing apparatus, it is possible to perform a robust alignment that is not affected by a layout in which a high-density circuit pattern exists, deformation, or generation of defects.
 上記画像処理装置は、汎用のコンピュータによって実現することができ、同装置は図示しない入出力インターフェイス(ディスプレイ、キーボード、マウス、LANポート、USBポート等)、CPU(演算部)、メモリ(記憶部)といった基本的なコンピュータの構成要素を備える。以下の説明は、上記コンピュータを動作させるプログラムとしての説明でもある。 The image processing apparatus can be realized by a general-purpose computer. The apparatus includes an input / output interface (not shown) (display, keyboard, mouse, LAN port, USB port, etc.), CPU (calculation unit), memory (storage unit). Basic computer components such as The following description is also an explanation as a program for operating the computer.
 図1は、2以上の検出器を備えた荷電粒子線装置によって得られた2以上の画像を用いて、画像を領域分けする工程を示すフローチャートである。図1に例示するような処理は、後述する制御装置(画像処理装置)、或いは外部の画像処理装置等によって実行される。図1に例示するフローチャートには、撮像条件の違う2枚以上のSE像を読み込むステップ101と、各SE像からエッジ領域を抽出するステップ102と、各画像のエッジ領域を組み合わせて平坦領域を区分するステップ103と、平坦領域とエッジ領域と撮影条件から、平坦領域のレイヤー情報を取得するステップ104が含まれている。 FIG. 1 is a flowchart showing a process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus equipped with two or more detectors. The processing illustrated in FIG. 1 is executed by a control device (image processing device) described later, an external image processing device, or the like. The flowchart illustrated in FIG. 1 includes a step 101 for reading two or more SE images with different imaging conditions, a step 102 for extracting an edge region from each SE image, and a flat region by combining the edge regions of each image. And step 104 for acquiring layer information of the flat region from the flat region, the edge region, and the photographing conditions.
 ステップ101の撮影条件とは、検査装置の2次電子検出器の3次元上の位置、検出器が取得する電子の角度や向きや、取得する電子のエネルギー量の設定値、後述するアライナの偏向度合いを数理化した値などである。 The imaging conditions in step 101 are the three-dimensional position of the secondary electron detector of the inspection apparatus, the angle and direction of electrons acquired by the detector, the set value of the energy amount of electrons acquired, and the alignment deflection described later. For example, the degree is a mathematical value.
 ステップ102のエッジ領域とは、図2のようなSE画像イメージにおいて、SE画像上にホワイトバンド201と呼ばれる帯状に現れるパターンのエッジ部のことである。ステップ103の平坦領域の区分とは、ステップ102で抽出したエッジ領域を用いて、画像内をエッジ部とパターン部202とスペース部203を区分することである。 The edge region of step 102 is an edge portion of a pattern that appears in a band shape called a white band 201 on the SE image in the SE image as shown in FIG. The flat area segmentation in step 103 is to segment the edge portion, the pattern portion 202, and the space portion 203 in the image using the edge region extracted in step 102.
 ステップ104のレイヤー情報とは、図3のような断面図で半導体を見た場合、3次元空間上でより上にある領域凸部301と、凸部301より低い位置にある領域凹部が、パターン部202とスペース部203のどちらであるかを表す情報である。また、SE(Secondary Electron)像では図4のように、上層パターン部401と、それに一部をそれに隠された下層パターン部402と、スペース部403のように、2層以上の回路パターンが写りこむ場合がある。このような場合において、ステップ104で取得するレイヤー情報は、上層パターン部、下層パターン部402、スペース部403と取得する。 The layer information in step 104 means that when the semiconductor is viewed in a cross-sectional view as shown in FIG. 3, the region convex portion 301 located above the three-dimensional space and the region concave portion located lower than the convex portion 301 are patterned. This is information indicating whether the part 202 or the space part 203 is used. Further, in the SE (Secondary Electron) image, as shown in FIG. 4, the upper layer pattern portion 401, the lower layer pattern portion 402 partially hidden therein, and the space portion 403, two or more circuit patterns are shown. There is a case to be depressed. In such a case, the layer information acquired in step 104 is acquired as the upper layer pattern portion, the lower layer pattern portion 402, and the space portion 403.
 また、単にパターン部とスペース部とに分けてもよい。レイヤー情報は、前述した2層、3層のものに限定しない。また、ステップ104の後に前記レイヤー情報を用いた位置合わせが行われる。 以下に、撮影条件ごとに各ステップの詳細な処理を述べる。 Also, it may be simply divided into a pattern part and a space part. The layer information is not limited to the two-layer and three-layer information described above. Further, after step 104, alignment using the layer information is performed. The detailed processing of each step is described below for each shooting condition.
 図5に、画像データを取得するための走査電子顕微鏡の概要を示す。なお、以下の説明では画像取得装置として走査電子顕微鏡(Scanning Electron Microscope:SEM)を採用した例を説明するが、試料にイオンビームを走査することによって得られる荷電粒子の検出に基づいて画像を形成する集束イオンビーム(Focused Ion Beam)装置のような他の荷電粒子線装置を画像取得装置とするようにしても良い。 FIG. 5 shows an outline of a scanning electron microscope for acquiring image data. In the following description, an example in which a scanning electron microscope (SEM) is used as an image acquisition device will be described. However, an image is formed based on detection of charged particles obtained by scanning an ion beam on a sample. Another charged particle beam device such as a focused ion beam device may be used as the image acquisition device.
 電子源501から引出電極502によって引き出され、図示しない加速電極によって加速された電子線503は、集束レンズの一形態であるコンデンサレンズ504によって、絞られた後に、図示しない走査偏向器によって、により、試料(半導体ウェハ)509上を一次元的、或いは二次元的に走査される。電子ビーム503は試料台508に内蔵された電極に印加された負電圧により減速されると共に、対物レンズ506のレンズ作用によって集束されて試料509上に照射される。 An electron beam 503 extracted from the electron source 501 by the extraction electrode 502 and accelerated by an acceleration electrode (not shown) is focused by a condenser lens 504 which is a form of a focusing lens, and then by a scanning deflector (not shown). The sample (semiconductor wafer) 509 is scanned one-dimensionally or two-dimensionally. The electron beam 503 is decelerated by a negative voltage applied to an electrode built in the sample stage 508 and is focused by the lens action of the objective lens 506 and irradiated onto the sample 509.
 電子ビーム503が試料509に照射されると、当該照射個所から二次電子、及び後方散乱電子のような電子510が放出される。放出された電子510は、試料に印加される負電圧に基づく加速作用によって、電子源方向に加速されると共に、アライナ505によって偏向される。アライナ505によって偏向された電子510は、変換電極512に衝突し、二次電子511を生じさせる。 When the sample 509 is irradiated with the electron beam 503, secondary electrons and electrons 510 such as backscattered electrons are emitted from the irradiated portion. The emitted electrons 510 are accelerated in the direction of the electron source by the acceleration action based on the negative voltage applied to the sample and are deflected by the aligner 505. The electrons 510 deflected by the aligner 505 collide with the conversion electrode 512 to generate secondary electrons 511.
 変換電極512から放出された二次電子511は、検出器513によって捕捉され、捕捉された二次電子量によって、検出器513の出力Iが変化する。この出力Iに応じて図示しない表示装置の輝度が変化する。例えば二次元像を形成する場合には、走査偏向器505への偏向信号と、検出器513の出力Iとの同期をとることで、走査領域の画像を形成する。また、図5に例示する走査電子顕微鏡には、電子ビームの走査領域を移動する偏向器(図示せず)が備えられている。 The secondary electrons 511 emitted from the conversion electrode 512 are captured by the detector 513, and the output I of the detector 513 changes depending on the amount of captured secondary electrons. Depending on the output I, the brightness of a display device (not shown) changes. For example, when a two-dimensional image is formed, an image of the scanning region is formed by synchronizing the deflection signal to the scanning deflector 505 and the output I of the detector 513. The scanning electron microscope illustrated in FIG. 5 includes a deflector (not shown) that moves the scanning region of the electron beam.
 制御装置514は、走査電子顕微鏡の各構成を制御すると共に、検出された電子に基づいて画像を形成する機能や、ラインプロファイルと呼ばれる検出電子の強度分布に基づいて、試料上に形成されたパターンのパターン幅を測定する機能を備えている。 The control device 514 controls each component of the scanning electron microscope, and forms a pattern on the sample based on the function of forming an image based on detected electrons and the intensity distribution of detected electrons called a line profile. It has a function to measure the pattern width.
 図22は制御装置(画像処理装置)514の詳細な構成例である。制御装置514はCPU2201、画像メモリ2202、LSI2203、出力装置2204、記憶デバイス2205といった基本的なコンピュータの構成要素を備える。また設計データ等の受け取るため、設計システム2208やその制御装置2207とLANなどのネットワークで接続されている。 FIG. 22 is a detailed configuration example of the control device (image processing device) 514. The control device 514 includes basic computer components such as a CPU 2201, an image memory 2202, an LSI 2203, an output device 2204, and a storage device 2205. In order to receive design data and the like, it is connected to the design system 2208 and its control device 2207 via a network such as a LAN.
 画像取得装置のハード構成に応じて、撮影される画像に違いが表れる。その理由は、ウェハ上での2次電子の振る舞いにある。まず、ウェハ上で2次電子がどのように放出されるかを述べる。電子源501から半導体ウェハ509に向かって電子が照射されると、図6のように2次電子601が発生する。2次電子は回路パターンの側壁部602に照射された際、発生した電子は検出器によって観測(検出)される電子603と、回路パターンに阻まれて検出器に観測されない電子604となる。観測されない電子604は、照射された電子線503とのなす角がある電子の一部である。 * Depending on the hardware configuration of the image acquisition device, a difference appears in the captured image. The reason is the behavior of secondary electrons on the wafer. First, how secondary electrons are emitted on the wafer will be described. When electrons are irradiated from the electron source 501 toward the semiconductor wafer 509, secondary electrons 601 are generated as shown in FIG. When the secondary electrons are irradiated onto the side wall portion 602 of the circuit pattern, the generated electrons are electrons 603 that are observed (detected) by the detector and electrons 604 that are blocked by the circuit pattern and are not observed by the detector. The electrons 604 that are not observed are a part of electrons having an angle with the irradiated electron beam 503.
 回路パターンの上部605では,遮蔽する回路パターンが存在しないため、なす角が比較的ある電子でも検出器によって観測される。しかし、回路パターンのスペース部606では、回路パターンに電子が阻まれるため、上部と比べて比較的なす角の小さな電子しか検出器に観測されない。また、片側に遮蔽する回路パターンがない側壁部607では、照射電子とのなす角が特定の角度の電子は回路パターンに阻まれて観測されない電子604となる。 In the upper part 605 of the circuit pattern, there is no circuit pattern to be shielded, so even electrons having a relatively small angle are observed by the detector. However, in the space portion 606 of the circuit pattern, electrons are blocked by the circuit pattern, so that only electrons having a relatively small angle compared to the upper portion are observed by the detector. Further, in the side wall portion 607 where there is no circuit pattern to be shielded on one side, electrons having a specific angle with the irradiated electrons become electrons 604 that are not observed due to the circuit pattern.
 図5のように,Upper検出器513aとLower検出器513bの2つの検出器によって得られる画像を、図23に例示する。回路パターンの形状として、ラインアンドスペース(L&S)パターンと、ホール(Hole)パターンを例に挙げている。L&Sパターンは、回路パターンの配線部とスペース部が交互に一定方向に配置されたパターンである。Holeパターンは平面の上部パターンにその下の材質まで届く穴が開いたパターンである。L&Sパターンは上部(ライン部)が窒化チタンTiN、下部(スペース部)がケイ素Siで構成されている。Holeパターンは上部(ホールのトップ側)が窒化ケイ素SiN、下部(ホールのボトム)がケイ素Siで構成されている。 FIG. 23 illustrates an image obtained by two detectors, an upper detector 513a and a lower detector 513b, as shown in FIG. As the shape of the circuit pattern, a line and space (L & S) pattern and a hole (Hole) pattern are given as examples. The L & S pattern is a pattern in which wiring portions and space portions of a circuit pattern are alternately arranged in a certain direction. The Hole pattern is a pattern in which holes reaching the underlying material are opened in the planar upper pattern. The L & S pattern is composed of titanium nitride TiN at the upper part (line part) and silicon Si at the lower part (space part). The Hole pattern is composed of silicon nitride SiN at the top (hole top side) and silicon Si at the bottom (bottom of the hole).
 図23の上段がUpper検出器513aでの取得画像、下段がLower検出器513bでの取得画像である。図6を用いて説明したように、Lower検出器513bによって得られる電子は、パターンのスペース部701では、少なくなる傾向にある。このため、パターンの上部702と比較して暗くなる特性がある。一方、Upper検出器によって得られる電子は、主に照射電子の軌道とのなす角が小さい2次電子が含まれており、深穴深溝の底から放出される電子であっても、深穴、深溝の側壁に補足されない電子であるため、スペース部703a、ボトム部703bは、スペース部701a、ボトム部701bに対して相対的に明るくなる。また、上述のように、深穴や深溝の底から放出された電子は、Upper検出器の方がLower検出器より多く検出されるが、試料を構成する材質によっては、この条件に当てはまらない場合がある。同じ条件で電子線が照射された場合に,2次電子の放出量が違うため,検出器で観測した電子量を画像化する際に,コントラストが逆転する場合がある。材質によってコントラストが変化する為,Lower検出器の画像では必ずスペース部が相対的に暗くなるわけではない。 23. The upper part of FIG. 23 is an acquired image with the Upper detector 513a, and the lower part is an acquired image with the Lower detector 513b. As described with reference to FIG. 6, the electrons obtained by the lower detector 513b tend to decrease in the space portion 701 of the pattern. For this reason, there is a characteristic that it becomes darker than the upper part 702 of the pattern. On the other hand, the electrons obtained by the Upper detector mainly include secondary electrons having a small angle with the orbit of the irradiated electrons, and even if electrons emitted from the bottom of the deep hole deep groove, Since the electrons are not captured by the side wall of the deep groove, the space portion 703a and the bottom portion 703b are relatively brighter than the space portion 701a and the bottom portion 701b. In addition, as described above, electrons emitted from the bottom of deep holes and deep grooves are detected more by the Upper detector than the Lower detector, but this condition may not apply depending on the material constituting the sample. There is. When the electron beam is irradiated under the same conditions, the amount of secondary electrons emitted is different, so the contrast may be reversed when imaging the amount of electrons observed by the detector. Since the contrast changes depending on the material, the space portion is not always relatively dark in the image of the Lower detector.
 上述のような前提をもとに、図5のようなUpper検出器とLower検出器を備えた電子顕微鏡を用いて、回路パターンの凹凸領域の領域わけを行う方法を以下に示す。 Based on the premise as described above, a method of dividing the uneven area of the circuit pattern using an electron microscope equipped with an upper detector and a lower detector as shown in FIG. 5 will be described below.
 図8は、2以上の検出器を備えた荷電粒子線装置によって得られた2以上の画像を用いて、画像を領域分けする工程を示すフローチャートである。図8に例示するフローチャートによれば、パターンのエッジ部分に相当する部分を画像から消失させているので、パターンマッチングを行う際のパターンの変形等に基づくマッチング位置精度の低下を抑制することが可能となる。 FIG. 8 is a flowchart showing a process of dividing an image into regions using two or more images obtained by a charged particle beam apparatus including two or more detectors. According to the flowchart illustrated in FIG. 8, since the portion corresponding to the edge portion of the pattern has disappeared from the image, it is possible to suppress a decrease in matching position accuracy based on pattern deformation or the like when performing pattern matching. It becomes.
 画像を読み込むステップ101とエッジ領域を抽出するステップ102は図1と同様である。ここで,エッジ領域を抽出するステップ102でのエッジ抽出法は,例えばソーベルフォルタやラプラシアンフィルタといったエッジ抽出フィルタを用いることで行える。また,エッジ部は平坦部より輝度の高いホワイトバンドとして画像に映るため、2値化などを用いて抽出することもできる。また、フィルタや2値化を組み合わせてより高度なエッジ領域抽出を行ってもよい。 Step 101 for reading an image and step 102 for extracting an edge region are the same as in FIG. Here, the edge extraction method in step 102 for extracting the edge region can be performed by using an edge extraction filter such as a Sobel filter or a Laplacian filter. Further, since the edge portion appears in the image as a white band having a higher luminance than the flat portion, it can be extracted using binarization or the like. Further, more advanced edge region extraction may be performed by combining a filter and binarization.
 エッジ領域を合成するステップ801では、Upper画像から得られたエッジ領域と、Lower画像から得られたエッジ領域を組み合わせて合成エッジ領域を作成する。合成エッジ領域とは,例えばエッジ部が1、非エッジ部が0である2値の画像で与えてもよい。合成エッジ領域の形式はこれに限定しない。また、組み合わせ方法は、和を求めて2値化するなどの処理を行えばよい。また、多層画像の場合、Upper画像から得られたエッジ領域が、Lower画像から得られたエッジ領域には存在しない場合がある。このような場合、その領域を無視して上層とそれ以外に分割してもよいし、下層を考慮して領域を3種類以上に分けてもよい。下層を考慮する場合の手法は後述する。 In step 801 for synthesizing the edge region, the edge region obtained from the Upper image and the edge region obtained from the Lower image are combined to create a synthesized edge region. For example, the composite edge region may be given as a binary image in which the edge portion is 1 and the non-edge portion is 0. The form of the composite edge region is not limited to this. Further, the combination method may be performed by obtaining a sum and binarizing. In the case of a multilayer image, the edge region obtained from the Upper image may not exist in the edge region obtained from the Lower image. In such a case, the area may be ignored and divided into the upper layer and the other layers, or the area may be divided into three or more types in consideration of the lower layer. A method for considering the lower layer will be described later.
 エッジ領域の合成方法は、前述の和を用いる手法に限定するものではない。エッジ部のマスクステップ802では,合成エッジ領域を用いてUpper画像とLower画像のエッジ領域をマスクする。例えば、画素値をマイナスの値とするなどの処理を行えばよい。マスクの方法はこれに限定しない。また、後述の処理を行う際、合成エッジ領域かをその都度判定して、各画像に対する処理を行うかの有無を判定してもよい。 The edge region synthesis method is not limited to the above-described method using the sum. In an edge mask step 802, the edge areas of the upper image and the lower image are masked using the combined edge area. For example, processing such as setting the pixel value to a negative value may be performed. The mask method is not limited to this. Further, when performing the processing described later, it may be determined each time whether or not it is a composite edge region, and whether or not the processing for each image is to be performed may be determined.
 本実施例では、2つの検出器を用いて得られた2つの画像から、マッチング処理のための被探索画像を生成するため、2つの画像のエッジを消失させるべく、上述のように合成エッジ領域の作成に基づいて、マスク領域を設定する。 In this embodiment, in order to generate a searched image for matching processing from two images obtained by using two detectors, the combined edge region is deleted as described above in order to eliminate the edges of the two images. The mask area is set based on the creation of.
 輝度差の算出を行うステップ803では、エッジ領域マスクされた部分を分割領域として(領域間の境界として)エリアを分ける。例えば隣接する画素がマイナスでないなら同じラベルを付ける等の処理を行う。Upper画像とLower画像ともに同一ラベル内で画素値の平均を計算し、その差を算出する。また、Upper画像とLower画像の各画素で差を取ってから、差の平均を求めてもよい。 In Step 803 for calculating the luminance difference, the area is divided using the edge area masked portion as a divided area (as a boundary between areas). For example, if adjacent pixels are not negative, processing such as attaching the same label is performed. For both the upper image and the lower image, the average pixel value is calculated within the same label, and the difference is calculated. Alternatively, an average of the differences may be obtained after taking a difference between each pixel of the upper image and the lower image.
 図9は、Upper検出器で得られた画像とLower検出器で得られた画像の差分を輝度信号としたL&Sパターン画像の一例を示す図である。破線で囲まれたグレー領域901はUpper画像とLower画像で輝度差大きい領域であり、黒領域902は輝度差が小さい領域である。画像として視認はできないが、グレー領域901と黒領域902の境界付近は,マスクされた不感領域となっている。領域分類を行うステップ804では、輝度差の算出を行うステップ803で得られた輝度差を指標値として各領域をクラス分けする。その手法は、例えば各輝度差を1次元上にプロットした場合に、最も距離があいている点間で2クラス分類してもよいし、K平均法のようなアルゴリズムを用いてもよい。分類手法はこれに限定するものではない。 FIG. 9 is a diagram illustrating an example of an L & S pattern image in which a difference between an image obtained by the Upper detector and an image obtained by the Lower detector is a luminance signal. A gray area 901 surrounded by a broken line is an area having a large luminance difference between the upper image and the lower image, and a black area 902 is an area having a small luminance difference. Although it cannot be visually recognized as an image, the vicinity of the boundary between the gray area 901 and the black area 902 is a masked insensitive area. In step 804 of performing region classification, each region is classified using the luminance difference obtained in step 803 of calculating the luminance difference as an index value. As the method, for example, when each luminance difference is plotted on one dimension, two classes may be classified between points having the longest distance, or an algorithm such as a K-average method may be used. The classification method is not limited to this.
 レイヤー情報化を行うステップ805では、クラス分けしたクラスを各領域に割り当てる。例えば、別画像としてエッジ部はマイナス値、各平坦領域では任意に定めたクラス番号を輝度値とする画像形式で所持すればよい。レイヤー情報の記述方法はこれに限定しない。 In step 805 for performing layer information conversion, the classified class is assigned to each area. For example, the image may be possessed in an image format in which the edge portion is a negative value as a separate image and the class number arbitrarily determined in each flat region is a luminance value. The description method of the layer information is not limited to this.
 以上のように、上層と下層の違いが顕在化し易い2つの検出器によって得られた画像の差画像を作成することによって、試料の材質等によらず、上層と下層との間の輝度差を明確にすることができる。また、当該輝度差に応じたクラス分けに基づいて、マッチング処理のための被探索画像と、参照画像を作成することによって、明るさの変化等に左右されることなく、安定したマッチング処理を実現することが可能となる。なお、図9に例示するような差画像を被探索画像、及び参照画像とする場合、これらの画像は、特定の明るさを持つ領域Aと、領域Aとは異なる明るさを持つ領域Bの領域にクラス分けされているので、例えば被探索画像と参照画像間の領域Aの重畳面積と、領域Bの重畳面積が最大となる位置をマッチング位置とするような処理に基づいて、位置合わせ処理を行うこともできる。 As described above, the difference in brightness between the upper layer and the lower layer can be obtained regardless of the material of the sample by creating a difference image of the images obtained by the two detectors where the difference between the upper layer and the lower layer is likely to be manifested. Can be clear. In addition, by creating a search image and a reference image for matching processing based on the classification according to the brightness difference, stable matching processing is realized without being affected by changes in brightness, etc. It becomes possible to do. In addition, when the difference image as illustrated in FIG. 9 is used as the search image and the reference image, these images are a region A having a specific brightness and a region B having a brightness different from the region A. Since it is classified into regions, for example, the registration processing based on the processing in which the overlapping area of the region A between the searched image and the reference image and the position where the overlapping area of the region B is maximized are set as the matching position Can also be done.
 また、上記領域わけ処理は、参照画像を用いたマッチング処理だけではなく、測定や検査の対象となるエッジを特定する処理を行う場合にも適用できる。ステップ805でクラス分け(領域分け)された領域の中で、予め測定、検査対象として設定されたパターンのエッジを選択的に抽出すべく、以下のような処理を実行する。 In addition, the region dividing process can be applied not only to the matching process using the reference image but also to the process of specifying the edge to be measured or inspected. In order to selectively extract the edges of the pattern previously set as measurement and inspection objects in the areas classified (area divided) in step 805, the following processing is executed.
 例えば測定対象がホールパターンのボトム径である場合、ボトムとしてクラス分けされた画像領域に近接するエッジを抽出すれば、目的の測定を容易に行うことができる。そのために本例では、領域分けされた特定層の領域に隣接するエッジを探索し、探索に基づいて抽出されるエッジを選択的に測定、検査する手法を提案する。このような手法によれば、測定対象となるエッジにカーソル等を設定することなく、測定対象の選択を行うことが可能となる。特に画像内に含まれる複数のホールパターンの径を測定し、その平均値を求める処理を行うような場合、複数のホールにカーソル等を設定しなくても、選択的にエッジを抽出することが可能となる。 For example, when the measurement target is the bottom diameter of the hole pattern, the target measurement can be easily performed by extracting an edge close to the image area classified as the bottom. Therefore, in this example, a method is proposed in which an edge adjacent to a region of a specific layer divided into regions is searched, and an edge extracted based on the search is selectively measured and inspected. According to such a method, it becomes possible to select a measurement target without setting a cursor or the like at an edge to be measured. Especially when measuring the diameter of multiple hole patterns contained in an image and calculating the average value, it is possible to selectively extract edges without setting a cursor or the like for multiple holes. It becomes possible.
 この場合、1のホールパターン(領域分けされた領域)ごとに、クラス分けされた領域の重心(測定基準位置)を求め、当該重心から特定方向に延びる直線上に位置するエッジ間の寸法を測定するようなアルゴリズム等を備えることによって、自動的な測定対象の選択と測定が可能となる。 In this case, the center of gravity (measurement reference position) of the classified area is obtained for each hole pattern (area divided), and the dimension between edges located on a straight line extending in a specific direction from the center of gravity is measured. By providing such an algorithm, automatic measurement object selection and measurement can be performed.
 また、領域の垂線方向の輝度や輝度勾配からエッジを特定することができる。例えば法線上で輪郭部を基準に、輪郭部から離れるように画素の探索を行った場合に、輝度勾配の大きさが前記領域分けされた領域の輪郭部と比較して1度目ないし2度目に同等の大きさまで減少した際に、その間をエッジ領域とする。大きさの比較は、しきい値での判定や、分散などの統計量を用いた手法などがある。これはSE像がパターンのエッジ領域の輝度が大きくなる特性を用いた手法である。このような手法を用いれば、複雑な形状を持ったパターンに対してもエッジの特定が可能である。特定したエッジと、例えば特定したエッジの法線方向にある同様に特定された別のエッジの位置からエッジ間の寸法を測定するようなアルゴリズム等を備えることによって、自動的な測定対象の選択と測定が可能となる。 Also, the edge can be identified from the luminance or luminance gradient in the normal direction of the region. For example, when searching for pixels away from the contour portion on the basis of the contour portion on the normal line, the magnitude of the luminance gradient is the first or second time compared with the contour portion of the region divided into the regions. When the size is reduced to the same size, the area between them is set as an edge region. The size comparison includes a determination using a threshold value and a method using a statistic such as variance. This is a technique using the characteristic that the brightness of the edge region of the pattern of the SE image is increased. If such a method is used, an edge can be specified even for a pattern having a complicated shape. Automatic selection of the measurement target by providing an algorithm that measures the dimension between the specified edge and, for example, the position of another edge that is also specified in the normal direction of the specified edge. Measurement is possible.
 以上のように、マッチング処理や位置特定処理等の測定、検査の前処理の際に、エッジ消失処理と領域わけ処理を行うことによって、その後の測定、検査を高精度に行うことが可能となる。 As described above, by performing edge disappearance processing and region separation processing during measurement and inspection preprocessing such as matching processing and position identification processing, subsequent measurement and inspection can be performed with high accuracy. .
 次に、回路パターンが多層である場合に、各レイヤーを定義する例を説明する。図10は多層回路パターンに電子ビームを照射したときに、試料から放出される電子の軌道を説明する図である。下層に電子線503が照射されて発生した2次電子1001は,上層の側壁1002にさえぎられるため、検出可能な電子の量は制限される。しかしこの量は,スペース部に電子線が照射されて発生した2次電子1003よりは多い。したがって、図4に示した多層画像のように、Lower画像では回路パターンの平坦領域の輝度は、基本的には上層401>下層402>スペース部403となる。ただし、前記式は材質によって成り立たない場合もある。しかし、下層が有る場合でもUpper画像とLower画像の特性は変わらないことから、平坦領域の輝度差は、スペース部>下層領域>上層領域が成り立つ。 Next, an example of defining each layer when the circuit pattern is multi-layer will be described. FIG. 10 is a diagram for explaining the trajectory of electrons emitted from the sample when the multilayer circuit pattern is irradiated with an electron beam. Since the secondary electrons 1001 generated by irradiating the lower layer with the electron beam 503 are blocked by the upper side wall 1002, the amount of detectable electrons is limited. However, this amount is larger than that of the secondary electrons 1003 generated by irradiating the space part with the electron beam. Therefore, as in the multilayer image shown in FIG. 4, in the lower image, the luminance of the flat area of the circuit pattern is basically upper layer 401> lower layer 402> space part 403. However, the above formula may not hold depending on the material. However, even when there is a lower layer, the characteristics of the upper image and the lower image do not change, so that the brightness difference of the flat region is such that space portion> lower layer region> upper layer region.
 そこで,領域分類を行うステップ804において、含まれる階層分の分割を行う。画像に何層の配線が写りこむかが不明な場合には、例えば赤池情報量規準やベイズ情報量規準などの指標値を用いK-Means法等を用いて分割数を自動決定すればよい。また入力パラメータとして写りこむレイヤー数や、位置合わせに利用する設計データが与えられるならば、その情報に基づいて分割数を決定すればよい。分割数の決定方法はこれに限定しない。分割数を決定することができれば,後はレイヤー情報化を行うステップ805と同様の手法でレイヤー情報化が可能である。マッチング処理のための被探索画像と参照画像を、レイヤー情報化に基づいて作成することによって、明るさの変化等に左右されることなく、安定したマッチング処理を実現することが可能となる。 Therefore, in the step 804 for performing area classification, division for the included layers is performed. If it is unclear how many layers of wiring are reflected in the image, the number of divisions may be automatically determined using an index value such as the Akaike information criterion or Bayesian information criterion using the K-Means method or the like. If the number of layers to be captured as input parameters and design data used for alignment are given, the number of divisions may be determined based on the information. The method for determining the number of divisions is not limited to this. If the number of divisions can be determined, layer information can be formed by the same method as in step 805 for layer information. By creating the searched image and the reference image for the matching process based on the layer informationization, it is possible to realize a stable matching process without being influenced by a change in brightness or the like.
 次に、アライナ505を用いて任意の角度の2次電子を検出する例について説明する。図11はアライナ505によって2次電子を偏向した場合の電子の挙動を示した図である。 Next, an example of detecting secondary electrons at an arbitrary angle using the aligner 505 will be described. FIG. 11 is a diagram showing the behavior of electrons when secondary electrons are deflected by the aligner 505.
 半導体ウェハ509から放出された2次電子1101は、アライナ505によって軌道を補正され、変換電極512の隙間を抜け、再度アライナ505で軌道を修正され、上部の変換電極512に衝突し、Upper検出器に検出される。 The secondary electrons 1101 emitted from the semiconductor wafer 509 have their trajectories corrected by the aligner 505, pass through the gaps of the conversion electrodes 512, are corrected again by the aligners 505, collide with the upper conversion electrodes 512, and the upper detector. Detected.
 Upper検出器によって検出される電子の軌道以外の軌道を通過する電子は、下段のアライナ505によって偏向され、下部の変換電極512に衝突し、Lower検出器によって検出される。図12は左方向に発生した電子をアライナで偏向し観測したものである。Upper検出器画像のホワイトバンド1201は左方向に発生した電子のみを観測したものであり、十字の凸パターンの右側面では左方向の電子が発生しないため黒い影領域1202となっている。 Electrons passing through an orbit other than that detected by the upper detector are deflected by the lower aligner 505, collide with the lower conversion electrode 512, and are detected by the lower detector. FIG. 12 shows the electron generated in the left direction deflected by the aligner and observed. The white band 1201 of the Upper detector image is obtained by observing only the electrons generated in the left direction, and the left side electrons are not generated on the right side surface of the cross-shaped convex pattern, so that a black shadow region 1202 is formed.
 また,スペース部1203では近くにパターンが無く,左方向に放出された電子が遮られないため影領域1202よりは明るくなっている。Lower検出画像ではパターンの左側面以外がホワイトバンド1204として検出される。左側面部1205は左側面方向の電子をUpper検出器に取られているため暗くなるが、例えば真上方向の電子は検出されており、エッジ部から多量の2次電子を放出するという特性を考えれば、平面よりは暗くならない。したがって図12のようにエッジ部のホワイトバンドが薄くなる。 Also, in the space portion 1203, there is no pattern in the vicinity, and electrons emitted in the left direction are not blocked, so the area is brighter than the shadow region 1202. In the lower detection image, the white band other than the left side of the pattern is detected as a white band 1204. The left side surface portion 1205 is dark because electrons in the left side surface direction are taken by the Upper detector, but for example, electrons in the upward direction are detected and a large amount of secondary electrons are emitted from the edge portion. It will not be darker than the plane. Therefore, as shown in FIG. 12, the white band at the edge portion becomes thin.
 図12のような画像に対して領域分けを行う処理フローを図13に示す。エッジ抽出までは図8と同様である。本例では左方向に放出される電子をUpper検出器で観測しているため、Upper検出器画像のホワイトバンドより右の領域はパターン部となる。領域の仮決めを行うステップ1301では、図14のように仮決め領域1401を決定する。これをLower検出画像から得られたホワイトバンドのエッジ1402と、Upper検出器画像のホワイトバンドから得られたエッジで区分する処理を行うステップ1302を行う。仮決め領域の内部にUpper検出器画像のホワイトバンドが存在する場合、そこでも領域区分を行う。エッジで囲われた閉領域1403と、エッジの外側領域1404に区分される。区分された領域の内、仮決め領域を作成した方向と同方向に有る領域、本例ではエッジの外側領域1404を削除する。削除されなかった仮決め領域をパターン部、仮決め領域に含まれない領域をスペース部として分類(ステップ1304)を行い、レイヤー情報化(ステップ805)を実施する。 FIG. 13 shows a processing flow for performing area division on the image as shown in FIG. The process until the edge extraction is the same as in FIG. In this example, since the electrons emitted in the left direction are observed by the upper detector, the region on the right side of the white band of the upper detector image is a pattern portion. In step 1301 for tentatively determining a region, a tentatively determined region 1401 is determined as shown in FIG. Step 1302 is performed in which processing is performed to distinguish this from the edge 1402 of the white band obtained from the lower detection image and the edge obtained from the white band of the upper detector image. When the white band of the Upper detector image exists inside the provisional determination area, the area division is also performed there. It is divided into a closed region 1403 surrounded by edges and an outer region 1404 of the edges. Of the divided areas, the area in the same direction as the provisional determination area is created, in this example, the edge outer area 1404 is deleted. The provisionally determined area that has not been deleted is classified as a pattern part, and the area not included in the provisionally determined area is classified as a space part (step 1304), and layer information is formed (step 805).
 また、アライナ505を用いた場合、図15に示すような特定の方向のエッジにフォーカスした画像1501やその合成画像1502の作成が可能になる。合成画像1502のホワイトバンドを利用すれば,図13の処理においてUpper検出器画像のホワイトバンドを仮決め領域の区分に用いる必要が無くなる。合成画像の作成方法は、例えば方向フォーカス画像1501a~1501dまでの画像の輝度最大値を示す部分を選択的に合成すること等が考えられる。合成画像の作成方法や画像の組み合わせはこれに限定しない。また、図15では4方向の例を示したが、4方向以外の複数方向のエッジを選択するようにしても良い。 Further, when the aligner 505 is used, an image 1501 focused on an edge in a specific direction as shown in FIG. 15 or a composite image 1502 thereof can be created. If the white band of the composite image 1502 is used, it is not necessary to use the white band of the Upper detector image in the process of FIG. As a method for creating a composite image, for example, it is conceivable to selectively synthesize a portion indicating the maximum luminance value of the direction focus images 1501a to 1501d. The method for creating a composite image and the combination of images are not limited to this. Further, although an example of four directions is shown in FIG. 15, edges in a plurality of directions other than the four directions may be selected.
 図16に被検査画像を取得するための新たな検査装置の構成例を示す。図5との差分は、図5でいうところのLower検出器513bが2台設置されている点である。図に向かって左をLeft検出器1613a、右をRight検出器1613bとする。また二つをまとめてLR検出器と呼称する。また図16ではUpper検出器は省略している。Upper検出器は図16の装置構成において領域分けを行う場合には必須ではない。 FIG. 16 shows a configuration example of a new inspection apparatus for acquiring an inspection image. The difference from FIG. 5 is that two Lower detectors 513b as shown in FIG. 5 are installed. As shown in the figure, the left is the Left detector 1613a, and the right is the Right detector 1613b. The two are collectively called an LR detector. In FIG. 16, the upper detector is omitted. The upper detector is not indispensable when the area is divided in the apparatus configuration of FIG.
 図17にLeft検出器1613aとRight検出器1613bから得られた画像の一例を示す。それぞれをL像、R像と呼称し、これら画像の組をLR像と呼ぶ。LR像は、図15で示した画像1501bおよび画像1501cと類似している。検出器が検出する電子は、図21に示したように、Left検出器2101によって左方向に放出された電子が、Right検出器2102によって右方向に放出された電子が画像化される。 FIG. 17 shows an example of images obtained from the Left detector 1613a and the Right detector 1613b. Each is called an L image and an R image, and a set of these images is called an LR image. The LR image is similar to the image 1501b and the image 1501c shown in FIG. As shown in FIG. 21, the electrons detected by the detector are imaged as electrons emitted in the left direction by the Left detector 2101 and electrons emitted in the right direction by the Right detector 2102.
 図18は、図16に例示する走査電子顕微鏡によって得られた画像に基づいて、画像の領域分けを行う工程を示すフローチャートの一例を示す図である。画像読み込みステップ(ステップ101)ではLR像を読み込む。エッジ領域の抽出工程(ステップ1801)では、LR像からホワイトバンドを抽出し、それを合成してエッジ領域を求めてもよい。また、LR像を合成して合成画像のホワイトバンドからエッジ領域を求めてもよい。画像の合成方法は、例えば輝度の大きい画像の値を示す部分を選択的に合成すること等が考えられる。 FIG. 18 is a diagram illustrating an example of a flowchart showing a process of dividing an image into regions based on an image obtained by the scanning electron microscope illustrated in FIG. In the image reading step (step 101), an LR image is read. In the edge region extraction step (step 1801), the white band may be extracted from the LR image and synthesized to obtain the edge region. Alternatively, the edge region may be obtained from the white band of the composite image by combining the LR images. As an image synthesizing method, for example, it is conceivable to selectively synthesize a portion showing an image value having a high luminance.
 ホワイトバンドの抽出には、ステップ102と同等の処理法を適用することができる。ステップ1802のパターン算出工程では、図14に例示した処理を、L像またはR像の最低どちらか一方に行い、パターン部を求めればよい。また両方について行ってもよい。これにより、パターン部とエッジ部とそれ以外に分類が可能である。 The same processing method as in step 102 can be applied to white band extraction. In the pattern calculation process of step 1802, the process illustrated in FIG. 14 may be performed on at least one of the L image and the R image to obtain the pattern portion. Moreover, you may carry out about both. Thereby, it is possible to classify into a pattern part, an edge part, and other parts.
 この分類結果に基づいてレイヤー情報化(ステップ805)を行う。また、図17に例示するように、パターン部が複数存在しないならば,合成されたエッジ領域を算出せず、L像とR像それぞれに図14に例示した仮決め領域を求め、その重複部分をパターン部としてもよい。各処理は上述した手法に限定するものではない。 The layer information is formed based on the classification result (step 805). Further, as illustrated in FIG. 17, if there are not a plurality of pattern portions, the combined edge region is not calculated, and the provisional determination regions illustrated in FIG. 14 are obtained for each of the L image and the R image, and the overlapping portion is obtained. It is good also as a pattern part. Each process is not limited to the method described above.
 図19に、画像取得のための走査電子顕微鏡の概要を例示する。図16に例示した構成との違いは、図5を用いて説明したLower検出器513bが四方に計4台設置されている点である。今後各検出器を、X+、X-、Y+、Y-検出器と呼称する。各検出器が画像化のために検出する電子は、図21に示したものと同様であり、X方向に配置されている2つの検出器に加えて、ビーム光軸を挟んで2つの検出器が配置される。各検出器によって検出される電子に基づいて形成される画像例を図20に示す。各検出器から得られた画像の特性は、アライナを用いた例における,4方向に類似しており,同様の処理で領域分類が可能である。 FIG. 19 illustrates an outline of a scanning electron microscope for image acquisition. The difference from the configuration illustrated in FIG. 16 is that a total of four Lower detectors 513b described with reference to FIG. 5 are installed in all directions. In the following, each detector will be referred to as an X +, X-, Y +, Y-detector. The electrons detected by each detector for imaging are the same as those shown in FIG. 21. In addition to the two detectors arranged in the X direction, two detectors sandwiching the beam optical axis. Is placed. An example of an image formed based on the electrons detected by each detector is shown in FIG. The characteristics of the image obtained from each detector are similar to the four directions in the example using the aligner, and region classification is possible by the same processing.
 また多層画像に対する領域分け手法を以下に示す。図24は多層画像の各検出器の出力に基づいて形成される画像の例を示す図である。Upper検出器画像では下層パターン2401がはっきり見えているが、図16のLR検出器に当たるX+、X-検出器では、下層のパターンが写りこまない場合がある。これは,パターン密度が高いため、下層パターンから放出されたなす角の大きな電子が上層パターンに阻まれてX+、X-検出器で検出されないためである。このような状況はLR検出器にも起こりえる。 Also, the area division method for multilayer images is shown below. FIG. 24 is a diagram illustrating an example of an image formed based on the output of each detector of a multilayer image. In the upper detector image, the lower layer pattern 2401 is clearly visible, but in the X + and X− detectors corresponding to the LR detector in FIG. 16, the lower layer pattern may not be reflected. This is because, since the pattern density is high, electrons with a large angle formed from the lower layer pattern are blocked by the upper layer pattern and are not detected by the X + and X− detectors. Such a situation can also occur in the LR detector.
 しかし、Y+、Y-検出器画像では遮蔽物が無いため下層2402が観測できる。このため、まずX+、X-検出器画像を用いて、これまでで述べた手法、例えば図16を用いて説明した領域わけの手法を用い、上層パターン2403とそれ以外の領域2404に領域分けを行う。さらに、Y+、Y-検出器画像に対し上層パターン2403とそのエッジ領域2405をマスクした画像を作成し、その画像を入力度してX+、X-検出器画像と同様に下層パターン2402とスペース部2406の領域分けを行う。これをレイヤー情報とする。このように、多層画像に対してこれまで述べた手法の組み合わせで領域分けが可能となる。4方向多層画像に対する処理はこれに限定するものではない。 However, in the Y + and Y− detector images, since there is no obstruction, the lower layer 2402 can be observed. For this reason, first, using the X + and X− detector images, the above-described method, for example, the region dividing method described with reference to FIG. 16, is used to divide the region into the upper layer pattern 2403 and the other region 2404. Do. Further, an image is created by masking the upper layer pattern 2403 and its edge region 2405 with respect to the Y +, Y− detector image, and the image is input to input the lower layer pattern 2402 and the space portion in the same manner as the X +, X− detector image. The area division of 2406 is performed. This is layer information. As described above, it is possible to divide a multi-layered image by combining the methods described so far. The processing for the four-direction multilayer image is not limited to this.
 また、検出器が1つの場合においても、エネルギーフィルタ等の電子検出量を調整する機能や、アライナで検出する電子の方向を制御し、そのパラメータを変えて複数枚画像を作成することで、Upper、Lower検出器やLR検出器でえた画像と同特性の画像を取得することが可能である。このような画像に対しても,これまでに述べた処理を用いて領域分けが可能である。上記記載は実施例についてなされたが、本発明はそれに限らず、本発明の精神と添付の請求の範囲の範囲内で種々の変更および修正をすることができることは当業者に明らかである Even in the case of a single detector, the function of adjusting the amount of electron detection such as an energy filter and the direction of electrons detected by the aligner are controlled, and multiple parameters are created to create an upper image. It is possible to acquire an image having the same characteristics as the image obtained by the Lower detector or the LR detector. Such an image can be divided into regions using the processing described so far. While the above description has been made with reference to embodiments, it will be apparent to those skilled in the art that the invention is not limited thereto and that various changes and modifications can be made within the spirit of the invention and the scope of the appended claims.
 501 電子源
 502 引出電極
 503 電子線
 504 コンデンサレンズ
 505 アライナ
 506 対物レンズ
 507 真空試料室
 508 試料ステージ
 509 試料
 510 試料から放出された電子
 511 2次電子
 512 変換電極
 513 検出器
 514 制御装置
501 Electron source 502 Extraction electrode 503 Electron beam 504 Condenser lens 505 Aligner 506 Objective lens 507 Vacuum sample chamber 508 Sample stage 509 Sample 510 Electrons emitted from the sample 511 Secondary electrons 512 Conversion electrode 513 Detector 514 Controller

Claims (10)

  1.  荷電粒子源から放出された荷電粒子ビームの照射に基づいて得られる荷電粒子を検出する少なくとも2以上の検出器と、当該検出器の出力に基づいて形成される画像の処理を行う画像処理装置を備えた荷電粒子線装置において、
     前記画像処理装置は、前記2以上の検出器によって得られた荷電粒子に基づいて得られる画像のエッジ領域にマスク処理を施すことを特徴とする荷電粒子線装置。
    At least two or more detectors for detecting charged particles obtained based on irradiation of a charged particle beam emitted from a charged particle source, and an image processing apparatus for processing an image formed based on an output of the detector In the charged particle beam apparatus provided,
    The charged particle beam apparatus, wherein the image processing apparatus performs a mask process on an edge region of an image obtained based on charged particles obtained by the two or more detectors.
  2.  請求項1において、
     前記画像処理装置は、前記2以上の画像の差分演算を行うことを特徴とする荷電粒子線装置。
    In claim 1,
    The charged particle beam device, wherein the image processing device performs a difference calculation of the two or more images.
  3.  請求項2において、
     前記画像処理装置は、前記差分演算に基づいて、前記画像の領域分けを行うことを特徴とする荷電粒子線装置。
    In claim 2,
    The charged particle beam device according to claim 1, wherein the image processing device divides the image into regions based on the difference calculation.
  4.  請求項1において、
     前記画像処理装置は、前記2以上の画像の合成エッジを形成し、当該合成エッジ領域について、マスク処理を施すことを特徴とする荷電粒子線装置。
    In claim 1,
    The charged particle beam apparatus, wherein the image processing apparatus forms a composite edge of the two or more images, and performs a mask process on the composite edge region.
  5.  請求項1において、
     前記2以上の検出器は、前記荷電粒子ビームの光軸方向に配列される検出器を含むことを特徴とする荷電粒子線装置。
    In claim 1,
    The charged particle beam apparatus, wherein the two or more detectors include detectors arranged in an optical axis direction of the charged particle beam.
  6.  請求項1において、
     前記2以上の検出器は、前記荷電粒子線の光軸に対し、軸対称に配置される検出器を含むことを特徴とする荷電粒子線装置。
    In claim 1,
    The charged particle beam apparatus characterized in that the two or more detectors include detectors that are arranged symmetrically with respect to the optical axis of the charged particle beam.
  7.  請求項1において、
     前記画像処理装置は、予め記憶された参照画像を用いて、前記検出器の出力に基づいて形成される画像上でマッチング処理を実行することを特徴とする荷電粒子線装置。
    In claim 1,
    The charged particle beam apparatus, wherein the image processing apparatus executes a matching process on an image formed based on an output of the detector, using a reference image stored in advance.
  8.  請求項7において、
     前記画像処理装置は、前記マスク処理が施された画像を被探索画像として、前記マッチング処理を実行することを特徴とする荷電粒子線装置。
    In claim 7,
    The charged particle beam apparatus, wherein the image processing apparatus executes the matching process using an image subjected to the mask process as a search image.
  9.  請求項1において、
     前記画像処理装置は、前記画像の領域わけを行うと共に、当該領域わけされた少なくとも1つの領域に隣接するエッジを選択的に測定、或いは検査することを特徴とする荷電粒子線装置。
    In claim 1,
    The charged particle beam device characterized in that the image processing device performs region separation of the image and selectively measures or inspects an edge adjacent to at least one region separated from the region.
  10.  荷電粒子源から放出された荷電粒子ビームの照射に基づいて得られる荷電粒子を検出する少なくとも2以上の検出器と、予め記憶された参照画像を用いて、前記検出器の出力に基づいて形成される画像上でマッチング処理を実行する画像処理装置を備えた荷電粒子線装置において、
     前記画像処理装置は、前記2以上の検出器によって得られた荷電粒子に基づいて得られる2以上の画像のそれぞれのエッジ領域にマスク処理を施し、当該マスク処理に基づいて得られる画像を被探索画像として、前記参照画像を用いたマッチング処理を実行することを特徴とする荷電粒子線装置。
    It is formed based on the output of the detector using at least two or more detectors for detecting charged particles obtained based on irradiation of the charged particle beam emitted from the charged particle source and a pre-stored reference image. In a charged particle beam apparatus equipped with an image processing apparatus that executes matching processing on an image
    The image processing apparatus performs mask processing on each edge region of two or more images obtained based on charged particles obtained by the two or more detectors, and searches for an image obtained based on the mask processing. A charged particle beam apparatus that performs a matching process using the reference image as an image.
PCT/JP2015/061100 2014-04-16 2015-04-09 Charged particle beam device WO2015159792A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-084166 2014-04-16
JP2014084166A JP2017126398A (en) 2014-04-16 2014-04-16 Charged particle beam device

Publications (1)

Publication Number Publication Date
WO2015159792A1 true WO2015159792A1 (en) 2015-10-22

Family

ID=54324001

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/061100 WO2015159792A1 (en) 2014-04-16 2015-04-09 Charged particle beam device

Country Status (3)

Country Link
JP (1) JP2017126398A (en)
TW (1) TW201541497A (en)
WO (1) WO2015159792A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10186399B2 (en) 2016-04-15 2019-01-22 Hitachi High-Technologies Corporation Scanning electron microscope

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020187876A (en) * 2019-05-13 2020-11-19 株式会社日立ハイテク Charged particle beam device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119471A (en) * 2009-12-03 2011-06-16 Hitachi High-Technologies Corp Defect inspection method and defect inspection device
JP2011165479A (en) * 2010-02-09 2011-08-25 Hitachi High-Technologies Corp Pattern inspection method, pattern inspection program, electronic device inspection system
JP2012177961A (en) * 2011-02-25 2012-09-13 Hitachi High-Technologies Corp Pattern matching device and computer program
JP2012204108A (en) * 2011-03-24 2012-10-22 Hitachi High-Technologies Corp Charged particle beam device and pattern measurement method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119471A (en) * 2009-12-03 2011-06-16 Hitachi High-Technologies Corp Defect inspection method and defect inspection device
JP2011165479A (en) * 2010-02-09 2011-08-25 Hitachi High-Technologies Corp Pattern inspection method, pattern inspection program, electronic device inspection system
JP2012177961A (en) * 2011-02-25 2012-09-13 Hitachi High-Technologies Corp Pattern matching device and computer program
JP2012204108A (en) * 2011-03-24 2012-10-22 Hitachi High-Technologies Corp Charged particle beam device and pattern measurement method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10186399B2 (en) 2016-04-15 2019-01-22 Hitachi High-Technologies Corporation Scanning electron microscope

Also Published As

Publication number Publication date
JP2017126398A (en) 2017-07-20
TW201541497A (en) 2015-11-01

Similar Documents

Publication Publication Date Title
JP5568277B2 (en) Pattern matching method and pattern matching apparatus
US20140016854A1 (en) Pattern matching device and computer program
JP5156619B2 (en) Sample size inspection / measurement method and sample size inspection / measurement device
US8331651B2 (en) Method and apparatus for inspecting defect of pattern formed on semiconductor device
US9141879B2 (en) Pattern matching method, image processing device, and computer program
US10545017B2 (en) Overlay error measuring device and computer program for causing computer to measure pattern
JP5313939B2 (en) Pattern inspection method, pattern inspection program, electronic device inspection system
WO2013168487A1 (en) Defect analysis assistance device, program executed by defect analysis assistance device, and defect analysis system
WO2016121265A1 (en) Sample observation method and sample observation device
US9960010B2 (en) Signal processing method and signal processing apparatus
JP6043735B2 (en) Image evaluation apparatus and pattern shape evaluation apparatus
JP6281019B2 (en) Electron beam pattern inspection system
US9341584B2 (en) Charged-particle microscope device and method for inspecting sample using same
US10712152B2 (en) Overlay error measurement device and computer program
JP3743319B2 (en) Defect detection method and apparatus
WO2015159792A1 (en) Charged particle beam device
JP2011174858A (en) Defect detection method and semiconductor device manufacturing method
JP5953117B2 (en) Pattern evaluation apparatus and computer program
WO2017130364A1 (en) Charged particle beam device
JP2016162513A (en) Charged particle beam device
WO2012056639A1 (en) Pattern determination device and computer program
JP6224467B2 (en) Pattern evaluation apparatus and scanning electron microscope
JP2015002000A (en) Electric charge particle beam apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15779627

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15779627

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP