WO2015011974A1 - Pattern dimension measurement method and device - Google Patents

Pattern dimension measurement method and device Download PDF

Info

Publication number
WO2015011974A1
WO2015011974A1 PCT/JP2014/063488 JP2014063488W WO2015011974A1 WO 2015011974 A1 WO2015011974 A1 WO 2015011974A1 JP 2014063488 W JP2014063488 W JP 2014063488W WO 2015011974 A1 WO2015011974 A1 WO 2015011974A1
Authority
WO
WIPO (PCT)
Prior art keywords
pattern
image
patterns
images
information
Prior art date
Application number
PCT/JP2014/063488
Other languages
French (fr)
Japanese (ja)
Inventor
慎弥 村上
高木 裕治
Original Assignee
株式会社日立ハイテクノロジーズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立ハイテクノロジーズ filed Critical 株式会社日立ハイテクノロジーズ
Publication of WO2015011974A1 publication Critical patent/WO2015011974A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01JELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
    • H01J37/00Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
    • H01J37/26Electron or ion microscopes; Electron or ion diffraction tubes
    • H01J37/28Electron or ion microscopes; Electron or ion diffraction tubes with scanning beams
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01JELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
    • H01J2237/00Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
    • H01J2237/26Electron or ion microscopes
    • H01J2237/28Scanning microscopes
    • H01J2237/2813Scanning microscopes characterised by the application
    • H01J2237/2817Pattern inspection

Definitions

  • the present invention relates to a method and apparatus for measuring the dimensions of a plurality of patterns formed on a semiconductor wafer, and more particularly to a pattern suitable for measuring the dimensions of a large number of fine patterns formed densely on a substrate.
  • the present invention relates to a dimension measuring method and apparatus.
  • Non-Patent Document 1 discloses a technique for improving the performance of template matching itself.
  • Patent Document 1 discloses a method of creating a statistical template image and improving the accuracy of template matching.
  • Patent Document 2 discloses a method for automatically generating a template by detecting a region having the same pattern by analyzing repeatability of a pattern image.
  • Non-Patent Document 2 discloses a method of matching a characteristic part of a pattern included in a template image using a feature amount extraction method called SIFT.
  • Non-Patent Document 3 discloses a method for obtaining the dimension and surface tip of a hole pattern by ellipse fitting. Patent Document 3 describes measuring the distance between patterns using a threshold method.
  • Non-Patent Documents 1 and 2 and Patent Documents 1 and 2 are a pattern matching method based on either template matching or a method of detecting a regular arrangement of patterns in an image focusing on repeatability. Is being detected. In template matching, the matching accuracy decreases in proportion to the instability of the pattern shape of the matching destination.
  • the method for detecting a regular arrangement of patterns causes a malfunction when the regularity is not localized or when there is no overall regularity.
  • the present invention solves the above-mentioned problems of the prior art, and for many fine patterns formed densely on a substrate, the pattern shape is unstable or the regularity of the pattern arrangement is locally To provide a pattern dimension measuring method and apparatus capable of accurately detecting the position of a pattern and measuring the dimension of the pattern even when it does not hold or when there is no regularity as a whole. is there.
  • a plurality of patterns having the same shape formed on a sample are imaged to obtain images of the plurality of patterns.
  • the pattern information extracted and extracted using the template matching method for multiple pattern images and the pattern information extracted using the periodicity information of the multiple pattern arrays in the multiple pattern images Set the measurement cursor using the obtained integration result, set the dimension measurement area using the measurement cursor set for the acquired multiple pattern images, and set using the measurement cursor among the multiple pattern images An image of a pattern existing in the dimension measurement area is processed to measure the dimension of this pattern.
  • an apparatus for measuring the dimension of a pattern captures a plurality of patterns having the same shape formed on a sample and acquires images of the plurality of patterns.
  • Pattern information is extracted by a template matching method for images of a plurality of patterns acquired by the image acquisition means and the image acquisition means, and a pattern is used by using information on periodicity of the arrangement of a plurality of patterns in the images of the plurality of patterns.
  • Information is extracted, and the pattern information extracted by the template matching method and the pattern information extracted using the periodicity information of the arrangement of multiple patterns are obtained to obtain an integrated result.
  • Measurement cursor setting means to set the measurement cursor using, and measurement cursor setting for multiple pattern images acquired by the image acquisition means
  • Dimension measurement area setting means for setting the dimension measurement area using the measurement cursor set in the stage, and the dimension set using the measurement cursor set by the measurement cursor setting means among the images of the plurality of patterns acquired by the image acquisition means
  • Dimension measuring means for processing a pattern image existing in the measurement region and measuring the dimension of the pattern is provided.
  • two methods of detecting a template matching and a regular arrangement of patterns are performed. Since the arrangement pattern detection statistically detects the pattern arrangement from the periodicity of the entire image, a pattern that cannot be detected due to insufficient template matching performance can also be detected. By integrating these two results, the pattern can be detected stably and the measurement cursor can be automatically set.
  • the pattern shape is unstable or the regularity of the pattern arrangement is not locally established or the entire pattern Even if there is no regularity, it is possible to accurately detect the position of the pattern and measure the dimension of the pattern.
  • FIG. 6 is an image of a measurement target pattern according to the first embodiment of the present invention when the end of a repetitive pattern is present in the image. It is an image in case a part of pattern arrangement
  • FIG. 1 shows an overall configuration diagram of a fine pattern measuring apparatus according to the present invention.
  • a length measurement SEM 100 includes a stage 106 on which a measurement wafer (sample) 107 is placed, an irradiation optical system that controls an electron beam 101 emitted from an electron gun 102, and secondary electrons emitted from the sample. And a signal processing system for detection signals.
  • the irradiation optical system includes an electron gun 102, a condenser lens 103, a deflection coil 104, and an objective lens 105 on the path of the electron beam 101.
  • the electron beam 101 is condensed by this optical system in a predetermined area where the pattern to be measured on the wafer 107 is present.
  • Reference numeral 111 denotes a stage controller
  • 112 denotes an electron optical system control unit
  • 113 denotes a control unit for the entire apparatus
  • 114 denotes a control terminal connected to the control unit.
  • the image processing unit 110, the overall control unit 113, and the control terminal 114 can be connected to a recording medium (not shown), and a program executed by the image processing unit 110 is read from the recording medium.
  • the image processing unit 110 can be loaded.
  • FIG. 2 shows a configuration diagram of the image processing unit 110.
  • the secondary electron signal converted into a digital signal by the A / D converter 109 is sent to the memory 203 via the data input I / F 205 and stored so as to be readable as image data in the memory 203.
  • the image processing program is read from the memory 203 or the storage medium by the image processing control unit 201.
  • the image processing control unit 201 controls the calculation unit 202 according to the read image processing program, processes image data stored in the memory 203 or intermediate processing data obtained as a result of processing the image data, and measures a pattern.
  • the pattern measurement result is sent to the overall control unit 113 via the input / output I / F 200, and the measurement result is displayed on the control terminal 114 shown in FIG.
  • An operation command for the image processing unit 110 is input from the overall control unit 113 to the image processing control unit 201 via the input / output I / F 200.
  • Data transmission / reception in the image processing unit 110 is performed via the bus 204.
  • FIG. 3, FIG. 4, and FIG. 5 show examples of pattern images to be detected.
  • a pattern image 301 in FIG. 3 is a pattern in which holes 302 to be measured are arranged in a square lattice pattern.
  • a pattern image 401 in FIG. 4 is a pattern in which the holes of the pattern image 301 are arranged with an inclination of 45 degrees. Further, there is a pattern in which two holes are regularly arranged as a set, such as a pattern image 501 in FIG.
  • a single hole pattern is detected from the entire image, the hole area 303 is set, and a length measurement algorithm is processed for the area 303.
  • An area where the length measurement algorithm is executed for example, an area surrounded by a dotted line as indicated by 303 and 304 in FIG. 3 is called a measurement cursor.
  • the input image 601 is a measurement cursor setting target image
  • the input image 602 is a measurement target pattern template image.
  • the periodicity in the image is analyzed by the array pattern detection processing 603, and a result 605 of detecting the hole position is obtained.
  • template matching 604 is performed on the image 601 using the template image 602, and a result 606 of detecting the position of the hole is obtained.
  • Each of the array pattern detection method and the template matching method has a pattern that is difficult to detect.
  • the region 609 is erroneously detected as having a hole, and the template matching is used, the region 609 is correctly recognized as a region having no hole, but a part of the hole pattern 611 missing is missed, and a plurality of holes are detected.
  • a region 610 extending over the area is erroneously detected as a region where a hole pattern exists.
  • the overlooked pattern 611 is a hole that does not appear to be closed on the image due to distortion or charging. This pattern 611 is detected as a hole area in the image 605 detected by the array pattern detection method. Therefore, by integrating the two results in the integration process 607, a detection result 608 in which the length measurement region is set by the measurement cursor without missing or erroneous detection can be finally obtained.
  • FIG. 6 With respect to the pattern image 301 of FIG. 3, the region 303 is a template image, the height of the template image is N, the horizontal width is M, and the pixel value of the pattern image 301 at coordinates (i, j) on the template image is I (i , J), when the pixel value of the template image 303 is T (i, j), template matching is performed by the normalized correlation expressed by (Equation 1). *
  • the correlation coefficient R NCC may have a high peak, which may cause erroneous detection.
  • the correlation coefficient has a high peak in the region 403 surrounded by the hole pattern.
  • the vertical axis 801 of the graph of FIG. 8 is a correlation coefficient obtained by normalized cross-correlation (Equation 1) while shifting the center of the template image 402 along the arrow 702, and the horizontal axis 806 represents the x coordinate on the arrow 702. Yes.
  • the correlation coefficient shows peaks 802 and 804 at points 703 and 705 where holes exist. Further, the point 704 is surrounded by the edge indicated by 706 and has a pixel component having a shape close to a hole, so that the correlation coefficient has a peak 803. If there is no influence of noise or the like, the correlation coefficient at the point 704 is sufficiently lower than the correlation coefficients at the points 703 and 705, so that it is possible to detect only the hole pattern by setting the threshold value 805.
  • the arrangement pattern detection 603 in FIG. 6 will be described.
  • Discrete Fourier transform is performed on the image 1101 in FIG. 11 in which the hole patterns are arranged in a lattice pattern, and the spectrum image 1201 in FIG. 12 is calculated.
  • the spectrum image represents a large frequency from the center to the outside, and generates a lattice-like peak according to the periodicity existing in the pattern image.
  • the horizontal interval 1102 of the pattern in FIG. 11 corresponds to the peak point 1203 on the spectral image in FIG. 12
  • the vertical interval 1103 in FIG. 11 corresponds to the peak point 1204 on the spectral image in FIG.
  • the frequency 1205 in the x-axis direction of the peak point 1203 is p
  • the frequency 1206 in the y-axis direction of the peak point 1204 is q
  • the horizontal interval 1102 of the pattern image is P
  • the vertical interval 1103 is Q.
  • P and Q can be calculated by (Equation 2) and (Equation 3).
  • N is the horizontal width of the pattern image
  • M is the vertical width of the pattern image.
  • the grid 1105 by arranging the grid 1105 at the obtained intervals P and Q with the center position of the template region 1104 as the base point, a place where the pattern periodically exists can be obtained as an intersection of the grids.
  • the base point of the grid may be obtained from the phase component of the peak frequency obtained by Fourier transform.
  • An image 1401 in FIG. 14 is a Fourier spectrum image of the hole pattern image 1301 arranged in an inclined lattice pattern as shown in FIG.
  • a grid 1305 on the pattern image 1301 of FIG. 13 corresponds to the peak point 1402 of the spectrum image 1401 of FIG.
  • This peak point 1402 is the first or second largest peak point excluding the center point 1403.
  • the angle 1304 formed by the line 1303 perpendicular to the grid 1305 and the X axis in FIG. 13 is the same as the angle 1404 formed by the X axis and the straight line connecting the center point 1403 to the peak point 1402 in the spectrum image of FIG. Become.
  • the period of the peak point 1402 is represented by a distance 1405 from the center point 1403 to the peak point 1402, and the grid interval 1302 may be obtained from (Equation 2).
  • the arrangement pattern of the pattern image 1501 includes a pattern in which one hole is arranged on the grid 1701 and the grid 1702 in FIG. 17 and a grid 1801 and a grid 1802 in FIG. It can be divided into arrangement patterns.
  • FIG. 17 is a peak 1602 in the spectral image of FIG. 16
  • grid 1702 is a peak 1603 in the spectral image
  • grid 1801 of FIG. 18 is a peak 1604 in the spectral image of FIG. 16
  • grid 1802 is a spectral image. It occurs corresponding to the peak 1605 in the middle.
  • these four peaks are the top four peaks excluding the center in the spectrum image. Therefore, by setting a threshold value and detecting a plurality of peaks that are equal to or higher than the threshold value, the top four peaks can be detected.
  • the maximum frequency of the peak to be detected may be determined by parameters.
  • the array pattern detection 603 detects a pattern by statistical processing from the arrangement of a plurality of hole patterns, even if individual pattern shapes collapse due to the influence of noise or distortion, the influence is less than that of the template matching 604. .
  • pattern detection cannot be performed correctly for pattern images 1901, 2101 in which the regularity of arrangement is partially broken as shown in FIGS. Therefore, as shown in FIG. 6, an integration process 607 is performed to compensate for the disadvantages of the template matching 604 and the array pattern detection 603.
  • An image 2201 in FIG. 22 is a correlation coefficient image obtained by performing template matching on the hole pattern image 301 shown in FIG.
  • the pixel value of the pixel 2204 of the correlation coefficient image 2201 is a correlation coefficient when the center of the template image 303 is aligned with the pixel 2204, and corresponds to the peak value of the waveform shown in the graph of FIG.
  • the correlation coefficient image 2201 includes a peak 2202 generated at the center position of the hole and a peak 2203 generated in a region between the holes.
  • An image 2301 in FIG. 23 represents a result image of the array pattern detection of the hole pattern image 301 shown in FIG.
  • a band-like area 2303 is an area widened by a width 2304 around the grid 2302 obtained by array detection.
  • a similar band-like region 2308 is also obtained in the grid 2305.
  • a region 2306 where the grid regions intersect each other is a region where holes are likely to exist as a result of the array pattern detection. In this example, it is actually the position of the hole in the pattern image 301.
  • the correlation coefficient 2202 at the hole center position in FIG. Consider the case of 0.7.
  • a region 2306 where the bands overlap is set to 1.0, and the other region is set to 0.6, and a weighting factor corresponding to the probability that a pattern exists for each position is set.
  • a value (integrated value) obtained by multiplying the correlation coefficient between the pixel values of the same coordinates in the image 2201 and the image 2301 and the weighting coefficient is set as the integration result.
  • the integrated result is that the integrated value of the hole position is higher than the integrated value between the holes, and by setting a threshold value for the integrated value, the hole can be detected without causing erroneous detection.
  • an image 2401 represents a hole arrangement detection result image for the hole pattern image 1501 shown in FIG.
  • a band-like region can be defined with widths 2406, 2407, 2408, and 2409, and a region 2410 where all overlap can be similarly defined as a region where holes exist.
  • Lines 1904, 2003, and 2104 in each pattern image are grids obtained by array pattern detection.
  • a hole pattern image 1901 in FIG. 19 is a pattern image in which holes are arranged in a lattice pattern, but there are no holes in the region 1902.
  • the region 1902 has less margin on the template matching than the region 1903 that is likely to be erroneously detected, so the correlation coefficient is a value close to 0.0.
  • the weighting coefficient is 1.0, but the integrated value multiplied by the correlation coefficient is a value close to 0.0, and erroneous detection occurs. Absent.
  • the region where holes are present in FIG. 19 is equivalent to the processing result for the pattern image 301, so that the pattern can be correctly detected by the integration processing.
  • the hole pattern image 2001 in FIG. 20 is the end of the repeated pattern.
  • the correlation coefficient becomes a value close to 0.0. Therefore, in the region 2002, the integrated value becomes a value close to 0.0 regardless of the arrangement pattern detection result, so that no erroneous detection occurs.
  • the same result as the processing result for the pattern image 301 is obtained, so that the pattern can be correctly detected by the integration process.
  • the hole pattern image 2101 in FIG. 21 is a pattern image in which holes are arranged in a lattice pattern and there are holes 2102 that are not related to the repetitive pattern.
  • the correlation coefficient of the hole 2102 by template matching is 0.8.
  • the weighting coefficient is 0.6
  • the holes 2103 forming the repetitive pattern of the pattern image 2101 have enough space to arrange another hole 2102 between the holes, the correlation coefficient does not increase in places other than the holes. Therefore, the integrated value of the hole 2102 has a value larger than that of the area where no hole exists, and the pattern can be correctly detected by the threshold processing for the integrated value.
  • An image 2501 in FIG. 25 is a pattern image in which lines are regularly arranged.
  • template matching is performed as a pattern region to be measured as the region 2502
  • an image 2601 in FIG. 26 is a pattern image in which lines are regularly arranged horizontally and the lines are partially connected in the vertical direction. Region 2602 is inside the pattern.
  • template matching is performed with the line end and end as in the region 2603 as the measurement target, there is a high possibility of erroneous detection in the region 2604 having a similar shape. That is, it is difficult to detect a pattern using a template matching method for a pattern such as the image 2601 in FIG.
  • the measurement cursor detection algorithm described with reference to FIG. 6 it is possible to detect the target for these pattern images 2501 and 2601 without causing erroneous detection. That is, template matching is performed on the image 2601 in FIG. 26 to create a correlation coefficient image as described in FIG. Next, an image of the array pattern detection result as described with reference to FIG. 23 is created from this correlation coefficient image. A correlation coefficient is defined for each region of the image of the array pattern detection result, an integrated value is obtained by multiplying the correlation coefficients of the same coordinates of the correlation coefficient image, and threshold processing is performed on the integrated value. Pattern detection can be performed correctly.
  • the template matching 604 in FIG. 6 may detect a pattern by using feature point matching that performs matching at a characteristic part of the pattern included in the template image.
  • the feature part extraction may be performed by using a feature quantity extraction method called SIFT as described in Non-Patent Document 2.
  • the correlation coefficient is calculated by calculating the feature quantity of the coordinates corresponding to the feature point on the template image from the matching target image and comparing it with the feature quantity at the feature point of the template image.
  • SIFT since the feature amount is represented by a histogram with respect to the luminance gradient direction, matching between the feature amounts calculates a Cullback library distance D (n) represented by (Equation 5).
  • i is the bin number of the histogram
  • P (n, i) is a histogram of the feature quantity at the nth feature point
  • Q (n, i) is the feature quantity at the matching destination coordinate corresponding to the nth feature point. This is a histogram.
  • N is the total number of extracted feature points.
  • Feature point matching is processed rather than normal template matching such as normalized cross-correlation because only the feature points need to be processed instead of processing all the pixels on the template image when obtaining the correlation coefficient. Can be performed at high speed.
  • the array pattern detection 603 may be a method using an autocorrelation function other than using Fourier transform.
  • a waveform 2702 in FIG. 27 represents an autocorrelation function with respect to the x-axis direction of the pattern image 2701 in which holes are arranged in a lattice pattern.
  • the center 2703 of the graph indicates when the positional deviation amount is zero.
  • the autocorrelation function R (t) has a displacement amount 2802 in the x-axis direction of t with respect to the original image 2801 in FIG. 28, a lateral width 2805 of an area 2804 where the shifted image 2803 and the original image 2801 overlap, and W.
  • the vertical width 2806 is H and the pixel value at the coordinates (x, y) of the original image 2801 is f (x, y)
  • Equation 7 is used.
  • Threshold value 2704 is applied to autocorrelation function 2702 obtained in this way, and peak points 2705 other than center 2703 are detected.
  • a distance 2706 between the center 2703 and the peak point 2705 is the pattern arrangement interval in the x-axis direction.
  • the autocorrelation function 2707 is also calculated in the y-axis direction, and the pattern arrangement interval is similarly obtained.
  • the shifting direction is also inclined in the direction of the arrow 2902 and the direction of the arrow 2904 perpendicular thereto, and the autocorrelation function 2903, What is necessary is just to obtain 2905.
  • the method using the discrete Fourier transform analyzes a pattern image on the basis of a sin wave.
  • the autocorrelation function is based on the pattern shape itself, and analyzes periodicity using more signal components of the image than Fourier analysis. Therefore, in an image with a lot of noise and distortion, the array pattern detection by the autocorrelation function can detect the array pattern more accurately than the method using the discrete Fourier transform.
  • the setting target is the pattern image 301 in FIG.
  • step S3001 the pattern area to be measured on the wafer is imaged to obtain the image 301 in FIG.
  • step S3002 the acquired image 301 is displayed on the GUI.
  • step S3003 an area 303 to be used as a template image is designated on the image 301 using the GUI.
  • step S3004 the measurement cursor detection algorithm shown in FIG. 6 is performed using the image 301 and the template image 303 to detect the measurement cursor.
  • step S3005 the measurement cursor detection result in S3004 and the results of template matching 604 and array pattern detection 603, which are intermediate processes, are output on the GUI.
  • step S3006 the user can change the parameters of the array pattern detection 603, the template matching 604, and the integration process 607 using the GUI.
  • step S3007 the measurement cursor is detected again using the adjusted parameter, and the display of the cursor detection result and the intermediate processing result is updated. If the user determines that parameter correction is still necessary, the process returns to step S3006 and parameter correction is performed again.
  • step S3008 the user can add, delete, and correct the position of the measurement cursor as necessary with respect to the measurement cursor detection result.
  • step S3009 the imaging position and the relative coordinates of the measurement cursor relative to the imaging position are stored as recipe data. Using this recipe data, the same pattern area as the image 301 on another chip or a different wafer is measured. When the storage of the recipe data is completed in step S3009, this series of processing is terminated.
  • step S3101 the recipe data stored in step S3009 is read from the storage device 206.
  • step S ⁇ b> 3102 the wafer 107 is moved under the control of the stage 106 to the same imaging area as the image from which the recipe data is created.
  • step S3103 the measurement area is imaged at a low magnification and aligned, and then imaged at a magnification for measurement to obtain a pattern image.
  • step S3104 a measurement cursor is set for the acquired image using the coordinates of the cursor in the recipe data, and measurement processing is performed on each cursor area.
  • step S3105 if there is a measurement area where measurement has not been completed, the process returns to S3102, and if measurement of all measurement areas has been completed, the process proceeds to S3106.
  • step S3106 the statistical value of the measurement result is calculated, the result data is output to the storage device 206, and the measurement ends.
  • the measurement processing performed in S3104 is executed by the calculation unit 202.
  • ellipse fitting as described in Non-Patent Document 3 is performed in each measurement cursor, and the major axis, minor axis, and area are calculated. Further, when the measurement target is a distance between patterns such as the end of the line, it is calculated by a threshold method as described in Patent Document 3.
  • Reference numeral 3201 denotes an example of a GUI window used in steps S3003 and S3006.
  • An image 3202 displays the image acquired in step S3001. The user can set a template image used for template matching by setting the rectangular area 3203 on the image 3202. Sliders 3204 to 3208 are parameters that can be adjusted in step S3006.
  • the parameters are the threshold value 3204 for the integrated value, the width 3205 of the band 2304 when creating the array pattern detection image, the maximum frequency value 3206 when detecting the peak from the Fourier spectrum image 1201, and a plurality of peaks.
  • the correlation coefficient image value obtained by template matching at a certain coordinate value on the pattern image is C (0 ⁇ C ⁇ 1)
  • the array pattern detection image value is A (0 ⁇ A ⁇ 1)
  • the weight determined by the slider 3208 Assuming that the coefficient is t (0 ⁇ t ⁇ 1), the value S as a result of the integration process is a weighted geometric mean expressed by the following (Equation 8).
  • the hole 3209 is determined to be a region where a hole exists in the array pattern detection as a region having a low correlation coefficient in template matching and no hole.
  • the weight coefficient t is set close to 1
  • t is set close to 0, so that the user can set the object of cursor setting. You can choose.
  • step S3008 described with reference to FIG. 30 the position of each cursor can be corrected by selecting each cursor 3212 and moving the cursor by operating the mouse.
  • the cursor is deleted by pressing the delete button 3214 while the cursor is selected, and one cursor is created at the center by pressing the add button 3213, and the cursor is added to an arbitrary place by moving the cursor. be able to.
  • the image 3215 includes cursor detection such as the Fourier spectrum image 1201 described with reference to FIG. 12, the correlation coefficient image 2201 described with reference to FIG. 22, and the result of array pattern detection 2301 described with reference to FIG. 23 by switching the tab 3216.
  • An intermediate image in the process can be displayed and confirmed.
  • FIG. 32 shows a state where a Fourier spectrum image is displayed in the display area 3215.
  • a peak region that can be detected by a frame line 3217 representing the maximum value of the frequency set by the slider 3206 can be confirmed.
  • the peak to be detected can be confirmed by the cursor 3218 surrounding the peak point equal to or higher than the threshold value.
  • the coordinate information of the cursor can be saved in the storage device 206 by pressing the save button 3220 on the GUI screen 3201 in step S3009 in the processing flow of FIG.
  • step S3001 the wafer 107 is moved to the area of the pattern to be acquired by controlling the stage 106 in S3301.
  • step S3302 the wafer 107 is irradiated with the electron beam 101, and secondary electrons are detected by the detector.
  • the image data obtained from the detector in S3303 is transferred to the memory 203 and the control terminal 114, and the acquired image is displayed on the GUI of the control terminal 114 (corresponding to step S3002).
  • step S3304 the setting data of the template image is received by the user input via the GUI, and the setting data is transferred to the memory 203 (corresponding to step S3003).
  • step S3305 the calculation unit 202 executes the processing shown in FIG. 6 to detect the measurement cursor from the image data and the template image setting data in the memory 203 (corresponding to step S3004).
  • step S3306 the detection result is transferred to the control terminal 114 and displayed on the GUI (corresponding to step S3005). If it is determined in step S3307 that the detected parameter is corrected by the user, the process advances to step S3310 to receive parameter correction data on the GUI, and the process returns to step S3305.
  • step S3308 correction data for the measurement cursor detection result is received via the GUI on the control terminal 114, and the correction data is transferred to the memory 203 (corresponding to step S3008).
  • step S3309 the coordinate data of the measurement cursor to which the correction data is applied is stored in the storage device 206 (corresponding to step S3009), and the process ends.
  • recipe setting steps and measurement different from those in the first embodiment are performed.
  • the pattern image 301 is different in arrangement from the pattern image 301 only by performing recipe setting on the pattern image 301.
  • 501 can also be performed.
  • the measurement cursor since the coordinates of the measurement cursor set at the time of recipe setting cannot be used, the measurement cursor is set and measured every time a pattern is imaged at the time of measurement. Therefore, the parameters 3204 to 3208 and the template image on the BUI of FIG. 32 adjusted at the time of recipe setting are used at the time of measurement.
  • step S3401 the template image represented by the area 303 set in step S3003 is stored in the storage medium 114 as recipe data. Also, the template matching, sequence pattern detection, and integration processing parameters set in step S3007 are stored as recipe data in the storage medium 114, and the recipe setting is completed.
  • FIG. 35 explains a flow at the time of measurement in the second embodiment.
  • step S3501 the recipe data saved in step S3401 is read from the storage device 206.
  • step S ⁇ b> 3502 the wafer 107 is moved by the control of the stage 106 to the imaging region of the image to be measured.
  • step S3503 the measurement area is imaged and a pattern image is acquired.
  • step S3504 the measurement cursor detection process illustrated in FIG. 6 is performed on the acquired pattern image using the template image and parameters that are recipe data, and the measurement cursor is set.
  • step S3505 measurement processing is performed on each cursor area.
  • step S3506 if there is an area where measurement has not been completed, the process returns to S3502, and if measurement of all measurement target areas has been completed, the process proceeds to S3507.
  • step S3507 the statistical value of the measurement result is calculated, the result data is output to the storage device 206, and the measurement ends.
  • a recipe setting for one measurement area can be performed, and an area having the same single measurement pattern with different arrangement on the same mask data can be measured without setting the recipe.
  • wafers with mask data different from the measurement area for which recipe settings have been made are manufactured using the same materials and processes, and if the measurement pattern is the same, measurement should be performed without setting the recipe for the wafer measurement area. Can do. Further, by reducing the time for setting these recipes, the burden on the user can be reduced and the operating rate of the measuring device can be increased.
  • the third embodiment targets pattern images 301 and 501 having hole patterns having the same size and shape in design.
  • the pattern image 301 is the same as the second embodiment.
  • Recipe setting as described in the first embodiment is performed for the pattern image 501 using the recipe data created in the processing flow. Thereby, the processing step of FIG. 30 can be simplified.
  • FIG. 36 shows the flow of processing in this embodiment.
  • a recipe setting step is performed on one measurement region (for example, the pattern image 301) in the same manner as the processing flow described in the second embodiment with reference to FIG. 34, and the template image and parameter data are stored.
  • recipe setting is performed in a processing flow as shown in FIG. 36 for areas of patterns having the same single measurement pattern and different arrangements (for example, areas corresponding to the pattern image 501).
  • step S3601 a wafer image (for example, pattern image 501) is acquired.
  • step S3602 the recipe data (template image and parameter data) created and stored in the same processing flow as in the second embodiment for one measurement region (for example, the pattern image 301) is stored.
  • a measurement cursor is detected in step S3603 (corresponding to step S3004).
  • step S3604 the measurement cursor detection result in S3603 and the results of template matching 604 and array pattern detection 603, which are intermediate processes, are displayed on the GUI. Output (corresponding to step S3005).
  • step S3605 the user can change the parameters of the array pattern detection 603, the template matching 604, and the integration process 607 using the GUI (corresponding to step S3006).
  • step S3606 the measurement cursor is detected again using the adjusted parameter, and the display of the cursor detection result and the intermediate processing result is updated (corresponding to step S3007). If the user determines that parameter correction is still necessary, the process returns to step S3605 and parameter correction is performed again.
  • step S3607 the user can add, delete, and correct the position of the measurement cursor as necessary with respect to the detection result of the measurement cursor (corresponding to step S3008).
  • step S3608 the imaging position and the relative coordinates of the measurement cursor relative to the imaging position are stored as recipe data (corresponding to step S3009). Using this recipe data, the same pattern area as the image 301 on another chip or a different wafer is measured. When the storage of the recipe data is completed in step S3608, this series of processing is terminated.
  • a pattern having the same shape as the first area is formed in a different arrangement from the first area after the first area is processed.
  • the steps S3002 and S3003 are omitted, and the operation after S3003 can be performed to create the recipe data of the length measurement cursor coordinate data. Similar to the case of the first embodiment, measurement as shown in FIG. 31 can be performed using this recipe data.
  • the step of designating the template area in step S3003 in the first embodiment can be omitted, and the parameter adjustment in step S3605 can be made with a minimum correction since the adjusted parameter is used.
  • the recipe setting time in the first embodiment can be reduced, the burden on the user can be reduced, and the operating rate of the measuring device can be increased.

Landscapes

  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Length-Measuring Devices Using Wave Or Particle Radiation (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

In order to reliably detect hole patterns and carry out pattern measurement even under the influence of noise, distortion, and surrounding patterns in an image in which concentrated patterns having a regular arrangement are imaged, an image of patterns on a substrate is imaged, pattern arrangement detection processing is carried out in which the regular arrangement of the patterns within an image area is detected, template matching is carried out in which the positions of the patterns within the image area are detected, the position of each pattern is detected through the integration of the results of the pattern arrangement detection and the template matching, measurement cursors are set, and pattern dimension measurement processing is carried out for each measurement cursor.

Description

パターン寸法計測方法及びその装置Pattern dimension measuring method and apparatus
 本発明は、半導体ウェーハ上に形成された複数のパターンの寸法を計測する方法、及びその装置に関し、特に基板に密集して形成された多数の微細なパターンの寸法を計測するのに適したパターン寸法計測方法及びその装置に関するものである。 The present invention relates to a method and apparatus for measuring the dimensions of a plurality of patterns formed on a semiconductor wafer, and more particularly to a pattern suitable for measuring the dimensions of a large number of fine patterns formed densely on a substrate. The present invention relates to a dimension measuring method and apparatus.
 SEM(Scanning Electron Microscope:走査型電子顕微鏡)を用いた半導体パターンの計測において、近年ホールパターンの微細化・高密集により、同一視野内での計測箇所が増大している。また形状のばらつきも増大しており、パターン計測箇所を自動で設定するために用いられる従来のパターンマッチングでは性能が不足し、ユーザ手動による計測カーソル設定の作業工数が増加している。 In the measurement of semiconductor patterns using SEM (Scanning Electron Microscope), the number of measurement points within the same field of view has increased in recent years due to the miniaturization and high density of hole patterns. In addition, the variation in shape is increasing, and the performance of conventional pattern matching used for automatically setting pattern measurement locations is insufficient, and the number of man-hours for manually setting a measurement cursor is increased.
 非特許文献1は、テンプレートマッチングそのものを高性能化する手法を開示している。また特許文献1では、統計的なテンプレート画像を作成し、テンプレートマッチングの精度を向上する方法を開示している。特許文献2ではパターン画像に対し、繰り返し性を解析することで同じパターンを持つ領域を検出しテンプレートの自動生成を行う方法を開示している。 Non-Patent Document 1 discloses a technique for improving the performance of template matching itself. Patent Document 1 discloses a method of creating a statistical template image and improving the accuracy of template matching. Patent Document 2 discloses a method for automatically generating a template by detecting a region having the same pattern by analyzing repeatability of a pattern image.
 また、非特許文献2には、テンプレート画像内に含まれるパターンの特徴的な部位をSIFTと呼ばれる特徴量抽出手法を利用してマッチングを行う方法が開示されている。 Further, Non-Patent Document 2 discloses a method of matching a characteristic part of a pattern included in a template image using a feature amount extraction method called SIFT.
 更に、非特許文献3には、楕円フィッティングによりホールパターンの寸法や面先を求める手法について開示されている。また、特許文献3には、パターン間の距離を閾値法を用いて計測することについて記載されている。 Furthermore, Non-Patent Document 3 discloses a method for obtaining the dimension and surface tip of a hole pattern by ellipse fitting. Patent Document 3 describes measuring the distance between patterns using a threshold method.
特開2010-276487号公報JP 2010-276487 A 特開2011-70602号公報JP 2011-70602 A 特許第4262592号公報Japanese Patent No. 4262592
 規則的に並んでいるパターンを撮像した画像を用いてパターンの寸法を計測するとき、1つ1つのパターンに対して計測領域(計測カーソル)を設定した上で、計測アルゴリズムを実行する必要がある。この計測カーソルの設定を自動的に行うために、事前に取得した画像から計測対象のパターン領域をテンプレートとして登録し、テンプレートマッチングにより計測カーソルを自動的に設定する方法が知られている。しかし、形状の変形が大きく、不安定なパターンを対象としたとき、従来のテンプレートマッチングでは十分な性能を満たすことができない。 
 非特許文献1及び2並びに特許文献1及び2に開示されている方法は、テンプレートマッチングによる手法、あるいは繰り返し性に着目し、画像内のパターンの規則的な配列を検出する手法のいずれかでパターンの検出を行っている。テンプレートマッチングはマッチング先のパターン形状の不安定さに比例してマッチング精度が低下する。パターンの規則的な配列を検出する手法は規則性が局所的に成りたたない場合や、全体的に規則性がない場合は誤作動を生じる。
When measuring the dimensions of a pattern using an image obtained by imaging regularly arranged patterns, it is necessary to set a measurement region (measurement cursor) for each pattern and then execute a measurement algorithm . In order to automatically set the measurement cursor, a method is known in which a pattern region to be measured is registered as a template from an image acquired in advance, and the measurement cursor is automatically set by template matching. However, when the shape deformation is large and an unstable pattern is targeted, the conventional template matching cannot satisfy a sufficient performance.
The methods disclosed in Non-Patent Documents 1 and 2 and Patent Documents 1 and 2 are a pattern matching method based on either template matching or a method of detecting a regular arrangement of patterns in an image focusing on repeatability. Is being detected. In template matching, the matching accuracy decreases in proportion to the instability of the pattern shape of the matching destination. The method for detecting a regular arrangement of patterns causes a malfunction when the regularity is not localized or when there is no overall regularity.
 本発明は、上記した従来技術の課題を解決して、基板に密集して形成された多数の微細なパターンについて、パターンの形状が不安定であったり、パターンの配列の規則性が局所的に成り立たない場合や全体的に規則性がない場合であっても、パターンの位置を正確に検出して、パターンの寸法を計測することを可能にするパターン寸法計測方法及びその装置を提供するものである。 The present invention solves the above-mentioned problems of the prior art, and for many fine patterns formed densely on a substrate, the pattern shape is unstable or the regularity of the pattern arrangement is locally To provide a pattern dimension measuring method and apparatus capable of accurately detecting the position of a pattern and measuring the dimension of the pattern even when it does not hold or when there is no regularity as a whole. is there.
 上記した従来技術の課題を解決するために、本発明では、パターンの寸法を計測する方法において、試料上に形成された本来同一の形状を有する複数のパターンを撮像して複数のパターンの画像を取得し、複数のパターンの画像に対してテンプレートマッチング法により抽出したパターンの情報と複数のパターンの画像における複数のパターンの配列の周期性の情報を用いて抽出したパターンの情報とを統合して得た統合結果を用いて計測カーソルを設定し、取得した複数のパターンの画像に対して設定した計測カーソルを用いて寸法計測領域を設定し、複数のパターンの画像のうち計測カーソルを用いて設定した寸法計測領域に存在するパターンの画像を処理してこのパターンの寸法を計測するようにした。 In order to solve the above-described problems of the prior art, according to the present invention, in a method for measuring the dimension of a pattern, a plurality of patterns having the same shape formed on a sample are imaged to obtain images of the plurality of patterns. The pattern information extracted and extracted using the template matching method for multiple pattern images and the pattern information extracted using the periodicity information of the multiple pattern arrays in the multiple pattern images Set the measurement cursor using the obtained integration result, set the dimension measurement area using the measurement cursor set for the acquired multiple pattern images, and set using the measurement cursor among the multiple pattern images An image of a pattern existing in the dimension measurement area is processed to measure the dimension of this pattern.
 また、上記課題を解決するために、本発明では、パターンの寸法を計測する装置を、試料上に形成された本来同一の形状を有する複数のパターンを撮像して複数のパターンの画像を取得する画像取得手段と、画像取得手段で取得した複数のパターンの画像に対してテンプレートマッチング法によりパターンの情報を抽出し、複数のパターンの画像における複数のパターンの配列の周期性の情報を用いてパターンの情報を抽出し、テンプレートマッチング法により抽出したパターンの情報と複数のパターンの配列の周期性の情報を用いて抽出したパターンの情報とを統合して統合結果を得、この得た統合結果を用いて計測カーソルを設定する計測カーソル設定手段と、画像取得手段で取得した複数のパターンの画像に対して計測カーソル設定手段で設定した計測カーソルを用いて寸法計測領域を設定する寸法計測領域設定手段と、画像取得手段で取得した複数のパターンの画像のうち計測カーソル設定手段で設定した計測カーソルを用いて設定した寸法計測領域に存在するパターンの画像を処理して該パターンの寸法を計測する寸法計測手段とを備えて構成した。 In order to solve the above problems, in the present invention, an apparatus for measuring the dimension of a pattern captures a plurality of patterns having the same shape formed on a sample and acquires images of the plurality of patterns. Pattern information is extracted by a template matching method for images of a plurality of patterns acquired by the image acquisition means and the image acquisition means, and a pattern is used by using information on periodicity of the arrangement of a plurality of patterns in the images of the plurality of patterns. Information is extracted, and the pattern information extracted by the template matching method and the pattern information extracted using the periodicity information of the arrangement of multiple patterns are obtained to obtain an integrated result. Measurement cursor setting means to set the measurement cursor using, and measurement cursor setting for multiple pattern images acquired by the image acquisition means Dimension measurement area setting means for setting the dimension measurement area using the measurement cursor set in the stage, and the dimension set using the measurement cursor set by the measurement cursor setting means among the images of the plurality of patterns acquired by the image acquisition means Dimension measuring means for processing a pattern image existing in the measurement region and measuring the dimension of the pattern is provided.
 本発明ではテンプレートマッチングとパターンの規則的な並びを検出する2つの手法を行う。配列パターン検出は、画像全体の周期性から統計的にパターンの配置を検出するため、テンプレートマッチングの性能不足により検出できないパターンも検出することができる。これら二つ結果を統合することで安定してパターンを検出することができ、計測カーソルの自動設定を行うことができる。 In the present invention, two methods of detecting a template matching and a regular arrangement of patterns are performed. Since the arrangement pattern detection statistically detects the pattern arrangement from the periodicity of the entire image, a pattern that cannot be detected due to insufficient template matching performance can also be detected. By integrating these two results, the pattern can be detected stably and the measurement cursor can be automatically set.
 これにより、本発明によれば、基板に密集して形成された多数の微細なパターンについて、パターンの形状が不安定であったり、パターンの配列の規則性が局所的に成り立たない場合や全体的に規則性がない場合であっても、パターンの位置を正確に検出して、パターンの寸法を計測することが可能になった。 As a result, according to the present invention, for a large number of fine patterns formed densely on the substrate, the pattern shape is unstable or the regularity of the pattern arrangement is not locally established or the entire pattern Even if there is no regularity, it is possible to accurately detect the position of the pattern and measure the dimension of the pattern.
本発明の実施例1に係るハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions which concern on Example 1 of this invention. 本発明の実施例1に係る画像処理部の構成を示すブロック図である。It is a block diagram which shows the structure of the image process part which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象のパターンの画像である。It is an image of the pattern of the measuring object concerning Example 1 of the present invention. 本発明の実施例1に係る計測対象のパターンの画像である。It is an image of the pattern of the measuring object concerning Example 1 of the present invention. 本発明の実施例1に係る計測対象のパターンの画像である。It is an image of the pattern of the measuring object concerning Example 1 of the present invention. 本発明の実施例1に係る計測カーソルの検出アルゴリズムのフロー図である。It is a flowchart of the detection algorithm of the measurement cursor which concerns on Example 1 of this invention. 本発明の実施例1に係るテンプレートマッチングを説明するパターンの画像である。It is an image of a pattern explaining template matching concerning Example 1 of the present invention. 本発明の実施例1に係るテンプレートマッチングにより求めたテンプレート画像とパターンの画像との相関係数の分布を示すグラフである。It is a graph which shows distribution of the correlation coefficient of the template image calculated | required by the template matching which concerns on Example 1 of this invention, and the image of a pattern. 本発明の実施例1に係るテンプレートマッチングを説明する実パタンをSEMで撮像して得られた画像である。It is the image obtained by imaging the real pattern explaining the template matching which concerns on Example 1 of this invention by SEM. 本発明の実施例1に係るテンプレートマッチングにより求めたテンプレート画像と実パータンをSEMで撮像して得られた画像との相関係数の分布を示すグラフである。It is a graph which shows distribution of the correlation coefficient of the template image calculated | required by the template matching which concerns on Example 1 of this invention, and the image obtained by imaging a real pattern with SEM. 本発明の実施例1に係る計測対象パターンの画像である。It is an image of the measurement object pattern which concerns on Example 1 of this invention. 図12の計測対象パターンのスペクトル画像である。It is a spectrum image of the measurement object pattern of FIG. 本発明の実施例1に係る計測対象パターンの画像である。It is an image of the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンのスペクトル画像である。It is a spectrum image of the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンの画像である。It is an image of the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンのフーリエスペクトル図である。It is a Fourier spectrum figure of the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンの画像である。It is an image of the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンの画像である。It is an image of the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンの画像で、パターン配列の一部が乱れている場合の画像である。In the image of the measurement target pattern according to the first embodiment of the present invention, the pattern arrangement is partially disturbed. 本発明の実施例1に係る計測対象パターンの画像で、画像内に繰り返しパターンの終端がある場合の画像である。FIG. 6 is an image of a measurement target pattern according to the first embodiment of the present invention when the end of a repetitive pattern is present in the image. 本発明の実施例1に係る計測対象パターンで、パターン配列の一部が乱れている場合の画像である。It is an image in case a part of pattern arrangement | sequence is disordered by the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンに対してテンプレートマッチングを行って求めた相関係数画像である。It is the correlation coefficient image calculated | required by performing template matching with respect to the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンに対して配列パターン検出の結果を示す画像である。It is an image which shows the result of an array pattern detection with respect to the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンに対して配列パターン検出の結果を示す画像である。It is an image which shows the result of an array pattern detection with respect to the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象のパターンの画像である。It is an image of the pattern of the measuring object concerning Example 1 of the present invention. 本発明の実施例1に係る計測対称のパターンの画像である。It is an image of a measurement symmetrical pattern concerning Example 1 of the present invention. 本発明の実施例1に係る計測対象パターンの画像とこの計測対象パターンのX方向及びY方向における自己相関関数の分布を示す図である。It is a figure which shows distribution of the autocorrelation function in the X direction of this measurement object pattern, and the Y direction of the measurement object pattern which concerns on Example 1 of this invention. 本発明の実施例1に係る自己相関関数を算出する場合の元の画像とずらした画像との位置関係を示す図である。It is a figure which shows the positional relationship of the original image and the image shifted in the case of calculating the autocorrelation function which concerns on Example 1 of this invention. 本発明の実施例1に係る計測対象パターンの画像とこの計測対象パターンの直交する並びの方向における自己相関関数による配列パターン検出を説明する図である。It is a figure explaining the arrangement | sequence pattern detection by the autocorrelation function in the direction of the orthogonal arrangement | sequence of the image of the measurement object pattern which concerns on Example 1 of this invention, and this measurement object pattern. 本発明の実施例1に係る処理ステップを示すフロー図である。It is a flowchart which shows the process step which concerns on Example 1 of this invention. 本発明の実施例1に係る計測時の処理の流れを示すフロー図である。It is a flowchart which shows the flow of the process at the time of the measurement which concerns on Example 1 of this invention. 本発明の実施例1に係るGUIの正面図である。It is a front view of GUI which concerns on Example 1 of this invention. 本発明の実施例1に係るレシピ設定時の装置の動作手順を示すフロー図である。It is a flowchart which shows the operation | movement procedure of the apparatus at the time of the recipe setting which concerns on Example 1 of this invention. 本発明の実施例2に係る処理ステップを示すフロー図である。It is a flowchart which shows the process step which concerns on Example 2 of this invention. 本発明の実施例2に係る計測時の処理手順を示すフロー図である。It is a flowchart which shows the process sequence at the time of the measurement which concerns on Example 2 of this invention. 本発明の実施例3に係る処理ステップを示すフロー図である。It is a flowchart which shows the process step which concerns on Example 3 of this invention.
 以下、本発明に係る第1の実施の形態について、全体構成を説明した後、各処理の内容を順次述べる。 Hereinafter, after describing the overall configuration of the first embodiment according to the present invention, the contents of each process will be sequentially described.
 図1に本発明に関わる微細パターン測定装置の全体構成図を示す。 
 本実施形態である測長SEM100は、計測用ウェーハ(試料)107を載置するステージ106、電子銃102より放出された電子ビーム101を制御する照射光学系、試料上から放出される2次電子を検出する検出器108、検出信号の信号処理系より構成される。照射光学系は、電子銃102、および、電子ビーム101の経路上にあるコンデンサレンズ103、偏向コイル104、対物レンズ105により構成される。電子ビーム101はこの光学系によりウェーハ107上の計測対象であるパターンがある所定の領域で集光される。
FIG. 1 shows an overall configuration diagram of a fine pattern measuring apparatus according to the present invention.
A length measurement SEM 100 according to this embodiment includes a stage 106 on which a measurement wafer (sample) 107 is placed, an irradiation optical system that controls an electron beam 101 emitted from an electron gun 102, and secondary electrons emitted from the sample. And a signal processing system for detection signals. The irradiation optical system includes an electron gun 102, a condenser lens 103, a deflection coil 104, and an objective lens 105 on the path of the electron beam 101. The electron beam 101 is condensed by this optical system in a predetermined area where the pattern to be measured on the wafer 107 is present.
 検出器108により検出された2次電子はA/Dコンバータ109によりデジタル信号に変換される。変換後のデジタル信号は画像処理部110に送られ、画像処理部110では、メモリ内に格納されたデジタル信号を必要に応じて取り出し、画像処理を行い、パターン寸法計測等を行なう。111はステージコントローラ、112は電子光学系制御部、113は装置全体の制御部であり、114は制御部に接続されている制御端末である。 Secondary electrons detected by the detector 108 are converted into digital signals by the A / D converter 109. The converted digital signal is sent to the image processing unit 110. The image processing unit 110 extracts the digital signal stored in the memory as necessary, performs image processing, and performs pattern dimension measurement and the like. Reference numeral 111 denotes a stage controller, 112 denotes an electron optical system control unit, 113 denotes a control unit for the entire apparatus, and 114 denotes a control terminal connected to the control unit.
 画像処理部110と、全体制御部113と、制御端末114とは記録媒体(図示せず)への接続が可能となっており、画像処理部110で実行されるプログラムを、この記録媒体から読み出し、画像処理部110にロードできる構成となっている。 The image processing unit 110, the overall control unit 113, and the control terminal 114 can be connected to a recording medium (not shown), and a program executed by the image processing unit 110 is read from the recording medium. The image processing unit 110 can be loaded.
 図2は画像処理部110の構成図を示している。A/Dコンバータ109にてデジタル信号に変換された2次電子信号は、データ入力I/F205を介してメモリ203に送られ、メモリ203内で画像データとして読み出し可能なように記憶される。画像処理プログラムは画像処理制御部201により、メモリ203あるいは前記記憶媒体から読み出される。画像処理制御部201は読み出した画像処理プログラムに従い演算部202を制御し、メモリ203に記憶されている画像データあるいは画像データを処理した結果得られる中間処理データを処理し、パターンを計測する。 FIG. 2 shows a configuration diagram of the image processing unit 110. The secondary electron signal converted into a digital signal by the A / D converter 109 is sent to the memory 203 via the data input I / F 205 and stored so as to be readable as image data in the memory 203. The image processing program is read from the memory 203 or the storage medium by the image processing control unit 201. The image processing control unit 201 controls the calculation unit 202 according to the read image processing program, processes image data stored in the memory 203 or intermediate processing data obtained as a result of processing the image data, and measures a pattern.
 パターン計測結果は入出力I/F200を介して全体制御部113に送られ、図1に示す制御端末114に計測結果の表示を行う。また、画像処理部110に対する動作命令は、全体制御部113から入出力I/F200を介して画像処理制御部201に入力される。画像処理部110内のデータの送受信はバス204を介して行われる。 The pattern measurement result is sent to the overall control unit 113 via the input / output I / F 200, and the measurement result is displayed on the control terminal 114 shown in FIG. An operation command for the image processing unit 110 is input from the overall control unit 113 to the image processing control unit 201 via the input / output I / F 200. Data transmission / reception in the image processing unit 110 is performed via the bus 204.
 図3、図4、図5において、検出対象とするパターン画像の例を示す。図3のパターン画像301は、計測対象であるホール302が正方格子状に並んだパターンである。図4のパターン画像401は、パターン画像301のホールの並びを45度傾けて配列させたパターンである。また、図5のパターン画像501のように、ホール2個を一組として規則的に配列しているパターンなどがある。計測を行うには、画像全体から単体のホールパターンを検出し、ホール領域303の設定を行い、領域303に対して、測長アルゴリズムを処理することで行う。測長アルゴリズムを実行する領域、例えば図3の303や304で示されるような点線で囲んだ領域を計測カーソルと呼ぶ。 FIG. 3, FIG. 4, and FIG. 5 show examples of pattern images to be detected. A pattern image 301 in FIG. 3 is a pattern in which holes 302 to be measured are arranged in a square lattice pattern. A pattern image 401 in FIG. 4 is a pattern in which the holes of the pattern image 301 are arranged with an inclination of 45 degrees. Further, there is a pattern in which two holes are regularly arranged as a set, such as a pattern image 501 in FIG. In order to perform measurement, a single hole pattern is detected from the entire image, the hole area 303 is set, and a length measurement algorithm is processed for the area 303. An area where the length measurement algorithm is executed, for example, an area surrounded by a dotted line as indicated by 303 and 304 in FIG. 3 is called a measurement cursor.
 図6において、計測カーソルの検出手法の概要について説明する。入力画像601は計測カーソルの設定対象画像であり、入力画像602は計測対象パターンのテンプレート画像である。まず、配列パターン検出処理603により、画像中の周期性の解析を行い、ホール位置を検出した結果605を得る。また、画像601に対してテンプレート画像602を用いテンプレートマッチング604を行い、ホールの位置を検出した結果606を得る。 In FIG. 6, the outline of the measurement cursor detection method will be described. The input image 601 is a measurement cursor setting target image, and the input image 602 is a measurement target pattern template image. First, the periodicity in the image is analyzed by the array pattern detection processing 603, and a result 605 of detecting the hole position is obtained. Further, template matching 604 is performed on the image 601 using the template image 602, and a result 606 of detecting the position of the hole is obtained.
 配列パターン検出手法とテンプレートマッチングのそれぞれの手法は、検出が苦手なパターンが存在し、図6の例の場合は、配列パターン検出手法でホール位置を検出した場合に、実際にはホールは存在しない領域609をホールがあると誤検出し、一方、テンプレートマッチングを用いた場合には領域609をホールが無い領域として正しく認識しているが、一部が欠けたホールパターン611を見逃し、複数のホールにまたがる領域610をホールパターンが存在する領域として誤検出している。見逃したパターン611は、歪や帯電の影響により画像上において輪郭が閉じていないように見えるホールである。このパターン611は、配列パターン検出手法で検出した画像605の中ではホール領域として検出されている。そこで二つの結果を統合処理607にて統合することで、最終的に見逃しや誤検出のない計測カーソルにより測長領域が設定された検出結果608を得ることができる。 Each of the array pattern detection method and the template matching method has a pattern that is difficult to detect. In the case of the example in FIG. 6, when the hole position is detected by the array pattern detection method, no hole actually exists. If the region 609 is erroneously detected as having a hole, and the template matching is used, the region 609 is correctly recognized as a region having no hole, but a part of the hole pattern 611 missing is missed, and a plurality of holes are detected. A region 610 extending over the area is erroneously detected as a region where a hole pattern exists. The overlooked pattern 611 is a hole that does not appear to be closed on the image due to distortion or charging. This pattern 611 is detected as a hole area in the image 605 detected by the array pattern detection method. Therefore, by integrating the two results in the integration process 607, a detection result 608 in which the length measurement region is set by the measurement cursor without missing or erroneous detection can be finally obtained.
 図6のテンプレートマッチング604について図3、図4を参照しながら詳しく説明する。図3のパターン画像301に対して、領域303をテンプレート画像とし、テンプレート画像の高さをN、横幅をM、テンプレート画像上の座標(i、j)におけるパターン画像301の画素値をI(i、j)、テンプレート画像303の画素値をT(i、j)としたとき、(数1)で表される正規化相関によりテンプレートマッチングを行う。  6 will be described in detail with reference to FIGS. 3 and 4. FIG. With respect to the pattern image 301 of FIG. 3, the region 303 is a template image, the height of the template image is N, the horizontal width is M, and the pixel value of the pattern image 301 at coordinates (i, j) on the template image is I (i , J), when the pixel value of the template image 303 is T (i, j), template matching is performed by the normalized correlation expressed by (Equation 1). *
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 このとき、ホールパターンに囲まれた領域304において、相関係数RNCCが高いピークを持つことがあり、誤検出の要因となる。同様に、図4の画像401においても、402をテンプレート画像としたとき、ホールパターンに囲まれた領域403において高いピークを持つ相関係数となる。 At this time, in the region 304 surrounded by the hole pattern, the correlation coefficient R NCC may have a high peak, which may cause erroneous detection. Similarly, in the image 401 of FIG. 4, when 402 is a template image, the correlation coefficient has a high peak in the region 403 surrounded by the hole pattern.
 図7の画像701は画像401の一部領域を拡大した画像で、ノイズなどの影響がない理想的な画像であるとする。図8のグラフの縦軸801はテンプレート画像402の中心を矢印702沿いにずらしながら正規化相互相関(数1)によって求めた相関係数で、横軸806は矢印702上のx座標を表している。相関係数はホールが存在する点703、705でピーク802、804を示す。また点704は706に示すエッジに囲まれており、ホールに近い形状の画素の成分を持っているため、相関係数はピーク803を持つ。ノイズなどの影響がなければ点704の相関係数は、点703、705の相関係数よりも十分低いため、しきい値805を設定することでホールパターンだけを検出することが可能である。 7 is an image obtained by enlarging a partial area of the image 401, and is an ideal image that is not affected by noise or the like. The vertical axis 801 of the graph of FIG. 8 is a correlation coefficient obtained by normalized cross-correlation (Equation 1) while shifting the center of the template image 402 along the arrow 702, and the horizontal axis 806 represents the x coordinate on the arrow 702. Yes. The correlation coefficient shows peaks 802 and 804 at points 703 and 705 where holes exist. Further, the point 704 is surrounded by the edge indicated by 706 and has a pixel component having a shape close to a hole, so that the correlation coefficient has a peak 803. If there is no influence of noise or the like, the correlation coefficient at the point 704 is sufficiently lower than the correlation coefficients at the points 703 and 705, so that it is possible to detect only the hole pattern by setting the threshold value 805.
 しかし、実際のパターンをSEMで撮像して得られる画像は図9の画像901に示すような、ノイズや撮像条件による歪や、周辺に存在するホールパターンなどがマッチングに影響する。そのため、矢印902上における相関係数のグラフを図10に示すと、ホール間の点903に対するピーク1001が、ホール中心位置の点904に対するピーク1002の相関係数よりも高くなるということが起きる。この場合、点904のホールを検出するようにしきい値1003を設定すると、ホールがない点903を誤検出し、点903を検出しないようにしきい値を1004に設定すると、ホール904を見逃してしまう。そこで、図6に示した処理手順において、ノイズや歪の影響を受けやすいテンプレートマッチング604とは別に、パターンの配置の規則性をターゲットとした周期性解析を行い、統計的に安定した配列パターン検出603を行う。 However, in an image obtained by imaging an actual pattern with the SEM, noise and distortion due to imaging conditions, a hole pattern existing in the vicinity, and the like affect the matching as shown in an image 901 in FIG. Therefore, when the correlation coefficient graph on the arrow 902 is shown in FIG. 10, the peak 1001 corresponding to the point 903 between holes is higher than the correlation coefficient of the peak 1002 corresponding to the point 904 at the hole center position. In this case, if the threshold value 1003 is set so as to detect the hole at the point 904, the point 903 having no hole is erroneously detected, and if the threshold value is set to 1004 so that the point 903 is not detected, the hole 904 is missed. . Therefore, in the processing procedure shown in FIG. 6, apart from the template matching 604 that is easily affected by noise and distortion, periodic analysis is performed targeting the regularity of pattern arrangement, and statistically stable array pattern detection is performed. 603 is performed.
 図6における配列パターン検出603について説明する。ホールパターンが格子状に並んだ図11の画像1101に対して離散フーリエ変換を行い、図12のスペクトル画像1201を算出する。スペクトル画像は中心から外側に向かって大きい周波数を表し、パターン画像に存在する周期性に応じて格子状のピークを発生する。ここで図11におけるパターンの横の間隔1102は図12におけるスペクトル画像上のピーク点1203に、図11の縦の間隔1103は図12のスペクトル画像上のピーク点1204に対応している。これらピークは中心のピーク1202を除いた1番目と2番目に大きいピーク点として検出することができる。 The arrangement pattern detection 603 in FIG. 6 will be described. Discrete Fourier transform is performed on the image 1101 in FIG. 11 in which the hole patterns are arranged in a lattice pattern, and the spectrum image 1201 in FIG. 12 is calculated. The spectrum image represents a large frequency from the center to the outside, and generates a lattice-like peak according to the periodicity existing in the pattern image. Here, the horizontal interval 1102 of the pattern in FIG. 11 corresponds to the peak point 1203 on the spectral image in FIG. 12, and the vertical interval 1103 in FIG. 11 corresponds to the peak point 1204 on the spectral image in FIG. These peaks can be detected as the first and second largest peak points excluding the central peak 1202.
 図12でピーク点1203のx軸方向の周波数1205をp、ピーク点1204のy軸方向の周波数1206をqとし、図11でパターン画像の横の間隔1102をP、縦の間隔1103をQとしたとき、(数2)と(数3)によりP、Qを算出することができる。但しNはパターン画像の横幅、Mはパターン画像の縦幅である。  In FIG. 12, the frequency 1205 in the x-axis direction of the peak point 1203 is p, the frequency 1206 in the y-axis direction of the peak point 1204 is q, and in FIG. 11, the horizontal interval 1102 of the pattern image is P, and the vertical interval 1103 is Q. Then, P and Q can be calculated by (Equation 2) and (Equation 3). However, N is the horizontal width of the pattern image, and M is the vertical width of the pattern image. *
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 図11の画像1101において、テンプレート領域1104の中心位置を基点にして、求めた間隔P、Qでグリッド1105を配置することにより、パターンが周期的に存在する箇所をグリッドの交点として求めることができる。また、グリッドの基点はフーリエ変換により求めたピーク周波数の位相成分から求めてもよい。 In the image 1101 of FIG. 11, by arranging the grid 1105 at the obtained intervals P and Q with the center position of the template region 1104 as the base point, a place where the pattern periodically exists can be obtained as an intersection of the grids. . The base point of the grid may be obtained from the phase component of the peak frequency obtained by Fourier transform.
 図13、図14において、ホールパターンが斜めに配列している場合の配列パターン検出の例を説明する。図14の画像1401は、図13に示すような傾いた格子状に並んだホールパターン画像1301のフーリエスペクトル画像である。図13のパターン画像1301上のグリッド1305は、図14のスペクトル画像1401のピーク点1402に対応する。このピーク点1402は中心点1403を除いた1番目か2番目に大きいピーク点である。このとき、図13においてグリッド1305に垂直な線1303とX軸とのなす角1304は、図14のスペクトル画像における中心点1403からピーク点1402を結ぶ直線とX軸とのなす角1404と同じになる。 Referring to FIGS. 13 and 14, an example of array pattern detection when the hole patterns are arranged obliquely will be described. An image 1401 in FIG. 14 is a Fourier spectrum image of the hole pattern image 1301 arranged in an inclined lattice pattern as shown in FIG. A grid 1305 on the pattern image 1301 of FIG. 13 corresponds to the peak point 1402 of the spectrum image 1401 of FIG. This peak point 1402 is the first or second largest peak point excluding the center point 1403. At this time, the angle 1304 formed by the line 1303 perpendicular to the grid 1305 and the X axis in FIG. 13 is the same as the angle 1404 formed by the X axis and the straight line connecting the center point 1403 to the peak point 1402 in the spectrum image of FIG. Become.
 図14においてピーク点1402の座標を(u、v)としたとき、中心点1403からピーク点1402を結ぶ直線とX軸とのなす角1404は(数4)のθとして求まる。  14, when the coordinates of the peak point 1402 are (u, v), the angle 1404 formed by the straight line connecting the peak point 1402 from the center point 1403 and the X axis is obtained as θ in (Expression 4). *
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
なおピーク点1402の周期は、中心点1403からピーク点1402までの距離1405で表され、(数2)よりグリッドの間隔1302を求めればよい。 Note that the period of the peak point 1402 is represented by a distance 1405 from the center point 1403 to the peak point 1402, and the grid interval 1302 may be obtained from (Equation 2).
 次に図15のパターン画像1501のように、ホール2個のパターン1502が斜めに配列している場合の配列パターン検出方法の例を示す。ホールパターン画像1501に対するフーリエスペクトル画像を図16の1601に示す。パターン画像1501の配列パターンは、1個のホールが、図17のグリッド1701とグリッド1702上に配置しているパターンと、2個1セット1502が斜めに並んだ図18のグリッド1801とグリッド1802の配置パターンに分けられる。 Next, an example of an array pattern detection method in the case where two hole patterns 1502 are diagonally arranged as in the pattern image 1501 of FIG. A Fourier spectrum image corresponding to the hole pattern image 1501 is shown at 1601 in FIG. The arrangement pattern of the pattern image 1501 includes a pattern in which one hole is arranged on the grid 1701 and the grid 1702 in FIG. 17 and a grid 1801 and a grid 1802 in FIG. It can be divided into arrangement patterns.
 図17のグリッド1701は図16のスペクトル画像中のピーク1602に、グリッド1702はスペクトル画像中のピーク1603に、図18のグリッド1801は図16のスペクトル画像中のピーク1604に、グリッド1802はスペクトル画像中のピーク1605に対応して発生する。図16においてこれら4つのピークはスペクトル画像中の中心を除いた上位4つのピークとなる。そこでしきい値を設定し、しきい値以上になる複数のピークを検出すれば、上位4つのピークを検出することができる。この場合、図16の1602、1603、1604、1605以外のピーク1606やピーク1607などの検出を防ぐため、パラメータによって検出するピークの最大の周波数を決めておけばよい。 17 is a peak 1602 in the spectral image of FIG. 16, grid 1702 is a peak 1603 in the spectral image, grid 1801 of FIG. 18 is a peak 1604 in the spectral image of FIG. 16, and grid 1802 is a spectral image. It occurs corresponding to the peak 1605 in the middle. In FIG. 16, these four peaks are the top four peaks excluding the center in the spectrum image. Therefore, by setting a threshold value and detecting a plurality of peaks that are equal to or higher than the threshold value, the top four peaks can be detected. In this case, in order to prevent detection of peaks 1606 and peaks 1607 other than 1602, 1603, 1604, and 1605 in FIG. 16, the maximum frequency of the peak to be detected may be determined by parameters.
 配列パターン検出603は、複数のホールパターンの配置から統計的な処理によりパターンを検出しているため、ノイズや歪の影響によって個々のパターン形状が崩れたとしても、テンプレートマッチング604に比べ影響が少ない。しかし、図19、20、21に示すような配置の規則性が一部崩れているパターン画像1901、2101や繰り返しの終端をもつ画像2001に対しては正しくパターン検出することができない。そこで図6に示したように、テンプレートマッチング604と配列パターン検出603の短所をお互いに補うための統合処理607を行う。 Since the array pattern detection 603 detects a pattern by statistical processing from the arrangement of a plurality of hole patterns, even if individual pattern shapes collapse due to the influence of noise or distortion, the influence is less than that of the template matching 604. . However, pattern detection cannot be performed correctly for pattern images 1901, 2101 in which the regularity of arrangement is partially broken as shown in FIGS. Therefore, as shown in FIG. 6, an integration process 607 is performed to compensate for the disadvantages of the template matching 604 and the array pattern detection 603.
 図22、図23を用いて、図6に示した処理の流れにおける統合処理607の詳細を説明する。図22の画像2201は図3に示したホールパターン画像301についてテンプレートマッチングを行い求めた相関係数画像である。この相関係数画像2201の画素2204の画素値は、テンプレート画像303の中心を画素2204に合わせた時の相関係数となり、図8又は図10のグラフに示した波形の波高値に相当する。相関係数画像2201には、図7及び図8を用いて説明したように、ホールの中心位置で発生するピーク2202と、ホールとホールの間の領域に発生するピーク2203が存在する。 Details of the integration processing 607 in the processing flow shown in FIG. 6 will be described with reference to FIGS. An image 2201 in FIG. 22 is a correlation coefficient image obtained by performing template matching on the hole pattern image 301 shown in FIG. The pixel value of the pixel 2204 of the correlation coefficient image 2201 is a correlation coefficient when the center of the template image 303 is aligned with the pixel 2204, and corresponds to the peak value of the waveform shown in the graph of FIG. As described with reference to FIGS. 7 and 8, the correlation coefficient image 2201 includes a peak 2202 generated at the center position of the hole and a peak 2203 generated in a region between the holes.
 図23における画像2301は、図3に示したホールパターン画像301の配列パターン検出の結果画像を表す。帯状の領域2303は、配列検出によって求めたグリッド2302を中心に、幅2304だけ広げた領域である。グリッド2305においても同様の帯状の領域2308が得られる。それぞれグリッドの領域の交差する領域2306は、配列パターン検出の結果としてホールが存在する可能性が高い領域である。この例においては、実際にパターン画像301のホールの位置となる。 An image 2301 in FIG. 23 represents a result image of the array pattern detection of the hole pattern image 301 shown in FIG. A band-like area 2303 is an area widened by a width 2304 around the grid 2302 obtained by array detection. A similar band-like region 2308 is also obtained in the grid 2305. A region 2306 where the grid regions intersect each other is a region where holes are likely to exist as a result of the array pattern detection. In this example, it is actually the position of the hole in the pattern image 301.
 ここで、ノイズの影響によりホール間の相関係数がホール中心位置の相関係数よりも高くなる例として、図22におけるホール中心位置の相関係数2202が0.6、ホール間の相関係数2203が0.7である場合を考える。図23の画像2301において帯の重なる領域2306を1.0とし、それ以外の領域を0.6として位置ごとにパターンが存在する確率に対応する重み係数を設定する。これら画像2201と画像2301の同じ座標の画素値同士の相関係数と重み係数とを乗算して求めた値(統合値)を統合結果とする。 Here, as an example in which the correlation coefficient between holes is higher than the correlation coefficient at the hole center position due to the influence of noise, the correlation coefficient 2202 at the hole center position in FIG. Consider the case of 0.7. In the image 2301 of FIG. 23, a region 2306 where the bands overlap is set to 1.0, and the other region is set to 0.6, and a weighting factor corresponding to the probability that a pattern exists for each position is set. A value (integrated value) obtained by multiplying the correlation coefficient between the pixel values of the same coordinates in the image 2201 and the image 2301 and the weighting coefficient is set as the integration result.
 図22のホール中心位置の相関係数2202は図23のホールが存在する可能性が高い領域2306に相当するので乗算の結果は0.6となり、図22のホール間の相関係数2203は図23のホールが存在する可能性が低い領域2307に相当するので乗算の結果は0.42となる。よって統合結果は、ホール位置の統合値がホール間の統合値よりも高くなり、統合値に対してしきい値を設定することで誤検出を起こさずにホールの検出が可能となる。 Since the correlation coefficient 2202 at the hole center position in FIG. 22 corresponds to the region 2306 in which the hole is likely to exist in FIG. 23, the multiplication result is 0.6, and the correlation coefficient 2203 between the holes in FIG. Since this corresponds to the region 2307 where there is a low possibility that a hole exists, the result of multiplication is 0.42. Therefore, the integrated result is that the integrated value of the hole position is higher than the integrated value between the holes, and by setting a threshold value for the integrated value, the hole can be detected without causing erroneous detection.
 図24において、画像2401は図15に示したホールパターン画像1501に対するホール配列検出結果画像を表す。各々の方向のグリッド2402、2403、2404、2405に対し、幅2406、2407、2408、2409で帯状の領域を定義し、すべてが重なる領域2410をホールが存在する領域として同様に定義することができる。 24, an image 2401 represents a hole arrangement detection result image for the hole pattern image 1501 shown in FIG. For the grids 2402, 2403, 2404, and 2405 in each direction, a band-like region can be defined with widths 2406, 2407, 2408, and 2409, and a region 2410 where all overlap can be similarly defined as a region where holes exist. .
 配列パターン検出604で失敗する可能性がある図19、図20、図21に示したような画像に対して、統合処理607を適用した例について説明する。各パターン画像中の線1904、2003、2104は配列パターン検出によりもとめたグリッドである。 An example in which the integration processing 607 is applied to the images shown in FIGS. 19, 20, and 21 that may fail in the array pattern detection 604 will be described. Lines 1904, 2003, and 2104 in each pattern image are grids obtained by array pattern detection.
 図19のホールパターン画像1901は、格子状にホールが配列しているが、領域1902においてホールがないパターン画像である。この画像に対してテンプレートマッチングを行う場合、領域1902は、誤検出しやすい領域1903に比べて隣接ホールとの間隔に余裕があるためテンプレートマッチングに対する影響は少なく、相関係数は0.0に近い値となる。また、配列パターンの検出結果において、領域1902はグリッド1904の交差する地点であるため重み係数は1.0となるが、相関係数と乗算した統合値は0.0に近い値となり、誤検出が起こることはない。一方、図19においてホールが存在する領域に関しては、パターン画像301に対する処理結果と同等になるため、統合処理により正しくパターンを検出することができる。 A hole pattern image 1901 in FIG. 19 is a pattern image in which holes are arranged in a lattice pattern, but there are no holes in the region 1902. When template matching is performed on this image, the region 1902 has less margin on the template matching than the region 1903 that is likely to be erroneously detected, so the correlation coefficient is a value close to 0.0. Become. Further, in the detection result of the array pattern, since the area 1902 is a point where the grid 1904 intersects, the weighting coefficient is 1.0, but the integrated value multiplied by the correlation coefficient is a value close to 0.0, and erroneous detection occurs. Absent. On the other hand, the region where holes are present in FIG. 19 is equivalent to the processing result for the pattern image 301, so that the pattern can be correctly detected by the integration processing.
 図20のホールパターン画像2001は、繰り返しパターンの終端である。このパターン画像に対してテンプレートマッチングを行うと、繰り返しパターン終端以降の領域2002において、ホールに囲まれるような領域が存在しないため、相関係数は0.0に近い値となる。そのため領域2002内において、配列パターン検出結果に関わらず、統合値も0.0に近い値となるため、誤検出が起こることはない。領域2002以外の領域に関しては、パターン画像301に対する処理結果と同じ結果となるため、統合処理によって正しくパターンを検出することができる。 The hole pattern image 2001 in FIG. 20 is the end of the repeated pattern. When template matching is performed on this pattern image, since there is no region surrounded by holes in the region 2002 after the end of the repeated pattern, the correlation coefficient becomes a value close to 0.0. Therefore, in the region 2002, the integrated value becomes a value close to 0.0 regardless of the arrangement pattern detection result, so that no erroneous detection occurs. With respect to the area other than the area 2002, the same result as the processing result for the pattern image 301 is obtained, so that the pattern can be correctly detected by the integration process.
 図21のホールパターン画像2101は、格子状にホールが配列されており、繰り返しパターンとは関係のないホール2102が存在するパターン画像である。ホール2102のテンプレートマッチングによる相関係数が0.8であるとする。配列パターン検出の結果において、ホール2102はグリッド2104の交差する箇所に存在しないため重み係数は0.6となり、統合値は0.8×0.6=0.48となりテンプレートマッチングのみの結果よりも小さい値となってしまう。しかし、パターン画像2101の繰り返しパターンを形成するホール2103は、ホールの間に別のホール2102を配置できるだけ余裕のある間隔を持つため、ホール以外の場所において相関係数が高くなることはない。そのためホール2102の統合値は、ホ-ルが存在しない領域よりも大きい値を持ち、統合値に対するしきい値処理により正しくパターンの検出を行うことができる。 The hole pattern image 2101 in FIG. 21 is a pattern image in which holes are arranged in a lattice pattern and there are holes 2102 that are not related to the repetitive pattern. Assume that the correlation coefficient of the hole 2102 by template matching is 0.8. In the result of the array pattern detection, since the hole 2102 does not exist at the intersection of the grid 2104, the weighting coefficient is 0.6, and the integrated value is 0.8 × 0.6 = 0.48, which is smaller than the result of only template matching. However, since the holes 2103 forming the repetitive pattern of the pattern image 2101 have enough space to arrange another hole 2102 between the holes, the correlation coefficient does not increase in places other than the holes. Therefore, the integrated value of the hole 2102 has a value larger than that of the area where no hole exists, and the pattern can be correctly detected by the threshold processing for the integrated value.
 図6に示した計測カーソルの検出アルゴリズムは、他のパターン画像においても用いることができる。図25の画像2501はラインが規則的に並んだパターン画像である。領域2502が計測したいパターン領域としてテンプレートマッチングを行った場合、領域2502に似た水平エッジの成分を持つ領域2503誤検出してしまう可能性が高い。 The measurement cursor detection algorithm shown in FIG. 6 can also be used in other pattern images. An image 2501 in FIG. 25 is a pattern image in which lines are regularly arranged. When template matching is performed as a pattern region to be measured as the region 2502, there is a high possibility that the region 2503 having a horizontal edge component similar to the region 2502 is erroneously detected.
 また、図26の画像2601はラインが規則的に横に並び、かつライン同士が一部分で縦方向につながっているパターン画像である。領域2602はパターン内部である。領域2603のようなライン終端と終端を計測対象ととしてテンプレートマッチングを行ったとき、似た形状をもつ領域2604において誤検出してしまう可能性が高い。即ち、図26の画像2601のようなパターンに対しては、テンプレートマッチングの手法を用いてパターン検出を行うことが難しい。 Further, an image 2601 in FIG. 26 is a pattern image in which lines are regularly arranged horizontally and the lines are partially connected in the vertical direction. Region 2602 is inside the pattern. When template matching is performed with the line end and end as in the region 2603 as the measurement target, there is a high possibility of erroneous detection in the region 2604 having a similar shape. That is, it is difficult to detect a pattern using a template matching method for a pattern such as the image 2601 in FIG.
 しかし、図6で説明した計測カーソルの検出アルゴリズムを用いれば、これらパターン画像2501、2601に対して、誤検出を発生させずに対象を検出することができる。 すなわち、図26の画像2601について、テンプレートマッチングを行って図22で説明したような相関係数画像を作成する。次に、この相関係数画像から図23で説明したような配列パターン検出結果の画像を作成する。この配列パターン検出結果の画像の各領域ごとに相関係数を定義して、相関係数画像の同じ座標同士の相関係数を掛け合わせて統合値を求め、この統合値に対するしきい値処理により正しくパターンの検出を行うことができる。 However, if the measurement cursor detection algorithm described with reference to FIG. 6 is used, it is possible to detect the target for these pattern images 2501 and 2601 without causing erroneous detection. That is, template matching is performed on the image 2601 in FIG. 26 to create a correlation coefficient image as described in FIG. Next, an image of the array pattern detection result as described with reference to FIG. 23 is created from this correlation coefficient image. A correlation coefficient is defined for each region of the image of the array pattern detection result, an integrated value is obtained by multiplying the correlation coefficients of the same coordinates of the correlation coefficient image, and threshold processing is performed on the integrated value. Pattern detection can be performed correctly.
 図6のテンプレートマッチング604は、テンプレート画像内に含まれるパターンの特徴的な部位でマッチングを行う特徴点マッチングを用い、パターンを検出するようにしてもよい。この特徴的な部位の抽出は非特許文献2に記載されているようなSIFTと呼ばれる特徴量抽出手法を利用すればよい。 The template matching 604 in FIG. 6 may detect a pattern by using feature point matching that performs matching at a characteristic part of the pattern included in the template image. The feature part extraction may be performed by using a feature quantity extraction method called SIFT as described in Non-Patent Document 2.
 相関係数の算出は、マッチング対象画像からテンプレート画像上の特徴点に対応する座標の特徴量を算出し、テンプレート画像の特徴点における特徴量と比較を行う。SIFTの場合、特徴量は輝度勾配方向に対するヒストグラムで表されるため、特徴量間のマッチングは(数5)で表されるカルバックライブラー距離D(n)を算出する。 The correlation coefficient is calculated by calculating the feature quantity of the coordinates corresponding to the feature point on the template image from the matching target image and comparing it with the feature quantity at the feature point of the template image. In the case of SIFT, since the feature amount is represented by a histogram with respect to the luminance gradient direction, matching between the feature amounts calculates a Cullback library distance D (n) represented by (Equation 5).
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
但し、iはヒストグラムのビン番号、P(n、i)はn番目の特徴点における特徴量のヒストグラムで、Q(n、i)はn番目の特徴点に対応するマッチング先の座標における特徴量のヒストグラムである。 Here, i is the bin number of the histogram, P (n, i) is a histogram of the feature quantity at the nth feature point, and Q (n, i) is the feature quantity at the matching destination coordinate corresponding to the nth feature point. This is a histogram.
 この距離D(n)がしきい値Tより大きければG(n)=0、小さければG(n)=1となるようなマッチング判定関数G(n)を用い、相関係数を(数6)にて計算する。 If this distance D (n) is larger than the threshold value T, a matching judgment function G (n) is used so that G (n) = 0, and if the distance D (n) is smaller, the correlation coefficient is expressed as )
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
但し、Nは抽出した特徴点の総数である。 N is the total number of extracted feature points.
 特徴点マッチングは、相関係数を求める際にテンプレート画像上の全ての画素に対して処理せず、特徴点だけを処理すればよいため、正規化相互相関のような通常のテンプレートマッチングよりも処理を高速に行う事が可能である。 Feature point matching is processed rather than normal template matching such as normalized cross-correlation because only the feature points need to be processed instead of processing all the pixels on the template image when obtaining the correlation coefficient. Can be performed at high speed.
 配列パターン検出603は、フーリエ変換を用いる以外に自己相関関数を利用する方法でもよい。図27の波形2702は、格子状にホールが配列したパターン画像2701のx軸方向に対する自己相関関数を表す。グラフの中心2703は位置ずれ量0の時を示す。自己相関関数R(t)は、図28の元の画像2801に対して、x軸方向へのずれ量2802をt、ずらした画像2803と元画像2801との重なる領域2804の横幅2805をW、縦幅2806をH、元画像2801の座標(x、y)における画素値をf(x、y)としたとき、次の(数7)で求める。 The array pattern detection 603 may be a method using an autocorrelation function other than using Fourier transform. A waveform 2702 in FIG. 27 represents an autocorrelation function with respect to the x-axis direction of the pattern image 2701 in which holes are arranged in a lattice pattern. The center 2703 of the graph indicates when the positional deviation amount is zero. The autocorrelation function R (t) has a displacement amount 2802 in the x-axis direction of t with respect to the original image 2801 in FIG. 28, a lateral width 2805 of an area 2804 where the shifted image 2803 and the original image 2801 overlap, and W. When the vertical width 2806 is H and the pixel value at the coordinates (x, y) of the original image 2801 is f (x, y), the following (Equation 7) is used.
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 これにより求めた自己相関関数2702に対して、しきい値2704を適用し、中心2703以外のピーク点2705を検出する。中心2703とピーク点2705の距離2706がx軸方向におけるパターンの配列間隔となる。y軸方向においても自己相関関数2707を算出し、同様にパターンの配列間隔を求める。また、図29の画像2901のような、格子状の配置を斜めに傾けたパターン画像に対しては、ずらす方向も矢印2902の方向とそれに垂直な矢印2904の方向に傾けて自己相関関数2903、2905を求めればよい。 Threshold value 2704 is applied to autocorrelation function 2702 obtained in this way, and peak points 2705 other than center 2703 are detected. A distance 2706 between the center 2703 and the peak point 2705 is the pattern arrangement interval in the x-axis direction. The autocorrelation function 2707 is also calculated in the y-axis direction, and the pattern arrangement interval is similarly obtained. In addition, for a pattern image such as the image 2901 in FIG. 29 in which the grid-like arrangement is inclined obliquely, the shifting direction is also inclined in the direction of the arrow 2902 and the direction of the arrow 2904 perpendicular thereto, and the autocorrelation function 2903, What is necessary is just to obtain 2905.
 離散フーリエ変換を用いた手法は、パターン画像をsin波の基底で解析している。一方で、自己相関関数はパターン形状そのものを基底とし、フーリエ解析よりも多くの画像の信号成分を用いて周期性を解析している。そのため、ノイズや歪の多い画像において、自己相関関数による配列パターン検出は離散フーリエ変換を用いた手法よりも正確に配列パターン検出することができる。 The method using the discrete Fourier transform analyzes a pattern image on the basis of a sin wave. On the other hand, the autocorrelation function is based on the pattern shape itself, and analyzes periodicity using more signal components of the image than Fourier analysis. Therefore, in an image with a lot of noise and distortion, the array pattern detection by the autocorrelation function can detect the array pattern more accurately than the method using the discrete Fourier transform.
 図30において、図6の計測カーソルの検出アルゴリズムを用いたパターン計測における計測カーソル設定操作(レシピ設定)の処理の流れを説明する。なお、設定対象は図3のパターン画像301とする。 30, the flow of processing of measurement cursor setting operation (recipe setting) in pattern measurement using the measurement cursor detection algorithm of FIG. 6 will be described. The setting target is the pattern image 301 in FIG.
 まずステップS3001において、ウェーハ上の計測対象のパターン領域を撮像し、図3の画像301を取得する。ステップS3002において、GUI上に取得画像301を表示する。ステップS3003において、画像301上からテンプレート画像として使用する領域303をGUIにて指定する。ステップS3004において、画像301とテンプレート画像303を用い、図6に示した計測カーソルの検出アルゴリズムの処理を行い、計測カーソルを検出する。ステップS3005において、S3004による計測カーソルの検出結果と、中間処理であるテンプレートマッチング604と配列パターン検出603の結果をGUI上に出力する。 First, in step S3001, the pattern area to be measured on the wafer is imaged to obtain the image 301 in FIG. In step S3002, the acquired image 301 is displayed on the GUI. In step S3003, an area 303 to be used as a template image is designated on the image 301 using the GUI. In step S3004, the measurement cursor detection algorithm shown in FIG. 6 is performed using the image 301 and the template image 303 to detect the measurement cursor. In step S3005, the measurement cursor detection result in S3004 and the results of template matching 604 and array pattern detection 603, which are intermediate processes, are output on the GUI.
 ステップS3006において、ユーザは配列パターン検出603とテンプレートマッチング604、及び統合処理607のパラメータをGUIにより変更することができる。ステップS3007において、調整後パラメータを用いて、計測カーソルの検出を再度行いカーソル検出結果と中間処理結果の表示の更新を行う。もし、パラメータの修正がまだ必要であるとユーザが判断した場合、ステップS3006に戻りパラメータの修正を再度行う。 In step S3006, the user can change the parameters of the array pattern detection 603, the template matching 604, and the integration process 607 using the GUI. In step S3007, the measurement cursor is detected again using the adjusted parameter, and the display of the cursor detection result and the intermediate processing result is updated. If the user determines that parameter correction is still necessary, the process returns to step S3006 and parameter correction is performed again.
 計測したい箇所にほぼ問題なく計測カーソルが検出していれば、ステップS3008に進む。ステップS3008において、ユーザは計測カーソルの検出結果に対して計測カーソルの追加、削除、位置の修正を必要に応じて行うことができる。ステップS3009において、撮像位置とそれに対する計測カーソルの相対座標をレシピデータとして保存する。このレシピデータを用いて、別のチップや異なるウェーハ上にある画像301と同じパターン領域の計測を行う。ステップS3009でレシピデータの保存を完了すると、この一連の処理を終了する。 If the measurement cursor has detected almost no problem at the location to be measured, the process proceeds to step S3008. In step S3008, the user can add, delete, and correct the position of the measurement cursor as necessary with respect to the measurement cursor detection result. In step S3009, the imaging position and the relative coordinates of the measurement cursor relative to the imaging position are stored as recipe data. Using this recipe data, the same pattern area as the image 301 on another chip or a different wafer is measured. When the storage of the recipe data is completed in step S3009, this series of processing is terminated.
 図31において、図30の処理フローを実行して保存したレシピデータを用いた計測時の処理のフローについて説明する。S3101にて、記憶装置206からステップS3009にて保存したレシピデータの読み込みを行う。次にS3102にて、レシピデータの作成元の画像と同じ撮像領域までステージ106の制御によりウェーハ107を移動する。S3103において計測領域を低倍で撮像し位置合わせ後、計測を行う倍率で撮像しパターン画像を取得する。次にS3104にて、レシピデータ内のカーソルの座標を用いて、取得した画像に対して計測カーソルを設定し、各カーソル領域に対して計測処理を行う。S3105において、計測がおわっていない計測領域があればS3102に戻り、すべての計測領域の計測が終了していればS3106に進む。S3106で計測結果の統計値を計算し、結果データを記憶装置206に出力して計測を終える。 Referring to FIG. 31, the process flow at the time of measurement using the recipe data stored by executing the process flow of FIG. 30 will be described. In step S3101, the recipe data stored in step S3009 is read from the storage device 206. In step S <b> 3102, the wafer 107 is moved under the control of the stage 106 to the same imaging area as the image from which the recipe data is created. In step S3103, the measurement area is imaged at a low magnification and aligned, and then imaged at a magnification for measurement to obtain a pattern image. In step S3104, a measurement cursor is set for the acquired image using the coordinates of the cursor in the recipe data, and measurement processing is performed on each cursor area. In S3105, if there is a measurement area where measurement has not been completed, the process returns to S3102, and if measurement of all measurement areas has been completed, the process proceeds to S3106. In step S3106, the statistical value of the measurement result is calculated, the result data is output to the storage device 206, and the measurement ends.
 なお、S3104で行う計測処理は演算部202で実行される。ホールパターンの場合は非特許文献3に記載されているような楕円フィッティングを各計測カーソル内で行い、長径、短径、面積を算出する。また、計測対象がラインの端と端などのパターン間の距離である場合は、特許文献3に記載されているようなしきい値法により算出する。 Note that the measurement processing performed in S3104 is executed by the calculation unit 202. In the case of a hole pattern, ellipse fitting as described in Non-Patent Document 3 is performed in each measurement cursor, and the major axis, minor axis, and area are calculated. Further, when the measurement target is a distance between patterns such as the end of the line, it is calculated by a threshold method as described in Patent Document 3.
 図32において、図30で説明した処理フローにおけるレシピ設定時に使用するGUIについて説明する。3201はステップS3003、S3006において使用するGUIのウインドウの一例である。3202はステップS3001において取得した画像が表示される。ユーザは矩形領域3203を画像3202上に設定することでテンプレートマッチングに使われるテンプレート画像を設定することができる。スライダ3204~3208はステップS3006において調整することができるパラメータである。 32, the GUI used when setting the recipe in the processing flow described in FIG. 30 will be described. Reference numeral 3201 denotes an example of a GUI window used in steps S3003 and S3006. An image 3202 displays the image acquired in step S3001. The user can set a template image used for template matching by setting the rectangular area 3203 on the image 3202. Sliders 3204 to 3208 are parameters that can be adjusted in step S3006.
 パラメータは統合値に対するしきい値3204や配列パターン検出画像を作る際の2304の帯の幅3205、フーリエスペクトル画像1201からピークを検出する際の周波数の最大値3206と複数のピークを検出する際に用いるしきい値3207、またテンプレートマッチングの結果と配列パターン検出の結果を統合する際の重み3208などである。 The parameters are the threshold value 3204 for the integrated value, the width 3205 of the band 2304 when creating the array pattern detection image, the maximum frequency value 3206 when detecting the peak from the Fourier spectrum image 1201, and a plurality of peaks. The threshold value 3207 to be used, the weight 3208 when integrating the template matching result and the array pattern detection result, and the like.
 パターン画像上のある座標値におけるテンプレートマッチングによって求めた相関係数画像の値をC(0≦C≦1)、配列パターン検出画像の値をA(0≦A≦1)、スライダ3208によって決まる重み係数をt(0≦t≦1)とすると、統合処理の結果の値Sは以下の(数8)で表される重み付きの相乗平均 The correlation coefficient image value obtained by template matching at a certain coordinate value on the pattern image is C (0 ≦ C ≦ 1), the array pattern detection image value is A (0 ≦ A ≦ 1), and the weight determined by the slider 3208 Assuming that the coefficient is t (0 ≦ t ≦ 1), the value S as a result of the integration process is a weighted geometric mean expressed by the following (Equation 8).
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 または以下の(数9)で表される重み付き平均 Or a weighted average represented by (Equation 9) below
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009
によって算出できる。これは、ユーザが計測対象とするホールの選択に用いられる。 Can be calculated. This is used for selection of a hole to be measured by the user.
 例えば、パターン画像3202にホール3209のような、不良パターンが存在する場合、このホール3209はテンプレートマッチングにおいて相関係数が低くホールが存在しない領域として、配列パターン検出においてホールが存在する領域と判定される。スライダ3208によりホール3209に対して計測カーソルを設定したくない場合は重み係数tを1に近く設定し、カーソルを設定したい場合はtを0に近く設定することで、ユーザはカーソル設定の対象を選択することができる。 For example, when a defective pattern such as a hole 3209 exists in the pattern image 3202, the hole 3209 is determined to be a region where a hole exists in the array pattern detection as a region having a low correlation coefficient in template matching and no hole. The When it is not desired to set the measurement cursor for the hole 3209 by the slider 3208, the weight coefficient t is set close to 1, and when the cursor is set, t is set close to 0, so that the user can set the object of cursor setting. You can choose.
 パラメータを設定し実行ボタン3219を押す(クリックする)と、パターン検出処理が行われ、計測カーソルが描画された結果画像3210が表示される。またパターンの配列を表すグリッド3211が描画され、配列パターンの検出結果を確認することができる。図30で説明したステップS3008においては、個々のカーソル3212を選択しマウス操作でカーソルを移動することで、そのカーソルの位置を修正することができる。また、カーソルを選択した状態で削除ボタン3214を押すことでカーソルの削除を行い、追加ボタン3213を押すことで中心にカーソルが1個作成され、このカーソルを移動することで任意の場所に追加することができる。 When a parameter is set and the execution button 3219 is pressed (clicked), pattern detection processing is performed, and a result image 3210 in which a measurement cursor is drawn is displayed. In addition, a grid 3211 representing the pattern arrangement is drawn, and the detection result of the arrangement pattern can be confirmed. In step S3008 described with reference to FIG. 30, the position of each cursor can be corrected by selecting each cursor 3212 and moving the cursor by operating the mouse. In addition, the cursor is deleted by pressing the delete button 3214 while the cursor is selected, and one cursor is created at the center by pressing the add button 3213, and the cursor is added to an arbitrary place by moving the cursor. be able to.
 画像3215は、タブ3216を切り替えることにより図12で説明した1201のフーリエスペクトル画像や図22で説明した2201の相関係数画像、図23で説明した2301の配列パターン検出の結果画像など、カーソル検出処理における中間画像を表示し確認することができる。図32においては、表示領域3215にフーリエスペクトル画像を表示した状態を示す。このフーリエスペクトル画像3215では、スライダ3206によって設定される周 波数の最大値を表す枠線3217により検出できるピークの領域が確認できる。 The image 3215 includes cursor detection such as the Fourier spectrum image 1201 described with reference to FIG. 12, the correlation coefficient image 2201 described with reference to FIG. 22, and the result of array pattern detection 2301 described with reference to FIG. 23 by switching the tab 3216. An intermediate image in the process can be displayed and confirmed. FIG. 32 shows a state where a Fourier spectrum image is displayed in the display area 3215. In the Fourier spectrum image 3215, a peak region that can be detected by a frame line 3217 representing the maximum value of the frequency set by the slider 3206 can be confirmed.
 また、スライダ3207によるしきい値を変更すると、しきい値以上のピーク点を囲ったカーソル3218により、検出したいピークを確認できる。パラメータ調整とカーソルの修正が終われば、図30の処理フローにおけるステップS3009においてGUI画面3201上で保存ボタン3220を押すことで、カーソルの座標情報を記憶装置206に保存することができる。 Further, when the threshold value by the slider 3207 is changed, the peak to be detected can be confirmed by the cursor 3218 surrounding the peak point equal to or higher than the threshold value. When parameter adjustment and cursor correction are completed, the coordinate information of the cursor can be saved in the storage device 206 by pressing the save button 3220 on the GUI screen 3201 in step S3009 in the processing flow of FIG.
 図33において、図3のレシピ設定のステップにおける装置の動作フローを説明する。ウェーハ107の画像を取得(ステップS3001に対応)するため、S3301においてステージ106を制御し取得するパターンの領域までウェーハ107を移動する。S3302においてウェーハ107上に電子ビーム101を照射し、2次電子を検出器108にて検出する。S3303において検出器より得られた画像データをメモリ203と制御端末機114へ転送し、制御端末114のGUI上で取得画像を表示する(ステップS3002に対応)。 33, the operation flow of the apparatus in the recipe setting step of FIG. 3 will be described. In order to acquire the image of the wafer 107 (corresponding to step S3001), the wafer 107 is moved to the area of the pattern to be acquired by controlling the stage 106 in S3301. In step S3302, the wafer 107 is irradiated with the electron beam 101, and secondary electrons are detected by the detector. The image data obtained from the detector in S3303 is transferred to the memory 203 and the control terminal 114, and the acquired image is displayed on the GUI of the control terminal 114 (corresponding to step S3002).
 S3304において、GUIを介して、ユーザの入力によりテンプレート画像の設定データを受け取り、設定データをメモリ203へ転送する(ステップS3003に対応)。S3305において、メモリ203にある画像データとテンプレート画像の設定データから演算部202において、図6で示した処理を実行し計測カーソルの検出を行う(ステップS3004に対応)。S3306において、検出結果を制御端末器114へ転送し、GUIにて表示を行う(ステップS3005に対応)。S3307において、ユーザによる検出パラメータの修正がある場合はS3310に進み、GUI上でパラメータの修正データを受け取りS3305の処理に戻る。 In step S3304, the setting data of the template image is received by the user input via the GUI, and the setting data is transferred to the memory 203 (corresponding to step S3003). In step S3305, the calculation unit 202 executes the processing shown in FIG. 6 to detect the measurement cursor from the image data and the template image setting data in the memory 203 (corresponding to step S3004). In S3306, the detection result is transferred to the control terminal 114 and displayed on the GUI (corresponding to step S3005). If it is determined in step S3307 that the detected parameter is corrected by the user, the process advances to step S3310 to receive parameter correction data on the GUI, and the process returns to step S3305.
 修正がない場合はS3308に進む(ステップS3006及びS3007に対応)。S3308において、制御端末114上のGUIを介して計測カーソルの検出結果に対する修正データを受け取り、修正データをメモリ203へ転送する(ステップS3008に対応)。S3309において、修正データを適用した計測カーソルの座標データを記憶装置206に保存し(ステップS3009に対応)、処理を終了する。 If there is no correction, the process proceeds to S3308 (corresponding to steps S3006 and S3007). In step S3308, correction data for the measurement cursor detection result is received via the GUI on the control terminal 114, and the correction data is transferred to the memory 203 (corresponding to step S3008). In step S3309, the coordinate data of the measurement cursor to which the correction data is applied is stored in the storage device 206 (corresponding to step S3009), and the process ends.
 これらの第1の実施形態により、複数の計測カーソルを設定するユーザ操作の負担を削減し、またレシピ設定の時間を削減することで計測装置の稼働率を高めることができる。 According to these first embodiments, it is possible to reduce the burden of a user operation for setting a plurality of measurement cursors, and it is possible to increase the operation rate of the measurement apparatus by reducing the time for setting a recipe.
 第2の実施形態では、第1の実施形態とは異なるレシピ設定のステップと計測とを行う。第2の実施の形態では、大きさ・形状が同じ設計のホールパターンを持つパターン画像301と501において、レシピ設定をパターン画像301で行うだけで、計測をパターン画像301とは配置が異なるパターン画像501に対しても行えるようにした。この場合、レシピ設定時に設定した計測カーソルの座標は使えないため、計測時においてパターンを撮像する度に計測カーソルを設定し計測を行う。そこでレシピ設定時に調整した図32のBUI上の3204~3208のパラメータとテンプレート画像を計測時に利用する。 In the second embodiment, recipe setting steps and measurement different from those in the first embodiment are performed. In the second embodiment, in pattern images 301 and 501 having hole patterns of the same size and shape, the pattern image 301 is different in arrangement from the pattern image 301 only by performing recipe setting on the pattern image 301. 501 can also be performed. In this case, since the coordinates of the measurement cursor set at the time of recipe setting cannot be used, the measurement cursor is set and measured every time a pattern is imaged at the time of measurement. Therefore, the parameters 3204 to 3208 and the template image on the BUI of FIG. 32 adjusted at the time of recipe setting are used at the time of measurement.
 図34において、レシピ設定の処理ステップについて説明を行う。まず、第1の実施形態と同様に図30に示す処理をステップS3001からS3007まで行った上で、ステップS3401へ進む。ステップS3401において、ステップS3003で設定した領域303で表されるテンプレート画像を記憶媒体114にレシピデータとして保存する。また、ステップS3007で設定した、テンプレートマッチング、配列パターン検出、統合処理のパラメータも記憶媒体114にレシピデータとして保存し、レシピ設定を終える。 34, the recipe setting process steps will be described. First, similarly to the first embodiment, the process shown in FIG. 30 is performed from step S3001 to S3007, and then the process proceeds to step S3401. In step S3401, the template image represented by the area 303 set in step S3003 is stored in the storage medium 114 as recipe data. Also, the template matching, sequence pattern detection, and integration processing parameters set in step S3007 are stored as recipe data in the storage medium 114, and the recipe setting is completed.
 図35において第2の実施形態における計測時のフローについて説明する。S3501にて、記憶装置206からステップS3401にて保存したレシピデータの読み込みを行う。次にS3502にて、計測対象となる画像の撮像領域までステージ106の制御によりウェーハ107を移動する。S3503において計測領域を撮像しパターン画像を取得する。次にS3504にて、取得したパターン画像に対し、レシピデータであるテンプレート画像とパラメータを用いて、図6に示した計測カーソルの検出の処理を行い、計測カーソルを設定する。次にS3505において、各カーソル領域に対して計測処理を行う。S3506において、計測がおわっていない領域があればS3502に戻り、すべての計測対象領域の計測が終了していればS3507に進む。S3507で計測結果の統計値を計算し、結果データを記憶装置206に出力して計測を終える。 FIG. 35 explains a flow at the time of measurement in the second embodiment. In step S3501, the recipe data saved in step S3401 is read from the storage device 206. In step S <b> 3502, the wafer 107 is moved by the control of the stage 106 to the imaging region of the image to be measured. In S3503, the measurement area is imaged and a pattern image is acquired. In step S3504, the measurement cursor detection process illustrated in FIG. 6 is performed on the acquired pattern image using the template image and parameters that are recipe data, and the measurement cursor is set. In step S3505, measurement processing is performed on each cursor area. In S3506, if there is an area where measurement has not been completed, the process returns to S3502, and if measurement of all measurement target areas has been completed, the process proceeds to S3507. In step S3507, the statistical value of the measurement result is calculated, the result data is output to the storage device 206, and the measurement ends.
 これら第2の実施形態により、1カ所の計測領域に対するレシピ設定のみで、同じマスクデータ上の配置は異なるが同じ単一の計測パターンを持つ領域を、レシピ設定せずに計測を行うことができる。また、レシピ設定を行った計測領域とは異なるマスクデータのウェーハにおいても、同じ材料・プロセスで製造され、計測パターンが同じであれば、そのウェーハの計測領域をレシピ設定せずに計測を行うことができる。
また、これらのレシピ設定の時間の削減により、ユーザの負担を軽減し、また計測装置の稼働率を高めることができる。
According to these second embodiments, only a recipe setting for one measurement area can be performed, and an area having the same single measurement pattern with different arrangement on the same mask data can be measured without setting the recipe. . Also, wafers with mask data different from the measurement area for which recipe settings have been made are manufactured using the same materials and processes, and if the measurement pattern is the same, measurement should be performed without setting the recipe for the wafer measurement area. Can do.
Further, by reducing the time for setting these recipes, the burden on the user can be reduced and the operating rate of the measuring device can be increased.
 第3の実施形態は、第2の実施形態と同様に、設計上では大きさ・形状が同じホールパターンを持つパターン画像301と501を対象とし、例えばパターン画像301について第2の実施形態と同じ処理フローで作成したレシピデータを用いて、パターン画像501について第1の実施形態で説明したようなレシピ設定を行う。これにより、図30の処理ステップを簡略化することができる。 Similar to the second embodiment, the third embodiment targets pattern images 301 and 501 having hole patterns having the same size and shape in design. For example, the pattern image 301 is the same as the second embodiment. Recipe setting as described in the first embodiment is performed for the pattern image 501 using the recipe data created in the processing flow. Thereby, the processing step of FIG. 30 can be simplified.
 本実施例における処理の流れを、図36に示す。まず、1カ所の計測領域(例えばパターン画像301)に対して実施例2で図34を用いて説明した処理フローと同様にレシピ設定のステップを行い、テンプレート画像とパラメータデータを保存する。次に、同じ単一の計測パターンを持ち配置が異なるパターンの領域(例えばパターン画像501に対応する領域)に対し、図36に示したような処理フローでレシピ設定を行う。 FIG. 36 shows the flow of processing in this embodiment. First, a recipe setting step is performed on one measurement region (for example, the pattern image 301) in the same manner as the processing flow described in the second embodiment with reference to FIG. 34, and the template image and parameter data are stored. Next, recipe setting is performed in a processing flow as shown in FIG. 36 for areas of patterns having the same single measurement pattern and different arrangements (for example, areas corresponding to the pattern image 501).
 すなわち、ステップS3601(図30のステップS3001に対応)において、ウェーハ画像(例えばパターン画像501)を取得、する。次に、ステップS3602において先に1カ所の計測領域(例えばパターン画像301)に対して実施例2の場合と同じ処理フローで作成して保存しておいたレシピデータ(テンプレート画像とパラメータデータ)の読み込みを行う。次に、ステップS3603で計測カーソルの検出を行い(ステップS3004に対応)、ステップS3604において、S3603による計測カーソルの検出結果と、中間処理であるテンプレートマッチング604と配列パターン検出603の結果をGUI上に出力する(ステップS3005に対応)。 That is, in step S3601 (corresponding to step S3001 in FIG. 30), a wafer image (for example, pattern image 501) is acquired. Next, in step S3602, the recipe data (template image and parameter data) created and stored in the same processing flow as in the second embodiment for one measurement region (for example, the pattern image 301) is stored. Read. Next, a measurement cursor is detected in step S3603 (corresponding to step S3004). In step S3604, the measurement cursor detection result in S3603 and the results of template matching 604 and array pattern detection 603, which are intermediate processes, are displayed on the GUI. Output (corresponding to step S3005).
 ステップS3605において、ユーザは配列パターン検出603とテンプレートマッチング604、及び統合処理607のパラメータをGUIにより変更することができる(ステップS3006に対応)。ステップS3606において、調整後パラメータを用いて、計測カーソルの検出を再度行いカーソル検出結果と中間処理結果の表示の更新を行う(ステップS3007に対応)。もし、パラメータの修正がまだ必要であるとユーザが判断した場合、ステップS3605に戻りパラメータの修正を再度行う。 In step S3605, the user can change the parameters of the array pattern detection 603, the template matching 604, and the integration process 607 using the GUI (corresponding to step S3006). In step S3606, the measurement cursor is detected again using the adjusted parameter, and the display of the cursor detection result and the intermediate processing result is updated (corresponding to step S3007). If the user determines that parameter correction is still necessary, the process returns to step S3605 and parameter correction is performed again.
 計測したい箇所にほぼ問題なく計測カーソルが検出していれば、ステップS3607に進む。ステップS3607において、ユーザは計測カーソルの検出結果に対して計測カーソルの追加、削除、位置の修正を必要に応じて行うことができる(ステップS3008に対応)。ステップS3608において、撮像位置とそれに対する計測カーソルの相対座標をレシピデータとして保存する(ステップS3009に対応)。このレシピデータを用いて、別のチップや異なるウェーハ上にある画像301と同じパターン領域の計測を行う。ステップS3608でレシピデータの保存を完了すると、この一連の処理を終了する。 If the measurement cursor has detected almost no problem at the location to be measured, the process proceeds to step S3607. In step S3607, the user can add, delete, and correct the position of the measurement cursor as necessary with respect to the detection result of the measurement cursor (corresponding to step S3008). In step S3608, the imaging position and the relative coordinates of the measurement cursor relative to the imaging position are stored as recipe data (corresponding to step S3009). Using this recipe data, the same pattern area as the image 301 on another chip or a different wafer is measured. When the storage of the recipe data is completed in step S3608, this series of processing is terminated.
 本実施例によれば、同一形状で配置が異なるパターンが形成された試料において、第1の領域を処理した後に第1の領域と同じ形状のパターンが該1の領域とは異なる配置で形成されている第2の領域を計測する際に、ステップS3002、S3003を省略して、Sステップ3004以降の操作を行うことで測長カーソル座標データのレシピデータを作成することができる。このレシピデータを用いて実施例1の場合と同様に、図31に示したような計測を行うことができる。これにより、実施例1におけるステップS3003のテンプレート領域を指定するステップを省略でき、また調整済みパラメータを利用するのでステップS3605におけるパラメータ調整も最低限の修正で済む。 According to this example, in a sample in which a pattern having the same shape and different arrangement is formed, a pattern having the same shape as the first area is formed in a different arrangement from the first area after the first area is processed. When measuring the second region, the steps S3002 and S3003 are omitted, and the operation after S3003 can be performed to create the recipe data of the length measurement cursor coordinate data. Similar to the case of the first embodiment, measurement as shown in FIG. 31 can be performed using this recipe data. As a result, the step of designating the template area in step S3003 in the first embodiment can be omitted, and the parameter adjustment in step S3605 can be made with a minimum correction since the adjusted parameter is used.
 本実施例によれば、実施例2によるレシピデータを利用することで、実施例1におけるレシピ設定の時間を削減し、ユーザの負担の軽減、及び計測装置の稼働率を高めることができる。 According to the present embodiment, by using the recipe data according to the second embodiment, the recipe setting time in the first embodiment can be reduced, the burden on the user can be reduced, and the operating rate of the measuring device can be increased.
 100・・・測長SEM  102・・・電子銃  103・・・コンデンサレンズ  104・・・偏向コイル  105・・・対物レンズ  106・・・ステージ  107・・・計測用ウェハ(試料)  108・・・検出器  109・・・A/Dコンバータ  110・・・画像処理部  111・・・ステージコントローラ  112・・・電子光学系制御部  113・・・装置全体の制御部  114・・・制御端末  200・・・入出力I/F  201・・・画像処理制御部  202・・・演算部  203・・・メモリ  204・・・バス  205・・・データ入力I/F  206・・・記憶装置。 100 ... Length measuring SEM 102 ... Electron gun 103 ... Condenser lens 104 ... Deflection coil 105 ... Objective lens 106 ... Stage 107 ... Measuring wafer (sample) 108 ... Detector 109 ... A / D converter 110 ... Image processing unit 111 ... Stage controller 112 ... Electronic optical system control unit 113 ... Control unit for the entire device 114 ... Control terminal 200 ... Input / output I / F 201: image processing control unit 202 ... arithmetic unit 203 ... memory 204 ... bus 205 ... data input I / F 206 ... storage device.

Claims (12)

  1. パターンの寸法を計測する方法であって、
    試料上に形成された本来同一の形状を有する複数のパターンを撮像して前記複数のパターンの画像を取得し、
    前記複数のパターンの画像に対してテンプレートマッチング法により抽出したパターンの情報と前記複数のパターンの画像における前記複数のパターンの配列の周期性の情報を用いて抽出したパターンの情報とを統合して得た統合結果を用いて計測カーソルを設定し、
    前記取得した複数のパターンの画像に対して前記設定した計測カーソルを用いて寸法計測領域を設定し、
    前記複数のパターンの画像のうち前記計測カーソルを用いて設定した寸法計測領域に存在するパターンの画像を処理して該パターンの寸法を計測する
    ことを特徴とするパターン寸法計測方法。
    A method for measuring the dimensions of a pattern,
    Capture images of a plurality of patterns originally having the same shape formed on the sample to obtain images of the plurality of patterns,
    The pattern information extracted by the template matching method for the plurality of pattern images and the pattern information extracted using the periodicity information of the arrangement of the plurality of patterns in the plurality of pattern images are integrated. Set the measurement cursor using the obtained integration results,
    Set a dimension measurement area using the set measurement cursor for the acquired images of the plurality of patterns,
    A pattern dimension measuring method, comprising: processing a pattern image existing in a dimension measuring region set by using the measurement cursor among the plurality of pattern images to measure a dimension of the pattern.
  2. 請求項1記載のパターン寸法計測方法であって、前記複数のパターンの画像は、SEMで撮像して取得した画像であることを特徴とするパターン寸法計測方法。 The pattern dimension measuring method according to claim 1, wherein the images of the plurality of patterns are images acquired by SEM.
  3. 請求項1記載のパターン寸法計測方法であって、前記複数のパターンの画像に対してテンプレートマッチング法により抽出したパターンの情報は、前記複数のパターンの画像から抽出したテンプレート画像と前記複数のパターンの画像の個々のパターンとの正規化相関を求めて得られた相関係数に関する情報であることを特徴とするパターン寸法計測方法。 The pattern dimension measuring method according to claim 1, wherein the pattern information extracted by the template matching method with respect to the plurality of pattern images includes the template image extracted from the plurality of pattern images and the plurality of patterns. A pattern dimension measuring method, which is information on a correlation coefficient obtained by obtaining a normalized correlation with each pattern of an image.
  4. 請求項1記載のパターン寸法計測方法であって、前記複数のパターンの画像における前記複数のパターンの配列の周期性の情報は、前記複数のパターンの画像に対して離散的フーリエ変換を行って求めたフーリエスペクトル画像を用いて得られた情報であることを特徴とするパターン寸法計測方法。 2. The pattern dimension measuring method according to claim 1, wherein the periodicity information of the arrangement of the plurality of patterns in the plurality of pattern images is obtained by performing discrete Fourier transform on the plurality of pattern images. A pattern dimension measuring method characterized in that the information is obtained using a Fourier spectrum image.
  5. 請求項1記載のパターン寸法計測方法であって、前記複数のパターンの画像における前記複数のパターンの配列の周期性の情報は、前記複数のパターンの画像に対して自己相関関数を求め、該求めた自己相関関数から得られた情報であることを特徴とするパターン寸法計測方法。 The pattern dimension measuring method according to claim 1, wherein the periodicity information of the arrangement of the plurality of patterns in the plurality of pattern images is obtained by obtaining an autocorrelation function for the plurality of pattern images. A pattern dimension measuring method characterized in that the information is obtained from an autocorrelation function.
  6. 請求項1記載のパターン寸法計測方法であって、前記統合結果は、前記複数のパターンの画像から抽出したテンプレート画像と前記複数のパターンの画像の個々のパターンとの正規化相関を求めて得られた相関係数と、前記複数のパターンの画像に対して離散的フーリエ変換を行って求めたフーリエスペクトル画像の情報から算出した前記パターンの周期情報に基づく各位置ごとのパターンが存在する確率に対応する重み係数とを掛け合わせた結果の情報であることを特徴とするパターン寸法計測方法。 2. The pattern dimension measuring method according to claim 1, wherein the integration result is obtained by obtaining a normalized correlation between a template image extracted from the plurality of pattern images and individual patterns of the plurality of pattern images. Corresponding to the probability that there is a pattern for each position based on the periodic information of the pattern calculated from the information of the Fourier spectrum image obtained by performing discrete Fourier transform on the image of the plurality of patterns. A pattern dimension measuring method, which is information of a result obtained by multiplying a weighting factor to be multiplied.
  7. パターンの寸法を計測する装置であって、
    試料上に形成された本来同一の形状を有する複数のパターンを撮像して前記複数のパターンの画像を取得する画像取得手段と、
    前記画像取得手段で取得した複数のパターンの画像に対してテンプレートマッチング法によりパターンの情報を抽出し、前記複数のパターンの画像における前記複数のパターンの配列の周期性の情報を用いてパターンの情報を抽出し、前記テンプレートマッチング法により抽出したパターンの情報と前記複数のパターンの配列の周期性の情報を用いて抽出したパターンの情報とを統合して統合結果を得、前記得た統合結果を用いて計測カーソルを設定する計測カーソル設定手段と、
    前記画像取得手段で取得した複数のパターンの画像に対して前記計測カーソル設定手段で設定した計測カーソルを用いて寸法計測領域を設定する寸法計測領域設定手段と、
    前記画像取得手段で取得した複数のパターンの画像のうち前記計測カーソル設定手段で設定した計測カーソルを用いて設定した寸法計測領域に存在するパターンの画像を処理して該パターンの寸法を計測する寸法計測手段と
    を備えたことを特徴とするパターン寸法計測装置。
    An apparatus for measuring the dimensions of a pattern,
    Image acquisition means for capturing images of the plurality of patterns by imaging a plurality of patterns having the same shape originally formed on the sample;
    Pattern information is extracted by a template matching method for a plurality of pattern images acquired by the image acquisition means, and pattern information is obtained using periodicity information of the plurality of pattern arrays in the plurality of pattern images. And extracting the pattern information extracted by the template matching method and the pattern information extracted using the periodicity information of the arrangement of the plurality of patterns to obtain an integrated result, and obtaining the integrated result A measurement cursor setting means for setting a measurement cursor using,
    A dimension measurement area setting means for setting a dimension measurement area using the measurement cursor set by the measurement cursor setting means for the images of the plurality of patterns acquired by the image acquisition means;
    Dimension for processing the pattern image existing in the dimension measurement region set using the measurement cursor set by the measurement cursor setting unit among the plurality of pattern images acquired by the image acquisition unit, and measuring the dimension of the pattern A pattern dimension measuring apparatus comprising a measuring means.
  8. 請求項7記載のパターン寸法計測装置であって、前記画像取得手段はSEMを含み、前記画像取得手段で取得した複数のパターンの画像は、前記SEMで撮像して取得した画像であることを特徴とするパターン寸法計測装置。 8. The pattern dimension measuring apparatus according to claim 7, wherein the image acquisition unit includes an SEM, and the images of the plurality of patterns acquired by the image acquisition unit are images acquired by the SEM. Pattern dimension measuring device.
  9. 請求項7記載のパターン寸法計測装置であって、前記カーソル設定手段で前記複数のパターンの画像に対してテンプレートマッチング法により抽出したパターンの情報は、前記複数のパターンの画像から抽出したテンプレート画像と前記複数のパターンの画像の個々のパターンとの正規化相関を求めて得られた相関係数に関する情報であることを特徴とするパターン寸法計測装置。 8. The pattern dimension measuring apparatus according to claim 7, wherein the information on the pattern extracted by the template setting method with respect to the plurality of pattern images by the cursor setting means is a template image extracted from the plurality of pattern images. An apparatus for measuring a pattern dimension, which is information relating to a correlation coefficient obtained by obtaining a normalized correlation with individual patterns of the images of the plurality of patterns.
  10. 請求項7記載のパターン寸法計測装置であって、前記カーソル設定手段で前記複数のパターンの画像における前記複数のパターンの配列の周期性の情報を用いて抽出した前記パターンの情報は、前記複数のパターンの画像に対して離散的フーリエ変換を行って求めたフーリエスペクトル画像を用いて得られた情報であることを特徴とするパターン寸法計測装置。 8. The pattern dimension measuring apparatus according to claim 7, wherein the information on the pattern extracted by the cursor setting unit using information on periodicity of the arrangement of the plurality of patterns in the images of the plurality of patterns is the plurality of patterns. An apparatus for measuring a pattern dimension, which is information obtained by using a Fourier spectrum image obtained by performing a discrete Fourier transform on an image of a pattern.
  11. 請求項7記載のパターン寸法計測装置であって、前記カーソル設定手段で前記複数のパターンの画像における前記複数のパターンの配列の周期性の情報を用いて抽出した前記パターンの情報は、前記複数のパターンの画像に対して自己相関関数を求め、該求めた自己相関関数から得られた情報であることを特徴とするパターン寸法計測装置。 8. The pattern dimension measuring apparatus according to claim 7, wherein the information on the pattern extracted by the cursor setting unit using information on periodicity of the arrangement of the plurality of patterns in the images of the plurality of patterns is the plurality of patterns. An apparatus for measuring a pattern size, characterized in that an autocorrelation function is obtained for an image of a pattern and the information is obtained from the obtained autocorrelation function.
  12. 請求項7記載のパターン寸法計測装置であって、前記計測カーソル設定手段で統合する前記統合結果は、前記複数のパターンの画像から抽出したテンプレート画像と前記複数のパターンの画像の個々のパターンとの正規化相関を求めて得られた相関係数と、前記複数のパターンの画像に対して離散的フーリエ変換を行って求めたフーリエスペクトル画像の情報から算出した前記パターンの周期情報に基づく各位置ごとのパターンが存在する確率に対応する重み係数とを掛け合わせた結果の情報であることを特徴とするパターン寸法計測装置。 The pattern dimension measurement apparatus according to claim 7, wherein the integration result integrated by the measurement cursor setting unit is obtained by extracting a template image extracted from the plurality of pattern images and individual patterns of the plurality of pattern images. For each position based on the correlation coefficient obtained by obtaining the normalized correlation and the periodic information of the pattern calculated from the information of the Fourier spectrum image obtained by performing discrete Fourier transform on the images of the plurality of patterns A pattern dimension measuring apparatus characterized by being information obtained by multiplying a weighting factor corresponding to the probability of the presence of a pattern.
PCT/JP2014/063488 2013-07-23 2014-05-21 Pattern dimension measurement method and device WO2015011974A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-152858 2013-07-23
JP2013152858A JP2015021942A (en) 2013-07-23 2013-07-23 Pattern dimension measurement method and device thereof

Publications (1)

Publication Number Publication Date
WO2015011974A1 true WO2015011974A1 (en) 2015-01-29

Family

ID=52393031

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/063488 WO2015011974A1 (en) 2013-07-23 2014-05-21 Pattern dimension measurement method and device

Country Status (3)

Country Link
JP (1) JP2015021942A (en)
TW (1) TW201508697A (en)
WO (1) WO2015011974A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024100896A1 (en) * 2022-11-11 2024-05-16 株式会社日立ハイテク Pattern length measurement/defect inspection method, image data processing system, and computer-readable recording medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022092727A (en) * 2020-12-11 2022-06-23 株式会社日立ハイテク Computer system for observation device and method for processing
CN115690200B (en) * 2022-12-30 2024-03-08 北京慕柏科技有限公司 Method, device, equipment and storage medium for matching perforated aluminum sheet with aluminum template

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07220077A (en) * 1994-01-28 1995-08-18 Toshiba Corp Pattern image processing device and image processing method
JP2004272313A (en) * 2003-03-05 2004-09-30 Ricoh Co Ltd Image analysis apparatus
JP2007121147A (en) * 2005-10-28 2007-05-17 Hitachi High-Technologies Corp Pattern matching device, and semiconductor inspection system using it
JP2008234455A (en) * 2007-03-22 2008-10-02 Hitachi Ltd Template matching apparatus and method
JP2009086920A (en) * 2007-09-28 2009-04-23 Hitachi High-Technologies Corp Inspection device and inspection method
JP2010186614A (en) * 2009-02-12 2010-08-26 Jeol Ltd Detection method of periodical pattern, and photographing method of sample image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07220077A (en) * 1994-01-28 1995-08-18 Toshiba Corp Pattern image processing device and image processing method
JP2004272313A (en) * 2003-03-05 2004-09-30 Ricoh Co Ltd Image analysis apparatus
JP2007121147A (en) * 2005-10-28 2007-05-17 Hitachi High-Technologies Corp Pattern matching device, and semiconductor inspection system using it
JP2008234455A (en) * 2007-03-22 2008-10-02 Hitachi Ltd Template matching apparatus and method
JP2009086920A (en) * 2007-09-28 2009-04-23 Hitachi High-Technologies Corp Inspection device and inspection method
JP2010186614A (en) * 2009-02-12 2010-08-26 Jeol Ltd Detection method of periodical pattern, and photographing method of sample image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024100896A1 (en) * 2022-11-11 2024-05-16 株式会社日立ハイテク Pattern length measurement/defect inspection method, image data processing system, and computer-readable recording medium

Also Published As

Publication number Publication date
TW201508697A (en) 2015-03-01
JP2015021942A (en) 2015-02-02

Similar Documents

Publication Publication Date Title
US8953868B2 (en) Defect inspection method and defect inspection apparatus
US6614923B1 (en) Pattern inspecting method and apparatus thereof, and pattern inspecting method on basis of electron beam images and apparatus thereof
US8207499B2 (en) Variable rate scanning in an electron microscope
US6476388B1 (en) Scanning electron microscope having magnification switching control
US7932493B2 (en) Method and system for observing a specimen using a scanning electron microscope
JP4919988B2 (en) Circuit pattern inspection apparatus and circuit pattern inspection method
US9019362B2 (en) Charged particle beam device and a method of improving image quality of the same
JP2013246062A (en) Pattern inspection device and pattern inspection method
US7807980B2 (en) Charged particle beam apparatus and methods for capturing images using the same
JP5414215B2 (en) Circuit pattern inspection apparatus and circuit pattern inspection method
JP5651428B2 (en) Pattern measuring method, pattern measuring apparatus, and program using the same
WO2015011974A1 (en) Pattern dimension measurement method and device
KR101808470B1 (en) Pattern measurement device and computer program
US8263935B2 (en) Charged particle beam apparatus
CN111079730B (en) Method for determining area of sample graph in interface graph and electronic equipment
KR101677822B1 (en) Scanning-electron-microscope image processing device and scanning method
WO2019110572A1 (en) Systems and methods for tuning and calibrating charged particle beam apparatus
US8552371B2 (en) Method for adjusting imaging magnification and charged particle beam apparatus
US9329034B2 (en) Pattern determination device and computer program
CN107154365A (en) Inspection method and system and the method that semiconductor devices is checked using it
US20230052350A1 (en) Defect inspecting system and defect inspecting method
JP2011191244A (en) Evaluation method of optical unit
JP2012019220A (en) Circuit pattern inspection device and circuit pattern inspection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14829705

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14829705

Country of ref document: EP

Kind code of ref document: A1