WO2022009357A1 - パターンマッチング装置、パターン測定システム、パターンマッチングプログラム - Google Patents
パターンマッチング装置、パターン測定システム、パターンマッチングプログラム Download PDFInfo
- Publication number
- WO2022009357A1 WO2022009357A1 PCT/JP2020/026773 JP2020026773W WO2022009357A1 WO 2022009357 A1 WO2022009357 A1 WO 2022009357A1 JP 2020026773 W JP2020026773 W JP 2020026773W WO 2022009357 A1 WO2022009357 A1 WO 2022009357A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- correlation
- pattern matching
- computer system
- shift amount
- Prior art date
Links
- 238000005259 measurement Methods 0.000 claims description 29
- 230000002596 correlated effect Effects 0.000 claims description 14
- 238000003384 imaging method Methods 0.000 claims description 14
- 230000000875 corresponding effect Effects 0.000 claims description 10
- 238000012549 training Methods 0.000 claims description 8
- 230000002194 synthesizing effect Effects 0.000 claims description 2
- 238000012545 processing Methods 0.000 abstract description 38
- 239000004065 semiconductor Substances 0.000 abstract description 25
- 230000006870 function Effects 0.000 abstract description 15
- 230000003252 repetitive effect Effects 0.000 abstract 1
- 238000013461 design Methods 0.000 description 109
- 238000001878 scanning electron micrograph Methods 0.000 description 107
- 238000004364 calculation method Methods 0.000 description 68
- 238000000034 method Methods 0.000 description 50
- 230000008569 process Effects 0.000 description 29
- 238000003860 storage Methods 0.000 description 16
- 238000005520 cutting process Methods 0.000 description 8
- 238000007689 inspection Methods 0.000 description 7
- 238000006073 displacement reaction Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 238000010494 dissociation reaction Methods 0.000 description 4
- 230000005593 dissociations Effects 0.000 description 4
- 238000010894 electron beam technology Methods 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/32—Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/22—Optical or photographic arrangements associated with the tube
- H01J37/222—Image processing arrangements associated with the tube
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/22—Optical or photographic arrangements associated with the tube
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/26—Electron or ion microscopes; Electron or ion diffraction tubes
- H01J37/28—Electron or ion microscopes; Electron or ion diffraction tubes with scanning beams
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
- G06T2207/10061—Microscopic image from scanning electron microscope
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/26—Electron or ion microscopes
- H01J2237/28—Scanning microscopes
- H01J2237/2813—Scanning microscopes characterised by the application
- H01J2237/2817—Pattern inspection
Definitions
- This disclosure relates to a technique for performing pattern matching between images.
- a device that measures and inspects a pattern formed on a semiconductor wafer often uses template matching technology to align the field of view of the inspection device with respect to a desired measurement or measurement position.
- Template matching is a process of finding an area that best matches a pre-registered template image from an image to be searched.
- Patent Document 1 describes an example of such template matching.
- Patent Document 2 describes a method of creating a template for template matching based on design data of a semiconductor device. If a template can be created based on design data, there is an advantage that the trouble of acquiring an image with an inspection device for creating a template is eliminated.
- Non-Patent Document 1 describes a learnable model in which two images are input and conversion parameters (for example, matching shift amount) existing in the images are output. By learning using the teacher data, it is possible to realize highly accurate model learning, and there is an advantage that it is not necessary to manually process the input image.
- Japanese Patent No. 4218171 (corresponding US Pat. No. 6,627,888) Japanese Patent No. 4199939 (corresponding US Pat. No. 7,235,782)
- the reasons why the difference in appearance of the image between the template image and the searched image becomes large are as follows, for example: (a) The imaging conditions of the inspection device and the searched image when the template is registered are as follows. When the difference in the imaging conditions of the inspection device at the time of imaging became large, (b) the difference in the quality of the semiconductor pattern taken when the template was registered and the quality of the semiconductor pattern when the image to be searched was captured became large.
- Patent Document 1 does not disclose how to process the matching process when a discrepancy occurs between the template image and the image to be searched.
- Patent Document 2 discloses that a template close to a pattern of an actual image is created by performing a smoothing process on graphic data generated based on design data and rounding each part of the pattern. However, it is difficult to deal with any discrepancy between the template image and the image to be searched.
- Non-Patent Document 1 makes it possible to realize highly accurate model learning while absorbing the dissociation between two images by learning in a short time.
- the repeating pattern has a plurality of the same shapes within the proximity range (for example, in the field of view), so that there are a plurality of candidates for the matching correct answer position, and it is difficult to specify the true correct answer position.
- the present disclosure has been made to solve such a problem, and proposes a pattern matching device capable of realizing a matching process characterized by having a learning function, particularly even in a semiconductor pattern including a repeating pattern. do.
- the pattern matching device includes a learning device that estimates a first correlation image having a numerical value representing the correlation between the first image and the second image as a pixel value, and the pattern matching device is the first.
- a second correlation image having a numerical value representing the correlation between the derivative image generated from the image and the first image as a pixel value is calculated, and the learner calculates the second correlation image between the first correlation image and the second correlation image. Learn to reduce the difference between.
- the pattern matching apparatus it is possible to realize a matching process characterized by having a learning function, particularly even in a semiconductor pattern including a repeating pattern.
- a configuration example of the pattern matching device 100 according to the first embodiment is shown.
- a configuration example of the correlation image calculation unit 1202 is shown. It is a block diagram which shows the structural example of the learning model 301 provided in the correlation image estimation unit 1201. It is a figure which shows an example of the GUI in which a user inputs a matching shift amount. It is a flowchart explaining the operation of the pattern matching apparatus 100.
- a configuration example of a pattern measurement system including a pattern matching device 100 and an SEM600 is shown. Another configuration example of the pattern measurement system of FIG. 6 is shown.
- a configuration example of the pattern matching device 100 according to the second embodiment is shown.
- a configuration example of the learning data generation unit 8206 is shown. It is a flowchart explaining the operation of the pattern matching apparatus 100 which concerns on Embodiment 2. It is a flowchart explaining the operation of the pattern matching apparatus 100 which concerns on Embodiment 3.
- FIG. 1 shows a configuration example of the pattern matching device 100 according to the first embodiment of the present disclosure.
- the pattern matching device 100 can be configured as an arithmetic unit that executes a pattern matching process.
- the pattern matching device 100 can be configured by a storage medium 110, a pattern matching system 120, and an input device 130.
- FIG. 1 shows a configuration example in which pattern matching processing is executed particularly by an arithmetic processing unit.
- the template image SEM image 1104 acquired by the measuring device
- the searched image design drawing obtained from the design data
- the template image is not limited to the SEM image, and may be another type of image, for example, a design drawing.
- the image to be searched is not limited to the design drawing, and other types of images such as SEM images may be used.
- a scanning electron microscope is used as an example of the measuring device.
- the SEM is used, for example, to measure the dimensions of a pattern of a semiconductor device formed on a semiconductor wafer. A specific configuration example of the SEM will be described later with reference to FIG.
- the pattern matching system 120 is composed of one or more computer subsystems including one or more CPUs (Central Processing Units) and GPUs (Graphics Processing Units).
- the pattern matching system 120 comprises one or more components executed by the one or more computer subsystems.
- One or more computer systems can realize the processing described later by software using one or more processors, or can realize a part or all of the processing by hardware such as an electronic circuit.
- the pattern matching system 120 executes a pattern matching process between the design drawing 1102 stored in the storage medium 1101 and the SEM image 1104 stored in the storage medium 1103. As a result of the pattern matching process, the estimated matching shift amount 1301 is output.
- the estimated matching shift amount 1301 represents the position shift amount or the position difference between the design drawing 1102 and the SEM image 1104.
- the estimated matching shift amount 1301 can be represented by a two-dimensional scalar value, for example (shift amount in the X direction, shift amount in the Y direction).
- the pattern matching system 120 includes a correlation image estimation unit 1201, a correlation image calculation unit 1202, an estimation error calculation unit 1203, an estimation parameter update unit 1204, and a matching shift amount calculation unit 1205.
- the pattern matching system 120 is configured to receive input of various information from the input device 130.
- the correlation image estimation unit 1201 estimates the estimated correlation image 1211 between the design drawing 1102 and the SEM image 1104.
- the matching shift amount calculation unit 1205 calculates the estimated matching shift amount 1301 using the estimated correlation image 1211.
- the correlation image is an image in which the correlation values between the template image and the searched image are arranged over the entire searched image. More specifically, an image having the same size as the template image is cut out from the image to be searched, and the correlation value between the cut-out image and the template image is obtained. The cut-out image is cut out by raster-scanning the image to be searched (sliding window).
- a correlation image is a cut-out image in which the obtained correlation values are arranged as pixel values for each displacement (slide) in the X direction and displacement (slide) in the Y direction.
- the correlation value is calculated higher as the template image and the cropped image match (and may be designed to be calculated lower). Therefore, the pixel value becomes large (the luminance value is high) at the position where the correlation between the image to be searched and the clipped image is high, and the pixel value becomes small (the luminance value is small) at the position where the correlation is low.
- the correlation image estimation unit 1201 is configured by a learner that takes the design drawing 1102 and the SEM image 1104 as inputs and estimates the estimated correlation image 1211 between the two input images.
- the correlation image estimation unit 1201 also inputs a model to which the estimation parameter 1212 has been updated, and executes estimation of the estimation correlation image 1211 using this learning model.
- the estimation parameter 1212 is appropriately updated by the estimation parameter updating unit 1204 and supplied to the correlation image estimation unit 1201.
- the learner can be configured, for example, by a neural network structure described with reference to FIG. 3 described later.
- the correlation image calculation unit 1202 takes the design drawing 1102 and the input matching shift amount 1302 as inputs, and calculates the calculation correlation image 1213 using them.
- the input matching shift amount 1302 can be input from the input device 130.
- the calculated correlation image 1213 can be used as teacher data when the learner performs learning.
- the estimation error calculation unit 1203 calculates the estimation error 1214 between the estimation correlation image 1211 and the calculation correlation image 1213. Specifically, the sum of the differences between each pixel value of the estimated correlation image 1211 estimated by the correlation image estimation unit 1201 and each pixel value of the calculated correlation image 1213 which is the training data is obtained by an error function. For example, the error is calculated by the mean square error or the mean absolute error. The error function is not limited to these, and any error function may be used as long as it can calculate the difference between images.
- the estimation parameter update unit 1204 adjusts the parameters of the learner in the correlation image estimation unit 1201 so that the estimation error of each pixel value in the estimation correlation image 1211 becomes small, and supplies the parameters to the correlation image estimation unit 1201.
- the matching shift amount calculation unit 1205 calculates the estimated matching shift amount 1301 from the estimated correlation image 1211.
- the following procedure can be used.
- the highest pixel value (highest luminance pixel) of the estimated correlation image 1211 is specified.
- the position of the highest luminance pixel is the position where the design drawing 1102 and the SEM image 1104 best match, and thus represent the matching shift amount.
- the pixel value of the estimated correlation image 1211 is set, for example, by calculating the correlation value at each shift position while shifting the coordinates with the upper left end of the estimated correlation image 1211 as the origin.
- the design drawing 1102 or the SEM image 1104 may have the center position as the origin.
- the origin positions are different between the estimated correlation image 1211 and the design drawing 1102, or between the estimated correlation image 1211 and the SEM image 1104, correction for matching the origin positions is required.
- the matching shift amount between the design drawing 1102 and the SEM image 1104 can be obtained from the position of the highest luminance pixel after the origin correction.
- the correlation image estimation unit 1201 (learned model) estimates the correlation image
- the matching shift amount calculation unit 1205 estimates the matching shift amount from the correlation image, whereby in the pattern of the semiconductor including the repetition pattern.
- a learning model for finding the matching shift amount can be realized.
- the learner estimates (a) a vertical correlation image having a correlation value as a pixel value indicating the degree to which the design drawing 1102 and the SEM image 1104 match in the vertical direction, and (b) the design drawing 1102 in the horizontal direction. It is configured to estimate a laterally correlated image having a correlation value as a pixel value indicating the degree to which the SEM image 1104 and the SEM image 1104 match.
- the pattern matching system 120 can generate the estimated correlation image 1211 by synthesizing the vertical correlation image and the horizontal correlation image.
- the information separation of the pattern is not limited to the horizontal edge and the vertical edge, and may be an edge in a direction in which stable learning can be performed.
- FIG. 2 shows a configuration example of the correlation image calculation unit 1202.
- Image 211 is an example of design drawing 1102
- image 212 is an example of SEM image 1104.
- the input matching shift amount 1302 is a matching shift amount between the image 211 and the image 212 input by the user.
- the image cutting unit 201 cuts out the image 202 from the image 211.
- the image 202 is an image obtained by cutting out a region having the same size as the image 212 from the image 211 at the position designated by the input matching shift amount 1302.
- the correlation calculation unit 203 calculates the correlation image 204 between the image 202 and the image 211.
- the correlation calculation unit 203 calculates the correlation image 204 by using, for example, a method such as normalization cross-correlation.
- the image 211 and the image 202 may be preprocessed so as to be a correlated image that is easy to learn.
- the method for calculating the correlation image 204 is not limited to these methods, and any method may be used as long as the correlation value of the matching correct answer position is calculated to be the highest (or the lowest).
- FIG. 3 is a block diagram showing a configuration example of the learning model 301 included in the correlation image estimation unit 1201.
- the learning model 301 can be configured, for example, by a convolutional neural network.
- the learning model 301 adjusts parameters (such as connection weights and biases between neurons) so that the difference between the computational correlation image 1213 and the estimated correlation image 1211 is small.
- the learning process is carried out by this parameter adjustment. Learning can be performed, for example, by sequentially updating the parameters by the backpropagation method.
- the estimation parameter update unit 1204 determines how the error between the output data (estimated correlation image 1211) and the teacher data (calculated correlation image 1213) changes with respect to each parameter (for example, gradient). ) Is calculated.
- the estimation parameter update unit 1204 updates the parameters little by little according to the amount of change, and adjusts the parameters so as to obtain the optimum output.
- the learning model 301 is composed of an input layer 311 and an output layer 316, and a plurality of intermediate layers 312, 313, 314, and 315.
- a design drawing 1102 (searched image) and an SEM image 1104 (template image) to be input images are input to the input layer 311.
- the data in the layer is aggregated by the convolution operation by the predetermined coefficient filter and the image reduction.
- the intermediate layer 313 stores data in which the design drawing 1102 and the SEM image 1104 are aggregated.
- the correlation data between the aggregated data in the design drawing 1102 and the aggregated data in the SEM image 1104 is calculated to calculate the correlation data between the two. This correlation data is stored in the intermediate layer 314.
- the data in the layer is expanded by the convolution calculation by the predetermined coefficient filter and the image enlargement.
- the data in the output layer 316 is an estimated correlation image 1211 between the blueprint 1102 and the SEM image 1104.
- the estimation error calculation unit 1203 calculates the error between the estimation correlation image 1211 and the calculation correlation image 1213.
- the estimation parameter update unit 1204 updates the parameters (weights and biases) of each layer by the error back propagation method using the error.
- a model that inputs two images to be matched and outputs a correlated image can be learned by End to End (learning that directly learns the input / output relationship for the task to be learned).
- End to End data that brings the template image and the searched image close to each other can be automatically aggregated in the intermediate layer 213.
- the intermediate layer 315 can automatically select the data necessary for generating the estimated correlation image 1211 from the correlation data stored in the intermediate layer 314. Even if an excessive pattern (a pattern not shown in the design drawing but in the SEM image) is reflected in the SEM image, there is an effect of estimating the estimated correlation image 1211 that is stably brought close to the calculated correlation image 1213.
- a plurality of images may be input in each of the template image and the searched image.
- the multi-channel image is, for example, (a) an image captured by changing the imaging conditions for the same imaging target, (b) an image captured by a different detector as described later in FIG. 6, and (c) detection.
- the multi-channel image is not limited to these, and may be an image that easily absorbs the dissociation between the input images.
- Auxiliary information is applied to the learning model via one channel of the input image so that the contrast of the estimated correlation image 1211 (the difference between the correlation value of the matching correct position and the correlation value of the other position) becomes large. You may enter it.
- Examples of the auxiliary information include weighted images that increase the luminance value of the region of interest for matching. By inputting such information, it is possible to emphasize the data in the region of interest, reduce the importance of the data in the other regions, and obtain a correlated image with higher contrast.
- the auxiliary information is not limited to these, and any information may be used as long as it facilitates the estimation of a large correlated image with the estimated contrast.
- FIG. 4 is a diagram showing an example of a GUI (Graphical User Interface) in which a user inputs a matching shift amount.
- the GUI 401 illustrated in FIG. 4 can be displayed on the input device 130 shown in FIG. 1, for example.
- the user can input the input matching shift amount 1302 required for learning by the correlation image estimation unit 1201 by the GUI 401.
- the GUI 401 has an image display area 402, an image transparency setting area 403, a matching shift amount input area 404, and a setting button 405.
- the image display area 402 displays the design drawing 421, the SEM image 422, and the cursor 423.
- the design drawing 421 and the SEM image 422 are displayed as overlays.
- the user can move the SEM image 422 to a position matching the design drawing 421 by the cursor 423.
- the amount of movement of the SEM image 422 corresponds to the amount of matching shift.
- the matching shift amount that changes by moving the cursor 423 is reflected in the matching shift amount setting frame 441 in real time.
- the user can also directly input the matching shift amount by the matching shift amount setting frame 441 in the matching shift amount input area 404.
- the input matching shift amount is reflected by the SEM image 422 moving relative to the design drawing 421.
- the user can input the transparency (intensity) of the design drawing 421 and the SEM image 422 to the frame 431 so that the matching result can be easily confirmed in the image transparency setting area 403.
- the matching shift amount is supplied to the correlation image calculation unit 1202 as the input matching shift amount 1302.
- the method of inputting the matching shift amount has been described with reference to FIG. 4, the method of inputting the matching shift amount is not limited to the method described, and a method capable of inputting the matching shift amount may be used.
- FIG. 5 is a flowchart illustrating the operation of the pattern matching device 100. Each step of FIG. 5 will be described below.
- Steps S501 to S502 The pattern matching system 120 acquires learning data (design drawing 1102, SEM image 1104) stored in the storage medium (S501).
- the user inputs the matching shift amount of both by manually matching the design drawing 1102 and the SEM image 1104 from the GUI 401 illustrated in FIG. 4 (S502).
- the correlation image calculation unit 1202 receives the design drawing 1102 and the input matching shift amount 1302, and calculates the calculation correlation image 1213.
- the correlation image estimation unit 1201 receives the design drawing 1102 and the SEM image 1104, and generates the estimation correlation image 1211.
- the estimation error calculation unit 1203 uses an error function to calculate the difference between the estimation correlation image 1211 and the calculation correlation image 1213, that is, the estimation error 1214 of the correlation image estimation unit 1201.
- the estimation parameter update unit 1204 calculates changes in the weight and bias of the neural network by back-propagating the estimation error 1214, and updates the values. Learning is performed by repeating the above estimation and back propagation at least once.
- the correlation image estimation unit 1201 estimates the estimated correlation image 1211 using the trained model (S505).
- the matching shift amount calculation unit 1205 calculates the estimated matching shift amount 1301 from the estimated correlation image 1211 (S506).
- FIG. 6 shows a configuration example of a pattern measurement system including a pattern matching device 100 and an SEM600.
- the SEM600 measures, for example, the pattern dimensions of a semiconductor device formed on a semiconductor wafer 603.
- the arithmetic processing unit or computer system in the pattern measurement system can be configured as, for example, a control unit 614.
- the control unit 614 includes a calculation means (for example, CPU / GPU 616) and a storage means (for example, a memory including an image memory 615). Information can be stored in the storage means, for example, a program related to pattern matching processing is stored.
- this program When the CPU / GPU 616 executes this program, the pattern matching process shown in FIG. 1 is executed. That is, the control unit 614 functions as the pattern matching device 100. In other words, this program causes the computer system to function as an arithmetic processing device included in the pattern matching device 100, and causes the pattern matching process shown in FIG. 1 to be executed.
- the SEM600 generates an electron beam from the electron gun 601.
- the deflector 604 and the objective lens 605 are controlled so that the electron beam is focused and emitted at an arbitrary position on the semiconductor wafer 603, which is a sample placed on the stage 602.
- Secondary electrons are emitted from the semiconductor wafer 603 irradiated with the electron beam and detected by the secondary electron detector 606.
- the detected secondary electrons are converted into a digital signal by the A / D converter 607.
- the image represented by the digital signal is stored in the image memory 615 in the control unit 614.
- This image is used, for example, as an SEM image 1104, and based on this image, the learning process shown in the pattern matching process shown in FIG. 1 is performed by the control unit 614 or the CPU / GPU 616.
- the setting process required for these processes and the display of the process result can be performed by the input device 130.
- the optical camera 611 may be used.
- the signal obtained by imaging the semiconductor wafer 603 by the optical camera 611 is converted into a digital signal by the A / D converter 612 (when the signal from the optical camera 611 is a digital signal, the A / D converter).
- the image represented by the digital signal is stored in the image memory 615 in the control unit 614, and the CPU / GPU 616 performs image processing according to the purpose.
- the SEM600 may include a backscattered electron detector 608.
- the backscattered electron detector 608 When the backscattered electron detector 608 is provided, the backscattered electrons emitted from the semiconductor wafer 603 are detected by the backscattered electron detector 608, and the detected backscattered electrons are converted into a digital signal by the A / D converter 609 or 610. Convert.
- the image represented by the digital signal is stored in the image memory 615 in the control unit 614, and the CPU / GPU 616 performs image processing according to the purpose.
- a storage means 621 may be provided separately from the image memory 615.
- the control unit 614 may control the stage 602 via the stage controller 630, or may control the objective lens 605 or the like via the deflection control unit 631.
- SEM600 is shown as an example of the inspection device used together with the pattern matching device 100, but the device that can be used together with the pattern matching device 100 is not limited to this. Any device (measuring device, inspection device, etc.) that acquires an image and performs pattern matching processing can be used together with the pattern matching device 100.
- FIG. 7 shows another configuration example of the pattern measurement system of FIG.
- the configuration example of FIG. 7 may be understood as another expression for the same configuration as that of FIG.
- the pattern measurement system refers to the SEM main body 701, the control device 702 that controls the SEM main body 701, the arithmetic processing unit 704 that executes the pattern matching process of FIG. 1, the design data storage medium 705 that stores the design data, and the arithmetic processing unit 704.
- An input device 130 for inputting necessary information is provided.
- the arithmetic processing unit 704 includes arithmetic means (for example, arithmetic processing unit 707) and storage means (for example, memory 708). Information can be stored in the storage means, for example, a program related to pattern matching processing is stored.
- this program By executing this program by the arithmetic processing unit 707, the pattern matching process shown in FIG. 1 is executed. That is, the arithmetic processing unit 704 functions as a pattern matching device 100. In other words, this program causes the computer system to function as an arithmetic processing device included in the pattern matching device 100, and causes the pattern matching process shown in FIG. 1 to be executed.
- the arithmetic processing unit 707 executes the measurement processing of the measurement position specified by the recipe creation unit 711 for setting the template conditions, the matching processing unit 712 for executing the pattern matching processing based on the set template, and the matching processing unit 712.
- the pattern measuring unit 710 is provided.
- SEM image (corresponding to the SEM image 1104 in FIG. 1) is generated based on the detection signal.
- the SEM image is sent to the arithmetic processing unit 704 as a searched image of the matching processing unit 712 and as a measurement signal by the pattern measuring unit 710.
- control device 702 and the arithmetic processing unit 704 are configured as separate devices, but these may be integrated devices.
- the electron-based signal captured by the detector 703 is converted into a digital signal by the A / D converter built in the control device 702. Based on this digital signal, image processing according to the purpose is performed by the image processing hardware (CPU, GPU, ASIC, FPGA, etc.) built in the arithmetic processing device 704.
- image processing hardware CPU, GPU, ASIC, FPGA, etc.
- the recipe creation unit 711 includes a cutting unit 713.
- the cutting unit 713 reads design data from the design data storage medium 705 and cuts out a part thereof. The portion cut out from the design data is determined based on pattern identification data such as coordinate information set from the input device 130.
- the recipe creation unit 711 creates pattern data to be used for matching based on the cut out design data (layout data).
- the pattern data created here can be used as the design data 104 in FIG.
- the processing in the matching processing unit 712 is as described with reference to FIG. Design data, recipe information, image information, measurement results, etc. are stored in the memory 708.
- the input device 130 also functions as an imaging recipe creating device and creates an imaging recipe.
- the imaging recipe represents measurement conditions and includes, for example, the coordinates of the electronic device, pattern type, imaging conditions (optical conditions and stage movement conditions) required for measurement and inspection.
- the input device 130 may have a function of collating the input coordinate information and information regarding the pattern type with the layer information of the design data or the identification information of the pattern, and reading out the necessary information from the design data storage medium 705. ..
- the design data stored in the design data storage medium 705 can be expressed in any format, but can be expressed in, for example, the GDS format or the OASIS format.
- Appropriate software for displaying design data can display design data in various formats of design data or treat it as graphic data.
- the graphic data may be line segment image information indicating the ideal shape of the pattern formed based on the design data, or by performing an exposure simulation on this, deformation processing is performed so as to be close to the actual pattern. It may be line segment image information.
- a program for performing the process described with reference to FIG. 1 may be registered in a storage medium, and the program may be executed by a control processor having an image memory and supplying a necessary signal to the scanning electron microscope.
- FIG. 8 shows a configuration example of the pattern matching device 100 according to the second embodiment of the present disclosure. Since the same components as those in the first embodiment (FIG. 1) are designated by the same reference numerals in FIG. 8, duplicate description will be omitted below.
- the pattern matching device 100 does not require manual input of a matching shift amount or imaging of an SEM image when generating training data, and can learn a learning device offline without depending on a scanning microscope.
- the pattern matching system 120 includes a learning data generation unit 8206 in addition to the configuration described with reference to FIG.
- the learning data generation unit 8206 is configured to generate a pseudo template image (pseudo SEM image 8216) necessary for learning and a generation matching shift amount 8215 (described later) from the image to be searched (design drawing 1102).
- the input of the learning data generation unit 8206 is not limited to the design drawing, and may be another type, for example, an SEM image.
- FIG. 9 shows a configuration example of the learning data generation unit 8206.
- Image 901 is an example of the design drawing 1102.
- the matching shift amount generation unit 904 generates a matching shift amount, and sets the generated matching shift amount to 903.
- Examples of the method for generating the matching shift amount include a method in which the shift amount is randomly generated within the possible range of the matching shift amount.
- the method of generating the matching shift amount is not limited to these methods, and a method of generating the matching shift amount may be used.
- the generated matching shift amount 903 and the image 901 are input to the image cutting unit 905, and the image cutting unit 905 cuts out an image 906 having the same size as the template image.
- the pseudo SEM image generation unit 907 generates a pseudo SEM image 902 from the image 906.
- the image quality of the generated pseudo SEM image is adjusted by the image style 908 (contrast, noise, pattern deformation, etc.).
- the image 908 style may be input by the user or may be randomly generated.
- the pseudo SEM image generation unit 907 can be configured by, for example, a model that converts a trained design drawing into an image, a simulator that converts a design drawing into an SEM image, or the like.
- the pseudo SEM image generation unit 907 is not limited to these, and may be any one that can convert a pseudo image whose image quality can be adjusted.
- the correlation image calculation unit 1202 uses the pseudo SEM image 8216 generated by the learning data generation unit 8206 and the design diagram 1102, the learning model of the correlation image estimation unit 1201 is learned as described in the first embodiment.
- the correlation image calculation unit 1202 generates the calculation correlation image 1213 from the generated matching shift amount 8215 and the design drawing 1102.
- the estimated correlation image 1211 between the design drawing 1102 to be matched and the SEM image 1104 is estimated.
- the matching shift amount calculation unit 1205 calculates the estimated matching shift amount 1301 from the estimated correlation image 1211.
- learning data generation unit 8206 at the learning stage, learning can be performed only with the design drawing 1102, so that learning can be performed offline. Further, by learning using the style-adjustable pseudo SEM image, it is possible to learn a highly versatile model.
- FIG. 10 is a flowchart illustrating the operation of the pattern matching device 100 according to the second embodiment. Each step of FIG. 10 will be described below.
- the pattern matching system 120 acquires the data (design drawing 1102) stored in the storage medium, and the learning data generation unit 8206 generates the learning data.
- the training data is composed of a pseudo SEM image 8216 and a generated matching shift amount 8215.
- the correlation image calculation unit 1202 receives the design drawing 1102 and the generated matching shift amount 8215, and calculates the calculation correlation image 1213.
- the correlation image estimation unit 1201 receives the design drawing 1102 and the pseudo SEM image 8216, and generates the estimation correlation image 1211.
- the estimation error calculation unit 1203 uses an error function to calculate the difference between the estimation correlation image 1211 and the calculation correlation image 1213, that is, the estimation error 1214 of the correlation image estimation unit 1201.
- the estimation parameter update unit 1204 calculates changes in the weight and bias of the neural network by back-propagating the estimation error 1214, and updates the values. Learning is performed by repeating the above estimation and back propagation at least once.
- the pattern matching system 120 acquires the design drawing 1102 and the SEM image 1104 (S1004).
- the acquired image is input to the correlation image estimation unit 1201.
- the correlation image estimation unit 1201 estimates the estimated correlation image 1211 between the input images (S1005).
- the matching shift amount calculation unit 1205 calculates the estimated matching shift amount 1301 from the estimated correlation image 1211 (S1006).
- the learning data generation unit 8206 automatically generates the learning data, so that the user can save the trouble of inputting the matching shift amount and acquiring the image by imaging with the scanning microscope. It becomes possible to learn the learner without depending on the scanning microscope.
- the pattern matching device 100 capable of outputting the matching shift amount of each layer in the matching of the design data having a plurality of layers and the SEM image will be described. Since the configuration of the pattern matching device 100 is the same as that of the first and second embodiments, the matters related to the matching shift amount of each layer and the like will be mainly described below.
- the imaged SEM image may have an interlaminar shift with respect to the design drawing.
- overlay measurement is a measurement of the displacement between layers. For example, the matching shift amount of the upper layer pattern and the matching shift amount of the lower layer pattern are measured, and the difference is used as the displacement between layers.
- the design drawing 1102 is a design drawing including design information of each of the upper and lower layers.
- the SEM image 1104 is an SEM image corresponding to a part of the design drawing 1102. In the design drawing 1102, the lower layer pattern hidden in the upper layer pattern may be deleted in advance.
- the design drawing 1102 and the SEM image 1104 are input to the correlation image estimation unit 1201 to estimate the estimated correlation image 1211.
- the estimated correlation image 1211 holds the correlation image of the upper layer and the correlation image of the lower layer, for example, in the form of a multi-channel image.
- the correlation image calculation unit 1202 calculates the calculation correlation image 1213 which is the teacher data.
- the calculation correlation image 1213 holds the calculation correlation image of the upper layer and the calculation correlation image of the lower layer in the same manner as the estimation correlation image 1211.
- the calculation correlation image of the upper layer is calculated using the design drawing of the upper layer and the matching shift amount of the upper layer as described in the first embodiment.
- the matching shift amount of the upper layer is input from the input device 130 as described with reference to FIG.
- the calculation correlation image of the lower layer is calculated using the design drawing of the lower layer and the matching shift amount of the lower layer as described in the first embodiment.
- the matching shift amount of the lower layer is input from the input device 130 as described with reference to FIG. If the lower layer pattern below the upper layer pattern cannot be seen on the SEM image, the lower layer pattern hidden in the upper layer pattern may be deleted in advance on the design drawing.
- the learning model that brings the estimated correlation image 1211 closer to the calculated correlation image 1213 is learned by repeating the estimation and the back propagation one or more times.
- the estimated correlation image 1211 between the design drawing 1102 to be matched and the SEM image 1104 is estimated.
- the estimated correlation image 1211 includes an upper layer estimated correlation image and a lower layer estimated correlation image.
- the matching shift amount calculation unit 1205 calculates the estimated matching shift amount 1301 from the estimated correlation image 1211.
- the estimated matching shift amount 1301 includes an upper layer matching shift amount and a lower layer matching shift amount.
- the upper layer matching shift amount is calculated from the estimated upper layer correlation image.
- the lower layer matching shift amount is calculated from the estimated lower layer correlation image.
- overlay measurement can be performed or the measurement position can be adjusted with high accuracy in the subsequent measurement process.
- learning data may be generated using the learning data generation unit 8206 described in the second embodiment.
- the learning data generation unit 8206 generates a pseudo SEM image 8216 and a generation matching shift amount 8215 from the design drawing 1102.
- the learning data generation unit 8206 generates the pseudo SEM image 8216
- first, the upper layer pattern and the lower layer pattern are slightly shifted.
- a pseudo SEM image 8216 is generated as described with reference to FIG.
- the amount of deviation between the upper and lower layer patterns is added to the generated matching shift amount to obtain the upper layer matching shift amount and the lower layer matching shift amount.
- the correlation image calculation unit 1202 calculates the calculation correlation image 1213 from the generated matching shift amount 8215 and the design drawing 1102.
- learning can be performed only with the design drawing 1102, so that learning can be performed offline. Further, by learning using the style-adjustable pseudo SEM image 8216, a highly versatile model can be learned.
- FIG. 11 is a flowchart illustrating the operation of the pattern matching device 100 according to the third embodiment. Each step of FIG. 11 will be described below.
- Step S1101 Part 1
- the pattern matching system 120 acquires learning data (design drawing 1102, SEM image 1104) stored in the storage medium.
- the user inputs the upper layer matching shift amount and the lower layer matching shift amount of both by manually matching the design drawing and the SEM image in the GUI 401 illustrated in FIG. Prepare the design drawing, SEM image, and matching shift amount as training data.
- Step S1101 Part 2
- the learning data generation unit 8206 When the learning data generation unit 8206 is used, the learning data (design drawing 1102) stored in the storage medium is acquired.
- the learning data generation unit 8206 generates a pseudo SEM image and a generation matching shift amount.
- a design drawing, a pseudo SEM image, and a generated matching shift amount are prepared as training data.
- the correlation image calculation unit 1202 receives the upper layer portion of the design drawing 1102 and the upper layer matching shift amount, and calculates the upper layer calculation correlation image (S1102).
- the correlation image calculation unit 1202 receives the lower layer portion of the design drawing 1102 and the lower layer matching shift amount, and calculates the lower layer calculation correlation image (S1103).
- the correlation image estimation unit 1201 receives the design drawing 1102 and the SEM image 1104 (pseudo SEM image when the image generation unit is used), and generates an estimation correlation image 1211 including the upper layer estimation correlation image and the lower layer correlation image.
- the estimation error calculation unit 1203 uses an error function to calculate the difference between the estimation correlation image 1211 and the calculation correlation image 1213, that is, the estimation error 1214 of the correlation image estimation unit 1201.
- the estimation parameter update unit 1204 calculates changes in the weight and bias of the neural network by back-propagating the estimation error 1214, and updates the values. Learning is performed by repeating the above estimation and back propagation at least once.
- Steps S1105 to S1106 After the learning is completed, the pattern matching system 120 acquires the design drawing 1102 and the SEM image 1104 (S1105). The acquired design drawing and SEM image are input to the trained model (S1106).
- the correlation image estimation unit 1201 estimates the upper layer estimation correlation image using the trained model (S1107).
- the matching shift amount calculation unit 1205 calculates the upper layer matching shift amount from the upper layer estimated correlation image (S1108).
- the correlation image estimation unit 1201 estimates the lower layer estimation correlation image using the trained model in the same manner as the upper layer pattern (S1109).
- the matching shift amount calculation unit 1205 calculates the lower layer matching shift amount from the lower layer estimated correlation image (S1110).
- the pattern matching system 120 can calculate the above-mentioned displacement between layers by calculating the difference (vector difference) between the calculated upper layer matching shift amount and the lower layer matching shift amount.
- the upper layer matching shift amount and the lower layer matching shift amount may be used as measurement positions in the subsequent measurement step without calculating the displacement between layers.
- the pattern matching apparatus 100 can perform overlay measurement, for example, or adjust the measurement position with high accuracy in the subsequent measurement process. effective.
- the matching shift amount is calculated for each of the upper and lower layers.
- the matching shift amount is calculated for each layer in the same manner, and the vector between each true ching shift amount is calculated. By obtaining the difference, the positional deviation between the layers can be calculated.
- the correlation image calculation unit 1202 obtains a correlation image (calculation correlation image 1213) between the image generated by cutting out the design drawing 1102 and the design drawing 1102. This is most desirable if accuracy as teacher data is required. On the other hand, teacher data with various accuracy may be required. In such a case, instead of cutting out the design drawing 1102 or in combination with the cutout, the following correlation image may be obtained and used as training data. When used in combination, the following correlation image may be used as a part of the training data.
- 1102 is an SEM image
- 1104 is an SEM image.
- the SEM image 1104 is an SEM image acquired by a means different from that of the SEM image 1102 (for example, an SEM image taken under conditions different from the imaging conditions of the SEM image 1102).
- the correlation image between the design drawing at the same position as the SEM image 1102 and the design drawing at the same position as the SEM image 1104 is calculated as the calculation correlation image 1213.
- the criteria for calculating the calculation correlation image 1213 can be unified.
- Pattern matching device 110 Storage medium 1102: Design drawing 1104: SEM image 120: Pattern matching system 1201: Correlation image estimation unit 1202: Correlation image calculation unit 1202: Estimated parameter update unit 1205: Matching shift amount calculation unit 1206: Estimate Error calculation unit 130: Input device 8206: Learning data generation unit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Image Analysis (AREA)
- Length-Measuring Devices Using Wave Or Particle Radiation (AREA)
Abstract
Description
図1は、本開示の実施形態1に係るパターンマッチング装置100の構成例を示す。パターンマッチング装置100は、パターンマッチング処理を実行する演算装置として構成することができる。パターンマッチング装置100は、記憶媒体110、パターンマッチングシステム120、入力装置130によって構成することができる。
推定相関画像1211の画素値のうち最も高いもの(最高輝度画素)を特定する。最高輝度画素の位置は、設計図1102とSEM像1104が最も合致する位置であるので、マッチングシフト量を表している。ただし以下の原点補正が必要である。推定相関画像1211の画素値は、例えば推定相関画像1211の左上端部を原点として、座標をシフトしながら各シフト位置における相関値を算出することにより、セットされている。他方で設計図1102またはSEM像1104は、中心位置を原点としている場合がある。このように、推定相関画像1211と設計図1102との間、または推定相関画像1211とSEM像1104との間で原点位置が異なる場合、原点位置を一致させる補正が必要になる。原点補正後の最高輝度画素の位置から、設計図1102とSEM像1104との間のマッチングシフト量を得ることができる。
最高輝度を有する画素に代えて、閾値以上の画素値を全て特定し、そのなかで画像内の指定位置(例えば画像中心)に最も近いものを用いて、マッチングシフト量を得るようにしてもよい。原点補正が必要な場合は、上記その1と同様に実施する。
パターンマッチングシステム120は、記憶媒体に記憶されている学習データ(設計図1102、SEM像1104)を取得する(S501)。ユーザは、図4に例示したGUI401から、設計図1102とSEM像1104とを手動でマッチングすることによって、両者のマッチングシフト量を入力する(S502)。
相関画像計算部1202は、設計図1102と入力マッチングシフト量1302を受け取り、計算相関画像1213を計算する。
相関画像推定部1201は、設計図1102とSEM像1104を受け取り、推定相関画像1211を生成する。推定誤差算出部1203は、誤差関数を用いて、推定相関画像1211と計算相関画像1213との間の差分、すなわち相関画像推定部1201の推定誤差1214を算出する。推定パラメータ更新部1204は、推定誤差1214を逆伝播することにより、ニューラルネットワークの重みおよびバイアスの変化を計算し、その値を更新する。以上のような推定と逆伝播を1回以上繰り返すことによって、学習を実施する。
相関画像推定部1201は、学習済みモデルを用いて推定相関画像1211を推定する(S505)。マッチングシフト量算出部1205は、推定相関画像1211から推定マッチングシフト量1301を算出する(S506)。
図8は、本開示の実施形態2に係るパターンマッチング装置100の構成例を示す。実施形態1(図1)と同じ構成要素については、図8において同じ参考符号を付しているので、以下では重複する説明は省略する。
パターンマッチングシステム120は、記憶媒体に記憶されているデータ(設計図1102)を取得し、学習データ生成部8206は学習データを生成する。学習データは疑似SEM像8216と生成マッチングシフト量8215によって構成される。
相関画像計算部1202は、設計図1102と生成マッチングシフト量8215を受け取り、計算相関画像1213を計算する。
相関画像推定部1201は、設計図1102と疑似SEM像8216を受け取り、推定相関画像1211を生成する。推定誤差算出部1203は、誤差関数を用いて、推定相関画像1211と計算相関画像1213との間の差分、すなわち相関画像推定部1201の推定誤差1214を算出する。推定パラメータ更新部1204は、推定誤差1214を逆伝播することにより、ニューラルネットワークの重みおよびバイアスの変化を計算し、その値を更新する。以上のような推定と逆伝播を1回以上繰り返すことによって、学習を実施する。
学習が完了後、パターンマッチングシステム120は、設計図1102とSEM像1104を取得する(S1004)。取得した画像は相関画像推定部1201に入力する。相関画像推定部1201は、入力画像間の推定相関画像1211を推定する(S1005)。マッチングシフト量算出部1205は、推定相関画像1211から推定マッチングシフト量1301を算出する(S1006)。
本実施形態2に係るパターンマッチング装置100は、学習データ生成部8206が学習データを自動生成することにより、ユーザによるマッチングシフト量の入力や走査顕微鏡の撮像による画像取得の手間を省くことができ、走査顕微鏡に依存せず学習器を学習することが可能になる。
本開示の実施形態3では、複数層を有する設計データとSEM像とのマッチングにおいて、各層のマッチングシフト量をそれぞれ出力することが可能であるパターンマッチング装置100について説明する。パターンマッチング装置100の構成は実施形態1~2と同様であるので、以下では各層のマッチングシフト量などに関する事項について主に説明する。
パターンマッチングシステム120は、記憶媒体に記憶されている学習データ(設計図1102、SEM像1104)を取得する。ユーザは、図4に例示したGUI401において、設計図とSEM像とを手動でマッチングすることによって、両者の上層マッチングシフト量と下層マッチングシフト量とを入力する。設計図、SEM像、マッチングシフト量を学習データとして準備する。
学習データ生成部8206利用した場合は、記憶媒体に記憶されている学習データ(設計図1102)を取得する。学習データ生成部8206は、疑似SEM像と生成マッチングシフト量を生成する。設計図、疑似SEM像、生成マッチングシフト量を学習データとして準備する。
相関画像計算部1202は、設計図1102の上層部分と上層マッチングシフト量とを受け取り、上層計算相関画像を計算する(S1102)。相関画像計算部1202は、設計図1102の下層部分と下層マッチングシフト量とを受け取り、下層計算相関画像を計算する(S1103)。
相関画像推定部1201は、設計図1102とSEM像1104(画像生成部を用いた場合疑似SEM像)を受け取り、上層推定相関画像と下層相関画像と含む推定相関画像1211を生成する。推定誤差算出部1203は、誤差関数を用いて、推定相関画像1211と計算相関画像1213との間の差分、すなわち相関画像推定部1201の推定誤差1214を算出する。推定パラメータ更新部1204は、推定誤差1214を逆伝播することにより、ニューラルネットワークの重みおよびバイアスの変化を計算し、その値を更新する。以上のような推定と逆伝播を1回以上繰り返すことによって、学習を実施する。
学習が完了後、パターンマッチングシステム120は、設計図1102とSEM像1104を取得する(S1105)。取得した設計図とSEM像を学習済みモデルに入力する(S1106)。
相関画像推定部1201は、学習済みモデルを用いて上層推定相関画像を推定する(S1107)。マッチングシフト量算出部1205は、上層推定相関画像から、上層マッチングシフト量を算出する(S1108)。
相関画像推定部1201は、上層パターンと同じように、学習済みモデルを用いて下層推定相関画像を推定する(S1109)。マッチングシフト量算出部1205は、下層推定相関画像から、下層マッチングシフト量を算出する(S1110)。
パターンマッチングシステム120は、算出した上層マッチングシフト量と下層マッチングシフト量との差(ベクトル差分)を計算することにより、先に述べた層間のずれを計算することができる。あるいは、層間のずれを算出せずに、上層マッチングシフト量と下層マッチングシフト量とを計測位置として後段の計測工程において用いてもよい。
本実施形態3に係るパターンマッチング装置100は、上下層のマッチングシフト量をそれぞれ計算することによって、後段の計測工程において、例えばオーバレイ計測を実施し、あるいは計測位置を高精度に調整することができる効果がある。
本開示は、前述した実施形態に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施形態は本開示を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。また、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。
SEM像1104と同じ位置において設計図1102と同じ種類の設計図と、設計図1102との間の相関画像を、計算相関画像1213として計算する。この場合、ユーザがマッチングシフト量を入力する手間を省くことができる。
1102がSEM像であり、1104がSEM像である場合も考えられる。SEM像1104は、SEM像1102とは別の手段によって取得したSEM像である(例えば、SEM像1102の撮像条件とは異なる条件で撮像したSEM像)。この場合において、SEM像1102と同じ位置の設計図と、SEM像1104と同じ位置の設計図(前記設計図と同じ種類)との間の相関画像を、計算相関画像1213として計算する。この場合、SEM像の種類が複数であっても、計算相関画像1213を計算する基準を統一できる利点がある。
1102がSEM像であり、1104がSEM像(SEM像1102とは別の手段によって取得)である場合において、SEM像1102と同じ位置においてSEM像1104と同じ種類のSEM像(画像生成ツール等で生成した疑似SEM画像等)と、SEM像1104との間の相関画像を、計算相関画像1213として計算する。この場合は、設計図を準備する手間省くことができる。
1102がSEM像であり、1104がSEM像(SEM像1102とは別の手段によって取得)である場合において、SEM像1104と同じ位置においてSEM像1102と同じ種類のSEM像(画像生成ツール等で生成した疑似SEM画像等)と、SEM像1102との間の相関画像を、計算相関画像1213として計算する。この場合は、設計図を準備する手間を省くことができる。
1102がSEM像、1104が設計図の場合において、SEM像1102と同じ位置において設計図1104と同じ種類の設計図と、設計図1104との相関画像を、計算相関画像1213として計算する。この場合、SEM像の種類が複数であっても計算相関画像1213を計算する基準を統一できる利点がある。
1102がSEM像、1104が設計図の場合において、設計図1104と同じ位置においてSEM像1102と同じ種類のSEM像と、SEM像1102との相関画像を、計算相関画像1213として計算する。この場合は、設計図を準備する手間を省くことができる。
110:記憶媒体
1102:設計図
1104:SEM像
120:パターンマッチングシステム
1201:相関画像推定部
1202:相関画像計算部
1204:推定パラメータ更新部
1205:マッチングシフト量算出部
1206:推定誤差算出部
130:入力装置
8206:学習データ生成部
Claims (16)
- 画像間のパターンマッチングを実施するパターンマッチング装置であって、
第1画像と第2画像との間でパターンマッチングを実施することにより、前記第1画像と前記第2画像との間のシフト量をパターンマッチング結果として出力する、コンピュータシステムを備え、
前記コンピュータシステムは、前記第1画像と前記第2画像を入力として受け取り、前記第1画像と前記第2画像との間の相関を表す数値を画素値として有する第1相関画像を推定して出力する、学習器を備え、
前記コンピュータシステムは、前記第1画像と、前記第1画像から生成した派生画像との間の相関を表す数値を画素値として有する、第2相関画像を計算し、
前記学習器は、前記第1相関画像と前記第2相関画像との間の差分を小さくする学習を実施できるように構成されており、
前記コンピュータシステムは、前記第1相関画像に基づき、前記第1画像と前記第2画像との間のシフト量を計算する
ことを特徴とするパターンマッチング装置。 - 前記コンピュータシステムは、前記第1画像と前記派生画像との間の相関を表す数値を、前記第1画像と前記派生画像との間の座標シフト量ごとに計算し、
前記コンピュータシステムは、前記座標シフト量ごとに計算した数値をその座標シフト量に対応する画素の画素値として有する画像を、前記第2相関画像として計算する
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記コンピュータシステムは、前記第1相関画像が有する前記画素値のうち前記第1画像と前記第2画像との間の相関が基準値以上であるものが、前記第1画像または前記第2画像からずれている量を計算することにより、前記シフト量を計算する
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記コンピュータシステムはさらに、前記座標シフト量を指定する入力を受け取るインターフェースを備え、
前記コンピュータシステムは、前記インターフェースが受け取った前記入力が指定する前記座標シフト量にしたがって、前記第2相関画像を計算する
ことを特徴とする請求項2記載のパターンマッチング装置。 - 前記コンピュータシステムは、前記学習器に対して、それぞれ異なる条件の下で取得された複数の前記第1画像を入力するとともに、前記第2画像と各前記第1画像との間の相関を表す前記第1相関画像を出力し、
または、
前記コンピュータシステムは、前記学習器に対して、それぞれ異なる条件の下で取得された複数の前記第2画像を入力するとともに、前記第1画像と各前記第2画像との間の相関を表す前記第1相関画像を出力し、
または、
前記コンピュータシステムは、前記学習器に対して、それぞれ異なる条件の下で取得された複数の前記第1画像を入力するとともに、それぞれ異なる条件の下で取得された複数の前記第2画像を入力し、各前記第1画像と各前記第2画像との間の相関を表す前記第1相関画像を出力する
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記コンピュータシステムは、前記第1相関画像のうち指定した部分領域の輝度値が前記第1相関画像のその他領域の輝度値よりも高くなるように、前記第1画像または前記第2画像のうち少なくともいずれかを補正した上で、その補正後の前記第1画像と前記第2画像を前記学習器に対して入力する
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記学習器は、前記第1相関画像として、前記第1画像と前記第2画像が第1方向において相関している程度を表す第1方向相関値を画素値として有する第1方向相関画像を推定するように構成されており、
前記学習器は、前記第1相関画像として、前記第1画像と前記第2画像が前記第1方向に対して直交する第2方向において相関している程度を表す第2方向相関値を画素値として有する第2方向相関画像を推定するように構成されており、
前記コンピュータシステムは、前記第1方向相関画像と前記第2方向相関画像を合成することにより、前記第1相関画像を生成する
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記コンピュータシステムはさらに、前記学習器が前記学習を実施するために用いる学習データを生成する学習データ生成部を備え、
前記学習データ生成部は、前記第1画像を座標シフトさせることにより前記派生画像を生成するとともに、その生成した前記派生画像と前記第1画像を前記学習データとして前記学習器に対して供給し、
前記コンピュータシステムは、前記学習データ生成部が前記派生画像を生成する際に用いた座標シフト量、または前記学習データ生成部が生成した前記派生画像を用いて、前記第2相関画像を計算する
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記学習データ生成部は、前記派生画像の画質または前記派生画像に含まれる形状パターンのうち少なくともいずれかを変化させることにより生成した変形画像を、前記学習データとして前記学習器に対して供給する
ことを特徴とする請求項8記載のパターンマッチング装置。 - 前記コンピュータシステムは、上層と下層を有する試料の画像に対してパターンマッチングを実施するように構成されており、
前記第1画像は、前記試料の第1上層画像と前記試料の第1下層画像とを含み、
前記第2画像は、前記試料の第2上層画像と前記試料の第2下層画像とを含み、
前記学習器は、前記第1上層画像または前記第1画像と前記第2上層画像または前記第2画像を入力として受け取り、前記第1上層画像または前記第1画像と前記第2上層画像または前記第2画像との間の相関を表す数値を画素値として有する第1上層相関画像を推定して出力し、
前記学習器は、前記第1下層画像または前記第1画像と前記第2下層画像または前記第2画像を入力として受け取り、前記第1下層画像または前記第1画像と前記第2下層画像または前記第2画像との間の相関を表す数値を画素値として有する第1下層相関画像を推定して出力し、
前記コンピュータシステムは、前記第1上層相関画像に基づき、前記第1上層画像または前記第1画像と前記第2上層画像または前記第2画像との間の上層シフト量を計算し、
前記コンピュータシステムは、前記第1下層相関画像に基づき、前記第1下層画像または前記第1画像と前記第2下層画像または前記第2画像との間の下層シフト量を計算する
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記コンピュータシステムは、前記上層シフト量と前記下層シフト量に基づき、前記試料の上層と前記試料の下層との間のずれ量を算出して出力する
ことを特徴とする請求項10記載のパターンマッチング装置。 - 前記コンピュータシステムは、
前記第1画像と、
前記第2画像に対応する位置において前記第1画像と同じ種類の第3画像と、
の間の相関を表す数値を画素値として有する第3相関画像を計算し、
前記学習器はさらに、前記第1相関画像と前記第3相関画像との間の差分を小さくする学習を実施できるように構成されている
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記コンピュータシステムは、
前記第1画像に対応する位置において前記第2画像と同じ種類の第4画像と、
前記第2画像と、
の間の相関を表す数値を画素値として有する第4相関画像を計算し、
前記学習器はさらに、前記第1相関画像と前記第4相関画像との間の差分を小さくする学習を実施できるように構成されている
ことを特徴とする請求項1記載のパターンマッチング装置。 - 前記コンピュータシステムは、
前記第1画像に対応する位置の第5画像と、
前記第2画像に対応する位置において前記第5画像と同じ種類の第6画像と、
の間の相関を表す数値を画素値として有する第5相関画像を計算し、
前記学習器はさらに、前記第1相関画像と前記第5相関画像との間の差分を小さくする学習を実施できるように構成されている
ことを特徴とする請求項1記載のパターンマッチング装置。 - 請求項1記載のパターンマッチング装置、
試料を撮像することにより前記第1画像と前記第2画像を取得して前記パターンマッチング装置に対して供給する走査電子顕微鏡、
を備えることを特徴とするパターン測定システム。 - 画像間のパターンマッチングをコンピュータシステムに実施させるパターンマッチングプログラムであって、前記コンピュータシステムに、
第1画像と第2画像との間でパターンマッチングを実施することにより、前記第1画像と前記第2画像との間のシフト量をパターンマッチング結果として出力する、ステップを実施させ、
前記パターンマッチングを実施するステップにおいては、前記コンピュータシステムに、前記第1画像と前記第2画像を入力として受け取り、前記第1画像と前記第2画像との間の相関を表す数値を画素値として有する第1相関画像を推定して出力する、学習器に対して、前記第1画像と前記第2画像を供給し、前記第1相関画像を受け取る、ステップを実施させ、
前記パターンマッチングを実施するステップにおいては、前記コンピュータシステムに、前記第1画像と、前記第1画像から生成した派生画像との間の相関を表す数値を画素値として有する第2相関画像を計算するステップを実施させ、
前記学習器は、前記第1相関画像と前記第2相関画像との間の差分を小さくする学習を実施できるように構成されており、
前記パターンマッチングを実施するステップにおいては、前記コンピュータシステムに、前記第1相関画像に基づき、前記第1画像と前記第2画像との間のシフト量を計算するステップを実施させる
ことを特徴とするパターンマッチングプログラム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020227042756A KR20230007485A (ko) | 2020-07-09 | 2020-07-09 | 패턴 매칭 장치, 패턴 측정 시스템, 패턴 매칭 프로그램 |
PCT/JP2020/026773 WO2022009357A1 (ja) | 2020-07-09 | 2020-07-09 | パターンマッチング装置、パターン測定システム、パターンマッチングプログラム |
JP2022534573A JP7332810B2 (ja) | 2020-07-09 | 2020-07-09 | パターンマッチング装置、パターン測定システム、パターンマッチングプログラム |
US18/009,783 US20230298310A1 (en) | 2020-07-09 | 2020-07-09 | Pattern Matching Device, Pattern Measuring System, Pattern Matching Program |
CN202080101855.7A CN115699244A (zh) | 2020-07-09 | 2020-07-09 | 图案匹配装置、图案测定系统、图案匹配程序 |
TW112123099A TWI817922B (zh) | 2020-07-09 | 2021-06-28 | 圖案匹配裝置、圖案測定系統、圖案匹配程式 |
TW110123568A TWI809432B (zh) | 2020-07-09 | 2021-06-28 | 圖案匹配裝置、圖案測定系統、圖案匹配程式 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/026773 WO2022009357A1 (ja) | 2020-07-09 | 2020-07-09 | パターンマッチング装置、パターン測定システム、パターンマッチングプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022009357A1 true WO2022009357A1 (ja) | 2022-01-13 |
Family
ID=79552435
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/026773 WO2022009357A1 (ja) | 2020-07-09 | 2020-07-09 | パターンマッチング装置、パターン測定システム、パターンマッチングプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20230298310A1 (ja) |
JP (1) | JP7332810B2 (ja) |
KR (1) | KR20230007485A (ja) |
CN (1) | CN115699244A (ja) |
TW (2) | TWI809432B (ja) |
WO (1) | WO2022009357A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001148016A (ja) * | 1999-11-22 | 2001-05-29 | Hitachi Ltd | 試料検査装置,試料表示装置、および試料表示方法 |
JP2007086066A (ja) * | 2005-09-19 | 2007-04-05 | Fei Co | ツール構成要素の動作領域を所定の要素に調節する方法 |
JP2013229394A (ja) * | 2012-04-24 | 2013-11-07 | Hitachi High-Technologies Corp | パターンマッチング方法及び装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4218171B2 (ja) | 2000-02-29 | 2009-02-04 | 株式会社日立製作所 | 走査電子顕微鏡,マッチング方法、及びプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP4199939B2 (ja) | 2001-04-27 | 2008-12-24 | 株式会社日立製作所 | 半導体検査システム |
CN101996398B (zh) * | 2009-08-12 | 2012-07-04 | 睿励科学仪器(上海)有限公司 | 用于晶圆对准的图像匹配方法及设备 |
US10115040B2 (en) * | 2016-09-14 | 2018-10-30 | Kla-Tencor Corporation | Convolutional neural network-based mode selection and defect classification for image fusion |
US11379970B2 (en) * | 2018-02-23 | 2022-07-05 | Asml Netherlands B.V. | Deep learning for semantic segmentation of pattern |
TWI689875B (zh) * | 2018-06-29 | 2020-04-01 | 由田新技股份有限公司 | 利用深度學習系統的自動光學檢測分類設備及其訓練設備 |
-
2020
- 2020-07-09 KR KR1020227042756A patent/KR20230007485A/ko unknown
- 2020-07-09 JP JP2022534573A patent/JP7332810B2/ja active Active
- 2020-07-09 US US18/009,783 patent/US20230298310A1/en active Pending
- 2020-07-09 CN CN202080101855.7A patent/CN115699244A/zh active Pending
- 2020-07-09 WO PCT/JP2020/026773 patent/WO2022009357A1/ja active Application Filing
-
2021
- 2021-06-28 TW TW110123568A patent/TWI809432B/zh active
- 2021-06-28 TW TW112123099A patent/TWI817922B/zh active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001148016A (ja) * | 1999-11-22 | 2001-05-29 | Hitachi Ltd | 試料検査装置,試料表示装置、および試料表示方法 |
JP2007086066A (ja) * | 2005-09-19 | 2007-04-05 | Fei Co | ツール構成要素の動作領域を所定の要素に調節する方法 |
JP2013229394A (ja) * | 2012-04-24 | 2013-11-07 | Hitachi High-Technologies Corp | パターンマッチング方法及び装置 |
Non-Patent Citations (1)
Title |
---|
SEIFI, M. ET AL.: "Fast Diffraction-Pattern Matching for Object Detection and Recognition in Digital Holograms", PUBLICATION IN THE CONFERENCE PROCEEDINGS OF EUSIPCO, vol. 15697347, 9 September 2013 (2013-09-09), pages 1 - 5, XP032593783 * |
Also Published As
Publication number | Publication date |
---|---|
JP7332810B2 (ja) | 2023-08-23 |
TW202341025A (zh) | 2023-10-16 |
US20230298310A1 (en) | 2023-09-21 |
JPWO2022009357A1 (ja) | 2022-01-13 |
TWI809432B (zh) | 2023-07-21 |
TW202203097A (zh) | 2022-01-16 |
KR20230007485A (ko) | 2023-01-12 |
TWI817922B (zh) | 2023-10-01 |
CN115699244A (zh) | 2023-02-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11227144B2 (en) | Image processing device and method for detecting image of object to be detected from input data | |
TWI697849B (zh) | 圖像處理系統、記憶媒體、資訊擷取系統及資料產生系統 | |
TWI757585B (zh) | 檢查裝置、檢查方法及檢查程式 | |
US11663713B2 (en) | Image generation system | |
US20190026879A1 (en) | Method of detecting defects in an object | |
US20220277434A1 (en) | Measurement System, Method for Generating Learning Model to Be Used When Performing Image Measurement of Semiconductor Including Predetermined Structure, and Recording Medium for Storing Program for Causing Computer to Execute Processing for Generating Learning Model to Be Used When Performing Image Measurement of Semiconductor Including Predetermined Structure | |
TW201944298A (zh) | 加工配方生成裝置 | |
CN115810133B (zh) | 基于图像处理和点云处理的焊接控制方法及相关设备 | |
US9110384B2 (en) | Scanning electron microscope | |
JP7170605B2 (ja) | 欠陥検査装置、欠陥検査方法、およびプログラム | |
JPWO2006073155A1 (ja) | パターン欠陥検査のための装置、その方法及びそのプログラムを記録したコンピュータ読取り可能な記録媒体 | |
WO2022009357A1 (ja) | パターンマッチング装置、パターン測定システム、パターンマッチングプログラム | |
CN114078114A (zh) | 用于生成用于晶片分析的校准数据的方法和系统 | |
US20230222764A1 (en) | Image processing method, pattern inspection method, image processing system, and pattern inspection system | |
WO2020121739A1 (ja) | 画像マッチング方法、および画像マッチング処理を実行するための演算システム | |
WO2021260765A1 (ja) | 寸法計測装置、半導体製造装置及び半導体装置製造システム | |
US20230005157A1 (en) | Pattern-edge detection method, pattern-edge detection apparatus, and storage medium storing program for causing a computer to perform pattern-edge detection | |
JP7273748B2 (ja) | 検査装置、検査方法、及びプログラム | |
JP2013254332A (ja) | 画像処理方法および画像処理装置 | |
JP7398480B2 (ja) | 画像を生成するシステム、及び非一時的コンピュータ可読媒体 | |
JP2020123064A (ja) | 画像マッチング判定方法、画像マッチング判定装置、および画像マッチング判定方法をコンピュータに実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体 | |
KR102678481B1 (ko) | 하전 입자 빔 장치 | |
KR20220073640A (ko) | 하전 입자 빔 장치 | |
JPH05101166A (ja) | パターンマツチング装置 | |
US20230071668A1 (en) | Pattern Matching Device, Pattern Measurement System, and Non-Transitory Computer-Readable Medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20944534 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022534573 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227042756 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20944534 Country of ref document: EP Kind code of ref document: A1 |