WO2005001593A2 - 基準パターン抽出方法とその装置、パターンマッチング方法とその装置、位置検出方法とその装置及び露光方法とその装置 - Google Patents
基準パターン抽出方法とその装置、パターンマッチング方法とその装置、位置検出方法とその装置及び露光方法とその装置 Download PDFInfo
- Publication number
- WO2005001593A2 WO2005001593A2 PCT/JP2004/008982 JP2004008982W WO2005001593A2 WO 2005001593 A2 WO2005001593 A2 WO 2005001593A2 JP 2004008982 W JP2004008982 W JP 2004008982W WO 2005001593 A2 WO2005001593 A2 WO 2005001593A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern
- area
- unique
- reference pattern
- predetermined
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F9/00—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
- G03F9/70—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
- G03F9/7092—Signal processing
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F9/00—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
- G03F9/70—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
- G03F9/7088—Alignment mark detection, e.g. TTR, TTL, off-axis detection, array detector, video detection
Definitions
- Reference pattern extraction method and device pattern matching method and device, position detection method and device, exposure method and device
- the present invention relates to an exposure method in a lithography process for manufacturing an electronic device such as a semiconductor device, a liquid crystal display device, a plasma display device, and a thin film magnetic head, and particularly to a wafer reticle and the like in an exposure apparatus.
- the present invention relates to a method and an apparatus for detecting a position to be detected, and an exposure method and an apparatus for performing exposure based on a detected position.
- an exposure apparatus is used to form fine particles formed on a mask reticle (hereinafter collectively referred to as a reticle).
- a pattern image is repeatedly projected and exposed on a substrate such as a semiconductor wafer or a glass plate coated with a photosensitive agent.
- the exposure apparatus it is necessary to match the position of the substrate with the position of the image of the projected pattern with high accuracy.
- the patterns formed have become extremely fine with the improvement in the degree of integration in recent years. Therefore, in order to manufacture a semiconductor device having desired performance, very high-precision alignment is required.
- the alignment in the exposure apparatus is performed by detecting an alignment mark (hereinafter, sometimes simply referred to as a mark or a pattern) formed on a substrate reticle by an alignment sensor and detecting the position of the substrate or the like. And controlling the position.
- Various methods are used to detect the position of the mark.
- alignment sensors of the FIA (Field Image Alignment) type which detects the position of the mark by image processing, have been used. I have.
- a signal pattern signal, n-dimensional signal
- image processing there is known a template matching method for detecting a mark by comparing (matching) a captured image with a reference pattern (template) corresponding to a mark signal prepared in advance ( For example, see Patent Document 1).
- Patent Document 1 JP 2001-210577 A
- a template specifies a pattern extracted from actual data as a template, or information on a known pattern to be used as a template such as an alignment mark (design data of a mark or placement information on a wafer).
- an alignment mark design data of a mark or placement information on a wafer.
- a template cannot be set when actual data is not or cannot be obtained for creating a template, or when a pattern to be used as a template is not specified.
- the method of selecting a template by the operator is considered as a force S.
- the range of the image of the substrate surface (observation field of view) imaged at the time of the alignment is not always at a fixed position due to an error relating to the wafer loading operation and an error relating to the manufacture of the pattern. It varies within the range.
- the pattern used as the template must be a pattern that is always included in the observation field of view. If the pattern does not exist in the observation field, it is impossible to detect it.
- the pattern used as a template must be a unique pattern that exists only in the maximum range that can be the observation field (hereinafter, referred to as the observation field maximum range or the input data maximum range). This is because even if a plurality of patterns are detected, their positions cannot be specified.
- the maximum range that can be the observation field increases, and conversely, the area that is always included in the observation field decreases. In some cases, there may be no region that is always included in the observation field of view. In such a situation, it is difficult or difficult to select a pattern to be used as a template, and the Plate matching cannot be performed.
- An object of the present invention is to provide a reference pattern extraction method and apparatus capable of appropriately and efficiently extracting a reference pattern (template) effective for template matching. Specifically, it is an object of the present invention to provide a reference pattern extraction method and a reference pattern extraction method capable of extracting an effective reference pattern from the observation field maximum range without depending on a region always included in the observation field. It is another object of the present invention to provide a reference pattern extraction method and apparatus capable of generating a template capable of performing template matching effectively without requiring actual data for template creation.
- Another object of the present invention is to provide a pattern matching method capable of appropriately performing template matching using the extracted effective reference pattern (template) to detect a desired mark or the like. It is to provide a method. Specifically, even if there is some variation in the observation visual field, a pattern that can appropriately detect a desired pattern using the template extracted from the entire range of the observation visual field maximum by the method according to the present invention. It is to provide a matching method. Another object of the present invention is to provide a pattern matching method capable of appropriately detecting a desired pattern using a template according to the present invention created without using actual data.
- Another object of the present invention is to provide a position detecting method capable of detecting a desired pattern used for positioning and appropriately detecting the position by using such a template matching method according to the present invention. And to provide the device.
- Another object of the present invention is to detect an exposure position of a substrate or the like by using such a position detection method according to the present invention, and to perform exposure to a desired position on a substrate or the like appropriately. It is an object of the present invention to provide an optical method and an apparatus therefor.
- the reference pattern extraction method of the present invention provides a method for extracting a surface smaller than the predetermined area (OR-Area) arbitrarily arranged in a predetermined area (OR-Area) on an object.
- Step S410 One step and (Step S410), out of the pattern signal information obtained in the first step, at all positions where the measured area (VIEW_Area) can take within the predetermined area ( ⁇ R_Area),
- a second step of extracting a plurality of unique patterns that can be recognized as unique within the area to be measured (VIE W_Area) for each position, and having different uniqueness step S430 and step S470
- S Tsu having up S480 see FIGS. 11A- FIG 11C ⁇ beauty Figure 12).
- a predetermined range such as a maximum range that can be an observation field of view (VIEW-Area) in the second step based on the pattern signal information obtained in the first step.
- Recognition that uniqueness is different from each other within an area (OR-Area) and unique within a measurement area (VIEW-Area) arbitrarily arranged within the predetermined area (OR-Area) Multiple unique patterns are extracted.
- all of the extracted unique patterns are set as reference patterns (templates) irrespective of the range that the measured area (VIEW-Area) can take. That is, a plurality of reference patterns are extracted from the maximum range of the observation field without depending on the area always included in the observation field.
- the observation field of view (VIEW-Area) set during template matching, etc.
- the observation field of view (VIEW_Area) is one of a plurality of set unique patterns.
- the unique pattern can be detected both in the predetermined area and in the predetermined area, and the position measurement can be appropriately performed based on the position of the unique pattern.
- each unique pattern is extracted for each specific region (element) having an area smaller than the measured region (VIEW_Area).
- information on a pattern shape characteristic is stored in the unity.
- the unique pattern is extracted by using the unique pattern as an index of the quality.
- the second step in addition to the information on the pattern shape characteristics, at least one of the number of patterns having different pattern shape characteristics and their arrangement relationship is used as the index of uniqueness. While using, the unique pattern is extracted.
- the number of patterns having the same pattern shape characteristic and / or their positional relationship is used as the index of uniqueness while using at least one of the patterns as patterns. Extract unique patterns.
- design value information on the arrangement relation is used.
- the information on the pattern shape characteristic is obtained by using a correlation calculation process or an SSDA method for each of the pattern signal information in the specific region with respect to the pattern signal information in the predetermined region ( ⁇ R_Area). It is obtained by performing arithmetic processing.
- the information on the shape characteristic of the pattern is at least one of information of an SN ratio, an edge amount, an entropy amount, a variance value, and a moment amount in the pattern signal information in each of the specific regions. Calculate using the amount.
- pattern signal information in the predetermined area is obtained using an imaging unit capable of capturing the predetermined area (OR-Area) at a time.
- an image pickup means capable of picking up an image of the inside of the area to be measured (VIEW-Area) is used, and the position of the object to be measured at a plurality of positions is changed while changing the position of the object relative to the image pickup means
- the pattern signal information in the predetermined area (OR_Area) is obtained by obtaining the pattern signal information in (VIEW_Area), respectively, and combining the plurality of pieces of the obtained pattern signal information.
- the object is positioned with respect to the imaging unit using design value information on a pattern formed on the object.
- the pattern matching method uses the reference pattern obtained by the above-described reference pattern extraction method and uses the reference area (VIEW) — Performs a correlation operation on the pattern signal information in the (Area).
- a correlation calculation process is performed on the pattern signal information in the measured area (VIEW-Area) by sequentially using all of the plurality of unique patterns.
- the reference pattern extraction device provides a measurement area (VIEW_Area) having an area smaller than the predetermined area (OR_Area) arbitrarily arranged in the predetermined area (OR_Area) on the object.
- a reference pattern extracting apparatus for extracting a reference pattern having a unique signal characteristic
- the pattern signal information acquisition means for obtaining a pattern signal information within the predetermined area (OR_Are a), the resulting said pattern From among the signal information, it is possible to recognize that each position is unique within the measured area (VIEW_Area) at all possible positions of the measured area (VIEW_Area) within the predetermined area ( ⁇ R_Area).
- Unique pattern extracting means for extracting a plurality of unique patterns having different uniquenesses, and
- the pattern matching device uses the reference pattern obtained by the above-described reference pattern extraction device to generate pattern signal information in the measurement area (VIEW-Area) on the object. And a correlation operation processing means for performing a correlation operation on all of the plurality of unique patterns.
- another reference pattern extraction method is a reference pattern extraction method for extracting a reference pattern used for identifying a predetermined pattern formed on an object, the method comprising: A first step of obtaining design value information relating to at least one of the shape of the pattern formed thereon and its arrangement information; a second step of converting the design value information into pattern signal information; and The method includes a third step of extracting a unique pattern having a unique signal feature from the information, and a fourth step of determining the reference pattern based on the unique pattern extracted in the third step.
- a reference pattern extraction method design value information on a pattern formed on an object in the first step is obtained, and the design value information is formed on the object from the design value information in the second step.
- the pattern signal data of the generated pattern is generated.
- a unique pattern is extracted from the pattern signal data in the third step, and this is set as a template (reference pattern) in the fourth step.
- the reference pattern is generated without using any pattern signal data captured from an actual object.
- a unique pattern that is, a pattern that exists solely in the pattern detection area and whose position can be specified by detecting the pattern is automatically detected and used as a template. Therefore, even when pattern signal data cannot be obtained or when a pattern to be used as a template is not specified, an effective template can be appropriately generated.
- the third step by performing a correlation operation process on the pattern signal information for each partial pattern signal information in the pattern signal information or an operation process using an SSDA method, The unique pattern is extracted.
- the unique pattern is extracted while changing the size of a region obtained from the partial pattern signal information.
- the unique pattern having the largest feature difference with respect to other patterns in the pattern signal information is referred to as the reference pattern. Determine as a pattern.
- a reference pattern extraction device is a reference pattern extraction device for extracting a reference pattern used when identifying a predetermined pattern formed on an object, wherein the reference pattern extraction device is formed on the object.
- Design value information obtaining means for obtaining design value information on at least one of the shape of the pattern and / or its arrangement information; information conversion means for converting the design value information into pattern signal information; and
- a unique pattern extracting means for extracting a unique pattern having unique signal characteristics; and a reference pattern determining means for determining the reference pattern based on the extracted unique pattern.
- a reference pattern extraction method is a reference pattern extraction method for extracting a reference pattern used when identifying a predetermined pattern formed on an object.
- a reference pattern extraction method first, a pattern formed on an object in the first step is imaged to obtain pattern signal data. Then, design value information on the pattern formed on the object in the second step is obtained, and a reference pattern is generated in the third step based on the design value information and the imaging pattern signal data. In other words, detection of a pattern signal to be used as a template and detection of an area of the pattern are performed based on design value information, and actual imaging data is obtained only at a point where a pattern signal to be actually used as a template is obtained. Used. Therefore, a pattern having a high correlation with the actual pattern can be set as a template. In addition, since processing can be performed in parallel with processing using design value information as appropriate, it is effective to set templates while checking and confirming the performance efficiently and each other.
- the third step is a step of converting the design value information into pattern signal information, and a step of converting a unique pattern relating to a position of a portion having a unique signal characteristic from the converted pattern signal information.
- a reference pattern extraction device is a reference pattern extraction device for extracting a reference pattern used when identifying a predetermined pattern formed on an object.
- Pattern signal information obtaining means for obtaining pattern signal information
- design value information obtaining means for obtaining design value information relating to at least one of a shape and an arrangement state of a pattern formed on the object;
- the pattern matching method is a pattern matching method for identifying a predetermined pattern formed in a predetermined area (OR-Area) on an object, wherein the reference patterns are unique to each other.
- the plurality of reference patterns are different from each other in pattern shape characteristics. More preferably, the plurality of reference patterns are at least one of a number of patterns having a specific shape and an arrangement relationship. Are different from each other.
- the pattern matching device is a pattern matching device for identifying a predetermined pattern formed in a predetermined region ( ⁇ ⁇ R_Area) on an object, wherein the reference pattern has a uniqueness as each other.
- a reference pattern preparing means for preparing a plurality of different reference patterns;
- a pattern signal information obtaining means for capturing an image of the inside of the predetermined area (OR-Area) to obtain pattern signal information;
- Correlation operation processing means for sequentially performing correlation operation processing on the obtained pattern signal information.
- a reference pattern extraction method is a reference pattern extraction method for extracting a reference pattern used when identifying a pattern formed on an object.
- the reference pattern extraction device recognizes a pattern formed on an object.
- a reference pattern extraction device for extracting a reference pattern used when differentiating, wherein photoelectric detection is performed on the object via an optical system having a first detection magnification to obtain first pattern signal information.
- First information obtaining means, and predetermined area specifying means for specifying a predetermined area on the object, on the basis of the first pattern signal information, which is presumed to have a unique pattern having a unique pattern signal characteristic And the identified predetermined area
- a second information acquisition unit that performs photoelectric detection through an optical system having a second detection magnification higher than the first magnification to obtain second pattern signal information in the predetermined area; and Reference pattern determining means for extracting and determining the unique pattern to be the reference pattern based on the pattern signal information.
- a reference pattern extraction method is a reference pattern extraction method for extracting a reference pattern used when identifying a pattern formed on an object.
- a unique pattern that is included in a specific area and that has a unique pattern signal characteristic within the predetermined area is extracted, and the reference pattern is extracted based on the extracted unique pattern.
- the first step and the second step are performed on the reset predetermined area to extract the unique pattern.
- a reference pattern extraction device is a reference pattern extraction device for extracting a reference pattern used when identifying a pattern formed on an object.
- Pattern signal information obtaining means for obtaining pattern signal information; Based on the obtained pattern signal information, a pattern that has an area smaller than the predetermined region and is present in a specific region that is necessarily included in a measurement region arranged at an arbitrary position in the predetermined region, A reference pattern determining unit that extracts a unique pattern having a unique pattern signal characteristic in a predetermined area, and determines the reference pattern based on the extracted unique pattern; When the unique pattern cannot be extracted from the specific area, the unique pattern is arranged at an arbitrary position in the predetermined area based on the pattern signal information obtained by the pattern signal information obtaining means, and the area is reduced.
- Resetting means wherein the pattern signal information obtaining means obtains the pattern signal information for the reset predetermined area, and the reference pattern determining means determines the reference pattern. .
- a reference pattern extraction method is a reference pattern extraction method for extracting a reference pattern used for identifying a pattern formed on an object. Acquiring pattern signal information, extracting a unique pattern having a unique pattern signal characteristic in the predetermined area based on the acquired pattern signal information, and determining the reference pattern based on the extracted unique pattern; In the first step, when the unique pattern cannot be extracted from the predetermined area in the first step, a unique pattern having a unique pattern signal characteristic exists in an area near the predetermined area. A second step of resetting the predetermined area in an area having a high possibility, and Subjected to a pre-Symbol first step to determine the reference pattern.
- a reference pattern extraction device is a reference pattern extraction device for extracting a reference pattern used when identifying a pattern formed on an object. Obtain pattern signal information and obtain the obtained pattern signal information A unique pattern having a unique pattern signal characteristic in the predetermined area based on the extracted unique pattern; and a reference pattern determining means for determining the reference pattern based on the extracted unique pattern. If the unique pattern cannot be extracted, the predetermined area is set to an area near the predetermined area, which is likely to include a unique pattern having a unique pattern signal characteristic. Predetermined area resetting means for resetting an area, wherein the reference pattern determining means determines the reference pattern again for the resetted predetermined area.
- the pattern matching method uses the reference pattern extracted by the above-described reference pattern extraction method to perform pattern signal information in a measurement area (VIEW_Area) on the object. To perform the correlation calculation processing.
- the pattern matching device uses the reference pattern extracted by the above-described reference pattern extraction device to perform pattern signal information in a measurement area (VIEW-Area) on the object. Perform correlation calculation processing.
- position information of the unique pattern in the above-mentioned measurement area is obtained by using the above-described pattern matching method.
- a position detecting device includes position information detecting means for obtaining position information of the unique pattern in the above-mentioned measured area (VIEW-Area) using the above-described pattern matching device.
- the exposure apparatus obtains the position information of the unique pattern formed on the substrate as the object on the moving coordinate system of the object by using the above-described position detection method.
- the substrate is aligned based on the position information, and a predetermined pattern is transferred and exposed on the aligned substrate.
- the exposure apparatus includes: the above-described position detection apparatus for obtaining position information of a unique pattern formed on the substrate as the object on a moving coordinate system of the object; A positioning unit for positioning the substrate; and an exposing unit for transferring and exposing a predetermined pattern onto the aligned substrate.
- Another reference pattern extraction method is a reference pattern extraction method for extracting a reference pattern having a unique signal characteristic on an object on which a pattern is formed. Wherein a reference pattern having a unique signal characteristic is extracted from a predetermined area in a range wider than the measured area, wherein the measured pattern located at a first position in the predetermined area is extracted.
- a first reference pattern having a unique signal characteristic is extracted from a pattern included in the region, and the measurement target region located at a second position different from the first position in the predetermined region is the first reference pattern.
- Unity from a pattern included in the measurement area located at the second position including at least a part of a pattern different from the measurement area located at the position.
- a second reference pattern having a unique signal characteristic is extracted.
- a measurement area having a predetermined area is arranged on an object on which a pattern is formed, and a pattern force included in the measurement area also matches a predetermined reference pattern.
- a pattern matching method for detecting a specific pattern to be detected wherein the specific pattern is detected from a predetermined area in a wider range than the measured area.
- a first reference pattern having a unique signal characteristic is extracted from a pattern included in the measured area located at one position, and the first reference pattern located at a second position different from the first position in the predetermined area is extracted.
- a second reference pattern having a unique signal characteristic is extracted from the pattern included in the measurement area, the measurement area is set on the object, and an image of the object in the measurement area is set.
- a pattern that matches the first reference pattern or the second reference pattern is detected from.
- Another position detection method is a position detection method for detecting position information of an object on which a pattern is formed, wherein a measurement area having a predetermined area is arranged on the object,
- the reference pattern includes: Area to be measured A first reference pattern having a unique signal characteristic is extracted from a pattern included in the measurement area located at the first position within a predetermined area in a wider range, and the first reference pattern in the predetermined area is extracted.
- a second reference pattern having a unique signal characteristic is extracted from the included patterns, the measurement target area is set on the object, and the first reference pattern or the second reference pattern is determined from the image of the object in the measurement target area.
- a specific pattern that matches a reference pattern is detected, and relative position information between the specific pattern and the measurement area is detected.
- Another reference pattern extracting apparatus is a reference pattern extracting apparatus for extracting a reference pattern having a unique signal characteristic on a object on which a pattern is formed.
- the pattern force included in the measurement area when the measurement area is located at the first position has a unique signal characteristic.
- a first reference pattern is extracted, and the measured area is located at a second position different from the first position in the predetermined area, and is different from the measured area located at the first position. wherein the pattern included in the measurement region located on the second position comprising at least a portion pattern, unique pattern for extracting a second quasi pattern having a unique signal characteristic Out with the equipment.
- another pattern matching apparatus arranges a measurement area having a predetermined area on an object on which a pattern is formed, and matches a pattern force included in the measurement area with a predetermined reference pattern.
- a pattern matching device for detecting a specific pattern to be detected wherein the pattern matching device detects the specific pattern from a predetermined region wider than the measured region.
- a first reference pattern having a unique signal characteristic is extracted from a pattern included in the measured area located at one position, and the first reference pattern located at a second position different from the first position in the predetermined area is extracted.
- a measurement area, which is located at the second position including at least a part of a pattern different from the measurement area located at the first position.
- a unique pattern extraction device for extracting a second reference pattern having a unique signal characteristic from a pattern included in the measurement area, setting means for setting the measurement area on the object, and the measurement area Detecting means for detecting a pattern that matches the first reference pattern or the second reference pattern from the image of the object in the image.
- another position detecting device arranges a measurement area having a predetermined area on an object on which a pattern is formed, and determines a predetermined reference pattern from a pattern included in the measurement area.
- a position detecting device for detecting a matching specific pattern and detecting relative position information between said object and said measured area, wherein said reference pattern is used as a reference pattern in a predetermined area wider than said measured area.
- a reference pattern extraction method and apparatus capable of appropriately and efficiently extracting a reference pattern (template) effective for template matching.
- a reference pattern extraction method and a reference pattern extraction method capable of extracting an effective reference pattern from the maximum range of the observation visual field without depending on the area always included in the observation visual field.
- a template that can perform effective template matching can be generated without the need for actual data for template creation.
- an apparatus for extracting a reference pattern that can be used.
- a pattern matching method capable of appropriately performing template matching using the effective reference pattern (template) extracted as described above and detecting a desired mark or the like. it can. More specifically, even if the observation visual field has a certain degree of variation, it is possible to provide a pattern matching method that can appropriately detect a desired pattern using a template extracted from the entire observation visual field maximum range. . Further, it is possible to provide a pattern matching method capable of appropriately detecting a desired pattern by using a template according to the present invention created without using actual data.
- FIG. 1 is a view showing a configuration of an exposure apparatus according to an embodiment of the present invention.
- FIG. 2 is a diagram showing a distribution of optical information from a mark on a wafer on a pupil image plane of a TTL alignment system of the exposure apparatus shown in FIG. 1.
- FIG. 3 is a view showing a light receiving surface of a light receiving element of a TTL type alignment system of the exposure apparatus shown in FIG. 1.
- FIG. 4 is a cross-sectional view of a reference plate of an off-axis alignment optical system of the exposure apparatus shown in FIG. 1.
- FIG. 5 is a diagram showing a configuration of an FIA operation unit of an off-axis type alignment optical system of the exposure apparatus shown in FIG. 1.
- FIG. 6 is a flowchart showing a template creation method according to the first embodiment of the present invention.
- FIG. 7A is a first diagram for explaining a template creating method according to the first embodiment of the present invention.
- FIG. 7B is a second diagram for explaining the template creating method according to the first embodiment of the present invention.
- FIG. 8A is a third diagram for explaining the template creating method according to the first embodiment of the present invention.
- FIG. 8B is a fourth diagram for explaining the template creating method according to the first embodiment of the present invention.
- FIG. 9 is a flowchart showing the entire flow of the exposure processing according to the first embodiment of the present invention.
- FIG. 10 is a flowchart showing a template creating method according to a second embodiment of the present invention.
- FIG. 11A is a first diagram for explaining a template creating method according to the third embodiment of the present invention.
- FIG. 11B is a second diagram for explaining the template creating method according to the third embodiment of the present invention.
- FIG. 11C is a third diagram for explaining the template creating method according to the third embodiment of the present invention.
- FIG. 12 is a flowchart showing a template creating method according to the third embodiment of the present invention.
- FIG. 13A is a first diagram illustrating a process of selecting one element from a plurality of elements detected for the same pattern.
- FIG. 13B is a second diagram illustrating a process of selecting one element from a plurality of elements detected for the same pattern.
- FIG. 14 is a first diagram illustrating a process of configuring a template by a plurality of elements.
- FIG. 15 is a second diagram illustrating a process of forming a template by a plurality of elements.
- FIG. 16A is a third diagram illustrating a process of composing a template by a plurality of elements.
- FIG. 16B is a fourth diagram for describing the processing of composing a template by a plurality of elements.
- FIG. 16C is a fifth diagram for describing the processing of composing a template by a plurality of elements.
- FIG. 17 is a flowchart showing a flow of a search alignment process according to the third embodiment of the present invention.
- FIG. 18 is a flowchart showing a template creation method according to a fourth embodiment of the present invention.
- FIG. 19A is a first diagram for explaining a template creating method according to the fourth embodiment of the present invention.
- FIG. 19B is a second diagram for explaining the template creating method according to the fourth embodiment of the present invention.
- FIG. 20A is a third diagram illustrating a template creating method according to the fourth embodiment of the present invention.
- FIG. 20B is a fourth diagram for explaining the template creating method according to the fourth embodiment of the present invention.
- FIG. 21 is a fifth diagram for explaining the template creating method according to the fourth embodiment of the present invention.
- FIG. 22 is a sixth diagram illustrating a template creation method according to the fourth embodiment of the present invention.
- FIG. 23 is a flowchart showing a template creating method according to the fifth embodiment of the present invention.
- FIG. 24A is a first diagram for explaining a template creating method according to the fifth embodiment of the present invention.
- FIG. 24B is a second diagram for explaining the template creating method according to the fifth embodiment of the present invention.
- FIG. 25A is a third diagram illustrating a template creating method according to the fifth embodiment of the present invention.
- FIG. 25B is a fourth diagram for explaining the template creating method according to the fifth embodiment of the present invention.
- FIG. 26 is a fifth diagram for explaining the template creating method according to the fifth embodiment of the present invention.
- FIG. 27 is a flowchart showing a template creating method according to a sixth embodiment of the present invention.
- FIG. 28 is a diagram for explaining a template creating method according to the sixth embodiment of the present invention.
- FIG. 29 is a flowchart for explaining a device manufacturing method according to the present invention.
- FIGS. 1 and 2 A first embodiment of the present invention will be described with reference to FIGS.
- an exposure apparatus having an off-axis alignment optical system for detecting a predetermined reference pattern on a wafer by image processing, and a template (reference) for performing alignment by template matching in the exposure apparatus.
- a method for creating a pattern, an alignment method using the template in the exposure apparatus, and the like will be described.
- FIG. 1 is a diagram showing a schematic configuration of an exposure apparatus 100 of the present embodiment.
- the XYZ orthogonal coordinate system shown in FIG. 1 is set, and the positional relationship and the like of each member will be described with reference to the XYZ orthogonal coordinate system.
- the X axis and the Z axis are set to be parallel to the paper surface, and the Y axis is set to a direction perpendicular to the paper surface.
- the XY plane is actually set as a plane parallel to the horizontal plane, and the Z axis is set vertically upward.
- exposure light EL emitted from an illumination optical system is applied to pattern area PA formed on reticle R via condenser lens 1 with a uniform illuminance distribution.
- the exposure light EL for example, g-line (436 nm), i-line (365 nm), or KrF Shima laser light (248 nm), ArF excimer laser light (193 nm), F2 laser light (157 ⁇ m), or the like is used.
- Reticle R is held on reticle stage 2, and reticle stage 2 is supported so as to be able to move and minutely rotate in a two-dimensional plane on base 3.
- a main control system 15 for controlling the operation of the entire apparatus controls the operation of the reticle stage 2 via the driving device 4 on the base 3.
- the reticle R is positioned with respect to the optical axis AX of the projection lens PL by detecting a reticle alignment mark (not shown) formed therearound by a reticle alignment system including a mirror 5, an objective lens 6, and a mark detection system 7. You.
- the exposure light EL that has passed through the pattern area PA of the reticle R is incident on, for example, a bilateral (or one side,...) Telecentric projection lens PL, and is projected onto each shot area on the ueno and (substrate) W. You.
- the projection lens PL is best corrected for aberration with respect to the wavelength of the exposure light EL, and the reticle R and the wafer W are conjugated to each other under that wavelength.
- the illumination light EL is Keller illumination, and is formed as a light source image at the center of the pupil EP of the projection lens PL.
- the projection lens PL has a plurality of optical elements such as lenses.
- an optical material such as quartz or fluorite is used according to the wavelength of the exposure light EL.
- the wafer W is placed on the wafer stage 9 via the wafer holder 8.
- a reference mark 10 used for baseline measurement or the like is provided on the wafer stage 9 on the wafer holder 8.
- the wafer stage 9 is used to two-dimensionally position the wafer W in a plane perpendicular to the optical axis AX of the projection lens PL.
- the XY stage is a wafer W in a direction (Z direction) parallel to the optical axis AX of the projection lens PL. It has a Z stage for positioning the wafer, a stage for slightly rotating the wafer W, and a stage for adjusting the inclination of the wafer W with respect to the XY plane by changing the angle with respect to the axis.
- An L-shaped movable mirror 11 is attached to one end of the upper surface of the wafer stage 9, and a laser interferometer 12 is arranged at a position facing the mirror surface of the movable mirror 11.
- the movable mirror 11 is composed of a plane mirror having a reflection surface perpendicular to the X axis and a plane mirror having a reflection surface perpendicular to the Y axis.
- the laser interferometer 12 irradiates the movable mirror 11 with a laser beam along the X axis.
- a laser interferometer for irradiating the movable mirror 11 with a laser beam along the X-axis and a Y-axis is comprised of one laser interferometer for the X-axis and one for the X-axis.
- the X and ⁇ coordinates of the wafer stage 9 are measured by the laser interferometers. Further, the rotation angle of the wafer stage 9 in the ⁇ plane is measured based on the difference between the measurement values of the two laser interferometers for the X axis.
- the position measurement signal PDS indicating the X coordinate, the ⁇ coordinate, and the rotation angle measured by the laser interferometer 12 is supplied to the stage controller 13.
- the stage controller 13 controls the position of the wafer stage 9 via the drive system 14 according to the position measurement signal PDS under the control of the main control system 15.
- the position measurement information PDS is output to the main control system 15.
- the main control system 15 outputs a control signal for controlling the position of the wafer stage 9 to the stage controller 13 while monitoring the supplied position measurement signal PDS.
- the position measurement signal PDS output from the laser interference system 12 is output to a laser step alignment (LSA) calculation unit 25 described later.
- LSA laser step alignment
- the exposure apparatus 100 includes a laser light source 16, a beam shaping optical system 17, a mirror 18, a lens system 19, a mirror 20, a beam splitter 21, an objective lens 22, a mirror 23, a light receiving element 24, and an LSA calculation unit 25. And a TTL alignment optical system with the projection lens PL as a component.
- the laser light source 16 is, for example, a light source such as a He-Ne laser, and emits a non-photosensitive laser beam LB which is red light (for example, a wavelength of 632.8 nm) and is non-photosensitive to the photoresist applied on the wafer W. I do.
- This laser beam LB passes through a beam shaping optical system 17 including a cylindrical lens and the like, and enters an objective lens 22 via a mirror 18, a lens system 19, a mirror 20, and a beam splitter 21.
- the laser beam LB transmitted through the objective lens 22 is reflected by a mirror 23 provided below the reticle R and obliquely to the XY plane, and is incident on the periphery of the field of view of the projection lens PL in parallel with the optical axis AX. Then, the wafer W is irradiated vertically through the center of the pupil EP of the projection lens PL.
- the laser beam LB is condensed as a slit-like spot light SP0 in a space in the optical path between the objective lens 22 and the projection lens PL by the function of the beam shaping optical system 17.
- the projection lens PL re-images the spot light SP0 on the wafer W as a spot SP.
- the mirror 23 is fixed so as to be outside the periphery of the pattern area PA of the reticle R and within the field of view of the projection lens PL. Therefore, the slit-shaped spot light SP formed on the wafer W is located outside the projected image of the pattern area PA.
- the wafer stage 9 In order to detect a mark on the wafer W using the spot light SP, the wafer stage 9 is moved horizontally with respect to the spot light SP in the XY plane.
- specular reflected light, scattered light, diffracted light, etc. are generated from the mark, and the light amount changes depending on the relative position between the mark and the spot light SP.
- Such optical information travels backward along the transmission path of the laser beam LB, and reaches the light receiving element 24 via the projection lens PL, mirror 23, objective lens 22, and beam splitter 21.
- the light receiving surface of the light receiving element 24 is arranged on a pupil image plane substantially conjugate to the pupil EP of the projection lens PL, has an insensitive area for specularly reflected light from the mark, and receives only scattered light and diffracted light.
- FIG. 2 is a diagram showing a distribution of optical information from a mark on wafer W on pupil EP (or pupil image plane).
- pupil EP or pupil image plane.
- the upper and lower sides (Y-axis direction) of the specularly reflected light DO extending in a slit shape in the X-axis direction are positive first-order diffracted light + D1 and second-order diffracted light + D2, respectively, Dl and second-order diffracted light-D2 are arranged, and the scattered light Dr from the markedge is located to the left and right (X-axis direction) of the specularly reflected light DO.
- the force S, the diffracted light Dl, and the earth D2 occur only when the mark is a diffraction grating mark.
- the light receiving element 24 has four independent light receiving surfaces 24a, 24b, 24c in the pupil image plane as shown in FIG. , 24d are arranged so that the light receiving surfaces 24a, 24b receive the scattered light soil Dr, and the light receiving surfaces 24c, 24d receive the diffracted light soil Dl, ⁇ D2.
- FIG. 3 is a view showing the light receiving surface of the light receiving element 24.
- the third-order diffracted light generated from the diffraction grating mark having a large numerical aperture (NA) on the wafer W side of the projection lens PL also passes through the pupil EP, the light receiving surfaces 24c and 24d also receive the third-order diffracted light.
- Good size the third-order diffracted light generated from the diffraction grating mark having a large numerical aperture (NA) on the wafer W side of the projection lens PL also passes through the pupil EP, the light receiving surfaces 24c and 24d also receive the third-order diffracted light.
- NA numerical aperture
- Each photoelectric signal from the light receiving element 24 is a position measurement signal P output from the laser interferometer 12.
- the data is input to the LSA operation unit 25, and a mark position information API is created.
- the LSA calculation unit 25 samples and stores the photoelectric signal waveform from the light receiving element 24 when the wafer mark is scanned with respect to the spot light SP based on the position measurement signal PDS, and analyzes the waveform to analyze the waveform.
- the mark position information API is output as the coordinate position of the wafer stage 9 when the center of the mark coincides with the center of the spot light SP.
- the TTL alignment system (16, 17, 18,
- the solid line shown in the optical path of the TTL alignment optical system in FIG. 1 represents the imaging relationship with the wafer W
- the broken line represents the conjugate relationship with the pupil EP.
- exposure apparatus 100 includes an alignment optical system of an off-axis system (hereinafter, referred to as an alignment sensor) on the side of projection optical system PL.
- This alignment sensor performs signal processing (including image processing) on a signal (n-dimensional signal) captured near the alignment mark on the substrate surface, and detects mark position information.
- the FIA (Field Image Alignment) method This is the alignment sensor.
- search alignment measurement and fine alignment measurement are performed by the alignment sensor.
- Search alignment measurement detects a plurality of search alignment marks formed on the wafer and rotates the wafer with respect to the wafer holder. It is a process to detect the amount and displacement in the ⁇ plane
- a technique of using a preset reference pattern (template) and detecting a predetermined pattern corresponding to the template is used.
- fine alignment measurement (hereinafter, sometimes simply referred to as “fine alignment”) is performed by detecting an alignment mark for fine alignment formed corresponding to a shot area, This is a process for positioning each exposure shot.
- fine alignment As an image processing method of fine alignment, an edge of a mark is extracted. Then, a method of detecting the position (edge measurement method) is used.
- the image processing method is not limited to the method of the present embodiment, but may be any of the template matching method, the edge measurement method, and other image processing methods.
- the observation magnification at the time of the search alignment and the observation magnification at the time of the fine alignment may be the same observation magnification, or the magnification at the time of the fine alignment may be the same as that at the time of the search alignment. May be set to be higher than the magnification of.
- the alignment sensor includes a halogen lamp 26 for emitting irradiation light for illuminating the wafer W, a condenser lens 27 for condensing illumination light emitted from the halogen lamp 26 to one end of an optical fiber 28, And an optical fiber 28 for guiding the illumination light.
- the halogen lamp 26 is used as a light source for the illumination light because the wavelength range of the illumination light emitted from the halogen lamp 26 is 500 to 800 nm, which is a wavelength range in which the photoresist applied to the upper surface of the wafer W is not exposed. This is because the influence of the wavelength characteristics of the reflectance on the surface of the wafer W having a wide wavelength band can be reduced.
- the illumination light emitted from the optical fiber 128 passes through a filter 29 that cuts a photosensitive wavelength (short wavelength) region and an infrared wavelength region of the photoresist applied on the wafer W, and passes through a lens system.
- the half mirror 31 is reached via 30.
- the illumination light reflected by the half mirror 31 is reflected by the mirror 32 almost in parallel with the X-axis direction, then enters the objective lens 33, and furthermore, the field of view of the projection lens PL is located around the lower part of the lens barrel of the projection lens PL. Is reflected by a prism (mirror) 34 fixed so as not to shield light, and irradiates the wafer W vertically.
- a prism (mirror) 34 fixed so as not to shield light
- an appropriate illumination field stop is provided at a position conjugate with the wafer W with respect to the objective lens 33 in the optical path from the exit end of the optical fiber 128 to the objective lens 33.
- the objective lens 33 is set to be telecentric, and an image of the exit end of the optical fiber 128 is formed on the surface 33a of the aperture stop (same as the pupil), and Keller illumination is performed.
- the optical axis of the objective lens 33 is set to be vertical on the wafer W, and the mark position does not shift due to the tilt of the optical axis when detecting the mark.
- the reflected light from wafer W passes through prism 34, objective lens 33, mirror 32, and half mirror 31. Then, an image is formed on the index plate 36 by the lens system 35.
- the index plate 36 is arranged conjugate with the wafer W by the objective lens 33 and the lens system 35, and extends in the rectangular transparent window in the X-axis direction and the Y-axis direction, respectively, as shown in FIG. It has linear index marks 36a, 36b, 36c, 36d.
- FIG. 4 is a sectional view of the index plate 36.
- the image of the mark of the wafer W is formed in the transparent window 36e of the index plate 36, and the image of the mark of the wafer W and the index marks 36a, 36b, 36c, 36d are connected to the relay systems 37, 39.
- the image is formed on the image sensor 40 via the mirror 38.
- the image sensor 40 (photoelectric conversion means, photoelectric conversion element) converts an image incident on its imaging surface into a photoelectric signal (image signal, image data, data, signal). Is used.
- the signal (n-dimensional signal) output from the image sensor 40 is input to the FIA operation unit 41 together with the position measurement signal PDS from the laser interferometer 12.
- a two-dimensional image signal is obtained in the image sensor 40 and is input to the FIA operation unit 41 and used.
- the signals obtained by the two-dimensional CCD are integrated (projected) in the non-measurement direction and used as a one-dimensional projection signal for measurement in the measurement direction. .
- the format of the signal obtained by the image sensor 40 and the signal to be processed in the subsequent signal processing are not limited to the example of the present embodiment.
- two-dimensional image processing may be performed to use two-dimensional signals for measurement.
- it may be configured to obtain a three-dimensional image signal and perform three-dimensional image processing.
- the CCD signal is expanded into n dimensions (n is an integer of n ⁇ l) to generate, for example, an n-dimensional cosine component signal, an n-dimensional sine signal, or an n-order frequency signal.
- the present invention is also applicable to a device that performs position measurement using the n-dimensional signal.
- the FIA operation unit 41 detects an alignment mark from the input image signal, and obtains a shift of the mark image of the alignment mark with respect to the index marks 36a and 36d. And From the stop position of the wafer stage 9 represented by the position measurement signal PDS, the mark center detection position of the wafer stage 9 when the image of the mark formed on the wafer W is accurately positioned at the center of the index marks 36a to 36d. Outputs information AP2.
- the FIA operation unit 41 performs the position detection of a predetermined alignment mark image and the detection of a deviation thereof during each of the search alignment and the fine alignment.
- the position of the mark and the deviation are detected using the template matching method at the time of search alignment, and using the edge detection processing method at the time of fine alignment.
- FIG. 5 is a block diagram showing the internal configuration of the FIA operation unit 41.
- the FIA operation unit 41 has an image signal (pattern signal) storage unit 50, a template data storage unit 52, a data processing unit 53, and a control unit 54.
- the image signal storage unit 50 stores an image signal (pattern signal) input from the image sensor 40.
- the image signal storage unit 50 stores an image (pattern signal) captured by the image sensor 40.
- the template data storage unit 52 stores template data used in template matching processing performed at the time of search alignment, for example.
- the template data is reference pattern data for performing pattern matching with an image signal (pattern signal) stored in the image signal storage unit 50 to detect a mark on the wafer.
- the mark used in this specification is a concept that includes a pattern that is a part of a circuit or a wiring and is set as a reference pattern, in addition to a mark pattern particularly formed for alignment. .
- the template data may be created by a computer system or the like different from the exposure apparatus 100 and stored in the template data storage unit 52, or may be an image captured by an alignment sensor. It may be created by the FIA operation unit 41 based on the information (pattern signal) and stored in the template data storage unit 52.
- the data processing unit 53 performs desired image processing (signal processing) such as template matching and edge detection processing on the image signal (pattern signal) stored in the image signal (pattern signal) storage unit. It detects marks, detects position information, and detects misalignment information.
- desired image processing signal processing
- edge detection processing on the image signal (pattern signal) stored in the image signal (pattern signal) storage unit. It detects marks, detects position information, and detects misalignment information.
- the data processing unit 53 performs matching between the image signal (pattern signal) stored in the image signal storage unit 50 and the template stored in the template data storage unit 52, and outputs a mark in the image signal (pattern signal). Is detected.
- the data processing unit 53 sequentially scans the visual field in a search area corresponding to the size of the pattern to be detected, and compares the image signal (pattern signal) of the area with the template data at each position. Check. Then, the similarity and the correlation between the patterns (between images (pattern signals)) are detected as evaluation values, and when the similarity is equal to or greater than a predetermined threshold, a mark exists in the area, that is, It is determined that the mark image is included in the image (pattern signal) at that location.
- the data processing unit 53 determines at which position in the field of view the mark is located. As a result, information AP2 on the mark center position of the wafer stage 9 when the mark image formed on the wafer W is accurately positioned at the center of the index marks 36a to 36d is obtained.
- the control unit 54 stores and reads the image signal in the image signal storage unit 50, stores and reads the template data in the template data storage unit 52, and performs the matching and edge detection described above in the data processing unit 53.
- the operation of the whole FIA operation unit 41 is controlled so that the processing such as the above is appropriately performed.
- FIG. 6 is a flowchart showing the template creation processing.
- the template data creation processing described below is performed by executing a program for performing the processing described below as shown in a flowchart in FIG. 6 in an external computer device or the like separate from the exposure apparatus 100. It is preferable to carry out the following. However, the present invention is not limited to this, and may be performed in the exposure apparatus 100. More specifically, the processing may be performed by the data processing unit 53 in the FIA operation unit 41, for example.
- the template creation method of the present embodiment generates template data from design data without using image data (a pattern signal) obtained by imaging a wafer.
- the design value data (design value information) is converted into a two-dimensional image (pattern signal), and the data is binarized to generate binary image data (pattern signal) (step S110).
- Most of the design value data is represented by a combination of two colors, for example, white and black, so that binarization can be easily performed.
- the range of the input data is set for the binarized design value data (step S120).
- the range to be imaged varies within a predetermined range due to an error caused by a wafer loading operation, an error caused by a pattern manufacturing process, and the like. Even if there is such a variation, it is preferable to always set an area included in the imaging range as the range of the input data. This is because a reference pattern is always included in a captured image (pattern signal). However, if a process in the case where the reference pattern is not included in the captured image is set at the time of the search alignment, the input data range can be expanded.
- the input data range may be set arbitrarily according to such a situation. In the present embodiment, for example, an area Areal as shown in FIG. 7A is set on the binary image (pattern signal) data.
- an area having the same size as a predetermined template size is set in the input data range, and is set as a temporary template (first partial image information) (step S130). .
- an area AreaT (xi, yi) having the same size SizeT as the template is set in the input data range Areal.
- (xi, yi) is the coordinate value of the reference point of the area AreaT (in the example of FIG. 7A, the coordinate value of the upper left corner of the area AreaT).
- the input data range Areal is scanned with a window of the same size SizeT as the template (temporary template AreaT), and an image (pattern) of the area AreaS (xj, yj) within the window at each position is obtained.
- Signal) second partial image (pattern signal) information
- a correlation value between the image (pattern signal) of the temporary template AreaT (xi, yi). Then, a region where the correlation value is a sufficiently high peak is detected.
- the correlation search calculation is performed based on the following equation (1) or (2).
- fCx) and g (x) are luminance
- step SI30 The setting of such a tentative template (step SI30) and the detection of the correlation value peak (step S140) are performed in all areas (areas of the same size as the template) that can be set within the input data range. Then, a region showing a correlation value peak for each region is detected (step S150).
- the regions showing the correlation value peak with respect to the same temporary template can be regarded as the same image, that is, the region constituted by the same pattern. Therefore, by detecting a region showing a correlation value peak, a region composed of the same pattern is detected for each pattern. For example, as shown in Figure 8A, there are five areas composed of pattern A, two areas composed of pattern B, and one area composed of pattern C in area Areal. Is detected.
- a pattern that is actually used as a template is selected from the patterns (step S160).
- a pattern that exists solely (uniquely) without a peak other than itself is detected.
- a pattern C image data of an area AreaT (xi, yi) represented by the pattern C) that exists only at one location is selected as a template.
- pattern O having a shape similar to the pattern C exists.
- pattern ⁇ may or may not be detected as a correlation peak with respect to pattern ⁇ itself.
- pattern B and pattern B are included in the input data range Areal.
- the pattern C has a pattern O having a shape similar to the pattern C, the difference between the correlation value of the pattern C and the correlation value of the pattern ⁇ when the pattern C is used as a temporary template becomes small. Therefore, in the case of FIG. Select as rate.
- template data to be used in template matching at the time of search alignment is created in this way.
- the created template data is stored in the template data storage unit 52 of the FIA operation unit 41 together with the position information of the area.
- the image data obtained by converting the design data is binarized and used. However, it is determined that there are a plurality of (N) pattern height directions in the design stage in advance. In that case, if you try to convert to N-value.
- the correlation value is obtained based on the equation (1) or the equation (2), but the SSDA method or the like may be applied.
- the template size SizeT is set to a predetermined size in advance, and a temporary template is set with this size, or a correlation value search is performed.
- this size (the size of the partial image information) may also be a variable parameter. That is, a size range or a size type is determined in advance, and the size of the template is sequentially changed within the range. Then, by scanning the input data range for each size, temporary templates are sequentially detected by the above-described method, and finally the most temporary template is used as a template.
- the template size can be automatically determined, and an appropriate template can be detected in consideration of the template size.
- reticle R and wafer W are transported onto reticle stage 2 and wafer stage 9, respectively, and are mounted and supported on each stage. At this time, the wafer W is positioned (pre-aligned) with respect to the wafer stage 9 using an orientation flat or notch formed on the wafer, and then the wafer is moved through the wafer holder 8. It is held on stage 9 (step S210).
- search alignment is performed by the alignment sensor to determine the amount of rotation and XY shift of the wafer W mounted on the wafer holder 8 with respect to the wafer holder 8 (step S220).
- search alignment generally, a plurality of distant locations on the wafer W are used. Then, the reference patterns are detected, and the rotation amount of the wafer, the XY shift, and the like are obtained based on the positional relationship between the respective reference patterns.
- an image (field image, pattern signal) of a predetermined area including the reference pattern on the wafer W is fetched based on positional information on the design of the reference pattern (step S221). .
- the main control system 15 drives the wafer stage 9 via the stage controller 13 and the drive system 14 so that the reference pattern enters the field of view of the alignment sensor.
- the illumination light of the alignment sensor is illuminated on wafer W.
- the illumination light power emitted from the halogen lamp 26 is condensed on one end of the optical fiber 28 by the condenser lens 27 and is incident on the optical fiber 28.
- the incident illumination light propagates through the optical fiber 28 and exits from the other end, passes through the filter 29, and reaches the half mirror 31 via the lens system 30.
- the illumination light reflected by the half mirror 31 is reflected by the mirror 32 almost horizontally with respect to the X-axis direction, then enters the objective lens 33, and further around the lower part of the barrel of the projection lens PL.
- the light is reflected by the prism 34 fixed so as not to shield the field of view, and is irradiated on the wafer W vertically.
- the reflected light from wafer W is imaged on index plate 36 by lens system 35 via prism 34, objective lens 33, mirror 32, and half mirror 31.
- the image of the mark on the wafer W and the index marks 36a, 36b, 36c, 36d form an image on the image sensor 40 via the relay systems 37, 39 and the mirror 38.
- the image data formed on the image sensor 40 is stored in the image signal storage unit 50 of the FIA operation unit 41 as an image in the visual field area.
- the data processing unit 53 of the FIA operation unit 41 uses the window of the same size as the template stored in the template data storage unit 52 to store the visual field image (pattern signal) stored in the image signal storage unit 50 Are scanned to perform matching with the template data (step S222).
- the evaluation value of the correlation between the image in the window set at a predetermined position by scanning and the template data is obtained by the above-described equation (1) or equation (2). Then, when the evaluation value is larger than the predetermined threshold value, it is determined that the reference pattern of the template exists at the position, and the position information of the reference pattern is obtained.
- Each pair is repeatedly performed on predetermined several regions on the wafer W set in advance for search alignment (step S221 to step S223).
- step S 223 When the detection of the positions of the reference patterns in several regions is completed (step S 223), a predetermined calculation is performed based on the positional relationship between the respective reference patterns, and the rotation amount of the wafer W and the XY A deviation or the like is obtained (step S224).
- a fine alignment for detecting the displacement of each exposure shot on the wafer W is performed by the alignment sensor (step S230).
- the alignment mark for the fine alignment formed corresponding to the exposure shot on the wafer W is detected, the position of the alignment mark is obtained, and the rotation amount and the displacement of each shot area are determined. To detect.
- the alignment mark is detected and its position information is detected by performing signal processing by the edge measurement method on the detected waveform signal of the image (pattern signal).
- the edge measuring method is disclosed in, for example, Japanese Patent Application Laid-Open No. 4-165603, and the detailed description thereof is omitted.
- This fine alignment may be performed by detecting all of the alignment marks provided corresponding to each shot area on the wafer W, or selecting and selecting several shot areas.
- the alignment may be performed by detecting an alignment mark corresponding to the selected shot area.
- statistical calculation processing EAA processing
- this image capture at the time of fine alignment may be performed by setting the magnification of the alignment sensor to a higher magnification than at the time of the search alignment described above.
- the alignment mark to be detected is usually a mark smaller than the reference pattern used at the time of the search alignment described above, that is, a mark with higher definition.
- the data processing unit 53 performs processing such as EGA processing based on the search alignment result and the fine alignment result. Is calculated (step S240).
- the main control system 15 aligns the position of the reticle R with the shot area of the wafer W based on the baseline amount managed in advance and the calculated shot area. Then, the pattern image of the reticle R is accurately superimposed on the shot area, and exposure is performed (step S250).
- design data is converted into two-dimensional image data (pattern signal), a unique pattern is extracted from the obtained image data (pattern signal), and a template is extracted.
- the template is generated by specifying the creation area and extracting the image data (pattern signal) of the area. Therefore, even when actual data cannot be obtained, or when a mark to be used as a template is not formed, an effective template can be generated.
- template matching and search alignment can be appropriately performed using this template.
- This template can be automatically generated from the design data. Therefore, the load on the operator can be reduced. Further, since the template is generated from the design data, the template can be created before the wafer is manufactured, and the exposure process can be performed efficiently.
- a region for creating a reference pattern is specified from the design value data, and a two-dimensional image (pattern) generated by converting the design value data is specified.
- a signal was generated from the image (pattern signal) data of that area in the data. This method is applicable and effective even when there is image (pattern signal) data of an actually manufactured wafer.
- a template generation method by a similar method when data of an actually manufactured wafer exists will be described as a second embodiment of the present invention.
- the configuration of the exposure apparatus and a series of exposure processing methods including template matching at the time of search alignment are the same as those in the first embodiment. Therefore, the description is omitted.
- the same reference numerals as those used in the first embodiment are used.
- FIG. 10 is a flowchart illustrating template creation processing according to the second embodiment of this invention.
- step S310 an image of the manufactured wafer is taken to obtain image (pattern signal) data.
- the design value data (design value information) is converted into a two-dimensional image (pattern signal) to generate two-dimensional image (pattern signal) data (step S320).
- an input data range is set on the two-dimensional image (pattern signal) data of the design values.
- step S340 After the input data range is set, an area having the same size as a predetermined template size is set in the input data range, and a temporary template is set (step S340).
- step S350 the entire area of the input data range is subjected to a correlation search calculation with the temporary template based on Equation (1) or Equation (2) described above, and the position where the correlation value exceeds a predetermined threshold value and is a peak is obtained. Is detected (step S350).
- Temporary templates are set (step S340) and correlation value peaks are detected (step S350) sequentially for the entire region within the input data range, and regions indicating correlation value peaks for each region are detected (step S350).
- step S360 When the processing in step S360 detects a pattern existing within the input data range, a pattern that has no peak other than itself and that exists uniquely (uniquely) is detected. The position information (area information) is obtained (step S370). When there are a plurality of unique patterns, the pattern having the largest correlation value difference with the pattern having the next highest correlation value is detected, and its positional information is obtained.
- the extracted template is stored in the template data storage unit 52 of the FIA operation unit 41 together with the position information of the area, and is used in template matching at the time of search alignment.
- the process of extracting the template in S380 may be performed separately, instead of a series of processes.
- the process of imaging the wafer and the process of detecting the unique pattern area may be performed separately in advance.
- each processing may be performed using the exposure apparatus 100, or may be performed using separate apparatuses, which may be performed using an external computer apparatus, an imaging apparatus, or the like.
- the template creating method of the present embodiment it is possible to automatically generate an effective template by using captured image (pattern signal) data of an actual wafer. Therefore, the load on the operator can be reduced. Further, an effective template having a high correlation with a pattern formed on an actual wafer can be generated.
- the detection of the region of the unique pattern to be used as the template is performed based on the design data, it can be performed in advance, and the template generation processing and the exposure processing can be efficiently performed.
- the range of the observation field of view imaged by the alignment sensor during the search alignment is caused by an error in the position at the time of loading the wafer and an error in the pattern formation position during the wafer processing. It fluctuates due to differences, etc., and does not become a constant area.
- the reference pattern (template) used in template matching must be extracted from the range that is always included in the observation field of view, but if the observation field fluctuates greatly, the area that is always included in the observation field becomes very small. Or a nonexistent situation, making it impossible to extract the reference pattern.
- the maximum range that can be the observation field is OR_Are a
- the common area that is always included in the observation field is AND_Area, as shown in FIG. If a unique pattern B exists in the common area AND_Area, it can be extracted as a template. However, if a unique pattern does not exist in the common area AND_Area as shown in FIG. 11C, a template cannot be generated.
- the pattern of each element constituting the template is referred to as an element.
- the configuration of the exposure apparatus, the overall flow of the exposure processing, and the like are substantially the same as those of the above-described first and second embodiments, and a description thereof will be omitted. Further, when referring to the configuration of the exposure apparatus, the same reference numerals as those used in the first embodiment are used.
- FIG. 12 is a flowchart illustrating a template creation process according to the third embodiment of the present invention.
- image (pattern signal) data is fetched (step S410).
- Image (pattern signal) data may be generated by converting design value data (design value information) into a two-dimensional image (pattern signal) as in the first and second embodiments, or may be actually manufactured. Alternatively, the acquired wafer may be imaged and acquired.
- a range of input data is set for the captured image (pattern signal) data (step S420). As described above, the range imaged during the search alignment varies within a predetermined range.
- the maximum range of the imageable area maximum field of view, maximum input data maximum
- the entire area of the maximum visual field range ⁇ R_Area is subjected to a correlation search calculation with a temporary template, and a position where the correlation value exceeds a predetermined threshold value and becomes a peak is detected (step S440).
- the evaluation calculation formula for the correlation search calculation may be any formula including the correlation coefficient calculation and the SSDA method shown in the formula (1) or (2).
- step S45 Select a unique element or a somewhat unique element (step S45 ).
- the element is selected by first checking whether the number of detected peaks is equal to or less than a predetermined threshold value TH-Peak. If the number of peaks is greater than the threshold value TH—Peak, it is assumed that the uniqueness is low, and that it is not used as a template component.
- elements are narrowed down using features such as the SN ratio, the detected edge amount, entropy, variance, and moment.
- the SN ratio is represented by the difference between the correlation values between the highest correlation value and the second highest correlation value (the difference between feature values). This is because the larger the difference, the more stable the feature pattern.
- the edge amount is represented by the number of edges or the number of CCDs that detected edges of 2CCD (pixel area on 2CCD occupied by edges).
- FIG. 13A for example, there is a case where a plurality of patterns including a pattern shifted (protruding) as shown in FIG. is there.
- the correlation coefficient and the evaluation value of the SSDA method show the same high value for each pattern.
- the pattern with the largest positioning information can be selected.
- an element from which pattern B is appropriately cut out as shown in FIG. 13B is selected.
- the element narrowing down based on the characteristics such as the SN ratio, detected edge amount, entropy, variance, and moment should be combined with the evaluation value calculation in the correlation search calculation processing in step S440. May be.
- step S430 The processing of setting such a temporary template (step S430), detecting the correlation value peak (step S440), and selecting a unique element (step S450) can all be set within the maximum visual field range OR_Area. , Ie, the area of the element size ElementSize (step S460). As a result, free Murrell relatively unique elements within the maximum field of view OA_Are a (pattern) is detected.
- element B or element C if either element B or element C is present in the common area AND_Area of the visual field, the element can be used as a template by a conventional method. . If these elements B and C are not included in the common area AND_Area as in the example shown in FIG. 14, both of the selected patterns B and C are registered as templates.
- element A indicating the correlation peak value detected in step S440 is shown. Since there are four elements in the maximum visual field range ⁇ R_Area, any one of them is included. Even if it is detected, the position cannot be detected. Such an element is not selected as a unique element in step S450 and does not become a template.
- the maximum visual field range OR—Area there is usually a plurality of identical patterns in the maximum visual field range OR—Area, so it cannot be used as a template.
- the maximum visual field range OR—A pattern that can be uniquely specified within the Area can be configured. Therefore, this element B is registered as a template in a format that also includes information on the positional relationship between these two patterns B.
- the position can be detected based on the template matching result of the element by creating the template in a format that also includes the information on the positional relationship of each element. That is, these elements A, B, and C are registered as templates in a format that also includes such positional relationship information of each pattern.
- the observation field of view VIEW_Area is arranged at the upper left as shown in Fig. 16B and the case where the observation field of view VIEW_Area is arranged at the lower right as shown in Fig. 16C for the maximum field of view OR_Area in Fig. 16A. .
- the element B and the element C are arranged in the same positional relationship in any observation visual field.
- the position can be detected by using the presence / absence of the detection of the element A and its positional relationship.
- the combination of the element B and the element C and the combination power of the element A, the element B, and the element C are templated so as to substantially function as one template.
- the observation field of view VIEW_Area is arranged at any position within the maximum field of view ⁇ R_Area by appropriately combining relatively unique elements detected in the maximum field of view OR—Area. Even so, the template is configured so that the template is always included in the observation field of view VIEW_Area.
- the maximum visual field range ⁇ the size of the R_Area, the size of the observation visual field VIEW_Area, the element size ElemetSize, and the arrangement of the elements detected in step S440.
- the threshold for the number of peaks to be selected TH Peak is gradually increased from 1 to construct a template. Elements to be selected are sequentially selected.
- the elements constituting the template are registered in a format that also includes their positional information, thereby creating a template (step S480).
- the pattern (image data and shape data) of each element the information on the mutual positional relationship between the elements, the information on the uniqueness and the number of the elements, or the other information make sure to remember.
- the data format of the template may be any format.
- position measurement is performed in two or three predetermined areas on the wafer W to obtain a desired pattern, and the position is measured based on the positional relationship of each position. Obtain the amount of rotation of wafer W, XY shift, etc.
- an image (field-of-view image, pattern signal) of a position measurement position on the wafer W is fetched based on the design value data (step S521).
- the captured field-of-view image is scanned through a window having the same size as the element size ElementSize of the template, and collation (matching) with the element data is performed (step S522).
- the evaluation value of the correlation between the image (pattern signal) in the window set at the predetermined position scanned and the element data is obtained by the above-described equation (1) or (2).
- the evaluation value is larger than a predetermined threshold value, it is determined that the pattern force S of the element exists at the position, and information for identifying the element and the position are stored.
- the template according to the present embodiment includes a plurality of elements. Therefore, this matching process is sequentially performed for all the elements (step S523).
- the relative positional relationship between these elements is detected, and this is stored in the template.
- the position of the visual field image (pattern signal) is detected by comparing with the information on the positional relationship between the elements (step S524). At this time, if the uniqueness information and the number information of each element or the other information is stored in the template, the processing is performed by referring to such information and giving priority to the unique element. It is preferable to preferentially process an element that has a small error or error and can more reliably detect the position. More specifically, since it has already been described in the description of the template configuration method described above with reference to FIG. 12, the description is omitted here.
- step S521 The capture of such an image (pattern signal) (step S521), the matching of each element (steps S522 and S523), and the visual field image (pattern signal) based on the relative positional relationship of the detected elements, etc.
- step S524 The process of position detection (step S524) is repeated for a plurality of predetermined regions on the wafer W set in advance for search alignment (step S525).
- step S526) when the position of each area is detected, a predetermined calculation is performed based on the positional relationship of each area, and the rotation amount of the wafer W, the XY shift, and the like are obtained.
- a relatively unique characteristic pattern is obtained from the entire range (maximum view image OR—Area) that can be taken by input data.
- a template is substantially generated. Therefore, even when the acquisition position of the input data (view image VIEW—Area) changes greatly or when the input data does not include much positioning information, a template can be generated. Enables template matching.
- the range to be imaged is predetermined due to an error caused by the wafer loading operation and an error caused by the pattern manufacturing process. It varies within the range (OR_Area).
- the range In order to create a template (reference pattern) to be used for alignment under such conditions, even if the observation visual field varies, it must be a pattern in the common area AND_Area included in the observation visual field. It is necessary to extract a unique pattern as a template within the maximum range OR_Area to be obtained. Strength, while common area of vision AND Are From a, it is not always possible to detect a unique pattern that has sufficient information necessary for position measurement.
- the target area for template creation is changed (the common area AND_Area and the maximum visual field range OR_Area are shifted).
- FIG. 18 is a flowchart showing the flow of template creation processing according to the fourth embodiment of the present invention.
- a target area for a unique pattern detection process is set (step S601).
- a target measurement area AreaVO on the wafer for pattern detection and position measurement by search alignment measurement is set. I do.
- the wafer is inserted into the exposure apparatus 100 based on information such as a wafer injection error and a wafer manufacturing error which are detected in advance, and an image is obtained by targeting the area AreaVO with an alignment sensor.
- the common field of view AND—Area and the maximum field of view OR—Area are detected. Then, an arbitrary area in the common field of view AND—Area is set as an area for searching for a unique pattern, Are aA, and an arbitrary area including the detected maximum field of view OR—Area is used to determine the uniqueness of the pattern.
- Validation area Set as Areal In the present embodiment, the uniter pattern search area AreaA is set equal to the visual field common area AND-Area, and the uniqueness verification area Areal is set equal to the maximum visual field area ⁇ R_Area.
- the image data (pattern signal information) of the uniqueness verification area Areal (the maximum visual field range ⁇ R_Area) is obtained (step S603).
- the image data may be obtained by converting design value data (design information) into an image signal, or may be obtained by imaging the surface of a manufactured wafer.
- a unique pattern is detected using the obtained image data.
- an area AreaT having the same size as a predetermined template is set in the unique pattern search area AreaA, and this is set as a temporary template (step S605).
- the entire area of the uniqueness verification area Areal is searched for in the area AreaS having the same size (the same size as the template) as the temporary template AreaT, and the image (pattern signal information) of the area AreaS at each position and the temporary template AreaT are compared.
- the correlation value with the image (pattern signal information) is sequentially obtained based on the above equation (1) or equation (2), and a region where the correlation value is a sufficiently high peak is detected (step S607). ).
- Temporary templates AreaT are sequentially set for the entire area within the unique pattern search area AreaA, and when a correlation value peak area for each temporary template is detected (step S609), a unique pattern (area) is selected from among them. Is extracted (step S611). That is, a single (unique) temporary template that has no peak other than itself in the uniqueness verification area Areal is extracted from the temporary template.
- step S613 the image data (image signal information) of the temporary template is extracted as a template (step S615), and a template is created. The process ends.
- the extracted template is stored in the template data storage unit 52 of the FIA operation unit 41 of the exposure apparatus 100 together with the position information of the target measurement area AreaVO set in step S601 for extracting the template (see FIGS. 1 and 5). And used for template matching during search alignment.
- the temporary template has the largest difference between the correlation value for itself and the second highest correlation value, and the correlation value. Select a plate as a template.
- step S611 if a unique temporary template is not extracted from the temporary templates (step S613), the unique pattern search area A and the uniqueness verification area Areal are reset (changed). ) (Step S620).
- a unique pattern is obtained from the area (see Fig. 19A) that is inside the current uniqueness verification area Areal (the maximum visual field range OR—Area) and outside the unique pattern search area AreaA (the common visual field area AND—Area). Detect patterns.
- an area AreaY having the same size as the template is set in the uniqueness verification area Areal, and this is set as a temporary uniter pattern area (step S621).
- the entire area of the uniqueness verification area Areal is searched for in an area AreaZ of the same size (the same size as the template) as the temporary unique pattern area AreaY (FIG. 20A), and the area AreaZ in each position is searched.
- the correlation value between the image (pattern signal information) and the image (pattern signal information) of the temporary unique pattern area AreaY is sequentially obtained based on the above-described equation (1) or (2), and the correlation value is sufficiently high. Then, an area having a peak is detected (step S623).
- step S621 The processing of setting the temporary unique pattern area (step S621) and detecting the correlation value peak area in the uniqueness verification area Areal for the temporary unit pattern (step S623) is as follows.
- Uniqueness verification area From the provisional unique pattern area AreaY that can be set in Areal, the provisional unique pattern area AreaY such that the entire range is included in the unique pattern search area AreaA (such as Y1 and Y2 in FIG. This is performed for all remaining temporary unique pattern areas AreaY except for the area such as Y3 (step S625).
- Step S627 From the tentative unique pattern detected by the processing of Step S621—Step S625 and the correlation value peak area, a unique pattern having no peak other than itself is extracted (Step S627). ).
- the unit is placed in the uniqueness verification area Areal. If the target pattern Q is detected, a new target imaging range AreaV is set so that the center of the unique pattern Q is set to the center of the observation field of view (VIEW—Area). The common area AND—Area—New and the maximum field of view OR—Area—New are detected.
- step S603 After newly setting the visual field common area AND_Area_New (unique pattern search area AreaA_New) and the visual field maximum range ⁇ R_Area_New (uniqueness verification area AreaI_New), the process returns to step S603, and these areas are again set. , The template creation processing described above is repeated (steps S603 and S609).
- step S611 When a unique temporary template is extracted in step S611 by such processing (step S613), the image data (image signal information) of the temporary template is extracted as a template (step S615), The template creation processing ends.
- the visual field common area AND— Area and the maximum field of view OR—Area are set as shown. Search within this common field of view AND—Area with an area AreaT of the same size as the template, make each area a temporary template, and determine whether each temporary template is unique within the maximum field of view OR—Area. Verify. As a result, when a unique temporary template is detected, the pattern is registered as it is as a template.
- a unique pattern Q is detected by this process, for example, the target of search alignment measurement is set so that this pattern Q is included in the common area of view (AND Area). Reset the area AreaV.
- the visual field common area AND—Area—New and the visual field maximum range OR—Area—New are detected again. (Of course, the common field of view AND—Area—New is detected to include the pattern Q.)
- the template creating method of the present embodiment when a unique pattern suitable for use as a template is not detected at a preset measurement position of search alignment measurement, a unique While changing (shifting) the pattern search area little by little, detection of the unique pattern is repeated and continued. Therefore, there is a high possibility that an appropriate template can be automatically created, and the burden on the operator involved in template creation can be reduced.
- a pattern that is likely to be a unique pattern is searched within the maximum range of the visual field, and the search area is set so as to include the searched pattern. Make changes. Therefore, there is a high possibility that an appropriate template can be detected after changing the unique pattern search area, and the template can be created more efficiently.
- a template can be created from both design data and image data from an actual wafer, so that convenience is improved.
- the searchable area can be expanded. Therefore, even if the alignment accuracy and the wafer loading accuracy of the exposure apparatus are slightly reduced, it is possible to cope with these by the search alignment, and as a result, a higher performance exposure apparatus can be provided.
- the uniqueness criterion for extracting a unique pattern for creating an actual template in step S611 and the unique pattern for changing the unique pattern search area in step S627 are used.
- the criterion of uniqueness for extracting a criterion is the same, a different criterion may be used, for example, the latter criterion is relaxed compared to the former.
- the processing is terminated under arbitrary conditions such as setting a limit on the number of times the area is changed.
- the template creation area is changed. Another method of resetting and retrying template creation will be described with reference to FIGS. 23 to 26.
- FIG. 23 is a flowchart showing a template creation process according to the fifth embodiment of the present invention.
- an area for creating a template is set (step S651). Specifically, as in the fourth embodiment, the field of view common area AND_Area and the maximum field of view with respect to the target measurement area are considered in consideration of wafer loading errors and wafer manufacturing errors with respect to the target measurement area of search alignment measurement.
- the OR_Area is detected, and in this embodiment, the maximum field of view OR_Area is set as a template creation area.
- image data (pattern signal information) of the maximum field of view OR_Area is obtained.
- this image data may be generated from the design information, or may be obtained by imaging an actual pattern on a wafer.
- a template is extracted from the template creation area (step S653).
- any method may be used, for example, each of the first to third embodiments described above may be applied. In the present embodiment, the same method as in the first embodiment is used. That is, as described with reference to FIGS.
- an area AreaT having the same size as the template is set as a temporary template in the template creation area Areal (the maximum field of view OR_Area), and the search area (Areal) Is searched in the window AreaS of the same size as the temporary template AreaT, and the correlation value between the image of the area AreaS (pattern signal information) and the image of the temporary template T (pattern signal information) at each position is calculated by the above-described equation ( The area where the correlation value is a sufficiently high peak is detected sequentially based on 1) or Equation (2). Then, when correlation peak values are detected for all the temporary templates AreaT set in the template creation area Areal, a unique temporary template having no peak other than itself is extracted from the correlation peak values.
- step S653 when one or more unique temporary templates are extracted (step S655), the most unique temporary template is selected as an actual template, and the image data (image signal information) is extracted. It is extracted as a template (step S657), and the template creation processing ends.
- step S653 If it is determined in step S653 that a unique temporary template has not been extracted at all (step S655), the setting of the template creation area Areal is changed also in the present embodiment.
- the image data (image signal information) of the periphery of the current template creation region Areal it is estimated from the image data (image signal information) of the periphery of the current template creation region Areal that the feature amount of the pattern feature is large and that there is a high possibility that a unique pattern exists.
- the area is detected (step S661), and the template creation area Areal is shifted in the direction by a desired amount to make a new template creation area AreaI_New (step S663).
- Figure 24A shows the current template creation area Areal (maximum field of view ⁇ R Area) Is set as follows, this maximum field of view OR—Along the four sides around the area, the maximum field of view OR—A belt-shaped area with a width L1 is defined inside the area, and thus the maximum field of view OR — Set eight areas E1 to E8 at the periphery of the area as shown.
- the pattern features include, for example, the pattern amount (pixel amount of a predetermined pixel value) in the binary pattern, the pattern density, the frequency component detection result, the amount of the edge line segment in the detected edge pattern, and the number of edges. , The density of the edge, the frequency component detection result of the edge line segment, and the like can be used.
- a pattern as shown in FIG. 24B is arranged in the template creation area Areal shown in FIG. 24A.
- the forces where patterns are arranged in areas El, E2, E3, E5 and E7 are arranged in area E1, for example, frequency analysis and edge detection are performed, and pattern features are extracted.
- the feature amount of the region E1 becomes the largest.
- the template creating region Areal is shifted to a predetermined region that is set in advance in association with the peripheral region, and a new template creating region is set.
- Areal—Set New In the example shown in FIG. 24B, the template creation area Areal is shifted by a predetermined amount in the direction of the area E1 toward the upper side in the drawing, and a new template creation area Areal-New is set.
- the new template creation area AreaI_New may be set in any direction and at any distance from the original area Areal. For example, it may be set so that it partially overlaps the original template creation area A real as shown in FIG. 25A, or it may be set so that it does not completely overlap as shown in FIG. 25B. Good.
- step S652 When a new template creation area is set in this way, step S652 Then, the image data of the template creation area Areal-New is obtained, and the template creation processing described above is repeated for this new template creation area Areal-New (steps S653 to S655).
- step S653 When a unique temporary template is extracted in step S653 by such processing (step S655), image data (image signal information) of the temporary template is extracted as a template (step S657).
- image data (image signal information) of the temporary template is extracted as a template (step S657).
- the template creation processing ends.
- a unique pattern suitable for use as a template in a preset template creation area (the maximum field of view OR_Area in the present embodiment).
- the template search position is shifted and reset, and the template is created again. Therefore, the possibility of automatically creating a template is increased, and the burden on the operator involved in template creation can be reduced.
- an area or direction in which a unique pattern is likely to be detected is estimated based on the pattern feature of the periphery of the template creation area. Change the template creation area based on the original. Therefore, at least it is possible to avoid a situation where the template creation area is reset in an area where no pattern exists, and it is possible to reset the template creation area in an area where it is highly likely that the template can be appropriately extracted. it can.
- a template can be created from both design data and image data from an actual wafer, so that convenience is improved.
- the template can be created by changing the search range (the maximum field of view ⁇ R_Area), the searchable area can be enlarged, and the alignment accuracy of the exposure apparatus and the wafer loading can be improved. Even if the accuracy is reduced, it can be dealt with by search alignment. As a result, a higher performance exposure apparatus can be provided.
- step S651 the maximum field of view OR Area was set as the template creation area, and in step S652, image data of the same area as the maximum field of view OR—Area was obtained.
- image data of an area wider than the maximum field of view OR—Area it is also possible to acquire image data of an area wider than the maximum field of view OR—Area and use it for later detection of pattern features. That is, as shown in FIG. 26, for example, when acquiring image data in step S652, an image of the area AreaE extended by the distance L2 in each of the four directions of each side force of the maximum field of view OR_Area is acquired. . Then, the template cannot be extracted from the maximum field of view OR_Area, and the image data (image signal information) of this extended part is used when detecting the pattern feature around the maximum field of view OR_Area to shift the template creation area Areal.
- the four-sided forces of the maximum field of view ⁇ R_Area are also sandwiched between a line defined outside by a distance L2 and a line defined inside by a distance L3 from the four sides. 24A and 24B, the areas E1 to E8 are set, the feature amount of the pattern feature of each area is detected, and the moving direction of the template creation area is determined based on this.
- the template creation area can be reset more efficiently.
- the direction and distance for shifting the template creation region may be set arbitrarily.
- a plurality of reset positions may be prepared in advance, or, for example, the direction in which the template creation region is shifted and the shift distance may be determined according to the pattern characteristics of the peripheral portion. Good. For example, the center of gravity of a series of pattern features may be detected, and the template creation area may be shifted in the direction of the center of gravity.
- the template creation area is moved and reset. However, from the initial stage of template creation, it is necessary to select an area with a lot of information suitable for template creation and set it as the template creation area. You may do it.
- a process of selecting an area suitable for template creation from a wide area around the target measurement area and creating a template using image data of the area is described as a sixth embodiment according to the present invention. This will be described with reference to FIGS.
- FIG. 27 is a flowchart showing the flow of template creation processing according to the sixth embodiment of the present invention.
- wide-area image data (image signal information) is acquired by a low-magnification (first detection magnification) imaging camera (step S701).
- the range in which image data is acquired includes the target measurement area (search measurement visual field area) during search alignment measurement and an area that includes a plurality of template creation areas.
- Magneticnification that is higher than the imaging magnification (first detection magnification) in this step for acquiring wide-area image data and lower than the detection magnification set during fine alignment measurement. 2 is fixedly determined in advance in consideration of the observation field of view (VIEW-Area) under wafer magnification and wafer manufacturing errors.
- VIEW-Area observation field of view
- the maximum field of view OR—Area is compared with the maximum field of view OR—Area when the observation field of view VIEW—Area is imaged at medium magnification during search alignment measurement.
- the target measurement area during search alignment measurement is an area that is approximately the center of AreaVO.
- the image data) of the detection magnification is obtained.
- the wafer surface is imaged by an imaging system having a low-magnification optical system provided exclusively for such use. get.
- an FIA type alignment sensor having a plurality of light receiving systems (low magnification, medium magnification, high magnification) is used.
- An alignment sensor having such a concept is disclosed in, for example, JP-A-2002-257512.
- the alignment sensor is not shown in the figure, the objective lens (objective optical system) is provided in common, but the light receiving system (high magnification system) with a relatively high magnification and a narrow field of view and the comparatively medium magnification It has two light receiving systems with a wide light receiving system (medium magnification system).
- the light beam reflected by the surface is split by a beam splitter or the like and is incident on each light receiving system.
- a low-magnification light receiving system including a sensor
- low-magnification image data in step S701 is captured by, for example, such an alignment sensor provided in the exposure apparatus. I do.
- an area AreaY corresponding to the size of the unique pattern to be detected is set in the low-magnification image data area AreaX, and this is set as a temporary unique pattern (step S703).
- the size of the unique pattern detected here is set to an arbitrary size that is not limited to the same force as the size of the template to be finally generated. May be.
- low-magnification image data (low-magnification image data area AreaX) is searched for in a window AreaZ having the same size as this area AreaY, and an image (pattern signal information) in the area AreaZ at each position is temporarily determined.
- the correlation value with the image (pattern signal information) of the unique pattern AreaY is sequentially obtained based on the above formula (1) or (2), and a region where the correlation value is a sufficiently high peak is detected. (Step S705).
- step S707 When correlation peak values are detected for all provisional unique patterns AreaY set in the low-magnification image data area AreaX (step S707), a truly unique provisional unique pattern having no peak other than itself is selected from the correlation peak values. Is extracted (step S709). In the example shown in FIG. 28, it is assumed that a truly unique pattern is detected in the area AreaU, for example.
- step S703 the processing in step S703 is performed. There is no need to spend a long time in the processing of step S709, which is a practical problem.
- Step S711 After detecting the area AreaU including the unique pattern in the low-magnification image data area AreaX, next, the area Areal for acquiring medium-magnification image data (image signal information) for actually creating a template is set ( Step S711).
- Area to acquire medium magnification image As shown in Fig. 28, Areal is an area of the same size as the maximum field of view OR-Area at the time of search alignment measurement (medium magnification measurement), and its center is the same as the center of area U, which has a unique pattern. Set the area so that At this time, the target visual field area AreaV at the time of search alignment measurement is also changed so that the visual field maximum range OR_Area becomes such an area.
- the template creation area Areal on the wafer is imaged through the medium-magnification optical system, and image data (image signal information) is obtained (step S713).
- an area AreaT corresponding to the size of the template is set in the unique pattern search area AreaA (view common area AND_Area) in the template creation area Areal, and this is set as a temporary template (step S715).
- the template creation area Areal is searched in the window AreaS having the same size as the area AreaT, and the correlation value between the image (pattern signal information) of the area AreaS and the image (pattern signal information) of the provisional template AreaT at each position. are sequentially obtained based on the above equation (1) or equation (2), and a region where the correlation value is a sufficiently high peak is detected (step S717).
- step S719 when a correlation peak value is detected for all the temporary templates AreaT set in the unique pattern search area AreaA (step S719), a unique pattern having no peak other than itself is selected from the correlation peak values. Is extracted (step S721). Then, the image data (image signal information) of the unique pattern is extracted as a template (step S723).
- an area where a unique pattern is likely to exist is determined by using a sufficiently wide range of image data for an area to be a template creation area.
- the area is estimated and the template creation process is performed using that area as the template creation area. Therefore, there is a very high possibility that a unique pattern for which a template can be created is detected from the first set area, and efficient template creation processing can be performed.
- the frequency of processing such as resetting the template creation area and reacquiring image data because a unique pattern is not detected can be reduced. It is possible to efficiently perform template creation, operation of various devices related to the exposure apparatus, and exposure processing.
- a template is created by detecting a unique pattern in a wide range
- a pattern having strong uniqueness can be used as a template.
- the ability to create high-quality templates with strong uniqueness and strong discrimination can be achieved.
- the template can be created by changing the search range (the maximum field of view ⁇ R_Area), the searchable area can be enlarged, and the alignment accuracy of the exposure apparatus and the wafer loading can be improved. Even if the accuracy is reduced, it can be dealt with by search alignment. As a result, a higher performance exposure apparatus can be provided.
- step S703-step S711 by detecting a unique pattern, an area where a unique pattern suitable for template creation is likely to exist is detected from the low-magnification image data area AreaX, and the template is actually detected. It was set as an area for creating.
- the method of estimating a region where a unique pattern is likely to exist is not limited to this.
- the feature amount of the pattern feature detected in order to move the visual field maximum range ⁇ R—Area in the fifth embodiment that is, the amount of the pattern in the binarized pattern (the pixel amount of the predetermined pixel value)
- Unique patterns can exist using the pattern density, frequency component detection results, the amount of edge segments in the edge detected pattern, the number of edges, the edge density, the frequency component detection results of the edge lines, etc. A highly likely region may be estimated.
- the template creation processing (reference data extraction processing) after setting the template creation area Areal (field-of-view maximum area OR—Area) is not limited to the processing in step S713 and step S723 described above. ,.
- a template may be created by applying each processing as in the first embodiment to the third embodiment described above.
- the template is created based on the actual image data on the wafer.
- the template may be created based on the design data.
- an area AreaU having a unique pattern is detected from the low-magnification image data area AreaX, and a template creation area Area is detected based on the detected area.
- a plurality of areas having a unique pattern may be detected, and a template may be created from each area by sequentially setting the area as a template creation area.
- the plurality of detected areas may be ranked based on uniqueness, and a template may be created for each area in that order.
- FIG. 29 is a flowchart showing the manufacturing process of an electronic device such as a semiconductor chip such as an IC or an LSI, a liquid crystal panel, a CCD, a thin-film magnetic head, or a micromachine.
- step S810 the function and performance of the device, such as the circuit design of the electronic device, are designed, and the pattern is designed to realize the function.
- step S820 a reticle on which the designed circuit pattern is formed is manufactured.
- a wafer (silicon substrate) is manufactured using a material such as silicon (Step S830).
- step S840 using the reticle manufactured in step S820 and the wafer manufactured in step S830, actual circuits and the like are formed on the wafer by lithography technology or the like. Specifically, first, a thin film with an insulating film, an electrode wiring film, or a semiconductor film is formed on the wafer surface (step S841), and then, using a resist coating device (coater) on the entire surface of the thin film. A photosensitive agent (resist) is applied (step S842).
- a photosensitive agent resist
- Step S843 the substrate after the application of the resist is loaded on the wafer holder, the reticle manufactured in step S830 is loaded on the reticle stage, and the pattern formed on the reticle is reduced and transferred onto the wafer ( Step S843).
- each shot area of the wafer is sequentially aligned by the above-described alignment method according to the present invention, and a reticle pattern is sequentially transferred to each shot area.
- the wafer is unloaded from the wafer holder, and is developed using a developing device (developer) (Step S844). As a result, a resist image of the reticle pattern is formed on the wafer surface.
- Step S846 an etching process is performed on the wafer after the development process using an etching apparatus. Then, the resist remaining on the wafer surface is removed by using, for example, a plasma asher (Step S846).
- the device is assembled next (step S850). Specifically, the wafer is diced and divided into individual chips, each chip is mounted on a lead frame or package, bonding is performed to connect electrodes, and packaging processing such as resin sealing is performed.
- step S860 an inspection such as an operation check test and a durability test of the manufactured device is performed (step S860), and the manufactured device is shipped as a completed device.
- a template S for performing search alignment of the wafer W is illustrated as an example, and the force S described in the present invention, for example, a template generation for performing positioning of the reticle R,
- the present invention can also be applied to the generation of a template for performing fine alignment.
- the present invention is applied to the alignment sensor of the off-axis method as an example.
- the image (pattern signal) of the mark captured by the image sensor is processed to perform the mark operation. Any device that detects a position can apply the present invention to all of them.
- the present invention can be applied to a step-and-repeat type or step-and-scan type reduction projection type exposure apparatus, mirror projection type, proximity type, contact type, etc. It is possible.
- the present invention can be applied to an apparatus and an exposure apparatus for transferring a circuit pattern onto a glass substrate or a silicon wafer for producing a reticle. That is, the present invention is applicable irrespective of the exposure method and application of the exposure apparatus.
- exposure light EL of exposure apparatus 100 of the present embodiment g-line or i-line, or light emitted from a KrF excimer laser, an ArF excimer laser, or an F2 excimer laser has been used.
- KrF excimer laser (248nm), ArF excimer laser (193nm), F2 laser 248nm
- F2 laser 248nm
- a thermionic emission type lanthanum hexaborite (LaB6) or tantalum (Ta) can be used as an electron gun.
- a single-wavelength laser in the infrared or visible range oscillated from a DFB semiconductor laser or a fiber laser is amplified by an erbium (or both erbium and yttrium) fiber amplifier, Further, a harmonic converted into a wavelength of ultraviolet light using a nonlinear optical crystal may be used.
- a single wavelength oscillation laser an itribium-doped fiber laser is used.
- the exposure apparatus can control the position of the substrate W with high accuracy and high speed, and can perform exposure with high exposure accuracy while improving throughput.
- the components are assembled electrically, mechanically, or optically, they are manufactured by comprehensive adjustment (electrical adjustment, operation confirmation, etc.). It is desirable that the exposure apparatus be manufactured using a clean frame whose temperature, cleanliness, etc. are controlled.
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005511046A JP4389871B2 (ja) | 2003-06-27 | 2004-06-25 | 基準パターン抽出方法とその装置、パターンマッチング方法とその装置、位置検出方法とその装置及び露光方法とその装置 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003185392 | 2003-06-27 | ||
JP2003-185392 | 2003-06-27 | ||
JP2004150499 | 2004-05-20 | ||
JP2004-150499 | 2004-05-20 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2005001593A2 true WO2005001593A2 (ja) | 2005-01-06 |
WO2005001593A1 WO2005001593A1 (ja) | 2005-01-06 |
WO2005001593A3 WO2005001593A3 (ja) | 2005-05-19 |
Family
ID=
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7177009B2 (en) | 2004-10-01 | 2007-02-13 | Asml Netherlands B.V. | Position determination method and lithographic apparatus |
JP2008058182A (ja) * | 2006-08-31 | 2008-03-13 | Mitsutoyo Corp | 変位量検出可能性判定装置、その方法、および、変位検出装置 |
JP2009109413A (ja) * | 2007-10-31 | 2009-05-21 | Canon Inc | 位置検出器、露光装置及びデバイス製造方法 |
JP2009251667A (ja) * | 2008-04-01 | 2009-10-29 | Toyota Motor Corp | 画像検索装置 |
JP2010091491A (ja) * | 2008-10-10 | 2010-04-22 | Fujifilm Corp | 3次元形状計測用撮影装置および方法並びにプログラム |
JP2011524267A (ja) * | 2008-05-28 | 2011-09-01 | ペッパール ウント フュフス ゲゼルシャフト ミット ベシュレンクテル ハフツング | 印刷物を点検するための方法および機構とコンピュータプログラムとコンピュータプログラム製品 |
WO2012001862A1 (ja) * | 2010-06-29 | 2012-01-05 | 株式会社 日立ハイテクノロジーズ | パターンマッチング用テンプレートの作成方法、及び画像処理装置 |
WO2012070549A1 (ja) * | 2010-11-24 | 2012-05-31 | 株式会社日立ハイテクノロジーズ | 複数のアライメントパターン候補を用いたグローバルアライメント |
US8254682B2 (en) | 2006-04-20 | 2012-08-28 | Realtek Semiconductor Corp. | Pattern detecting method and related image processing apparatus |
US8379989B2 (en) | 2008-04-01 | 2013-02-19 | Toyota Jidosha Kabushiki Kaisha | Image search apparatus and image processing apparatus |
WO2014024655A1 (ja) * | 2012-08-09 | 2014-02-13 | コニカミノルタ株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
US8823852B2 (en) | 2012-05-24 | 2014-09-02 | Panasonic Intellectual Property Corporation Of America | Information communication method of obtaining information from a subject by demodulating data specified by a pattern of a bright line included in an obtained image |
JP5603513B1 (ja) * | 2012-12-27 | 2014-10-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 制御方法、情報通信装置およびプログラム |
US9608727B2 (en) | 2012-12-27 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Switched pixel visible light transmitting method, apparatus and program |
US9608725B2 (en) | 2012-12-27 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Information processing program, reception program, and information processing apparatus |
US9613596B2 (en) | 2012-12-27 | 2017-04-04 | Panasonic Intellectual Property Corporation Of America | Video display method using visible light communication image including stripe patterns having different pitches |
US9635278B2 (en) | 2012-12-27 | 2017-04-25 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information specified by striped pattern of bright lines |
US9641766B2 (en) | 2012-12-27 | 2017-05-02 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9646568B2 (en) | 2012-12-27 | 2017-05-09 | Panasonic Intellectual Property Corporation Of America | Display method |
US9768869B2 (en) | 2012-12-27 | 2017-09-19 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9918016B2 (en) | 2012-12-27 | 2018-03-13 | Panasonic Intellectual Property Corporation Of America | Information communication apparatus, method, and recording medium using switchable normal mode and visible light communication mode |
US10148354B2 (en) | 2012-12-27 | 2018-12-04 | Panasonic Intellectual Property Corporation Of America | Luminance change information communication method |
US10225014B2 (en) | 2012-12-27 | 2019-03-05 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information using ID list and bright line image |
JP2019049919A (ja) * | 2017-09-12 | 2019-03-28 | 大日本印刷株式会社 | テンプレート抽出装置、テンプレート抽出方法、およびプログラム |
US10303945B2 (en) | 2012-12-27 | 2019-05-28 | Panasonic Intellectual Property Corporation Of America | Display method and display apparatus |
US10523876B2 (en) | 2012-12-27 | 2019-12-31 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10530486B2 (en) | 2012-12-27 | 2020-01-07 | Panasonic Intellectual Property Corporation Of America | Transmitting method, transmitting apparatus, and program |
US10951310B2 (en) | 2012-12-27 | 2021-03-16 | Panasonic Intellectual Property Corporation Of America | Communication method, communication device, and transmitter |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000042640A1 (fr) * | 1999-01-18 | 2000-07-20 | Nikon Corporation | Procede et dispositif d'appariement de formes, procede et dispositif de determination de position, procede et dispositif d'alignement de position, procede et dispositif d'exposition, et dispositif et son procede de production |
JP2002062112A (ja) * | 2000-08-21 | 2002-02-28 | Sony Corp | 位置決め装置及び位置決め方法 |
WO2002029870A1 (fr) * | 2000-10-05 | 2002-04-11 | Nikon Corporation | Procede de determination des conditions d'exposition, procede d'exposition, dispositif de realisation dudit procede et support d'enregistrement |
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000042640A1 (fr) * | 1999-01-18 | 2000-07-20 | Nikon Corporation | Procede et dispositif d'appariement de formes, procede et dispositif de determination de position, procede et dispositif d'alignement de position, procede et dispositif d'exposition, et dispositif et son procede de production |
JP2002062112A (ja) * | 2000-08-21 | 2002-02-28 | Sony Corp | 位置決め装置及び位置決め方法 |
WO2002029870A1 (fr) * | 2000-10-05 | 2002-04-11 | Nikon Corporation | Procede de determination des conditions d'exposition, procede d'exposition, dispositif de realisation dudit procede et support d'enregistrement |
Cited By (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7177009B2 (en) | 2004-10-01 | 2007-02-13 | Asml Netherlands B.V. | Position determination method and lithographic apparatus |
TWI386032B (zh) * | 2006-04-20 | 2013-02-11 | Realtek Semiconductor Corp | 型樣偵測方法與相關的影像處理裝置 |
US8254682B2 (en) | 2006-04-20 | 2012-08-28 | Realtek Semiconductor Corp. | Pattern detecting method and related image processing apparatus |
JP2008058182A (ja) * | 2006-08-31 | 2008-03-13 | Mitsutoyo Corp | 変位量検出可能性判定装置、その方法、および、変位検出装置 |
JP2009109413A (ja) * | 2007-10-31 | 2009-05-21 | Canon Inc | 位置検出器、露光装置及びデバイス製造方法 |
JP2009251667A (ja) * | 2008-04-01 | 2009-10-29 | Toyota Motor Corp | 画像検索装置 |
US8379989B2 (en) | 2008-04-01 | 2013-02-19 | Toyota Jidosha Kabushiki Kaisha | Image search apparatus and image processing apparatus |
JP2011524267A (ja) * | 2008-05-28 | 2011-09-01 | ペッパール ウント フュフス ゲゼルシャフト ミット ベシュレンクテル ハフツング | 印刷物を点検するための方法および機構とコンピュータプログラムとコンピュータプログラム製品 |
JP2010091491A (ja) * | 2008-10-10 | 2010-04-22 | Fujifilm Corp | 3次元形状計測用撮影装置および方法並びにプログラム |
WO2012001862A1 (ja) * | 2010-06-29 | 2012-01-05 | 株式会社 日立ハイテクノロジーズ | パターンマッチング用テンプレートの作成方法、及び画像処理装置 |
JP2012114202A (ja) * | 2010-11-24 | 2012-06-14 | Hitachi High-Technologies Corp | 画像撮像装置および画像撮像方法 |
US9057873B2 (en) | 2010-11-24 | 2015-06-16 | Hitachi High-Technologies Corporation | Global alignment using multiple alignment pattern candidates |
WO2012070549A1 (ja) * | 2010-11-24 | 2012-05-31 | 株式会社日立ハイテクノロジーズ | 複数のアライメントパターン候補を用いたグローバルアライメント |
US8823852B2 (en) | 2012-05-24 | 2014-09-02 | Panasonic Intellectual Property Corporation Of America | Information communication method of obtaining information from a subject by demodulating data specified by a pattern of a bright line included in an obtained image |
JP5602966B1 (ja) * | 2012-05-24 | 2014-10-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 制御方法、情報通信装置およびプログラム |
JP2014220787A (ja) * | 2012-05-24 | 2014-11-20 | パナソニックインテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 制御方法、情報通信装置およびプログラム |
WO2014024655A1 (ja) * | 2012-08-09 | 2014-02-13 | コニカミノルタ株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
JPWO2014024655A1 (ja) * | 2012-08-09 | 2016-07-25 | コニカミノルタ株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
US10218914B2 (en) | 2012-12-20 | 2019-02-26 | Panasonic Intellectual Property Corporation Of America | Information communication apparatus, method and recording medium using switchable normal mode and visible light communication mode |
US9859980B2 (en) | 2012-12-27 | 2018-01-02 | Panasonic Intellectual Property Corporation Of America | Information processing program, reception program, and information processing apparatus |
US10303945B2 (en) | 2012-12-27 | 2019-05-28 | Panasonic Intellectual Property Corporation Of America | Display method and display apparatus |
JP2015119458A (ja) * | 2012-12-27 | 2015-06-25 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 制御方法、情報通信装置およびプログラム |
JP5603523B1 (ja) * | 2012-12-27 | 2014-10-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 制御方法、情報通信装置およびプログラム |
US9608727B2 (en) | 2012-12-27 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Switched pixel visible light transmitting method, apparatus and program |
US9608725B2 (en) | 2012-12-27 | 2017-03-28 | Panasonic Intellectual Property Corporation Of America | Information processing program, reception program, and information processing apparatus |
US9613596B2 (en) | 2012-12-27 | 2017-04-04 | Panasonic Intellectual Property Corporation Of America | Video display method using visible light communication image including stripe patterns having different pitches |
US9635278B2 (en) | 2012-12-27 | 2017-04-25 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information specified by striped pattern of bright lines |
US9641766B2 (en) | 2012-12-27 | 2017-05-02 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9646568B2 (en) | 2012-12-27 | 2017-05-09 | Panasonic Intellectual Property Corporation Of America | Display method |
US9756255B2 (en) | 2012-12-27 | 2017-09-05 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9768869B2 (en) | 2012-12-27 | 2017-09-19 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US9794489B2 (en) | 2012-12-27 | 2017-10-17 | Panasonic Intellectual Property Corporation Of America | Information communication method |
JP5603512B1 (ja) * | 2012-12-27 | 2014-10-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 制御方法、情報通信装置およびプログラム |
US9918016B2 (en) | 2012-12-27 | 2018-03-13 | Panasonic Intellectual Property Corporation Of America | Information communication apparatus, method, and recording medium using switchable normal mode and visible light communication mode |
US9998220B2 (en) | 2012-12-27 | 2018-06-12 | Panasonic Intellectual Property Corporation Of America | Transmitting method, transmitting apparatus, and program |
US10051194B2 (en) | 2012-12-27 | 2018-08-14 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10148354B2 (en) | 2012-12-27 | 2018-12-04 | Panasonic Intellectual Property Corporation Of America | Luminance change information communication method |
US10165192B2 (en) | 2012-12-27 | 2018-12-25 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10205887B2 (en) | 2012-12-27 | 2019-02-12 | Panasonic Intellectual Property Corporation Of America | Information communication method |
JP5603513B1 (ja) * | 2012-12-27 | 2014-10-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 制御方法、情報通信装置およびプログラム |
US10225014B2 (en) | 2012-12-27 | 2019-03-05 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information using ID list and bright line image |
US11659284B2 (en) | 2012-12-27 | 2023-05-23 | Panasonic Intellectual Property Corporation Of America | Information communication method |
JP2015119459A (ja) * | 2012-12-27 | 2015-06-25 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 制御方法、情報通信装置およびプログラム |
US10334177B2 (en) | 2012-12-27 | 2019-06-25 | Panasonic Intellectual Property Corporation Of America | Information communication apparatus, method, and recording medium using switchable normal mode and visible light communication mode |
US10354599B2 (en) | 2012-12-27 | 2019-07-16 | Panasonic Intellectual Property Corporation Of America | Display method |
US10361780B2 (en) | 2012-12-27 | 2019-07-23 | Panasonic Intellectual Property Corporation Of America | Information processing program, reception program, and information processing apparatus |
US10368005B2 (en) | 2012-12-27 | 2019-07-30 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10368006B2 (en) | 2012-12-27 | 2019-07-30 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10447390B2 (en) | 2012-12-27 | 2019-10-15 | Panasonic Intellectual Property Corporation Of America | Luminance change information communication method |
US10455161B2 (en) | 2012-12-27 | 2019-10-22 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10516832B2 (en) | 2012-12-27 | 2019-12-24 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10523876B2 (en) | 2012-12-27 | 2019-12-31 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10521668B2 (en) | 2012-12-27 | 2019-12-31 | Panasonic Intellectual Property Corporation Of America | Display method and display apparatus |
US10530486B2 (en) | 2012-12-27 | 2020-01-07 | Panasonic Intellectual Property Corporation Of America | Transmitting method, transmitting apparatus, and program |
US10531010B2 (en) | 2012-12-27 | 2020-01-07 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10531009B2 (en) | 2012-12-27 | 2020-01-07 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10616496B2 (en) | 2012-12-27 | 2020-04-07 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10638051B2 (en) | 2012-12-27 | 2020-04-28 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10666871B2 (en) | 2012-12-27 | 2020-05-26 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10742891B2 (en) | 2012-12-27 | 2020-08-11 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10887528B2 (en) | 2012-12-27 | 2021-01-05 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US10951310B2 (en) | 2012-12-27 | 2021-03-16 | Panasonic Intellectual Property Corporation Of America | Communication method, communication device, and transmitter |
US11165967B2 (en) | 2012-12-27 | 2021-11-02 | Panasonic Intellectual Property Corporation Of America | Information communication method |
US11490025B2 (en) | 2012-12-27 | 2022-11-01 | Panasonic Intellectual Property Corporation Of America | Information communication method |
JP2019049919A (ja) * | 2017-09-12 | 2019-03-28 | 大日本印刷株式会社 | テンプレート抽出装置、テンプレート抽出方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
WO2005001593A3 (ja) | 2005-05-19 |
JP4389871B2 (ja) | 2009-12-24 |
JPWO2005001593A1 (ja) | 2007-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005008753A1 (ja) | テンプレート作成方法とその装置、パターン検出方法、位置検出方法とその装置、露光方法とその装置、デバイス製造方法及びテンプレート作成プログラム | |
US7643961B2 (en) | Position detecting device and position detecting method | |
JP3269343B2 (ja) | ベストフォーカス決定方法及びそれを用いた露光条件決定方法 | |
CN114008534A (zh) | 量测方法和相关联的量测术以及光刻设备 | |
US20180246420A1 (en) | A method and apparatus for determining at least one property of patterning device marker features | |
KR102189687B1 (ko) | 기판 상의 타겟 구조체의 위치를 결정하는 방법 및 장치, 기판의 위치를 결정하는 방법 및 장치 | |
US8097473B2 (en) | Alignment method, exposure method, pattern forming method, and exposure apparatus | |
US7852477B2 (en) | Calculation method and apparatus of exposure condition, and exposure apparatus | |
EP2103995A2 (en) | Method for coarse wafer alignment in a lithographic apparatus | |
JP2005030963A (ja) | 位置検出方法 | |
KR100991574B1 (ko) | 위치 검출기, 위치 검출 방법, 노광 장치 및 디바이스 제조방법 | |
JP6608130B2 (ja) | 計測装置、リソグラフィ装置、及び物品の製造方法 | |
JP2005011976A (ja) | 位置検出方法 | |
JP2006216796A (ja) | 基準パターン情報の作成方法、位置計測方法、位置計測装置、露光方法、及び露光装置 | |
JP4677183B2 (ja) | 位置検出装置、および露光装置 | |
US10267749B2 (en) | Inspection method | |
WO2005001593A2 (ja) | 基準パターン抽出方法とその装置、パターンマッチング方法とその装置、位置検出方法とその装置及び露光方法とその装置 | |
CN113966490A (zh) | 图像形成设备 | |
JP4470503B2 (ja) | 基準パターン決定方法とその装置、位置検出方法とその装置、及び、露光方法とその装置 | |
US20040075099A1 (en) | Position detecting method and apparatus | |
US11927892B2 (en) | Alignment method and associated alignment and lithographic apparatuses | |
CN110770653B (en) | System and method for measuring alignment | |
JP2004281904A (ja) | 位置計測装置、露光装置、及びデバイス製造方法 | |
JP2004087562A (ja) | 位置検出方法及びその装置、露光方法及びその装置、並びにデバイス製造方法 | |
CN110770653A (zh) | 用于测量对准的系统和方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005511046 Country of ref document: JP |
|
122 | Ep: pct application non-entry in european phase |