WO2014017337A1 - Matching process device, matching process method, and inspection device employing same - Google Patents
Matching process device, matching process method, and inspection device employing same Download PDFInfo
- Publication number
- WO2014017337A1 WO2014017337A1 PCT/JP2013/069296 JP2013069296W WO2014017337A1 WO 2014017337 A1 WO2014017337 A1 WO 2014017337A1 JP 2013069296 W JP2013069296 W JP 2013069296W WO 2014017337 A1 WO2014017337 A1 WO 2014017337A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- matching
- image
- feature
- feature amount
- template
- Prior art date
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 26
- 238000000034 method Methods 0.000 title claims description 96
- 230000008569 process Effects 0.000 title claims description 48
- 238000012545 processing Methods 0.000 claims abstract description 107
- 238000000605 extraction Methods 0.000 claims abstract description 36
- 239000000284 extract Substances 0.000 claims abstract description 10
- 238000004364 calculation method Methods 0.000 claims description 32
- 238000003672 processing method Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 35
- 230000000875 corresponding effect Effects 0.000 description 18
- 238000005259 measurement Methods 0.000 description 13
- 239000004065 semiconductor Substances 0.000 description 13
- 210000000349 chromosome Anatomy 0.000 description 9
- 238000011156 evaluation Methods 0.000 description 9
- 238000000926 separation method Methods 0.000 description 9
- 238000009826 distribution Methods 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 238000007781 pre-processing Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000010894 electron beam technology Methods 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- 238000001000 micrograph Methods 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000002068 genetic effect Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000035772 mutation Effects 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000254032 Acrididae Species 0.000 description 1
- 102100028971 HLA class I histocompatibility antigen, C alpha chain Human genes 0.000 description 1
- 101100395312 Homo sapiens HLA-C gene Proteins 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F7/00—Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
- G03F7/70—Microphotolithographic exposure; Apparatus therefor
- G03F7/70483—Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
- G03F7/70605—Workpiece metrology
- G03F7/70616—Monitoring the printed patterns
- G03F7/70625—Dimensions, e.g. line width, critical dimension [CD], profile, sidewall angle or edge roughness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/956—Inspecting patterns on the surface of objects
- G01N21/95607—Inspecting patterns on the surface of objects using a comparative method
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N23/00—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
- G01N23/22—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by measuring secondary emission from the material
- G01N23/225—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by measuring secondary emission from the material using electron or ion
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F7/00—Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
- G03F7/70—Microphotolithographic exposure; Apparatus therefor
- G03F7/70483—Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
- G03F7/70491—Information management, e.g. software; Active and passive control, e.g. details of controlling exposure processes or exposure tool monitoring processes
- G03F7/70508—Data handling in all parts of the microlithographic apparatus, e.g. handling pattern data for addressable masks or data transfer to or from different components within the exposure apparatus
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F7/00—Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
- G03F7/70—Microphotolithographic exposure; Apparatus therefor
- G03F7/70483—Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
- G03F7/70605—Workpiece metrology
- G03F7/706835—Metrology information management or control
- G03F7/706839—Modelling, e.g. modelling scattering or solving inverse problems
- G03F7/706841—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/211—Selection of the most significant subset of features
- G06F18/2113—Selection of the most significant subset of features by ranking or filtering the set of features, e.g. using a measure of variance or of feature cross-correlation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/771—Feature selection, e.g. selecting representative features from a multi-dimensional feature space
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L22/00—Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
- H01L22/10—Measuring as part of the manufacturing process
- H01L22/12—Measuring as part of the manufacturing process for structural parameters, e.g. thickness, line width, refractive index, temperature, warp, bond strength, defects, optical inspection, electrical measurement of structural dimensions, metallurgic measurement of diffusions
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L22/00—Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
- H01L22/30—Structural arrangements specially adapted for testing or measuring during manufacture or treatment, or specially adapted for reliability measurements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
- G06T2207/10061—Microscopic image from scanning electron microscope
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30148—Semiconductor; IC; Wafer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
Definitions
- the present invention relates to a matching processing technique, more specifically to a pattern matching technique, and more particularly to a template matching technique in an inspection technique for the purpose of inspecting or measuring a pattern formed on a semiconductor wafer.
- Non-Patent Document 1 In an apparatus for measuring and inspecting a pattern formed on a semiconductor wafer, a template matching technique (see Non-Patent Document 1 below) that performs matching using a template is used to align the visual field of the inspection apparatus with a desired measurement position.
- Patent Document 1 below describes an example of such a template matching method. Note that the template matching process is a process for finding an area most matching a pre-registered template image from the search target image.
- a specific example of an inspection apparatus using template matching is measurement of a pattern on a semiconductor wafer using a scanning electron microscope.
- the field of view of the apparatus is moved to a rough position of the measurement position by moving the stage, but a large deviation often occurs on an image captured at a high magnification of an electron microscope only with the positioning accuracy of the stage.
- the wafer is not always placed on the stage in the same direction, and the coordinate system of the wafer placed on the stage (for example, the direction in which the chips of the wafer are arranged, for example) does not completely match the drive direction of the stage, This also causes a shift on an image taken at a high magnification of an electron microscope.
- an electron beam is deflected by a minute amount (for example, several tens of ⁇ m or less) and irradiated to a target position on an observation sample. (This may be referred to as “beam shift”). Even if this beam shift is performed, a deviation from a desired observation position may occur with respect to the irradiation position only by the accuracy of beam deflection control. Template matching is performed to correct each of these deviations and perform measurement and inspection at an accurate position.
- alignment is performed in multiple stages: alignment with an optical camera with a lower magnification than the electron microscope image and alignment with the electron microscope image.
- alignment is performed using images of a plurality of chips (for example, chips on both the left and right sides of the wafer) that are separated from each other on the wafer.
- a plurality of chips for example, chips on both the left and right sides of the wafer
- register the same unique pattern in each chip or in the vicinity (pattern relatively in the same position in each chip) as a template the pattern used for registration is optical alignment on the wafer) Often used as a pattern).
- the stage is moved so as to capture the pattern registered in the template with each chip, and an image is acquired with each chip.
- Template matching is performed on the acquired image. Based on the respective matching positions obtained as a result, the shift amount of the stage movement is calculated, and the coordinate system of the stage movement and the coordinate system of the wafer are matched using this shift amount as a correction value for the stage movement.
- a unique pattern close to the measurement position is registered in advance as a template, and the relative coordinates of the measurement position viewed from the template are stored.
- template matching is performed on the captured image, the matching position is determined, and the position moved by the stored relative coordinates is the measurement position.
- the visual field of the apparatus is moved to a desired measurement position.
- the alignment pattern may not appear in the image captured by the electron microscope.
- a process of searching for an alignment pattern again around the imaging position (periphery search) or interrupting measurement and notifying the user that the alignment has failed (measurement interruption) may be performed.
- a preset reference value hereinafter, this reference value is referred to as score acceptance
- the template matching method can be divided into an image-based method based on normalized correlation and a feature point-based method for comparing feature points extracted from images.
- image-based method for example, an image having the same size as the template is extracted from the searched image, a correlation value between the extracted image and the template is calculated, and this correlation value is extracted for each image position (position is determined from the searched image). For example, a position having a large correlation value is set as a matching position (non-patent document 1).
- the latter feature point-based method extracts a plurality of feature points in each of the template and the searched image, for example, finds similar feature points in both images (corresponding point matching), and superimposes the feature points.
- a simple template is projected (considering differences in rotation, scale, etc. between images)
- a position where the number of projected areas overlaps is set as a matching position (Non-patent Document 2).
- Patent document 2001-243906 (corresponding US patent US627278)
- the matching may not be successful if there is a large difference in the appearance of the image between the template and the searched image.
- the reason why the difference in the appearance of the image between the template and the searched image is large is, for example, that the difference between the imaging condition of the inspection apparatus when the template is registered and the imaging condition of the inspection apparatus when the searched image is captured is large. If the difference between the performance of the semiconductor pattern captured when the template is registered and the performance of the semiconductor pattern captured the image to be searched is increased, or the semi-manufacturing process of the semiconductor pattern when the template is registered, There are cases where the manufacturing process of the semiconductor pattern when the image to be searched is photographed is different. In addition to the examples given here, there may be a large difference in appearance between the template and the searched image due to various factors.
- Non-patent Document 3 a method of performing preprocessing such as smoothing processing and edge enhancement has also been proposed.
- FIG. 4 is a diagram exemplifying correlation values (matching scores) at a matching correct answer position and a matching incorrect answer position for a plurality of images (sample IDs: 1 to 100) having different appearances.
- the correlation value differs depending on the sample, and with one score acceptance (threshold value) 202 (or 203), it is difficult to determine the success or failure of the matching. If the score acceptance (threshold value 1) 202 is used, a sample in the section 205 with a matching correct answer position is erroneously determined as a matching incorrect answer position.
- the sample of the matching incorrect answer position (the hatched area on the right side of 205) is erroneously determined as the matching correct answer position.
- Non-Patent Document 2 a method using a feature quantity such as SIFT (Scale-Invariant Feature Transform) (see Non-Patent Document 2) has been proposed. If the apparent divergence from the searched image is large, the similarity of the feature vector (feature descriptor) between the template and the searched image will be worse, and matching will not be successful, making the matching unstable. is there.
- SIFT Scale-Invariant Feature Transform
- An object of the present invention is to output an accurate matching position even when there is a large difference in the appearance of an image between a template and a searched image in template matching.
- a feature amount (hereinafter referred to as individual feature amount) is extracted and compared separately from each of the template or the searched image, but also both the template and the searched image.
- the mutual information obtained from the above (hereinafter referred to as a mutual feature amount) is also used as information for determining the success or failure of matching.
- An inspection apparatus for performing template matching includes a feature region extraction processing unit that extracts a feature amount determined by coordinates in an image from a template, and a feature amount extraction process that extracts a feature amount determined by coordinates in the image from a searched image Mutual feature amount calculation of both images of the template and the searched image from the portion, the feature amount extracted from the template, the feature amount extracted from the searched image, and the relative position between the template and the searched image
- the image processing apparatus includes a processing unit and a template matching processing unit that performs matching between the template and the searched image using the plurality of mutual feature amounts.
- the score acceptance can always be 0 (or the median score) as a fixed value.
- an accurate matching position can be output even when there is a large difference in image appearance between the template and the searched image.
- FIG. 6 is a diagram illustrating correlation values (matching scores) at matching correct answer positions and matching incorrect answer positions for a plurality of images (sample IDs: 1 to 100) having different appearances. It is a block diagram which shows the example of 1 structure of the pattern matching process of this Embodiment.
- FIG. 1 shows a scanning electron microscope (SEM) that is mainly used for pattern dimension measurement of a semiconductor device formed on a semiconductor wafer as an application example of an inspection apparatus according to an embodiment of the present invention.
- FIG. 5 is a diagram illustrating an example of a configuration of an apparatus when performing template matching using mask processing.
- SEM scanning electron microscope
- A an electron beam is generated from the electron gun 1.
- the deflector 4 and the objective lens 5 are controlled so that the electron beam is focused and irradiated at an arbitrary position on the semiconductor wafer 3, for example, a sample placed on the stage 2.
- Secondary electrons are emitted from the semiconductor wafer 3 irradiated with the electron beam and detected by the secondary electron detector 6.
- the detected secondary electrons are converted into a digital signal by the A / D converter 7, stored in the image memory 15 in the processing / control unit 14, and the CPU 16 performs image processing according to the purpose.
- the template matching process according to the present embodiment is performed by the processing / control unit 14, more specifically, the matching processing unit 16a.
- the display device 20 performs processing setting and processing result display, which will be described later with reference to FIG. Further, in the alignment using the optical camera that is lower in magnification than the electron microscope, the optical camera 11 is used.
- a signal obtained by imaging the semiconductor wafer 3 with the camera 11 is also converted into a digital signal by the A / D converter 12 (when the signal from the optical camera is a digital signal, the A / D converter 12 Is stored in the image memory 15 in the processing / control unit 14, and the CPU 16 performs image processing according to the purpose.
- the backscattered electrons emitted from the semiconductor wafer 3 are detected by the backscattered electron detector 8, and the detected backscattered electrons are detected by the A / D converter 9 or 10. It is converted into a digital signal, stored in the image memory 15 in the processing / control unit 14, and the CPU 16 performs image processing according to the purpose.
- a scanning electron microscope is shown as an example of an inspection apparatus.
- the apparatus to be applied is not limited to this, and can be applied to an inspection apparatus that acquires an image and performs template matching processing. .
- FIG. 2 is a functional block diagram showing a configuration example of the matching processing unit in the inspection apparatus according to the present embodiment, and is a functional block diagram of the processing unit that performs processing corresponding to FIG.
- FIG. 3 is a functional block diagram illustrating an example of the entire configuration including a flow of template matching processing in the inspection apparatus according to the present embodiment, and is a diagram illustrating a configuration for performing learning processing. Note that the learning process and the matching process may be separate processes, or may have a partially common hardware configuration or software configuration, or a combination thereof.
- the matching processing unit 16a illustrated in FIG. 1 includes a feature amount extraction unit 16a-1 that extracts, for example, feature amounts of two inputs, and a plurality of feature amounts including first and second feature amounts. Based on the mutual feature quantity calculation unit 16a-2 for calculating the mutual feature quantity indicating the relationship between the feature quantities, and the template matching is determined based on the mutual feature quantity and the matching success / failure determination boundary surface. A template matching determination unit 16a-3 for obtaining a distance (score) between the mutual feature amount and the matching success / failure determination boundary surface in the feature amount space, and a collation target determination unit 16a-4 for determining whether or not a collation target remains.
- a feature amount extraction unit 16a-1 that extracts, for example, feature amounts of two inputs, and a plurality of feature amounts including first and second feature amounts.
- the template matching is determined based on the mutual feature quantity and the matching success / failure determination boundary surface.
- a template matching determination unit 16a-3 for obtaining a distance (score) between the mutual feature amount
- the score position selection unit 16a-5 that selects the position on the wafer that has the maximum distance (score), and the class that determines whether the class is a correct answer or an incorrect answer
- a storage unit 16a-7 for storing the matching unit 16a-6, the matched position (x, y) on the wafer, the matching score, etc., and a display based on these stored values.
- an output unit 16a-8 for outputting.
- a feature region extraction processing unit 16a-1a that extracts a feature amount from a template image acquired for learning and a learning processing unit 16a-2a described later are provided.
- the matching processing unit 16a may include all the elements (functional units) shown in FIG. 2 or may include only a part.
- FIG. 3 is a flowchart showing the flow of processing by the matching processing unit 16a in the inspection apparatus according to the present embodiment.
- the processing according to the present embodiment includes determination index value calculation processing X based on mutual features, matching correctness determination boundary surface (identification boundary surface) specification processing Y based on learning processing, and processing X. And derivation processing of a determination result by template matching processing based on processing Y.
- the determination index value calculation process X based on the mutual features is performed by using a template 101 registered in advance and an image 102 (cutout image at a matching candidate position) cut out from a search target image acquired by the inspection apparatus. This is a process for obtaining the value 109. Details of the process for determining whether there is a search target pattern in the image to be searched and the means for obtaining the matching position will be described later with reference to FIG. In the present embodiment, for example, one of the purposes is to make pattern matching successful even when the difference in the appearance of the image between the template 101 and the matching correct position of the searched image 102 increases. As described later in FIG.
- a determination index value 109 for performing feature amount-based matching is obtained using the mutual feature amount 108 obtained using both the template 101 and the searched image 102. .
- Matching using feature quantities that are less susceptible to adverse effects due to differences (or matching using feature quantities so as not to be adversely affected) can be performed, and the robustness of template matching can be improved.
- the mutual feature quantity 108 includes a feature quantity extraction unit from the feature quantity A 105 extracted from the template 101 by the feature quantity A extraction process 103 by the feature quantity extraction unit 16a-1 and a cut-out image (matching candidate position) 102 from the search target image.
- the mutual feature amount calculation unit 16a-2 obtains the mutual feature amount calculation processing 107.
- the mutual feature amount calculation method will be described later. For example, as a simple calculation method, the template 101 and the clipped image 102 from the search target image are used as they are as the feature amount A105 and the feature amount B106, and both images are normalized.
- the correlation value can be used as one of the mutual feature amounts.
- a covariance ⁇ XY that is an average value of products of deviation from the average between two sets of corresponding data x and y may be obtained as a mutual feature amount.
- the obtained mutual feature value 108 is used as a part or all of the determination index value 109 used in the template matching determination unit 16a-3 (matching score calculation unit).
- the mutual feature quantity 108 is not limited to one, and a plurality of different kinds of feature quantities can be calculated and used.
- the feature quantity B106 extracted from the clipped image 102 from the searched image can be used as part of the determination index value 109 as an individual feature quantity as it is.
- This feature amount is not limited to one, and a plurality of different types of feature amounts can be calculated and used as individual feature amounts.
- the matching score determination processing unit has a median value setting processing unit that sets the distance as 0 as the median value of the score. If the distance is equal to or less than the median value of the matching score, a matching incorrect answer is obtained. You may make it.
- the learning processing unit 16a-2a can use the image 101a-1 and the matching correct / incorrect information 102a-2 as the template 101a and the cut-out image (matching correct / incorrect information) 102a from the searched image.
- the following process Y until the determination index value 109a is obtained is the same as the process X, and can be executed by the same algorithm or can be processed by the same hardware. You may make it carry out by another structure.
- process Y a matching correct / incorrect determination boundary surface specification process 110 is performed from the determination index value 109a.
- the matching correct / incorrect determination boundary surface specification process 110 details will be described later, but a boundary surface that determines the correctness of matching in the determination index value space is specified. Since a plurality of determination index values are used, and the determination index values include a determination index value 109a obtained based on the mutual relationship between the template and the searched image, for example, an image-based matching method Even in the case where only the correlation value is used in the case where the success / failure of the matching cannot be divided, the technique according to the present embodiment increases the possibility of obtaining the matching success / failure determination boundary surface that can be divided into the success / failure of the matching.
- a template matching determination unit In the template matching determination process 112 in 16a-3, the distance from the matching determination boundary surface of the determination index value 109 in the determination index value space is calculated as a matching determination index, and the distance is determined as a determination result (matching score or the like) 113. To do. An example of this distance calculation method will be described later.
- a matching determination index such as a matching score of a target (a cut-out image 102 from the searched image) whose matching score is to be calculated.
- a matching determination index such as a matching score of a target (a cut-out image 102 from the searched image) whose matching score is to be calculated.
- FIG. 5 is a flowchart showing a flow of processing for performing search processing using the template matching processing described with reference to FIG.
- a part 300 surrounded by a broken line corresponds to the process described with reference to FIG. 3.
- the template matching determination process 112 is performed using the template 101, the image 102 cut out from the search target image, and the matching correctness determination boundary surface 111 for the learning process. Based on this, a matching determination result (matching score) 113 is calculated.
- the image 102 cut out by the image cutout process 302 of the area to be matched with the template is cut out from the search target image 301, checked against the template 101, and the determination result 113 is output through the template matching determination process 112.
- the verification target determination process 303 it is determined whether or not the determination result 113 is obtained at all the verification target positions of the searched image.
- the extraction position is changed by the extraction position changing process 304, and the image is extracted by the extraction process 302 of the image to be compared with the template.
- a certain score for example, a maximum score position selection process 305 is performed to obtain a position where the matching score is maximized.
- the matching position with the highest matching score in the belonging class determination processing 306 is a position that can be regarded as a matching correct answer, or a position that is regarded as a matching incorrect answer It is determined whether it is.
- the determination index value 109 at the matching position where the matching score is maximum is based on the matching correct / incorrect determination boundary surface as a matching incorrect answer side (incorrect answer class). ) (When it is below the score acceptance), it is determined that there is no pattern to be searched within the field of view of the image to be searched. In this case, processing such as searching for a pattern for alignment around the imaging position or notifying the user that the alignment has failed by interrupting measurement is performed (incorrect answer class).
- the determination index value 109 of the matching position where the matching score is maximum belongs to the matching correct side (correct answer class) with respect to the matching correctness determination boundary surface (when the score acceptance is equal to or higher)
- the collation position is output as the matching position 307.
- a matching score can also be output together with the matching correctness.
- template matching can be performed using the template matching result calculated using the mutual feature amount described in FIG. 3, for example, the matching score.
- FIG. 6 is a diagram illustrating the principle of the matching score calculation process 112 described in FIG.
- FIG. 6 (a) shows an example in which two determination index values of determination index value A and determination index value B are used.
- the matching score is a determination index value space (shown in two dimensions in this example) spanned by determination index values, and the coordinates for which the score is calculated (the coordinates in the determination index value space are determined from each determination index value 109. )
- the matching correctness determination boundary surface 111 is used.
- the matching correctness determination boundary surface 111 is given as a result of the matching correctness determination boundary surface designation process 110 as described with reference to FIG.
- the distance to the matching correct / incorrect determination boundary surface 111 is a broken line portion 410.
- the Euclidean distance can be used as the distance.
- the distance to be used is not limited to the Euclidean distance, and any means that can calculate the distance from the matching correctness determination boundary surface 111 may be used.
- FIG. 6B is a diagram showing the relationship between the distance 410 from the matching correctness determination boundary surface 111 and the matching score 411.
- the matching score 411 is set to 0.
- the relationship between the distance and the score at that time can be linear as shown by a straight line 412 in FIG.
- the correctness of matching may be determined based on the score acceptance as described above. This score acceptance is often required to be determined by the user or device designer, and the matching performance may differ depending on the setting. As described above, if the distance is 0 and the score is a fixed value of 0, setting of acceptance is not necessary. Further, in the matching using the correlation value in the conventional matching method using, for example, normalized correlation, only one correlation value corresponds to the determination index, and the value itself becomes the score. In that case, when one value is used as shown in FIG. 6A, if the matching correct answer and the incorrect answer cannot be separated, an appropriate acceptance score cannot be set. According to the present embodiment, such a case can be avoided.
- FIG. 7 is a diagram for explaining the specification processing 110 for the matching correct / incorrect determination boundary surface 111 described in FIG.
- the matching success / failure determination boundary surface 111 includes a case where the matching is correct in the determination index value space (indicated by a circle in FIG. 7A) and a case where the matching is incorrect (indicated by a cross in FIG. 7A). Set for the purpose of separation. By doing so, in the affiliation class determination processing 306 described with reference to FIG. 6, it is possible to determine which side has the matching result on the basis of the matching success / failure determination boundary surface 111, and the matching result is the matching correct position. It can be seen whether there is a matching incorrect answer position.
- the matching success / failure determination boundary surface 111 can be obtained by a method used in, for example, SVM (Support Vector Vector Machine).
- the matching success / failure determination boundary surface 111 corresponds to a separation hyperplane (also referred to as an identification surface) in the SVM.
- the matching success / failure determination boundary surface 111 is a separation hyperplane, and a broken line portion 501 inside the matching success / failure determination boundary surface 111 and a broken line portion 502 outside the matching success / failure determination boundary surface 111 are This is called a margin.
- a point on the margin is called a support vector (there is at least one in each case of matching correct answer and case of matching incorrect answer).
- the separation hyperplane is set at the position where the Euclidean distance becomes the maximum with reference to the learning data that is closest to the other cases (it is a support vector). That is, the margin from the extreme end of the case to another case is maximized (margin maximization).
- the matching correct case and the matching incorrect case are separated in the feature amount space even when there are a plurality of determination index values. It becomes possible to do. That is, the success / failure of the matching can be determined using the matching success / failure determination boundary surface 111 obtained by this method as a reference.
- FIG. 7B is a diagram for explaining a configuration of processing for obtaining the matching success / failure determination boundary surface 111.
- a combination of a plurality of determination index values 109a and matching success / failures 102-2 described in FIG. 5 is used as one case, and data (learning data) 102a including a plurality of cases is prepared.
- the learning data 102a includes a matching correct answer case and a matching incorrect answer case.
- an SVM separation hyperplane is obtained using SVM (111), and the obtained separation hyperplane is set as a matching success / failure determination boundary surface 111.
- the matching success / failure determination boundary surface 111 can be obtained from a plurality of determination index values using the SVM.
- the SVM it is not necessary to limit to the SVM to obtain the identification surface (separation hyperplane), and any method may be used as long as a matching success / failure determination boundary surface that separates the matching correct answer case and the matching incorrect answer case is obtained.
- FIG. 8 is a diagram for explaining the determination index value calculation means described in FIGS. 3 and 5. As described with reference to FIG. 3, the determination index value 109 a is calculated from the template 101 and the cutout image 102 at the matching candidate position.
- the feature amount extracted from the template 101 by the feature amount extraction unit A103 (here, the feature amount A105) and the feature amount extracted from the clipped image 102 at the matching candidate position by the feature amount extraction unit B104 (here, the feature amount B106).
- the method for calculating the mutual feature amount will be described with reference to FIG.
- the feature amount C608 is calculated by the feature amount extraction processing C605 by using the clipped image 102 or the template 101 at the matching candidate position.
- a method of calculating the individual feature amount will be described with reference to FIG.
- the obtained feature amount D108 or feature amount C608 is set as the determination index value 109a.
- a plurality of types of feature quantity A105, feature quantity B106, feature quantity C608, and feature quantity D108 may be used.
- a plurality of types of determination index values 109a are also used. With this configuration, a plurality of types of mutual feature amounts and individual feature amounts can be obtained from the template 101 and the cutout image 102 at the matching candidate position, and the feature amounts can be used as the determination index value 109a.
- FIG. 9 is a diagram illustrating the feature amount described in FIG.
- the feature quantities are classified according to their properties, and are referred to as a first class feature quantity, a second class feature quantity, and a third class feature quantity.
- the first type feature amount is a feature that is determined from the target image or a partial region of the target image regardless of the position (coordinates) in the image in the image for calculating the feature amount. For example, the pixel value average value, the pixel value variance value, and the like of the entire image are the first type feature amount. This feature amount corresponds to an individual feature amount.
- the second type feature amount is a feature amount determined by a position (coordinates) in the image. For example, as shown in FIG.
- the calculated feature value V i, j is the coordinate (i, j) 1402 on the image (here, the upper left of the image is the origin of the image coordinate system).
- the feature amount V i, j can be expressed as a multidimensional vector.
- the vector V i, j of feature quantity has f1 to fn as vector elements (n is the number of vector elements).
- the SIFT feature amount represents a feature by a vector determined for each certain coordinate (each feature point) on the image.
- the area around the feature point is divided into a plurality of small areas (16 areas), and a histogram is generated with bins of gradient directions (8 directions) of pixel values in each small area.
- a vector having a bin as one of vector elements (the number of elements is 128 (16 ⁇ 8)) is used as a feature amount.
- This feature amount also corresponds to the individual feature amount.
- the third type feature amount is a feature amount determined by the second type feature amount calculated from the template image, the second type feature amount calculated from the searched image, and the relative position between the two images (for example, the collation position of both images). It is.
- the mutual feature amount is a third type feature amount.
- the method for obtaining the third type feature using the second type feature (mutual feature amount calculation method) will be described in detail with reference to FIGS. 11 and 13.
- the feature amount is determined by the relative position 1416 with respect to the region (broken line portion) cut out from the searched image.
- FIG. 10 is a diagram for explaining a feature amount calculation area when calculating a feature amount determined by a position in an image in the second type feature amount described in FIG.
- FIG. 10A shows an example in which the feature amount is obtained from certain coordinates in the image.
- the pixel value, pixel value gradient information, and the like at this coordinate are used as feature amounts. Therefore, the feature amount is determined by the coordinates in the image.
- FIG. 10B is an example in which the feature amount is obtained from a certain rectangular area in the image. In this rectangular area, the pixel value average, the pixel value variance, the value of each bin of the pixel value histogram, or the value of each bin of the pixel value gradient direction histogram calculated by dividing the rectangular area into small areas are used as feature amounts. .
- FIG. 10C shows an example in which the feature amount is obtained from a circular area in the image. Similar to the rectangular area in FIG. 10B, the pixel value average, pixel value variance, and each bin value of the pixel value histogram in this circular area, or the pixel value gradient direction calculated by dividing the circular area into small areas The value of each bin of the histogram is used as a feature amount. As a result, the feature around the target coordinate for calculating the feature amount can also be used, and matching can be made more robust by using this feature.
- FIG. 10C shows an example in which the feature amount is obtained from a circular area in the image. Similar to the rectangular area in FIG. 10B, the pixel value average, pixel value variance, and each bin value of the pixel value histogram in this circular area, or the pixel value gradient direction calculated by dividing the circular area into small areas The value of each bin of the histogram is used as a feature amount. As a result, the feature around the target coordinate for calculating the feature amount can also be used, and matching
- FIGS. 10D shows an example in which a feature amount in a region having an arbitrary shape in the image is obtained. Similar to the rectangular area and the circular area in FIGS. 10B and 10C, the feature amount may be calculated from the arbitrarily shaped area. As a result, the feature around the target coordinate for calculating the feature amount can also be used, and matching can be made more robust by using this feature.
- FIG. 11 is a diagram for explaining a method for obtaining the third class feature quantity from the second class feature quantity in the third class feature quantity described in FIG. As described above, the third type feature value is obtained based on the relative position between the second type feature value of the template image and the second type feature value of the searched image.
- a region (broken line portion) 1610 having the same size as that of the template image 1601 is cut out from the searched image 1605 (cutout position (X, Y)) 1607, and the second type feature quantity is calculated from the cut out region. .
- the second type feature amount is also calculated from the template image 1601.
- a mutual feature amount obtained by obtaining a mutual relationship between the second type feature amount calculated from the searched image 1605 and the second type feature amount calculated from the template image 1601.
- a vector distance value representing the second type feature quantity of both is used as the mutual feature quantity.
- the distance is not limited as long as it can quantify the relationship between both feature quantities such as Euclidean distance, Manhattan distance, and grasshopper distance.
- the mutual feature amount is a feature amount determined by the relative position between the template image and the searched image (in this example, the image cutout position (X, Y) 1607 in the searched image corresponds to the relative position. ).
- FIG. 11 (b) is a diagram for explaining a method for determining the relative positions of the template image and the image to be searched by a method different from the method of FIG. 11 (a).
- This method is a method in which the voting value when the voting base method is used as one of the methods for estimating the position where the template image is similar on the searched image is used as the third type feature amount.
- a second type feature quantity is calculated for each of the template image and the searched image (the calculation of the second type feature quantity for the searched image here is for the entire image area).
- a point 1631 that serves as a reference for the position when obtaining the second type feature quantity in the template image is referred to as a reference point (for example, in FIG.
- the upper left is the origin O in the image coordinate system, and (If the type 2 feature is defined, the origin O is the reference point).
- a feature amount having the highest similarity is selected from the second type feature amounts of both images, and is stored as a pair.
- the second image on the searched image side paired with the second class feature quantity in the template image.
- the distance obtained from the template image and the coordinates corresponding to the vector direction are obtained with respect to the calculated position (coordinates) of the type 2 feature amount.
- voting is performed on the obtained coordinates as matching position candidate coordinates (one vote for one of the pairs).
- This voting process is performed for all pairs (or all pairs having a certain degree of similarity).
- the number of votes at the reference point for example, the upper left coordinate of the cut-out region 1641 is set as the third type feature amount.
- you may use several high-order groups with high similarity in each feature point for example, three high-order groups are used).
- the third type feature quantity can be calculated from the second type feature quantity.
- the third type feature quantity which is a mutual feature quantity it becomes possible to perform more robust matching.
- FIG. 12 is a diagram illustrating a specific example of the feature amount described in FIG.
- the same kind of feature quantity A105, feature quantity B106, and feature quantity C608 (individual feature quantity) described in FIG. 8 can be used.
- the feature quantity A105, feature quantity B106, and feature quantity C608 include, for example, a feature quantity 702 relating to texture in an area based on a specified coordinate in the image, or an image.
- An edge feature amount 703 representing information on the structure of a pattern can be used.
- Examples of the feature amount related to the texture include those using a histogram feature amount 707, a contrast feature amount 708, and a co-occurrence matrix 709, which will be described later.
- This feature amount is a feature amount corresponding to the second type feature amount described with reference to FIG.
- the histogram feature 707 is a feature value including an average, variance, skewness, kurtosis, etc. obtained by analyzing a gradation value histogram in a region with reference to a specified coordinate in the image for each of the template and the searched image.
- the contrast feature amount 708 uses an average gradation value of an area designated by each of the template and the searched image as a feature amount.
- the designated area for example, an area where a pattern (for example, a line pattern) exists in an image or an area where no pattern exists (background area) is used.
- the contrast difference between a plurality of designated areas in the respective fields of view of the template and the searched image may be obtained (contrast feature in the image), and the value may be used as the feature amount.
- feature point information 704 obtained using a technique such as SIFT (Non-Patent Document 2).
- the edge feature quantity 703 includes a feature quantity such as HOG (Histgramsistof Oriented Gradients).
- the feature amount D (mutual feature amount 720) is calculated mutually from the feature amounts obtained from the feature amount A105 and the feature amount B106.
- the degree of coincidence for example, a difference in values
- the correlation value of the histogram distribution shape obtained from the template and the searched image may be used as the feature amount.
- the degree of coincidence for example, a difference in values of contrast features obtained from the template and the searched image is used as a feature amount.
- a correlation value between the template and the searched image itself may be used as the feature amount.
- the image used at that time may be the input image itself, or an image in which preprocessing such as noise removal processing and edge enhancement processing has been performed in advance may be used.
- the mutual feature amount is obtained from the corner time point information
- the number of matching corner points obtained from the template and the searched image can be used.
- the number of Votings in corresponding point matching may be used as the feature amount as shown in Non-Patent Document 2. This feature amount corresponds to the third type feature amount described in FIG.
- the plurality of individual feature values 701 and some or all of the mutual feature values as described above can be used in the template matching described with reference to FIGS.
- FIG. 13 is a diagram for explaining an example of the feature amount calculation means described in FIG. Any of the following feature amounts can be used for the feature amount A105, the feature amount B106, and the feature amount C608 described with reference to FIGS.
- FIG. 13A is a diagram for explaining the histogram feature.
- the histogram feature is a means that uses, as a feature, a distribution shape of a gradation value histogram in a specified region or a value obtained by analyzing the distribution. Histograms 804 and 805 are obtained from the template 801 and the image 803 cut out from the searched image 802, respectively.
- the distribution shape of the histogram can be used as it is. For example, it is characterized by a vector whose element is the frequency of each bin (divided section of the data range) of the histogram. Alternatively, some or all of the average, variance, skewness, and kurtosis calculated by analyzing the shape of the distribution may be used as the feature amount.
- a cumulative histogram may be used as the gradation value histogram.
- FIG. 13B is a diagram for explaining the contrast feature.
- the average value of the gradation values in the designated areas 814 and 815 is used as the feature amount.
- the information is not limited to the average value, and may be any information that can represent the information of the gradation value in the region, and may be, for example, a variance value, a maximum value, a minimum value, or the like.
- FIG. 13C is a diagram for explaining a feature amount different from that in FIG. 13B regarding the contrast feature.
- the ratio of the average values of the gradation values (contrast in the image) is obtained in each of the designated areas 822 and 823, and the value is used as the feature amount.
- the ratio of the average value of the gradation values (contrast in the image) is used as the feature amount in the specified regions 826 and 827.
- the average value is used, but the present invention is not limited to this, and any information that can represent the information of the gradation value in the area may be used.
- the feature amounts acquired by the method illustrated in FIGS. 13A to 13C are individual feature amounts that can be used for the feature amount A105, the feature amount B106, and the feature amount C608 described above.
- the present invention is not limited to these feature amounts, and any value or vector that represents the features of an image cut out from the template and the searched image may be used.
- the mutual feature quantity which is the feature quantity D720 described with reference to FIGS. 8 and 12 can be obtained by comparing the individual feature quantities obtained from the template image and the image cut out from the searched image.
- the correlation value of the distribution shape of the histogram obtained from the template image and the image cut out from the searched image is used as the mutual feature amount.
- an average (or variance, skewness, kurtosis, etc.) difference or ratio which is a value obtained by analyzing a histogram, can be used as a mutual feature amount.
- the contrast feature amount obtained in FIGS. 13B and 13C the difference or ratio between the values obtained from the template image and the searched image can be used as the mutual feature amount.
- the following can be used as a mutual feature amount.
- FIG. 13D is a diagram for explaining line profile characteristics.
- pixels are added and averaged (projected) in a certain direction of the image to obtain a different dimension waveform. This is called a line profile.
- FIG. 13D shows an example in which each image is projected in the Y direction.
- Correlation values of line profiles 834 and 835 of the respective images can be obtained, and the correlation values can be used as mutual feature amounts. It should be noted that the range in which the line profile is correlated is not limited to using the entire line profile, and the correlation value of only a section cut out of a part of the line profile may be used.
- FIG. 13E is a diagram illustrating an example in which the correlation value of the image itself is a mutual feature amount.
- a correlation value between images is calculated in the template 841 and an image 843 cut out from the searched image 842, and the correlation value is used as a feature amount.
- FIG. 13F is an example in which the corresponding point matching result of SIFT is a mutual feature amount.
- corresponding point matching is performed on feature points (feature descriptors) extracted from the template 851 and the searched image 852, for example, corresponding points 853 connected by arrows are the corresponding features in the searched image 852.
- the coordinates, scale, and rotation amount of the point are obtained (Non-Patent Document 2).
- the reference point coordinates for example, the position of the white circle in the template 851
- the reference point in the searched image 852 is obtained from the previous coordinates, scale, and rotation amount information by generalized Hough transform.
- Voting (Voting processing) is performed on the position (Non-patent Document 2).
- the feature amount can be the number of votes that the template is projected to the position (or the periphery thereof) of the image cut out from the searched image.
- the density of voting (the number of votes / the area of the surrounding area) may be considered in consideration of the surrounding area, not the number of votes.
- the correlation value between the SIFT feature value of the corresponding point template obtained by matching the corresponding points and the SIFT feature value in the searched image may be used as the feature value.
- FIG. 13G is an example in which the corresponding point matching result of Corner is used as a mutual feature quantity instead of the SIFT in FIG. 13F.
- corresponding point matching is performed using a corner extracted from the template 861 and the image 863 cut out from the searched image 862 as a feature point (for example, a point connected by an arrow is a corresponding point)
- a searched image 862 is obtained.
- the coordinates, scale, and rotation amount of the corresponding feature point are obtained.
- Mutual feature quantities can be obtained by the above method. Note that the method for calculating the mutual feature amount is not limited to the method described here, and any feature amount (scalar value, vector, or the like) expressing the mutual relationship between the template and the searched image may be used. .
- the template and the searched image are preprocessed to generate an image with reduced noise or enhanced features, and the above-described feature amounts (individual feature amounts and mutual feature amounts) are obtained for the generated images.
- Examples of the preprocessing include smoothing filtering, edge enhancement filtering, and binarization processing.
- the present invention is not limited to the processing described here, and any filter processing can be used as long as it can be used as preprocessing.
- a process combining a plurality of pre-processing is performed on a template or a searched image, and the above-described feature amount (individual feature amount and mutual feature amount) is obtained for the image obtained by the process. You can also.
- FIG. 14 is a diagram illustrating in detail the learning data when calculating the matching success / failure determination boundary surface 111 described in FIG. FIG. 14A shows an example of simple learning data.
- One template 1001, an image 1002 in which a matching correct answer position in the searched image is cut out, and an image 1003 in which a matching incorrect answer position is cut out are used as learning data.
- a plurality of images (correct position and incorrect position) cut out from the searched image may be used when a plurality of images can be obtained from one image, or when a plurality of images are used for learning ( One template or a plurality of images may be cut out from each of a plurality of images used for learning.
- the matching correct / incorrect position of the matching success / failure determination boundary surface 111 in the determination index value space described in FIG. 7 is determined. The possibility of improving generalization performance increases.
- FIG. 14B is a diagram illustrating an example in which a plurality of types of templates 1011 used for learning data are used.
- a plurality of templates 1004 it is possible to obtain a matching success / failure determination boundary surface that is less dependent on a specific template pattern or appearance, that is, more versatile.
- FIG. 15 is a diagram for explaining a method of determining not only the matching success / failure boundary surface in FIG. 3 but also the feature amount extraction method in the feature amount extraction unit by learning.
- the feature amount extraction method is learned by a genetic algorithm (hereinafter referred to as GA) or genetic programming (hereinafter referred to as GP).
- GA genetic algorithm
- GP genetic programming
- Feature extraction is a combination of a plurality of image processes. Each image process may have a plurality of setting parameters. The combination of the image processing and the setting parameters of each processing are learned using GA or GP.
- a combination of image processing (including parameter setting) for calculating a determination index value is set as a chromosome (solution candidate).
- FIG. 15A shows the flow of processing when GA or GP is used for learning.
- FIG. 15B is a diagram showing in detail the evaluation unit in FIG.
- chromosome 1721 is a combination of processes for calculating determination index value 1728 (a plurality of determination index values are calculated from one chromosome).
- a plurality of chromosomes (solution candidates) 1721 are generated (for example, 100 individuals are generated).
- a matching success / failure determination boundary surface calculation process 1723 is performed.
- the matching success / failure determination boundary surface is calculated by the SVM described above.
- the evaluation value 1724 for example, the distance (score) from the matching boundary surface to the support vector in SVM can be used as the evaluation value.
- Learning end determination 1725 is performed based on whether or not the evaluation value satisfies a specified value (for example, whether or not the distance is larger than the specified value).
- the chromosome at the end of learning becomes a feature amount extraction method (combination of image processing and parameter setting for each image processing), and a matching success / failure determination boundary surface is also determined. If the learning does not end, the learning is continued, and the processing of the selection 1704, the intersection 1705, and the mutation 1706 shown in FIG. 15A is performed again (generation change).
- distance was used as an evaluation value here, if it is an evaluation value which can judge the quality of a matching success / failure determination boundary surface, it will not be limited to distance.
- the method used for learning is not limited to GA (or GP) and SVM, and any method capable of such learning may be used.
- FIG. 16 is a diagram illustrating an example of a GUI for realizing manual setting of the matching success / failure determination boundary surface 111.
- the matching success / failure determination boundary surface 111 can be obtained by using a technique such as SVM, for example.
- the user manually specifies the matching success / failure determination boundary surface 111.
- FIG. 16A is a diagram illustrating an example of user setting by GUI. In the GUI shown in FIG. 16A, the determination index value space is displayed on the display device 20 such as an LCD screen. In FIG.
- a graph in which the two determination index values of the determination index value A1103 and the determination index value B1104 are plotted on the vertical axis and the horizontal axis is taken as an example, but three or more determination index values are used. If it is, it can be displayed by switching the index value displayed on the axis.
- the boundary surface drawing button 1102 is selected with a mouse or the like, and acceptance of user input of the matching determination boundary surface 1101 is started. Next, the user manually draws the determination boundary surface 1101 in the determination index value space on the GUI by mouse input or the like.
- the matching determination boundary surface drawn by the user can be used as the matching success / failure determination boundary surface 111 described in FIG.
- GUI is not limited to the format as shown in FIG. 16, and any known technique can be used as long as the user can manually specify the matching success / failure determination boundary surface 111.
- FIGS. 16B and 16C are diagrams illustrating an example of a determination index value space displayed on the GUI.
- FIG. 16B shows an example in which the matching correctness determination boundary surface 111 is drawn with a straight line
- FIG. 16C shows an example in which the matching correctness determination boundary surface 111 is drawn with a curve.
- the plot displayed with a symbol ⁇ or symbol ⁇ on the graph is a determination index value in the learning data, and one symbol is obtained from a set of templates and an image cut out from the searched image.
- the determination index value is a value when the matching is correct
- x is a value when the matching is incorrect.
- the distribution of the symbols ⁇ and ⁇ can be displayed as reference data when the user designates the matching success / failure determination index values 1103 and 1104 by manual operation.
- xx is used as a symbol, but the symbol is not limited to this, and any symbol can be used as long as it can distinguish between correct and incorrect answers.
- FIG. 17 is a diagram illustrating an example of a GUI for confirming the stability of a matching result in a determination index value space spanned by determination index values.
- FIG. 17A is an example of a two-dimensional determination index value space.
- a determination index value space spanned by the determination index value A 1203 and the determination index value B 1204, a matching success / failure determination boundary surface 1201, and a template obtained when a matching position (matching result) is obtained by matching and an image cut out from the searched image
- the position 1202 in the determination index value space of the matching result is displayed in the GUI.
- the GUI shown in this figure it is possible to graphically confirm how far the matching result is from the matching success / failure determination boundary surface 1201. If the matching result is close to the matching success / failure determination boundary surface 1201, it can be confirmed that if the determination index value is slightly different, the matching correct answer / incorrect answer is different and the matching may be unstable.
- FIG. 17B is a diagram showing an example of a GUI for confirming the stability of the matching result, as in FIG. 17A, but the determination index value space spanned by three determination index values is a three-dimensional graph. Can be checked graphically.
- the matching success / failure determination boundary surface 1205 and the matching result position 1202 can be displayed as in FIG.
- the matching success / failure determination boundary surface 1205 can also confirm data in the boundary surface by transparent display.
- the position of the learning data, the matching success / failure determination boundary surface, and the matching result in the determination index value space can be confirmed from any viewpoint such as the front side or the back side by using the viewpoint movement buttons 1206, 1207, and the like.
- FIG. 17C is a GUI for confirming the stability of the matching result, as in FIGS. 17A and 17B.
- a determination index value space spanned by a plurality of determination index values is divided into two arbitrary index values.
- a display 1211 in which all or some of the projections 1214 obtained by projecting the matching determination boundary surface in the determination index value space spanned by the determination index values are arranged in correspondence with each other is displayed.
- FIG. 17C shows an example in which there are four determination index values A, B, C, and D (1212, 1213).
- a matching determination boundary is set in a determination index value space obtained from A and B.
- a projected surface 1214 can be seen for any correspondence. As described above, it is possible to provide a GUI for confirming the stability of the matching result.
- the arrangement, size, and items of display members in the GUI are not limited to those shown in the figure, and a display method for confirming the relationship between the learning data, the matching success / failure determination boundary surface, and the matching result position in the determination index value space. If it is good.
- Each component of the present invention can be arbitrarily selected, and an invention having a selected configuration is also included in the present invention.
- the present invention can be used for a pattern matching device.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Biochemistry (AREA)
- Manufacturing & Machinery (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Power Engineering (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Image Analysis (AREA)
- Length-Measuring Devices Using Wave Or Particle Radiation (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
尚、被サーチ画像内にサーチ対象のパターンが有るか否かの判定処理、及び、マッチング位置を求める手段の詳細については、図6を参照しながら後に説明する。本実施の形態では、例えば、テンプレート101と被サーチ画像102のマッチング正解位置とで、画像の見た目の乖離が大きくなった場合でもパターンマッチングを成功させるようにすることが1つの目的であり、詳細には、図3の説明の後半に述べるが、テンプレート101及び被サーチ画像102の両画像を用いて求めた相互的特徴量108を用いて、特徴量ベースのマッチングを行う判定指標値109を求める。これにより、例えば、テンプレート101のみ、或いは、被サーチ画像102のみ、から求めていた個別特徴量を用いた特徴量ベースのマッチングでは取り扱うことが難しかったテンプレート101と被サーチ画像102とでの見た目の違いに起因する悪影響を受けにくい特徴量によるマッチング(或いは、悪影響を受けにくいように特徴量を用いるマッチング)が行えるようになり、テンプレートマッチングのロバスト性を向上させることができる。 The determination index value calculation process X based on the mutual features is performed by using a
Details of the process for determining whether there is a search target pattern in the image to be searched and the means for obtaining the matching position will be described later with reference to FIG. In the present embodiment, for example, one of the purposes is to make pattern matching successful even when the difference in the appearance of the image between the
求めた相互的特徴量108は、テンプレートマッチング判定部16a-3(マッチングスコア算出部)で用いる判定指標値109の一部、或いは、全てとして用いる。尚、この相互的特徴量108は、1つとは限らず、種類の異なる複数の特徴量を算出し、用いることもできる。なお、被サーチ画像からの切り出し画像102から抽出した特徴量B106を、そのまま個別特徴量として、判定指標値109の一部に用いることもできる。この特徴量も、1つとは限らず、種類の異なる特徴量を複数算出し、個別特徴量として用いることもできる。マッチングスコア判定処理部は、距離が0をスコアの中央値として設定する中央値設定処理部を有し、マッチングススコアの中央値以下ならばマッチング不正解とし、スコアの中央値以上ならばマッチング正解とするようにしても良い。 ρ XY = Cov (x, y) / (V (x)) 1/2 · (V (y)) 1/2
The obtained
Claims (10)
- 被サーチ画像に対しパターンマッチングを行うマッチング処理装置において、
学習用に取得したテンプレート画像から画像内の座標によって定まる特徴量を抽出する領域を抽出する特徴領域抽出処理部と、
学習用に取得した被サーチ画像から画像内の座標によって定まる特徴量を抽出する特徴量抽出処理部と、
前記テンプレート画像から抽出した特徴量と前記被サーチ画像から抽出した特徴量、及び前記テンプレート画像と前記被サーチ画像との相対位置とから、テンプレート画像と被サーチ画像との第1の相互的な特徴量を算出する第1の相互的特徴量算出処理部と、
特徴量抽出演算が異なる複数の前記特徴量抽出処理部で抽出した画像内の同じ座標において特徴量の値が異なる特徴量から算出した複数の前記第1の相互的な特徴量を用いてマッチングの成否を分ける識別境界面を算出する識別境界面算出部と、
検査対象から取得したテンプレート画像と被サーチ画像とから第2の相互的な特徴量を算出する第2の相互的特徴量算出処理部と、
画像内の同じ座標において特徴量の値が異なる特徴量から算出した複数の前記第2の相互的な特徴量と前記識別境界面とを用いて検査対象のテンプレート画像と被サーチ画像とのマッチングを行うテンプレートマッチング処理部と
を備えることを特徴とするマッチング処理装置。 In a matching processing device that performs pattern matching on a searched image,
A feature region extraction processing unit for extracting a region for extracting a feature amount determined by coordinates in the image from a template image acquired for learning;
A feature amount extraction processing unit that extracts a feature amount determined by coordinates in the image from the searched image acquired for learning;
A first mutual feature between the template image and the searched image from the feature value extracted from the template image, the feature value extracted from the searched image, and the relative position between the template image and the searched image. A first mutual feature amount calculation processing unit for calculating an amount;
Matching is performed using a plurality of the first mutual feature amounts calculated from feature amounts having different feature amount values at the same coordinates in the image extracted by the plurality of feature amount extraction processing units having different feature amount extraction operations. An identification boundary surface calculation unit that calculates an identification boundary surface that divides success or failure;
A second mutual feature amount calculation processing unit that calculates a second mutual feature amount from the template image acquired from the inspection target and the searched image;
Matching between the template image to be inspected and the image to be searched using the plurality of second mutual feature amounts calculated from feature amounts having different feature values at the same coordinates in the image and the identification boundary surface A matching processing apparatus comprising: a template matching processing unit to perform. - 前記テンプレートマッチング処理部は、
複数の特徴量からなる特徴量空間において、前記識別境界面からの距離をマッチングスコアとして算出するマッチングスコア算出処理部を備えることを特徴とする請求項1に記載のマッチング処理装置。 The template matching processing unit
The matching processing apparatus according to claim 1, further comprising a matching score calculation processing unit that calculates a distance from the identification boundary surface as a matching score in a feature amount space including a plurality of feature amounts. - 前記マッチングスコア算出処理部は、
距離が0をスコアの中央値として設定する中央値設定処理部を有し、
マッチングススコアの中央値以下ならばマッチング不正解とし、スコアの中央値以上ならばマッチング正解とすることを特徴とする請求項2に記載のマッチング処理装置。 The matching score calculation processing unit
A median setting processing unit for setting a distance of 0 as a median score;
3. The matching processing apparatus according to claim 2, wherein if it is equal to or less than the median value of the matching score, a matching incorrect answer is determined, and if equal to or greater than the median value of the score, the matching correct answer is determined. - 前記第2の相互的な特徴量算出部において、
テンプレートと被サーチ画像との正規化相関値、及び、テンプレートから抽出した特徴点群と被サーチ画像から抽出した特徴点とで類似した特徴点の座標の一致度、及び、テンプレートから求めた階調値ヒストグラムと被サーチ画像から求めた階調値ヒストグラムとの一致度の、うちの少なくともいずれか1を特徴量とすることを特徴とする請求項1に記載のマッチング処理装置。 In the second mutual feature amount calculation unit,
Normalized correlation value between the template and the searched image, the degree of coincidence of the coordinates of similar feature points between the feature point group extracted from the template and the feature point extracted from the searched image, and the gradation obtained from the template 2. The matching processing apparatus according to claim 1, wherein at least one of the degree of coincidence between the value histogram and the gradation value histogram obtained from the searched image is used as a feature amount. - 前記第1の相互的な特徴量と、テンプレートから抽出した特徴量と、被サーチ画像から抽出した特徴量のうちの少なくともいずれか1と、事前に取得したテンプレートと被サーチ画像とのマッチング成否結果と、を入力として、前記マッチング判定境界面を求めるマッチング成否判定境界面指定処理部を備えることを特徴とする請求項1に記載のマッチング処理装置。 Matching success / failure result of at least one of the first mutual feature value, the feature value extracted from the template, and the feature value extracted from the searched image, and the template and the searched image acquired in advance The matching processing device according to claim 1, further comprising: a matching success / failure determination boundary surface designation processing unit that obtains the matching determination boundary surface.
- 前記マッチング対象の特徴量と前記識別境界面とをGUI表示させるとともに、前記マッチング対象の特徴量に応じて、マージンを最大化するようにマッチング処理の正否を判定可能にする前記識別境界面を求める処理を行うことを特徴とする請求項5に記載のマッチング処理装置。 The feature amount of the matching target and the identification boundary surface are displayed in a GUI, and the identification boundary surface that makes it possible to determine whether the matching process is correct or not is obtained according to the feature amount of the matching target so as to maximize a margin. 6. The matching processing apparatus according to claim 5, wherein the matching processing apparatus performs processing.
- 請求項1に記載のマッチング処理装置によりパターンマッチングを行うことを特徴とする検査装置。 An inspection apparatus that performs pattern matching using the matching processing apparatus according to claim 1.
- 被サーチ画像に対しパターンマッチングを行うマッチング処理方法において、
学習用に取得したテンプレート画像から特徴量を抽出する領域を抽出する特徴領域抽出ステップと、
学習用に取得した被サーチ画像から特徴量を抽出する特徴量抽出ステップと、
前記テンプレート画像から抽出した特徴量と前記被サーチ画像から抽出した特徴量からテンプレート画像と被サーチ画像との第1の相互的な特徴量を算出する第1の相互的特徴量算出ステップと、
複数の前記第1の相互的な特徴量を用いてマッチングの成否を分ける識別境界面を算出する識別境界面算出ステップと、
検査対象から取得したテンプレート画像と被サーチ画像とから第2の相互的な特徴量を算出する第2の相互的特徴量算出ステップと、
前記第2の相互的な特徴量と前記識別境界面とを用いて検査対象のテンプレート画像と被サーチ画像とのマッチングを行うテンプレートマッチングステップと
を備えることを特徴とするマッチング処理方法。 In a matching processing method for performing pattern matching on a searched image,
A feature region extraction step for extracting a region for extracting a feature value from the template image acquired for learning;
A feature amount extraction step for extracting a feature amount from the searched image acquired for learning;
A first mutual feature amount calculating step of calculating a first mutual feature amount between the template image and the searched image from the feature amount extracted from the template image and the extracted feature amount from the searched image;
An identification boundary surface calculating step for calculating an identification boundary surface that divides the success or failure of matching using the plurality of first mutual feature quantities;
A second mutual feature amount calculating step for calculating a second mutual feature amount from the template image acquired from the inspection target and the searched image;
A matching processing method comprising: a template matching step for performing matching between a template image to be inspected and a searched image using the second mutual feature quantity and the identification boundary surface. - コンピュータに、請求項8に記載のマッチング処理方法を実行させるためのプログラム。 A program for causing a computer to execute the matching processing method according to claim 8.
- 被サーチ画像に対しパターンマッチングを行うマッチング処理装置において、
テンプレート画像から画像内の座標によって定まる特徴量を抽出する領域を抽出する特徴領域抽出処理部と、
被サーチ画像から画像内の座標によって定まる特徴量を抽出する特徴量抽出処理部と、
前記テンプレート画像から抽出した特徴量と前記被サーチ画像から抽出した特徴量、及び前記テンプレート画像と前記被サーチ画像との相対位置とから、テンプレート画像と被サーチ画像との相互的な特徴量を算出する相互的特徴量算出処理部と、
特徴量抽出演算が異なる複数の前記特徴量抽出処理部で抽出した画像内の同じ座標において特徴量の値が異なる特徴量から算出した複数の前記相互的な特徴量と、事前に設定したマッチングの成否を分ける識別境界面と、を用いて検査対象のテンプレート画像と被サーチ画像とのマッチングを行うテンプレートマッチング処理部と
を備えることを特徴とするマッチング処理装置。 In a matching processing device that performs pattern matching on a searched image,
A feature region extraction processing unit for extracting a region for extracting a feature amount determined by coordinates in the image from the template image;
A feature amount extraction processing unit that extracts a feature amount determined by coordinates in the image from the searched image;
A mutual feature amount between the template image and the searched image is calculated from the feature amount extracted from the template image, the feature amount extracted from the searched image, and the relative position between the template image and the searched image. A mutual feature amount calculation processing unit,
A plurality of the mutual feature amounts calculated from the feature amounts having different feature value values at the same coordinates in the image extracted by the plurality of feature amount extraction processing units having different feature amount extraction operations, and a preset matching A matching processing apparatus comprising: a template matching processing unit that performs matching between a template image to be inspected and a search target image using an identification boundary surface that divides success or failure.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/417,425 US9619727B2 (en) | 2012-07-27 | 2013-07-16 | Matching process device, matching process method, and inspection device employing same |
KR1020157002223A KR101701069B1 (en) | 2012-07-27 | 2013-07-16 | Matching process device, matching process method, and inspection device employing same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-167363 | 2012-07-27 | ||
JP2012167363A JP5941782B2 (en) | 2012-07-27 | 2012-07-27 | Matching processing apparatus, matching processing method, and inspection apparatus using the same |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2014017337A1 true WO2014017337A1 (en) | 2014-01-30 |
WO2014017337A8 WO2014017337A8 (en) | 2014-05-22 |
Family
ID=49997151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/069296 WO2014017337A1 (en) | 2012-07-27 | 2013-07-16 | Matching process device, matching process method, and inspection device employing same |
Country Status (5)
Country | Link |
---|---|
US (1) | US9619727B2 (en) |
JP (1) | JP5941782B2 (en) |
KR (1) | KR101701069B1 (en) |
TW (1) | TWI579775B (en) |
WO (1) | WO2014017337A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3021280A1 (en) * | 2014-11-07 | 2016-05-18 | JEOL Ltd. | Image evaluation method and charged particle beam device |
CN109724988A (en) * | 2019-02-01 | 2019-05-07 | 佛山市南海区广工大数控装备协同创新研究院 | A kind of pcb board defect positioning method based on multi-template matching |
CN110969661A (en) * | 2018-09-30 | 2020-04-07 | 上海微电子装备(集团)股份有限公司 | Image processing device and method, position calibration system and method |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6063315B2 (en) * | 2013-03-26 | 2017-01-18 | 富士フイルム株式会社 | Authenticity determination system, feature point registration apparatus and operation control method thereof, and collation determination apparatus and operation control method thereof |
JP6454533B2 (en) * | 2014-12-15 | 2019-01-16 | 株式会社日立ハイテクノロジーズ | Charged particle beam equipment |
US11669953B2 (en) | 2015-01-30 | 2023-06-06 | Hitachi High-Tech Corporation | Pattern matching device and computer program for pattern matching |
JP6511986B2 (en) | 2015-06-26 | 2019-05-15 | 富士通株式会社 | PROGRAM GENERATION DEVICE, PROGRAM GENERATION METHOD, AND GENERATION PROGRAM |
US10373379B2 (en) | 2015-08-20 | 2019-08-06 | Disney Enterprises, Inc. | Deformable-surface tracking based augmented reality image generation |
TWI578240B (en) * | 2015-12-01 | 2017-04-11 | 財團法人工業技術研究院 | Method for feature description and feature descriptor using the same |
JP2017211765A (en) * | 2016-05-24 | 2017-11-30 | アイシン精機株式会社 | Object recognition device |
US10417737B2 (en) * | 2017-06-21 | 2019-09-17 | International Business Machines Corporation | Machine learning model for automatic image registration quality assessment and correction |
US10424045B2 (en) * | 2017-06-21 | 2019-09-24 | International Business Machines Corporation | Machine learning model for automatic image registration quality assessment and correction |
KR102041310B1 (en) * | 2017-08-02 | 2019-11-07 | 세메스 주식회사 | Apparatus for treating a substrate and method for determining the state the pose of a substrate |
CN111417860B (en) * | 2017-11-27 | 2023-04-07 | 浜松光子学株式会社 | Analysis method, analysis device, analysis program, and storage medium storing analysis program |
JP6898211B2 (en) | 2017-11-27 | 2021-07-07 | 浜松ホトニクス株式会社 | A recording medium for recording an optical measurement method, an optical measurement device, an optical measurement program, and an optical measurement program. |
WO2019168310A1 (en) * | 2018-02-28 | 2019-09-06 | 서울대학교산학협력단 | Device for spatial normalization of medical image using deep learning and method therefor |
KR102219890B1 (en) * | 2018-02-28 | 2021-02-24 | 서울대학교산학협력단 | Apparatus for spatial normalization of medical image using deep learning and method thereof |
JP6844564B2 (en) * | 2018-03-14 | 2021-03-17 | オムロン株式会社 | Inspection system, identification system, and learning data generator |
TWI722562B (en) * | 2018-09-24 | 2021-03-21 | 荷蘭商Asml荷蘭公司 | Method for determining candidate patterns from set of patterns of a patterning process |
JP7395566B2 (en) * | 2019-04-02 | 2023-12-11 | 株式会社半導体エネルギー研究所 | Inspection method |
JP7298333B2 (en) * | 2019-06-25 | 2023-06-27 | オムロン株式会社 | Visual inspection management system, visual inspection management device, visual inspection management method and program |
US11244440B2 (en) * | 2019-08-30 | 2022-02-08 | Intel Corporation | Ranking of objects with noisy measurements |
CN110704559B (en) * | 2019-09-09 | 2021-04-16 | 武汉大学 | Multi-scale vector surface data matching method |
JP7449111B2 (en) * | 2020-02-18 | 2024-03-13 | キヤノン株式会社 | Inspection equipment, inspection method |
CN111695621B (en) * | 2020-06-09 | 2023-05-05 | 杭州印鸽科技有限公司 | Method for detecting matching of customized article and order based on deep learning |
CN114202578A (en) * | 2020-09-18 | 2022-03-18 | 长鑫存储技术有限公司 | Wafer alignment method and device |
JP7518741B2 (en) | 2020-11-27 | 2024-07-18 | 株式会社フジクラ | Inspection device, inspection method, and inspection program |
KR102510581B1 (en) * | 2022-09-06 | 2023-03-16 | 주식회사 포스로직 | Method for matching shape array and apparatus for using the method |
CN118314336B (en) * | 2024-06-11 | 2024-08-09 | 四川迪晟新达类脑智能技术有限公司 | Heterogeneous image target positioning method based on gradient direction |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05101186A (en) * | 1991-10-08 | 1993-04-23 | Sumitomo Cement Co Ltd | Optical pattern identifying method |
JPH09138785A (en) * | 1995-11-14 | 1997-05-27 | Mitsui Eng & Shipbuild Co Ltd | Pattern matching method and device |
JP2001014465A (en) * | 1999-06-29 | 2001-01-19 | Matsushita Electric Ind Co Ltd | Method and device for recognizing object |
JP2003076976A (en) * | 2001-08-31 | 2003-03-14 | Mitsui Eng & Shipbuild Co Ltd | Pattern matching method |
JP2006292615A (en) * | 2005-04-13 | 2006-10-26 | Sharp Corp | Visual examination apparatus, visual inspection method, program for making computer function as visual inspection apparatus, and recording medium |
JP2006293528A (en) * | 2005-04-07 | 2006-10-26 | Sharp Corp | Method and apparatus for selecting learning image, and method, apparatus, program and recording medium for creating image processing algorithm |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6647139B1 (en) | 1999-02-18 | 2003-11-11 | Matsushita Electric Industrial Co., Ltd. | Method of object recognition, apparatus of the same and recording medium therefor |
EP1049030A1 (en) * | 1999-04-28 | 2000-11-02 | SER Systeme AG Produkte und Anwendungen der Datenverarbeitung | Classification method and apparatus |
US6868175B1 (en) * | 1999-08-26 | 2005-03-15 | Nanogeometry Research | Pattern inspection apparatus, pattern inspection method, and recording medium |
JP4218171B2 (en) * | 2000-02-29 | 2009-02-04 | 株式会社日立製作所 | Scanning electron microscope, matching method, and computer-readable recording medium recording program |
JP4199939B2 (en) * | 2001-04-27 | 2008-12-24 | 株式会社日立製作所 | Semiconductor inspection system |
JP4901254B2 (en) * | 2006-03-22 | 2012-03-21 | 株式会社日立ハイテクノロジーズ | Pattern matching method and computer program for performing pattern matching |
US7525673B2 (en) * | 2006-07-10 | 2009-04-28 | Tokyo Electron Limited | Optimizing selected variables of an optical metrology system |
JP4814116B2 (en) * | 2007-01-29 | 2011-11-16 | 三菱重工業株式会社 | Mounting board appearance inspection method |
JP5460023B2 (en) * | 2008-10-16 | 2014-04-02 | 株式会社トプコン | Wafer pattern inspection method and apparatus |
US8194938B2 (en) | 2009-06-02 | 2012-06-05 | George Mason Intellectual Properties, Inc. | Face authentication using recognition-by-parts, boosting, and transduction |
JP5671928B2 (en) | 2010-10-12 | 2015-02-18 | ソニー株式会社 | Learning device, learning method, identification device, identification method, and program |
-
2012
- 2012-07-27 JP JP2012167363A patent/JP5941782B2/en active Active
-
2013
- 2013-06-06 TW TW102120127A patent/TWI579775B/en active
- 2013-07-16 US US14/417,425 patent/US9619727B2/en active Active
- 2013-07-16 WO PCT/JP2013/069296 patent/WO2014017337A1/en active Application Filing
- 2013-07-16 KR KR1020157002223A patent/KR101701069B1/en active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05101186A (en) * | 1991-10-08 | 1993-04-23 | Sumitomo Cement Co Ltd | Optical pattern identifying method |
JPH09138785A (en) * | 1995-11-14 | 1997-05-27 | Mitsui Eng & Shipbuild Co Ltd | Pattern matching method and device |
JP2001014465A (en) * | 1999-06-29 | 2001-01-19 | Matsushita Electric Ind Co Ltd | Method and device for recognizing object |
JP2003076976A (en) * | 2001-08-31 | 2003-03-14 | Mitsui Eng & Shipbuild Co Ltd | Pattern matching method |
JP2006293528A (en) * | 2005-04-07 | 2006-10-26 | Sharp Corp | Method and apparatus for selecting learning image, and method, apparatus, program and recording medium for creating image processing algorithm |
JP2006292615A (en) * | 2005-04-13 | 2006-10-26 | Sharp Corp | Visual examination apparatus, visual inspection method, program for making computer function as visual inspection apparatus, and recording medium |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3021280A1 (en) * | 2014-11-07 | 2016-05-18 | JEOL Ltd. | Image evaluation method and charged particle beam device |
US9396905B2 (en) | 2014-11-07 | 2016-07-19 | Jeol Ltd. | Image evaluation method and charged particle beam device |
CN110969661A (en) * | 2018-09-30 | 2020-04-07 | 上海微电子装备(集团)股份有限公司 | Image processing device and method, position calibration system and method |
CN110969661B (en) * | 2018-09-30 | 2023-11-17 | 上海微电子装备(集团)股份有限公司 | Image processing device and method, and position calibration system and method |
CN109724988A (en) * | 2019-02-01 | 2019-05-07 | 佛山市南海区广工大数控装备协同创新研究院 | A kind of pcb board defect positioning method based on multi-template matching |
CN109724988B (en) * | 2019-02-01 | 2021-05-18 | 佛山市南海区广工大数控装备协同创新研究院 | PCB defect positioning method based on multi-template matching |
Also Published As
Publication number | Publication date |
---|---|
WO2014017337A8 (en) | 2014-05-22 |
KR20150036230A (en) | 2015-04-07 |
US20150199583A1 (en) | 2015-07-16 |
JP5941782B2 (en) | 2016-06-29 |
TWI579775B (en) | 2017-04-21 |
TW201415379A (en) | 2014-04-16 |
US9619727B2 (en) | 2017-04-11 |
JP2014026521A (en) | 2014-02-06 |
KR101701069B1 (en) | 2017-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5941782B2 (en) | Matching processing apparatus, matching processing method, and inspection apparatus using the same | |
US7889909B2 (en) | Pattern matching method and pattern matching program | |
US10318805B2 (en) | Pattern matching method and apparatus | |
US9141879B2 (en) | Pattern matching method, image processing device, and computer program | |
US20140016854A1 (en) | Pattern matching device and computer program | |
JP6872670B2 (en) | Dimension measuring device, dimensional measuring program and semiconductor manufacturing system | |
WO2018074110A1 (en) | Fingerprint processing device, fingerprint processing method, program, and fingerprint processing circuit | |
KR102435492B1 (en) | Image processing system and computer program for carrying out image process | |
CN111783770B (en) | Image correction method, device and computer readable storage medium | |
JP2006048322A (en) | Object image detecting device, face image detection program, and face image detection method | |
JP5651428B2 (en) | Pattern measuring method, pattern measuring apparatus, and program using the same | |
JP6713185B2 (en) | Inspection apparatus and inspection method using template matching | |
CN113168687A (en) | Image evaluation apparatus and method | |
CN110288040B (en) | Image similarity judging method and device based on topology verification | |
CN111898408B (en) | Quick face recognition method and device | |
JP7138137B2 (en) | INSPECTION DEVICE AND INSPECTION METHOD USING TEMPLATE MATCHING | |
US20230114432A1 (en) | Dimension measurement apparatus, semiconductor manufacturing apparatus, and semiconductor device manufacturing system | |
CN115546219B (en) | Detection plate type generation method, plate card defect detection method, device and product | |
JP5592414B2 (en) | Template evaluation apparatus, microscope apparatus, and program | |
JP5010627B2 (en) | Character recognition device and character recognition method | |
JP4775957B2 (en) | Face detection device | |
US20150178934A1 (en) | Information processing device, information processing method, and program | |
JP4525526B2 (en) | Pattern matching method and apparatus | |
US20230194253A1 (en) | Pattern Inspection/Measurement Device, and Pattern Inspection/Measurement Program | |
JP2013053986A (en) | Pattern inspection method and device thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13823104 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14417425 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20157002223 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13823104 Country of ref document: EP Kind code of ref document: A1 |