WO2018180562A1 - 画像処理システム及び画像処理を行うためのコンピュータープログラム - Google Patents

画像処理システム及び画像処理を行うためのコンピュータープログラム Download PDF

Info

Publication number
WO2018180562A1
WO2018180562A1 PCT/JP2018/010307 JP2018010307W WO2018180562A1 WO 2018180562 A1 WO2018180562 A1 WO 2018180562A1 JP 2018010307 W JP2018010307 W JP 2018010307W WO 2018180562 A1 WO2018180562 A1 WO 2018180562A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
identified
identification
data
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/010307
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
篠田 伸一
康隆 豊田
崎村 茂寿
昌義 石川
新藤 博之
仁志 菅原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi High Tech Corp
Original Assignee
Hitachi High Technologies Corp
Hitachi High Tech Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi High Technologies Corp, Hitachi High Tech Corp filed Critical Hitachi High Technologies Corp
Priority to CN202310270799.5A priority Critical patent/CN116152507A/zh
Priority to CN201880014768.0A priority patent/CN110352431B/zh
Priority to KR1020197022591A priority patent/KR102336431B1/ko
Priority to US16/493,432 priority patent/US11176405B2/en
Priority to KR1020217027367A priority patent/KR102435492B1/ko
Publication of WO2018180562A1 publication Critical patent/WO2018180562A1/ja
Anticipated expiration legal-status Critical
Priority to US17/503,438 priority patent/US11836906B2/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/772Determining representative reference patterns, e.g. averaging or distorting patterns; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • G06T2207/10061Microscopic image from scanning electron microscope
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Definitions

  • the present disclosure relates to an image processing system that causes an image classifier that performs image identification based on image information to perform machine learning, and a computer program, and more particularly, to an image classifier that identifies an image to be verified using a verification image.
  • the present invention relates to an image processing system to be learned and a computer program.
  • Non-patent Document 1 a deep learning device called a CNN (Convolutional Neural Network) has been developed and attracts attention.
  • the CNN is a kind of machine learning device, and is a mechanism for determining what a target is by automatically extracting and learning image features from the system. Since the system automatically extracts feature values that have been considered important so far, it will be important to determine what kind of learning data to prepare in the future.
  • Patent Documents 1 and 2 introduce a technique that makes it possible to discriminate robustly against noise by adding noise to learning data and learning.
  • Machine learning discrimination performance is improved by using a large amount of learning data. It may take several weeks to several months to prepare a large amount of learning data and to learn using the large amount of learning data.
  • Machine learning is also introduced to classifiers (image processing devices) that identify images output by a scanning electron microscope (SEM) used for semiconductor inspection and measurement according to the purpose of inspection and measurement.
  • SEM scanning electron microscope
  • disturbance due to SEM-specific charging
  • true value a correct discrimination result corresponding to an image
  • learning time for learning is also required according to the scale of the learning data
  • learning using a large amount of learning data may hinder production line operation and is difficult. Therefore, by identifying the disturbance that causes failure using the images that actually fail in identification and the images that succeed in identification below, the number of learning data is reduced by creating learning data focused on the disturbance, and the learning time
  • an image processing system including an arithmetic processing device for identifying an image using a collation image, the display device displaying the image, and selecting a part of the image
  • An input device ; a memory that stores collation image data for identifying the image; and a machine learning engine that machine-learns the collation image data required for image identification of the arithmetic processing unit.
  • a search is performed for an image that has been successfully identified by the arithmetic processing device stored in the memory using an image that has failed to be identified by the arithmetic processing device, and the input device is used to obtain an image that has been successfully identified by the search.
  • Processing system for generating corrected collation image data by adding information obtained based on a partial image of the image that has been identified and selected by Proposed.
  • a computer-readable storage medium storing computer instructions executed by a processor, wherein the computer instructions identify an image to be verified using a verification image
  • the computer instructions identify an image to be verified using a verification image
  • search for collated image data that has been successfully identified using the collation image data Executed by a processor that generates corrected matching image data by adding information obtained based on partial selection of the matching image that failed to be identified to the matching image data that was successfully identified by the search
  • the figure which shows the Example of an image generation apparatus The figure which shows the Example of the GUI screen of an image generation apparatus. The figure which shows the Example of an image generation apparatus. The figure which shows the Example of a disturbance specific part. The figure which shows the Example of a contrast difference detection part. The figure which shows the Example of a luminance value difference detection part. The figure which shows the Example of a noise difference detection part. The figure which shows the Example of a contrast difference determination part. The figure which shows the Example of disturbance image generation. The figure which shows the Example of a contrast difference addition part. The figure which shows the Example of a luminance value difference addition part. The figure which shows the Example of a noise difference addition part. The figure which shows the Example of disturbance image generation.
  • the figure which shows the Example of an identification part and an image generation apparatus The figure which shows the Example of a disturbance specific part.
  • An image processing system for updating a collation image of a discriminator for identifying a collated image using a collation image using machine learning, or a computer program for causing an arithmetic processing unit to execute the update Similar image processing for searching for an image that succeeded in identification (hereinafter referred to as a successful image) similar to an image that failed to be identified using a container (hereinafter referred to as a failed image), and successful search performed by the failed image and the similar image search unit Disturbance specifying process for obtaining difference information calculated by comparison based on image information of an image, and an image processing system for performing a disturbance image generating process for creating an image based on the difference information calculated by the disturbance specifying process, and a computer program explain.
  • An image generation apparatus exemplified in the embodiments described below includes an image generation method and an image generation method for reducing the amount of learning data and shortening the learning time in additional learning in semiconductor inspection utilizing machine learning It relates to the device.
  • image data for learning is generated using image data that has failed to be identified and an image that has been successfully identified is shown.
  • an apparatus and a measurement inspection system having a function of generating learning data in additional learning in semiconductor inspection utilizing machine learning will be described with reference to the drawings.
  • a charged particle beam apparatus is illustrated as an apparatus for forming an image, and an example using an SEM is described as one aspect thereof.
  • a focused ion beam (FIB) apparatus that scans a beam to form an image may be employed as the charged particle beam apparatus.
  • FIB focused ion beam
  • the 21 is a schematic explanatory diagram of a measurement / inspection system in which a plurality of measurement or inspection devices are connected to a network.
  • the system mainly includes a CD-SEM 2401 for measuring pattern dimensions of a semiconductor wafer, a photomask, etc., and irradiating a sample with an electron beam to acquire an image and compare the image with a pre-registered reference image.
  • the defect inspection apparatus 2402 for extracting defects based on the above is connected to the network.
  • the network also includes a condition setting device 2403 for setting the measurement position and measurement conditions on the design data of the semiconductor device, and the pattern quality based on the design data of the semiconductor device and the manufacturing conditions of the semiconductor manufacturing device.
  • a simulator 2404 for simulation and a storage medium 2405 for storing design data in which layout data and manufacturing conditions of semiconductor devices are registered are connected.
  • the condition setting device 2403 is provided with a display device for displaying a GUI (Graphical User Interface) image as described later, and an input device for inputting necessary information.
  • the design data is expressed in, for example, the GDS format or the OASIS format, and is stored in a predetermined format.
  • the design data can be of any type as long as the software that displays the design data can display the format and can handle the data as graphic data.
  • the storage medium 2405 may be built in the measuring device, the control device of the inspection device, the condition setting device 2403, and the simulator 2404.
  • the CD-SEM 2401 and the defect inspection device 2402 are provided with respective control devices, and necessary control is performed for each device. In these control devices, the functions of the simulator, measurement conditions, etc. are set. You may make it mount a function.
  • SEM an electron beam emitted from an electron source is focused by a plurality of lenses, and the focused electron beam is scanned one-dimensionally or two-dimensionally on a sample by a scanning deflector. Secondary electrons (SE) or backscattered electrons (BSE) emitted from the sample by scanning the electron beam are detected by a detector, and synchronized with the scanning deflector scanning, in a frame memory. Or the like.
  • the image signals stored in the frame memory are integrated by an arithmetic device mounted in the control device. Further, scanning by the scanning deflector is possible for any size, position, and direction.
  • the above control and the like are performed by the control devices of each SEM, and images and signals obtained as a result of scanning with the electron beam are sent to the condition setting device 2403 via the communication line network.
  • the control device that controls the SEM and the condition setting device 2403 are described as separate units. However, the present invention is not limited to this, and the condition setting device 2403 controls and measures the device. Processing may be performed in a lump, or SEM control and measurement processing may be performed together in each control device.
  • the condition setting device 2403 or the control device stores a program for executing a measurement process, and measurement or calculation is performed according to the program.
  • the condition setting device 2403 has a function of creating a program (recipe) for controlling the operation of the SEM based on semiconductor design data, and functions as a recipe setting unit. Specifically, positions for performing processing necessary for SEM such as desired measurement points, autofocus, autostigma, and addressing points on design data, pattern outline data, or design data that has been simulated And a program for automatically controlling the sample stage, deflector, etc. of the SEM is created based on the setting.
  • FIG. 22 is a schematic configuration diagram of a scanning electron microscope.
  • An electron beam 2503 extracted from an electron source 2501 by an extraction electrode 2502 and accelerated by an acceleration electrode (not shown) is focused by a condenser lens 2504 which is a form of a focusing lens, and then is scanned on a sample 2509 by a scanning deflector 2505.
  • a condenser lens 2504 which is a form of a focusing lens
  • the electron beam 2503 is decelerated by a negative voltage applied to an electrode built in the sample stage 2508 and is focused by the lens action of the objective lens 2506 and irradiated onto the sample 2509.
  • secondary electrons and electrons 2510 such as backscattered electrons are emitted from the irradiated portion.
  • the emitted electrons 2510 are accelerated in the direction of the electron source by the acceleration action based on the negative voltage applied to the sample, collide with the conversion electrode 2512, and generate secondary electrons 2511.
  • the secondary electrons 2511 emitted from the conversion electrode 2512 are captured by the detector 2513, and the output I of the detector 2513 changes depending on the amount of captured secondary electrons.
  • the brightness of a display device changes.
  • an image of the scanning region is formed by synchronizing the deflection signal to the scanning deflector 2505 and the output I of the detector 2513.
  • the scanning electron microscope illustrated in FIG. 22 includes a deflector (not shown) that moves the scanning region of the electron beam.
  • FIG. 22 an example is described in which electrons emitted from the sample are converted once by the conversion electrode and detected.
  • the present invention is not limited to such a configuration. It is possible to adopt a configuration in which the detection surface of the electron multiplier tube or the detector is arranged on the orbit.
  • the control device 2514 controls each component of the scanning electron microscope, and forms a pattern on the sample based on a function of forming an image based on detected electrons and an intensity distribution of detected electrons called a line profile. It has a function to measure the pattern width.
  • the image generation apparatus includes a dedicated processor or a general-purpose processor. In the case of a general-purpose processor, the image generation apparatus is controlled by a program that generates learning image data described later. The image generation device functions as a machine learning engine.
  • FIG. 1 is a diagram illustrating an example of an image generation apparatus that creates learning image data.
  • learning image data 4 is created using image 2 that has failed to be identified (hereinafter referred to as an unsuccessful identification image) data 2 and image 3 that has been successfully identified (hereinafter referred to as identification successful image) data 3.
  • the similar image search unit 11 uses the identification failure image data 2 to search for image data in the identification success image data 3 similar to the identification failure image.
  • the disturbance specifying unit 12 compares the unsuccessful identification image data 2 with the successful identification image data searched by the similar image search unit 11, and there is no disturbance greatly different due to disturbance such as a decrease in contrast due to charging, uneven brightness, and noise. Find out. A disturbance having a large difference is identified by comparing contrast, luminance change, and noise between the identification failure image and the identification success image. Then, the disturbance image generation unit 13 adds the disturbance difference specified by the disturbance specifying unit 12 to the identification success image data, generates an image, and stores it as learning image data 4. Since the image has been successfully identified, the true value is created at the time of identification, and no true value creation work occurs.
  • the disturbance generated in the identification failure image is reflected in the identification success image, and is learned using the CNN as learning image data, so that the identification failure image can be identified after learning.
  • learning will also include a disturbance that does not actually occur, so redundant learning data is included.
  • a disturbance actually generated is specified by comparing with a successful image, and learning image data focused on the disturbance is created. Therefore, the learning data can be reduced in quantity, and the learning time can be shortened.
  • the identification failure image data is created by designating an image that is determined to be unsuccessful by visually confirming the identification result image.
  • the GUI screen is displayed on a display device of the condition setting device 2403, and necessary information is input by an input device such as a pointing device or a keyboard.
  • the identification result is displayed for each closed figure pattern.
  • the identification target is a rectangular pattern with rounded corners, and if the contour line can be extracted / identified, the matching score with the collation image becomes high, and the image becomes a successful identification.
  • An unsuccessful identification image is a region extracted at a location where no contour line appears, such as a thick rectangle a, or a region such as b, where a contour line does not appear, a and b like c Therefore, the matching score with the collation image is lowered and the identification fails.
  • the discriminator to be described later performs machine learning based on an image that has failed to be identified. For example, a discriminating image that has a certain score, although the score is insufficient to discriminate it from a successful image. It is desirable to program to register as For example, a score Ths for success in matching> a range of scores Thf1 to Thf2 to be captured as failure images> a score Thof that should be clearly failed is set as a discrimination score.
  • the to-be-matched image that has a Thf2 score may be identified as image data for machine learning and stored in a predetermined storage medium.
  • the user displays only the image area that has failed to be identified as a rectangle, and designates it, so that all other areas not designated by the rectangle can be set as successful image areas (identification success images).
  • a rectangular frame of a, b, and c is displayed on the displayed identification result as an image area (identification failure image) that has failed to be identified, and can be set while visually checking.
  • the process of performing true value extraction / identification of contour lines using machine learning takes into account the white band direction and gives the true value of the contour line while checking the peak position, which is very time consuming. It takes.
  • the similar image search unit 11 searches for identification success image data similar to the identification failure image data. It is conceivable to search for an image having a high degree of similarity by a matching process using the normalized correlation of images. Further, as shown in FIG. 3, by using the design data 5 corresponding to the identification failure image and obtaining the design data corresponding to the similar identification success image, the identification success image similar to the identification failure image data is obtained. Searching for data is also conceivable. In addition, as shown in FIG. 2, since there are often a plurality of identification objects in one sheet, it is conceivable to search for a successful identification image similar to the unsuccessful identification image from one image of the identification result image. . FIG.
  • FIG. 23 is a diagram illustrating an example of learning image data including a success image and a partial failure image selected on the GUI screen.
  • the learning image data is stored in the memory in association with the designated image data of the success image data and the failure image.
  • a synthesized image obtained by synthesizing the success image and the failed partial image may be stored, and collation processing using the synthesized image (corrected collation image) may be performed, or image identification may be performed.
  • the collation image may be generated by synthesizing the success image and the failed partial image.
  • FIG. 4 is a diagram illustrating an example of the disturbance specifying unit 12. The contrast is calculated from each of the identification failure image 2 and the identification success image 3, and the difference is obtained by the contrast difference detection 121.
  • the contrast difference determination unit 124 determines whether or not the contrast is largely different between the identification failure image 2 and the identification success image 3.
  • the luminance value difference detection unit 122 obtains the difference between the luminance values of the identification failure image 2 and the identification success image 3.
  • the luminance value difference determination unit 125 determines whether or not the luminance values are largely different between the identification failure image 2 and the identification success image 3.
  • the noise difference detection unit 123 obtains a difference in noise amount between the identification failure image 2 and the identification success image 3.
  • the noise difference determination unit 126 determines whether or not the noise is largely different between the identification failure image 2 and the identification success image 3.
  • FIG. 5 is a diagram for explaining an example of the contrast difference detection 121.
  • the smoothing unit 1211 smoothes the image of the identification failure image 2 using a smoothing filter or the like, and removes a noise component having a steep luminance value of the image. Thereafter, the maximum value detection unit 1213 and the minimum value detection unit 1214 obtain the maximum value Lmax and the minimum value Lmin of the luminance value of the image.
  • the contrast calculation unit calculates (Lmax ⁇ Lmin) / (Lmax + Lmin) using the maximum value Lmax and the minimum value Lmin. Similarly, the contrast calculation unit 1218 calculates the contrast for the successful identification image 3, and the contrast difference calculation unit 1219 obtains the difference between the contrasts.
  • the method of obtaining the contrast is not limited to this example, and the difference between the maximum value and the minimum value in the identification failure image 2 and the identification success image 3 may be used as the contrast. In any case, the contrast is obtained using the difference between the maximum value and the minimum value.
  • FIG. 6 is a diagram for explaining an example of the luminance value difference detection 122.
  • the smoothing units 1221 and 1222 sufficiently smooth the discrimination failure image 2 and the discrimination success image 3 by using a smoothing filter or the like to obtain a gradual change in luminance of the image. Thereafter, the luminance value difference calculation unit 1223 calculates the difference between the luminance values of the respective images.
  • FIG. 7 is a diagram for explaining an example of the noise difference detection 123.
  • the smoothing units 1231 and 1232 smooth the identification failure image 2 and the identification success image 3 using a smoothing filter or the like to remove noise components, and the noise calculation units 1233 and 1234 perform the identification failure image 2 and the identification success image.
  • the difference between the image 3 and the image that has passed through the smoothing units 1231 and 1232 and the luminance value is obtained, and the variation in the difference value is calculated as noise.
  • the noise difference calculation unit 1235 calculates the difference value of the noise obtained in each case.
  • FIG. 8 is a diagram illustrating an example of the contrast difference determination unit 124.
  • a threshold value 1251 for determining whether or not the contrast difference is large is set in advance, and the comparison determination unit 1252 compares the contrast difference value output from the contrast difference detection 121 with the threshold value.
  • the threshold value is greater than 1251, a value of “1” is output, and otherwise, “0” is output.
  • the luminance value difference determination unit 125 and the noise difference determination unit 126 have thresholds for determination in advance, and are compared with the thresholds. When the value is larger than the threshold value, a difference value between luminance values and a difference value between noises are output. Otherwise, “0” is output. In that case, it is conceivable to reset each subsequent image generation calculation and output a signal that does not permit storage to the learning image data 4. If all the differences in contrast, brightness value, and noise are not large, the user is notified of this.
  • FIG. 9 is a diagram illustrating an example of the disturbance image generation unit 13. If the disturbance specifying unit 12 determines that the contrast difference is large, an image obtained by adding the contrast difference to the identification success image 3 by the contrast difference adding unit 131 is generated. If the disturbance specifying unit 12 determines that the difference in luminance value is large, an image obtained by adding the difference in luminance value to the identification success image 3 by the luminance difference adding unit 132 is generated. If the disturbance specifying unit 12 determines that the noise difference is large, an image obtained by adding the noise difference to the identification success image 3 by the noise adding unit 133 is generated. FIG.
  • the maximum value-minimum value of the identification failure image 2 is obtained by the subtractor 1312
  • the maximum value-minimum value of the identification success image 3 is obtained by the subtractor 1313
  • the divider 1314 maximum value-minimum value of the identification failure image 2).
  • / Maximum value-minimum value of identification success image 3
  • the image obtained by subtracting the minimum value obtained from the successful identification image by the subtractor 1311 is obtained by the divider 1314 (maximum value of the identification failure image 2 ⁇ minimum value) / (maximum value of the identification success image 3 ⁇ Multiplier 1315 multiplies the minimum value).
  • the luminance value difference adding unit 132 adds the luminance value difference obtained by the disturbance specifying unit 12 to the identification success image 3 by the adder 1321 to generate an image. For example, luminance unevenness such that the background becomes black due to charging generated in the failed image can be added to the successful image.
  • FIG. 12 is a diagram for explaining an example of the noise difference adding unit 133.
  • the noise difference adding unit 133 adds the noise created by the noise adjustment 1331 based on the noise 123a of the identification failure image 2 and the noise 123b of the identification success image 3 obtained by the disturbance specifying unit 12 to the identification success image by the adder 1332. Generate an image.
  • noise is added so as to take into account the noise already included in the successful identification image 3. For example, when the noise 123a of the identification failure image 3 is generated in a normal distribution, only noise that has a value larger than the value of the noise 123b of the identification success image 3 is added. Similarly, when the noise 123a of the identification failure image 3 is generated in a normal distribution, the noise 123b of the identification success image 3 is generated in a normal distribution, and the difference between the absolute values is added as noise. When the noise of the identification success image 3 is larger than the noise of the identification failure image 2, the noise to be added may be 0.
  • FIG. 14 is a diagram illustrating an example of an image generation apparatus including an identification unit.
  • the learning unit 9 already learned identifies the image of the image data for identification 10 and stores the identification result in the identification result image data 7.
  • the identification unit 9 is an arithmetic processing device that executes image processing for identifying an image using a collation image stored in advance in a memory. More specifically, the similarity between the collated image and the collated image is determined, and the search target image is retrieved according to the score. For example, an image having a score equal to or higher than a predetermined value is identified as a search target image. An image identified as a search target image is stored as a successful identification image as a predetermined storage medium, while an image with a score less than a predetermined value is stored as a failed identification image in a predetermined storage medium for subsequent machine learning. The As described with reference to FIG.
  • the image data of the identification result image data 7 is displayed on the GUI by the failure area image instruction unit 8 as shown in FIG.
  • the image area instructed by the failure area image instruction unit 8 to indicate the identification failure image area in the identification result image displayed on the GUI by the user is stored in the identification failure image data 2.
  • the image generation unit 1 generates learning image data using the identification failure image data and the identification success image data.
  • the similar image / design data search unit 14 of the image generation apparatus 1 searches for a success image similar to the identification failure image data 2.
  • a design success image that is highly similar to the discrimination failure image is searched using design data or image data. Then, the successful identification image data and the unsuccessful identification image data searched are compared by the disturbance specifying unit 12 to specify disturbances (contrast, luminance change, noise) having a large difference. Then, an image in which the disturbance specified by the disturbance image generation unit 13 is reflected in the successful identification image 3 is generated and stored in the learning image data 4.
  • the discriminator 9 performs learning using the image data stored in the learning image data 4.
  • the disturbance specifying unit 12 needs to have a function of detecting a disturbance of luminance inversion as shown in FIG.
  • the luminance inversion detection unit 127 detects luminance inversion using the images of the identification failure image 2 and the identification success image 3, and the luminance inversion determination unit 128 determines whether or not luminance inversion has occurred.
  • the image correlation calculation unit 1271 calculates the image correlation between the identification failure image 2 and the identification success image 3. If there is a reversal of luminance, the negative correlation becomes large.
  • the luminance reversal determination unit 128 can determine that reversal has occurred when the negative correlation is larger than the threshold as compared with a preset threshold.
  • the luminance reversal determination unit 128 determines that the luminance reversal has occurred, the luminance for the identification success image is determined. Generate an image with inverted values.
  • FIG. 18 shows an example of an embodiment of image generation processing. In the image generation process S10, first, an image that fails to be identified is selected in the identification failure image selection process S11. As described with reference to FIG.
  • an identification failure image an image region in which identification has failed from the identification result image.
  • an identification success image similar to the identification failure image selected in the identification failure image selection process S11 is searched from the image data that has been successfully identified.
  • disturbances such as contrast, luminance change, noise, and the like are compared from the identification failure image and the identification success image, and disturbances that differ greatly between the identification failure image and the identification success image are specified.
  • the disturbance addition process S14 an image in which the disturbance specified in the disturbance specifying process S13 is reflected in the identification success image is generated.
  • FIG. 19 shows an example of the disturbance identification process.
  • the disturbance identification unit processing S20 first, contrast is calculated from the identification failure image and the identification success image in contrast difference detection / determination processing S21, the difference is obtained, and the difference between the difference value and the specific threshold value is determined. When the difference value is larger than the specific threshold value, it is determined that the contrast disturbance is greatly different between the identification failure image and the identification success image.
  • the luminance value difference detection / determination process S22 the difference between the image of the identification failure image and the image of the identification success image (luminance value) is obtained, and the difference between the difference value and the specific threshold is determined. When the difference value is larger than the specific threshold value, it is determined that the disturbance of luminance unevenness is greatly different between the identification failure image and the identification success image.
  • the noise difference detection / determination process S23 the difference between the noise levels of the identification failure image and the identification success image is obtained, and the difference between the difference value and the specific threshold is determined.
  • the order is the contrast difference detection / determination process S21, the luminance value difference detection / determination process S22, and the noise difference detection / determination process S23, but they may be in any order.
  • FIG. 20 shows an example of the disturbance addition process. In the disturbance addition process S30, a contrast difference addition process S31, a luminance value difference addition process S32, and a noise difference addition process S23 are performed.
  • contrast difference detection / determination process S21 when it is determined in contrast difference detection / determination process S21 that the disturbance of contrast is greatly different between the identification failure image and the identification success image, the contrast of the identification success image is matched with the contrast of the identification failure image. Include. Specifically, the contents are the same as those described with reference to FIGS.
  • the luminance value difference addition processing S32 when it is determined in the luminance value difference detection / determination processing S22 that the disturbance of luminance unevenness is greatly different between the identification failure image and the identification success image, the identification failure image and the identification success image are each strongly smoothed. To obtain a difference image between the identification failure image and the identification success image, and an image obtained by adding the difference image to the identification success image is created.
  • the noise difference addition process S33 when it is determined in the noise difference detection / determination process S23 that the noise disturbance is greatly different between the identification failure image and the identification success image, an image obtained by adding noise occurring in the identification failure image to the identification success image Create This is the same as described with reference to FIG. 7 and FIG.
  • the order is the contrast difference addition process S31, the luminance value difference addition process S32, and the noise difference addition process S33, but they may be in any order.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)
PCT/JP2018/010307 2017-03-27 2018-03-15 画像処理システム及び画像処理を行うためのコンピュータープログラム Ceased WO2018180562A1 (ja)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN202310270799.5A CN116152507A (zh) 2017-03-27 2018-03-15 图像处理系统及图像处理方法
CN201880014768.0A CN110352431B (zh) 2017-03-27 2018-03-15 图像处理系统、计算机可读存储介质以及系统
KR1020197022591A KR102336431B1 (ko) 2017-03-27 2018-03-15 화상 처리 시스템, 및 화상 처리를 행하기 위한 컴퓨터 프로그램
US16/493,432 US11176405B2 (en) 2017-03-27 2018-03-15 Image processing system and computer program for performing image processing
KR1020217027367A KR102435492B1 (ko) 2017-03-27 2018-03-15 화상 처리 시스템, 및 화상 처리를 행하기 위한 컴퓨터 프로그램
US17/503,438 US11836906B2 (en) 2017-03-27 2021-10-18 Image processing system and computer program for performing image processing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-060351 2017-03-27
JP2017060351A JP6731370B2 (ja) 2017-03-27 2017-03-27 画像処理システム及び画像処理を行うためのコンピュータープログラム

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/493,432 A-371-Of-International US11176405B2 (en) 2017-03-27 2018-03-15 Image processing system and computer program for performing image processing
US17/503,438 Continuation US11836906B2 (en) 2017-03-27 2021-10-18 Image processing system and computer program for performing image processing

Publications (1)

Publication Number Publication Date
WO2018180562A1 true WO2018180562A1 (ja) 2018-10-04

Family

ID=63675752

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/010307 Ceased WO2018180562A1 (ja) 2017-03-27 2018-03-15 画像処理システム及び画像処理を行うためのコンピュータープログラム

Country Status (6)

Country Link
US (2) US11176405B2 (enExample)
JP (1) JP6731370B2 (enExample)
KR (2) KR102336431B1 (enExample)
CN (2) CN116152507A (enExample)
TW (2) TWI697849B (enExample)
WO (1) WO2018180562A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020091002A1 (ja) * 2018-10-31 2020-05-07 住友建機株式会社 ショベル、ショベル支援システム

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11675053B2 (en) 2018-06-15 2023-06-13 Innovusion, Inc. LiDAR systems and methods for focusing on ranges of interest
US12361779B2 (en) 2019-05-18 2025-07-15 Looplearn Pty Ltd Localised, loop-based self-learning for recognising individuals at locations
JP2021117548A (ja) * 2020-01-22 2021-08-10 富士通株式会社 画像処理装置、画像処理方法及び画像処理プログラム
JP7288870B2 (ja) * 2020-02-05 2023-06-08 株式会社日立製作所 画像を生成するシステム
US12131103B2 (en) * 2020-03-30 2024-10-29 Kla Corporation Semiconductor fabrication process parameter determination using a generative adversarial network
JP7298016B2 (ja) * 2020-03-30 2023-06-26 株式会社日立ハイテク 診断システム
CN111832433B (zh) * 2020-06-24 2023-12-29 奇点微(上海)光电科技有限公司 一种由图像提取物体特性的装置及其工作方法
JP7351812B2 (ja) * 2020-07-27 2023-09-27 株式会社日立製作所 荷電粒子線装置および帯電評価方法
US20220101114A1 (en) * 2020-09-27 2022-03-31 Kla Corporation Interpretable deep learning-based defect detection and classification
CN113505700A (zh) * 2021-07-12 2021-10-15 北京字跳网络技术有限公司 一种图像处理方法、装置、设备及存储介质
JP7743224B2 (ja) * 2021-07-30 2025-09-24 株式会社Screenホールディングス 画像処理方法および分類モデルの構築方法
JPWO2024111303A1 (enExample) * 2022-11-22 2024-05-30
US20240273232A1 (en) * 2023-02-15 2024-08-15 BeeKeeperAI, Inc. Systems and methods for measuring data exfiltration vulnerability and dynamic differential privacy in a zero-trust computing environment
JP2025040724A (ja) * 2023-09-12 2025-03-25 日立Astemo株式会社 Aiモデル学習システム、及びaiモデル学習方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007052575A (ja) * 2005-08-17 2007-03-01 Konica Minolta Holdings Inc メタデータ付与装置およびメタデータ付与方法
JP2011214903A (ja) * 2010-03-31 2011-10-27 Denso It Laboratory Inc 外観検査装置、外観検査用識別器の生成装置及び外観検査用識別器生成方法ならびに外観検査用識別器生成用コンピュータプログラム

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0554195A (ja) 1991-08-29 1993-03-05 Matsushita Electric Ind Co Ltd 文字認識装置
JPH0650738A (ja) 1992-07-31 1994-02-25 Toyota Motor Corp 画像輪郭線検出装置
JPH06333008A (ja) 1993-05-19 1994-12-02 Fujitsu General Ltd 画像輪郭指定用入力装置
JP3434976B2 (ja) * 1996-06-28 2003-08-11 三菱電機株式会社 画像処理装置
JP3449392B2 (ja) 1996-08-27 2003-09-22 日本電信電話株式会社 識別関数学習方法
JP2001148012A (ja) * 1999-11-19 2001-05-29 Minolta Co Ltd 対応点探索方法および装置
JP2005149455A (ja) * 2003-10-21 2005-06-09 Sharp Corp 画像照合装置、画像照合方法、画像照合プログラムおよび画像照合プログラムを記録したコンピュータ読取り可能な記録媒体
JP5010207B2 (ja) * 2006-08-14 2012-08-29 株式会社日立ハイテクノロジーズ パターン検査装置及び半導体検査システム
US8655939B2 (en) * 2007-01-05 2014-02-18 Digital Doors, Inc. Electromagnetic pulse (EMP) hardened information infrastructure with extractor, cloud dispersal, secure storage, content analysis and classification and method therefor
JP2008176517A (ja) * 2007-01-18 2008-07-31 Juki Corp 物体認識方法および装置
US20110106734A1 (en) 2009-04-24 2011-05-05 Terrance Boult System and appartus for failure prediction and fusion in classification and recognition
KR101034117B1 (ko) * 2009-11-13 2011-05-13 성균관대학교산학협력단 영상에서 관심 영역 지정 및 윤곽선 영상을 이용한 객체 인식 방법 및 장치
US9152861B2 (en) * 2011-03-04 2015-10-06 Nec Corporation Individual product identification system, individual product identification method, and device and program used by same
JP2013084074A (ja) * 2011-10-07 2013-05-09 Sony Corp 情報処理装置、情報処理サーバ、情報処理方法、情報抽出方法及びプログラム
WO2013099367A1 (ja) * 2011-12-27 2013-07-04 Necソフト株式会社 画像認識装置、画像認識方法、補正器、プログラムおよび記録媒体
US9053595B2 (en) * 2012-02-02 2015-06-09 Jared Grove Coin identification system and method using image processing
JP5808371B2 (ja) * 2013-08-28 2015-11-10 ヤフー株式会社 画像認識装置、画像認識方法及び画像認識プログラム
JP2015176272A (ja) * 2014-03-14 2015-10-05 オムロン株式会社 画像処理装置、画像処理方法、および画像処理プログラム
US9518934B2 (en) 2014-11-04 2016-12-13 Kla-Tencor Corp. Wafer defect discovery
JP2016099668A (ja) 2014-11-18 2016-05-30 キヤノン株式会社 学習方法、学習装置、画像認識方法、画像認識装置及びプログラム
CN105844202A (zh) * 2015-01-12 2016-08-10 芋头科技(杭州)有限公司 一种影像识别系统及方法
CN104598910A (zh) * 2015-01-16 2015-05-06 科大讯飞股份有限公司 基于梯度方向匹配算法的智能电视台标识别方法及系统
JP6904249B2 (ja) 2015-03-19 2021-07-14 日本電気株式会社 オブジェクト検出装置、オブジェクト検出方法およびプログラム
CN104809442B (zh) * 2015-05-04 2017-11-17 北京信息科技大学 一种东巴象形文字字素智能识别方法
US10477095B2 (en) * 2015-05-14 2019-11-12 Sri International Selecting optimal image from mobile device captures
US10659766B2 (en) * 2015-10-30 2020-05-19 Canon Kabushiki Kaisha Confidence generation apparatus, confidence generation method, and imaging apparatus
CN105469097A (zh) * 2015-11-18 2016-04-06 江苏省电力公司检修分公司 一种基于神经网络的变电站特征提取方法
CN106446950B (zh) * 2016-09-27 2020-04-10 腾讯科技(深圳)有限公司 一种图像处理方法及装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007052575A (ja) * 2005-08-17 2007-03-01 Konica Minolta Holdings Inc メタデータ付与装置およびメタデータ付与方法
JP2011214903A (ja) * 2010-03-31 2011-10-27 Denso It Laboratory Inc 外観検査装置、外観検査用識別器の生成装置及び外観検査用識別器生成方法ならびに外観検査用識別器生成用コンピュータプログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020091002A1 (ja) * 2018-10-31 2020-05-07 住友建機株式会社 ショベル、ショベル支援システム
JPWO2020091002A1 (ja) * 2018-10-31 2021-09-24 住友建機株式会社 ショベル、ショベル支援システム
JP7472034B2 (ja) 2018-10-31 2024-04-22 住友建機株式会社 ショベル、ショベル支援システム
US12320096B2 (en) 2018-10-31 2025-06-03 Sumitomo Construction Machinery Co., Ltd. Shovel and shovel assist system

Also Published As

Publication number Publication date
TW201945982A (zh) 2019-12-01
US20220036116A1 (en) 2022-02-03
CN116152507A (zh) 2023-05-23
TWI697849B (zh) 2020-07-01
US11836906B2 (en) 2023-12-05
KR102435492B1 (ko) 2022-08-24
KR102336431B1 (ko) 2021-12-08
KR20210111335A (ko) 2021-09-10
CN110352431B (zh) 2023-07-18
TW201835815A (zh) 2018-10-01
JP6731370B2 (ja) 2020-07-29
JP2018163524A (ja) 2018-10-18
KR20190103283A (ko) 2019-09-04
CN110352431A (zh) 2019-10-18
US20200134355A1 (en) 2020-04-30
US11176405B2 (en) 2021-11-16

Similar Documents

Publication Publication Date Title
JP6731370B2 (ja) 画像処理システム及び画像処理を行うためのコンピュータープログラム
JP7144244B2 (ja) パターン検査システム
CN113168687B (zh) 图像评价装置和方法
JP5639797B2 (ja) パターンマッチング方法,画像処理装置、及びコンピュータプログラム
JP5568277B2 (ja) パターンマッチング方法、及びパターンマッチング装置
JP5948138B2 (ja) 欠陥解析支援装置、欠陥解析支援装置で実行されるプログラム、および欠陥解析システム
JP6043735B2 (ja) 画像評価装置及びパターン形状評価装置
JP5313939B2 (ja) パターン検査方法、パターン検査プログラム、電子デバイス検査システム
JP4982544B2 (ja) 合成画像形成方法及び画像形成装置
JP2006066478A (ja) パターンマッチング装置及びそれを用いた走査型電子顕微鏡
US20220012404A1 (en) Image matching method and arithmetic system for performing image matching process
JP7438311B2 (ja) 画像処理システムおよび画像処理方法
US20230071668A1 (en) Pattern Matching Device, Pattern Measurement System, and Non-Transitory Computer-Readable Medium
JP2010182423A (ja) 荷電粒子ビームの焦点評価方法及び荷電粒子線装置
WO2025141839A1 (ja) コンピュータシステム、荷電粒子ビーム装置および学習モデルのトレーニング方法
JP5396496B2 (ja) 合成画像形成方法及び画像形成装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18774438

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20197022591

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18774438

Country of ref document: EP

Kind code of ref document: A1