WO2022172469A1 - Dispositif d'inspection d'image, procédé d'inspection d'image et dispositif de génération de modèle entraîné - Google Patents

Dispositif d'inspection d'image, procédé d'inspection d'image et dispositif de génération de modèle entraîné Download PDF

Info

Publication number
WO2022172469A1
WO2022172469A1 PCT/JP2021/009510 JP2021009510W WO2022172469A1 WO 2022172469 A1 WO2022172469 A1 WO 2022172469A1 JP 2021009510 W JP2021009510 W JP 2021009510W WO 2022172469 A1 WO2022172469 A1 WO 2022172469A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
inspection
feature
divided
defective
Prior art date
Application number
PCT/JP2021/009510
Other languages
English (en)
Japanese (ja)
Inventor
泰之 池田
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2022172469A1 publication Critical patent/WO2022172469A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an image inspection device, an image inspection method, and a trained model generation device.
  • Japanese Patent Laid-Open No. 2002-200000 discloses an anomaly determination apparatus that performs an anomaly determination based on input image data to be determined. for generating reconstructed image data from the feature amount of the determination target image data using the reconstruction parameters of and performing abnormality determination based on difference information between the generated reconstructed image data and the determination target image data It has a process executing means for executing an abnormality determination process.
  • image data to be determined includes image data of a plurality of channels
  • the abnormality determination device of Patent Document 1 generates reconstructed image data for each channel from the feature amount of the image data of each channel using reconstruction parameters, Abnormality determination is performed based on difference information between each generated reconstructed image data and image data of each channel of the determination target image data.
  • a non-defective product image of a non-defective object is divided into small sizes and input, and a trained model trained to output the feature amount of the non-defective product image is extracted from the inspection image in the feature space.
  • a method for detecting defects in an inspection image based on the feature vector obtained by the inspection and the feature vector of the non-defective product image.
  • the feature vector of the non-defective product image including this special pattern is plotted at a position distant from the feature vectors of other non-defective product images in the feature space.
  • the feature vector of the inspection image that includes such a pattern at the other location is , is plotted near the feature vector of the non-defective product image, and the defective product may be overlooked.
  • the present invention has been made in view of such circumstances, and plots the feature vector of an inspection image containing a special pattern close to the feature vector of a non-defective product image, and plots the feature vector of an inspection image of a defective product.
  • One of the objects is to provide an image inspection device, an image inspection method, and a trained model generation device that are capable of plotting at a distance.
  • An image inspection apparatus includes a plurality of trained models that are trained to output feature amounts by inputting one or more of non-defective divided images that are divided images of a non-defective inspection object.
  • an extraction unit for inputting an inspection divided image which is a divided image of an object to be inspected, and extracting a feature vector having a feature amount for the inspection divided image as a component; Indicates the degree of defect of the inspection divided image corresponding to the feature vector based on the point indicated by the feature vector and the set formed by the plurality of feature vectors for the plurality of non-defective divided images in the feature space represented by
  • An acquisition unit that acquires the degree of defect, and an inspection unit that inspects the inspection object based on a plurality of degrees of defect.
  • the test divided image is input to each of a plurality of trained models that have been trained to receive one or more non-defective product divided images and to output the feature amount, respectively, and the feature vector for the test divided image is input.
  • each trained model can learn a pattern specific to each non-defective divided image. is plotted near the set formed by Therefore, in the feature space, the point indicated by the feature vector of the inspection divided image containing the special pattern can be plotted near the set formed by the feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image. It is possible to acquire the degree of defect with a small degree of , and determine that the product is non-defective.
  • each trained model can learn different judgment criteria depending on the position and part of the non-defective product image of the inspection object.
  • the point indicated by the feature vector for each inspection divided image is Different ranges and regions are collected for each position of the inspection divided image. Therefore, in the feature space, a point indicated by a feature vector of an inspection division image at another position, which includes a pattern that is a non-defective product at a certain position but is defective at another position, is defined as a plurality of non-defective product divisions at the other position. It is possible to plot far relative to the set formed by the feature vectors of the image, obtain the degree of defect with a large degree of defect, and suppress overlooking of defective products.
  • each defect degree is acquired based on the point indicated by the feature vector of the inspection divided image and the set formed by the plurality of feature vectors for the plurality of non-defective divided images in the feature space, and based on these plurality of defect degrees
  • the obtaining unit obtains the feature vector based on the distance between the point indicated by the feature vector and the set in the feature space represented by the feature vector.
  • the defect degree of the corresponding inspection divided image may be acquired.
  • the defect degree of the inspection divided image corresponding to the feature vector is the distance between the point indicated by the feature vector and the set formed by the plurality of feature vectors of the plurality of non-defective divided images in the feature space. obtained based on This makes it possible to easily indicate the degree of defects in the inspection divided image.
  • the acquisition unit obtains the point indicated by the feature vector in the feature space represented by the feature vector and one of the plurality of feature vectors of the plurality of non-defective divided images.
  • the defect degree of the inspection divided image corresponding to the feature vector may be acquired based on the distance between the points indicated by the two.
  • the defect degree of the inspection divided image corresponding to the feature vector is between the point P2 indicated by the feature vector and the point indicated by one of the plurality of feature vectors of the plurality of non-defective product divided images in the feature space. is obtained based on the distance of This makes it possible to easily indicate the degree of defects in the inspection divided image.
  • the degree of defect may be a value indicating the degree of defect in the inspection divided image.
  • the inspection unit may generate a defect degree image based on a plurality of defect degrees, and inspect the inspection object based on the defect degree image.
  • a learning unit may be further provided for learning a learning model for each of one or more non-defective divided images and generating a plurality of trained models.
  • it further comprises a learning unit that learns a learning model for each of one or more non-defective divided images and generates a plurality of trained models.
  • a plurality of trained models can be obtained without a trained model generation device.
  • the learning unit may learn a learning model using one or more non-defective divided images subjected to data augmentation as input, and generate a plurality of trained models.
  • one or more non-defective divided images subjected to data augmentation are used as input to learn a learning model, and a plurality of trained models are generated.
  • the limited number of non-defective split images can be increased, and each trained model can be designed to be robust against variations in non-defective split images.
  • the data augmentation applies at least one of translation, rotation, enlargement, reduction, horizontal flip, vertical flip, and filtering to the one or more good split images.
  • the data augmentation applies at least one of translation, rotation, enlargement, reduction, horizontal flip, vertical flip, and filtering to the one or more good split images. including doing This makes it possible to easily obtain variations of non-defective divided images.
  • the above-described aspect may further include a dividing unit that divides the inspection image of the inspection object into a plurality of divided inspection images.
  • it further includes a dividing unit that divides the inspection image of the inspection object into a plurality of divided inspection images. This makes it possible to easily obtain inspection divided images.
  • an imaging unit for acquiring an inspection image of the inspection object may be further provided.
  • it further includes an imaging unit for acquiring an inspection image of the inspection object. This makes it possible to easily obtain an inspection image.
  • a storage unit that stores a plurality of trained models may be further provided.
  • it further includes a storage unit that stores a plurality of trained models. This makes it possible to easily read out each learned model.
  • An image inspection method includes a plurality of trained models each learned to output a feature amount by inputting one or more of non-defective divided images, which are divided images of a non-defective inspection object. inputting inspection divided images, which are divided images of an object to be inspected, into the step of respectively extracting feature vectors whose components are feature amounts for the inspection divided images; Indicates the degree of defect of the inspection divided image corresponding to the feature vector based on the point indicated by the feature vector and the set formed by the plurality of feature vectors for the plurality of non-defective divided images in the feature space represented by The method includes the step of acquiring defect degrees, and the step of inspecting an inspection object based on a plurality of defect degrees.
  • the test divided image is input to each of a plurality of trained models that have been trained to receive one or more non-defective product divided images and to output the feature amount, respectively, and the feature vector for the test divided image is input.
  • each trained model can learn a pattern specific to each non-defective divided image. is plotted near the set formed by Therefore, in the feature space, the point indicated by the feature vector of the inspection divided image containing the special pattern can be plotted near the set formed by the feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image. It is possible to acquire the degree of defect with a small degree of , and determine that the product is non-defective.
  • each trained model can learn different judgment criteria depending on the position and part of the non-defective product image of the inspection object.
  • the point indicated by the feature vector for each inspection divided image is Different ranges and regions are collected for each position of the inspection divided image. Therefore, in the feature space, a point indicated by a feature vector of an inspection division image at another position, which includes a pattern that is a non-defective product at a certain position but is defective at another position, is defined as a plurality of non-defective product divisions at the other position. It is possible to plot far relative to the set formed by the feature vectors of the image, obtain the degree of defect with a large degree of defect, and suppress overlooking of defective products.
  • each defect degree is acquired based on the point indicated by the feature vector of the inspection divided image and the set formed by the plurality of feature vectors for the plurality of non-defective divided images in the feature space, and based on these plurality of defect degrees
  • a trained model generation device provides a plurality of learned model generation devices each trained to output a feature amount by inputting one or more of non-defective product divided images, which are divided images of a non-defective product inspection object.
  • a model generating unit for generating a finished model is provided.
  • a plurality of trained models are generated that are trained so as to input one or more non-defective divided images and output feature amounts.
  • each trained model can learn a pattern specific to each non-defective divided image. is plotted near the set formed by Therefore, in the feature space, the point indicated by the feature vector of the inspection divided image containing the special pattern can be plotted near the set formed by the feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image.
  • each trained model can learn different judgment criteria depending on the position and part of the non-defective product image of the inspection object. In the feature space, the point indicated by the feature vector for each inspection divided image is Different ranges and regions are collected for each position of the inspection divided image.
  • a point indicated by a feature vector of an inspection division image at another position which includes a pattern that is a non-defective product at a certain position but is defective at another position, is defined as a plurality of non-defective product divisions at the other position. It can be plotted against the set formed by the feature vectors of the image.
  • the present invention it is possible to plot the feature vector of the inspection image containing the special pattern closer to the feature vector of the non-defective product image, and plot the feature vector of the inspection image of the defective product farther away.
  • FIG. 1 is a configuration diagram illustrating a schematic configuration of an image inspection system according to one embodiment.
  • FIG. 2 is a configuration diagram showing the physical configuration of a trained model generation device and an image inspection device in one embodiment.
  • FIG. 3 is a configuration diagram showing the configuration of functional blocks of a trained model generation device according to one embodiment.
  • FIG. 4 is a conceptual diagram for explaining generation of non-defective divided images by the learning image generation unit shown in FIG.
  • FIG. 5 is a conceptual diagram for explaining a learning model used by the model generating unit shown in FIG. 3;
  • FIG. 6 is a conceptual diagram for explaining an example of generation of a plurality of trained models by the model generating unit shown in FIG.
  • FIG. 3; 7 is a conceptual diagram for explaining another example of generation of a plurality of trained models by the model generation unit shown in FIG. 3.
  • FIG. FIG. 8 is a configuration diagram showing the configuration of functional blocks of the image inspection apparatus according to one embodiment.
  • FIG. 9 is a conceptual diagram for explaining an example of a method of acquiring the degree of defect.
  • FIG. 10 is a conceptual diagram for explaining another example of the method of acquiring the degree of defect.
  • 11 is a conceptual diagram for explaining an example of processing by the dividing unit, the extracting unit, and the obtaining unit shown in FIG. 8.
  • FIG. 12 is a conceptual diagram for explaining another example of processing by the dividing unit, the extracting unit, and the acquiring unit shown in FIG. 8.
  • FIG. 13 is a diagram showing an example of an inspection image.
  • FIG. 14 is a conceptual diagram for explaining the processing by the inspection unit shown in FIG. 8.
  • FIG. FIG. 15 is a diagram showing an example of a defect degree image.
  • FIG. 16 is a flowchart for explaining an example of a trained model generation process performed by the trained model generation device according to one embodiment.
  • FIG. 17 is a flowchart for explaining an example of image inspection processing performed by the image inspection apparatus according to one embodiment.
  • FIG. 1 is a configuration diagram illustrating a schematic configuration of an image inspection system 1 according to one embodiment.
  • the image inspection system 1 includes an image inspection device 20 and illumination IL.
  • the image inspection device 20 is connected to the trained model generation device 10 via the communication network NW.
  • the illumination IL irradiates the light L onto the inspection object TA.
  • the image inspection device 20 captures the reflected light R and inspects the inspection object TA based on the image of the inspection object TA (hereinafter also referred to as the “inspection image”).
  • the trained model generating device 10 generates a trained model used by the image inspection device 20 for inspection.
  • FIG. 2 is a configuration diagram showing the physical configuration of the trained model generation device 10 and the image inspection device 20 in one embodiment.
  • the trained model generation device 10 and the image inspection device 20 each include a processor 31, a memory 32, a storage device 33, a communication device 34, an input device 35, an output device 36, Prepare. These components are connected to each other via a bus so that data can be sent and received.
  • the trained model generation device 10 and the image inspection device 20 are each composed of one computer, but the invention is not limited to this.
  • the trained model generation device 10 and the image inspection device 20 may each be realized by combining a plurality of computers.
  • the configuration shown in FIG. 2 is an example, and the trained model generation device 10 and the image inspection device 20 may each have a configuration other than these, or may not have some of these configurations. good.
  • the processor 31 is configured to control the operation of each part of the trained model generation device 10 and the image inspection device 20. ⁇ 31 ⁇ CPU(Central Processing Unit) ⁇ DSP(Digital Signal Processor) ⁇ ASIC(Application Specific Integrated Circuit) ⁇ PLD(Programmable Logic Device) ⁇ FPGA(Field Programmable Gate Array)) ⁇ SoC(Sysmtem-on- a-Chip) and other integrated circuits.
  • the memory 32 and the storage device 33 are configured to store programs, data, etc., respectively.
  • the memory 32 is composed of, for example, ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM) and/or RAM (Random Access Memory).
  • the storage device 33 is composed of storage such as HDD (Hard Disk Drive), SSD (Solid State Drive) and/or eMMC (Embedded Multi Media Card).
  • the communication device 34 is configured to communicate via at least one of wired and wireless networks.
  • the communication device 34 includes, for example, a network card, a communication module, an interface for connecting to other devices, and the like.
  • the input device 35 is configured so that information can be input by a user's operation.
  • the input device 35 includes, for example, a keyboard, touch panel, mouse, pointing device, and/or microphone.
  • the output device 36 is configured to output information.
  • the output device 36 includes, for example, a display device such as a liquid crystal display, an EL (Electro Luminescence) display, a plasma display, an LCD (Liquid Crystal Display), and/or a speaker.
  • a display device such as a liquid crystal display, an EL (Electro Luminescence) display, a plasma display, an LCD (Liquid Crystal Display), and/or a speaker.
  • FIG. 3 is a configuration diagram showing the configuration of functional blocks of the trained model generation device 10 according to one embodiment.
  • FIG. 4 is a conceptual diagram for explaining generation of non-defective divided images by the learning image generation unit 131 shown in FIG.
  • FIG. 5 is a conceptual diagram for explaining the learning model 50 used by the model generator 135 shown in FIG.
  • FIG. 6 is a conceptual diagram for explaining an example of generation of a plurality of trained models 55 by the model generation unit 135 shown in FIG.
  • FIG. 7 is a conceptual diagram for explaining another example of generation of the plurality of trained models 55 by the model generation unit 135 shown in FIG.
  • the trained model generation device 10 includes a communication unit 110, a storage unit 120, and a learning unit .
  • the communication unit 110 is configured to be able to transmit and receive various types of information.
  • the communication unit 110 transmits a learned model 55, which will be described later, to the image inspection apparatus 20, for example, via the communication network NW.
  • the communication unit 110 receives non-defective product images from other devices, for example, via the communication network NW. That is, the communication unit 110 acquires a non-defective product image, which is an image of the inspection object TA determined to be a non-defective product.
  • a plurality of non-defective product images acquired by the communication unit 110 (hereinafter, one of the non-defective product images 40 is also referred to as a “non-defective product image 40” when not distinguished from each other) is written and stored in the storage unit 120 .
  • the storage unit 120 is configured to store various types of information.
  • the storage unit 120 stores, for example, a non-defective product image 40, a plurality of learning images (hereinafter, one of them is also referred to as a “learning image 124” if they are not distinguished), and a plurality of trained models (hereinafter each , one of which is also referred to as a "learned model 55") and a plurality of feature vectors (hereinafter, one of which is also referred to as a "feature vector 129" when not distinguished). .
  • the learning unit 130 is for learning a learning model and generating a trained model.
  • the learning unit 130 includes, for example, a learning image generation unit 131 and a model generation unit 135 .
  • the learning image generation unit 131 is configured to generate the learning image 124 .
  • the learning image 124 is an image used to generate the trained model 55 .
  • the learning image 124 is, for example, an image obtained by dividing one non-defective product image into a plurality of images (hereinafter referred to as “non-defective product divided images”).
  • the learning image generation unit 131 reads one of the plurality of non-defective product images 40 stored in the storage unit 120, divides the non-defective product image 40 into a plurality of non-defective product divided images, and generates learning images. 124 is generated. The generated learning image 124 is written and stored in the storage unit 120 .
  • the learning image generation unit 131 generates 16 non-defective divided images by vertically and horizontally dividing the non-defective product image 40 into four, for example.
  • the non-defective product image 40 may be divided into two or more non-defective divided images, and may be divided into 2 to 15 non-defective divided images, or may be divided into 17 or more non-defective divided images. .
  • the learning image generation unit 131 may perform data augmentation on one or a plurality of non-defective divided images and generate the data augmented image as the learning image 124 .
  • Data augmentation includes applying at least one of translation, rotation, enlargement, reduction, horizontal flip, vertical flip, and filtering to one or more non-defective split images.
  • the filter used for filtering may be, for example, a filter that converts the brightness of the non-defective divided image, a smoothing filter that removes noise, or a filter that extracts edges, and any filter is used. be able to.
  • data augmentation is not limited to the methods described above.
  • data augmentation may perform a projective transformation or add random noise to one or more good split images.
  • the model generating unit 135 is configured to learn a learning model for each of one or more non-defective divided images and generate a plurality of trained models 55 .
  • Each trained model 55 is, for example, a model that receives one or more non-defective divided images as input and outputs a feature amount.
  • each trained model 55 may be a model that receives as input one or a plurality of non-defective divided images subjected to data augmentation and outputs a feature amount.
  • each trained model 55 is assumed to be generated by inputting one or more non-defective divided images, unless otherwise specified.
  • the learning model 50 used by the model generation unit 135 to generate the learned model 55 includes a feature extraction unit 51 and a feature determination unit 52 .
  • a feature extraction unit 51 for example, an autoencoder, which is one of neural networks and one of unsupervised machine learning techniques, is used. and an intermediate layer 513 positioned between layer 511 and output layer 515 .
  • the feature determination unit 52 uses, for example, a neural network.
  • the feature extraction unit 51 is not limited to using an autoencoder, and may use, for example, PCA (Principal Component Analysis). Further, the intermediate layer 513 is not limited to one layer, and may be two or more layers. Furthermore, the feature determination unit 52 is not limited to the case of using a neural network, and for example, a One Class SVM (Support Vector Machine) may be used.
  • PCA Principal Component Analysis
  • the feature determination unit 52 is not limited to the case of using a neural network, and for example, a One Class SVM (Support Vector Machine) may be used.
  • the autoencoder of the feature extraction unit 51 learns weights (also called coefficients) so that the difference between the input image and the output image is minimized. More specifically, each node, represented by a circle, has a unique weighting at each edge, represented by a line, and the weighted values are input to the nodes of the next layer. By learning the weight in this weighting, the difference between each pixel included in the input image and each corresponding pixel included in the output image is minimized. Through this learning, it becomes possible to extract the feature quantity, specifically the intermediate feature quantity, from the intermediate layer 513 .
  • the feature amount is output.
  • the feature quantity is a variable that quantitatively expresses the features of the desired thing.
  • the output feature quantity output from the feature determination unit 52 is, for example, a feature that characterizes a non-defective product divided image.
  • Single, blue single, or red/green/blue (RGB) histograms can be used.
  • RGB red/green/blue
  • a feature vector is a vector representation of one or more feature values as components.
  • the number (number) of feature amounts represents the dimension (number of dimensions) of the feature vector.
  • a space represented by feature vectors in other words, a space spanned by feature vectors is called a feature space, and a feature vector is represented as one point in the feature space.
  • the trained model 55 extracts a d-dimensional vector feature vector 129 from the input image. , and the extracted feature vector 129 can be represented by a point in the feature space of the d-dimensional space.
  • the neural network of the feature determination unit 52 learns weights (also referred to as coefficients) so that the distance between each point in the feature space is reduced. More specifically, after fixing the weights in the autoencoder of the feature extraction unit 51 , an input image is input to the input layer 511 and an intermediate feature amount is obtained from the intermediate layer 513 . Then, in a feature space represented by a feature vector having the output feature amount as a component, the feature determination unit 52 adjusts the distance between each point indicated by the feature vector of the output feature amount corresponding to each input image to be short. learn the weights in the neural network of This learning enables the feature determination unit 52 to output the output feature amount. Note that the dimension of the output feature quantity and the dimension of the intermediate feature quantity may be the same or may be different from each other.
  • the input of the learning model 50 that is, the input image is, for example, a non-defective divided image
  • the output of the learning model 50 that is, the output feature amount is the feature amount of the non-defective divided image.
  • the corresponding learning model 50A includes the upper left corners of the non-defective product images 40, respectively. is input. Then, this learning model 50A learns each weight so that points indicated by each feature vector whose component is the feature quantity of each upper left non-defective divided image 400 are plotted at close distances in the feature space.
  • a trained model 55A that outputs the feature amount of the non-defective product divided image 400 on the upper left is generated.
  • the model generating unit 135 learns the corresponding learning models 50A, 50B, 50C, . , trained models 55A, 55B, 55C, . . .
  • the generated learned models 55A, 55B, 55C, . . . are written and stored in the storage unit 120.
  • the model generation unit 135 extracts the feature vector 129 of each non-defective divided image 400, 402, 404, .
  • a plurality of extracted feature vectors 129 are also written and stored in the storage unit 120 .
  • the number of extracted and stored feature vectors 129 is j ⁇ k, for example.
  • the dimension d of each of the plurality of feature vectors 129 may be the same or different.
  • the learning model 50 is not limited to using a non-defective product divided image at one position of the non-defective product image 40 as an input.
  • the number of divisions of the non-defective product image 40 such as the division number m (m is an integer of 2 or more)
  • the number of learning models 50 used for learning and the number of generated trained models 55, such as the number of generation n (n is an integer of 2 or more) is not limited to being the same number.
  • the learning model 50 may be trained using a plurality of non-defective product divided images, for example, the non-defective product divided image 400 and the non-defective product divided image 402 as input images. In this case, a trained model 55 is generated that outputs the feature amounts of the non-defective product divided images of the two areas of the non-defective product divided image 400 and the non-defective product divided image 402 .
  • the learning models 50 used for each may be integrated. That is, the non-defective product split image 402' and the non-defective product split image 404' that are similar to each other are input to the same learning model 50B, and as a result of learning, one trained model 55B' is generated. In this case, the generated number n of the learned models 55 is smaller than the division number m of the non-defective product image 40 (generated number n ⁇ divided number m). It should be noted that the determination as to whether or not two non-defective divided images are similar may be made, for example, based on whether or not the degree of matching is equal to or greater than a predetermined threshold value.
  • each trained model 55 can learn a pattern unique to each good split image, and in the feature space, the points indicated by the feature vectors for each test split image are near the set formed by the feature vectors of their corresponding good split images. plotted to Therefore, in the feature space, the point indicated by the feature vector of the inspection divided image containing the special pattern can be plotted near the set formed by the feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image.
  • each learned model 55 can learn different judgment criteria depending on the position and part of the non-defective inspection object TA in the non-defective product image. Different ranges and regions are collected for each position of the inspection divided image in the image. Therefore, in the feature space, a point indicated by a feature vector of an inspection division image at another position, which includes a pattern that is a non-defective product at a certain position but is defective at another position, is defined as a plurality of non-defective product divisions at the other position. It can be plotted against the set formed by the feature vectors of the image.
  • the learning image generation unit 131 and the model generation unit 135 may be implemented by the processor 31 executing a program stored in the storage device 33.
  • the program When executing a program, the program may be stored in a storage medium.
  • the storage medium storing the program may be a non-transitory computer readable medium.
  • the non-temporary storage medium is not particularly limited, but may be, for example, a storage medium such as a USB memory or a CD-ROM (Compact Disc ROM).
  • FIG. 8 is a configuration diagram showing the configuration of functional blocks of the image inspection apparatus 20 in one embodiment.
  • FIG. 9 is a conceptual diagram for explaining an example of a method of acquiring the degree of defect.
  • FIG. 10 is a conceptual diagram for explaining another example of the method of acquiring the degree of defect.
  • FIG. 11 is a conceptual diagram for explaining an example of processing by the dividing unit 250, the extracting unit 260, and the obtaining unit 270 shown in FIG.
  • FIG. 12 is a conceptual diagram for explaining another example of processing by the dividing unit 250, the extracting unit 260, and the acquiring unit 270 shown in FIG. FIG.
  • FIG. 13 is a diagram showing an example of a non-defective product image 60.
  • FIG. 14 is a conceptual diagram for explaining processing by the inspection unit 280 shown in FIG.
  • FIG. 15 is a diagram showing an example of the defect degree image 70. As shown in FIG.
  • the image inspection apparatus 20 includes a communication unit 210, a storage unit 220, a learning unit 230, an imaging unit 240, a division unit 250, an extraction unit 260, an acquisition unit 270, and an inspection unit. 280 and.
  • the communication unit 210 is configured to be able to transmit and receive various types of information.
  • the communication unit 210 receives, for example, the plurality of trained models 55 and the plurality of feature vectors 129 from the trained model generation device 10 via the communication network NW.
  • the communication unit 210 receives a plurality of learning images 124 from the trained model generation device 10 or another device, for example, via the communication network NW.
  • the plurality of learning images 124, the plurality of trained models 55, and the plurality of feature vectors 129 received are written and stored in the storage unit 220.
  • FIG. Note that the communication unit 210 may receive only one of the plurality of learning images 124 , the plurality of trained models 55 , and the plurality of feature vectors 129 .
  • the learning unit 230 uses the plurality of training images 124 to generate a plurality of trained models 55, and generates a plurality of trained models 55.
  • a plurality of feature vectors 129 are extracted in the process.
  • the storage unit 220 is configured to store various types of information.
  • the storage unit 220 stores, for example, multiple learning images 124 , multiple trained models 55 , and multiple feature vectors 129 . By providing the storage unit 220 that stores a plurality of trained models 55 in this manner, each trained model can be easily read.
  • the learning unit 230 is configured to learn a learning model for each of one or more non-defective divided images and generate a plurality of trained models 55 .
  • Each trained model 55 is a model that receives one or more non-defective divided images as input and outputs a feature amount.
  • the generated learned models 55 are written and stored in the storage unit 220 . Note that the method of learning the learning model is the same as the description of the model generation unit 135 in the trained model generation device 10, so the description thereof will be omitted.
  • the learning unit 230 that learns a learning model for each of one or more non-defective divided images and generates a plurality of trained models 55, even without the trained model generation device 10, a plurality of You can get a trained model.
  • each of the plurality of learning images 124 stored in the storage unit 220 is a non-defective divided image
  • the learning unit 230 performs data augmentation on one or a plurality of non-defective divided images.
  • a plurality of trained models 55 may be generated by learning a learning model by using a model subjected to . As a result, the limited number of non-defective split images can be increased, and each trained model 55 can be designed to be robust against variations in the non-defective split images.
  • data augmentation applies at least one of translation, rotation, enlargement, reduction, horizontal flip, vertical flip, and filtering to one or more good split images.
  • the filter used for filtering may be, for example, a filter that converts the brightness of the non-defective divided image, a smoothing filter that removes noise, or a filter that extracts edges, and any filter is used. be able to.
  • the data augmentation includes applying at least one of translation, rotation, enlargement, reduction, horizontal flip, vertical flip, and filtering to one or more good split images. By including , it is possible to easily obtain variations of non-defective divided images.
  • data augmentation is not limited to the methods described above.
  • data augmentation may perform a projective transformation or add random noise to one or more good split images.
  • the imaging unit 240 is for acquiring an inspection image of the inspection object TA.
  • the imaging unit 240 includes, for example, an imaging device such as a camera.
  • the imaging unit 240 of this embodiment receives the reflected light R from the inspection object TA and acquires an inspection image.
  • the imaging unit 240 then outputs the acquired inspection image to the dividing unit 250 .
  • the dividing unit 250 is configured to divide the inspection image of the inspection object TA into a plurality of divided inspection images. More specifically, the division unit 250 divides the inspection image by a method similar to the division of the non-defective product image in the trained model generation device 10 . Specifically, the dividing unit 250 vertically and horizontally divides the inspection image of the inspection object TA into four parts to generate 16 divided inspection images. The dividing unit 250 then outputs the generated inspection divided image to the extracting unit 260 . By dividing the inspection image of the inspection object TA into a plurality of inspection divided images in this way, the inspection divided images can be easily obtained.
  • the extraction unit 260 is configured to input test divided images to a plurality of trained models 55, respectively, and extract feature vectors whose components are feature amounts for the test divided images. Specifically, the extracting unit 260 reads out the plurality of trained models 55 stored in the storage unit 220 and inputs corresponding inspection divided images to each of the plurality of trained models 55 . As described above, since the trained model 55 is trained to output the feature amount of the input image, the extracting unit 260 extracts one or more features for the input inspection divided image from the trained model 55. quantity can be obtained. As a result, a feature vector whose components are one or more feature amounts is extracted. Extraction section 260 then outputs the extracted feature vector to acquisition section 270 . Note that the feature vector of the inspection divided image extracted by the extraction unit 260 is the same d-dimensional vector as the feature vector 129 of the non-defective product divided image.
  • the acquisition unit 270 acquires the points indicated by the feature vectors in the feature space represented by the feature vectors and a set formed by the plurality of feature vectors for the plurality of non-defective divided images. and acquires the degree of defect indicating the degree of defect in the inspection divided image corresponding to the feature vector.
  • the degree of defect may be expressed in a discrete manner such as level, step, class, class (grade), layer (hierarchy), etc., or may be expressed in a continuous manner such as a number. Any representation may be used.
  • the acquisition unit 270 reads out the plurality of feature vectors 129 stored in the storage unit 220, and plots the points indicated by each feature vector 129 in a d-dimensional feature space.
  • a set S1 indicated by a dashed line is formed by black circle points indicated by each feature vector 129.
  • FIG. This set is formed for each position of the non-defective product image 40, for example.
  • the acquisition unit 270 may form a set S1 in the feature space in advance based on the plurality of feature vectors 129, and write and store information about the set S1 in the storage unit 220.
  • FIG. 9 the acquisition unit 270 may form a set S1 in the feature space in advance based on the plurality of feature vectors 129, and write and store information about the set S1 in the storage unit 220.
  • the acquiring unit 270 plots points indicated by the feature vectors of the test divided images at positions corresponding to the set S1 among the plurality of feature vectors input from the extracting unit 260 in the feature space.
  • the feature vector of the inspection split image at the corresponding position is indicated by the white circle point P1.
  • the acquiring unit 270 acquires the defect degree of the inspection divided image based on the point P1 indicated by the feature vector of the inspection divided image and the set S1.
  • the acquisition unit 270 calculates the distance between the point indicated by the feature vector and the set formed by the feature vectors of the non-defective divided images in the feature space. , the degree of defect of the inspection divided image corresponding to the feature vector is obtained.
  • the acquisition unit 270 obtains a point P1 indicated by a feature vector of a certain inspection divided image
  • the degree of defect of the inspection divided image is acquired based on the distance between the set S1 of the plurality of feature vectors of the plurality of non-defective divided images corresponding to the divided image.
  • This defect degree is, for example, a value corresponding to the distance.
  • the defect degree of the inspection divided image corresponding to the feature vector is the distance between the point P1 indicated by the feature vector and the set S1 formed by the plurality of feature vectors of the plurality of non-defective divided images in the feature space.
  • the degree of defects in the inspection divided image can be easily indicated by being acquired based on the above.
  • acquisition section 270 calculates the distance between the point indicated by the feature vector and the point indicated by one of the plurality of feature vectors of the plurality of non-defective divided images in the feature space. Based on this, the defect degree of the inspection divided image corresponding to the feature vector may be obtained.
  • the acquisition unit 270 acquires a point P2 indicated by a feature vector of a test divided image
  • the degree of defect of the inspection divided image is acquired based on the distance between the points included in the set S2 of the plurality of feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image.
  • This defect degree is, for example, a value corresponding to the distance.
  • a point included in the set S2 is, for example, the closest point to the point P2 among the plurality of points included in the set S2.
  • the obtaining unit 270 can calculate the distance between each of the plurality of points included in the set S2 and the point P2, and determine the closest point to the point P2.
  • the defect degree of the inspection divided image corresponding to the feature vector is the distance between the point P2 indicated by the feature vector and the point indicated by one of the plurality of feature vectors of the plurality of non-defective divided images in the feature space. can easily indicate the degree of defects in the inspection divided image.
  • the acquisition unit 270 repeats the above procedure for each feature vector extracted by the extraction unit 260, and acquires the degree of defect of each of the plurality of divided inspection images. Then, the acquisition unit 270 outputs the plurality of defect degrees to the inspection unit 280 .
  • the inspection image of the inspection object TA acquired by the imaging unit 240 is divided into a plurality of divided inspection images 420, 422, 424, .
  • the extraction unit 260 inputs each inspection divided image 420, 422, 424, . . . into the corresponding learned models 55A, 55B, 55C, .
  • Each trained model 55A, 55B, 55C, . . . are extracted as feature vectors 440, 442, 444, .
  • the extracting unit 260 when the learned model 55 is generated by integrating a plurality of non-defective divided images that are similar to each other into one learning model 50, the extracting unit 260, as shown in FIG. Inspection split images at positions corresponding to a plurality of similar non-defective product split images, for example, inspection split image 422 and inspection split image 424, are input to the corresponding learned model 55B'.
  • the extraction unit 260 extracts a feature vector 442 corresponding to the test divided image 422 and a feature vector 444 corresponding to the test divided image 424 based on the output of the trained model 55B'.
  • the extracted unit 260 extracts a plurality of feature vectors 440, 442, 444 by outputting the feature amounts of the inspection divided images 420, 422, 424, . , . . . are extracted.
  • the feature vector 442 is a vector whose component is the feature amount of the inspection divided image 422, and the feature vector 444 is a vector whose component is the feature amount of the inspection divided image 424.
  • FIG. That is, if the feature amount of the inspection divided image 422 and the feature amount of the inspection divided image 424 are the same, the feature vector 442 and the feature vector 444 are the same vector. If there is a difference between the feature amount of , the feature vector 442 and the feature vector 444 can be different vectors.
  • the inspection image 60 of the inspection object TA which is known to be non-defective in advance, consists of, for example, six inspection divided images 600, 602, 604, 606, 608, and 610. split. Of these six inspection split images, inspection split images 602, 606, and 608 contain mutually similar patterns. Also, the inspection divided image 604 includes a special pattern different from other inspection divided images.
  • the learned model can extract and output the pattern feature amounts of the inspection divided images 602, 604, or 608.
  • the feature amount of the special pattern is not sufficiently extracted.
  • the point indicated by the feature vector of the inspection divided image 604 containing the special pattern is located away from the set formed by the feature vectors of the other inspection divided images 600, 602, 606, 608, and 610. may be plotted.
  • the defect degree of the inspection divided image 604 becomes large, and the inspection object TA, which is a non-defective product, may be determined as a defective product.
  • the image inspection apparatus 20 of the present embodiment provides a plurality of trained models 55 which are trained to receive one or a plurality of non-defective product divided images as input and output feature amounts, respectively, to inspect divided images 600 and 602 .
  • 604, 606, 608, and 610 are input, and feature vectors for the inspection divided images 600, 602, 604, 606, 608, and 610 are extracted.
  • the point indicated by the feature vector of the inspection divided image 604 including the special pattern can be plotted near the set formed by the feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image 604. , it is possible to obtain the defect degree of a small degree of defect and determine that the product is non-defective.
  • one learned image is obtained by using an inspection divided image including a pattern that is a non-defective product at one position but is defective at another position.
  • the points indicated by the feature vectors of the test split images containing such patterns at the other locations in the feature space are plotted near the set formed by the feature vectors of the other test split images.
  • the defect degree of the inspection divided image including such a pattern at the other position becomes small, and the inspection object TA of the inspection image including the inspection divided image which is originally defective is determined as a non-defective product, and the defective product is determined. Sometimes I missed it.
  • each learned model 55 can learn different judgment criteria depending on the position and part of the non-defective product image of the non-defective inspection object TA.
  • the points indicated by the feature vectors for each inspection divided image are collected in different ranges or areas for each position of the inspection divided image in the inspection image. Therefore, in the feature space, a point indicated by a feature vector of an inspection division image at another position, which includes a pattern that is a non-defective product at a certain position but is defective at another position, is defined as a plurality of non-defective product divisions at the other position. It is possible to plot far relative to the set formed by the feature vectors of the image, obtain the degree of defect with a large degree of defect, and suppress overlooking of defective products.
  • the inspection unit 280 is configured to inspect the inspection object TA based on the plurality of defect degrees acquired by the acquisition unit 270 . In this way, based on the point indicated by the feature vector of the inspection divided image in the feature space and the set formed by the plurality of feature vectors for the plurality of non-defective divided images, the degree of defect of each inspection divided image is acquired, and these By inspecting the inspection object TA based on a plurality of defect degrees, the inspection accuracy of the inspection object TA can be improved.
  • the degree of defect described above is preferably a value indicating the degree of defect in the inspection divided image. This makes it possible to quantitatively indicate the degree of defects in the inspection divided image.
  • the inspection unit 280 is configured to generate a defect degree image based on a plurality of defect degrees and inspect the inspection object TA based on the defect degree image. As a result, it is possible to easily determine whether an object to be inspected is a non-defective product or a defective product, and it is possible to easily realize an inspection with high inspection accuracy.
  • the inspection unit 280 visualizes the degrees of defects 460, 462, 464, . Partial images 480, 482, 484, . . . are generated. The partial images 480, 482, 484, . The inspection unit 280 generates the defect degree image 48 by integrating the generated partial images 480, 482, 484, . The vertical and horizontal sizes (number of pixels) of the defect degree image 48 may be the same as or different from those of the inspection image 42 . Then, the inspection unit 280 determines whether or not the inspection object TA is non-defective based on the defect degree image 48 .
  • the defect degree image 70 includes two defect partial images 700 and 702, for example.
  • a defective partial image 700 is a partial image with a relatively low defect degree
  • a defective partial image 702 is a partial image with a relatively high defect degree.
  • the inspection unit 280 determines that the inspection object TA is non-defective when the ratio of the defect partial images 700 and 702 in the defect degree image 70 is equal to or less than a predetermined threshold, and when the ratio exceeds the predetermined threshold It is determined that the inspection object TA is not a non-defective product, that is, is a defective product.
  • the inspection unit 280 may detect defects in the inspection object TA based on the presence or absence of the defect partial images 700 and 702 included in the defect degree image 70 .
  • the learning unit 230 is implemented by the processor 31 executing a program stored in the storage device 33. good too.
  • the program may be stored in a storage medium.
  • the storage medium storing the program may be a computer-readable non-temporary storage medium.
  • the non-temporary storage medium is not particularly limited, but may be, for example, a storage medium such as a USB memory or a CD-ROM.
  • FIG. 16 is a flowchart for explaining an example of the trained model generation processing S100 performed by the trained model generation device 10 in one embodiment.
  • the communication unit 110 first acquires a plurality of non-defective product images 40 via the communication network NW (step S101).
  • the acquired non-defective product images 40 are stored in the storage unit 120 .
  • the learning image generating unit 131 reads out a plurality of non-defective product images 40 from the storage unit 120, and generates a plurality of learning images 124 based on the plurality of non-defective product images 40 (step S102).
  • the learning image 124 may be a non-defective divided image, or one or more non-defective divided images subjected to data augmentation.
  • the model generation unit 135 generates a plurality of trained models 55 that are trained to output feature amounts by inputting the plurality of learning images 124 generated in step S102. (Step S103). A plurality of generated trained models 55 are stored in the storage unit 120 .
  • the communication unit 110 transmits the multiple learned models 55 generated in step S103 and the multiple feature vectors 129 extracted in the process of generation to the image inspection apparatus 20 via the communication network NW. (step S104).
  • the image inspection device 20 can use a plurality of trained models generated by the trained model generation device 10 .
  • step S104 the learned model generation device 10 ends the learned model generation processing S100.
  • FIG. 17 is a flowchart for explaining an example of image inspection processing S200 performed by the image inspection apparatus 20 according to one embodiment.
  • the communication unit 210 receives the plurality of trained models 55 and the plurality of feature vectors 129 from the trained model generation device 10, and stores the plurality of trained models 55 and the plurality of feature vectors in the storage unit 220. 129 are stored.
  • the imaging unit 240 acquires an inspection image of the inspection object TA (step S201).
  • the acquired inspection image is output to the dividing section 250 .
  • the dividing unit 250 divides the inspection image acquired in step S201 to generate a plurality of divided inspection images (step S202). A plurality of generated inspection divided images are output to the extraction unit 260 .
  • the extraction unit 260 reads out the plurality of trained models 55 pre-stored in the storage unit 220, inputs the plurality of inspection divided images generated in step S207 to the plurality of trained models 55, and extracts the plurality of is extracted (step S203).
  • a plurality of generated feature vectors are output to the acquisition unit 270 .
  • the obtaining unit 270 reads out the plurality of feature vectors 129 pre-stored in the storage unit 220, and for each of the plurality of feature vectors extracted in step S207, the points indicated by the feature vectors and the points indicated by the feature vectors in the feature space. Based on the set formed by the feature vectors 129 of the non-defective divided images, the degree of defect of the plurality of inspection divided images is obtained (step S204). A plurality of obtained degrees of defects are output to the inspection section 280 .
  • the inspection unit 280 generates a plurality of partial images from the respective values of the plurality of defect degrees obtained in step S204, and integrates the generated partial images to generate a defect degree image (step S205).
  • the inspection unit 280 inspects the inspection object TA based on the defect degree image generated in step S205 (step S206).
  • step S206 the image inspection apparatus 20 ends the image inspection process S200.
  • the inspection divided image 600 is applied to the plurality of trained models 55 each trained to output the feature amount with one or more non-defective divided images as input.
  • 602, 604, 606, 608, and 610 are input, and feature vectors for the inspection divided images 600, 602, 604, 606, 608, and 610 are extracted.
  • the point indicated by the feature vector of the inspection divided image 604 including the special pattern can be plotted near the set formed by the feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image 604. , it is possible to obtain the defect degree of a small degree of defect and determine that the product is non-defective.
  • each learned model can learn different judgment criteria depending on the position and part of the non-defective inspection object TA in the non-defective product image. are collected in different ranges and regions for each position of the inspection divided image in .
  • a point indicated by a feature vector of an inspection division image at another position which includes a pattern that is a non-defective product at a certain position but a defective product at another position, is defined by a plurality of non-defective product divisions at the other position. It is possible to plot far relative to the set formed by the feature vectors of the image, obtain the degree of defect with a large degree of defect, and suppress overlooking of defective products.
  • each defect degree is obtained based on the point indicated by the feature vector of the inspection divided image and the set formed by the plurality of feature vectors for the plurality of non-defective divided images in the feature space, and based on these plurality of defect degrees
  • a plurality of trained models 55 are generated by learning one or more non-defective divided images as input and outputting feature amounts. This allows each trained model 55 to learn a pattern specific to each non-defective split image, and in the feature space, the feature vectors for each test split image 600, 602, 604, 606, 608, and 610 show Points are plotted near the set formed by the feature vectors of the respective corresponding good split images. Therefore, in the feature space, the point indicated by the feature vector of the inspection divided image 604 including the special pattern can be plotted near the set formed by the feature vectors of the plurality of non-defective product divided images corresponding to the inspection divided image 604. .
  • each learned model 55 can learn different determination criteria depending on the position and part of the non-defective inspection object TA in the non-defective product image. Different ranges and regions are collected for each position of the inspection divided image in the image. Therefore, in the feature space, a point indicated by a feature vector of an inspection division image at another position, which includes a pattern that is a non-defective product at a certain position but a defective product at another position, is defined by a plurality of non-defective product divisions at the other position. It can be plotted against the set formed by the feature vectors of the image.
  • the object to be inspected (TA an extraction unit (260) for inputting inspection divided images, which are divided images of ), and extracting feature vectors each having a feature amount for each inspection divided image as a component; For each of the extracted feature vectors, based on the point indicated by the feature vector in the feature space represented by the feature vector and a set formed by a plurality of feature vectors (129) for a plurality of non-defective divided images, an acquisition unit (270) for acquiring a degree of defect indicating the degree of defect of the inspection divided image corresponding to the feature vector; An inspection unit (280) that inspects the inspection object (TA) based on a plurality of defect degrees, An image inspection device (20).
  • the object to be inspected (TA a step of inputting each inspection divided image which is a divided image of ) and extracting a feature vector whose component is a feature amount for the inspection divided image; For each of the extracted feature vectors, based on the point indicated by the feature vector in the feature space represented by the feature vector and the set formed by the plurality of feature vectors for the plurality of non-defective divided images, the feature vector obtaining a degree of defect indicating the degree of defect in the inspection divided image corresponding to; inspecting the test object (TA) based on the plurality of defect degrees; Image inspection method.

Landscapes

  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention trace le vecteur de caractéristique d'une image d'inspection qui comprend un motif spécial proche du vecteur de caractéristique d'une image d'inspection d'un article défectueux, loin du vecteur de caractéristique d'une image d'article non défectueux. Le dispositif d'inspection d'image (20) comprend : une unité d'extraction (260) pour entrer une image d'inspection divisée qui est une image divisée d'un article de test (TA) à chacun d'une pluralité de modèles entraînés (55) ayant été entraînés, avec une ou une pluralité d'images d'article non défectueux divisées en tant qu'entrée, de façon à délivrer en sortie une quantité de caractéristiques, et à extraire le vecteur de caractéristiques de l'image d'inspection divisée, respectivement ; une unité d'acquisition (270) pour acquérir, pour chacun des vecteurs de caractéristiques extraits, le degré de défaut de l'image d'inspection divisée qui correspond au vecteur de caractéristiques, sur la base de points du vecteur de caractéristiques et d'un agrégat formé par une pluralité de vecteurs de caractéristiques (129) d'une pluralité d'images d'article non défectueux ; et une unité d'inspection (280) pour inspecter l'article de test (TA) sur la base d'une pluralité de degrés de défaut.
PCT/JP2021/009510 2021-02-12 2021-03-10 Dispositif d'inspection d'image, procédé d'inspection d'image et dispositif de génération de modèle entraîné WO2022172469A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-020383 2021-02-12
JP2021020383A JP2022123217A (ja) 2021-02-12 2021-02-12 画像検査装置、画像検査方法、及び学習済みモデル生成装置

Publications (1)

Publication Number Publication Date
WO2022172469A1 true WO2022172469A1 (fr) 2022-08-18

Family

ID=82837622

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/009510 WO2022172469A1 (fr) 2021-02-12 2021-03-10 Dispositif d'inspection d'image, procédé d'inspection d'image et dispositif de génération de modèle entraîné

Country Status (2)

Country Link
JP (1) JP2022123217A (fr)
WO (1) WO2022172469A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6973207B1 (en) * 1999-11-30 2005-12-06 Cognex Technology And Investment Corporation Method and apparatus for inspecting distorted patterns
WO2019117065A1 (fr) * 2017-12-15 2019-06-20 オムロン株式会社 Dispositif de génération de données, procédé de génération de données, et programme de génération de données
JP2020052520A (ja) * 2018-09-25 2020-04-02 エヌ・ティ・ティ・コムウェア株式会社 判定装置、判定方法、およびプログラム
JP2020181532A (ja) * 2019-04-26 2020-11-05 富士通株式会社 画像判定装置及び画像判定方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6973207B1 (en) * 1999-11-30 2005-12-06 Cognex Technology And Investment Corporation Method and apparatus for inspecting distorted patterns
WO2019117065A1 (fr) * 2017-12-15 2019-06-20 オムロン株式会社 Dispositif de génération de données, procédé de génération de données, et programme de génération de données
JP2020052520A (ja) * 2018-09-25 2020-04-02 エヌ・ティ・ティ・コムウェア株式会社 判定装置、判定方法、およびプログラム
JP2020181532A (ja) * 2019-04-26 2020-11-05 富士通株式会社 画像判定装置及び画像判定方法

Also Published As

Publication number Publication date
JP2022123217A (ja) 2022-08-24

Similar Documents

Publication Publication Date Title
JP6792842B2 (ja) 外観検査装置、変換データ生成装置、及びプログラム
US10818000B2 (en) Iterative defect filtering process
WO2023077404A1 (fr) Procédé, appareil et système de détection de défauts
CN109871895B (zh) 电路板的缺陷检测方法和装置
WO2019117065A1 (fr) Dispositif de génération de données, procédé de génération de données, et programme de génération de données
KR101704325B1 (ko) 결함 관찰 방법 및 결함 관찰 장치
US11783469B2 (en) Method and system for scanning wafer
JP2017049974A (ja) 識別器生成装置、良否判定方法、およびプログラム
JP2017211259A (ja) 検査装置、検査方法、及びプログラム
JP2016115331A (ja) 識別器生成装置、識別器生成方法、良否判定装置、良否判定方法、プログラム
JP2006098152A (ja) 欠陥検出装置および欠陥検出方法
TW201512649A (zh) 偵測晶片影像瑕疵方法及其系統與電腦程式產品
JP2011058926A (ja) 画像検査方法及び画像検査装置
JP2020112483A (ja) 外観検査システム、計算モデル構築方法及び計算モデル構築プログラム
US8606017B1 (en) Method for inspecting localized image and system thereof
WO2022172469A1 (fr) Dispositif d'inspection d'image, procédé d'inspection d'image et dispositif de génération de modèle entraîné
CN112184717A (zh) 用于质检的自动化分割方法
WO2022172475A1 (fr) Dispositif et procédé d'inspection d'image, et dispositif de génération de modèle entraîné
JP7465446B2 (ja) 画像検査装置、画像検査方法、及び学習済みモデル生成装置
US8300918B2 (en) Defect inspection apparatus, defect inspection program, recording medium storing defect inspection program, figure drawing apparatus and figure drawing system
KR20230036650A (ko) 영상 패치 기반의 불량 검출 시스템 및 방법
WO2022172470A1 (fr) Dispositif d'inspection d'image, procédé d'inspection d'image et dispositif de génération de modèle entraîné
WO2022172468A1 (fr) Dispositif d'inspection d'image, procédé d'inspection d'image et dispositif de génération de modèle entraîné
WO2021229901A1 (fr) Dispositif d'inspection d'images, procédé d'inspection d'images et dispositif de génération de modèle pré-appris
JP7392166B2 (ja) 画像生成装置、画像生成方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21925718

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21925718

Country of ref document: EP

Kind code of ref document: A1