WO2019186915A1 - Dispositif d'inspection d'anomalies et procédé associé - Google Patents

Dispositif d'inspection d'anomalies et procédé associé Download PDF

Info

Publication number
WO2019186915A1
WO2019186915A1 PCT/JP2018/013305 JP2018013305W WO2019186915A1 WO 2019186915 A1 WO2019186915 A1 WO 2019186915A1 JP 2018013305 W JP2018013305 W JP 2018013305W WO 2019186915 A1 WO2019186915 A1 WO 2019186915A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
domain
abnormality
unit
determination
Prior art date
Application number
PCT/JP2018/013305
Other languages
English (en)
Japanese (ja)
Inventor
一之 宮澤
杉本 和夫
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2018/013305 priority Critical patent/WO2019186915A1/fr
Priority to KR1020207027495A priority patent/KR102206698B1/ko
Priority to DE112018007171.5T priority patent/DE112018007171T5/de
Priority to JP2020504740A priority patent/JP6693684B2/ja
Priority to CN201880091679.6A priority patent/CN111902712B/zh
Priority to TW107139476A priority patent/TWI719357B/zh
Publication of WO2019186915A1 publication Critical patent/WO2019186915A1/fr
Priority to US17/031,004 priority patent/US11386549B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]

Definitions

  • the present invention relates to an abnormality inspection apparatus and an abnormality inspection method for inspecting whether or not an abnormality has occurred in an object.
  • a technique in which a machine automatically inspects whether an object has an abnormality by analyzing an image obtained by photographing the object with a camera or the like is, for example, a visual inspection performed in the manufacturing process of an industrial product. This is an important technique for automating or saving labor for visual inspection.
  • a set good pixel range is created by calculating a statistic of characteristics for each pixel having the same coordinates for a digital image individually acquired from a predetermined number among a plurality of objects having the same specification.
  • An inspection apparatus is disclosed that determines, for each pixel, whether a characteristic belongs to a set good pixel range for a digital image acquired from an object to be inspected.
  • the present invention has been made to solve the above-described problems, and inspects an abnormality of an object without being affected by a difference in imaging conditions or a difference between images due to a variation within a normal range.
  • An object of the present invention is to provide an abnormality inspection apparatus and an abnormality inspection method that can perform such an operation.
  • the abnormality inspection apparatus includes an image acquisition unit that acquires a determination target image obtained by photographing an object to be examined, and a normal image obtained by photographing a normal state object as teacher data.
  • a learning result acquisition unit that acquires the result of machine learning of domain forward conversion or domain reverse conversion of an image, and a determination result image acquired by the image acquisition unit, using the result of machine learning acquired by the learning result acquisition unit
  • a determination target image analysis unit that sequentially performs domain forward conversion and domain reverse conversion to acquire an image after domain conversion, a determination target image acquired by the image acquisition unit, and a domain converted image acquired by the determination target image analysis unit In comparison, a determination unit that determines whether or not an abnormality has occurred in an object photographed in the determination target image.
  • the present invention it is possible to inspect an abnormality of an object without being affected by a difference in imaging conditions or a difference between images due to variations within a normal range.
  • FIG. 10 is a diagram for describing domain conversion performed on a determination target image by a determination target image analysis unit in the first embodiment.
  • Embodiment 1 it is a figure which shows an example of the determination target image and the image after domain conversion when abnormality has generate
  • Embodiment 1 it is a figure for demonstrating the difference absolute value of a determination target image and the image after domain conversion in case the determination target image is a normal image. In Embodiment 1, it is a figure for demonstrating the difference absolute value of the determination target image and the image after domain conversion when abnormality has generate
  • FIG. 10B An image in which a two-dimensional mask is superimposed on a determination target image is shown.
  • 11A and 11B are diagrams showing an example of a hardware configuration of the abnormality inspection apparatus according to Embodiment 1 of the invention.
  • 5 is a flowchart for explaining the operation of the abnormality inspection device in the “learning mode” in the first embodiment.
  • 13 is a flowchart for explaining details of domain conversion learning processing performed by the normal image analysis unit in step ST1202 of FIG. 12 in the first embodiment.
  • 4 is a flowchart for explaining an operation of the abnormality inspection apparatus in the “inspection mode” in the first embodiment.
  • FIG. 15 is a flowchart for describing the details of the post-domain conversion image acquisition process performed by the determination target image analysis unit in step ST1402 of FIG.
  • FIG. 15 is a flowchart illustrating details of abnormality determination processing performed by the determination unit in step ST1403 of FIG. 14 in the first embodiment.
  • FIG. 6 is a diagram for describing an example of an image of a display screen that is received by the input / output device from the input / output unit and receives information regarding the determination result and displayed on the display in the first embodiment. In Embodiment 1, it is a figure for demonstrating the other example of the image of the display screen which received the information regarding the determination result which the input / output device transmitted from the input / output part, and displayed on the display.
  • Embodiment 1 it is a figure for demonstrating the other example of the image of the display screen which received the information regarding the determination result which the input / output device transmitted from the input / output part, and displayed on the display. It is a figure which shows the structural example of the abnormality inspection apparatus which concerns on Embodiment 2.
  • FIG. In Embodiment 2 it is a figure which shows an example of the image of the image which the abnormality analysis part superimposed the image of the bounding box and the area domain.
  • 9 is a flowchart for explaining an operation of the abnormality inspection apparatus in the “inspection mode” in the second embodiment.
  • FIG. 24 is a flowchart for explaining details of an abnormality analysis process performed by the abnormality analysis unit in step ST2204 in FIG. 22 in the second embodiment.
  • FIG. 1 is a diagram illustrating a configuration example of an abnormality inspection apparatus 1 according to the first embodiment.
  • the abnormality inspection device 1 is connected to the camera 2 and the input / output device 4 via, for example, a network.
  • the abnormality inspection apparatus 1 acquires an image obtained by photographing the object 3 from the camera 2, determines whether an abnormality has occurred in the object, and outputs the determination result to the input / output device 4.
  • an image obtained by photographing the object 3 is referred to as a “photographed image”.
  • the input / output device 4 includes, for example, a display, a speaker, a keyboard, or a mouse.
  • the abnormality inspection apparatus 1 is configured to be connected to the camera 2 and the input / output device 4 via a network.
  • the device 1 may include the camera 2 and the input / output device 4.
  • the abnormality inspection apparatus 1 includes a control unit 10, an image acquisition unit 11, an image analysis unit 12, a storage unit 13, a determination unit 14, and an input / output unit 15.
  • the image analysis unit 12 includes a normal image analysis unit 121, a determination target image analysis unit 122, and a learning result acquisition unit 123.
  • the control unit 10 controls operations of the image acquisition unit 11, the image analysis unit 12, the determination unit 14, and the input / output unit 15.
  • the image acquisition unit 11 acquires a captured image from the camera 2 via a network.
  • the timing at which the image acquisition unit 11 receives a captured image from the camera 2 may be determined in advance, for example, 30 times per second, or may be determined based on an instruction from the control unit 10. Good.
  • the captured image acquired by the image acquisition unit 11 is digitized, but this is not necessarily the case.
  • the image acquisition unit 11 outputs the acquired captured image to the image analysis unit 12.
  • the image analysis unit 12 performs processing corresponding to the two operation modes.
  • the abnormality inspection apparatus 1 performs two operation modes.
  • the two operation modes performed by the abnormality inspection apparatus 1 are referred to as “learning mode” and “inspection mode”, respectively.
  • the “learning mode” the abnormality inspection apparatus 1 performs domain learning in which the object 3 is in a normal state by machine learning based on one or more photographed images in which the object 3 in a normal state that does not include an abnormality is imaged. Learn how to convert. The domain conversion will be described later.
  • a captured image obtained by capturing the normal object 3 that does not include an abnormality is referred to as a “normal image”.
  • the abnormality inspection apparatus 1 determines whether or not an abnormality has occurred in the target object 3 captured in the captured image based on the captured image in which the target object 3 is captured in the “inspection mode”.
  • a captured image that is a target for determining whether or not an abnormality has occurred is referred to as a “determination target image”.
  • the “inspection mode” is performed after the operation of the “learning mode” is completed.
  • the “learning mode” it is assumed that the object 3 is in a normal state that does not include an abnormality. Accordingly, in the “learning mode”, all the captured images acquired by the image acquisition unit 11 are normal images.
  • the abnormality inspection apparatus 1 is an object of the same type, a plurality of different objects are set as the object 3, and a normal image obtained by capturing a plurality of different objects 3 in a normal state is acquired from the camera 2. May learn a normal state.
  • the camera 2 acquires an image of another domain corresponding to the normal image in addition to the normal image for machine learning. Details of the image of another domain will be described later.
  • the normal image analysis unit 121 of the image analysis unit 12 performs processing corresponding to the “learning mode”, and the determination target image analysis unit 122 and the learning result acquisition unit 123 of the image analysis unit 12 perform processing corresponding to the “inspection mode”. I do.
  • the “learning mode” the normal image analysis unit 121 of the image analysis unit 12 acquires a specified number (for example, 1000) of normal images from the image acquisition unit 11, and performs domain conversion learning based on the acquired normal images. Perform the process.
  • Domain conversion learning is machine learning that learns how to perform domain conversion when the object 3 is in a normal state.
  • the number of normal images used when the normal image analysis unit 121 performs domain conversion learning may be specified in advance or may be specified by the control unit 10.
  • the normal image analysis unit 121 sets the number of normal images acquired from the image acquisition unit 11 to the number designated in advance. When it reaches, the acquisition of the normal image is terminated. Also, for example, when the number of normal images is based on an instruction from the control unit 10, the normal image analysis unit 121 receives a normal image output from the image acquisition unit 11 from the control unit 10 as an image acquisition end instruction. Keep getting until output. When the image acquisition end instruction is output from the control unit 10, the normal image analysis unit 121 ends the normal image acquisition.
  • the control unit 10 receives an image acquisition end instruction from the user. Specifically, for example, the user inputs an image acquisition end instruction from the input / output device 4.
  • the control unit 10 receives the input image acquisition end instruction and outputs an image acquisition end instruction to the normal image analysis unit 121.
  • FIG. 2 is a diagram for explaining domain conversion of an image.
  • An image domain is a type of image, and an image domain conversion is to convert an image of a certain domain into an image of a domain different from the certain domain.
  • FIG. 2 as an example of a domain, an image obtained by photographing a circuit board with a general camera (image of a photographic domain; see 201 in FIG. 2) and an area domain separated for each part on the circuit board An image (see 202 in FIG. 2) is shown.
  • the captured image acquired from the camera 2 by the image acquisition unit 11 is an image of a photographic domain.
  • a substrate area see 2021 in FIG.
  • domain conversion after an image of a photographic domain is converted to obtain an image of an area domain, the image of the area domain can be converted again to obtain an image of the photographic domain.
  • domain order conversion domain conversion from a picture domain image to an area domain image
  • domain reverse domain conversion from an area domain image to a photo domain image
  • FIG. 3 shows an example of an image of a domain other than the photo domain and the region domain.
  • the edge domain image shown in FIG. 3 (see 301 in FIG. 3) is an image obtained by extracting only edges from the photographic domain image.
  • the CAD (Computer Aided Design) domain image (see 302 in FIG. 3) shown in FIG. 3 is an image drawn by CAD software.
  • Non-Patent Document 1 Phillip Isola, Jun-Yan Zhu, Tinghui Zhou, and Alexei A. Efros, “Image-to-Image Translation with Conditional Advertise Nets,” Conference on Computer Vision and Pattern Recognition (2017).
  • FIG. 4 is a diagram illustrating a state in which image domain conversion is performed using a neural network.
  • FIG. 4 shows, as an example, a state where image domain conversion is performed between an image in a photographic domain and an image in an area domain using a neural network.
  • FIG. 4 shows a state where image domain conversion is performed using a convolutional neural network that is known to exhibit a particularly high effect on image processing among neural networks.
  • the convolutional neural network is a kind of neural network, and repeatedly convolves a large number of filters with an input image, and finally obtains a desired result while changing the number of dimensions or the number of channels in the spatial direction.
  • the horizontal length of the filter is indicated as W, the height as H, and the number of channels as C.
  • the neural network used for domain conversion shown in FIG. 4 first, the number of channels is increased while reducing the dimension in the spatial direction while repeating convolution, and then the dimension in the spatial direction is increased while reducing the number of channels. . By this conversion, an output image having the number of dimensions and the number of channels that match the number of dimensions of the input image and the number of channels is finally obtained.
  • the state of domain conversion shown in FIG. 4 is merely an example, and domain conversion is not limited to that performed using a neural network.
  • the neural network used is not limited to a convolutional neural network.
  • an arbitrary network such as a fully connected neural network or a neural network in which a convolutional neural network and a fully connected neural network are combined can be used.
  • the normal image analysis unit 121 acquires an image of another corresponding domain for each of one or more normal images used for domain conversion learning, and performs learning for performing domain forward conversion and domain reverse conversion on a neural network. Let it be done. Since the normal image is an image in the photo domain, the image in the other domain is an image in a domain other than the photo domain. In the example of FIG. 4, the image of the other domain is an image of the region domain.
  • the normal image analysis unit 121 uses a normal image and an image of another domain as teacher data, and performs domain order conversion from a normal image that is a photographic domain image to an image of the domain domain, and a normal image from the image of the domain domain
  • the neural network is trained to perform domain inverse transformation to.
  • neural network learning is to optimize the weights at each edge of the network so that a desired result is obtained for the learning data.
  • the filter coefficients are optimized. It corresponds to doing.
  • learning is performed so that the result of converting the input normal image is close to the corresponding region domain image
  • the reverse conversion the result of converting the input domain domain image corresponds. Learning is performed so that it is close to a normal image.
  • the neural network for forward conversion and the neural network for reverse conversion may have the same configuration or different configurations.
  • the normal image analysis unit 121 can acquire an image of another corresponding domain by a filter process or the like on the image.
  • an image of another corresponding domain is an edge image
  • an image of another domain that is the conversion destination image is applied by applying an edge detection filter to the normal image that is the conversion source image.
  • the user may generate an image of another corresponding domain in advance, and the normal image analysis unit 121 may acquire an image of another domain generated in advance.
  • an image of another corresponding domain is an image obtained by dividing a normal image into regions
  • the user can perform region division on the normal image using image editing software, etc. Images can be generated.
  • the normal image analysis unit 121 causes the storage unit 13 to store information that defines the configuration of a neural network that performs domain forward conversion and domain reverse conversion as a learning result.
  • the information that defines the configuration of the neural network that is stored in the storage unit 13 by the image analysis unit 12 is information that is necessary and sufficient to reproduce the input / output of the neural network during learning. Specifically, such information includes information regarding the structure of the neural network, such as the number of hidden layers of the neural network or the number of units in each layer, and information on the weight obtained by learning.
  • the determination target image analysis unit 122 of the image analysis unit 12 acquires a specified number (for example, one) of captured images from the image acquisition unit 11, and based on each of the captured images, Perform post-domain conversion image acquisition processing.
  • a specified number for example, one
  • an image acquired in order to determine abnormality of the target object 3 to be determined for abnormality in “inspection mode” is referred to as “determination”. It is called “target image”.
  • the determination target image is a captured image obtained by capturing the target object 3 to be inspected by the user.
  • the determination target image analysis unit 122 uses, for each determination target image, a domain that is used to determine whether or not an abnormality has occurred in the target object 3 photographed in the determination target image. Generate a converted image. The domain converted image will be described later.
  • the determination target image analysis unit 122 acquires a specified number of determination target images from the image acquisition unit 11.
  • the number of determination target images that the determination target image analysis unit 122 acquires for the post-domain conversion image acquisition process may be specified in advance or may be specified from the control unit 10.
  • a specific method for the determination target image analysis unit 122 to acquire the specified number of determination target images is a specific method for the normal image analysis unit 121 to acquire the specified number of normal images in the “learning mode”. It is the same as the method.
  • the determination target image analysis unit 122 performs domain conversion on the determination target image using information that defines the configuration of the neural network stored in the storage unit 13.
  • the learning result acquisition unit 123 described later acquires information that defines the configuration of the neural network stored in the storage unit 13, and outputs the acquired information to the determination target image analysis unit 122.
  • the determination target image analysis unit 122 performs domain conversion on the determination target image using the information output from the learning result acquisition unit 123.
  • the information defining the configuration of the neural network acquired by the learning result acquisition unit 123 from the storage unit 13 is information stored by the normal image analysis unit 121 as a learning result in the “learning mode”.
  • FIG. 5 is a diagram for describing domain conversion performed on the determination target image by the determination target image analysis unit 122 in the first embodiment.
  • the determination target image analysis unit 122 first converts the determination target image by domain order conversion, and acquires an image of the region domain.
  • the image of the domain domain acquired by performing the domain order conversion on the determination target image in the “inspection mode” by the determination target image analysis unit 122 is also referred to as a “domain forward conversion image”.
  • the image acquired by the image acquisition unit 11 from the camera 2 is a photographic domain image, so the determination target image is also a photographic domain image.
  • the determination target image analysis unit 122 converts the domain-order-converted image again by domain inverse conversion, and acquires an image of the photographic domain.
  • the determination target image analysis unit 122 performs domain conversion that converts an image of a photographic domain into an image of an area domain, as illustrated in FIG.
  • the determination target image analysis unit 122 can perform domain conversion that converts an image of a photographic domain into an image of an arbitrary domain.
  • the determination target image analysis unit 122 determines the determination target image acquired from the image acquisition unit 11 and the image of the photo domain acquired by sequentially performing domain forward conversion and domain reverse conversion on the determination target image. To the unit 14.
  • the image of the photographic domain acquired by the determination target image analysis unit 122 by sequentially performing the domain forward conversion and the domain reverse conversion on the determination target image in the “inspection mode” is referred to as “domain converted image”.
  • the determination target image analysis unit 122 uses a neural network for both domain forward conversion and domain reverse conversion, but this is only an example.
  • the determination target image analysis unit 122 does not necessarily need to use a neural network for both domain forward transformation and domain reverse transformation, and may use a neural network only for either domain forward transformation or domain reverse transformation. Good.
  • the learning result acquisition unit 123 of the image analysis unit 12 acquires the learning result stored in the storage unit 13 in the “inspection mode”.
  • the learning result stored in the storage unit 13 is information that defines the configuration of a neural network that performs domain forward conversion and domain reverse conversion, which is stored as a learning result by the normal image analysis unit 121 in the “learning mode”. It is.
  • the image analysis unit 12 operates in the “learning mode” by the normal image analysis unit 121 and in the “inspection mode” by the determination target image analysis unit 122 and the learning result acquisition unit 123 based on an instruction from the control unit 10. Switching between the operation.
  • the storage unit 13 stores information that defines the configuration of a neural network that performs domain forward conversion and domain reverse conversion, which is a learning result obtained when the normal image analysis unit 121 performs domain conversion learning in the “learning mode”.
  • the storage unit 13 is provided in the abnormality inspection apparatus 1.
  • the storage unit 13 is not limited thereto, and the storage unit 13 is provided outside the abnormality inspection apparatus 1. It may be provided in a place where the abnormality inspection apparatus 1 can be referred to.
  • the determination unit 14 acquires the determination target image and the domain-converted image output from the determination target image analysis unit 122, compares the acquired determination target image with the domain-converted image, and determines An abnormality determination process is performed to determine whether an abnormality has occurred in the target object 3 photographed in the target image.
  • the normal image analysis unit 121 causes the neural network to learn using a normal image and an image of another domain corresponding to the normal image. Therefore, in the “inspection mode”, when the determination target image is a normal image, the domain-converted image based on the determination target image substantially matches the determination target image acquired from the image acquisition unit 11 (see FIG. 5).
  • the determination target image was used when the normal image analysis unit 121 learned the neural network.
  • the normal image is an image in which the state of the object 3 being photographed is different. Therefore, when domain order conversion is performed on the determination target image, the domain order conversion is not performed correctly for the abnormal part in the object 3.
  • the finally obtained domain-converted image is a restored image as if it were a normal image. That is, a difference occurs between the determination target image and the domain-converted image at the abnormal part in the target object 3 photographed in the determination target image (see FIG. 6).
  • the determination unit 14 determines whether an abnormality has occurred in the object 3 photographed in the determination target image, using the principle of the abnormality inspection using the domain conversion as described above.
  • the determination unit 14 can detect the presence / absence of an abnormality of the object 3 photographed in the determination target image and the abnormal part by comparing the determination target image and the domain-converted image.
  • the determination unit 14 calculates a difference absolute value of pixel values between the determination target image acquired from the determination target image analysis unit 122 and the domain-converted image. Specifically, the determination unit 14 calculates an absolute difference value between pixel values of each pixel in a plurality of pixels at positions corresponding to each other among a plurality of pixels constituting the determination target image and the domain-converted image. To do.
  • the pixel value is brightness (luminance value).
  • the determination unit 14 Based on the calculated absolute difference value between the determination target image and the domain-converted image, the determination unit 14 generates an image indicating the determination target image with the difference absolute value.
  • a difference image an image in which each pixel of the determination target image is indicated by a difference absolute value between the determination target image and the domain-converted image.
  • the determination unit 14 performs threshold processing on the difference image, and sets the value of a pixel whose difference absolute value is less than the threshold to 0 and the value of a pixel whose difference absolute value is equal to or greater than the threshold. Note that this is merely an example, and the determination unit 14 may set the value of the pixel whose difference absolute value is less than the threshold to 1, set the value of the pixel whose difference absolute value is equal to or greater than the threshold to 0, and set the difference absolute value to A value other than 1 or 0 may be used for the pixel value that is less than the threshold value and the pixel value for which the absolute difference value is greater than or equal to the threshold value.
  • the determination part 14 should just determine the threshold value at the time of performing the threshold value process with respect to a difference image by an appropriate method. For example, the determination unit 14 determines the threshold based on an instruction from the control unit 10. Specifically, for example, the user inputs a threshold value from the input / output device 4, and the control unit 10 receives the input threshold value. The determination unit 14 determines the threshold received by the control unit 10 as a threshold used when threshold processing is performed. In addition, a user or the like may store a threshold value in the storage unit 13 in advance, and the determination unit 14 may perform threshold processing using the threshold value stored in the storage unit 13. Further, the determination unit 14 may adaptively determine the threshold value according to the difference value distribution or the like.
  • Non-Patent Document 2 a method for adaptively determining a threshold is disclosed in Non-Patent Document 2 below, for example. Nobuyuki Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. on Systems, Man, and Cybernetics, 1979.
  • Non-Patent Document 2 when a certain threshold is determined, a set of pixels having pixel values equal to or higher than the threshold is class 1, and a set of other pixels is class 2.
  • the inter-class variance and the intra-class variance are obtained from the class 2 pixel values, and the threshold is determined so that the degree of separation calculated from these values is maximized.
  • the determination unit 14 divides a region formed by a set of pixels having a difference absolute value less than the threshold and a region formed by a set of pixels having a difference absolute value not less than the threshold as a result of the threshold processing. Then, a rectangle circumscribing the area exceeding the threshold (hereinafter referred to as “bounding box”) is obtained. And the determination part 14 determines the location where a bounding box exists as an area
  • the determination unit 14 performs threshold processing on the difference image generated by comparing the determination target image output from the determination target image analysis unit 122 and the domain-converted image, and is captured in the determination target image. It is determined whether or not an abnormality has occurred in the target object 3.
  • FIG. 9 is a diagram for explaining an example of a result obtained by the determination unit 14 performing threshold processing on the difference image in the first embodiment.
  • An area indicated by 901 in FIG. 9 indicates an area determined by the determination unit 14 to be less than the threshold in the difference image
  • an area indicated by 902 in FIG. 9 indicates an area determined by the determination unit 14 in the difference image by the determination unit 14 or more.
  • a rectangle denoted by reference numeral 903 in FIG. 9 is a bounding box. In the determination target image, the position where the bounding box exists is the position where an abnormality has occurred in the object 3 being photographed.
  • the determination unit 14 outputs, to the input / output unit 15, information related to the determination result, which is determined as to whether or not an abnormality has occurred in the target object 3 photographed in the determination target image as described above.
  • the information related to the determination result includes at least information specifying whether an abnormality has occurred, such as abnormality or no abnormality, and information on the determination target image.
  • the determination unit 14 includes information related to the bounding box in the information related to the determination result, and stores the information in the input / output unit 15. Output.
  • the information related to the bounding box is information such as the coordinates of the upper left point on the image indicating the bounding box, the vertical width of the bounding box, or the horizontal width of the bounding box.
  • the determination unit 14 may output the difference image to the input / output unit 15.
  • the determination unit 14 ignores a bounding box that does not satisfy a predetermined condition (hereinafter referred to as a “bounding box condition”) with respect to the position or size of the bounding box, and determines that an abnormality has occurred. You may make it not determine.
  • the determination unit 14 includes information indicating that there is no abnormality in the information regarding the determination result output to the input / output unit 15 and does not include information regarding the bounding box. By doing in this way, in the difference image, it is possible to prevent erroneous detection of abnormality outside the range of the target area to be determined as to whether or not abnormality has occurred, or erroneous detection of abnormality due to noise. Can do.
  • the determination unit 14 may determine the bounding box condition by an appropriate method. For example, the determination unit 14 can determine the bounding box condition based on an instruction from the control unit 10. Specifically, the user inputs the bounding box condition from the input / output device 4, and the control unit 10 receives the input bounding condition and instructs the determination unit 14.
  • the user or the like stores the bounding conditions in the storage unit 13 in advance, and the determination unit 14 determines the bounding box when an abnormality exists based on the bounding conditions stored in the storage unit 13. Also good.
  • the bounding conditions stored in the storage unit 13 are, for example, the following conditions. -The vertical width of the bounding box is 10 pixels or more-The horizontal width of the bounding box is 10 pixels or more-A two-dimensional mask for limiting the target area
  • FIG. 10 is a diagram for explaining an example of the two-dimensional mask stored in the storage unit 13 as the bounding condition in the first embodiment.
  • FIG. 10B shows an image of a two-dimensional mask
  • FIG. 10A shows an image obtained by superimposing the two-dimensional mask shown in FIG. 10B on the determination target image.
  • an area indicated by 1001 in FIG. 10B indicates the target area.
  • the determination unit 14 ignores the bounding box detected outside the target area indicated by 1002 in FIG. 10B in the difference image generated based on the determination target image.
  • the input / output unit 15 transmits information regarding the determination result output from the determination unit 14 to the input / output device 4.
  • the input / output device 4 receives the information transmitted from the input / output unit 15 and displays the received information on a display, for example.
  • the user checks the display of the input / output device 4 and inspects whether the object 3 is in an abnormal state.
  • the input / output device 4 is a PC (Personal Computer) having a display, for example.
  • An example of the screen when information related to the determination result is displayed on the display in the input / output device 4 will be described with reference to the drawings in the description of operations described later.
  • the input / output unit 15 transmits information about the determination result to the input / output device 4, but this is only an example, and the input / output unit 15 stores the information about the determination result. For example, it may be transmitted to an external control device or the like.
  • FIG. 11A and 11B are diagrams illustrating an example of a hardware configuration of the abnormality inspection apparatus 1 according to the first embodiment.
  • the functions of the control unit 10, the image acquisition unit 11, the image analysis unit 12, and the determination unit 14 are realized by the processing circuit 1101. That is, the abnormality inspection apparatus 1 determines whether or not an abnormality has occurred in the object 3 that covers the image based on the acquired image, and includes a processing circuit 1101 for performing control to output the determination result.
  • the processing circuit 1101 may be dedicated hardware as shown in FIG. 11A or may be a CPU (Central Processing Unit) 1105 that executes a program stored in the memory 1106 as shown in FIG. 11B.
  • CPU Central Processing Unit
  • the processing circuit 1101 When the processing circuit 1101 is dedicated hardware, the processing circuit 1101 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array) or a combination of these.
  • the functions of the control unit 10, the image acquisition unit 11, the image analysis unit 12, and the determination unit 14 are realized by software, firmware, or a combination of software and firmware. That is, the control unit 10, the image acquisition unit 11, the image analysis unit 12, and the determination unit 14 include a CPU 1105 that executes a program stored in an HDD (Hard Disk Drive) 1102, a memory 1106, a system LSI (Large ⁇ This is realized by a processing circuit such as Scale Integration. It can also be said that the programs stored in the HDD 1102, the memory 1106, and the like cause the computer to execute the procedures and methods of the control unit 10, the image acquisition unit 11, the image analysis unit 12, and the determination unit 14.
  • HDD Hard Disk Drive
  • LSI Large ⁇ This is realized by a processing circuit such as Scale Integration.
  • the memory 1106 is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory, an EEPROM) And volatile semiconductor memories, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable Read Only Memory, an EEPROM)
  • volatile semiconductor memories magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • a part is implement
  • the function of the control unit 10 is realized by a processing circuit 1101 as dedicated hardware, and the processing circuit is stored in the memory 1106 for the image acquisition unit 11, the image analysis unit 12, and the determination unit 14. The function can be realized by reading out and executing.
  • the storage unit 13 uses, for example, the HDD 1102. This is only an example, and the storage unit 13 may be configured by a DVD, a memory 1106, or the like.
  • the storage unit 13 is composed of various recording media.
  • the abnormality inspection apparatus 1 includes an input interface device 1103 and an output interface device 1104 that communicate with a device such as the camera 2 or the input / output device 4.
  • the input / output unit 15 includes an input interface device 1103 and an output interface device 1104.
  • the operation of the abnormality inspection apparatus 1 according to Embodiment 1 will be described.
  • the operation of the abnormality inspection apparatus 1 in the “learning mode” and the operation of the abnormality inspection apparatus 1 in the “inspection mode” will be described separately.
  • the operation of the abnormality inspection apparatus 1 in the “learning mode” will be described.
  • FIG. 12 is a flowchart for explaining the operation of the abnormality inspection apparatus 1 in the “learning mode” in the first embodiment.
  • the image acquisition unit 11 acquires a captured image from the camera 2 via the network (step ST1201). As described above, all the captured images acquired by the image acquisition unit 11 in the “learning mode” are “normal images”. Note that the processing in step ST1201 is performed by the control unit 10 receiving the activation signal and the setting signal from the input / output device 4 and controlling the image acquisition unit 11 to acquire a captured image. Specifically, for example, the user activates the abnormality inspection apparatus 1 in the input / output device 4 and sets the “learning mode”.
  • the input / output device 4 transmits an activation signal for the abnormality inspection device 1 and a setting signal for setting the abnormality inspection device 1 to the “learning mode” to the abnormality inspection device 1 based on an instruction from the user.
  • the control unit 10 receives an activation signal and a setting signal input from the user.
  • the image acquisition unit 11 outputs the acquired normal image to the normal image analysis unit 121 of the image analysis unit 12.
  • the normal image analysis unit 121 acquires the designated number of normal images from the image acquisition unit 11, and performs domain conversion learning processing based on the acquired normal images (step ST1202).
  • the process of step ST1202 is performed by the control unit 10 instructing the image analysis unit 12 to operate in the “learning mode”.
  • FIG. 13 is a flowchart for explaining the details of the domain conversion learning process performed in step ST1202 of FIG. 12 by the normal image analysis unit 121 in the first embodiment.
  • the normal image analysis unit 121 acquires the normal image output from the image acquisition unit 11 in step ST1201 in FIG. 12 (step ST1301)
  • the normal image analysis unit 121 performs image acquisition end determination (step ST1302). Specifically, the normal image analysis unit 121 determines whether or not a predetermined number of normal images have been acquired for use in performing domain conversion learning.
  • step ST1303 If the normal image analysis unit 121 determines that a predetermined number of normal images have not been acquired (in the case of “NO” in step ST1303), the normal image analysis unit 121 returns to step ST1301 and outputs the normal image output from the image acquisition unit 11. Keep getting. If the normal image analysis unit 121 determines that a predetermined number of normal images have been acquired (in the case of “YES” in step ST1303), the normal image analysis unit 121 ends the acquisition of normal images and proceeds to step ST1304.
  • the normal image analysis unit 121 acquires an image of another corresponding domain for each of the normal images acquired from the image acquisition unit 11 in step ST1301, and learns to perform domain forward conversion and domain reverse conversion using neural network learning. Let the network do it. (Step ST1304).
  • the normal image analysis unit 121 stores, in the storage unit 13, information that defines the configuration of the neural network that performs domain forward conversion and domain reverse conversion (step ST1305). ). Specifically, the normal image analysis unit 121 causes the storage unit 13 to store information defining the configuration of the neural network via the control unit 10. The control unit 10 instructs the storage unit 13 to store information that defines the configuration of the neural network.
  • the abnormality inspection apparatus 1 learns the neural network used in the “inspection mode” to be performed later based on the normal image acquired from the camera 2, and configures the learned neural network configuration.
  • the prescribed information is stored.
  • FIG. 14 is a flowchart for explaining the operation of the abnormality inspection apparatus 1 in the “inspection mode” in the first embodiment.
  • the image acquisition unit 11 acquires a determination target image from the camera 2 via the network (step ST1401). Note that the process of step ST1401 is performed by the control unit 10 receiving the activation signal and the setting signal from the input / output device 4 and controlling the image acquisition unit 11 to acquire a captured image. Specifically, for example, the user activates the abnormality inspection apparatus 1 in the input / output device 4 and sets it to the “inspection mode”.
  • the input / output device 4 transmits a start signal for the abnormality inspection device 1 and a setting signal for setting the abnormality inspection device 1 to the “setting mode” to the abnormality inspection device 1.
  • the control unit 10 receives an activation signal and a setting signal input from the user.
  • the image acquisition unit 11 outputs the acquired determination target image to the determination target image analysis unit 122 of the image analysis unit 12.
  • the determination target image analysis unit 122 acquires a specified number of determination target images from the image acquisition unit 11, and performs domain conversion image acquisition processing based on the acquired determination target images (step ST1402).
  • the process of step ST1402 is performed by the control unit 10 instructing the determination target image analysis unit 122 to operate in the “inspection mode”.
  • FIG. 15 is a flowchart illustrating details of the post-domain conversion image acquisition process performed by the determination target image analysis unit 122 in step ST1402 of FIG. 14 in the first embodiment.
  • the determination target image analysis unit 122 acquires the determination target image output from the image acquisition unit 11 in step ST1401 (step ST1501)
  • the determination target image analysis unit 122 performs image acquisition end determination (step ST1502). Specifically, the determination target image analysis unit 122 determines whether or not the specified number of determination target images for determining abnormality has been acquired.
  • step ST1503 If the determination target image analysis unit 122 determines that the specified number of determination target images have not been acquired (in the case of “NO” in step ST1503), the determination target image analysis unit 122 returns to step ST1501, and the determination output from the image acquisition unit 11 Continue to acquire the target image. If the determination target image analysis unit 122 determines that the specified number of determination target images have been acquired (in the case of “YES” in step ST1503), the determination target image analysis unit 122 ends the acquisition of the determination target image and proceeds to step ST1504.
  • the learning result acquisition unit 123 acquires information that defines the configuration of the neural network stored in the storage unit 13 (step ST1504).
  • the learning result acquisition unit 123 outputs the acquired information to the determination target image analysis unit 122.
  • the determination target image analysis unit 122 performs domain conversion of the determination target image using the information acquired by the learning result acquisition unit 123 in step ST1504 (step ST1505). Specifically, the determination target image analysis unit 122 sequentially performs domain forward conversion and domain reverse conversion on the determination target image to obtain a domain converted image. Then, the determination target image analysis unit 122 outputs the determination target image acquired from the image acquisition unit 11 in step ST1501 and the domain-converted image acquired in step ST1505 to the determination unit 14.
  • the determination unit 14 acquires the determination target image and the domain-converted image output from the determination target image analysis unit 122 in step ST1402, compares the acquired determination target image with the domain-converted image, and performs abnormality determination processing. It performs (step ST1403).
  • FIG. 16 is a flowchart illustrating details of the abnormality determination process performed by determination unit 14 in step ST1403 of FIG. 14 in the first embodiment.
  • the determination unit 14 calculates an absolute difference between the determination target image acquired from the determination target image analysis unit 122 and the domain-converted image in step ST1402 of FIG. 14, and generates a difference image (step ST1601).
  • the determination unit 14 performs threshold processing on the difference image generated by comparing the determination target image and the domain-converted image, and determines whether or not an abnormality has occurred in the target object 3 photographed in the determination target image. Determination is made (step ST1602).
  • the determination unit 14 outputs, to the input / output unit 15, information related to a determination result that determines whether or not an abnormality has occurred in the target object 3 photographed in the determination target image.
  • the input / output unit 15 transmits information regarding the determination result output from the determination unit 14 in step ST1403 to the input / output device 4 (step ST1404).
  • the control unit 10 gives an operation instruction to the input / output unit 15 after the abnormality determination process by the determination unit 14.
  • the input / output device 4 receives the information transmitted from the input / output unit 15 and displays the received information on, for example, a display.
  • FIG. 17 to FIG. 19 illustrate an example of an image of a display screen that is received by the input / output device 4 from the input / output unit 15 in the first embodiment and receives information related to the determination result and displayed on the display. It is a figure for doing.
  • FIG. 17 shows a display displayed by the input / output device 4 when the determination unit 14 transmits a result of determining that no abnormality has occurred in the object 3 photographed in the determination target image from the input / output unit 15.
  • An example of a screen image is shown.
  • the input / output device 4 simply displays the determination target image on the display as it is, and displays a message (see 1701 in FIG. 17) for notifying that there is no abnormality on the display.
  • “OK” is displayed as an example of a message for notifying that there is no abnormality. This is only an example, and the input / output device 4 only needs to display an appropriate message that can notify the user that there is no abnormality.
  • FIG. 18 illustrates the input / output device 4 when the determination unit 14 has transmitted from the input / output unit 15 a result of determining that one abnormality has occurred in the target object 3 photographed in the determination target image.
  • the input / output device 4 superimposes and displays an abnormal part on the determination target image (see 1801 in FIG. 18), and a message for notifying that an abnormality has occurred (see 1802 in FIG. 18). ) On the display.
  • the input / output device 4 displays “NG” as a message for notifying that an abnormality has occurred.
  • the input / output device 4 superimposes the abnormal part on the determination target image and displays a message (see 1803 in FIG. 18) for calling attention to the abnormal part on the display.
  • the input / output device 4 displays “CHECK!” As a message for calling attention to an abnormal part.
  • the input / output device 4 may identify the abnormal part from the information in the bounding box included in the information related to the determination result transmitted from the input / output unit 15.
  • the letters “NG” are displayed as an example of a message for notifying that an abnormality has occurred, but this is only an example.
  • the input / output device 4 only needs to display an appropriate message that can notify the user that an abnormality has occurred.
  • the characters “CHECK!” Are displayed as an example of a message for calling attention to an abnormal part, but this is only an example.
  • the input / output device 4 only needs to display an appropriate message that can prompt the user to pay attention to the abnormal part.
  • FIG. 19 illustrates the input / output device 4 when the determination unit 14 has transmitted from the input / output unit 15 a result of determining that one abnormality has occurred in the target object 3 photographed in the determination target image.
  • the other example of the image of the display screen displayed is shown.
  • the input / output device 4 does not directly display the abnormal portion as shown in FIG.
  • a combined image obtained by combining the difference image with the determination target image may be displayed on the display.
  • the input / output device 4 displays the determination target image on the left side and the above-described composite image on the right side.
  • the portion where the difference from the normal state is large in the determination target image is displayed so as to be conspicuous (see 1901 in FIG. 19). Therefore, the user should pay attention when performing an abnormality inspection.
  • the location can be easily grasped.
  • the input / output device 4 may generate a composite image by an appropriate method.
  • the input / output device 4 may generate a composite image in which a portion having a small difference is dark and a portion having a large difference is displayed brightly (see FIG. 19), or a portion having a small difference is displayed in blue and the difference is large. You may make it color-display so that it may become red as it becomes.
  • the image of the display screen as shown in FIGS. 17 to 19 is merely an example, and the input / output device 4 may display other display screens.
  • the input / output device 4 may display a combination of display screens as shown in FIGS.
  • the input / output device 4 receives information on the determination result transmitted from the input / output unit 15 by voice or music, for example. You may make it output.
  • the abnormality inspection apparatus 1 performs domain conversion on the determination target image acquired from the camera 2 using information that has been learned in advance and that defines the configuration of the neural network. Then, the image after domain conversion is acquired. Then, the abnormality inspection apparatus 1 performs threshold processing on the difference image generated by comparing the determination target image with the domain-converted image, and determines whether an abnormality has occurred in the determination target image.
  • the photographing system may be devised so that the illumination condition and the positional relationship between the camera 2 and the object 3 are kept constant.
  • a photographing system such as a light shielding plate for covering the periphery of the camera 2 or the object 3 or a jig for determining the position of the camera 2 or the object 3 with high accuracy is necessary. There is a problem of incurring higher costs.
  • the image analysis unit 12 acquires a domain-converted image by directly using the determination target image using a neural network that has been learned in advance. Then, the determination unit 14 determines whether an abnormality has occurred in the determination target image by comparing the determination target image with the domain-converted image. Therefore, it is possible to make a highly accurate abnormality determination that is not affected by the difference in the imaging conditions or the difference between the images due to the variation within the normal range as described above. Moreover, since the abnormality inspection apparatus 1 does not require a reliable fixing of the object and the camera or a highly accurate positioning device, an increase in introduction cost can be suppressed.
  • the abnormality inspection apparatus 1 includes the image acquisition unit 11 that acquires the determination target image obtained by capturing the target object 3 to be inspected and the target object 3 in the normal state.
  • a learning result acquisition unit 123 that acquires the result of machine learning of domain order conversion of images or domain inverse conversion of images performed using the normal image as teacher data, and the determination target image acquired by the image acquisition unit 11
  • the determination target image analysis unit 122 that sequentially performs domain forward conversion and domain reverse conversion using the machine learning result acquired by the learning result acquisition unit 123 and acquires an image after domain conversion, and the image acquisition unit 11 acquired.
  • the determination target image is compared with the domain-converted image acquired by the determination target image analysis unit 122, and whether or not an abnormality has occurred in the target object 3 photographed in the determination target image.
  • the user or the like does not need to define in detail what state an abnormality is in the determination target image, and can be applied universally to any abnormality.
  • machine learning can be performed using a plurality of images in which the positional relationship between the object and the camera is not constant. When such machine learning is performed, an arbitrary image can be targeted in the inspection mode, and there is an effect that it is not necessary to reliably fix the object and the camera or to perform high-precision positioning.
  • Embodiment 2 the abnormality inspection apparatus 1 learns the configuration of the neural network based on the normal image in advance, and an abnormality occurs in the target object 3 photographed in the determination target image using the learning result. Judgment whether or not.
  • the abnormality inspection apparatus 1a further analyzes an abnormality occurring in the object 3 photographed in the determination target image.
  • FIG. 20 is a diagram illustrating a configuration example of the abnormality inspection apparatus 1a according to the second embodiment.
  • the abnormality inspection apparatus 1a shown in FIG. 20 is different from the abnormality inspection apparatus 1 described with reference to FIG. Since the configuration of the abnormality inspection device 1a other than the abnormality analysis unit 16 is the same as that of the abnormality inspection device 1 of the first embodiment, the same components are denoted by the same reference numerals, and redundant description is omitted.
  • the determination target image analysis unit 122 of the image analysis unit 12 makes a determination on the determination unit 14 in addition to the determination target image and the domain-converted image.
  • a domain order conversion image (see FIGS. 5 and 6) obtained by domain order conversion of the target image is output.
  • the determination unit 14 acquires the determination target image, the domain order conversion image, and the domain converted image output from the determination target image analysis unit 122.
  • the determination unit 14 performs the abnormality determination process by comparing the determination target image and the domain-converted image
  • the determination unit 14 outputs information on the determination result to the abnormality analysis unit 16.
  • the determination unit 14 outputs the domain order conversion image acquired from the determination target image analysis unit 122 to the abnormality analysis unit 16 together with information on the determination result.
  • the overlapping description is abbreviate
  • the abnormality analysis unit 16 performs an abnormality analysis process based on the information on the determination result output from the determination unit 14 and the domain order conversion image, and outputs the information on the abnormality analysis result to the input / output unit 15. To do. Specifically, the abnormality analysis unit 16 outputs, for example, bounding information when the determination unit 14 outputs information on a determination result indicating that an abnormality has occurred in the target object 3 photographed in the determination target image. By superimposing the box and the domain order conversion image, it is analyzed in which region in the determination target image the abnormality occurred. Note that the information regarding the bounding box is included in the information regarding the determination result output from the determination unit 14.
  • the abnormality analysis unit 16 converts the bounding box and the domain domain image.
  • Overlay (see FIG. 21). 21, 211 in FIG. 21 is an image of a region domain, 2101 in FIG. 21 is a substrate region, 2102 in FIG. 21 is a wiring region, 2103 in FIG. 21 is an IC region, and 2014 in FIG. 21 is a bounding box. Yes. In the example shown in FIG. 21, an abnormality has occurred in the wiring area.
  • the abnormality analysis unit 16 analyzes that an abnormality has occurred in the wiring, and outputs information indicating that there is an abnormality in the wiring to the input / output unit 15 as information related to the abnormality analysis result. At this time, the abnormality analysis unit 16 outputs information regarding the determination result acquired from the determination unit 14 to the input / output unit 15 together with information regarding the abnormality analysis result.
  • the determination unit 14 outputs information about the determination result indicating that no abnormality has occurred in the target object 3 photographed in the determination target image
  • the abnormality analysis unit 16 performs the determination without performing the abnormality analysis. Information about the result is output to the input / output unit 15 as it is.
  • the abnormality analysis unit 16 reads arbitrary information stored in advance in the storage unit 13 based on information that an abnormality has occurred in the wiring area, and inputs the read information as information on the abnormality analysis result. You may make it output to the output part 15.
  • FIG. The arbitrary information read from the storage unit 13 by the abnormality analysis unit 16 is, for example, an error code corresponding to the occurring abnormality or a manual for dealing with the occurring abnormality.
  • the “learning mode” and the “inspection mode” are performed as in the abnormality inspection apparatus 1 according to the first embodiment.
  • the specific operation in the “learning mode” in the abnormality inspection apparatus 1a is the same as the specific operation in the “learning mode” in the abnormality inspection apparatus 1 described in Embodiment 1 with reference to FIGS. Since it is the same, the overlapping description is omitted.
  • FIG. 22 is a flowchart for explaining the operation of the abnormality inspection apparatus 1a in the “inspection mode” in the second embodiment.
  • the image acquisition unit 11 acquires a determination target image from the camera 2 via the network (step ST2201).
  • the specific operation of the image acquisition unit 11 in step ST2201 is the same as the specific operation of the image acquisition unit 11 in step ST1401 of FIG. 14 described in the first embodiment.
  • the image acquisition unit 11 outputs the acquired determination target image to the determination target image analysis unit 122 of the image analysis unit 12.
  • the determination target image analysis unit 122 acquires a specified number of determination target images from the image acquisition unit 11, and performs domain conversion image acquisition processing based on the acquired determination target images (step ST2202). In step ST2202, the determination target image analysis unit 122 performs the operation described with reference to FIG. 15 in the first embodiment. However, in step ST2202, the determination target image analysis unit 122 outputs a domain order conversion image to the determination unit 14 in addition to the determination target image and the domain converted image.
  • the determination unit 14 acquires the determination target image, the domain-converted image, and the domain order conversion image output from the determination target image analysis unit 122 in step ST2202, and compares the acquired determination target image with the domain-converted image. Then, abnormality determination processing is performed (step ST2203). In step ST2203, the determination unit 14 performs an abnormality determination process as described with reference to FIG. 16 in the first embodiment. The determination unit 14 outputs the domain order conversion image acquired from the determination target image analysis unit 122 in step ST2202 to the abnormality analysis unit 16 together with information on the determination result.
  • the abnormality analysis unit 16 performs an abnormality analysis process based on the information related to the determination result and the domain order conversion image output from the determination unit 14 in step ST2203 (step ST2204). In the “inspection mode”, the control unit 10 gives an operation instruction to the abnormality analysis unit 16 after the abnormality determination process by the determination unit 14.
  • FIG. 23 is a flowchart for explaining the details of the abnormality analysis processing performed in step ST2204 in FIG. 22 by the abnormality analysis unit 16 in the second embodiment.
  • the abnormality analysis unit 16 determines whether information regarding a determination result indicating that an abnormality has occurred in the target object 3 photographed in the determination target image is output from the determination unit 14. (Step ST2301).
  • Step ST2301 when the abnormality analysis unit 16 determines that information related to a determination result indicating that no abnormality has occurred in the target object 3 photographed in the determination target image has been output from the determination unit 14 (Step ST2301). In the case of “NO”), the process shown in FIG. 23 is terminated, and the process proceeds to Step ST2205 in FIG. At this time, the abnormality analysis unit 16 outputs the information regarding the determination result output from the determination unit 14 to the input / output unit 15 as it is.
  • step ST2301 When it is determined in step ST2301 that the determination unit 14 has output information related to a determination result indicating that an abnormality has occurred in the object 3 photographed in the determination target image (in the case of “YES” in step ST2301).
  • the abnormality analysis unit 16 superimposes the bounding box and the domain order conversion image, and analyzes in which region in the determination target image the abnormality has occurred (step ST2302). Then, the abnormality analysis unit 16 outputs information related to the abnormality analysis result to the input / output unit 15 together with information related to the determination result output from the determination unit 14.
  • the input / output unit 15 transmits the information regarding the determination result output from the abnormality analysis unit 16 in step ST2204 to the input / output device 4 (step ST2205).
  • the input / output unit 15 transmits the information related to the abnormality analysis result to the input / output device 4 together with the information related to the determination result.
  • the input / output device 4 receives the information transmitted from the input / output unit 15 and displays the received information on, for example, a display. For example, when information related to the abnormality analysis result is output from the input / output unit 15, the input / output device 4 determines the location where the abnormality has occurred in the target object 3, an error code, or Information about the countermeasure manual is displayed on the display.
  • the abnormality inspection apparatus 1a is not affected by the difference between the imaging conditions or the difference between the images due to the variation within the normal range, as in the first embodiment.
  • the abnormality of the object can be inspected.
  • the determination target image analysis unit 122 is a result of machine learning in which the normal image is used as teacher data for the determination target image acquired by the image acquisition unit 11.
  • the domain order conversion using is performed, and the acquired domain order conversion image is output to the determination unit 14, and the determination unit 14 has an abnormality in the target 3 captured in the determination target image acquired by the image acquisition unit 11.
  • an abnormality analysis unit 16 that performs an abnormality analysis using the domain forward conversion image is further provided. For this reason, when it is determined that an abnormality has occurred in the determination target image, it is determined that an abnormality has occurred in order to analyze the abnormality determined to have occurred using the domain forward conversion image. It is possible to provide information regarding the classification or coping method of the abnormality.
  • the normal image analysis unit 121 performs machine learning using a normal image in the “learning mode” and results of the machine learning.
  • the learning result acquisition unit 123 acquires the machine learning result stored by the normal image analysis unit 121, and the determination target image analysis unit 122 displays the result of the machine learning. Used to acquire domain conversion images.
  • the present invention is not limited to this, and it is only necessary that machine learning using a normal image is performed in advance before the execution of the “inspection mode” by the abnormality inspection apparatuses 1 and 1a.
  • the abnormality inspection devices 1 and 1a are configured not to include the normal image analysis unit 121, and the learning result acquisition unit 123 It is also possible to acquire machine learning results stored in advance.
  • the invention of the present application can be freely combined with each embodiment, modified with any component in each embodiment, or omitted with any component in each embodiment. .
  • the abnormality inspection apparatus is configured to be able to inspect the abnormality of an object without being affected by differences in imaging conditions or differences between images due to variations within a normal range.
  • the present invention can be applied to an abnormality inspection apparatus that inspects whether or not there is an abnormality on the object based on an image obtained by photographing the target object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Processing (AREA)

Abstract

Dans la présente invention, un objet d'acquisition d'image acquiert une image cible d'évaluation dans laquelle un objet cible à inspecter est capturé ; une unité d'acquisition de résultats d'apprentissage acquiert les résultats d'apprentissage machine de la conversion directe de domaine d'une image ou de la conversion inverse de domaine de l'image, où ledit apprentissage machine utilise, à titre de données d'apprentissage, une image normale dans laquelle l'objet cible est capturé à l'état normal ; une unité d'analyse d'image cible d'évaluation utilise les résultats d'apprentissage machine pour effectuer séquentiellement une conversion directe de domaine et une conversion inverse de domaine sur l'image cible d'évaluation, et acquiert une image convertie de domaine ; et une unité de détermination qui compare l'image cible d'évaluation à l'image convertie de domaine et évalue si une anomalie s'est produite ou non dans l'objet cible capturé dans l'image cible d'évaluation.
PCT/JP2018/013305 2018-03-29 2018-03-29 Dispositif d'inspection d'anomalies et procédé associé WO2019186915A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
PCT/JP2018/013305 WO2019186915A1 (fr) 2018-03-29 2018-03-29 Dispositif d'inspection d'anomalies et procédé associé
KR1020207027495A KR102206698B1 (ko) 2018-03-29 2018-03-29 이상 검사 장치 및 이상 검사 방법
DE112018007171.5T DE112018007171T5 (de) 2018-03-29 2018-03-29 Abnormalitätsinspektionsvorrichtung und abnormalitätsinspektionsverfahren
JP2020504740A JP6693684B2 (ja) 2018-03-29 2018-03-29 異常検査装置および異常検査方法
CN201880091679.6A CN111902712B (zh) 2018-03-29 2018-03-29 异常检查装置及异常检查方法
TW107139476A TWI719357B (zh) 2018-03-29 2018-11-07 異常檢查裝置以及異常檢查方法
US17/031,004 US11386549B2 (en) 2018-03-29 2020-09-24 Abnormality inspection device and abnormality inspection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/013305 WO2019186915A1 (fr) 2018-03-29 2018-03-29 Dispositif d'inspection d'anomalies et procédé associé

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/031,004 Continuation US11386549B2 (en) 2018-03-29 2020-09-24 Abnormality inspection device and abnormality inspection method

Publications (1)

Publication Number Publication Date
WO2019186915A1 true WO2019186915A1 (fr) 2019-10-03

Family

ID=68059606

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/013305 WO2019186915A1 (fr) 2018-03-29 2018-03-29 Dispositif d'inspection d'anomalies et procédé associé

Country Status (7)

Country Link
US (1) US11386549B2 (fr)
JP (1) JP6693684B2 (fr)
KR (1) KR102206698B1 (fr)
CN (1) CN111902712B (fr)
DE (1) DE112018007171T5 (fr)
TW (1) TWI719357B (fr)
WO (1) WO2019186915A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021220437A1 (fr) * 2020-04-28 2021-11-04 三菱電機株式会社 Dispositif d'inspection d'apparence et procédé d'inspection d'apparence
JPWO2021220336A1 (fr) * 2020-04-27 2021-11-04
WO2021229901A1 (fr) * 2020-05-15 2021-11-18 オムロン株式会社 Dispositif d'inspection d'images, procédé d'inspection d'images et dispositif de génération de modèle pré-appris
WO2022153564A1 (fr) * 2021-01-14 2022-07-21 オムロン株式会社 Dispositif d'inspection de composant

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11272097B2 (en) * 2020-07-30 2022-03-08 Steven Brian Demers Aesthetic learning methods and apparatus for automating image capture device controls
TWI779836B (zh) * 2021-09-15 2022-10-01 英業達股份有限公司 基於影像處理的鍵盤檔案驗證方法
KR102656162B1 (ko) * 2021-12-16 2024-04-11 의료법인 성광의료재단 인공지능을 이용한 라디오믹스 기반의 골건강 상태 평가 장치 및 방법
JP2023141721A (ja) * 2022-03-24 2023-10-05 株式会社Screenホールディングス 検査システム、教師データ生成装置、教師データ生成方法およびプログラム
CN114825636A (zh) * 2022-05-26 2022-07-29 深圳博浩远科技有限公司 一种光伏逆变器健康状态监控与告警系统及方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06281592A (ja) * 1993-03-29 1994-10-07 Sumitomo Metal Ind Ltd 表面検査方法
US7295695B1 (en) * 2002-03-19 2007-11-13 Kla-Tencor Technologies Corporation Defect detection via multiscale wavelets-based algorithms
WO2011004534A1 (fr) * 2009-07-09 2011-01-13 株式会社 日立ハイテクノロジーズ Procédé de classification des défauts d'un semi-conducteur, appareil de classification des défauts d'un semi-conducteur et programme de classification des défauts d'un semi-conducteur
US20110082650A1 (en) * 2009-10-07 2011-04-07 Iyun Leu Method for utilizing fabrication defect of an article
JP2016021004A (ja) * 2014-07-15 2016-02-04 株式会社ニューフレアテクノロジー マスク検査装置及びマスク検査方法
JP6241576B1 (ja) * 2016-12-06 2017-12-06 三菱電機株式会社 検査装置及び検査方法

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4146971B2 (ja) * 1999-07-07 2008-09-10 アピックヤマダ株式会社 リードフレーム検査装置
EP2012273B1 (fr) * 2006-04-26 2019-05-29 Mitsubishi Electric Corporation Dispositif de detection d'objet et ce meme dispositif pour ascenseur
JP4369961B2 (ja) * 2007-03-23 2009-11-25 株式会社日立製作所 異常検知装置及び異常検知プログラム
JP2009250777A (ja) * 2008-04-07 2009-10-29 Fuji Electric Holdings Co Ltd 表面検査装置および表面検査方法
JP2010055194A (ja) * 2008-08-26 2010-03-11 Sony Corp 画像処理装置および方法、学習装置および方法、並びにプログラム
KR101057525B1 (ko) * 2009-07-09 2011-08-17 삼성에스디아이 주식회사 배터리 팩
JP2012098181A (ja) * 2010-11-02 2012-05-24 Sumitomo Electric Ind Ltd 検出装置及び検出方法
JP5804834B2 (ja) 2011-08-03 2015-11-04 株式会社オービット 検査プログラム、同プログラムを格納した記録媒体及び検査装置
US9076233B2 (en) * 2012-02-03 2015-07-07 Seiko Epson Corporation Image processing device and electronic apparatus using the same
JP5661833B2 (ja) * 2013-02-28 2015-01-28 ファナック株式会社 線状パターンを含む対象物の外観検査装置及び外観検査方法
JP6303352B2 (ja) * 2013-09-18 2018-04-04 株式会社デンソーウェーブ 外観検査システム
JP6333871B2 (ja) * 2016-02-25 2018-05-30 ファナック株式会社 入力画像から検出した対象物を表示する画像処理装置
JP6281592B2 (ja) 2016-04-06 2018-02-21 大日本印刷株式会社 レプリカテンプレートの製造方法
JP2017215277A (ja) * 2016-06-02 2017-12-07 住友化学株式会社 欠陥検査システム、フィルム製造装置及び欠陥検査方法
JP6702064B2 (ja) * 2016-06-07 2020-05-27 株式会社デンソー 外観異常検査装置、方法、及びプログラム
JP2020053771A (ja) * 2018-09-25 2020-04-02 オリンパス株式会社 画像処理装置、撮像装置
CN109829904B (zh) * 2019-01-29 2022-01-14 京东方科技集团股份有限公司 检测屏幕上灰尘的方法、装置、电子设备、可读存储介质
CN111260545B (zh) * 2020-01-20 2023-06-20 北京百度网讯科技有限公司 生成图像的方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06281592A (ja) * 1993-03-29 1994-10-07 Sumitomo Metal Ind Ltd 表面検査方法
US7295695B1 (en) * 2002-03-19 2007-11-13 Kla-Tencor Technologies Corporation Defect detection via multiscale wavelets-based algorithms
WO2011004534A1 (fr) * 2009-07-09 2011-01-13 株式会社 日立ハイテクノロジーズ Procédé de classification des défauts d'un semi-conducteur, appareil de classification des défauts d'un semi-conducteur et programme de classification des défauts d'un semi-conducteur
US20110082650A1 (en) * 2009-10-07 2011-04-07 Iyun Leu Method for utilizing fabrication defect of an article
JP2016021004A (ja) * 2014-07-15 2016-02-04 株式会社ニューフレアテクノロジー マスク検査装置及びマスク検査方法
JP6241576B1 (ja) * 2016-12-06 2017-12-06 三菱電機株式会社 検査装置及び検査方法

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021220336A1 (fr) * 2020-04-27 2021-11-04
WO2021220336A1 (fr) * 2020-04-27 2021-11-04 三菱電機株式会社 Dispositif et procédé d'inspection d'image
JP7101918B2 (ja) 2020-04-27 2022-07-15 三菱電機株式会社 画像検査装置および画像検査方法
WO2021220437A1 (fr) * 2020-04-28 2021-11-04 三菱電機株式会社 Dispositif d'inspection d'apparence et procédé d'inspection d'apparence
JPWO2021220437A1 (fr) * 2020-04-28 2021-11-04
JP7270842B2 (ja) 2020-04-28 2023-05-10 三菱電機株式会社 外観検査装置および外観検査方法
WO2021229901A1 (fr) * 2020-05-15 2021-11-18 オムロン株式会社 Dispositif d'inspection d'images, procédé d'inspection d'images et dispositif de génération de modèle pré-appris
JP2021179902A (ja) * 2020-05-15 2021-11-18 オムロン株式会社 画像検査装置、画像検査方法及び学習済みモデル生成装置
JP7505256B2 (ja) 2020-05-15 2024-06-25 オムロン株式会社 画像検査装置、画像検査方法及び学習済みモデル生成装置
WO2022153564A1 (fr) * 2021-01-14 2022-07-21 オムロン株式会社 Dispositif d'inspection de composant

Also Published As

Publication number Publication date
JP6693684B2 (ja) 2020-05-13
KR20200116532A (ko) 2020-10-12
DE112018007171T5 (de) 2021-01-28
JPWO2019186915A1 (ja) 2020-05-28
CN111902712B (zh) 2023-07-25
TW201942567A (zh) 2019-11-01
US11386549B2 (en) 2022-07-12
TWI719357B (zh) 2021-02-21
US20210012476A1 (en) 2021-01-14
KR102206698B1 (ko) 2021-01-22
CN111902712A (zh) 2020-11-06

Similar Documents

Publication Publication Date Title
WO2019186915A1 (fr) Dispositif d'inspection d'anomalies et procédé associé
JP2012032370A (ja) 欠陥検出方法、欠陥検出装置、学習方法、プログラム、及び記録媒体
Kıraç et al. VISOR: A fast image processing pipeline with scaling and translation invariance for test oracle automation of visual output systems
JP5242248B2 (ja) 欠陥検出装置、欠陥検出方法、欠陥検出プログラム、及び、記録媒体
JP2022038390A (ja) 推論装置、方法、プログラムおよび学習装置
US11416978B2 (en) Image processing apparatus, control method and non-transitory computer-readable recording medium therefor
JP2021165888A (ja) 情報処理装置、情報処理装置の情報処理方法およびプログラム
WO2021220437A1 (fr) Dispositif d'inspection d'apparence et procédé d'inspection d'apparence
JP2006258713A (ja) シミ欠陥検出方法及び装置
JP2014206528A (ja) パネル検査方法及びその装置
JP2008014842A (ja) シミ欠陥検出方法及び装置
KR20210056974A (ko) 멀티카메라를 이용한 물체 인식 방법 및 장치
US11039096B2 (en) Image processing device, image processing method and storage medium
WO2020080250A1 (fr) Dispositif, procédé et programme de traitement d'images
US20220414826A1 (en) Image processing apparatus, image processing method, and medium
US11830177B2 (en) Image processing apparatus, control method and non-transitory computer-readable recording medium therefor
JP2008171142A (ja) シミ欠陥検出方法及び装置
JP6948547B2 (ja) プログラム、情報処理システム、及び情報処理方法
US11508143B2 (en) Automated salience assessment of pixel anomalies
KR20090091578A (ko) 한 대의 카메라를 이용한 최소 오차를 갖는 레이저 빔의위치 검출 방법 및 장치
JP6161256B2 (ja) 画像分析支援装置及び方法
JP2020003837A (ja) 識別装置および識別方法
US20240078720A1 (en) Image processing apparatus, image processing method, and non-transitory computer readable storage medium
JP2023008416A (ja) 異常検知システムおよび異常検知方法
CN112906704A (zh) 用于跨域目标检测的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18911424

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020504740

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20207027495

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 18911424

Country of ref document: EP

Kind code of ref document: A1