WO2022254747A1 - 外観検査装置、外観検査方法、学習装置および推論装置 - Google Patents

外観検査装置、外観検査方法、学習装置および推論装置 Download PDF

Info

Publication number
WO2022254747A1
WO2022254747A1 PCT/JP2021/043248 JP2021043248W WO2022254747A1 WO 2022254747 A1 WO2022254747 A1 WO 2022254747A1 JP 2021043248 W JP2021043248 W JP 2021043248W WO 2022254747 A1 WO2022254747 A1 WO 2022254747A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspected
image
filter
filter image
foreign matter
Prior art date
Application number
PCT/JP2021/043248
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
幸博 徳
悠一郎 森田
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2023525357A priority Critical patent/JP7483135B2/ja
Publication of WO2022254747A1 publication Critical patent/WO2022254747A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to a visual inspection device, a visual inspection method, a learning device, and a reasoning device.
  • the inspection apparatus of Patent Document 1 irradiates the surface of the work W with A-color irradiation light L1 and LA1, B-color irradiation light LB1, and C-color irradiation light LC1.
  • the light is split to produce an A-color image, a B-color image, and a C-color image simultaneously.
  • the A-color image has a smaller difference in brightness between the portion corresponding to the defect of the workpiece W and its surroundings.
  • the inspection apparatus of Patent Literature 1 compares the two images and extracts only defects of the work W as differences between the two images.
  • Patent Literature 1 cannot determine whether the foreign matter to be detected is embedded in the surface of the object to be inspected or is only adhered to the surface of the object to be inspected.
  • an object of the present disclosure is to provide a visual inspection apparatus, a visual inspection method, a learning apparatus, and an inference apparatus that can detect only metallic foreign matter embedded in the surface of an object to be inspected.
  • a visual inspection apparatus of the present disclosure includes a first camera that has an ND filter and captures the surface of an object to be inspected to generate an ND filter image, and a PL filter that captures the surface of the object to be inspected. a second camera for generating a PL filtered image; and a control device for inspecting presence/absence of metal foreign matter embedded in the surface of the object to be inspected based on a differential image between the ND filtered image and the PL filtered image.
  • the appearance inspection method of the present disclosure comprises the steps of: a first camera having an ND filter photographs the surface of an object to be inspected to generate an ND filter image; photographing the surface of the object to generate a PL filter image; and the step of:
  • the learning device of the present disclosure includes a data acquisition unit that obtains learning data including an ND filter image of a non-defective product obtained by photographing the surface of the non-defective product, with an object to be inspected having no metal foreign matter embedded in the surface as a non-defective product. and a model generation unit that generates a trained model for reconstructing a good ND filter image from a good ND filter image using the learning data.
  • the inference device of the present disclosure includes a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected, and an object to be inspected that does not contain any metal foreign matter embedded in the surface is regarded as a non-defective product, Using a trained model that reconstructs an ND filter image of a good product from an ND filter image of a good product obtained by photographing the surface of a good product, the object to be inspected is obtained from the ND filter image of the object to be inspected acquired by the data acquisition unit and a reasoner for reconstructing the ND filtered image of .
  • the learning device of the present disclosure includes an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected and identification data representing whether or not there is a metallic foreign substance embedded in the surface of the object to be inspected.
  • a data acquisition unit that acquires data for learning, and a metallic foreign substance embedded in the surface of the object to be inspected exists from an ND filter image of the object to be inspected obtained by photographing the surface of the object using the data for learning.
  • a model generation unit that generates a trained model that identifies whether or not to perform the training.
  • the inference device of the present disclosure includes a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by imaging the surface of the object to be inspected, and an ND filter of the object to be inspected that is obtained by imaging the surface of the object to be inspected.
  • a trained model for inferring from the image whether or not there is a metal foreign substance embedded in the surface of the object to be inspected from the ND filtered image of the object to be inspected acquired by the data acquisition unit, an inference unit for inferring whether or not a metallic foreign object exists on the surface.
  • the appearance inspection apparatus of the present disclosure inspects for the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected based on the difference image between the ND filter image and the PL filter image. Thereby, the visual inspection apparatus of the present disclosure can detect only metallic foreign matter embedded in the surface of the object to be inspected.
  • FIG. 1 is a diagram illustrating the configuration of an appearance inspection apparatus according to Embodiment 1;
  • FIG. 4 is a flow chart showing the procedure of inspection processing by the inspection apparatus according to Embodiment 1;
  • FIG. 4 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1;
  • FIG. 10 is a diagram illustrating the configuration of a visual inspection apparatus according to Embodiment 2; It is a figure showing the structure of a neural network.
  • 10 is a flowchart regarding learning processing of the learning device 101 of Embodiment 2.
  • FIG. 10 is a flowchart of inference processing of the inference device 105 according to the second embodiment;
  • FIG. 12 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 3;
  • FIG. 12 is a flowchart of learning processing of the learning device 101A of Embodiment 3.
  • FIG. 12 is a flowchart of inference processing of the inference device 105A of Embodiment 3;
  • FIG. 13 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 4;
  • 14 is a flow chart showing a procedure of inspection processing by an inspection apparatus according to Embodiment 4;
  • FIG. 4 is a diagram for explaining an inspection range;
  • FIG. 2 is a diagram showing a hardware configuration of control devices 20, 200, 200A;
  • FIG. 1 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 1. As shown in FIG. 1
  • the appearance inspection device inspects foreign matter on the surface of the object 8 to be inspected.
  • the object 8 to be inspected is not particularly limited as long as it is an object that causes specular reflection of light on its surface. In specular reflection, the angle of incidence and the angle of reflection of the illuminated light are equal. Diffuse light is light that is diffusely reflected in all directions.
  • the object 8 to be inspected is, for example, a plastic molded product whose surface is coated with epoxy resin.
  • the surface (surface to be inspected) of the object to be inspected 8 may be flat or curved.
  • Foreign substances that can be detected by visual inspection equipment are metal foreign substances and floating foreign substances.
  • the metal foreign matter is an object embedded in the surface of the object 8 to be inspected and having a flat shape.
  • a floating foreign object is an object having irregularities adhering to the surface of the object 8 to be inspected.
  • metal contaminants include copper, aluminum, and the like.
  • Airborne contaminants include, for example, dust, fiber contaminants, and the like.
  • the visual inspection apparatus includes an illumination unit 61, a beam splitter 4, a first camera 1, a second camera 51, a motor 6, a drive shaft 7, a jig 9, a power supply 10, and a polarizing plate. 11 and a control device 20 .
  • the lighting unit 61 adjusts the brightness when photographing the object 8 to be inspected.
  • the illumination unit 61 irradiates the object 8 to be inspected with illumination light L1 having only a specific polarization angle component.
  • the lighting section 61 includes the lighting device 5 and the polarizing plate 11 .
  • the illumination device 5 emits light.
  • the polarizing plate 11 emits only light having a specific polarization angle component among the light components irradiated from the illumination device 5 .
  • the inspected object 8 is irradiated with the irradiation light L1 having only a specific polarization angle component.
  • specular reflection light LA 1 is generated and is incident on the beam splitter 4 .
  • the beam splitter 4 is a cubic beam splitter composed of two rectangular prisms.
  • the beam splitter 4 is arranged in a direction in which the irradiation light L1 is specularly reflected.
  • the beam splitter 4 splits the incident light into two lights with a defined splitting ratio.
  • Reflected light LA1 incident on beam splitter 4 is split into reflected light LA2 and reflected light LA3.
  • Reflected light LA2 is incident on first camera 1 .
  • Reflected light LA3 is incident on second camera 51 .
  • the first camera 1 and the second camera 51 are monochrome cameras for photographing the surface of the inspected object 8.
  • the first camera 1 and the second camera 51 photograph the surface of the inspected object 8 simultaneously.
  • the first camera 1 photographs the surface of the inspected object 8 from the specular reflection direction of the irradiation light L1.
  • the first camera 1 receives the reflected light LA2 and photographs the inspected object 8 to generate an image of the inspected object 8 .
  • a first camera 1 includes a lens 2 and an ND (Neutral Density) filter 3 to which the lens 2 is attached.
  • the ND filter 3 is also called a "neutral density filter", and reduces the amount of light by a certain amount over the entire predetermined wavelength band.
  • An image generated by light passing through the ND filter 3 is called an ND filter image.
  • Light passing through the ND filter 3 includes light reflected from flat metallic foreign matter embedded in the surface of the inspected object 8 and light reflected from floating foreign matter.
  • the lens 2 converges the light that has passed through the ND filter 3 and forms an image at one point.
  • the first camera 1 having the ND filter 3 photographs the surface of the inspected object 8 and generates an ND filter image.
  • the ND filtered image is produced by the light reflected from the embedded flat metallic foreign matter present on the surface of the inspected object 8 and the light reflected from the airborne foreign matter.
  • the second camera 51 photographs the surface of the inspected object 8 from a direction perpendicular to the specular reflection direction of the irradiation light L1.
  • the second camera 51 receives the reflected light LA3 and photographs the inspected object 8 to generate an image of the inspected object 8 .
  • the second camera 51 includes a lens 52 and a PL (Polarized Light) filter 53 to which the lens 52 is attached.
  • the PL filter 53 is also called a "polarizing filter” and is a lens filter using a polarizing film.
  • a polarizing filter has a structure in which a polarizing film is sandwiched between two sheets of glass, and has a rotating frame structure for rotating the orientation of the polarizing film.
  • the polarization film of the PL filter 53 is mounted so that the direction of the polarization film is perpendicular to the polarization direction of the reflected light from the metal foreign matter embedded in the surface of the object 8 to be inspected.
  • the lens 52 converges the light that has passed through the PL filter 53 and forms an image at one point.
  • An image generated by light passing through the PL filter 53 is called a PL filter image.
  • the polarization characteristics of incident light and the polarization characteristics of reflected light are preserved as they are. It is known that reflected light from dust and fiber contaminants with minute scattering particles is unpolarized. Since the polarization direction of the light condensed by the lens 52 is perpendicular to the reflected light of the metallic foreign matter, if there is a flat-shaped metallic foreign matter embedded in the surface of the inspected object 8, that portion will appear dark. . On the other hand, if there is a floating foreign matter having irregularities on the surface of the object 8 to be inspected, that portion appears bright. Light transmitted through the PL filter 53 includes only light reflected from floating foreign matter. As described above, the second camera 51 having the PL filter 53 photographs the surface of the inspected object 8 and generates a PL filter image. A PL filtered image is produced only by the light reflected from the airborne foreign object.
  • the metallic foreign matter and floating foreign matter are brightly emphasized in the ND filter image, and only the floating foreign matter is brightly emphasized in the PL filter image. be. Therefore, only metallic foreign matter can be inspected from the difference image between the ND filter image and the PL filter image.
  • the ND filter image and the PL filter image can be generated simultaneously, so the time required for inspection can be shortened compared to the case where the images are generated separately.
  • the motor 6 converts electrical energy into mechanical energy.
  • the motor 6 outputs rotational motion.
  • the drive shaft 7 transmits the rotational driving force of the motor 6 to the inspected object 8 .
  • the jig 9 mounts the object 8 to be inspected.
  • the power supply device 10 supplies power supply voltage to the lighting device 5 .
  • the control device 20 has an arithmetic function.
  • the control device 20 can be configured by, for example, a personal computer, a microcomputer board, or an FPGA (Field Programmable Gate Array) board.
  • the control device 20 includes a camera control section 21 , an inspection processing section 22 , an illumination control section 23 and a motor control section 24 .
  • the camera control unit 21 controls the first camera 1 and the second camera 51.
  • the camera control unit 21 sends a trigger signal to the first camera 1 and the second camera 51 to cause the first camera 1 and the second camera 51 to image the inspected object 8 .
  • the inspection processing unit 22 performs image conversion, deformation, feature amount extraction, etc. on the ND filter image generated by the first camera 1 and the PL filter image generated by the second camera 51 .
  • the inspection processing unit 22 inspects the presence or absence of metallic foreign matter embedded in the surface of the object 8 to be inspected based on the difference image between the ND filter image and the PL filter image.
  • the inspection processing unit 22 determines the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected 8 based on the size of the area of the region composed of the connected pixels having a value equal to or greater than the binarization threshold value in the difference image. inspect.
  • the lighting control unit 23 controls the lighting unit 61 .
  • the lighting control unit 23 is connected to the lighting device 5 and the power supply device 10 .
  • the lighting control unit 23 not only turns on and off the lighting device 5 but also controls the intensity of light from the lighting device 5 .
  • the motor control unit 24 controls the motor 6.
  • a motor control unit 24 transmits the position and speed to the motor 6 and receives a positioning completion signal.
  • the motor control unit 24 controls the rotation angle and rotation speed of the motor 6 using pulse signals.
  • the motor 6 may be a common stepping motor or servomotor.
  • the motor control section 24 outputs a pulse signal.
  • the ON/OFF cycle of the pulse signal is defined as one pulse, and the drive shaft 7 rotates by one step angle at which one pulse is output.
  • FIG. 2 is a flowchart representing the procedure of inspection processing by the inspection apparatus according to Embodiment 1.
  • FIG. 2 is a flowchart representing the procedure of inspection processing by the inspection apparatus according to Embodiment 1.
  • step A ⁇ b>1 the lighting control unit 23 turns on the lighting device 5 .
  • step A ⁇ b>2 the camera control unit 21 sends imaging instructions to the first camera 1 and the second camera 51 . As a result, an ND filter image and a PL filter image are generated on the surface of the object 8 to be inspected.
  • step A3 the inspection processing unit 22 generates a difference image between the ND filter image and the PL filter image by subtracting the PL filter image from the ND filter image. Specifically, the inspection processing unit 22 uses a value obtained by subtracting the pixel value of the PL filter image from the pixel value of the ND filter image as the pixel value of the difference image.
  • step A4 the inspection processing unit 22 generates a binarized image by binarizing the pixel values of the difference image using a predetermined binarization threshold. If the value of the pixel in the differential image is equal to or greater than the binarization threshold, the inspection processing unit 22 sets the value of the pixel to "1". Let the value of that pixel be "0".
  • the inspection processing unit 22 labels the binarized image.
  • labeling is a process of classifying a plurality of regions as a group by adding the same label "1" to connected pixels, and is a well-known technique in image processing.
  • step A6 the inspection processing unit 22 measures the area of each region formed by a plurality of pixels labeled "1". If there is a region whose area is equal to or greater than the threshold, the process proceeds to step A8. If there is no region with an area equal to or greater than the threshold, the process proceeds to step A7.
  • the inspection processing unit 22 determines that there is no metal foreign matter embedded in the surface of the object 8 to be inspected.
  • the inspection processing unit 22 determines that there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
  • FIG. 3 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1.
  • FIG. 3 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1.
  • the metallic foreign matter FW1 and the floating foreign matter FW2 are extracted.
  • Floating foreign matter FW2 is extracted from the PL filtered image 301 .
  • a difference image 302 is generated. In the difference image 302, only the metallic foreign matter FW1 is extracted.
  • FIG. 4 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 2. As shown in FIG. 4
  • the appearance inspection apparatus of the second embodiment includes a control device 200 that differs from the control device 20 of the first embodiment.
  • the control device 200 includes a learning device 101, a learned model storage unit 109, an inference device 105, and an inspection unit 108.
  • Learning device 101 includes data acquisition unit 102 and model generation unit 103 .
  • the data acquisition unit 102 acquires, as learning data, an ND filter image of a non-defective product obtained by photographing the surface of an object to be inspected (hereinafter referred to as a non-defective product) having no metallic foreign matter embedded in the surface.
  • This ND filter image can be generated by the first camera 1 as in the first embodiment.
  • the model generation unit 103 uses the learning data to generate a trained model 104 that reconstructs a good ND filter image from a good ND filter image (good product learning).
  • a known algorithm such as supervised learning, unsupervised learning, or reinforcement learning can be used.
  • supervised learning unsupervised learning
  • reinforcement learning can be used as an example.
  • a case where a neural network is applied will be described.
  • the model generation unit 103 performs non-defective product learning by so-called unsupervised learning, for example, according to the neural network model.
  • the unsupervised learning means that only good ND filter images are given to the learning device 101 without linking the result (label) data combination to the input, and the features of the good ND filter images are learned.
  • This is a method of generating a learned model for reconstructing a good ND filter image from a good ND filter image.
  • supervised learning by giving a set of input and result (label) data to a learning device, it learns the features in those learning data, and from among multiple results (labels) from the input, A method of inferring results (labels) with high scores.
  • FIG. 5 is a diagram showing the configuration of a neural network.
  • a neural network is composed of an input layer consisting of a plurality of neurons, an intermediate layer (hidden layer) consisting of a plurality of neurons, and an output layer consisting of a plurality of neurons.
  • the intermediate layer may be one layer, or two or more layers.
  • the neural network learns the features of the good ND filter image by so-called unsupervised learning according to the good ND filter image acquired by the data acquisition unit 102 . That is, by inputting a good ND filter image to the input layer and adjusting the weights W1 and W2 so that the reconstructed image output from the output layer approaches the good ND filter image input to the input layer. learn.
  • the learning algorithm used in the model generation unit 103 may be executed according to deep learning, such as convolution neural network, which learns so that the feature quantity itself can be extracted.
  • An autoencoder can be used as a learning algorithm used in the model generation unit 103 .
  • the autoencoder includes an encoder that extracts a feature amount from a good ND filter image as input data, and a decoder that reconstructs a good ND filter image from the feature amount.
  • the encoder consists of the input and hidden layers of the neural network, and the decoder consists of the hidden and output layers of the neural network.
  • machine learning may be performed according to other known methods such as genetic programming, functional logic programming, or support vector machines.
  • the model generation unit 103 generates and outputs a learned model by executing the above learning.
  • the learned model storage unit 109 stores the learned model 104 output from the model generation unit 103.
  • FIG. 6 is a flowchart relating to learning processing of the learning device 101 according to the second embodiment.
  • an ND filter image of a non-defective product obtained by photographing the surface of an object to be inspected (hereinafter referred to as a non-defective product) having no metallic foreign matter embedded in the surface is acquired as learning data.
  • a non-defective product an image representing the feature amount of the object to be inspected 8 obtained by performing image processing on a non-defective ND filter image may be input as a non-defective product image.
  • a known algorithm such as luminance binarization or normalization can be used.
  • step a2 the model generation unit 103 uses the learning data acquired by the data acquisition unit 102 to reconstruct a good ND filter image from a good ND filter image by so-called unsupervised learning. to generate
  • the learned model storage unit 109 stores the learned model 104 generated by the model generation unit 103.
  • the inference device 105 includes a data acquisition unit 106 and an inference unit 107.
  • the data acquisition unit 106 acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected.
  • This ND filter image can be generated by the first camera 1 as in the first embodiment.
  • the inference unit 107 uses the learned model 104 that reconstructs a good ND filter image from the good ND filter images stored in the learned model storage unit 109 to obtain the inspected object 8 acquired by the data acquisition unit 106.
  • An ND-filtered image of the inspected object 8 is reconstructed from the ND-filtered image of . That is, by inputting the ND-filtered image of the inspected object 8 acquired by the data acquisition unit 106 into this trained model, a reconstructed image of the ND-filtered image of the inspected object 8 can be generated.
  • the reconstructed image generated by the inference unit 107 is similar to the ND filtered image (non-defective product image) of the object to be inspected obtained by the data acquisition unit 106 and input to the autoencoder. image, and the reconstruction is done properly. If the inspected object 8 is not a non-defective product, the reconstructed image generated by the inference unit 107 will not be an image close to the ND filtered image of the inspected object acquired by the data acquisition unit 106 and input to the autoencoder. not be properly reconfigured. This is because the trained model has been trained to reconstruct good ND-filtered images from good ND-filtered images.
  • the inspection unit 108 shown in FIG. 4 compares the reconstructed image of the object 8 to be inspected output from the inference unit 107 with the ND filter image of the object 8 to be inspected input to the inference unit 107 to obtain the object to be inspected. It is inspected whether or not there is a metal foreign substance embedded in the surface of 8.
  • a known algorithm such as Zscore or pattern matching can be used.
  • FIG. 7 is a flowchart of the inference processing of the inference device 105 according to the second embodiment.
  • the data acquisition unit 106 acquires an ND filter image of the object 8 to be inspected.
  • the inference unit 107 inputs the ND filter image of the inspection object 8 acquired by the data acquisition unit 106 to the learned model 104 stored in the learned model storage unit 109 to obtain a reconstructed image.
  • the inference unit 107 outputs the obtained reconstructed image to the inspection unit 108.
  • the inspection unit 108 compares the ND filter image of the inspection object 8 acquired by the data acquisition unit 106 with the reconstructed image.
  • the inspection unit 108 determines that the two images are similar by pattern matching or the like, it can determine that there is no metal foreign matter embedded in the surface of the object 8 to be inspected.
  • the inspection unit 108 determines that the two images are not similar by pattern matching or the like, it can determine that there is a metallic foreign substance embedded in the surface of the object 8 to be inspected.
  • unsupervised learning is applied to the learning algorithm used by the model generating unit 103
  • the present invention is not limited to this.
  • reinforcement learning, supervised learning, or semi-supervised learning can also be applied as learning algorithms.
  • the model generation unit 103 may perform non-defective product learning using ND filter images of non-defective inspected objects 8 of a plurality of resin molded products having similar shapes as learning data. It is also possible to add a non-defective inspected object 8 of a resin molded product for which learning data is collected as a target model on the way, and to remove it from the target model. In addition, a learned model obtained by executing good product learning using an ND filter image of a good product under inspection 8 of a certain resin molded product is added to an ND filter image of a good product under inspection 8 of another resin molded product. may be used to perform good product learning again.
  • the quality of the object to be inspected 8 that is, the presence or absence of metallic foreign matter embedded in the surface is determined. be able to.
  • FIG. 8 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 3. As shown in FIG.
  • a visual inspection apparatus includes a control device 200A that is different from the control device 200 according to the second embodiment.
  • the control device 200A includes a learning device 101A, an inference device 105A, a learned model storage unit 109A, and a display unit 120 instead of the learning device 101, the inference device 105, the learned model storage unit 109, and the inspection unit 108.
  • the learning device 101A includes a data acquisition unit 102A and a model generation unit 103A.
  • the data acquisition unit 102A acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected, and identification data representing whether or not there is a metallic foreign substance embedded in the surface of the object 8 to be inspected. and are acquired as learning data.
  • the model generation unit 103A determines whether there is a metallic foreign substance embedded in the surface of the inspection object 8 from the ND filter image of the inspection object 8 obtained by photographing the surface of the inspection object 8.
  • a trained model 104A that identifies whether or not is generated.
  • a known algorithm such as supervised learning, unsupervised learning, or reinforcement learning can be used as the learning algorithm used by the model generation unit 103A.
  • supervised learning unsupervised learning
  • reinforcement learning can be used as the learning algorithm used by the model generation unit 103A.
  • a case where a neural network is applied will be described.
  • the model generation unit 103A learns whether or not there is a metallic foreign substance embedded in the surface of the inspection object 8 by so-called supervised learning according to, for example, a neural network model.
  • supervised learning refers to a technique in which input and result (label) data sets are given to the learning device 101A to learn features in the learning data, and the result is inferred from the input.
  • the neural network includes an ND-filtered image of the inspected object 8 acquired by the data acquisition unit 102A, and identification data representing whether or not there is a metal foreign substance embedded in the surface of the inspected object 8.
  • learning data it is learned whether or not there is a metal foreign substance embedded in the surface of the object 8 to be inspected by so-called supervised learning.
  • the neural network learns by adjusting the weights W1 and W2 so that the ND filter image is input to the input layer and the result output from the output layer approaches the identification data (correct answer).
  • the model generation unit 103A generates and outputs the learned model 104A by executing the above learning.
  • the learned model storage unit 109A stores the learned model 104A output from the model generation unit 103A.
  • FIG. 9 is a flowchart relating to learning processing of the learning device 101A according to the third embodiment.
  • the data acquisition unit 102A displays an ND filter image of the object 8 to be inspected obtained by photographing the object 8 and whether or not there is a metallic foreign matter embedded in the surface of the object 8 to be inspected. Acquire learning data including combinations with identification data (correct answers).
  • step c2 the model generation unit 103A performs so-called supervised learning based on the learning data acquired by the data acquisition unit 102A to extract the metal embedded in the surface of the inspection object 8 from the ND filter image of the inspection object 8. Generate a trained model 104A that identifies whether or not a foreign object exists.
  • the learned model storage unit 109A stores the learned model 104A generated by the model generation unit 103A.
  • the inference device 105A in FIG. 8 includes a data acquisition unit 106A and an inference unit 107A.
  • the data acquisition unit 106A acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected.
  • the inference unit 107A uses the learned model 104A stored in the learned model storage unit 109A to detect metallic foreign matter on the surface of the inspection object 8 from the ND filter image of the inspection object 8 acquired by the data acquisition unit 106A. infer whether exists or not. That is, by inputting the ND filter image of the object 8 to be inspected acquired by the data acquisition unit 106A into this trained model, the presence of metallic foreign matter embedded in the surface of the object 8 to be inspected can be inferred from the ND filter image. It is possible to output identification data that identifies whether or not to do so.
  • FIG. 10 is a flowchart of the inference processing of the inference device 105A of the third embodiment.
  • the data acquisition unit 106A acquires an ND filter image of the object 8 to be inspected by photographing the surface of the object 8 to be inspected.
  • the inference unit 107A inputs the ND filter image of the inspected object 8 acquired by the data acquisition unit 106A to the learned model 104A stored in the learned model storage unit 109A, and the surface of the inspected object 8 Identification data is obtained to identify whether or not an embedded metallic foreign object is present.
  • the inference unit 107A outputs the obtained identification data to the display unit 120.
  • the display unit 120 can display information indicating whether or not there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
  • the quality of the object to be inspected 8 that is, the presence or absence of metallic foreign matter embedded in the surface is determined simply by inputting the ND filter image of the object to be inspected into the inference device 105A. be able to.
  • FIG. 11 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 4. As shown in FIG.
  • the visual inspection apparatus of the fourth embodiment differs from the visual inspection apparatus of the first embodiment in that the visual inspection apparatus of the fourth embodiment includes an illumination section 61A instead of the illumination section 61.
  • the lighting unit 61A adjusts the brightness when photographing the object 8 to be inspected.
  • the illumination unit 61A irradiates the inspection object 8 with illumination light L1 and illumination light L2 having only specific polarization angle components.
  • the lighting section 61A includes the lighting device 5 and the polarizing plate 11 in the same manner as the lighting section 61 .
  • the illumination section 61A further includes an illumination device 54 and a polarizing plate 55 .
  • the illumination device 54 emits light.
  • the polarizing plate 55 emits only the irradiation light L2 having only a specific polarization angle component among the light components irradiated from the illumination device 54 . Irradiation light L2 having only a specific polarization angle component is projected onto the object 8 to be inspected.
  • reflected light LB1 of diffuse reflection is generated and is incident on the beam splitter 4 .
  • the beam splitter 4 is a cubic beam splitter composed of two rectangular prisms.
  • the beam splitter 4 is arranged in the direction in which the reflected light LB1 is incident.
  • the beam splitter 4 splits the incident light into two lights with a defined splitting ratio.
  • Reflected light LB1 incident on beam splitter 4 is split into reflected light LB2 and reflected light LB3.
  • Reflected light LB2 is incident on first camera 1 .
  • Reflected light LB3 is incident on second camera 51 .
  • the first camera 1 is arranged at a position different from that of the first embodiment.
  • the first camera 1 photographs the surface of the object 8 to be inspected from the normal direction of the object 8 to be inspected.
  • the positional relationship between the first camera 1 and the second camera 51 is the same as in the first embodiment. Therefore, due to the movement of the first camera 1, the second camera 51 is also arranged at an offset position.
  • FIG. 12 is a flow chart showing the procedure of inspection processing by the inspection apparatus according to the fourth embodiment.
  • processing for dynamically setting the inspection range is added to the content shown in FIG. 2 of the first embodiment.
  • the lighting control unit 23 turns on the lighting device 5 and the lighting device 54 .
  • step B2 the camera control unit 21 sends imaging instructions to the first camera 1 and the second camera 51.
  • an ND filter image and a PL filter image are generated on the surface of the object 8 to be inspected.
  • FIG. 13 is a diagram for explaining the inspection range.
  • the ND filtered image 400 includes a captured image of the inspected object 401 .
  • the object 401 to be inspected has a range in which the light emitted from the illumination device 5 and the illumination device 54 is specularly reflected. Rectangles 402 and 403 indicate the specular reflection range.
  • the illuminances of the illumination devices 5 and 54 are set in advance so that the specular reflection range is input to the first camera 1 with a light intensity of 255 (maximum value).
  • the inspection processing unit 22 detects the range where the luminance value is 255, that is, the rectangles 402 and 403, and sets the gap between the rectangles 402 and 403 as the inspection range. If the metallic foreign matter is embedded in the specular reflection range, the luminance value of the metallic foreign matter also becomes 255, and there is a problem that the metallic foreign matter cannot be detected. , the problem is solved.
  • step B4 the inspection processing unit 22 generates a difference image between the ND filter image and the PL filter image by subtracting the PL filter image from the ND filter image. Since the light transmitted through the ND filter 3 includes specularly reflected light and diffusely reflected light from the metallic foreign matter and diffusely reflected light from the floating foreign matter, the metallic foreign matter and the floating foreign matter are brightly emphasized in the ND filter image. In the PL filter image, the metallic foreign matter and floating foreign matter are also brightly emphasized. However, since the light transmitted through the PL filter 53 includes only the diffusely reflected light from the metallic foreign matter, the brightness of the metallic foreign matter in the PL filter image is ND It is darker than the brightness of the metallic foreign matter in the filtered image.
  • the inspection processing unit 22 uses a value obtained by subtracting the pixel value of the PL filter image from the pixel value of the ND filter image as the pixel value of the difference image.
  • step B5 the inspection processing unit 22 generates a binarized image by binarizing the pixel values of the difference image using a predetermined binarization threshold. If the value of the pixel in the differential image is equal to or greater than the binarization threshold, the inspection processing unit 22 sets the value of the pixel to "1". Let the value of that pixel be "0".
  • the inspection processing unit 22 labels the binarized image.
  • labeling is a process of classifying a plurality of regions as a group by adding the same label "1" to connected pixels, and is a well-known technique in image processing.
  • step B7 the inspection processing unit 22 measures the area of each region formed by a plurality of pixels labeled "1". If there is a region whose area is equal to or greater than the threshold, the process proceeds to step B9. If there is no region with an area equal to or greater than the threshold, the process proceeds to step B8.
  • the inspection processing unit 22 determines that there is no metallic foreign matter embedded in the surface of the object 8 to be inspected.
  • the inspection processing unit 22 determines that there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
  • the illumination device 5 and the illumination device 54 are bar-shaped illumination devices, but they may be ring-shaped illumination devices.
  • two illumination devices 5 and 54 are installed in order to widen the inspection range, but it is also possible to install only one bar-shaped illumination device.
  • FIG. 14 is a diagram showing the hardware configuration of control devices 20, 200 and 200A.
  • the controllers 20, 200, 200A can implement corresponding operations in digital circuit hardware or software.
  • the control devices 20, 200, 200A are, for example, as shown in FIG. , and the processor 1001 can execute a program stored in the memory 1002 .

Landscapes

  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
PCT/JP2021/043248 2021-06-03 2021-11-25 外観検査装置、外観検査方法、学習装置および推論装置 WO2022254747A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023525357A JP7483135B2 (ja) 2021-06-03 2021-11-25 外観検査装置、および外観検査方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021093691 2021-06-03
JP2021-093691 2021-06-03

Publications (1)

Publication Number Publication Date
WO2022254747A1 true WO2022254747A1 (ja) 2022-12-08

Family

ID=84324051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/043248 WO2022254747A1 (ja) 2021-06-03 2021-11-25 外観検査装置、外観検査方法、学習装置および推論装置

Country Status (2)

Country Link
JP (1) JP7483135B2 (zh)
WO (1) WO2022254747A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221268A (ja) * 1997-02-05 1998-08-21 Advantest Corp ウェーハの表面状態検出方法および装置
JP2001013261A (ja) * 1999-06-30 2001-01-19 Mitsubishi Heavy Ind Ltd 異物検出方法及びその装置
JP2016105052A (ja) * 2014-12-01 2016-06-09 東レエンジニアリング株式会社 基板検査装置
JP2019060780A (ja) * 2017-09-27 2019-04-18 ファナック株式会社 検査装置及び検査システム
WO2020031984A1 (ja) * 2018-08-08 2020-02-13 Blue Tag株式会社 部品の検査方法及び検査システム
JP2020193890A (ja) * 2019-05-29 2020-12-03 ヴィスコ・テクノロジーズ株式会社 外観検査装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221268A (ja) * 1997-02-05 1998-08-21 Advantest Corp ウェーハの表面状態検出方法および装置
JP2001013261A (ja) * 1999-06-30 2001-01-19 Mitsubishi Heavy Ind Ltd 異物検出方法及びその装置
JP2016105052A (ja) * 2014-12-01 2016-06-09 東レエンジニアリング株式会社 基板検査装置
JP2019060780A (ja) * 2017-09-27 2019-04-18 ファナック株式会社 検査装置及び検査システム
WO2020031984A1 (ja) * 2018-08-08 2020-02-13 Blue Tag株式会社 部品の検査方法及び検査システム
JP2020193890A (ja) * 2019-05-29 2020-12-03 ヴィスコ・テクノロジーズ株式会社 外観検査装置

Also Published As

Publication number Publication date
JPWO2022254747A1 (zh) 2022-12-08
JP7483135B2 (ja) 2024-05-14

Similar Documents

Publication Publication Date Title
CN108355981B (zh) 一种基于机器视觉的电池连接器质量检测方法
TWI598581B (zh) 檢查裝置及檢查方法
KR102206753B1 (ko) 결함 검사 장치
CN114787648A (zh) 用于使用偏振提示进行透明对象分段的系统和方法
TWI743837B (zh) 訓練資料增量方法、電子裝置與電腦可讀取記錄媒體
US10726535B2 (en) Automatically generating image datasets for use in image recognition and detection
CN112164048A (zh) 一种基于深度学习的磁瓦表面缺陷自动检测方法和装置
US20220284567A1 (en) Teacher data generation method, trained learning model, and system
JP2008170256A (ja) 欠陥検出方法、欠陥検出プログラムおよび検査装置
JP3220455U (ja) 皮革検出装置
Banus et al. A deep-learning based solution to automatically control closure and seal of pizza packages
WO2022254747A1 (ja) 外観検査装置、外観検査方法、学習装置および推論装置
TW202240546A (zh) 用於自動視覺檢查之圖像增強技術
JP2020106295A (ja) シート欠陥検査装置
JP2019200775A (ja) 表面欠陥検査装置及び表面欠陥検査方法
JPH0792104A (ja) 被検査物の不良個所検査方法
US20210372778A1 (en) Detection apparatus and method of producing electronic apparatus
CN111028250A (zh) 一种实时智能验布方法及系统
WO2024009868A1 (ja) 外観検査システム、外観検査方法、学習装置および推論装置
Yakimov Preprocessing digital images for quickly and reliably detecting road signs
KR20210009411A (ko) 결함 검사 장치
Bäuerle et al. CAD2Real: Deep learning with domain randomization of CAD data for 3D pose estimation of electronic control unit housings
JPH109835A (ja) 表面欠陥検査装置
CA2997335C (en) Automatically generating image datasets for use in image recognition and detection
Dighvijay et al. A Faster R-CNN implementation of presence inspection for parts on industrial produce

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21944239

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023525357

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21944239

Country of ref document: EP

Kind code of ref document: A1