WO2022254747A1 - Appearance inspection device, appearance inspection method, learning device, and inference device - Google Patents

Appearance inspection device, appearance inspection method, learning device, and inference device Download PDF

Info

Publication number
WO2022254747A1
WO2022254747A1 PCT/JP2021/043248 JP2021043248W WO2022254747A1 WO 2022254747 A1 WO2022254747 A1 WO 2022254747A1 JP 2021043248 W JP2021043248 W JP 2021043248W WO 2022254747 A1 WO2022254747 A1 WO 2022254747A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspected
image
filter
filter image
foreign matter
Prior art date
Application number
PCT/JP2021/043248
Other languages
French (fr)
Japanese (ja)
Inventor
幸博 徳
悠一郎 森田
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2023525357A priority Critical patent/JP7483135B2/en
Publication of WO2022254747A1 publication Critical patent/WO2022254747A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to a visual inspection device, a visual inspection method, a learning device, and a reasoning device.
  • the inspection apparatus of Patent Document 1 irradiates the surface of the work W with A-color irradiation light L1 and LA1, B-color irradiation light LB1, and C-color irradiation light LC1.
  • the light is split to produce an A-color image, a B-color image, and a C-color image simultaneously.
  • the A-color image has a smaller difference in brightness between the portion corresponding to the defect of the workpiece W and its surroundings.
  • the inspection apparatus of Patent Literature 1 compares the two images and extracts only defects of the work W as differences between the two images.
  • Patent Literature 1 cannot determine whether the foreign matter to be detected is embedded in the surface of the object to be inspected or is only adhered to the surface of the object to be inspected.
  • an object of the present disclosure is to provide a visual inspection apparatus, a visual inspection method, a learning apparatus, and an inference apparatus that can detect only metallic foreign matter embedded in the surface of an object to be inspected.
  • a visual inspection apparatus of the present disclosure includes a first camera that has an ND filter and captures the surface of an object to be inspected to generate an ND filter image, and a PL filter that captures the surface of the object to be inspected. a second camera for generating a PL filtered image; and a control device for inspecting presence/absence of metal foreign matter embedded in the surface of the object to be inspected based on a differential image between the ND filtered image and the PL filtered image.
  • the appearance inspection method of the present disclosure comprises the steps of: a first camera having an ND filter photographs the surface of an object to be inspected to generate an ND filter image; photographing the surface of the object to generate a PL filter image; and the step of:
  • the learning device of the present disclosure includes a data acquisition unit that obtains learning data including an ND filter image of a non-defective product obtained by photographing the surface of the non-defective product, with an object to be inspected having no metal foreign matter embedded in the surface as a non-defective product. and a model generation unit that generates a trained model for reconstructing a good ND filter image from a good ND filter image using the learning data.
  • the inference device of the present disclosure includes a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected, and an object to be inspected that does not contain any metal foreign matter embedded in the surface is regarded as a non-defective product, Using a trained model that reconstructs an ND filter image of a good product from an ND filter image of a good product obtained by photographing the surface of a good product, the object to be inspected is obtained from the ND filter image of the object to be inspected acquired by the data acquisition unit and a reasoner for reconstructing the ND filtered image of .
  • the learning device of the present disclosure includes an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected and identification data representing whether or not there is a metallic foreign substance embedded in the surface of the object to be inspected.
  • a data acquisition unit that acquires data for learning, and a metallic foreign substance embedded in the surface of the object to be inspected exists from an ND filter image of the object to be inspected obtained by photographing the surface of the object using the data for learning.
  • a model generation unit that generates a trained model that identifies whether or not to perform the training.
  • the inference device of the present disclosure includes a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by imaging the surface of the object to be inspected, and an ND filter of the object to be inspected that is obtained by imaging the surface of the object to be inspected.
  • a trained model for inferring from the image whether or not there is a metal foreign substance embedded in the surface of the object to be inspected from the ND filtered image of the object to be inspected acquired by the data acquisition unit, an inference unit for inferring whether or not a metallic foreign object exists on the surface.
  • the appearance inspection apparatus of the present disclosure inspects for the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected based on the difference image between the ND filter image and the PL filter image. Thereby, the visual inspection apparatus of the present disclosure can detect only metallic foreign matter embedded in the surface of the object to be inspected.
  • FIG. 1 is a diagram illustrating the configuration of an appearance inspection apparatus according to Embodiment 1;
  • FIG. 4 is a flow chart showing the procedure of inspection processing by the inspection apparatus according to Embodiment 1;
  • FIG. 4 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1;
  • FIG. 10 is a diagram illustrating the configuration of a visual inspection apparatus according to Embodiment 2; It is a figure showing the structure of a neural network.
  • 10 is a flowchart regarding learning processing of the learning device 101 of Embodiment 2.
  • FIG. 10 is a flowchart of inference processing of the inference device 105 according to the second embodiment;
  • FIG. 12 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 3;
  • FIG. 12 is a flowchart of learning processing of the learning device 101A of Embodiment 3.
  • FIG. 12 is a flowchart of inference processing of the inference device 105A of Embodiment 3;
  • FIG. 13 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 4;
  • 14 is a flow chart showing a procedure of inspection processing by an inspection apparatus according to Embodiment 4;
  • FIG. 4 is a diagram for explaining an inspection range;
  • FIG. 2 is a diagram showing a hardware configuration of control devices 20, 200, 200A;
  • FIG. 1 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 1. As shown in FIG. 1
  • the appearance inspection device inspects foreign matter on the surface of the object 8 to be inspected.
  • the object 8 to be inspected is not particularly limited as long as it is an object that causes specular reflection of light on its surface. In specular reflection, the angle of incidence and the angle of reflection of the illuminated light are equal. Diffuse light is light that is diffusely reflected in all directions.
  • the object 8 to be inspected is, for example, a plastic molded product whose surface is coated with epoxy resin.
  • the surface (surface to be inspected) of the object to be inspected 8 may be flat or curved.
  • Foreign substances that can be detected by visual inspection equipment are metal foreign substances and floating foreign substances.
  • the metal foreign matter is an object embedded in the surface of the object 8 to be inspected and having a flat shape.
  • a floating foreign object is an object having irregularities adhering to the surface of the object 8 to be inspected.
  • metal contaminants include copper, aluminum, and the like.
  • Airborne contaminants include, for example, dust, fiber contaminants, and the like.
  • the visual inspection apparatus includes an illumination unit 61, a beam splitter 4, a first camera 1, a second camera 51, a motor 6, a drive shaft 7, a jig 9, a power supply 10, and a polarizing plate. 11 and a control device 20 .
  • the lighting unit 61 adjusts the brightness when photographing the object 8 to be inspected.
  • the illumination unit 61 irradiates the object 8 to be inspected with illumination light L1 having only a specific polarization angle component.
  • the lighting section 61 includes the lighting device 5 and the polarizing plate 11 .
  • the illumination device 5 emits light.
  • the polarizing plate 11 emits only light having a specific polarization angle component among the light components irradiated from the illumination device 5 .
  • the inspected object 8 is irradiated with the irradiation light L1 having only a specific polarization angle component.
  • specular reflection light LA 1 is generated and is incident on the beam splitter 4 .
  • the beam splitter 4 is a cubic beam splitter composed of two rectangular prisms.
  • the beam splitter 4 is arranged in a direction in which the irradiation light L1 is specularly reflected.
  • the beam splitter 4 splits the incident light into two lights with a defined splitting ratio.
  • Reflected light LA1 incident on beam splitter 4 is split into reflected light LA2 and reflected light LA3.
  • Reflected light LA2 is incident on first camera 1 .
  • Reflected light LA3 is incident on second camera 51 .
  • the first camera 1 and the second camera 51 are monochrome cameras for photographing the surface of the inspected object 8.
  • the first camera 1 and the second camera 51 photograph the surface of the inspected object 8 simultaneously.
  • the first camera 1 photographs the surface of the inspected object 8 from the specular reflection direction of the irradiation light L1.
  • the first camera 1 receives the reflected light LA2 and photographs the inspected object 8 to generate an image of the inspected object 8 .
  • a first camera 1 includes a lens 2 and an ND (Neutral Density) filter 3 to which the lens 2 is attached.
  • the ND filter 3 is also called a "neutral density filter", and reduces the amount of light by a certain amount over the entire predetermined wavelength band.
  • An image generated by light passing through the ND filter 3 is called an ND filter image.
  • Light passing through the ND filter 3 includes light reflected from flat metallic foreign matter embedded in the surface of the inspected object 8 and light reflected from floating foreign matter.
  • the lens 2 converges the light that has passed through the ND filter 3 and forms an image at one point.
  • the first camera 1 having the ND filter 3 photographs the surface of the inspected object 8 and generates an ND filter image.
  • the ND filtered image is produced by the light reflected from the embedded flat metallic foreign matter present on the surface of the inspected object 8 and the light reflected from the airborne foreign matter.
  • the second camera 51 photographs the surface of the inspected object 8 from a direction perpendicular to the specular reflection direction of the irradiation light L1.
  • the second camera 51 receives the reflected light LA3 and photographs the inspected object 8 to generate an image of the inspected object 8 .
  • the second camera 51 includes a lens 52 and a PL (Polarized Light) filter 53 to which the lens 52 is attached.
  • the PL filter 53 is also called a "polarizing filter” and is a lens filter using a polarizing film.
  • a polarizing filter has a structure in which a polarizing film is sandwiched between two sheets of glass, and has a rotating frame structure for rotating the orientation of the polarizing film.
  • the polarization film of the PL filter 53 is mounted so that the direction of the polarization film is perpendicular to the polarization direction of the reflected light from the metal foreign matter embedded in the surface of the object 8 to be inspected.
  • the lens 52 converges the light that has passed through the PL filter 53 and forms an image at one point.
  • An image generated by light passing through the PL filter 53 is called a PL filter image.
  • the polarization characteristics of incident light and the polarization characteristics of reflected light are preserved as they are. It is known that reflected light from dust and fiber contaminants with minute scattering particles is unpolarized. Since the polarization direction of the light condensed by the lens 52 is perpendicular to the reflected light of the metallic foreign matter, if there is a flat-shaped metallic foreign matter embedded in the surface of the inspected object 8, that portion will appear dark. . On the other hand, if there is a floating foreign matter having irregularities on the surface of the object 8 to be inspected, that portion appears bright. Light transmitted through the PL filter 53 includes only light reflected from floating foreign matter. As described above, the second camera 51 having the PL filter 53 photographs the surface of the inspected object 8 and generates a PL filter image. A PL filtered image is produced only by the light reflected from the airborne foreign object.
  • the metallic foreign matter and floating foreign matter are brightly emphasized in the ND filter image, and only the floating foreign matter is brightly emphasized in the PL filter image. be. Therefore, only metallic foreign matter can be inspected from the difference image between the ND filter image and the PL filter image.
  • the ND filter image and the PL filter image can be generated simultaneously, so the time required for inspection can be shortened compared to the case where the images are generated separately.
  • the motor 6 converts electrical energy into mechanical energy.
  • the motor 6 outputs rotational motion.
  • the drive shaft 7 transmits the rotational driving force of the motor 6 to the inspected object 8 .
  • the jig 9 mounts the object 8 to be inspected.
  • the power supply device 10 supplies power supply voltage to the lighting device 5 .
  • the control device 20 has an arithmetic function.
  • the control device 20 can be configured by, for example, a personal computer, a microcomputer board, or an FPGA (Field Programmable Gate Array) board.
  • the control device 20 includes a camera control section 21 , an inspection processing section 22 , an illumination control section 23 and a motor control section 24 .
  • the camera control unit 21 controls the first camera 1 and the second camera 51.
  • the camera control unit 21 sends a trigger signal to the first camera 1 and the second camera 51 to cause the first camera 1 and the second camera 51 to image the inspected object 8 .
  • the inspection processing unit 22 performs image conversion, deformation, feature amount extraction, etc. on the ND filter image generated by the first camera 1 and the PL filter image generated by the second camera 51 .
  • the inspection processing unit 22 inspects the presence or absence of metallic foreign matter embedded in the surface of the object 8 to be inspected based on the difference image between the ND filter image and the PL filter image.
  • the inspection processing unit 22 determines the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected 8 based on the size of the area of the region composed of the connected pixels having a value equal to or greater than the binarization threshold value in the difference image. inspect.
  • the lighting control unit 23 controls the lighting unit 61 .
  • the lighting control unit 23 is connected to the lighting device 5 and the power supply device 10 .
  • the lighting control unit 23 not only turns on and off the lighting device 5 but also controls the intensity of light from the lighting device 5 .
  • the motor control unit 24 controls the motor 6.
  • a motor control unit 24 transmits the position and speed to the motor 6 and receives a positioning completion signal.
  • the motor control unit 24 controls the rotation angle and rotation speed of the motor 6 using pulse signals.
  • the motor 6 may be a common stepping motor or servomotor.
  • the motor control section 24 outputs a pulse signal.
  • the ON/OFF cycle of the pulse signal is defined as one pulse, and the drive shaft 7 rotates by one step angle at which one pulse is output.
  • FIG. 2 is a flowchart representing the procedure of inspection processing by the inspection apparatus according to Embodiment 1.
  • FIG. 2 is a flowchart representing the procedure of inspection processing by the inspection apparatus according to Embodiment 1.
  • step A ⁇ b>1 the lighting control unit 23 turns on the lighting device 5 .
  • step A ⁇ b>2 the camera control unit 21 sends imaging instructions to the first camera 1 and the second camera 51 . As a result, an ND filter image and a PL filter image are generated on the surface of the object 8 to be inspected.
  • step A3 the inspection processing unit 22 generates a difference image between the ND filter image and the PL filter image by subtracting the PL filter image from the ND filter image. Specifically, the inspection processing unit 22 uses a value obtained by subtracting the pixel value of the PL filter image from the pixel value of the ND filter image as the pixel value of the difference image.
  • step A4 the inspection processing unit 22 generates a binarized image by binarizing the pixel values of the difference image using a predetermined binarization threshold. If the value of the pixel in the differential image is equal to or greater than the binarization threshold, the inspection processing unit 22 sets the value of the pixel to "1". Let the value of that pixel be "0".
  • the inspection processing unit 22 labels the binarized image.
  • labeling is a process of classifying a plurality of regions as a group by adding the same label "1" to connected pixels, and is a well-known technique in image processing.
  • step A6 the inspection processing unit 22 measures the area of each region formed by a plurality of pixels labeled "1". If there is a region whose area is equal to or greater than the threshold, the process proceeds to step A8. If there is no region with an area equal to or greater than the threshold, the process proceeds to step A7.
  • the inspection processing unit 22 determines that there is no metal foreign matter embedded in the surface of the object 8 to be inspected.
  • the inspection processing unit 22 determines that there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
  • FIG. 3 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1.
  • FIG. 3 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1.
  • the metallic foreign matter FW1 and the floating foreign matter FW2 are extracted.
  • Floating foreign matter FW2 is extracted from the PL filtered image 301 .
  • a difference image 302 is generated. In the difference image 302, only the metallic foreign matter FW1 is extracted.
  • FIG. 4 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 2. As shown in FIG. 4
  • the appearance inspection apparatus of the second embodiment includes a control device 200 that differs from the control device 20 of the first embodiment.
  • the control device 200 includes a learning device 101, a learned model storage unit 109, an inference device 105, and an inspection unit 108.
  • Learning device 101 includes data acquisition unit 102 and model generation unit 103 .
  • the data acquisition unit 102 acquires, as learning data, an ND filter image of a non-defective product obtained by photographing the surface of an object to be inspected (hereinafter referred to as a non-defective product) having no metallic foreign matter embedded in the surface.
  • This ND filter image can be generated by the first camera 1 as in the first embodiment.
  • the model generation unit 103 uses the learning data to generate a trained model 104 that reconstructs a good ND filter image from a good ND filter image (good product learning).
  • a known algorithm such as supervised learning, unsupervised learning, or reinforcement learning can be used.
  • supervised learning unsupervised learning
  • reinforcement learning can be used as an example.
  • a case where a neural network is applied will be described.
  • the model generation unit 103 performs non-defective product learning by so-called unsupervised learning, for example, according to the neural network model.
  • the unsupervised learning means that only good ND filter images are given to the learning device 101 without linking the result (label) data combination to the input, and the features of the good ND filter images are learned.
  • This is a method of generating a learned model for reconstructing a good ND filter image from a good ND filter image.
  • supervised learning by giving a set of input and result (label) data to a learning device, it learns the features in those learning data, and from among multiple results (labels) from the input, A method of inferring results (labels) with high scores.
  • FIG. 5 is a diagram showing the configuration of a neural network.
  • a neural network is composed of an input layer consisting of a plurality of neurons, an intermediate layer (hidden layer) consisting of a plurality of neurons, and an output layer consisting of a plurality of neurons.
  • the intermediate layer may be one layer, or two or more layers.
  • the neural network learns the features of the good ND filter image by so-called unsupervised learning according to the good ND filter image acquired by the data acquisition unit 102 . That is, by inputting a good ND filter image to the input layer and adjusting the weights W1 and W2 so that the reconstructed image output from the output layer approaches the good ND filter image input to the input layer. learn.
  • the learning algorithm used in the model generation unit 103 may be executed according to deep learning, such as convolution neural network, which learns so that the feature quantity itself can be extracted.
  • An autoencoder can be used as a learning algorithm used in the model generation unit 103 .
  • the autoencoder includes an encoder that extracts a feature amount from a good ND filter image as input data, and a decoder that reconstructs a good ND filter image from the feature amount.
  • the encoder consists of the input and hidden layers of the neural network, and the decoder consists of the hidden and output layers of the neural network.
  • machine learning may be performed according to other known methods such as genetic programming, functional logic programming, or support vector machines.
  • the model generation unit 103 generates and outputs a learned model by executing the above learning.
  • the learned model storage unit 109 stores the learned model 104 output from the model generation unit 103.
  • FIG. 6 is a flowchart relating to learning processing of the learning device 101 according to the second embodiment.
  • an ND filter image of a non-defective product obtained by photographing the surface of an object to be inspected (hereinafter referred to as a non-defective product) having no metallic foreign matter embedded in the surface is acquired as learning data.
  • a non-defective product an image representing the feature amount of the object to be inspected 8 obtained by performing image processing on a non-defective ND filter image may be input as a non-defective product image.
  • a known algorithm such as luminance binarization or normalization can be used.
  • step a2 the model generation unit 103 uses the learning data acquired by the data acquisition unit 102 to reconstruct a good ND filter image from a good ND filter image by so-called unsupervised learning. to generate
  • the learned model storage unit 109 stores the learned model 104 generated by the model generation unit 103.
  • the inference device 105 includes a data acquisition unit 106 and an inference unit 107.
  • the data acquisition unit 106 acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected.
  • This ND filter image can be generated by the first camera 1 as in the first embodiment.
  • the inference unit 107 uses the learned model 104 that reconstructs a good ND filter image from the good ND filter images stored in the learned model storage unit 109 to obtain the inspected object 8 acquired by the data acquisition unit 106.
  • An ND-filtered image of the inspected object 8 is reconstructed from the ND-filtered image of . That is, by inputting the ND-filtered image of the inspected object 8 acquired by the data acquisition unit 106 into this trained model, a reconstructed image of the ND-filtered image of the inspected object 8 can be generated.
  • the reconstructed image generated by the inference unit 107 is similar to the ND filtered image (non-defective product image) of the object to be inspected obtained by the data acquisition unit 106 and input to the autoencoder. image, and the reconstruction is done properly. If the inspected object 8 is not a non-defective product, the reconstructed image generated by the inference unit 107 will not be an image close to the ND filtered image of the inspected object acquired by the data acquisition unit 106 and input to the autoencoder. not be properly reconfigured. This is because the trained model has been trained to reconstruct good ND-filtered images from good ND-filtered images.
  • the inspection unit 108 shown in FIG. 4 compares the reconstructed image of the object 8 to be inspected output from the inference unit 107 with the ND filter image of the object 8 to be inspected input to the inference unit 107 to obtain the object to be inspected. It is inspected whether or not there is a metal foreign substance embedded in the surface of 8.
  • a known algorithm such as Zscore or pattern matching can be used.
  • FIG. 7 is a flowchart of the inference processing of the inference device 105 according to the second embodiment.
  • the data acquisition unit 106 acquires an ND filter image of the object 8 to be inspected.
  • the inference unit 107 inputs the ND filter image of the inspection object 8 acquired by the data acquisition unit 106 to the learned model 104 stored in the learned model storage unit 109 to obtain a reconstructed image.
  • the inference unit 107 outputs the obtained reconstructed image to the inspection unit 108.
  • the inspection unit 108 compares the ND filter image of the inspection object 8 acquired by the data acquisition unit 106 with the reconstructed image.
  • the inspection unit 108 determines that the two images are similar by pattern matching or the like, it can determine that there is no metal foreign matter embedded in the surface of the object 8 to be inspected.
  • the inspection unit 108 determines that the two images are not similar by pattern matching or the like, it can determine that there is a metallic foreign substance embedded in the surface of the object 8 to be inspected.
  • unsupervised learning is applied to the learning algorithm used by the model generating unit 103
  • the present invention is not limited to this.
  • reinforcement learning, supervised learning, or semi-supervised learning can also be applied as learning algorithms.
  • the model generation unit 103 may perform non-defective product learning using ND filter images of non-defective inspected objects 8 of a plurality of resin molded products having similar shapes as learning data. It is also possible to add a non-defective inspected object 8 of a resin molded product for which learning data is collected as a target model on the way, and to remove it from the target model. In addition, a learned model obtained by executing good product learning using an ND filter image of a good product under inspection 8 of a certain resin molded product is added to an ND filter image of a good product under inspection 8 of another resin molded product. may be used to perform good product learning again.
  • the quality of the object to be inspected 8 that is, the presence or absence of metallic foreign matter embedded in the surface is determined. be able to.
  • FIG. 8 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 3. As shown in FIG.
  • a visual inspection apparatus includes a control device 200A that is different from the control device 200 according to the second embodiment.
  • the control device 200A includes a learning device 101A, an inference device 105A, a learned model storage unit 109A, and a display unit 120 instead of the learning device 101, the inference device 105, the learned model storage unit 109, and the inspection unit 108.
  • the learning device 101A includes a data acquisition unit 102A and a model generation unit 103A.
  • the data acquisition unit 102A acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected, and identification data representing whether or not there is a metallic foreign substance embedded in the surface of the object 8 to be inspected. and are acquired as learning data.
  • the model generation unit 103A determines whether there is a metallic foreign substance embedded in the surface of the inspection object 8 from the ND filter image of the inspection object 8 obtained by photographing the surface of the inspection object 8.
  • a trained model 104A that identifies whether or not is generated.
  • a known algorithm such as supervised learning, unsupervised learning, or reinforcement learning can be used as the learning algorithm used by the model generation unit 103A.
  • supervised learning unsupervised learning
  • reinforcement learning can be used as the learning algorithm used by the model generation unit 103A.
  • a case where a neural network is applied will be described.
  • the model generation unit 103A learns whether or not there is a metallic foreign substance embedded in the surface of the inspection object 8 by so-called supervised learning according to, for example, a neural network model.
  • supervised learning refers to a technique in which input and result (label) data sets are given to the learning device 101A to learn features in the learning data, and the result is inferred from the input.
  • the neural network includes an ND-filtered image of the inspected object 8 acquired by the data acquisition unit 102A, and identification data representing whether or not there is a metal foreign substance embedded in the surface of the inspected object 8.
  • learning data it is learned whether or not there is a metal foreign substance embedded in the surface of the object 8 to be inspected by so-called supervised learning.
  • the neural network learns by adjusting the weights W1 and W2 so that the ND filter image is input to the input layer and the result output from the output layer approaches the identification data (correct answer).
  • the model generation unit 103A generates and outputs the learned model 104A by executing the above learning.
  • the learned model storage unit 109A stores the learned model 104A output from the model generation unit 103A.
  • FIG. 9 is a flowchart relating to learning processing of the learning device 101A according to the third embodiment.
  • the data acquisition unit 102A displays an ND filter image of the object 8 to be inspected obtained by photographing the object 8 and whether or not there is a metallic foreign matter embedded in the surface of the object 8 to be inspected. Acquire learning data including combinations with identification data (correct answers).
  • step c2 the model generation unit 103A performs so-called supervised learning based on the learning data acquired by the data acquisition unit 102A to extract the metal embedded in the surface of the inspection object 8 from the ND filter image of the inspection object 8. Generate a trained model 104A that identifies whether or not a foreign object exists.
  • the learned model storage unit 109A stores the learned model 104A generated by the model generation unit 103A.
  • the inference device 105A in FIG. 8 includes a data acquisition unit 106A and an inference unit 107A.
  • the data acquisition unit 106A acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected.
  • the inference unit 107A uses the learned model 104A stored in the learned model storage unit 109A to detect metallic foreign matter on the surface of the inspection object 8 from the ND filter image of the inspection object 8 acquired by the data acquisition unit 106A. infer whether exists or not. That is, by inputting the ND filter image of the object 8 to be inspected acquired by the data acquisition unit 106A into this trained model, the presence of metallic foreign matter embedded in the surface of the object 8 to be inspected can be inferred from the ND filter image. It is possible to output identification data that identifies whether or not to do so.
  • FIG. 10 is a flowchart of the inference processing of the inference device 105A of the third embodiment.
  • the data acquisition unit 106A acquires an ND filter image of the object 8 to be inspected by photographing the surface of the object 8 to be inspected.
  • the inference unit 107A inputs the ND filter image of the inspected object 8 acquired by the data acquisition unit 106A to the learned model 104A stored in the learned model storage unit 109A, and the surface of the inspected object 8 Identification data is obtained to identify whether or not an embedded metallic foreign object is present.
  • the inference unit 107A outputs the obtained identification data to the display unit 120.
  • the display unit 120 can display information indicating whether or not there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
  • the quality of the object to be inspected 8 that is, the presence or absence of metallic foreign matter embedded in the surface is determined simply by inputting the ND filter image of the object to be inspected into the inference device 105A. be able to.
  • FIG. 11 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 4. As shown in FIG.
  • the visual inspection apparatus of the fourth embodiment differs from the visual inspection apparatus of the first embodiment in that the visual inspection apparatus of the fourth embodiment includes an illumination section 61A instead of the illumination section 61.
  • the lighting unit 61A adjusts the brightness when photographing the object 8 to be inspected.
  • the illumination unit 61A irradiates the inspection object 8 with illumination light L1 and illumination light L2 having only specific polarization angle components.
  • the lighting section 61A includes the lighting device 5 and the polarizing plate 11 in the same manner as the lighting section 61 .
  • the illumination section 61A further includes an illumination device 54 and a polarizing plate 55 .
  • the illumination device 54 emits light.
  • the polarizing plate 55 emits only the irradiation light L2 having only a specific polarization angle component among the light components irradiated from the illumination device 54 . Irradiation light L2 having only a specific polarization angle component is projected onto the object 8 to be inspected.
  • reflected light LB1 of diffuse reflection is generated and is incident on the beam splitter 4 .
  • the beam splitter 4 is a cubic beam splitter composed of two rectangular prisms.
  • the beam splitter 4 is arranged in the direction in which the reflected light LB1 is incident.
  • the beam splitter 4 splits the incident light into two lights with a defined splitting ratio.
  • Reflected light LB1 incident on beam splitter 4 is split into reflected light LB2 and reflected light LB3.
  • Reflected light LB2 is incident on first camera 1 .
  • Reflected light LB3 is incident on second camera 51 .
  • the first camera 1 is arranged at a position different from that of the first embodiment.
  • the first camera 1 photographs the surface of the object 8 to be inspected from the normal direction of the object 8 to be inspected.
  • the positional relationship between the first camera 1 and the second camera 51 is the same as in the first embodiment. Therefore, due to the movement of the first camera 1, the second camera 51 is also arranged at an offset position.
  • FIG. 12 is a flow chart showing the procedure of inspection processing by the inspection apparatus according to the fourth embodiment.
  • processing for dynamically setting the inspection range is added to the content shown in FIG. 2 of the first embodiment.
  • the lighting control unit 23 turns on the lighting device 5 and the lighting device 54 .
  • step B2 the camera control unit 21 sends imaging instructions to the first camera 1 and the second camera 51.
  • an ND filter image and a PL filter image are generated on the surface of the object 8 to be inspected.
  • FIG. 13 is a diagram for explaining the inspection range.
  • the ND filtered image 400 includes a captured image of the inspected object 401 .
  • the object 401 to be inspected has a range in which the light emitted from the illumination device 5 and the illumination device 54 is specularly reflected. Rectangles 402 and 403 indicate the specular reflection range.
  • the illuminances of the illumination devices 5 and 54 are set in advance so that the specular reflection range is input to the first camera 1 with a light intensity of 255 (maximum value).
  • the inspection processing unit 22 detects the range where the luminance value is 255, that is, the rectangles 402 and 403, and sets the gap between the rectangles 402 and 403 as the inspection range. If the metallic foreign matter is embedded in the specular reflection range, the luminance value of the metallic foreign matter also becomes 255, and there is a problem that the metallic foreign matter cannot be detected. , the problem is solved.
  • step B4 the inspection processing unit 22 generates a difference image between the ND filter image and the PL filter image by subtracting the PL filter image from the ND filter image. Since the light transmitted through the ND filter 3 includes specularly reflected light and diffusely reflected light from the metallic foreign matter and diffusely reflected light from the floating foreign matter, the metallic foreign matter and the floating foreign matter are brightly emphasized in the ND filter image. In the PL filter image, the metallic foreign matter and floating foreign matter are also brightly emphasized. However, since the light transmitted through the PL filter 53 includes only the diffusely reflected light from the metallic foreign matter, the brightness of the metallic foreign matter in the PL filter image is ND It is darker than the brightness of the metallic foreign matter in the filtered image.
  • the inspection processing unit 22 uses a value obtained by subtracting the pixel value of the PL filter image from the pixel value of the ND filter image as the pixel value of the difference image.
  • step B5 the inspection processing unit 22 generates a binarized image by binarizing the pixel values of the difference image using a predetermined binarization threshold. If the value of the pixel in the differential image is equal to or greater than the binarization threshold, the inspection processing unit 22 sets the value of the pixel to "1". Let the value of that pixel be "0".
  • the inspection processing unit 22 labels the binarized image.
  • labeling is a process of classifying a plurality of regions as a group by adding the same label "1" to connected pixels, and is a well-known technique in image processing.
  • step B7 the inspection processing unit 22 measures the area of each region formed by a plurality of pixels labeled "1". If there is a region whose area is equal to or greater than the threshold, the process proceeds to step B9. If there is no region with an area equal to or greater than the threshold, the process proceeds to step B8.
  • the inspection processing unit 22 determines that there is no metallic foreign matter embedded in the surface of the object 8 to be inspected.
  • the inspection processing unit 22 determines that there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
  • the illumination device 5 and the illumination device 54 are bar-shaped illumination devices, but they may be ring-shaped illumination devices.
  • two illumination devices 5 and 54 are installed in order to widen the inspection range, but it is also possible to install only one bar-shaped illumination device.
  • FIG. 14 is a diagram showing the hardware configuration of control devices 20, 200 and 200A.
  • the controllers 20, 200, 200A can implement corresponding operations in digital circuit hardware or software.
  • the control devices 20, 200, 200A are, for example, as shown in FIG. , and the processor 1001 can execute a program stored in the memory 1002 .

Landscapes

  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

This appearance inspection device comprises: a first camera (1) that has an ND filter (3), images the surface of an object (8) to be inspected, and generates an ND filter image; a second camera (51) that has a PL filter (53), images the surface of the object (8) to be inspected, and generates a PL filter image; and a control device (20) that, on the basis of a differential image of the ND filter image and the PL filter image, inspects for the presence of metallic foreign matter embedded in the surface of the object (8) to be inspected. The control device (20) inspects for the presence of metallic foreign matter embedded in the surface of the object (8) to be inspected on the basis of the size of the surface area of a region comprising connected pixels that have values of no less than a binarization threshold in the differential image.

Description

外観検査装置、外観検査方法、学習装置および推論装置Appearance inspection device, appearance inspection method, learning device and reasoning device
 本開示は、外観検査装置、外観検査方法、学習装置および推論装置に関する。 The present disclosure relates to a visual inspection device, a visual inspection method, a learning device, and a reasoning device.
 工業製品の表面に凹凸、傷、汚れ、または異物が存在すると、工業製品の性能劣化を引き起こすため、検査によりこれらの不良品を取り除く必要がある。近年、製品の多品種化および大量生産によって、作業者の負担増加および検査時間の増加が発生しているため、検査工程の自動化が試みられている。 If there are irregularities, scratches, dirt, or foreign matter on the surface of industrial products, it will cause performance deterioration of industrial products, so it is necessary to remove these defective products through inspection. In recent years, the increased variety and mass production of products have increased the burden on workers and increased the inspection time, so attempts have been made to automate the inspection process.
 例えば、特許文献1の検査装置は、ワークWの表面に対してA色の照射光L1,LA1とB色の照射光LB1と、C色の照射光LC1とを照射し、ワークWからの反射光を分光して、A色画像、B色画像、およびC色画像を同時に生成する。A色画像はB色画像(あるいはC色画像)に比べて、ワークWの欠陥に対応する部分と、その周囲との明るさの差が小さい。特許文献1の検査装置は、2つの画像を比較することによって、ワークWの欠陥のみを2つの画像の相違点として抽出する。 For example, the inspection apparatus of Patent Document 1 irradiates the surface of the work W with A-color irradiation light L1 and LA1, B-color irradiation light LB1, and C-color irradiation light LC1. The light is split to produce an A-color image, a B-color image, and a C-color image simultaneously. Compared to the B-color image (or the C-color image), the A-color image has a smaller difference in brightness between the portion corresponding to the defect of the workpiece W and its surroundings. The inspection apparatus of Patent Literature 1 compares the two images and extracts only defects of the work W as differences between the two images.
特開2008-64715号公報JP 2008-64715 A
 しかしながら、特許文献1に記載の検査装置は、検出したい異物が被検査物体の表面に埋め込まれているか、あるいは被検査物体の表面上に付着しているだけかを判別することはできない。 However, the inspection apparatus described in Patent Literature 1 cannot determine whether the foreign matter to be detected is embedded in the surface of the object to be inspected or is only adhered to the surface of the object to be inspected.
 それゆえに、本開示の目的は、被検査物体の表面に埋め込まれた金属異物のみを検出することができる外観検査装置、外観検査方法、学習装置および推論装置を提供することである。 Therefore, an object of the present disclosure is to provide a visual inspection apparatus, a visual inspection method, a learning apparatus, and an inference apparatus that can detect only metallic foreign matter embedded in the surface of an object to be inspected.
 本開示の外観検査装置は、NDフィルタを有し、被検査物体の表面を撮影して、NDフィルタ画像を生成する第1のカメラと、PLフィルタを有し、被検査物体の表面を撮影して、PLフィルタ画像を生成する第2のカメラと、NDフィルタ画像とPLフィルタ画像との差分画像に基づいて、被検査物体の表面に埋め込まれた金属異物の有無を検査する制御装置とを備える。 A visual inspection apparatus of the present disclosure includes a first camera that has an ND filter and captures the surface of an object to be inspected to generate an ND filter image, and a PL filter that captures the surface of the object to be inspected. a second camera for generating a PL filtered image; and a control device for inspecting presence/absence of metal foreign matter embedded in the surface of the object to be inspected based on a differential image between the ND filtered image and the PL filtered image. .
 本開示の外観検査方法は、NDフィルタを有する第1のカメラが、被検査物体の表面を撮影して、NDフィルタ画像を生成するステップと、PLフィルタを有する第2のカメラが、被検査物体の表面を撮影して、PLフィルタ画像を生成するステップと、制御装置が、NDフィルタ画像とPLフィルタ画像との差分画像に基づいて、被検査物体の表面に埋め込まれた金属異物の有無を検査するステップとを備える。 The appearance inspection method of the present disclosure comprises the steps of: a first camera having an ND filter photographs the surface of an object to be inspected to generate an ND filter image; photographing the surface of the object to generate a PL filter image; and the step of:
 本開示の学習装置は、表面に埋め込まれた金属異物が存在しない被検査物体を良品とし、良品の表面を撮影して得られる良品のNDフィルタ画像を含む学習用データを取得するデータ取得部と、学習用データを用いて、良品のNDフィルタ画像から、良品のNDフィルタ画像を再構成する学習済モデルを生成するモデル生成部とを備える。 The learning device of the present disclosure includes a data acquisition unit that obtains learning data including an ND filter image of a non-defective product obtained by photographing the surface of the non-defective product, with an object to be inspected having no metal foreign matter embedded in the surface as a non-defective product. and a model generation unit that generates a trained model for reconstructing a good ND filter image from a good ND filter image using the learning data.
 本開示の推論装置は、被検査物体の表面を撮影して得られる被検査物体のNDフィルタ画像を取得するデータ取得部と、表面に埋め込まれた金属異物が存在しない被検査物体を良品とし、良品の表面を撮影して得られる良品のNDフィルタ画像から、良品のNDフィルタ画像を再構成する学習済モデルを用いて、データ取得部で取得した被検査物体のNDフィルタ画像から、被検査物体のNDフィルタ画像を再構成する推論部とを備える。 The inference device of the present disclosure includes a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected, and an object to be inspected that does not contain any metal foreign matter embedded in the surface is regarded as a non-defective product, Using a trained model that reconstructs an ND filter image of a good product from an ND filter image of a good product obtained by photographing the surface of a good product, the object to be inspected is obtained from the ND filter image of the object to be inspected acquired by the data acquisition unit and a reasoner for reconstructing the ND filtered image of .
 本開示の学習装置は、被検査物体の表面を撮影して得られる被検査物体のNDフィルタ画像と被検査物体の表面に埋め込まれた金属異物が存在するか否かを表わす識別データとを含む学習用データを取得するデータ取得部と、学習用データを用いて、被検査物体の表面を撮影して得られる被検査物体のNDフィルタ画像から被検査物体の表面に埋め込まれた金属異物が存在するか否かを識別する学習済モデルを生成するモデル生成部と備える。 The learning device of the present disclosure includes an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected and identification data representing whether or not there is a metallic foreign substance embedded in the surface of the object to be inspected. A data acquisition unit that acquires data for learning, and a metallic foreign substance embedded in the surface of the object to be inspected exists from an ND filter image of the object to be inspected obtained by photographing the surface of the object using the data for learning. and a model generation unit that generates a trained model that identifies whether or not to perform the training.
 本開示の推論装置は、被検査物体の表面を撮影して得られる被検査物体のNDフィルタ画像を取得するデータ取得部と、被検査物体の表面を撮影して得られる被検査物体のNDフィルタ画像から被検査物体の表面に埋め込まれた金属異物が存在するか否かを推論するための学習済モデルを用いて、データ取得部で取得した被検査物体のNDフィルタ画像から、被検査物体の表面に金属異物が存在するか否かを推論する推論部とを備える。 The inference device of the present disclosure includes a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by imaging the surface of the object to be inspected, and an ND filter of the object to be inspected that is obtained by imaging the surface of the object to be inspected. Using a trained model for inferring from the image whether or not there is a metal foreign substance embedded in the surface of the object to be inspected, from the ND filtered image of the object to be inspected acquired by the data acquisition unit, an inference unit for inferring whether or not a metallic foreign object exists on the surface.
 本開示の外観検査装置は、NDフィルタ画像とPLフィルタ画像との差分画像に基づいて、被検査物体の表面に埋め込まれた金属異物の有無を検査する。これによって、本開示の外観検査装置は、被検査物体の表面に埋め込まれた金属異物のみを検出することができる。 The appearance inspection apparatus of the present disclosure inspects for the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected based on the difference image between the ND filter image and the PL filter image. Thereby, the visual inspection apparatus of the present disclosure can detect only metallic foreign matter embedded in the surface of the object to be inspected.
実施の形態1の外観検査装置の構成を表わす図である。1 is a diagram illustrating the configuration of an appearance inspection apparatus according to Embodiment 1; FIG. 実施の形態1に係る検査装置による検査処理の手順を表わすフローチャートである。4 is a flow chart showing the procedure of inspection processing by the inspection apparatus according to Embodiment 1; 実施の形態1によるNDフィルタ画像とPLフィルタ画像との差分処理を説明するための図である。FIG. 4 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1; 実施の形態2の外観検査装置の構成を表わす図である。FIG. 10 is a diagram illustrating the configuration of a visual inspection apparatus according to Embodiment 2; ニューラルネットワークの構成を表わす図である。It is a figure showing the structure of a neural network. 実施の形態2の学習装置101の学習処理に関するフローチャートである。10 is a flowchart regarding learning processing of the learning device 101 of Embodiment 2. FIG. 実施の形態2の推論装置105の推論処理に関するフローチャートである。10 is a flowchart of inference processing of the inference device 105 according to the second embodiment; 実施の形態3の外観検査装置の構成を表わす図である。FIG. 12 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 3; 実施の形態3の学習装置101Aの学習処理に関するフローチャートである。FIG. 12 is a flowchart of learning processing of the learning device 101A of Embodiment 3. FIG. 実施の形態3の推論装置105Aの推論処理に関するフローチャートである。FIG. 12 is a flowchart of inference processing of the inference device 105A of Embodiment 3; FIG. 実施の形態4の外観検査装置の構成を表わす図である。FIG. 13 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 4; 実施の形態4に係る検査装置による検査処理の手順を表わすフローチャートである。14 is a flow chart showing a procedure of inspection processing by an inspection apparatus according to Embodiment 4; 検査範囲を説明するための図である。FIG. 4 is a diagram for explaining an inspection range; FIG. 制御装置20、200、200Aのハードウェア構成を表わす図である。2 is a diagram showing a hardware configuration of control devices 20, 200, 200A; FIG.
 以下、実施の形態について、図面を参照して説明する。図示される実施形態の構成要素は、実施の形態の理解を助けるために寸法が適宜変更されている。また、同一または対応する構成要素には、同一の参照符号が付されている。 Embodiments will be described below with reference to the drawings. Components of the illustrated embodiments have been arbitrarily resized to aid in understanding the embodiments. Identical or corresponding components are provided with the same reference numerals.
 実施の形態1.
 図1は、実施の形態1の外観検査装置の構成を表わす図である。
Embodiment 1.
FIG. 1 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 1. As shown in FIG.
 外観検査装置は、被検査物体8の表面における異物を検査する。被検査物体8は、その表面で光の正反射が生じる物体であれば特に限定されない。正反射では、照射された光の入射角と反射角とが等しい。拡散光は、あらゆる方向に拡散して反射する光である。 The appearance inspection device inspects foreign matter on the surface of the object 8 to be inspected. The object 8 to be inspected is not particularly limited as long as it is an object that causes specular reflection of light on its surface. In specular reflection, the angle of incidence and the angle of reflection of the illuminated light are equal. Diffuse light is light that is diffusely reflected in all directions.
 被検査物体8は、たとえば表面がエポキシ樹脂によりコーティングされたプラスチック成型品である。被検査物体8の表面(被検査面)は平面でも曲面でもよい。 The object 8 to be inspected is, for example, a plastic molded product whose surface is coated with epoxy resin. The surface (surface to be inspected) of the object to be inspected 8 may be flat or curved.
 外観検査装置が検出可能な異物は、金属異物および浮遊異物である。金属異物は、被検査物体8の表面に埋め込まれてフラット形状を有する物体である。浮遊異物は、被検査物体8の表面に付着している凹凸を有する物体である。たとえば、金属異物には、銅、およびアルミ等が含まれる。浮遊異物には、たとえば埃、および繊維異物等が含まれる。 Foreign substances that can be detected by visual inspection equipment are metal foreign substances and floating foreign substances. The metal foreign matter is an object embedded in the surface of the object 8 to be inspected and having a flat shape. A floating foreign object is an object having irregularities adhering to the surface of the object 8 to be inspected. For example, metal contaminants include copper, aluminum, and the like. Airborne contaminants include, for example, dust, fiber contaminants, and the like.
 外観検査装置は、照明部61と、ビームスプリッタ4と、第1のカメラ1と、第2のカメラ51と、モータ6と、駆動軸7と、治具9と、電源装置10と、偏光板11と、制御装置20とを備える。 The visual inspection apparatus includes an illumination unit 61, a beam splitter 4, a first camera 1, a second camera 51, a motor 6, a drive shaft 7, a jig 9, a power supply 10, and a polarizing plate. 11 and a control device 20 .
 照明部61は、被検査物体8を撮影するときの明るさを調整する。照明部61は、被検査物体8に対して、特定の偏光角度成分のみを有する照射光L1を照射する。照明部61は、照明装置5と、偏光板11とを備える。照明装置5は、光を照射する。偏光板11は、照明装置5から照射される光の成分のうち、特定の偏光角度成分だけを有する光だけを出射する。特定の偏光角度成分だけを有する照射光L1が被検査物体8に照射される。照射光L1が被検査物体8に入射すると、正反射の反射光LA1が発生し、ビームスプリッタ4に入射する。 The lighting unit 61 adjusts the brightness when photographing the object 8 to be inspected. The illumination unit 61 irradiates the object 8 to be inspected with illumination light L1 having only a specific polarization angle component. The lighting section 61 includes the lighting device 5 and the polarizing plate 11 . The illumination device 5 emits light. The polarizing plate 11 emits only light having a specific polarization angle component among the light components irradiated from the illumination device 5 . The inspected object 8 is irradiated with the irradiation light L1 having only a specific polarization angle component. When the irradiation light L 1 is incident on the object 8 to be inspected, specular reflection light LA 1 is generated and is incident on the beam splitter 4 .
 ビームスプリッタ4は、2個の直角プリズムによって構成されたキューブ型ビームスプリッタである。ビームスプリッタ4は、照射光L1が正反射する方向に配置される。ビームスプリッタ4は、入射光を定められた分割比で2つの光に分割する。ビームスプリッタ4に入射した反射光LA1は、反射光LA2と反射光LA3とに分岐される。反射光LA2は、第1のカメラ1に入射される。反射光LA3は、第2のカメラ51に入射される。 The beam splitter 4 is a cubic beam splitter composed of two rectangular prisms. The beam splitter 4 is arranged in a direction in which the irradiation light L1 is specularly reflected. The beam splitter 4 splits the incident light into two lights with a defined splitting ratio. Reflected light LA1 incident on beam splitter 4 is split into reflected light LA2 and reflected light LA3. Reflected light LA2 is incident on first camera 1 . Reflected light LA3 is incident on second camera 51 .
 第1のカメラ1および第2のカメラ51は、被検査物体8の表面を撮影するためのモノクロカメラである。第1のカメラ1および第2のカメラ51は、被検査物体8の表面を同時に撮影する。 The first camera 1 and the second camera 51 are monochrome cameras for photographing the surface of the inspected object 8. The first camera 1 and the second camera 51 photograph the surface of the inspected object 8 simultaneously.
 第1のカメラ1は、照射光L1の正反射方向から被検査物体8の表面を撮影する。第1のカメラ1は、反射光LA2を受けて被検査物体8を撮影し、被検査物体8の画像を生成する。第1のカメラ1は、レンズ2と、レンズ2が装着されたND(Neutral Density)フィルタ3とを含む。 The first camera 1 photographs the surface of the inspected object 8 from the specular reflection direction of the irradiation light L1. The first camera 1 receives the reflected light LA2 and photographs the inspected object 8 to generate an image of the inspected object 8 . A first camera 1 includes a lens 2 and an ND (Neutral Density) filter 3 to which the lens 2 is attached.
 NDフィルタ3は、「中性濃度フィルタ」とも呼ばれ、所定の波長帯の全域において、光量を一定量だけ低下させる。NDフィルタ3を通過した光によって生成される画像をNDフィルタ画像と呼ぶことにする。NDフィルタ3を通過する光は、被検査物体8の表面に埋め込まれたフラットな金属異物から反射された光と、浮遊異物から反射された光とを含む。レンズ2は、NDフィルタ3を通過した光を集光して、1点に像を結ぶ。以上のようにして、NDフィルタ3を有する第1のカメラ1は、被検査物体8の表面を撮影して、NDフィルタ画像を生成する。NDフィルタ画像は、被検査物体8の表面に存在する埋め込まれたフラットな金属異物から反射された光と浮遊異物から反射された光とによって生成される。 The ND filter 3 is also called a "neutral density filter", and reduces the amount of light by a certain amount over the entire predetermined wavelength band. An image generated by light passing through the ND filter 3 is called an ND filter image. Light passing through the ND filter 3 includes light reflected from flat metallic foreign matter embedded in the surface of the inspected object 8 and light reflected from floating foreign matter. The lens 2 converges the light that has passed through the ND filter 3 and forms an image at one point. As described above, the first camera 1 having the ND filter 3 photographs the surface of the inspected object 8 and generates an ND filter image. The ND filtered image is produced by the light reflected from the embedded flat metallic foreign matter present on the surface of the inspected object 8 and the light reflected from the airborne foreign matter.
 第2のカメラ51は、照射光L1の正反射方向と垂直な方向から被検査物体8の表面を撮影する。第2のカメラ51は、反射光LA3を受けて被検査物体8を撮影し、被検査物体8の画像を生成する。第2のカメラ51は、レンズ52と、レンズ52が装着されたPL(Polarized Light)フィルタ53とを含む。 The second camera 51 photographs the surface of the inspected object 8 from a direction perpendicular to the specular reflection direction of the irradiation light L1. The second camera 51 receives the reflected light LA3 and photographs the inspected object 8 to generate an image of the inspected object 8 . The second camera 51 includes a lens 52 and a PL (Polarized Light) filter 53 to which the lens 52 is attached.
 PLフィルタ53は、「偏光フィルタ」とも呼ばれ、偏光膜を利用したレンズフィルタである。偏光フィルタは、2枚のガラスの間に偏光膜をサンドイッチした構造を有し、偏光膜の向きを回転させるための回転枠構造を備える。PLフィルタ53の偏光膜の方向は、被検査物体8の表面に埋め込まれる金属異物からの反射光の偏光方向と垂直方向になるように装着されている。レンズ52は、PLフィルタ53を通過した光を集光して、1点に像を結ぶ。PLフィルタ53を通過した光によって生成される画像をPLフィルタ画像と呼ぶことにする。 The PL filter 53 is also called a "polarizing filter" and is a lens filter using a polarizing film. A polarizing filter has a structure in which a polarizing film is sandwiched between two sheets of glass, and has a rotating frame structure for rotating the orientation of the polarizing film. The polarization film of the PL filter 53 is mounted so that the direction of the polarization film is perpendicular to the polarization direction of the reflected light from the metal foreign matter embedded in the surface of the object 8 to be inspected. The lens 52 converges the light that has passed through the PL filter 53 and forms an image at one point. An image generated by light passing through the PL filter 53 is called a PL filter image.
 一般的に、金属のような光沢表面では、入射光の偏光特性と反射光の偏光特性とが、そのまま保存される。微小な散乱粒子を有する埃および繊維異物からの反射光は、無偏光になることが知られている。レンズ52に集光される光の偏光方向は、金属異物の反射光に対して垂直方向のため、被検査物体8の表面に埋め込まれたフラット形状の金属異物が存在すると、その部分が暗く見える。これに対して、被検査物体8の表面に凹凸を有する浮遊異物が存在すると、その部分が明るく見える。PLフィルタ53を透過する光は、浮遊異物から反射された光のみを含む。以上のようにして、PLフィルタ53を有する第2のカメラ51は、被検査物体8の表面を撮影して、PLフィルタ画像を生成する。PLフィルタ画像は、浮遊異物から反射された光のみによって生成される。 In general, on a glossy surface such as metal, the polarization characteristics of incident light and the polarization characteristics of reflected light are preserved as they are. It is known that reflected light from dust and fiber contaminants with minute scattering particles is unpolarized. Since the polarization direction of the light condensed by the lens 52 is perpendicular to the reflected light of the metallic foreign matter, if there is a flat-shaped metallic foreign matter embedded in the surface of the inspected object 8, that portion will appear dark. . On the other hand, if there is a floating foreign matter having irregularities on the surface of the object 8 to be inspected, that portion appears bright. Light transmitted through the PL filter 53 includes only light reflected from floating foreign matter. As described above, the second camera 51 having the PL filter 53 photographs the surface of the inspected object 8 and generates a PL filter image. A PL filtered image is produced only by the light reflected from the airborne foreign object.
 被検査物体8の表面に金属異物と浮遊異物とが存在している場合には、NDフィルタ画像では、金属異物と浮遊異物とが明るく強調され、PLフィルタ画像では、浮遊異物のみが明るく強調される。したがって、NDフィルタ画像とPLフィルタ画像との差分画像から、金属異物だけを検査できる。本実施の形態では、NDフィルタ画像とPLフィルタ画像とを同時に生成できるので、別々に画像を生成する場合と比較して、検査に要する時間を短縮できる。 When metallic foreign matter and floating foreign matter exist on the surface of the object 8 to be inspected, the metallic foreign matter and floating foreign matter are brightly emphasized in the ND filter image, and only the floating foreign matter is brightly emphasized in the PL filter image. be. Therefore, only metallic foreign matter can be inspected from the difference image between the ND filter image and the PL filter image. In this embodiment, the ND filter image and the PL filter image can be generated simultaneously, so the time required for inspection can be shortened compared to the case where the images are generated separately.
 モータ6は、電気エネルギーを力学的エネルギーに変換する。本実施形態においては、モータ6に直流のパルス電圧が印加されると、モータ6は、回転運動を出力する。 The motor 6 converts electrical energy into mechanical energy. In this embodiment, when a DC pulse voltage is applied to the motor 6, the motor 6 outputs rotational motion.
 駆動軸7は、モータ6の回転駆動力を被検査物体8に伝える。
 治具9は、被検査物体8を載置する。
The drive shaft 7 transmits the rotational driving force of the motor 6 to the inspected object 8 .
The jig 9 mounts the object 8 to be inspected.
 電源装置10は、照明装置5に電源電圧を供給する。
 制御装置20は、演算機能を有する。制御装置20は、例えば、パーソナルコンピュータ、マイコンボード、またはFPGA(Field Programmable Gate Array)ボードなどにより構成することができる。制御装置20は、カメラ制御部21、検査処理部22、照明制御部23、およびモータ制御部24を備える。
The power supply device 10 supplies power supply voltage to the lighting device 5 .
The control device 20 has an arithmetic function. The control device 20 can be configured by, for example, a personal computer, a microcomputer board, or an FPGA (Field Programmable Gate Array) board. The control device 20 includes a camera control section 21 , an inspection processing section 22 , an illumination control section 23 and a motor control section 24 .
 カメラ制御部21は、第1のカメラ1および第2のカメラ51を制御する。カメラ制御部21は、第1のカメラ1と第2のカメラ51とに対してトリガ信号を送ることにより、第1のカメラ1と第2のカメラ51に被検査物体8を撮像させる。 The camera control unit 21 controls the first camera 1 and the second camera 51. The camera control unit 21 sends a trigger signal to the first camera 1 and the second camera 51 to cause the first camera 1 and the second camera 51 to image the inspected object 8 .
 検査処理部22は、第1のカメラ1によって生成されたNDフィルタ画像、および第2のカメラ51によって生成されたPLフィルタ画像に対して画像変換、変形、および特徴量抽出などを行なう。検査処理部22は、NDフィルタ画像とPLフィルタ画像との差分画像に基づいて、被検査物体8の表面に埋め込まれた金属異物の有無を検査する。検査処理部22は、差分画像における2値化しきい値以上の値を有する連結された画素からなる領域の面積の大きさに基づいて、被検査物体8の表面に埋め込まれた金属異物の有無を検査する。 The inspection processing unit 22 performs image conversion, deformation, feature amount extraction, etc. on the ND filter image generated by the first camera 1 and the PL filter image generated by the second camera 51 . The inspection processing unit 22 inspects the presence or absence of metallic foreign matter embedded in the surface of the object 8 to be inspected based on the difference image between the ND filter image and the PL filter image. The inspection processing unit 22 determines the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected 8 based on the size of the area of the region composed of the connected pixels having a value equal to or greater than the binarization threshold value in the difference image. inspect.
 照明制御部23は、照明部61を制御する。照明制御部23は、照明装置5と電源装置10とに接続される。照明制御部23は、照明装置5の点灯および消灯を行なうだけでなく、照明装置5の光の強度を制御する。 The lighting control unit 23 controls the lighting unit 61 . The lighting control unit 23 is connected to the lighting device 5 and the power supply device 10 . The lighting control unit 23 not only turns on and off the lighting device 5 but also controls the intensity of light from the lighting device 5 .
 モータ制御部24は、モータ6を制御する。モータ制御部24は、モータ6に対して位置、および速度を伝達し、位置決め完了信号を受信する。モータ制御部24は、パルス信号によってモータ6の回転角度と回転速度とを制御する。モータ6は、一般的なステッピングモータまたはサーボモータでよい。モータ制御部24は、パルス信号を出力する。パルス信号のON/OFFのサイクルを1パルスとし、1パルスが出力される1ステップ角度だけが駆動軸7が回転する。 The motor control unit 24 controls the motor 6. A motor control unit 24 transmits the position and speed to the motor 6 and receives a positioning completion signal. The motor control unit 24 controls the rotation angle and rotation speed of the motor 6 using pulse signals. The motor 6 may be a common stepping motor or servomotor. The motor control section 24 outputs a pulse signal. The ON/OFF cycle of the pulse signal is defined as one pulse, and the drive shaft 7 rotates by one step angle at which one pulse is output.
 カメラ制御部21から出力されるトリガ信号と、照明制御部23から出力される点灯および消灯信号と、モータ制御部24から出力されるパルス信号とを同期させることによって、1ステップ角度毎に画像が生成される。 By synchronizing the trigger signal output from the camera control unit 21, the turn-on and turn-off signal output from the illumination control unit 23, and the pulse signal output from the motor control unit 24, an image is displayed for each step angle. generated.
 図2は、実施の形態1に係る検査装置による検査処理の手順を表わすフローチャートである。 FIG. 2 is a flowchart representing the procedure of inspection processing by the inspection apparatus according to Embodiment 1. FIG.
 ステップA1において、照明制御部23は、照明装置5を点灯させる。
 ステップA2において、カメラ制御部21は、第1のカメラ1と第2のカメラ51とに撮像指示を送る。これにより、被検査物体8の表面において、NDフィルタ画像とPLフィルタ画像とが生成される。
At step A<b>1 , the lighting control unit 23 turns on the lighting device 5 .
In step A<b>2 , the camera control unit 21 sends imaging instructions to the first camera 1 and the second camera 51 . As a result, an ND filter image and a PL filter image are generated on the surface of the object 8 to be inspected.
 ステップA3において、検査処理部22は、NDフィルタ画像からPLフィルタ画像を減算することによって、NDフィルタ画像とPLフィルタ画像との差分画像を生成する。具体的には、検査処理部22は、NDフィルタ画像の画素値からPLフィルタ画像の画素値を減算した値を差分画像の画素値とする。 In step A3, the inspection processing unit 22 generates a difference image between the ND filter image and the PL filter image by subtracting the PL filter image from the ND filter image. Specifically, the inspection processing unit 22 uses a value obtained by subtracting the pixel value of the PL filter image from the pixel value of the ND filter image as the pixel value of the difference image.
 ステップA4において、検査処理部22は、予め定められた2値化しきい値を用いて、差分画像の画素値を2値化することによって、2値化画像を生成する。検査処理部22は、差分画像の画素の値が2値化しきい以上の場合には、その画素の値を「1」とし、差分画像の画素の値が2値化しきい未満の場合には、その画素の値を「0」とする。 In step A4, the inspection processing unit 22 generates a binarized image by binarizing the pixel values of the difference image using a predetermined binarization threshold. If the value of the pixel in the differential image is equal to or greater than the binarization threshold, the inspection processing unit 22 sets the value of the pixel to "1". Let the value of that pixel be "0".
 ステップA5において、検査処理部22は、2値化画像をラベリングする。ここでラベリングとは、連結している画素に同じラベル「1」を付加することによって、複数の領域をグループとして分類する処理であり、画像処理において公知の技術である。 At step A5, the inspection processing unit 22 labels the binarized image. Here, labeling is a process of classifying a plurality of regions as a group by adding the same label "1" to connected pixels, and is a well-known technique in image processing.
 ステップA6において、検査処理部22は、「1」のラベルが付加された複数の画素によって形成される領域毎に、その領域の面積を計測する。面積が閾値以上の領域が存在するときには、処理がステップA8に進む。面積が閾値以上の領域が存在しないときには、処理がステップA7に進む。 In step A6, the inspection processing unit 22 measures the area of each region formed by a plurality of pixels labeled "1". If there is a region whose area is equal to or greater than the threshold, the process proceeds to step A8. If there is no region with an area equal to or greater than the threshold, the process proceeds to step A7.
 ステップA7において、検査処理部22は、被検査物体8の表面に埋め込まれた金属異物が存在しないと判定する。 At step A7, the inspection processing unit 22 determines that there is no metal foreign matter embedded in the surface of the object 8 to be inspected.
 ステップA8において、検査処理部22は、被検査物体8の表面に埋め込まれた金属異物が存在すると判定する。 At step A8, the inspection processing unit 22 determines that there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
 図3は、実施の形態1によるNDフィルタ画像とPLフィルタ画像との差分処理を説明するための図である。 FIG. 3 is a diagram for explaining difference processing between an ND filter image and a PL filter image according to Embodiment 1. FIG.
 NDフィルタ画像300では、金属異物FW1と浮遊異物FW2とが抽出されている。PLフィルタ画像301では、浮遊異物FW2が抽出されている。 In the ND filter image 300, the metallic foreign matter FW1 and the floating foreign matter FW2 are extracted. Floating foreign matter FW2 is extracted from the PL filtered image 301 .
 NDフィルタ画像300からPLフィルタ画像301を減算すると、差分画像302が生成される。差分画像302では、金属異物FW1のみが抽出される。 By subtracting the PL filtered image 301 from the ND filtered image 300, a difference image 302 is generated. In the difference image 302, only the metallic foreign matter FW1 is extracted.
 以上のように、本実施の形態によれば、NDフィルタ画像とPLフィルタ画像との差分画像を用いることによって、被検査物体の表面に埋め込まれた金属異物のみを検出することができる。 As described above, according to the present embodiment, by using the differential image between the ND filter image and the PL filter image, it is possible to detect only metallic foreign matter embedded in the surface of the object to be inspected.
 実施の形態2.
 図4は、実施の形態2の外観検査装置の構成を表わす図である。
Embodiment 2.
FIG. 4 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 2. As shown in FIG.
 実施の形態2の外見検査装置は、実施の形態1の制御装置20と相違する制御装置200を備える。 The appearance inspection apparatus of the second embodiment includes a control device 200 that differs from the control device 20 of the first embodiment.
 制御装置200は、学習装置101と、学習済モデル記憶部109と、推論装置105と、検査部108とを備える。 The control device 200 includes a learning device 101, a learned model storage unit 109, an inference device 105, and an inspection unit 108.
 学習装置101は、データ取得部102と、モデル生成部103とを備える。
 データ取得部102は、表面に埋め込まれた金属異物が存在しない被検査物体(以下、良品)の表面を撮影して得られ得る良品のNDフィルタ画像を学習用データとして取得する。このNDフィルタ画像は、実施の形態1と同様に、第1のカメラ1によって生成することができる。
Learning device 101 includes data acquisition unit 102 and model generation unit 103 .
The data acquisition unit 102 acquires, as learning data, an ND filter image of a non-defective product obtained by photographing the surface of an object to be inspected (hereinafter referred to as a non-defective product) having no metallic foreign matter embedded in the surface. This ND filter image can be generated by the first camera 1 as in the first embodiment.
 モデル生成部103は、学習用データを用いて、良品のNDフィルタ画像から良品のNDフィルタ画像を再構成する学習済モデル104を生成する(良品学習)。 The model generation unit 103 uses the learning data to generate a trained model 104 that reconstructs a good ND filter image from a good ND filter image (good product learning).
 モデル生成部103が用いる学習アルゴリズムとして、教師あり学習、教師なし学習、または強化学習等の公知のアルゴリズムを用いることができる。一例として、ニューラルネットワークを適用した場合について説明する。 As a learning algorithm used by the model generation unit 103, a known algorithm such as supervised learning, unsupervised learning, or reinforcement learning can be used. As an example, a case where a neural network is applied will be described.
 モデル生成部103は、例えば、ニューラルネットワークモデルに従って、いわゆる教師なし学習により、良品学習を実行する。ここで、教師なし学習とは、入力に対して結果(ラベル)のデータ組みを紐づけず、良品のNDフィルタ画像だけを学習装置101に与え、良品のNDフィルタ画像の特徴を学習して、良品のNDフィルタ画像から良品のNDフィルタ画像を再構成する学習済みモデルを生成する手法を言う。なお、教師あり学習とは、入力と結果(ラベル)のデータの組を学習装置に与えることで、それらの学習用データにある特徴を学習し、入力から複数の結果(ラベル)の中から、スコアの高い結果(ラベル)を推論する手法をいう。 The model generation unit 103 performs non-defective product learning by so-called unsupervised learning, for example, according to the neural network model. Here, the unsupervised learning means that only good ND filter images are given to the learning device 101 without linking the result (label) data combination to the input, and the features of the good ND filter images are learned, This is a method of generating a learned model for reconstructing a good ND filter image from a good ND filter image. In supervised learning, by giving a set of input and result (label) data to a learning device, it learns the features in those learning data, and from among multiple results (labels) from the input, A method of inferring results (labels) with high scores.
 図5は、ニューラルネットワークの構成を表わす図である。
 ニューラルネットワークは、複数のニューロンからなる入力層、複数のニューロンからなる中間層(隠れ層)、及び複数のニューロンからなる出力層によって構成される。中間層は、1層、又は2層以上でもよい。
FIG. 5 is a diagram showing the configuration of a neural network.
A neural network is composed of an input layer consisting of a plurality of neurons, an intermediate layer (hidden layer) consisting of a plurality of neurons, and an output layer consisting of a plurality of neurons. The intermediate layer may be one layer, or two or more layers.
 例えば、図5に示すような3層のニューラルネットワークであれば、複数の入力が入力層(X1~X3)に入力されると、その値に重みW1(w11~w16)を乗算して中間層(Y1~Y2)に入力され、その結果にさらに重みW2(w21~w26)を乗算して出力層(Z1~Z3)から出力される。この出力結果は、重みW1とW2との値によって変わる。 For example, in a three-layer neural network as shown in FIG. (Y1 to Y2), the result is further multiplied by weight W2 (w21 to w26), and output from the output layer (Z1 to Z3). This output result varies depending on the values of weights W1 and W2.
 本願において、ニューラルネットワークは、データ取得部102によって取得された良品のNDフィルタ画像に従って、いわゆる教師なし学習により、良品のNDフィルタ画像の特徴を学習する。すなわち、入力層に良品のNDフィルタ画像を入力して出力層から出力される再構成画像が、入力層に入力された良品のNDフィルタ画像に近づくように重みW1とW2とを調整することで学習する。 In the present application, the neural network learns the features of the good ND filter image by so-called unsupervised learning according to the good ND filter image acquired by the data acquisition unit 102 . That is, by inputting a good ND filter image to the input layer and adjusting the weights W1 and W2 so that the reconstructed image output from the output layer approaches the good ND filter image input to the input layer. learn.
 モデル生成部103に用いられる学習アルゴリズムとしては、特徴量そのものを抽出できるように学習する、深層学習(Deep Learning)、例えば畳み込み処理(Convolution Neural Network)に従って実行してもよい。モデル生成部103に用いられる学習アルゴリズムとして、オートエンコーダを用いることができる。オートエンコーダは、入力データである良品のNDフィルタ画像から特徴量を抽出するエンコーダと、特徴量から良品のNDフィルタ画像を再構成するデコーダとを備える。エンコーダは、ニューラルネットワークの入力層と中間層とによって構成され、デコーダは、ニューラルネットワークの中間層と出力層とによって構成される。さらに、他の公知の方法、例えば遺伝的プログラミング、機能論理プログラミング、またはサポートベクターマシンなどに従って機械学習を実行してもよい。 The learning algorithm used in the model generation unit 103 may be executed according to deep learning, such as convolution neural network, which learns so that the feature quantity itself can be extracted. An autoencoder can be used as a learning algorithm used in the model generation unit 103 . The autoencoder includes an encoder that extracts a feature amount from a good ND filter image as input data, and a decoder that reconstructs a good ND filter image from the feature amount. The encoder consists of the input and hidden layers of the neural network, and the decoder consists of the hidden and output layers of the neural network. Additionally, machine learning may be performed according to other known methods such as genetic programming, functional logic programming, or support vector machines.
 モデル生成部103は、以上のような学習を実行することで学習済モデルを生成し、出力する。 The model generation unit 103 generates and outputs a learned model by executing the above learning.
 学習済モデル記憶部109は、モデル生成部103から出力された学習済モデル104を記憶する。 The learned model storage unit 109 stores the learned model 104 output from the model generation unit 103.
 次に、学習装置101による学習手順について説明する。図6は、実施の形態2の学習装置101の学習処理に関するフローチャートである。 Next, the learning procedure by the learning device 101 will be described. FIG. 6 is a flowchart relating to learning processing of the learning device 101 according to the second embodiment.
 ステップa1において、表面に埋め込まれた金属異物が存在しない被検査物体(以下、良品)の表面を撮影して得られ得る良品のNDフィルタ画像を学習用データとして取得する。また、前処理として、良品のNDフィルタ画像に画像処理を施すことによって得られた被検査物体8の特徴量を表わす画像を良品画像として入力するものとしてもよい。前処理として実施する画像処理は、一例として公知のアルゴリズム、例えば輝度二値化または正規化などを用いることができる。 In step a1, an ND filter image of a non-defective product obtained by photographing the surface of an object to be inspected (hereinafter referred to as a non-defective product) having no metallic foreign matter embedded in the surface is acquired as learning data. Further, as preprocessing, an image representing the feature amount of the object to be inspected 8 obtained by performing image processing on a non-defective ND filter image may be input as a non-defective product image. As an example of the image processing performed as preprocessing, a known algorithm such as luminance binarization or normalization can be used.
 ステップa2において、モデル生成部103は、データ取得部102によって取得された学習用データを用いて、いわゆる教師なし学習により、良品のNDフィルタ画像から良品のNDフィルタ画像を再構成する学習済モデル104を生成する。 In step a2, the model generation unit 103 uses the learning data acquired by the data acquisition unit 102 to reconstruct a good ND filter image from a good ND filter image by so-called unsupervised learning. to generate
 ステップa3において、学習済モデル記憶部109は、モデル生成部103が生成した学習済モデル104を記憶する。 In step a3, the learned model storage unit 109 stores the learned model 104 generated by the model generation unit 103.
 図4に示すように、推論装置105は、データ取得部106と、推論部107とを備える。 As shown in FIG. 4, the inference device 105 includes a data acquisition unit 106 and an inference unit 107.
 データ取得部106は、被検査物体8の表面を撮影して得られる被検査物体8のNDフィルタ画像を取得する。このNDフィルタ画像は、実施の形態1と同様に、第1のカメラ1によって生成することができる。 The data acquisition unit 106 acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected. This ND filter image can be generated by the first camera 1 as in the first embodiment.
 推論部107は、学習済モデル記憶部109に記憶されている良品のNDフィルタ画像から良品のNDフィルタ画像を再構成する学習済モデル104を用いて、データ取得部106で取得した被検査物体8のNDフィルタ画像から被検査物体8のNDフィルタ画像を再構成する。すなわち、この学習済モデルにデータ取得部106で取得した被検査物体8のNDフィルタ画像を入力することで、被検査物体8のNDフィルタ画像の再構成画像を生成することができる。 The inference unit 107 uses the learned model 104 that reconstructs a good ND filter image from the good ND filter images stored in the learned model storage unit 109 to obtain the inspected object 8 acquired by the data acquisition unit 106. An ND-filtered image of the inspected object 8 is reconstructed from the ND-filtered image of . That is, by inputting the ND-filtered image of the inspected object 8 acquired by the data acquisition unit 106 into this trained model, a reconstructed image of the ND-filtered image of the inspected object 8 can be generated.
 被検査物体8が良品の場合には、推論部107によって生成された再構成画像は、データ取得部106で取得し、オートエンコーダに入力された被検査物体のNDフィルタ画像(良品画像)に近い画像となり、再構成が適切に行われる。被検査物体8が良品でない場合には、推論部107によって生成された再構成画像は、データ取得部106で取得し、オートエンコーダに入力された被検査物体のNDフィルタ画像に近い画像とはならず、再構成が適切に行われない。なぜなら、学習済みモデルは、良品のNDフィルタ画像から良品のNDフィルタ画像を再構成するように学習されたものだからである。 If the object 8 to be inspected is a non-defective product, the reconstructed image generated by the inference unit 107 is similar to the ND filtered image (non-defective product image) of the object to be inspected obtained by the data acquisition unit 106 and input to the autoencoder. image, and the reconstruction is done properly. If the inspected object 8 is not a non-defective product, the reconstructed image generated by the inference unit 107 will not be an image close to the ND filtered image of the inspected object acquired by the data acquisition unit 106 and input to the autoencoder. not be properly reconfigured. This is because the trained model has been trained to reconstruct good ND-filtered images from good ND-filtered images.
 図4に示す検査部108は、推論部107から出力される被検査物体8の再構成画像と、推論部107に入力した被検査物体8のNDフィルタ画像とを比較することによって、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを検査する。比較手法の一例として、例えばZscoreまたはパターンマッチングなどの公知のアルゴリズムを用いることができる。 The inspection unit 108 shown in FIG. 4 compares the reconstructed image of the object 8 to be inspected output from the inference unit 107 with the ND filter image of the object 8 to be inspected input to the inference unit 107 to obtain the object to be inspected. It is inspected whether or not there is a metal foreign substance embedded in the surface of 8. As an example of the comparison method, a known algorithm such as Zscore or pattern matching can be used.
 次に、推論装置105による推論手順について説明する。図7は、実施の形態2の推論装置105の推論処理に関するフローチャートである。 Next, the inference procedure by the inference device 105 will be described. FIG. 7 is a flowchart of the inference processing of the inference device 105 according to the second embodiment.
 ステップb1において、データ取得部106は、被検査物体8のNDフィルタ画像を取得する。 At step b1, the data acquisition unit 106 acquires an ND filter image of the object 8 to be inspected.
 ステップb2において、推論部107は、学習済モデル記憶部109に記憶された学習済モデル104に、データ取得部106が取得した被検査物体8のNDフィルタ画像を入力し、再構成画像を得る。 At step b2, the inference unit 107 inputs the ND filter image of the inspection object 8 acquired by the data acquisition unit 106 to the learned model 104 stored in the learned model storage unit 109 to obtain a reconstructed image.
 ステップb3において、推論部107は、得られた再構成画像を検査部108に出力する。 At step b3, the inference unit 107 outputs the obtained reconstructed image to the inspection unit 108.
 ステップb4において、検査部108は、データ取得部106が取得した被検査物体8のNDフィルタ画像と再構成画像とを比較する。検査部108は、パターンマッチングなどによって、2つの画像が類似していると判断したときには、被検査物体8の表面に埋め込まれた金属異物が存在しないと判定することができる。検査部108は、パターンマッチングなどによって、2つの画像が類似していないと判断したときに、被検査物体8の表面に埋め込まれた金属異物が存在すると判定することができる。 At step b4, the inspection unit 108 compares the ND filter image of the inspection object 8 acquired by the data acquisition unit 106 with the reconstructed image. When the inspection unit 108 determines that the two images are similar by pattern matching or the like, it can determine that there is no metal foreign matter embedded in the surface of the object 8 to be inspected. When the inspection unit 108 determines that the two images are not similar by pattern matching or the like, it can determine that there is a metallic foreign substance embedded in the surface of the object 8 to be inspected.
 なお、本実施の形態では、モデル生成部103が用いる学習アルゴリズムに教師なし学習を適用した場合について説明したが、これに限られるものではない。学習アルゴリズムについては、教師なし学習以外にも、強化学習、教師あり学習、または半教師あり学習等を適用することも可能である。 In addition, in the present embodiment, the case where unsupervised learning is applied to the learning algorithm used by the model generating unit 103 has been described, but the present invention is not limited to this. In addition to unsupervised learning, reinforcement learning, supervised learning, or semi-supervised learning can also be applied as learning algorithms.
 モデル生成部103は、形状が類似した複数の樹脂成型品の良品の被検査物体8のNDフィルタ画像を学習用データとして用いて、良品学習を実行してもよい。学習用データを収集する樹脂成型品の良品の被検査物体8を途中で対象のモデルとして追加すること、および対象のモデルから除去することも可能である。さらに、ある樹脂成型品の良品の被検査物体8のNDフィルタ画像を用いて良品学習を実行して得られる学習済みモデルに、さらに別の樹脂成型品の良品の被検査物体8のNDフィルタ画像を用いて良品学習を再度実行するようにしてもよい。 The model generation unit 103 may perform non-defective product learning using ND filter images of non-defective inspected objects 8 of a plurality of resin molded products having similar shapes as learning data. It is also possible to add a non-defective inspected object 8 of a resin molded product for which learning data is collected as a target model on the way, and to remove it from the target model. In addition, a learned model obtained by executing good product learning using an ND filter image of a good product under inspection 8 of a certain resin molded product is added to an ND filter image of a good product under inspection 8 of another resin molded product. may be used to perform good product learning again.
 以上のように、本実施の形態によれば、被検査物体のNDフィルタ画像を推論装置105に入力するだけで、被検査物体8の良否、すなわち表面に埋め込まれた金属異物の有無を判定することができる。 As described above, according to the present embodiment, by simply inputting the ND filter image of the object to be inspected to the inference device 105, the quality of the object to be inspected 8, that is, the presence or absence of metallic foreign matter embedded in the surface is determined. be able to.
 実施の形態3.
 図8は、実施の形態3の外観検査装置の構成を表わす図である。
Embodiment 3.
FIG. 8 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 3. As shown in FIG.
 実施の形態3の外観検査装置は、実施の形態2の制御装置200と相違する制御装置200Aを備える。 A visual inspection apparatus according to the third embodiment includes a control device 200A that is different from the control device 200 according to the second embodiment.
 制御装置200Aは、学習装置101、推論装置105、学習済モデル記憶部109、および検査部108に代えて、学習装置101A、推論装置105A、学習済モデル記憶部109A、および表示部120を備える。 The control device 200A includes a learning device 101A, an inference device 105A, a learned model storage unit 109A, and a display unit 120 instead of the learning device 101, the inference device 105, the learned model storage unit 109, and the inspection unit 108.
 学習装置101Aは、データ取得部102Aと、モデル生成部103Aとを備える。
 データ取得部102Aは、被検査物体8の表面を撮影して得られる被検査物体8のNDフィルタ画像と、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを表わす識別データとを学習用データとして取得する。
The learning device 101A includes a data acquisition unit 102A and a model generation unit 103A.
The data acquisition unit 102A acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected, and identification data representing whether or not there is a metallic foreign substance embedded in the surface of the object 8 to be inspected. and are acquired as learning data.
 モデル生成部103Aは、学習用データを用いて、被検査物体8の表面を撮影して得られる被検査物体8のNDフィルタ画像から被検査物体8の表面に埋め込まれた金属異物が存在するか否かを識別する学習済モデル104Aを生成する。 Using the learning data, the model generation unit 103A determines whether there is a metallic foreign substance embedded in the surface of the inspection object 8 from the ND filter image of the inspection object 8 obtained by photographing the surface of the inspection object 8. A trained model 104A that identifies whether or not is generated.
 モデル生成部103Aが用いる学習アルゴリズムは、教師あり学習、教師なし学習、または強化学習等の公知のアルゴリズムを用いることができる。一例として、ニューラルネットワークを適用した場合について説明する。 A known algorithm such as supervised learning, unsupervised learning, or reinforcement learning can be used as the learning algorithm used by the model generation unit 103A. As an example, a case where a neural network is applied will be described.
 モデル生成部103Aは、例えば、ニューラルネットワークモデルに従って、いわゆる教師あり学習により、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを学習する。ここで、教師あり学習とは、入力と結果(ラベル)のデータの組を学習装置101Aに与えることで、それらの学習用データにある特徴を学習し、入力から結果を推論する手法をいう。 The model generation unit 103A learns whether or not there is a metallic foreign substance embedded in the surface of the inspection object 8 by so-called supervised learning according to, for example, a neural network model. Here, supervised learning refers to a technique in which input and result (label) data sets are given to the learning device 101A to learn features in the learning data, and the result is inferred from the input.
 本願において、ニューラルネットワークは、データ取得部102Aによって取得される被検査物体8のNDフィルタ画像と、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを表わす識別データとを含む学習用データを用いて、いわゆる教師あり学習により、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを学習する。 In the present application, the neural network includes an ND-filtered image of the inspected object 8 acquired by the data acquisition unit 102A, and identification data representing whether or not there is a metal foreign substance embedded in the surface of the inspected object 8. Using the learning data, it is learned whether or not there is a metal foreign substance embedded in the surface of the object 8 to be inspected by so-called supervised learning.
 すなわち、ニューラルネットワークは、入力層にNDフィルタ画像を入力して出力層から出力された結果が、識別データ(正解)に近づくように重みW1とW2とを調整することで学習する。 That is, the neural network learns by adjusting the weights W1 and W2 so that the ND filter image is input to the input layer and the result output from the output layer approaches the identification data (correct answer).
 モデル生成部103Aは、以上のような学習を実行することで学習済モデル104Aを生成し、出力する。 The model generation unit 103A generates and outputs the learned model 104A by executing the above learning.
 学習済モデル記憶部109Aは、モデル生成部103Aから出力された学習済モデル104Aを記憶する。 The learned model storage unit 109A stores the learned model 104A output from the model generation unit 103A.
 次に、学習装置101Aによる学習手順について説明する。図9は、実施の形態3の学習装置101Aの学習処理に関するフローチャートである。 Next, the learning procedure by the learning device 101A will be described. FIG. 9 is a flowchart relating to learning processing of the learning device 101A according to the third embodiment.
 ステップc1において、データ取得部102Aは、被検査物体8を撮影して得られる被検査物体8のNDフィルタ画像と、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを表わす識別データ(正解)との組み合わせを含む学習用データを取得する。 At step c1, the data acquisition unit 102A displays an ND filter image of the object 8 to be inspected obtained by photographing the object 8 and whether or not there is a metallic foreign matter embedded in the surface of the object 8 to be inspected. Acquire learning data including combinations with identification data (correct answers).
 ステップc2において、モデル生成部103Aは、データ取得部102Aで取得した学習用データに基づいて、いわゆる教師あり学習により、被検査物体8のNDフィルタ画像から被検査物体8の表面に埋め込まれた金属異物が存在するか否かを識別する学習済モデル104Aを生成する。 In step c2, the model generation unit 103A performs so-called supervised learning based on the learning data acquired by the data acquisition unit 102A to extract the metal embedded in the surface of the inspection object 8 from the ND filter image of the inspection object 8. Generate a trained model 104A that identifies whether or not a foreign object exists.
 ステップc3において、学習済モデル記憶部109Aは、モデル生成部103Aが生成した学習済モデル104Aを記憶する。 At step c3, the learned model storage unit 109A stores the learned model 104A generated by the model generation unit 103A.
 図8の推論装置105Aは、データ取得部106Aと、推論部107Aとを備える。
 データ取得部106Aは、被検査物体8の表面を撮影して得られる被検査物体8のNDフィルタ画像を取得する。
The inference device 105A in FIG. 8 includes a data acquisition unit 106A and an inference unit 107A.
The data acquisition unit 106A acquires an ND filter image of the object 8 to be inspected obtained by photographing the surface of the object 8 to be inspected.
 推論部107Aは、学習済モデル記憶部109Aに記憶されている学習済モデル104Aを利用して、データ取得部106Aで取得した被検査物体8のNDフィルタ画像から被検査物体8の表面に金属異物が存在するか否かを推論する。すなわち、この学習済モデルにデータ取得部106Aで取得した被検査物体8のNDフィルタ画像を入力することで、このNDフィルタ画像から推論される被検査物体8の表面に埋め込まれた金属異物が存在するか否かを識別する識別データを出力することができる。 The inference unit 107A uses the learned model 104A stored in the learned model storage unit 109A to detect metallic foreign matter on the surface of the inspection object 8 from the ND filter image of the inspection object 8 acquired by the data acquisition unit 106A. infer whether exists or not. That is, by inputting the ND filter image of the object 8 to be inspected acquired by the data acquisition unit 106A into this trained model, the presence of metallic foreign matter embedded in the surface of the object 8 to be inspected can be inferred from the ND filter image. It is possible to output identification data that identifies whether or not to do so.
 次に、推論装置105Aによる推論手順について説明する。図10は、実施の形態3の推論装置105Aの推論処理に関するフローチャートである。 Next, the inference procedure by the inference device 105A will be explained. FIG. 10 is a flowchart of the inference processing of the inference device 105A of the third embodiment.
 ステップd1において、データ取得部106Aは、被検査物体8の表面を撮影することによって得られる被検査物体8のNDフィルタ画像を取得する。 At step d1, the data acquisition unit 106A acquires an ND filter image of the object 8 to be inspected by photographing the surface of the object 8 to be inspected.
 ステップd2において、推論部107Aは、学習済モデル記憶部109Aに記憶された学習済モデル104Aにデータ取得部106Aで取得した被検査物体8のNDフィルタ画像を入力し、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを識別する識別データを得る。 At step d2, the inference unit 107A inputs the ND filter image of the inspected object 8 acquired by the data acquisition unit 106A to the learned model 104A stored in the learned model storage unit 109A, and the surface of the inspected object 8 Identification data is obtained to identify whether or not an embedded metallic foreign object is present.
 ステップd3において、推論部107Aは、得られた識別データを表示部120に出力する。 At step d3, the inference unit 107A outputs the obtained identification data to the display unit 120.
 ステップd4において、表示部120は、被検査物体8の表面に埋め込まれた金属異物が存在するか否かを表わす情報を表示することができる。 At step d4, the display unit 120 can display information indicating whether or not there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
 以上のように、本実施の形態によれば、被検査物体のNDフィルタ画像を推論装置105Aに入力するだけで、被検査物体8の良否、すなわち表面に埋め込まれた金属異物の有無を判定することができる。 As described above, according to the present embodiment, the quality of the object to be inspected 8, that is, the presence or absence of metallic foreign matter embedded in the surface is determined simply by inputting the ND filter image of the object to be inspected into the inference device 105A. be able to.
 実施の形態4.
 図11は、実施の形態4の外観検査装置の構成を表わす図である。
Embodiment 4.
FIG. 11 is a diagram showing the configuration of a visual inspection apparatus according to Embodiment 4. As shown in FIG.
 実施の形態4の外観検査装置が、実施の形態1の外観検査装置と相違する点は、実施の形態4の外観検査装置が、照明部61に代えて照明部61Aを備える点である。 The visual inspection apparatus of the fourth embodiment differs from the visual inspection apparatus of the first embodiment in that the visual inspection apparatus of the fourth embodiment includes an illumination section 61A instead of the illumination section 61.
 照明部61Aは、被検査物体8を撮影するときの明るさを調整する。照明部61Aは、被検査物体8に対して、特定の偏光角度成分のみを有する照明光L1および照射光L2を照射する。照明部61Aは、照明部61と同様に、照明装置5と偏光板11とを備える。照明部61Aは、さらに、照明装置54と偏光板55とを備える。照明装置54は、光を照射する。偏光板55は、照明装置54から照射される光の成分のうち、特定の偏光角度成分だけを有する照射光L2だけを出射する。特定の偏光角度成分だけを有する照射光L2が被検査物体8に射される。 The lighting unit 61A adjusts the brightness when photographing the object 8 to be inspected. The illumination unit 61A irradiates the inspection object 8 with illumination light L1 and illumination light L2 having only specific polarization angle components. The lighting section 61A includes the lighting device 5 and the polarizing plate 11 in the same manner as the lighting section 61 . The illumination section 61A further includes an illumination device 54 and a polarizing plate 55 . The illumination device 54 emits light. The polarizing plate 55 emits only the irradiation light L2 having only a specific polarization angle component among the light components irradiated from the illumination device 54 . Irradiation light L2 having only a specific polarization angle component is projected onto the object 8 to be inspected.
 照射光L1と照射光L2とが被検査物体8に入射すると、拡散反射の反射光LB1が発生し、ビームスプリッタ4に入射する。 When the irradiation light L1 and the irradiation light L2 are incident on the object 8 to be inspected, reflected light LB1 of diffuse reflection is generated and is incident on the beam splitter 4 .
 ビームスプリッタ4は、2個の直角プリズムによって構成されたキューブ型ビームスプリッタである。ビームスプリッタ4は、反射光LB1が入射する方向に配置される。ビームスプリッタ4は、入射光を定められた分割比で2つの光に分割する。ビームスプリッタ4に入射した反射光LB1は、反射光LB2と反射光LB3とに分岐される。反射光LB2は、第1のカメラ1に入射される。反射光LB3は、第2のカメラ51に入射される。 The beam splitter 4 is a cubic beam splitter composed of two rectangular prisms. The beam splitter 4 is arranged in the direction in which the reflected light LB1 is incident. The beam splitter 4 splits the incident light into two lights with a defined splitting ratio. Reflected light LB1 incident on beam splitter 4 is split into reflected light LB2 and reflected light LB3. Reflected light LB2 is incident on first camera 1 . Reflected light LB3 is incident on second camera 51 .
 第1のカメラ1は、実施の形態1と異なる位置に配置している。
 第1のカメラ1は、被検査物体8の法線方向から被検査物体8の表面を撮影する。第1のカメラ1と第2のカメラ51との位置関係は、実施の形態1と同様である。したがって、第1のカメラ1が移動したことにより、第2のカメラ51もオフセットした位置に配置している。
The first camera 1 is arranged at a position different from that of the first embodiment.
The first camera 1 photographs the surface of the object 8 to be inspected from the normal direction of the object 8 to be inspected. The positional relationship between the first camera 1 and the second camera 51 is the same as in the first embodiment. Therefore, due to the movement of the first camera 1, the second camera 51 is also arranged at an offset position.
 図12は、実施の形態4に係る検査装置による検査処理の手順を表わすフローチャートである。 FIG. 12 is a flow chart showing the procedure of inspection processing by the inspection apparatus according to the fourth embodiment.
 本フローチャートは、実施の形態1の図2で示した内容に検査範囲を動的に設定する処理が加えられている。 In this flowchart, processing for dynamically setting the inspection range is added to the content shown in FIG. 2 of the first embodiment.
 ステップB1において、照明制御部23は、照明装置5と照明装置54とを点灯させる。 At step B1, the lighting control unit 23 turns on the lighting device 5 and the lighting device 54 .
 ステップB2において、カメラ制御部21は、第1のカメラ1と第2のカメラ51とに撮像指示を送る。これにより、被検査物体8の表面において、NDフィルタ画像とPLフィルタ画像とが生成される。 In step B2, the camera control unit 21 sends imaging instructions to the first camera 1 and the second camera 51. As a result, an ND filter image and a PL filter image are generated on the surface of the object 8 to be inspected.
 ステップB3において、検査範囲が設定される。
 図13は、検査範囲を説明するための図である。NDフィルタ画像400は、被検査物体401の撮影画像を含む。被検査物体401には、照明装置5と照明装置54とから照射された光が正反射する範囲がある。正反射する範囲が矩形402と矩形403とで示されている。照明装置5と照明装置54との照度は、あらかじめ正反射する範囲の輝度値が255(最大値)の光量で第1のカメラ1に入力されるように設定されている。検査処理部22は、輝度値が255になる範囲、すなわち矩形402と矩形403を検出し、矩形402と矩形403とで挟まれた隙間を検査範囲として設定する。金属異物が、正反射する範囲に埋め込まれていた場合、金属異物も輝度値255になり、金属異物を検出できない課題があったが、正反射する範囲を除いたところを検査範囲とすることによって、その課題が解決される。
At step B3, an inspection range is set.
FIG. 13 is a diagram for explaining the inspection range. The ND filtered image 400 includes a captured image of the inspected object 401 . The object 401 to be inspected has a range in which the light emitted from the illumination device 5 and the illumination device 54 is specularly reflected. Rectangles 402 and 403 indicate the specular reflection range. The illuminances of the illumination devices 5 and 54 are set in advance so that the specular reflection range is input to the first camera 1 with a light intensity of 255 (maximum value). The inspection processing unit 22 detects the range where the luminance value is 255, that is, the rectangles 402 and 403, and sets the gap between the rectangles 402 and 403 as the inspection range. If the metallic foreign matter is embedded in the specular reflection range, the luminance value of the metallic foreign matter also becomes 255, and there is a problem that the metallic foreign matter cannot be detected. , the problem is solved.
 矩形402と矩形403とで挟まれた範囲の表面に埋め込まれたフラット形状の金属異物が存在すると、金属異物から正反射光と拡散反射光とが同時に生成される。また、浮遊異物においては拡散反射光のみが生成される。 If there is a flat-shaped metallic foreign matter embedded in the surface of the area between the rectangles 402 and 403, specularly reflected light and diffusely reflected light are generated at the same time from the metallic foreign matter. In addition, only diffusely reflected light is generated from floating foreign matter.
 ステップB4において、検査処理部22は、NDフィルタ画像からPLフィルタ画像を減算することによって、NDフィルタ画像とPLフィルタ画像との差分画像を生成する。NDフィルタ3を透過する光は、金属異物からの正反射光と拡散反射光と、浮遊異物からの拡散反射光とを含むため、NDフィルタ画像では、金属異物と浮遊異物とが明るく強調され、PLフィルタ画像でも金属異物と浮遊異物とが明るく強調されるが、PLフィルタ53を透過する光は、金属異物からの拡散反射光のみを含むため、PLフィルタ画像の金属異物の明るさは、NDフィルタ画像の金属異物の明るさに対して暗くなっている。したがって、NDフィルタ画像とPLフィルタ画像との差分画像から、金属異物だけを検査できる。具体的には、検査処理部22は、NDフィルタ画像の画素値からPLフィルタ画像の画素値を減算した値を差分画像の画素値とする。 In step B4, the inspection processing unit 22 generates a difference image between the ND filter image and the PL filter image by subtracting the PL filter image from the ND filter image. Since the light transmitted through the ND filter 3 includes specularly reflected light and diffusely reflected light from the metallic foreign matter and diffusely reflected light from the floating foreign matter, the metallic foreign matter and the floating foreign matter are brightly emphasized in the ND filter image. In the PL filter image, the metallic foreign matter and floating foreign matter are also brightly emphasized. However, since the light transmitted through the PL filter 53 includes only the diffusely reflected light from the metallic foreign matter, the brightness of the metallic foreign matter in the PL filter image is ND It is darker than the brightness of the metallic foreign matter in the filtered image. Therefore, only metallic foreign matter can be inspected from the difference image between the ND filter image and the PL filter image. Specifically, the inspection processing unit 22 uses a value obtained by subtracting the pixel value of the PL filter image from the pixel value of the ND filter image as the pixel value of the difference image.
 ステップB5において、検査処理部22は、予め定められた2値化しきい値を用いて、差分画像の画素値を2値化することによって、2値化画像を生成する。検査処理部22は、差分画像の画素の値が2値化しきい以上の場合には、その画素の値を「1」とし、差分画像の画素の値が2値化しきい未満の場合には、その画素の値を「0」とする。 In step B5, the inspection processing unit 22 generates a binarized image by binarizing the pixel values of the difference image using a predetermined binarization threshold. If the value of the pixel in the differential image is equal to or greater than the binarization threshold, the inspection processing unit 22 sets the value of the pixel to "1". Let the value of that pixel be "0".
 ステップB6において、検査処理部22は、2値化画像をラベリングする。ここでラベリングとは、連結している画素に同じラベル「1」を付加することによって、複数の領域をグループとして分類する処理であり、画像処理において公知の技術である。 At step B6, the inspection processing unit 22 labels the binarized image. Here, labeling is a process of classifying a plurality of regions as a group by adding the same label "1" to connected pixels, and is a well-known technique in image processing.
 ステップB7において、検査処理部22は、「1」のラベルが付加された複数の画素によって形成される領域毎に、その領域の面積を計測する。面積が閾値以上の領域が存在するときには、処理がステップB9に進む。面積が閾値以上の領域が存在しないときには、処理がステップB8に進む。 In step B7, the inspection processing unit 22 measures the area of each region formed by a plurality of pixels labeled "1". If there is a region whose area is equal to or greater than the threshold, the process proceeds to step B9. If there is no region with an area equal to or greater than the threshold, the process proceeds to step B8.
 ステップB8において、検査処理部22は、被検査物体8の表面に埋め込まれた金属異物が存在しないと判定する。 At step B8, the inspection processing unit 22 determines that there is no metallic foreign matter embedded in the surface of the object 8 to be inspected.
 ステップB9において、検査処理部22は、被検査物体8の表面に埋め込まれた金属異物が存在すると判定する。 At step B9, the inspection processing unit 22 determines that there is a metal foreign substance embedded in the surface of the object 8 to be inspected.
 なお、本実施の形態では、照明装置5と照明装置54とが、バー状の照明装置である場合について説明したが、リング状の照明装置であってもよい。本実施の形態では、検査範囲を広くするために照明装置5と照明装置54の2台を設置するものとしたが、バー状の照明装置1台のみ設置することも可能である。 In the present embodiment, the case where the illumination device 5 and the illumination device 54 are bar-shaped illumination devices has been described, but they may be ring-shaped illumination devices. In this embodiment, two illumination devices 5 and 54 are installed in order to widen the inspection range, but it is also possible to install only one bar-shaped illumination device.
 以上のように、本実施の形態によれば、NDフィルタ画像とPLフィルタ画像との差分画像を用いることによって、被検査物体の表面に埋め込まれた金属異物のみを検出することができる。 As described above, according to the present embodiment, by using the differential image between the ND filter image and the PL filter image, it is possible to detect only metallic foreign matter embedded in the surface of the object to be inspected.
 図14は、制御装置20、200、200Aのハードウェア構成を表わす図である。
 制御装置20、200、200Aは、相当する動作をデジタル回路のハードウェアまたはソフトウェアで構成することができる。制御装置20、200、200Aの機能をソフトウェアを用いて実現する場合には、制御装置20、200、200Aは、例えば、図11に示すように、バス1003によって接続されたプロセッサ1001とメモリ1002とを備え、メモリ1002に記憶されたプログラムをプロセッサ1001が実行するようにすることができる。
FIG. 14 is a diagram showing the hardware configuration of control devices 20, 200 and 200A.
The controllers 20, 200, 200A can implement corresponding operations in digital circuit hardware or software. When the functions of the control devices 20, 200, 200A are realized using software, the control devices 20, 200, 200A are, for example, as shown in FIG. , and the processor 1001 can execute a program stored in the memory 1002 .
 今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本開示の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiments disclosed this time should be considered illustrative in all respects and not restrictive. The scope of the present disclosure is indicated by the scope of claims rather than the above description, and is intended to include all changes within the meaning and scope of equivalence to the scope of claims.
 1 第1のカメラ、2,52 レンズ、3 NDフィルタ、4 ビームスプリッタ、5,54 照明装置、6 モータ、7 駆動軸、8 被検査物体、9 治具、10 電源装置、11,55 偏光板、20,200,200A 制御装置、21 カメラ制御部、22 検査処理部、23 照明制御部、24 モータ制御部、51 第2のカメラ、53 PLフィルタ、61,61A 照明部、101,101A 学習装置、102,102A,106,106A データ取得部、103,103A モデル生成部、104,104A 学習済モデル、105,105A 推論装置、107,107A 推論部、108 検査部、109,109A 学習済モデル記憶部、120 表示部、1001 プロセッサ、1002 メモリ、1003 バス。 1 first camera, 2, 52 lens, 3 ND filter, 4 beam splitter, 5, 54 lighting device, 6 motor, 7 drive shaft, 8 object to be inspected, 9 jig, 10 power supply, 11, 55 polarizing plate , 20, 200, 200A control device, 21 camera control unit, 22 inspection processing unit, 23 lighting control unit, 24 motor control unit, 51 second camera, 53 PL filter, 61, 61A lighting unit, 101, 101A learning device , 102, 102A, 106, 106A data acquisition unit, 103, 103A model generation unit, 104, 104A learned model, 105, 105A inference device, 107, 107A inference unit, 108 inspection unit, 109, 109A learned model storage unit , 120 display unit, 1001 processor, 1002 memory, 1003 bus.

Claims (12)

  1.  NDフィルタを有し、被検査物体の表面を撮影して、NDフィルタ画像を生成する第1のカメラと、
     PLフィルタを有し、前記被検査物体の表面を撮影して、PLフィルタ画像を生成する第2のカメラと、
     前記NDフィルタ画像と前記PLフィルタ画像との差分画像に基づいて、前記被検査物体の表面に埋め込まれた金属異物の有無を検査する制御装置とを備え、
     前記制御装置は、前記差分画像における2値化しきい値以上の値を有する連結された画素からなる領域の面積の大きさに基づいて、前記被検査物体の表面に埋め込まれた前記金属異物の有無を検査する、外観検査装置。
    a first camera that has an ND filter and captures the surface of the object under inspection to generate an ND filtered image;
    a second camera having a PL filter for imaging the surface of the inspected object to generate a PL filtered image;
    a control device that inspects the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected based on the difference image between the ND filter image and the PL filter image;
    The control device determines the presence or absence of the metal foreign matter embedded in the surface of the object to be inspected based on the size of the area of the connected pixels having a value equal to or higher than the binarization threshold value in the difference image. Appearance inspection device for inspecting
  2.  前記被検査物体の表面に対して、特定の偏光角度成分のみを有する照射光を照射する照明部をさらに備える、請求項1記載の外観検査装置。 The visual inspection apparatus according to claim 1, further comprising an illumination unit that irradiates the surface of the object to be inspected with illumination light having only a specific polarization angle component.
  3.  前記照射光が正反射する方向に配置され、前記照射光の反射光を2分岐させるビームスプリッタをさらに備え、
     前記第1のカメラは、前記正反射する方向から前記被検査物体の表面を撮影し、
     前記第2のカメラは、前記正反射する方向と垂直な方向から前記被検査物体の表面を撮影する、請求項1または2に記載の外観検査装置。
    further comprising a beam splitter arranged in a direction in which the irradiation light is specularly reflected and splitting the reflected light of the irradiation light into two;
    the first camera photographs the surface of the object to be inspected from the specular direction;
    3. The visual inspection apparatus according to claim 1, wherein said second camera photographs the surface of said object to be inspected from a direction perpendicular to said specular reflection direction.
  4.  前記第1のカメラおよび前記第2のカメラは、前記被検査物体の表面を同時に撮影する、請求項1~3のいずれか1項に記載の外観検査装置。 The appearance inspection apparatus according to any one of claims 1 to 3, wherein the first camera and the second camera simultaneously photograph the surface of the object to be inspected.
  5.  前記制御装置は、前記被検査物体における正反射する範囲を除いた箇所を検査範囲として、前記被検査物体の表面に埋め込まれた金属異物の有無を検査する、請求項1~4のいずれか1項に記載の外観検査装置。 5. The controller according to any one of claims 1 to 4, wherein said control device inspects for the presence or absence of metallic foreign matter embedded in the surface of said object to be inspected, using a portion of said object to be inspected excluding a specular reflection range as an inspection range. Appearance inspection device according to the item.
  6.  NDフィルタを有する第1のカメラが、被検査物体の表面を撮影して、NDフィルタ画像を生成するステップと、
     PLフィルタを有する第2のカメラが、前記被検査物体の表面を撮影して、PLフィルタ画像を生成するステップと、
     制御装置が、前記NDフィルタ画像と前記PLフィルタ画像との差分画像に基づいて、前記被検査物体の表面に埋め込まれた金属異物の有無を検査するステップとを備える、外観検査方法。
    a first camera with an ND filter imaging the surface of the inspected object to generate an ND filtered image;
    a second camera with a PL filter imaging the surface of the inspected object to generate a PL filtered image;
    A visual inspection method comprising a step of inspecting presence/absence of metal foreign matter embedded in the surface of the inspected object based on a difference image between the ND filter image and the PL filter image.
  7.  前記検査するステップは、前記被検査物体における正反射する範囲を除いた箇所を検査範囲として、前記被検査物体の表面に埋め込まれた金属異物の有無を検査するステップを含む、請求項6記載の外観検査方法。 7. The step of inspecting includes a step of inspecting for the presence or absence of metallic foreign matter embedded in the surface of the object to be inspected, using a portion of the object to be inspected excluding a specular reflection range as an inspection range. Appearance inspection method.
  8.  表面に埋め込まれた金属異物が存在しない被検査物体を良品とし、前記良品の表面を撮影して得られる良品のNDフィルタ画像を含む学習用データを取得するデータ取得部と、
     前記学習用データを用いて、良品のNDフィルタ画像から、前記良品のNDフィルタ画像を再構成する学習済モデルを生成するモデル生成部と、を備える学習装置。
    A data acquisition unit for acquiring learning data including an ND filter image of a non-defective product obtained by photographing the surface of the non-defective product, with an object to be inspected having no metallic foreign matter embedded in the surface thereof as a non-defective product;
    and a model generating unit that generates a trained model for reconstructing the good ND filter image from the good ND filter image using the learning data.
  9.  被検査物体の表面を撮影して得られる前記被検査物体のNDフィルタ画像を取得するデータ取得部と、
     表面に埋め込まれた金属異物が存在しない被検査物体を良品とし、前記良品の表面を撮影して得られる前記良品のNDフィルタ画像から、前記良品のNDフィルタ画像を再構成する学習済モデルを用いて、前記データ取得部で取得した前記被検査物体のNDフィルタ画像から、前記被検査物体のNDフィルタ画像を再構成する推論部と、を備える推論装置。
    a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected;
    An object to be inspected having no metallic foreign matter embedded in its surface is regarded as a non-defective product, and a trained model is used to reconstruct an ND filter image of the non-defective product from the ND filter image of the non-defective product obtained by photographing the surface of the non-defective product. and an inference unit that reconstructs the ND filter image of the inspection object from the ND filter image of the inspection object acquired by the data acquisition unit.
  10.  NDフィルタを有し、前記被検査物体のNDフィルタ画像を生成するカメラと、
     請求項9に記載された推論装置と、
     前記データ取得部で取得した前記被検査物体のNDフィルタ画像と、前記再構成された被検査物体のNDフィルタ画像とを比較することによって、前記被検査物体の表面に埋め込まれた金属異物が存在するか否かを検査する検査部と、を備えた外観検査装置。
    a camera having an ND filter and generating an ND filtered image of the inspected object;
    an inference device according to claim 9;
    By comparing the ND filter image of the object to be inspected acquired by the data acquiring unit and the reconstructed ND filter image of the object to be inspected, the presence of metallic foreign matter embedded in the surface of the object to be inspected is detected. and an inspection unit that inspects whether or not to
  11.  被検査物体の表面を撮影して得られる前記被検査物体のNDフィルタ画像と前記被検査物体の表面に埋め込まれた金属異物が存在するか否かを表わす識別データとを含む学習用データを取得するデータ取得部と、
     前記学習用データを用いて、前記被検査物体の表面を撮影して得られる前記被検査物体のNDフィルタ画像から前記被検査物体の表面に埋め込まれた金属異物が存在するか否かを識別する学習済モデルを生成するモデル生成部と、備えた学習装置。
    Acquiring learning data including an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected and identification data representing whether or not a metallic foreign substance embedded in the surface of the object to be inspected exists. a data acquisition unit for
    Using the learning data, it is discriminated whether or not there is a metallic foreign matter embedded in the surface of the object to be inspected from an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected. A model generation unit that generates a trained model, and a learning device that includes the model generation unit.
  12.  被検査物体の表面を撮影して得られる前記被検査物体のNDフィルタ画像を取得するデータ取得部と、
     前記被検査物体の表面を撮影して得られる前記被検査物体のNDフィルタ画像から前記被検査物体の表面に埋め込まれた金属異物が存在するか否かを推論するための学習済モデルを用いて、前記データ取得部で取得した前記被検査物体のNDフィルタ画像から、前記被検査物体の表面に金属異物が存在するか否かを推論する推論部と、を備える推論装置。
    a data acquisition unit that acquires an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected;
    using a trained model for inferring whether or not there is a metallic foreign substance embedded in the surface of the object to be inspected from an ND filter image of the object to be inspected obtained by photographing the surface of the object to be inspected and an inference unit for inferring whether or not a metallic foreign substance exists on the surface of the object to be inspected from the ND filter image of the object to be inspected acquired by the data acquisition unit.
PCT/JP2021/043248 2021-06-03 2021-11-25 Appearance inspection device, appearance inspection method, learning device, and inference device WO2022254747A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023525357A JP7483135B2 (en) 2021-06-03 2021-11-25 Visual inspection device and visual inspection method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021093691 2021-06-03
JP2021-093691 2021-06-03

Publications (1)

Publication Number Publication Date
WO2022254747A1 true WO2022254747A1 (en) 2022-12-08

Family

ID=84324051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/043248 WO2022254747A1 (en) 2021-06-03 2021-11-25 Appearance inspection device, appearance inspection method, learning device, and inference device

Country Status (2)

Country Link
JP (1) JP7483135B2 (en)
WO (1) WO2022254747A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221268A (en) * 1997-02-05 1998-08-21 Advantest Corp Method and device for detecting surface state of wafer
JP2001013261A (en) * 1999-06-30 2001-01-19 Mitsubishi Heavy Ind Ltd Contamination detection method and device
JP2016105052A (en) * 2014-12-01 2016-06-09 東レエンジニアリング株式会社 Substrate inspection device
JP2019060780A (en) * 2017-09-27 2019-04-18 ファナック株式会社 Inspection device and inspection system
WO2020031984A1 (en) * 2018-08-08 2020-02-13 Blue Tag株式会社 Component inspection method and inspection system
JP2020193890A (en) * 2019-05-29 2020-12-03 ヴィスコ・テクノロジーズ株式会社 Visual inspection apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221268A (en) * 1997-02-05 1998-08-21 Advantest Corp Method and device for detecting surface state of wafer
JP2001013261A (en) * 1999-06-30 2001-01-19 Mitsubishi Heavy Ind Ltd Contamination detection method and device
JP2016105052A (en) * 2014-12-01 2016-06-09 東レエンジニアリング株式会社 Substrate inspection device
JP2019060780A (en) * 2017-09-27 2019-04-18 ファナック株式会社 Inspection device and inspection system
WO2020031984A1 (en) * 2018-08-08 2020-02-13 Blue Tag株式会社 Component inspection method and inspection system
JP2020193890A (en) * 2019-05-29 2020-12-03 ヴィスコ・テクノロジーズ株式会社 Visual inspection apparatus

Also Published As

Publication number Publication date
JPWO2022254747A1 (en) 2022-12-08
JP7483135B2 (en) 2024-05-14

Similar Documents

Publication Publication Date Title
TWI598581B (en) Inspection apparatus and inspection method
CN114787648A (en) System and method for transparent object segmentation using polarization cues
CN108355981A (en) A kind of battery connector quality determining method based on machine vision
JP2022520018A (en) Defect inspection equipment
CN113807378B (en) Training data increment method, electronic device and computer readable recording medium
US10726535B2 (en) Automatically generating image datasets for use in image recognition and detection
TW202240546A (en) Image augmentation techniques for automated visual inspection
US20220284567A1 (en) Teacher data generation method, trained learning model, and system
JP2008170256A (en) Flaw detection method, flaw detection program and inspection device
JP3220455U (en) Leather detector
Banus et al. A deep-learning based solution to automatically control closure and seal of pizza packages
WO2022254747A1 (en) Appearance inspection device, appearance inspection method, learning device, and inference device
JP2020106295A (en) Sheet defect inspection device
JPH0792104A (en) Method for inspecting defective point of object
US20210372778A1 (en) Detection apparatus and method of producing electronic apparatus
CN111028250A (en) Real-time intelligent cloth inspecting method and system
WO2024009868A1 (en) Appearance inspection system, appearance inspection method, training device, and inference device
CN116601665A (en) Method for classifying images and method for optically inspecting objects
Yakimov Preprocessing digital images for quickly and reliably detecting road signs
KR20210009411A (en) Defect inspection apparatus
Bäuerle et al. CAD2Real: Deep learning with domain randomization of CAD data for 3D pose estimation of electronic control unit housings
JPH109835A (en) Surface flaw inspection apparatus
CA2997335C (en) Automatically generating image datasets for use in image recognition and detection
Dighvijay et al. A Faster R-CNN implementation of presence inspection for parts on industrial produce
JP2006145228A (en) Unevenness defect detecting method and unevenness defect detector

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21944239

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023525357

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21944239

Country of ref document: EP

Kind code of ref document: A1