WO2018235266A1 - Dispositif d'inspection, procédé d'inspection et programme d'inspection - Google Patents

Dispositif d'inspection, procédé d'inspection et programme d'inspection Download PDF

Info

Publication number
WO2018235266A1
WO2018235266A1 PCT/JP2017/023213 JP2017023213W WO2018235266A1 WO 2018235266 A1 WO2018235266 A1 WO 2018235266A1 JP 2017023213 W JP2017023213 W JP 2017023213W WO 2018235266 A1 WO2018235266 A1 WO 2018235266A1
Authority
WO
WIPO (PCT)
Prior art keywords
abnormality
partial
determined
partial image
contain
Prior art date
Application number
PCT/JP2017/023213
Other languages
English (en)
Japanese (ja)
Inventor
宏季 遠野
Original Assignee
株式会社Rist
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Rist filed Critical 株式会社Rist
Priority to PCT/JP2017/023213 priority Critical patent/WO2018235266A1/fr
Priority to JP2018515903A priority patent/JP6387477B1/ja
Publication of WO2018235266A1 publication Critical patent/WO2018235266A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an inspection apparatus, an inspection method, and an inspection program.
  • an image of an inspection object flowing in a production line is photographed to detect the presence or absence of an abnormality regarding the inspection object.
  • the abnormality of the inspection object is, for example, a scratch or dirt attached to the inspection object, or a foreign substance mixed in the inspection object.
  • a first processing unit that classifies a test target signal into normal and non-normal by a first learned neural network, and a second neural network after learning
  • a second processing unit which extracts a partial signal including an area other than normal from an inspection target signal classified other than normal by a processing unit, and classifies the type of abnormality with respect to the partial signal.
  • the first processing unit can detect the presence or absence of an abnormality regarding the inspection object, and the second processing unit can classify the type of abnormality.
  • the detection accuracy of the presence or absence of an abnormality depends on the classification accuracy with which the first processing unit classifies the test target signal as normal and non-normal, and the first processing unit when possible abnormalities are various.
  • the classification accuracy of is not sufficiently high, and the detection accuracy of the presence or absence of abnormality is not necessarily high.
  • an object of the present invention is to provide an inspection apparatus, an inspection method, and an inspection program capable of accurately detecting the presence or absence of an abnormality related to an inspection object even if there are various possible abnormalities.
  • An inspection apparatus includes a first division unit that divides an image of an inspection object into a plurality of first partial images, and a plurality of second partial images different from the plurality of first partial images.
  • a second division unit for dividing into a plurality of first partial images, and a first partial image determined to contain an abnormality and a first partial image determined to contain no abnormality;
  • a classification unit; and a second classification unit for classifying the plurality of second partial images into a second partial image determined to contain an abnormality and a second partial image determined to contain no abnormality.
  • a determination unit that determines the presence or absence of an abnormality related to the inspection object based on the overlap between the first partial image determined to contain an abnormality and the second partial image determined to contain an abnormality And.
  • the image is divided into a plurality of first partial images and a plurality of second partial images different from the plurality of first partial images, and the plurality of first partial images and the plurality of second partial images are respectively provided.
  • the erroneous determination is reduced, and the possible abnormality may be various. Even if there is an abnormality, the presence or absence of the inspection object can be detected with high accuracy.
  • the determination unit determines that the first partial image determined to contain an abnormality and the second partial image determined to contain an abnormality overlap with each other at the overlapping portion. It may be determined that exists.
  • the determination unit does not include the case where the first partial image determined to include an abnormality and the second partial image determined to not include an abnormality or the abnormality is not included. If the first partial image determined to be superimposed on the second partial image determined to contain an abnormality, the first partial image determined to contain the abnormality or the abnormality is It may be determined that no abnormality exists in the portion of the second partial image determined to be included.
  • the first partial image determined to include the abnormality
  • the erroneous determination of the existence of the abnormality is prevented.
  • the first partial image determined to contain an abnormality and the first partial image determined to not contain an abnormality are distinguishably displayed, and it is determined that an abnormality is included.
  • the display device may further include a display unit configured to distinguishably display the second partial image and the second partial image that is determined not to include an abnormality, and to display the image.
  • the display unit may distinguishably display the first partial image determined to include an abnormality and the second partial image determined to include an abnormality.
  • the difference between the classification results by the first classification unit and the second classification unit can be obtained. It can be easily grasped.
  • each of the plurality of second partial images may be smaller than each of the plurality of first partial images.
  • an abnormality is included in the plurality of first partial images and the plurality of second partial images smaller than each of the plurality of first partial images. Even in the case where there are various items, it is possible to accurately determine the presence or absence and presence of an abnormality.
  • At least two of the plurality of second partial images may overlap with any one of the plurality of first partial images.
  • the size and the shape of the abnormality are various. Even in this case, the presence or absence of an abnormality can be determined with high accuracy.
  • the first division unit divides the image such that the plurality of first partial images do not overlap each other
  • the second division divides the image such that the plurality of second partial images do not overlap each other It is also good.
  • the first classification unit includes a first learned image in which the first partial image including the abnormality and the first partial image not including the abnormality are learned as the learning data
  • the classification unit may include a second learned image in which the second partial image containing the abnormality and the second partial image not containing the abnormality are learned as learning data.
  • the first classification unit and the second classification unit classify presence or absence of abnormality from different viewpoints. can do.
  • the computer divides the image of the inspection object into a plurality of first partial images, and a plurality of second portions different from the plurality of first partial images. Dividing into a plurality of images, and classifying the plurality of first partial images into a first partial image determined to contain an abnormality and a first partial image determined to contain no abnormality. Classifying the plurality of second partial images into a second partial image determined to contain an abnormality and a second partial image determined to not contain an abnormality; Determining the presence or absence of an abnormality related to the inspection object based on the overlap between the first partial image to be determined and the second partial image determined to contain an abnormality.
  • the image is divided into a plurality of first partial images and a plurality of second partial images different from the plurality of first partial images, and the plurality of first partial images and the plurality of second partial images are respectively provided.
  • the erroneous determination is reduced, and the possible abnormality may be various. Even if there is an abnormality, the presence or absence of the inspection object can be detected with high accuracy.
  • An inspection program includes a first division unit that divides a computer into an image of an inspection object into a plurality of first partial images, and a plurality of first images different from the plurality of first partial images.
  • a second division unit that divides the image into two partial images, and a plurality of first partial images are classified into a first partial image that is determined to contain an abnormality and a first partial image that is determined to contain no abnormality.
  • a first classification unit for classifying the plurality of second partial images into a second partial image determined to contain an abnormality and a second partial image determined to not contain an abnormality;
  • the inspection target object based on the overlap between the second partial portion different from the second portion and the first partial image determined to include an abnormality and the second partial image determined to include an abnormality.
  • the image is divided into a plurality of first partial images and a plurality of second partial images different from the plurality of first partial images, and the plurality of first partial images and the plurality of second partial images are respectively provided.
  • the erroneous determination is reduced, and the possible abnormality may be various. Even if there is an abnormality, the presence or absence of the inspection object can be detected with high accuracy.
  • an inspection apparatus an inspection method, and an inspection program that can accurately detect the presence or absence of an abnormality related to an inspection object even if there are various possible abnormalities.
  • FIG. 1 is a diagram showing a network configuration of an inspection apparatus 10 according to an embodiment of the present invention.
  • the inspection apparatus 10 is a dedicated or general-purpose computer, and may be, for example, a general-purpose computer on which an inspection program according to the present embodiment is installed.
  • the inspection apparatus 10 is connected to the learning image database DB and the camera 30 via the communication network N.
  • the communication network N is a wired or wireless communication network, and may be, for example, a LAN (Local Area Network) or the Internet.
  • the inspection apparatus 10 acquires an image of the inspection object 100 captured by the camera 30 via the communication network N, and determines whether the inspection object 100 has an abnormality based on a learned model.
  • the learned model is, for example, a learned neural network.
  • the inspection apparatus 10 learns a model using an image stored in the learning image database DB.
  • the camera 30 captures an image of the inspection object 100. All or part of the image captured by the camera 30 is stored in the learning image database DB.
  • the camera 30 may be a general digital camera that captures visible light, but may be a camera that captures infrared light, X-rays, or the like, or the inspection object 100 may be other than light as with an electron microscope or the like. It may be a camera for shooting.
  • FIG. 2 is a diagram showing a physical configuration of the inspection apparatus 10 according to the embodiment of the present invention.
  • the inspection device 10 includes a central processing unit (CPU) 10a corresponding to a hardware processor, a random access memory (RAM) 10b corresponding to a memory, a read only memory (ROM) 10c corresponding to a memory, and a communication interface 10d. , An input unit 10e, and a display unit 10f. These components are mutually connected so as to be able to transmit and receive data via a bus.
  • CPU central processing unit
  • RAM random access memory
  • ROM read only memory
  • FIG. 2 is a diagram showing a physical configuration of the inspection apparatus 10 according to the embodiment of the present invention.
  • the inspection device 10 includes a central processing unit (CPU) 10a corresponding to a hardware processor, a random access memory (RAM) 10b corresponding to a memory, a read only memory (ROM) 10c corresponding to a memory, and a communication interface 10d. , An input unit 10e
  • the CPU 10a is a control unit that performs control related to the execution of a program stored in the RAM 10b or the ROM 10c, and performs calculation and processing of data.
  • the CPU 10a is an arithmetic device that executes a program (inspection program) related to inspection of an inspection object.
  • the CPU 10a receives various input data from the input unit 10e and the communication interface 10d, and displays the calculation result of the input data on the display unit 10f or stores it in the RAM 10b or the ROM 10c.
  • the RAM 10 b is a storage unit capable of rewriting data, and is formed of, for example, a semiconductor storage element.
  • the RAM 10 b stores programs and data such as applications executed by the CPU 10 a.
  • the ROM 10 c is a storage unit that can only read data, and is configured of, for example, a semiconductor storage element.
  • the ROM 10 c stores, for example, programs and data such as firmware.
  • the communication interface 10 d is an interface that connects the inspection device 10 to the communication network N.
  • the input unit 10 e receives an input of data from the user, and includes, for example, a keyboard, a mouse, and a touch panel.
  • the display unit 10 f visually displays the calculation result by the CPU 10 a, and is configured of, for example, an LCD (Liquid Crystal Display).
  • the inspection program may be stored in a computer-readable storage medium such as the RAM 10 b or the ROM 10 c and provided, or may be provided via the communication network N connected by the communication interface 10 d.
  • the CPU 10a executes an inspection program to realize various functions described with reference to the following drawings. Note that these physical configurations are exemplifications and may not necessarily be independent configurations.
  • the inspection apparatus 10 may include an LSI (Large-Scale Integration) in which the CPU 10a, the RAM 10b, and the ROM 10c are integrated.
  • the inspection apparatus 10 may also include an arithmetic circuit such as a graphics processing unit (GPU), a field-programmable gate array (FPGA), or an application specific integrated circuit (ASIC).
  • GPU graphics processing unit
  • FPGA field-programmable gate array
  • ASIC application specific integrated circuit
  • FIG. 3 is a functional block diagram of the inspection apparatus 10 according to the embodiment of the present invention.
  • the inspection apparatus 10 includes an acquisition unit 11, a division unit 12, a classification unit 13, a determination unit 14, and a learning unit 16.
  • the acquisition unit 11 acquires an image of the inspection object 100 from the camera 30 or the learning image database DB via the communication network N.
  • the dividing unit 12 includes a first dividing unit 12a, a second dividing unit 12b, and a third dividing unit 12c.
  • the first division unit 12a divides the image of the inspection object 100 into a plurality of first partial images.
  • the second division unit 12 b divides the image of the inspection object 100 into a plurality of second partial images different from the plurality of first partial images.
  • the third division unit 12c divides the image of the inspection object 100 into a plurality of third partial images different from any of the plurality of first partial images and the plurality of second partial images.
  • Each of the plurality of second partial images may be smaller than each of the plurality of first partial images.
  • each of the plurality of third partial images may be smaller than each of the plurality of second partial images.
  • at least two of the plurality of second partial images may be superimposed on any one of the plurality of first partial images.
  • at least two of the plurality of third partial images may overlap with any one of the plurality of second partial images.
  • the first division unit 12a may divide the image of the inspection object 100 so that the plurality of first partial images do not overlap each other. However, the first division unit 12a may divide the image of the inspection object 100 such that all or part of the plurality of first partial images overlap each other. The same applies to the second division 12b and the third division 12c.
  • the dividing unit 12 divides the image of the inspection object 100 into rectangular partial images.
  • the long side and the short side of the first partial image are respectively longer than the long side and the short side of the second partial image
  • the long side and the short side of the second partial image are respectively the long side and the third side of the third partial image. May be longer than the short side.
  • the dividing unit 12 may divide the image of the inspection object 100 into a partial image having a shape other than a rectangle, for example, a shape such as a polygon or an ellipse.
  • the dividing unit 12 according to the present embodiment divides the image of the inspection object 100 into three types of partial images, but the dividing unit 12 divides the image of the inspection object 100 into at least two types of partial images. It may be divided into four or more types of partial images.
  • the classification unit 13 includes a first classification unit 13a, a second classification unit 13b, and a third classification unit 13c.
  • the first classification unit 13a classifies the plurality of first partial images into a first partial image determined to contain an abnormality and a first partial image determined to contain no abnormality.
  • the second classification unit 13 b classifies the plurality of second partial images into a second partial image determined to contain an abnormality and a second partial image determined to contain no abnormality.
  • the third classification unit 13c classifies the plurality of third partial images into a third partial image determined to contain an abnormality and a third partial image determined to contain no abnormality.
  • the first classification unit 13a may include a first learned image in which a first partial image including an abnormality and a first partial image not including an abnormality are learned as learning data.
  • the second classification unit 13b may include a second learned model in which the second partial image including the abnormality and the second partial image not including the abnormality are learned as the learning data.
  • the third classification unit 13c may include a third learned model in which a third partial image including an abnormality and a third partial image not including an abnormality are learned as learning data.
  • the first learned model, the second learned model, and the third learned model may be learned models in which a common model is learned using different learning data, or different models may be different learning data It may be a learned model learned in.
  • the first learned model, the second learned model, and the third learned model may be, for example, a learned neural network, but may be any model such as a support vector machine.
  • the classification unit 13 may not include a learned model, and may classify whether or not an abnormality is included in a partial image by a predetermined method. For example, the classification unit 13 classifies a partial image including an abnormality and a partial image not including an abnormality by calculating an evaluation value based on the partial image and comparing the evaluation value with a threshold. It may be
  • the learning unit 16 acquires a first partial image, a second partial image, and a third partial image for learning from the learning image database DB, and the first classification unit 13a, the second classification unit 13b, and the third classification unit 13c. Let each learn. Here, learning is to improve the classification accuracy of the classification unit 13 by inputting learning data to the classification unit 13 and correcting the configuration of the classification unit 13 based on the correctness of the output result.
  • the learning unit 16 sets a first partial image for learning and a second partial image as learning data.
  • the third partial image are input to the first classification unit 13a, the second classification unit 13b, and the third classification unit 13c, respectively, so that the partial image containing an abnormality and the partial image not containing an abnormality can be classified correctly.
  • the weighting of the units included in the neural network is optimized so as to improve the evaluation value.
  • the determination unit 14 determines whether there is an abnormality related to the inspection object 100 based on an overlap between the first partial image determined to contain an abnormality and the second partial image determined to contain an abnormality. Determine The determination unit 14 may determine the presence or absence of an abnormality regarding the inspection object 100 based on the overlap of at least two types of partial images, but the first partial image determined to contain an abnormality and the abnormality The presence or absence of abnormality regarding the inspection object 100 may be determined based on the overlap between the second partial image determined to be included and the third partial image determined to include an abnormality.
  • the determination unit 14 determines that there is an abnormality in the overlapping portion. You may judge. The determination unit 14 may determine that an abnormality exists in the overlapping portion when a partial image including an abnormality is superimposed on at least two types of partial images, but it is determined that an abnormality is included. When the first partial image of the subject, the second partial image determined to contain the abnormality, and the third partial image determined to contain the abnormality overlap, an abnormality is generated in the overlapping portion. It may be determined that it exists.
  • the determination unit 14 determines that the first partial image determined to include an abnormality and the second partial image determined to not include an abnormality overlap or do not include an abnormality.
  • the first partial image determined to include the abnormality or the abnormality is included It may be determined that no abnormality is present in the portion of the second partial image determined to be corrected.
  • the determination unit 14 includes an abnormality when at least two types of partial images to be superimposed include an abnormality in one of the partial images and an abnormality is not included in the other partial image.
  • an abnormality is included, but other types of partial images If no abnormality is included, it may be determined that there is no abnormality in a part determined to include the abnormality.
  • the display unit 10 f distinguishably displays the first partial image determined to contain an abnormality and the first partial image determined to not contain an abnormality, and it is determined that an abnormality is included. And an image of the inspection object 100 is displayed in a distinguishable manner. In addition, the display unit 10 f distinguishably displays the third partial image determined to contain an abnormality and the third partial image determined to contain no abnormality, thereby displaying the image of the inspection object 100. You may display Furthermore, the display unit 10 f distinguishably displays the first partial image determined to contain an abnormality and the second partial image determined to contain an abnormality, and the abnormality is included. The first partial image determined to be present and the second partial image determined to contain an abnormality may be displayed distinguishably from the third partial image determined to contain an abnormality.
  • FIG. 4 is a view showing an image P of the inspection object 100 acquired by the inspection apparatus 10 according to the embodiment of the present invention.
  • the image P is an image obtained by photographing the inspection object 100 from a predetermined direction.
  • the image P shows that the inspection object 100 has a scratch S, and shows the case where the inspection object 100 has an abnormality.
  • the case where it is determined whether or not there is an abnormality in the inspection object 100 based on a single image P will be described. It may be determined whether or not the object 100 has an abnormality.
  • FIG. 5 is a view showing a first partial image P1 divided by the inspection apparatus 10 according to the embodiment of the present invention.
  • the first division unit 12a divides the image P into sixteen first partial images P1.
  • the first partial images P1 are rectangular images and do not overlap each other. By dividing the image P so that the plurality of first partial images P1 do not overlap with each other, it is determined whether or not there is an abnormality in the inspection object 100 without excessively increasing the number of the plurality of first partial images P1. Thus, the processing load of the first classification unit 13a can be reduced.
  • the first classification unit 13a classifies the sixteen first partial images P1 into a first partial image determined to contain an abnormality and a first partial image determined to contain no abnormality. In the case of this example, the first classification unit 13a determines that one first partial image of the 16 first partial images P1 includes an abnormality, and the other 15 first partial images are determined. Determines that no abnormality is included.
  • the display unit 10 f distinguishably displays the first partial image determined to contain an abnormality and the first partial image determined to contain no abnormality.
  • the display unit 10 f emphasizes the first partial image P 1 a determined to include an abnormality by the first type of hatching, and determines that the first partial image P 1 a does not include an abnormality.
  • the partial image P1 is indicated by a broken line. Thereby, it is possible to visually recognize in what part of the inspection object 100 the abnormality is determined to be included, and the classification result by the first classification unit 13a can be easily grasped. .
  • the mode in which the first partial image that is determined to be abnormal by the display unit 10 f and the first partial image that is determined to be abnormal is distinguishably displayed is optional, and the abnormality is included.
  • the hatching display of the first partial image P1a that is determined to be performed may be subjected to transmission processing to be displayed so that the flaw S can be visually recognized.
  • FIG. 6 is a view showing a second partial image P2 divided by the inspection device 10 according to the embodiment of the present invention.
  • the second division unit 12b divides the image P into 25 second partial images P2.
  • the second partial images P2 are rectangular images and do not overlap each other. By dividing the image P so that the plurality of second partial images P2 do not overlap with each other, it is possible to determine the presence / absence of abnormality of the inspection object 100 without excessively increasing the number of the plurality of second partial images P2. Thus, the processing load of the second classification unit 13b can be reduced.
  • the second partial image P2 is smaller than the first partial image P1, and at least two of the second partial images P2 overlap with any one of the plurality of first partial images P1. It is a case where the size and the shape of the abnormality are various by determining whether the abnormality is included in the first partial image P1 and the at least two second partial images P2 to be superimposed on the first partial image P1. Even in this case, the presence or absence of an abnormality can be determined with high accuracy.
  • the second classification unit 13b classifies the 25 second partial images P2 into a second partial image determined to contain an abnormality and a second partial image determined to contain no abnormality. In the case of this example, the second classification unit 13 b determines that three second partial images of the 25 second partial images P 2 contain an abnormality, and the other 22 second partial images are determined. Determines that no abnormality is included.
  • the display unit 10 f distinguishably displays the second partial image determined to contain an abnormality and the second partial image determined to contain no abnormality.
  • the display unit 10 f emphasizes the second partial images P 2 a, P 2 b, and P 2 c determined to contain an abnormality by the second type of hatching, and determines that no abnormality is included.
  • the second partial image P2 to be displayed is indicated by a broken line. Thereby, it is possible to visually recognize which part of the inspection object 100 is determined to contain an abnormality, and the classification result by the second classification unit 13 b can be easily grasped. .
  • the display unit 10 f distinguishably displays the first partial image P1 a determined to include an abnormality and the second partial images P2 a, P2 b, and P2 c determined to include an abnormality. Specifically, although the display unit 10f shows the first partial image P1a determined to contain an abnormality by the first type of hatching, the second partial image determined to contain an abnormality. P2a, P2b and P2c are indicated by hatching of the second type.
  • the second partial images P2b and P2c determined to contain an abnormality by the second classification unit 13b do not actually include an abnormality. Even in such a case, as will be described in detail later, the inspection apparatus 10 does not determine that there is an abnormality in the portion of the second partial images P2b and P2c determined to contain an abnormality. The presence or absence of abnormality can be determined correctly.
  • FIG. 7 is a view showing a third partial image P3 divided by the inspection device 10 according to the embodiment of the present invention.
  • the third division unit 12c divides the image P into 64 third partial images P3.
  • the third partial images P3 are rectangular images and do not overlap each other. By dividing the image P so that the plurality of third partial images P3 do not overlap with each other, it is possible to determine the presence / absence of abnormality of the inspection object 100 without excessively increasing the number of the plurality of third partial images P3. Thus, the processing load of the third classification unit 13c can be reduced.
  • the third partial image P3 is smaller than the first partial image P1 and the second partial image P2, and at least two of the third partial images P3 overlap with any one of the plurality of first partial images P1. And overlap with any one of the plurality of second partial images P2.
  • the third classification unit 13c classifies the 64 third partial images P3 into a third partial image determined to contain an abnormality and a third partial image determined to contain no abnormality. In the case of this example, the third classification unit 13c determines that four third partial images of the 64 third partial images P3 include an abnormality, and the other 60 third partial images are determined. Determines that no abnormality is included.
  • the display unit 10 f distinguishably displays the third partial image determined to contain an abnormality and the third partial image determined to contain no abnormality.
  • the display unit 10 f emphasizes the third partial images P 3 a, P 3 b, P 3 c, and P 3 d determined to contain an abnormality by the third type of hatching, and does not include an abnormality.
  • the third partial image P3 determined to be represented by a broken line is shown.
  • the display unit 10f includes the first partial image P1a determined to contain an abnormality, the second partial images P2a, P2b, and P2c determined to contain an abnormality, and the abnormality.
  • the third partial images P3a, P3b, P3c, and P3d determined to be present are displayed in a distinguishable manner.
  • the display unit 10f indicates the first partial image P1a determined to contain an abnormality by the first type of hatching, and the second partial image P2a determined to contain an abnormality.
  • P2b and P2c are indicated by the second type of hatching
  • third partial images P3a, P3b, P3c and P3d determined to contain an abnormality are indicated by the third type of hatching.
  • the third partial image P3d determined to contain an abnormality by the third classification unit 13c does not actually include an abnormality. Even in such a case, as will be described in detail later, the inspection apparatus 10 does not determine that there is an abnormality in the portion of the third partial image P3 d that is determined to contain an abnormality; The presence or absence of can be determined correctly.
  • FIG. 8 is a view showing an image P of the inspection object 100 displayed on the display unit 10 f of the inspection apparatus 10 according to the embodiment of the present invention.
  • the display unit 10 f includes the first partial image P 1 a determined to contain an abnormality, the second partial images P 2 a, P 2 b, and P 2 c determined to contain an abnormality, and the abnormality is included.
  • the third partial images P3a, P3b, P3c, and P3d to be determined are displayed in a distinguishable manner.
  • the classification result by the first classification unit 13a, the classification result by the second classification unit 13b, and the classification result by the third classification unit 13c are visually distinguishably indicated, so that the first classification unit 13a, the second classification, It is possible to easily grasp the difference in the classification result by the unit 13 b and the third classification unit 13 c.
  • the determination unit 14 of the inspection apparatus 10 includes the first partial image P1a determined to include an abnormality, the second partial images P2a, P2b, and P2c determined to include an abnormality, and the abnormality. Based on the overlap with the third partial images P3a, P3b, P3c, and P3d that are determined to be present, it is determined whether or not there is an abnormality regarding the inspection object 100.
  • the image P is divided into a plurality of first partial images P1, a plurality of second partial images P2, and a plurality of third partial images P3, and the first partial image,
  • the number of erroneous determinations is small by determining the presence or absence of an abnormality regarding the inspection object based on the overlap of the partial images Therefore, even if there are various possible abnormalities, it is possible to accurately detect the presence or absence of an abnormality regarding the inspection object.
  • the determination unit 14 determines that an abnormality is present in the overlapping portion R. As described above, by determining that an abnormality exists in the superimposed portion of a plurality of types of partial images determined to include an abnormality, not only the presence or absence of an abnormality related to the inspection object, but also the portion where the abnormality exists It can be identified well.
  • the determination unit 14 determines that the second partial images P2b and P2c determined to contain an abnormality and the third partial image P3d determined to contain an abnormality do not include an abnormality.
  • the second partial images P2b and P2c that are determined to include an abnormality, or the third partial image P3d that is determined to include an abnormality in order to overlap with another type of partial image to be It determines that there is no abnormality.
  • it is determined that any one of the first partial image, the second partial image, and the third partial image to be superimposed includes an abnormality, and the other type of partial image is determined not to include an abnormality. If it is determined that there is no abnormality in the portion of the first partial image, the second partial image or the third partial image that is determined to contain the abnormality, erroneous determination of the existence of the abnormality Is prevented.
  • each of the plurality of second partial images P2 is smaller than each of the plurality of first partial images P1
  • each of the plurality of third partial images P3 is smaller than each of the plurality of second partial images P2 Even in the case of various sizes and shapes, the presence or absence of an abnormality can be determined with high accuracy.
  • FIG. 9 is a flowchart of a first process performed by the inspection apparatus 10 according to the embodiment of the present invention.
  • the first process divides the image P of the inspection object 100 into a first partial image P1, a second partial image P2, and a third partial image P3, determines the presence or absence of an abnormality, and includes an abnormality. It is the process which determines the presence or absence of abnormality regarding the test object 100 based on the overlap of the partial image determined to be.
  • the inspection apparatus 10 first acquires an image P of the inspection object 100 (S10). Then, the image P is divided into a plurality of first partial images P1 by the first division unit 12a (S11), and the image P is divided into a plurality of second partial images P2 by the second division unit 12b (S12). Are divided into a plurality of third partial images P3 by the third division unit 12c (S13).
  • the first classification unit 13a classifies the plurality of first partial images P1 into a first partial image determined to contain an abnormality and a first partial image determined to contain no abnormality.
  • the second classification unit 13b determines that the plurality of second partial images P2 are determined to contain no abnormality and the second partial image determined to contain no abnormality.
  • the third classification unit 13c determines that the plurality of third partial images P3 are determined not to contain an abnormality and the third partial image determined to contain an abnormality.
  • the image is classified into partial images (S16).
  • the display unit 10 f determines that the first partial image determined to include an abnormality, the second partial image determined to include an abnormality, and the third part determined to include an abnormality.
  • the images are displayed in a distinguishable manner (S17).
  • the determination unit 14 determines that the first partial image determined to contain an abnormality, the second partial image determined to contain an abnormality, and the third part determined to contain an abnormality. It is determined whether the image is to be superimposed (S18). When it is determined that these three types of partial images are to be superimposed (S18: Yes), the determination unit 14 determines that there is an abnormality in the superimposed portion (S19). On the other hand, when it is determined that these three types of partial images are not superimposed (S18: No), that is, when it is determined that no abnormality is included in at least one type of partial image, the determination unit 14 determines that It is determined that there is no abnormality in the portion of the partial image determined to be included (S20). Thus, the first process ends.
  • FIG. 10 is a flowchart of a second process performed by the inspection apparatus 10 according to the embodiment of the present invention.
  • the second process is a process that is executed prior to the first process, and partial images and abnormalities in which abnormality is included in the first classification unit 13a, the second classification unit 13b, and the third classification unit 13c. This is a process of learning the classification of partial images not included.
  • the learning unit 16 of the inspection apparatus 10 acquires a first partial image for learning from the learning image database DB (S30). Similarly, the learning unit 16 acquires a second partial image for learning (S31) and acquires a third partial image for learning from the learning image database DB (S32).
  • the first partial image for learning includes a first partial image including an abnormality and a first partial image not including an abnormality, and each first partial image is tagged with the presence or absence of an abnormality. It is done.
  • the second partial image for learning includes a second partial image containing an abnormality and a second partial image not containing an abnormality, and each second partial image is tagged with the presence or absence of an abnormality.
  • the third partial image for learning includes a third partial image containing an abnormality and a third partial image not containing an abnormality, and each third partial image is tagged with the presence or absence of an abnormality. It is done.
  • the learning unit 16 executes the learning of the first classification unit 13a using the first partial image for learning (S33), and executes the learning of the second classification unit 13b using the second partial image for learning (S33).
  • the third classification unit 13c executes learning using the third partial image for learning (S35). Specifically, when the first classification unit 13a, the second classification unit 13b, and the third classification unit 13c include a neural network, the learning unit 16 calculates a first partial image, a second partial image, and a third portion for learning. An image is input to each of the first classification unit 13a, the second classification unit 13b, and the third classification unit 13c, and it is evaluated whether the partial image including the abnormality and the partial image not including the abnormality can be correctly classified. Based on the values, the weightings of the units included in the neural network are optimized to improve the evaluation value.
  • the second process ends, and the first classification unit 13a includes the first learned model, the second classification unit 13b includes the second learned model, and the third classification unit 13c It will include the third learned model.
  • the first learned model, the second learned model, and the third learned model are different learning data (the first partial image for learning, the second partial image for learning, and the third portion for learning) It may be a model having a different configuration learned using images).
  • the first classification unit 13a includes a first learned neural network
  • the second classification unit 13b includes a second learned neural network having a configuration different from that of the first learned neural network.
  • the unit 13c may include a third learned neural network that is different in configuration from both the first learned neural network and the second learned neural network.

Landscapes

  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

L'objectif de la présente invention est de fournir un dispositif d'inspection, un procédé d'inspection et un programme d'inspection qui permettent de détecter avec précision s'il y a une anomalie dans un objet d'inspection même s'il est possible que diverses anomalies se produisent. Le dispositif d'inspection 10 comprend une première unité de division pour diviser une image d'un objet d'inspection 100 en une pluralité de premières images partielles, une seconde unité de division pour diviser l'image en une pluralité de secondes images partielles différentes des premières images partielles, une première unité de classification pour classifier la pluralité de premières images partielles en premières images partielles déterminées comme comprenant une anomalie et en premières images partielles déterminées comme ne comprenant pas d'anomalie, une seconde unité de classification pour classifier la pluralité de secondes images partielles en secondes images partielles déterminées comme comprenant une anomalie et en secondes images partielles déterminées comme ne comprenant pas d'anomalie, et une unité de détermination pour déterminer s'il y a une anomalie dans l'objet d'inspection 100 sur la base d'un chevauchement entre les premières images partielles déterminées comme comprenant une anomalie et les secondes images partielles déterminées comme comprenant une anomalie.
PCT/JP2017/023213 2017-06-23 2017-06-23 Dispositif d'inspection, procédé d'inspection et programme d'inspection WO2018235266A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/023213 WO2018235266A1 (fr) 2017-06-23 2017-06-23 Dispositif d'inspection, procédé d'inspection et programme d'inspection
JP2018515903A JP6387477B1 (ja) 2017-06-23 2017-06-23 検査装置、検査方法及び検査プログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/023213 WO2018235266A1 (fr) 2017-06-23 2017-06-23 Dispositif d'inspection, procédé d'inspection et programme d'inspection

Publications (1)

Publication Number Publication Date
WO2018235266A1 true WO2018235266A1 (fr) 2018-12-27

Family

ID=63444288

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/023213 WO2018235266A1 (fr) 2017-06-23 2017-06-23 Dispositif d'inspection, procédé d'inspection et programme d'inspection

Country Status (2)

Country Link
JP (1) JP6387477B1 (fr)
WO (1) WO2018235266A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020042001A (ja) * 2018-09-07 2020-03-19 株式会社フジクラ 評価装置、評価方法、評価プログラム、及び検査装置
JP2020125618A (ja) * 2019-02-04 2020-08-20 東京電力ホールディングス株式会社 情報処理方法、プログラム、取水制御システム及び学習済みモデルの生成方法
WO2021010430A1 (fr) * 2019-07-16 2021-01-21 株式会社テクムズ Procédé de détection de défaut d'article à inspecter, dispositif associé, et programme informatique associé
WO2022085253A1 (fr) * 2020-10-23 2022-04-28 株式会社荏原製作所 Dispositif de détermination de surface usinée, programme de détermination de surface usinée, procédé de détermination de surface usinée et système d'usinage
WO2022215382A1 (fr) * 2021-04-05 2022-10-13 パナソニックIpマネジメント株式会社 Dispositif d'inspection, procédé d'inspection et programme
US12013351B2 (en) 2019-09-24 2024-06-18 Ishida Co., Ltd. Inspection device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4173441A (en) * 1977-03-28 1979-11-06 E. I. Du Pont De Nemours And Company Web inspection system and method therefor
JPH11111796A (ja) * 1997-10-02 1999-04-23 Mitsubishi Electric Corp 不良解析方法及びその装置
JP2010266430A (ja) * 2009-04-15 2010-11-25 Jfe Steel Corp 鋼板表面欠陥検査方法および装置
JP2011058926A (ja) * 2009-09-09 2011-03-24 Ricoh Elemex Corp 画像検査方法及び画像検査装置
JP2011122985A (ja) * 2009-12-11 2011-06-23 Ricoh Co Ltd 印刷物検査装置、印刷物検査方法、プログラムおよび記憶媒体
JP2012026982A (ja) * 2010-07-27 2012-02-09 Panasonic Electric Works Sunx Co Ltd 検査装置
JP2015026287A (ja) * 2013-07-26 2015-02-05 新電元工業株式会社 はんだ付け検査装置、はんだ付け検査方法および電子部品
JP2016126769A (ja) * 2014-12-26 2016-07-11 古河機械金属株式会社 検査結果出力装置、検査結果出力方法、及び、プログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6126450B2 (ja) * 2013-04-25 2017-05-10 株式会社ブリヂストン 検査装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4173441A (en) * 1977-03-28 1979-11-06 E. I. Du Pont De Nemours And Company Web inspection system and method therefor
JPH11111796A (ja) * 1997-10-02 1999-04-23 Mitsubishi Electric Corp 不良解析方法及びその装置
JP2010266430A (ja) * 2009-04-15 2010-11-25 Jfe Steel Corp 鋼板表面欠陥検査方法および装置
JP2011058926A (ja) * 2009-09-09 2011-03-24 Ricoh Elemex Corp 画像検査方法及び画像検査装置
JP2011122985A (ja) * 2009-12-11 2011-06-23 Ricoh Co Ltd 印刷物検査装置、印刷物検査方法、プログラムおよび記憶媒体
JP2012026982A (ja) * 2010-07-27 2012-02-09 Panasonic Electric Works Sunx Co Ltd 検査装置
JP2015026287A (ja) * 2013-07-26 2015-02-05 新電元工業株式会社 はんだ付け検査装置、はんだ付け検査方法および電子部品
JP2016126769A (ja) * 2014-12-26 2016-07-11 古河機械金属株式会社 検査結果出力装置、検査結果出力方法、及び、プログラム

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020042001A (ja) * 2018-09-07 2020-03-19 株式会社フジクラ 評価装置、評価方法、評価プログラム、及び検査装置
JP7236292B2 (ja) 2018-09-07 2023-03-09 株式会社フジクラ 評価装置、評価方法、評価プログラム、及び検査装置
JP2020125618A (ja) * 2019-02-04 2020-08-20 東京電力ホールディングス株式会社 情報処理方法、プログラム、取水制御システム及び学習済みモデルの生成方法
JP7293687B2 (ja) 2019-02-04 2023-06-20 東京電力ホールディングス株式会社 情報処理方法、プログラム、取水制御システム及び学習済みモデルの生成方法
WO2021010430A1 (fr) * 2019-07-16 2021-01-21 株式会社テクムズ Procédé de détection de défaut d'article à inspecter, dispositif associé, et programme informatique associé
US12013351B2 (en) 2019-09-24 2024-06-18 Ishida Co., Ltd. Inspection device
WO2022085253A1 (fr) * 2020-10-23 2022-04-28 株式会社荏原製作所 Dispositif de détermination de surface usinée, programme de détermination de surface usinée, procédé de détermination de surface usinée et système d'usinage
JP2022069275A (ja) * 2020-10-23 2022-05-11 株式会社荏原製作所 加工面判定装置、加工面判定プログラム、加工面判定方法、及び、加工システム
JP7450517B2 (ja) 2020-10-23 2024-03-15 株式会社荏原製作所 加工面判定装置、加工面判定プログラム、加工面判定方法、及び、加工システム
WO2022215382A1 (fr) * 2021-04-05 2022-10-13 パナソニックIpマネジメント株式会社 Dispositif d'inspection, procédé d'inspection et programme

Also Published As

Publication number Publication date
JP6387477B1 (ja) 2018-09-05
JPWO2018235266A1 (ja) 2019-06-27

Similar Documents

Publication Publication Date Title
WO2018235266A1 (fr) Dispositif d'inspection, procédé d'inspection et programme d'inspection
EP3502966B1 (fr) Appareil de génération de données, procédé de génération de données et programme de génération de données
US10885618B2 (en) Inspection apparatus, data generation apparatus, data generation method, and data generation program
JP6837848B2 (ja) 診断装置
JP6707920B2 (ja) 画像処理装置、画像処理方法、およびプログラム
CN109946303A (zh) 检查装置和方法
JP2013224833A (ja) 外観検査装置、外観検査方法及びコンピュータプログラム
JP2015041164A (ja) 画像処理装置、画像処理方法およびプログラム
CN114066857A (zh) 红外图像质量评价方法、装置、电子设备及可读存储介质
JP5914732B2 (ja) 画像検証方法、画像検証装置、およびプログラム
JP6882772B2 (ja) 検査装置、検査方法及び検査プログラム
US11481673B2 (en) Signal analysis device, signal analysis method, and signal analysis program
US20220237895A1 (en) Class determination system, class determination method, and class determination program
CN113016023B (zh) 信息处理方法以及计算机可读的非暂时性记录介质
KR20180061528A (ko) Lab 컬러 모델을 이용한 이물질 검사 시스템 및 방법
US20210049396A1 (en) Optical quality control
US20060093203A1 (en) Attribute threshold evaluation scheme
JP6838651B2 (ja) 画像処理装置、画像処理方法及びプログラム
JPWO2021130978A1 (ja) 動作分析システムおよび動作分析プログラム
JP6897100B2 (ja) 判定装置、判定方法、および判定プログラム
US20230080978A1 (en) Machine learning method and information processing apparatus for machine learning
JP7498890B2 (ja) 画像検査装置
JP7469100B2 (ja) 検出装置及びプログラム
JP7067321B2 (ja) 検査結果提示装置、検査結果提示方法及び検査結果提示プログラム
JP2020071716A (ja) 異常判定方法、特徴量算出方法、外観検査装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018515903

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17914697

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17914697

Country of ref document: EP

Kind code of ref document: A1