CN115298539A - Appearance inspection system and computer program - Google Patents
Appearance inspection system and computer program Download PDFInfo
- Publication number
- CN115298539A CN115298539A CN202080098729.0A CN202080098729A CN115298539A CN 115298539 A CN115298539 A CN 115298539A CN 202080098729 A CN202080098729 A CN 202080098729A CN 115298539 A CN115298539 A CN 115298539A
- Authority
- CN
- China
- Prior art keywords
- image
- defect
- offset
- partial
- inspection system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 107
- 238000004590 computer program Methods 0.000 title claims description 14
- 230000007547 defect Effects 0.000 claims abstract description 167
- 238000000034 method Methods 0.000 claims abstract description 111
- 238000012545 processing Methods 0.000 claims abstract description 100
- 230000002950 deficient Effects 0.000 claims abstract description 95
- 238000001514 detection method Methods 0.000 claims abstract description 78
- 238000003860 storage Methods 0.000 claims abstract description 45
- 230000008569 process Effects 0.000 claims description 82
- 238000003384 imaging method Methods 0.000 claims description 38
- 238000011179 visual inspection Methods 0.000 claims description 35
- 238000004891 communication Methods 0.000 claims description 21
- 238000004519 manufacturing process Methods 0.000 claims description 20
- 238000013528 artificial neural network Methods 0.000 claims description 14
- 238000005520 cutting process Methods 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000011282 treatment Methods 0.000 claims 8
- 238000010801 machine learning Methods 0.000 claims 2
- 238000013527 convolutional neural network Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 17
- 238000012360 testing method Methods 0.000 description 14
- 238000012790 confirmation Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012546 transfer Methods 0.000 description 4
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 238000002310 reflectometry Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000002040 relaxant effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Landscapes
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Provided is a technique capable of suppressing undetected defects and simultaneously suppressing overdetection. Provided is an appearance inspection system and the like for judging whether an object to be inspected is qualified or not by using an image of the object to be inspected. The appearance inspection system comprises: a storage device for storing a non-defective image, a reference table in which a plurality of threshold values are described, and an offset table in which a plurality of offset values are described; and an arithmetic circuit. Each of the plurality of thresholds is a criterion for determining that an image of the object included in each of the corresponding plurality of partial regions indicates a defect. The arithmetic circuit executes processing for determining whether or not the image feature amount of a partial image is classified into an excessive detection category. When the image feature amount of the partial image is classified into the over-detection category, the arithmetic circuit further changes the threshold value of the reference table at the position of the partial region using the offset value, and determines whether or not the image of the object to be inspected includes a defective image using the changed reference table.
Description
Technical Field
The present disclosure relates to a visual inspection system and a computer program.
Background
Conventionally, the following appearance inspection is performed: the image obtained by imaging the object to be inspected is compared with an image of a reference article (reference image), thereby determining whether or not the object to be inspected has a defect, that is, whether or not the object to be inspected is a non-defective article.
In the conventional appearance inspection device, the determination criterion for the acceptance determination is verified sufficiently, and then set manually in the device. When inspecting an object whose shape and surface pattern often change, it is necessary to frequently change the determination criterion, and therefore it is difficult to perform effective operation.
For example, japanese patent application laid-open No. 2013-224833 discloses a technique in which a determination criterion for non-defective determination is defined for each specific pixel, and the determination criterion is automatically set using an average value/standard deviation of luminance. According to the technique disclosed in japanese laid-open patent publication No. 2013-224833, frequent change of the determination criterion by a human can be omitted, and the appearance inspection can be more effectively used.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open gazette: japanese laid-open patent publication No. 2013-224833
Disclosure of Invention
Problems to be solved by the invention
However, in japanese laid-open patent publication No. 2013-224833, if the excessive detection is suppressed, the criterion for determining a portion where the variation in pixel value is large is relaxed, and there is a possibility that the detection of a defect is missed.
The present disclosure provides an appearance inspection system and a computer program capable of suppressing overdetection while suppressing omission of defects.
Means for solving the problems
An exemplary appearance inspection system of the present disclosure determines whether an inspection object is qualified or not using an image of the inspection object, wherein the appearance inspection system includes: a storage device that stores a plurality of images obtained by imaging a plurality of articles determined as non-defective articles, a reference table in which a plurality of threshold values are described, and an offset table in which a plurality of offset values are described; an interface device that receives image data of an object under inspection and attribute data indicating an attribute relating to a manufacturing condition; and an arithmetic circuit. The image of the object and the plurality of images each include a plurality of partial regions. Each of the plurality of threshold values of the reference table and each of the plurality of offset values of the offset table are set in correspondence with the plurality of partial areas. The plurality of thresholds are criteria for determining that the image of the inspection object included in each of the plurality of corresponding partial regions indicates a defect. The plurality of offset values include a 1 st offset value that varies a part of the plurality of threshold values and a 2 nd offset value that does not vary the remaining part. The arithmetic circuit performs the following processing: and (a) performing predetermined defect detection processing on the image data, and extracting the object to be inspected as a defect candidate when the image of the object to be inspected includes at least one defect image. In addition, the arithmetic circuit executes the following processing: and (b) cutting out a partial image from the image of the inspection object extracted as the defect candidate, wherein the position of the cut-out partial image is the position of the partial region corresponding to the 1 st offset value in the offset table. In addition, the arithmetic circuit executes the following processing: and (c) determining whether or not the image feature values of the cut partial images are classified into an excess detection class into which the image feature values of the partial images of the plurality of images are classified at a ratio equal to or higher than a predetermined value at the position of the partial region corresponding to the offset value of type 1. In addition, the arithmetic circuit executes the following processing: and a process (d) of changing the threshold value of the reference table at the position of the partial region using the 1 st offset value and further increasing a reference for determining that the image of the inspection object indicates a defect, when the image feature amount of the cut partial image is classified into the over-detection category. In addition, the arithmetic circuit executes processing of: and (e) determining whether or not an image of at least one defect is included in the image of the object to be inspected, using the reference table in which the threshold value is changed.
An exemplary computer program of the present disclosure is executed by an arithmetic circuit of an appearance inspection system that determines whether an inspection object is acceptable or not using an image of the inspection object. The appearance inspection system comprises: a storage device that stores a plurality of images obtained by imaging a plurality of articles determined as non-defective articles, a reference table in which a plurality of threshold values are described, and an offset table in which a plurality of offset values are described; an interface device that receives image data of an object to be inspected; and the arithmetic circuit. The image of the object and the plurality of images each include a plurality of partial regions, and each of the plurality of thresholds of the reference table and each of the plurality of offset values of the offset table are set in correspondence with the plurality of partial regions. The plurality of thresholds are criteria for determining that the image of the inspection object included in each of the plurality of corresponding partial regions indicates a defect. The plurality of offset values include a 1 st offset value that varies a part of the plurality of threshold values and a 2 nd offset value that does not vary the remaining part. The computer program causes the arithmetic circuit to execute processing (a) of: and performing predetermined defect detection processing on the image data, and extracting the object to be inspected as a defect candidate when the image of the object to be inspected contains at least one defect image. Further, the computer program causes the arithmetic circuit to execute processing (b) of: and cutting out a partial image from the image of the inspection object extracted as the defect candidate, wherein the position of the cut-out partial image is the position of the partial region corresponding to the offset value of type 1 in the offset table. Further, the computer program causes the arithmetic circuit to execute the following processing (c): and determining whether or not the image feature amount of the cut partial image is classified into an excess detection class in which the image feature amount of each partial image of the plurality of images is classified at a ratio equal to or greater than a predetermined ratio at the position of the partial region corresponding to the 1 st offset value. Further, the computer program causes the arithmetic circuit to execute the following processing (d): when the image feature amount of the cut partial image is classified into the over-detection category, the threshold value of the reference table at the position of the partial region is changed using the 1 st offset value, and the reference for determining that the object to be inspected indicates a defect is further increased. Further, the computer program causes the arithmetic circuit to execute processing (e) of: and determining whether or not the image of the object includes at least one defective image using the reference table in which the threshold value is changed.
Effects of the invention
According to the exemplary embodiments of the present disclosure, it is possible to suppress overdetection while suppressing the occurrence of undetected signals to a minimum.
Drawings
Fig. 1 is a diagram illustrating a configuration example of an appearance inspection system 1000 of the present disclosure including an appearance inspection apparatus 100.
Fig. 2 is a diagram mainly schematically showing a configuration example of the appearance inspection apparatus 100.
Fig. 3 is a diagram showing an example of the database 14Z in the storage device 14.
Fig. 4 is a diagram schematically showing an example of image data obtained by imaging the workpiece 70.
Fig. 5 is a diagram showing an example of an image of a workpiece 70 divided into a plurality of partial areas 72.
Fig. 6 is a flowchart showing a procedure of the appearance inspection process.
Fig. 7 is a diagram showing an example of the result of the defect detection processing when a damage is present in a certain partial region 72 b.
Fig. 8 is a diagram schematically showing the offset table 14c (upper stage) and the position of the partial image cut out from the image 72 of the workpiece 70 (lower stage).
Fig. 9 is a diagram for explaining the processing of step S8 in fig. 6.
Fig. 10 is a diagram for explaining step S10 in fig. 6.
Fig. 11 is a flowchart showing a specific procedure of the appearance inspection process.
Fig. 12 is a flowchart showing a procedure of the database registration processing.
Fig. 13 is a flowchart showing a procedure of a similar tendency confirmation process.
Fig. 14 is a diagram for specifically explaining steps S56 and S58 of fig. 13.
Fig. 15 is a flowchart showing a procedure of the generation/update processing of the offset table.
Fig. 16 is a flowchart showing a procedure of the update process of the database 14Z.
Fig. 17 is a flowchart showing a procedure of application processing of the offset table.
Fig. 18 is a diagram showing the configuration of an appearance inspection system 1100 according to a modification.
Detailed Description
Hereinafter, embodiments of the visual inspection system according to the present disclosure will be described with reference to the drawings. In this specification, detailed descriptions beyond necessity may be omitted. For example, detailed descriptions of already known matters and repetitive descriptions of substantially the same configuration may be omitted. This is to avoid unnecessary redundancy in the following description, which will be readily understood by those skilled in the art. The drawings and the following description are provided to enable those skilled in the art to fully understand the present disclosure, and are not intended to limit the subject matter recited in the claims. In the following description, the same or similar components are denoted by the same reference numerals.
1. Structure of appearance inspection system
Fig. 1 is a diagram illustrating a configuration example of an appearance inspection system 1000 of the present disclosure including an appearance inspection apparatus 100. Fig. 2 is a diagram mainly schematically showing a configuration example of the appearance inspection apparatus 100.
In the illustrated example, the appearance inspection system 1000 includes the imaging device 30, the appearance inspection device 100, and the monitor 130. The appearance inspection apparatus 100, the input apparatus 120, and the monitor 130 can be implemented by a general-purpose digital computer system, for example, a PC. In addition, the image pickup device 30 may be a digital camera. The imaging device 30 is connected to the appearance inspection device 100 so as to be able to communicate with each other via a wired communication cable or a wireless communication line, not shown. The monitor 130 is connected to the visual inspection apparatus 100 through a wired communication cable not shown so as to be able to communicate with the monitor.
The imaging device 30 images an object to be inspected, and transmits image data acquired by the imaging to the appearance inspection device 100. The appearance inspection apparatus 100 performs appearance inspection of the object to be inspected based on the image data received from the imaging apparatus 30 according to a process described later.
A process of appearance inspection of an object to be inspected performed in an appearance inspection system will be described with reference to fig. 1. The object to be inspected is various products, parts, and other various articles to be inspected for appearance. Hereinafter, the object to be inspected may be referred to as a "workpiece".
The workpiece 70 is placed on the transfer table 62, and is fixed to the transfer table 62 by a holding mechanism. The workpiece 70 is, for example, a finished product of a hard disk drive, a housing of a hard disk drive in which a motor is incorporated, or the like.
The transfer table 62 can be moved in the horizontal direction by the conveyance table 64 with the workpiece 70 placed thereon. The imaging device 30 is supported by the support member 60 above the conveyance table 64 so as to include the conveyance table 64 in the field of view. The imaging device 30 captures an image of the workpiece 70 in the field of view. The workpiece 70 may also be held by a robot arm and placed in the imaging position. In addition, a light source (not shown) may be used to irradiate the workpiece 70 during imaging.
The image data acquired by the shooting is transmitted from the image pickup device 30 to the appearance inspection device 100. An example of the size of an image obtained by one-time imaging is a horizontal 1600-pixel vertical 1200-pixel image, and another example is a horizontal 800-pixel vertical 600-pixel image.
Next, fig. 2 is referred to.
The appearance inspection apparatus 100 includes an arithmetic circuit 10, a memory 12, a storage device 14, a communication circuit 16, and an image processing circuit 18. The components are connected to each other by a bus 20 so as to be able to communicate with each other. The appearance inspection apparatus 100 includes an interface device 22a for communicating with the imaging device 30, an interface device 22b for communicating with the input device 120, and an interface device 22c for outputting video data to the monitor 130. An example of the interface device 22a is a video input terminal or an ethernet terminal. An example of the interface device 22b is a USB terminal. An example of the interface device 22c is an HDMI (registered trademark) terminal. Instead of these examples, other terminals may be used as the interface devices 22a to 22c. The interface devices 22a to 22c may be wireless communication circuits for performing wireless communication. As such a wireless communication circuit, for example, a wireless communication circuit conforming to the Wi-Fi (registered trademark) standard is known which performs wireless communication using frequencies such as 2.4GHz/5.2GHz/5.3GHz/5.6 GHz.
The arithmetic circuit 10 may be, for example, an Integrated Circuit (IC) chip such as a Central Processing Unit (CPU) or a digital signal processor. The memory 12 is a recording medium in which a computer program 12p for controlling the operation of the arithmetic circuit 10 is stored and a learned convolutional neural network 12n described later is constructed. The entity of the learned convolutional neural network 12n is a plurality of parameters assigned to each artificial neuron constituting the input layer, the intermediate layer, and the output layer. The plurality of parameters includes a threshold value compared to a weight applied to a plurality of inputs to each artificial neuron and a weighted sum of the inputs (sum of products of the inputs and the weights). It should be noted that the processing of the arithmetic circuit 10 is actually required to cause the neural network 12n to perform an operation and output a result. Further, hardware (application specific integrated circuit (ASIC), GPGPU, programmable logic device) may be used in which a plurality of parameters constituting the neural network 12n on the memory 12 and the operation function of the operation circuit 10 related to the neural network 12n are integrated.
The memory 12 need not be a single recording medium, but may be a collection of a plurality of recording media. The memory 12 may include, for example, a semiconductor volatile memory such as a RAM and a semiconductor nonvolatile memory such as a flash ROM. At least a part of the memory 12 may be a detachable recording medium.
The storage device 14 stores a database as a collection of various data. Fig. 3 shows an example of the database 14Z in the storage device 14. Specifically, the storage device 14 stores a plurality of images 14a obtained by imaging a plurality of articles determined as non-defective articles, a reference table 14b in which a plurality of threshold values are described, and an offset table 14c in which a plurality of offset values are described. The storage device 14 may store attribute data 14d indicating an attribute related to a manufacturing condition of the test object. The contents and the like of the stored data will be described in detail later.
Reference is again made to fig. 2.
The imaging device 30 is a device that outputs an image signal for generating data of an image of an object to be inspected. The image signal is transmitted to the arithmetic circuit 10 by wire or wirelessly. A typical example of the imaging device 30 is a camera having an area sensor such as a CMOS image sensor or a CCD image sensor in which a plurality of photodiodes are arranged in a matrix. The imaging device 30 generates data of a color image or a monochrome image of the object. Various cameras for visual inspection can be used for the imaging device 30.
Fig. 4 schematically shows an example of image data obtained by imaging the workpiece 70. The image data is a frame image containing an image of the workpiece 70. The exemplary image of the present embodiment is an array of 256-gray-scale pixel values ("brightness" or "gray scale" values) reflecting the unevenness and the pattern on the surface of the workpiece 70. The pixel values are sometimes referred to as luminance values or densities. When the frame image is a color image, pixel values can be defined by three colors of red, green, and blue expressed in 256-level gray scales, for example. In this specification, the gradation value, the luminance value, and the like may be collectively referred to as "image feature amount". However, the "image feature amount" may include not only a feature related to a pixel value but also a feature amount of a gray Level Co-occurrence Matrix (GLCM) that is a luminance histogram statistic for evaluating a basic property of an image, and a local feature gray Level that indicates a degree of contrast between pixels or a regularity of pixels.
In one frame image, a part of the transfer stage 62 can be included as a background 70B in addition to the workpiece 70. In the example of fig. 4, the region 52 surrounded by the line 52L is an "inspection region". A frame image may also contain a plurality of examination regions 52.
The position of the workpiece 70 in the frame image is aligned to be aligned with a predetermined position within the frame image. In this alignment, a plurality of reference points of the workpiece 70 are matched with a plurality of reference points in the field of view of the imaging device 30. In the 1 st stage of such alignment, the physical arrangement relationship of the workpiece 70 with respect to the optical axis of the lens of the imaging device 30 is adjusted. In the 2 nd stage of alignment, the pixel position (coordinates) of the captured image is adjusted. Stage 2 alignment involves translation, rotation, magnification and/or reduction of the image using image processing techniques. As a result of such alignment, the inspection area 52 of each workpiece 70 is always aligned with the area enclosed by the line 52L. In this way, the image of the inspection area 52 of each workpiece 70 can be processed in units of partial areas. The alignment may be mainly performed by the arithmetic circuit 10 or the image processing circuit 18, but may be performed by rotating or moving the image pickup device 30 in parallel with the conveyance table 64 by a mechanism (not shown) that changes the posture of the image pickup device 30 in accordance with a command from the arithmetic circuit 10 or the image processing circuit 18.
The monitor 130 is a device that displays the result of the determination performed by the image processing apparatus 100 and the like. The monitor 130 can also display an image acquired by the imaging device 30.
The input device 120 is an input device that receives an input from a user including designation of a selection region and supplies it to the arithmetic circuit 10. An example of the input device 120 is a touch panel, a mouse, and/or a keyboard. The monitor 130 and the input device 120 need not be always connected to the arithmetic circuit 10 by a wire, and may be connected by wireless or wire only when necessary via a communication interface. The monitor 130 and the input device 120 may also be a terminal device or a smart phone carried by the user.
2. Outline of appearance inspection
The visual inspection system 1000 of the present disclosure determines whether or not a defect exists in an object to be inspected using an image. The visual inspection system 1000 of the exemplary embodiment uses a neural network to determine whether or not a defective image is present in an image of an object to be inspected. The presence or absence of the defect is determined for each partial region when the article image is divided into a plurality of partial regions.
Fig. 5 shows an example of an image of a workpiece 70 divided into a plurality of partial areas 72. The plurality of partial regions 72 are in a lattice shape. Fig. 5 illustrates one partial region 72a constituting a plurality of partial regions 72. The plurality of partial areas 72 are set slightly larger than the image of the workpiece 70 to entirely contain the inspection area 52 of the workpiece 70 as a whole.
The appearance inspection system 1000 uses, for example, pixel values of all pixels constituting the partial region 72a as inputs to the neural network, and obtains outputs from the neural network. In the present embodiment, the convolutional neural network 12n is used which performs learning in advance using images of various defects such as "dirt", "foreign matter", and "damage" and names of the defects as teacher signals. In this specification, the type of defect is referred to as "category". The pixel values of all the pixels of the partial region 72a are input to the neural network, and as a result, when it is determined that a defect exists, the neural network outputs a class name which is regarded as a corresponding defect.
In the present embodiment, a value indicating the degree of the discrepancy between the image input to the neural network and the standard image determined to be normal is output together with the output of the class name of the defect. In this specification, the output value may be referred to as a "probability value" indicating a high probability of being a defect.
Various methods of calculating probability values are considered. Currently, a defect is set to be "damage" to the surface of the case of the hard disk drive. When the damage entered the coating on the aluminum plate, the brightness value of the damaged image was higher than the brightness value of the image in the absence of damage, since the reflectivity of the aluminum plate was higher than the reflectivity of the coating. When an image containing a damaged partial region is input to the convolutional neural network, an image feature quantity related to the luminance value of the input image of the partial region can be obtained from the intermediate layer of the convolutional neural network. Each extracted image feature amount is processed as a hash value of the image of the input partial region. For example, when any one of the plurality of images 14a stored in the storage device 14 and obtained by imaging the plurality of articles determined as non-defective articles is used, the image feature amount associated with the luminance value of the partial region of the non-defective article image corresponding to the position of the partial region determined as defective can be extracted from the intermediate layer of the convolutional neural network. When comparing the hash value of the inputted image of the partial area with the hash value of the standard image of the partial area, it can be said that the closer the hash values are, the more similar the hash values are, and the further apart the hash values are, the less similar the hash values are. Therefore, the larger the difference between the hash values of the two, the closer to 100 is assigned, and the smaller the difference, the closer to 0 is assigned, whereby the above-mentioned "probability value" can be obtained for each partial region determined to contain a defect. The probability value of the image of the partial region determined to be defective can be adjusted to a value of at least 50% or more, for example. The above-described technique of extracting the image feature amount extracted by the CNN as a hash value is known. And thus a detailed description is omitted.
The class name and the probability value of the defect output from the neural network are defect candidate detection results based on the learning results of the neural network.
However, even when a certain article is determined to contain a defect, it may be determined that the article does not actually contain a defect. This is the case, for example: although dirt adheres to the surface of the article, the degree thereof is small and there is substantially no problem in functionality or appearance. In conventional appearance inspection, a skilled inspector visually inspects an article and determines that such dirt is not a defect.
Even when the appearance inspection system 1000 according to the exemplary embodiment temporarily determines that an article includes a defect, the determination criterion of the defect is changed based on the relationship with a plurality of articles determined as non-defective articles, and it can be said that the presence or absence of the defect is newly determined. In the present specification, a case where a defect is determined to be present despite an article that should be determined as a non-defective article is referred to as "over-detection". When the overdetection occurs, the yield of the article is lowered, and the article has to be subjected to modification or the like, and the cost of the article increases. In order to suppress such excessive detection, the criterion for determining the defect is changed based on the experience so far, so as to enlarge the range of the non-defective product determined, thereby suppressing the excessive detection.
In order to suppress the excessive detection, a method of relaxing the criterion for determining the defect from the beginning is also considered. However, if the excessive inspection is excessively suppressed, the article that should be originally determined as a defect is determined as a non-defective article, that is, a defect is not detected, and the purpose of the appearance inspection cannot be achieved.
The appearance inspection system 1000 of the exemplary embodiment first determines whether or not the product is a non-defective product based on the criterion for clearly determining a non-defective product, and if it is determined that the product is not a non-defective product and a defect exists, the criterion for determining the defect is relaxed based on the relationship with the plurality of articles determined as non-defective products up to this point.
The specific processing is outlined below.
When the appearance inspection system 1000 performs appearance inspection, the following data is stored in the storage device 14 in advance.
(a) A plurality of images obtained by respectively shooting a plurality of articles judged as qualified articles
(b) Reference table in which a plurality of threshold values are described
(c) Offset table in which a plurality of offset values are described
The above (a) will be explained.
The article determined as a non-defective article is not limited to the article initially determined as a non-defective article. The inspection target includes an inspection target that is determined to be defective once and then determined to be non-defective, for example, by visual determination by an inspector, and an inspection target that is determined to be non-defective as a result of processing of the present disclosure described later. By including the latter two, the range of the features of the image determined as non-defective can be expanded. The "plurality of images" may include only an image of one article, or may include images of a plurality of articles. The image of each article determined as a non-defective article is used to change the criterion for determining whether or not a defect exists.
The above (b) will be explained.
The "plurality of thresholds" are thresholds corresponding to respective partial regions when the image is divided into a plurality of partial regions. Each threshold is used as a reference for determining whether or not each partial region includes a defective image when the presence or absence of defects is newly determined for the object to be inspected which is once determined to be defective. In the present embodiment, the initial value of each of the "plurality of thresholds" is, for example, "0.5". As the threshold value approaches 1, only a significant defect or the like is determined as a "defect". That is, the closer the threshold is to 1, the more difficult it is to determine that the defect is. The plurality of threshold values can be updated to a higher threshold value by each offset value of "a plurality of offset values" described below.
The partial region may be a region formed of a plurality of continuous pixels, or may be a region formed of one pixel, that is, a region defined by the size of one pixel. In the case of expressing an image of an article in color, one pixel may be an area defined by the size of three sub-pixels of red, green, and blue, for example, which are disposed adjacent to each other.
The above (c) will be explained.
The "plurality of offset values" are used to update a plurality of threshold values described in the reference table. When the offset value of a certain partial area is calculated by the processing of the present embodiment, the threshold value of the partial area is updated based on the offset value so as to make the criterion for determining that the partial area is a defect higher. For example, a value obtained by adding the offset value to the current threshold value is updated as a new threshold value. This can further improve the criterion for determining that the defect is present.
The "plurality of offset values" is not limited to the offset values used for updating the threshold value, and may include offset values not updating the threshold value, that is, offset values maintaining the current value. In the description of the embodiment to be described later, a mode in which both the offset value (offset value of type 1) for maintaining the current value and the offset value (offset value of type 2) for updating the threshold value are provided is exemplified and described.
When an offset value that maintains the current value is used, calculation is performed for all the partial areas, and therefore the program of the calculation processing can be simplified. For example, when the offset value is added to the current threshold value of each partial region, the following operation processing may be programmed: the offset value for not updating the threshold is set to 0, the offset value for updating the threshold is set to a value greater than 0, and the offset value is added to the current threshold for all the partial areas. On the other hand, when the offset value is added to the threshold value of only a specific partial area, an operation of specifying the partial area to be updated is required, and therefore programming of the arithmetic processing may be complicated.
With respect to the article image that is temporarily determined to include a defective image, the appearance inspection system 1000 determines again whether or not each partial region of the article image includes a defective image, using a threshold value that further increases the criterion for determining that the image is a defect.
When it is determined that any partial region in the article image does not include a defective image, the appearance inspection system 1000 registers the article image as one of the "plurality of images" shown in (a) above. On the other hand, when it is determined that any partial region in the article image continuously includes a defective image, the visual inspection system 1000 determines that the article in the article image includes a defect.
According to the above processing, it is possible to suppress excessive detection while preventing non-detection.
In the present embodiment, the "defect" described above is classified into a plurality of categories, and whether or not the defect is present is determined for each category. Examples of the types include "dirt", "foreign matter", "damage", "burr", "notch", "deformation", "color irregularity", "unevenness", and "blur". One or more of the above categories of "defects" may be present in the image of the inspected object. On the other hand, in the image of the non-defective product (the above (a)), there may be a feature that is determined to be "defective" once in one of the above one or more categories. Therefore, the reference table (b) and the offset table (c) may be provided for each type of defect, and may be used when determining the presence or absence of a defect for each type. In the embodiment described later, one type corresponding to a defect is illustrated and described, and since it is complicated to describe all types, description thereof will be omitted.
The image processing apparatus 100 that has acquired the image data performs the aforementioned processing to perform the appearance inspection of the workpiece 70. The content of this processing will be described in further detail below.
3. Visual inspection process
3.1 Explanation of idea of appearance inspection processing
Fig. 6 is a flowchart showing a procedure of the appearance inspection process.
In step S2, the arithmetic circuit 10 performs predetermined defect detection processing on the image data, and extracts the object as a defect candidate when the image of the object includes at least one defect image. The "predetermined defect detection processing" referred to herein is defect detection processing using the above-described convolutional neural network. When an image of at least one defect is included in an image of an object to be inspected by defect detection processing, the object to be inspected is extracted as a defect candidate. Fig. 7 shows an example of the result of the defect detection processing when there is a damage in a certain partial region 72 b. In this example, the defect type output from the convolutional neural network is "damage", and the "probability value" indicating the high probability of being a defect is 0.75. As a result, the workpiece 70 shown in fig. 7 is extracted as a defect candidate. In fig. 7, it is determined that only one partial region has a defect, but a plurality of defects may be present. In this case, defect types other than "damage" may be assigned. In this specification, a partial region determined to have a defect is sometimes referred to as a "defect candidate region".
Reference is again made to fig. 6. After the next step S4, the following processing is performed: it can be said that the object temporarily determined to contain a defect has a defect, and whether or not the object has a defect is newly determined by changing the criterion for determining a defect based on the relationship with the plurality of articles determined to be non-defective.
In the following processing, the above-described "reference table" and "offset table" prepared in advance are used. In the present embodiment, the "offset table" includes the offset values of type 1 in which a part of the plurality of threshold values is changed and the offset values of type 2 in which the remaining part is not changed. The offset values are used to change or maintain the threshold value of the criterion for determining whether the defect is set for each partial area. The method of creating the offset table will be described later.
In step S4, the arithmetic circuit 10 cuts out a partial image from the image of the inspection object extracted as the defect candidate. The position of the partial image to be cut out is the position of the partial area corresponding to the offset value of type 1 in the offset table 14c.
Fig. 8 is a diagram schematically showing the offset table 14c (upper stage) and the position of the partial image cut out from the image 72 of the workpiece 70 (lower stage). The number of rows and columns of the offset table 14c arranged in a grid pattern corresponds to the number of rows and columns that divide a plurality of partial regions. This allows the positions of the elements of the offset table 14c to be associated with the positions of the partial regions set in the image 72.
Referring to the upper stage of fig. 8, offset table 14c includes offset values of type 1 (values other than 0) and offset values of type 2 (values of 0). The images of the partial areas 72b, 72c, 72d, and 72e corresponding to the respective positions to which the offset values of type 1 are assigned are cut out by the processing of step S4.
Reference is again made to fig. 6.
In step S6, the arithmetic circuit 10 determines whether or not the image feature amount of the cut-out partial image is classified into the overdetection category. The "excessive detection category" is a category in which the image feature amounts of the respective partial images of the plurality of images at the position of the partial region corresponding to the offset value of type 1 are classified at a ratio equal to or higher than a predetermined ratio.
The "overdetection category" is explained in a more easily understood manner. First, a plurality of images determined to be non-defective are prepared. The plurality of images include images in which excessive detection has occurred in the previous appearance inspection process. With these plural images as objects, a partial region in which excessive detection is performed (hereinafter referred to as "excessive detection partial region") is cut out. When the defect detection processing of step S2 is performed on the over-detected partial area, one or more image feature quantities related to the image of the input partial area can be obtained from the intermediate layer of the convolutional neural network. When predetermined clustering processing is performed on each of the obtained image features as an object, the image features can be divided into one or more clusters. Among these, the clusters having the largest number of elements can be said to be classified into image feature quantities that tend to be easily detected excessively in a plurality of past images. The image feature amount may be different depending on the type of defect in which the over-detection has occurred (hereinafter, referred to as "over-detection type"). The processing of step S6 is processing of determining whether or not the partial image extracted in step S4 has an image feature amount belonging to the excessive detection category.
In addition, in the case where the number of image feature quantities extracted from the intermediate layer of the convolutional neural network is one, clustering may be performed on a one-dimensional space according to the size of the image feature quantities. When the number of image feature amounts is P (an integer of 2 or more), clustering can be performed on the P-dimensional space according to the size of the image feature amounts.
In step S8, when the image feature amount of the cut partial image is classified into the excessive detection type, the arithmetic circuit 10 changes the threshold value of the reference table at the position of the partial region using the offset value of type 1. This further improves the criterion for determining that the inspection object is a defective product.
The classification of the image feature amount of the cut-out partial image into the overdetection type can be said to mean that the image feature amount tends to be easily overdetected. The processing in step S8 is significant in that the threshold value of the reference table at the position of the partial region is updated so as to be high, thereby suppressing such excessive detection.
Fig. 9 is a diagram for explaining the processing of step S8 in fig. 6. Next, a case is assumed where a defect exists in the partial region 72b (fig. 7). Then, in step S6, the image feature amount of the partial image cut out from the partial region 72b (fig. 7) is classified into the excessive detection category. For convenience of explanation, the partial image having the image feature amount classified into the excessive detection category is only the partial region 72b (fig. 7).
The arithmetic circuit 10 adds the threshold value and the offset value at each position of the reference table 14b and the offset table 14c having the same row and column. In the example of fig. 9, in the reference table 14b, the value of the position 72-1 corresponding to the partial area 72b is "0.5". In the offset table 14c, the offset value of type 1 at the position 72-2 corresponding to the partial area 72b is "0.3". In the offset table 14c, the 2 nd offset value "0.0" is set for each position corresponding to a partial region other than the partial region 72 b.
When the threshold value and the offset value of the position corresponding to the partial region 72b are added, it is "0.8". "0.8" is described in the position 72-3 of the updated reference table 14b-2 obtained by addition. Since it is assumed that the partial image having the image feature amount classified into the overdetection category is only the partial area 72b (fig. 7), the difference between the reference tables 14b and 14b-2 before and after the update is only the threshold value at the position 72-3. However, in the case where a plurality of image feature amounts are classified into the excessive detection category, the threshold values of the positions of the other partial regions may also be updated.
Reference is again made to fig. 6.
In step S10, the arithmetic circuit 10 determines whether or not the image of the inspection object includes at least one defective image using the reference table 14b-2 in which the threshold value is updated. Specifically, the obtained probability value is compared with the threshold value of the reference table 14b-2 for the partial region 72b (fig. 7) of the image determined to have a defect in step S2. If the probability value is equal to or greater than the threshold value as a result of the comparison, the arithmetic circuit 10 determines that the defect still exists. When the probability value is smaller than or equal to the threshold value, the arithmetic circuit 10 determines that no defect is present.
Fig. 10 is a diagram for explaining step S10 in fig. 6. The arithmetic circuit 10 compares a probability value "0.75" that the image at the position of the partial region 72b contains a defect with a threshold value "0.8" of the reference table 14b-2 corresponding to the position. Since the probability value is smaller than the threshold value, the arithmetic circuit 10 determines that the partial region 72b of the image which is temporarily determined to include the defect in step S2 does not have the defect.
As described above, even when it is determined that a defect is present, the threshold value serving as the determination criterion is updated only when the result of the past non-defective determination is similar to that when the overdetection is performed. As a result, it is possible to prevent non-detection of the defect and to suppress excessive detection of the defect.
3.2 detailed description of appearance inspection processing
A more specific example of the appearance inspection process will be described. In the following description, "attribute data" of an object to be examined is introduced. The "attribute data" of the test object is data indicating an attribute relating to the manufacturing condition of each test object. As shown in the attribute data 14d in fig. 3, the attributes relating to the manufacturing conditions include, for example, a mold, a production line, a manufacturing factory, and the date and year of manufacture of each test object, which are used when each test object is manufactured. The attribute data may be at least one selected from the group (set) of the above-described mold, production line, manufacturing plant, manufacturing year, month, day, and the like. The attribute data may be expressed by a combination of a character string, a number, a symbol, and a numerical sequence, for example, which indicate a name. The attribute data is prepared for each object to be examined. However, when the manufacturing conditions of the respective test objects are the same, one piece of attribute data may be used as the attribute data for each test object.
The reason for adopting the attribute data will be explained. When a defect is confirmed in a certain object to be inspected, it is empirically known that the same defect is also confirmed in other objects to be inspected, such as a mold, a production line, and/or a manufacturing plant. Therefore, when it is determined that there is a defect in the test object and then it is determined whether there is a defect, the same reference table and offset table are used for a plurality of test objects having the same attribute. This can reduce the occurrence of variation in the result of the defect determination again.
As described above, after the attribute to be used is determined, the set of the reference table and the offset table is prepared according to the type of the attribute. A set of the reference table and the offset table may be prepared for each defect category.
Fig. 11 is a flowchart showing a specific procedure of the appearance inspection process. This flow is executed by the arithmetic circuit 10 using the convolutional neural network 12n in the memory 12 and various data in the storage device 14.
First, in step S20, the arithmetic circuit 10 acquires image data of the object to be inspected. Various image data acquisition methods are considered. For example, the imaging device 30 captures an image of the object to be inspected, acquires image data, and transmits the image data to the interface device 22a. The arithmetic circuit 10 receives the image data received by the interface device 22a from the interface device 22a. Alternatively, the interface device 22a may receive image data of the object from the imaging device 30, and after the storage device 14 stores the image data, the arithmetic circuit 10 may acquire the image data from the storage device 14. In the latter example, the imaging device 30 may not be included as part of the visual inspection system 1000.
In step S22, the arithmetic circuit 10 acquires attribute data of the object to be inspected.
In step S24, the arithmetic circuit 10 performs a defect detection process using the convolutional neural network 12n. Then, in a case where a defect candidate region is detected within the image of the object, the object is extracted as a defect candidate. At this time, the defect type and the probability value are output for each defect candidate region.
In step S26, the arithmetic circuit 10 executes offset table application processing. The offset table application process corresponds to the process shown in fig. 9. By the offset table application processing, the reference table 14b is updated by the offset table 14c, and a new reference table 14b-2 is acquired. The offset table application process will be described in detail later.
In step S28, the arithmetic circuit 10 determines whether or not the defect candidate region actually has a defect using the probability value output for each defect candidate region in step S24 and the threshold value of the updated reference table 14b-2. More specifically, the arithmetic circuit 10 compares the probability value of the position of the defect candidate region with the threshold value of the reference table 14b-2 corresponding to the position of the position. If the probability value is equal to or greater than the threshold value, it is determined that the object still has a defect, that is, the object is a defective product, and the process is terminated. On the other hand, if the probability value is smaller than the threshold value or equal to or smaller than the threshold value, the arithmetic circuit 10 determines that no defect is present, and the process proceeds to step S30.
In step S30, the arithmetic circuit 10 executes the update processing of the database 14Z. The update processing of the database 14Z is the following processing: the image, the attribute, and the detection result of the inspection object that is once determined as a defect candidate are registered in the storage device 14 as non-defective products, and the offset table is updated or deleted depending on whether or not there is a tendency of similarity in the image feature amount. A specific description of the update process of the database 14Z will be described later with reference to fig. 16.
The appearance inspection process of fig. 11 is ended in the above manner.
Further, although the processing of the arithmetic circuit 10 is ended after the object is determined to be a defective item in step S28, the inspector may determine whether or not the object has a defect by visual observation, for example, thereafter.
3.3 database registration processing (including similar tendency confirmation processing and offset table creation/update processing)
Next, a process of registering the offset table in the database 14Z of the storage device 14 will be described with reference to fig. 12. The processing of fig. 12 is a processing of generating an offset table as a premise for executing step S30 of fig. 11. In the present embodiment, the process of updating the database 14Z is also performed, but the update process will be described later with reference to fig. 16.
Fig. 12 is a flowchart showing a procedure of the database registration processing. The flow is executed by the arithmetic circuit 10.
In step S40, the arithmetic circuit 10 newly registers the non-defective product image, the attribute, and the detection result in the storage device 14. Thereby generating the database 14Z. The "non-defective product image" is an image of an article determined as a non-defective product. As already described, the article determined as "non-defective article" is not limited to the article initially determined as non-defective article. That is, the inspection object that is determined to be defective and then determined to be a non-defective product, and the inspection object that is determined to be a non-defective product as a result of the processing of the present disclosure described later are also included. The "detection result" includes a range of detection, a defect type, and a probability of a defect. The detected range can be expressed by the coordinates of the upper left corner and the coordinates of the lower right corner, for example, in the case of a rectangle. In addition, even in the case of a non-defective product, the over-test state differs depending on the sample, and therefore, the over-test state is registered as a separate test result.
In step S42, the arithmetic circuit 10 executes similar tendency confirmation processing. The "similar tendency confirmation processing" refers to the following processing: the classification classifier is configured to cluster non-defective images based on the same image feature amount of non-defective images having the same attribute, and to store a classification classifier that implements the clustering when there is a cluster including non-defective images at a predetermined ratio or more. Obtaining a class classifier means that the convolutional neural network 12n has performed excessive detection with respect to an image that is a specific attribute and has a specific image feature amount. The similar tendency confirmation processing will be described later with reference to fig. 13 and 14.
In step S44, the arithmetic circuit 10 determines whether or not there is a similar tendency based on the result of the similar tendency checking process performed in step S42. If there is a similar tendency, the process proceeds to step S44, and if there is no similar tendency, the offset table is not generated and the process is ended.
In step S46, the arithmetic circuit 10 generates an offset table and registers it in the database 14Z. When the excessive detection is performed, the offset table is generated, and then, when it is determined that an image having the same attribute and image feature amount is a defect, the criterion (threshold value) for determining the defect can be changed. The offset table generation process will be described later with reference to fig. 15.
Through the above processing, when the predetermined condition is satisfied, the processing of registering the offset table in the database 14Z is completed. Through the above-described processing, as shown in fig. 3, the offset table 14c is registered in the database 14Z.
Next, specific contents of the similar tendency checking process will be described with reference to fig. 13 and 14. The similar tendency confirmation processing is the following processing: in the case where the non-defective product image newly registered in step S40 of fig. 12 tends to be similar in relation to the non-defective product image already registered in the database 14Z, an offset table is generated. Since the similarity tendency checking process is performed based on the attribute of the object to be inspected, the target attribute is determined in advance when the process is executed.
Fig. 13 is a flowchart showing a procedure of a similar tendency confirmation process. The flow is executed by the arithmetic circuit 10.
In step S50, the arithmetic circuit 10 extracts n (n is an integer of 2 or more) of the latest non-defective images among the non-defective images having the attribute of the object from the database 14Z.
In step S52, the arithmetic circuit 10 cuts out an excess detection region of one or more non-defective images in which excess detection has occurred among the extracted n non-defective images.
In step S54, the arithmetic circuit 10 extracts a predetermined image feature amount. The image feature amount to be extracted is determined in advance. The image feature may be, for example, a feature extracted by the convolutional neural network 12n, or may be a GLCM feature. The number of extracted image feature amounts is arbitrary.
In step S56, the arithmetic circuit 10 performs clustering processing using non-defective images of the same attribute including the non-defective image that has been excessively detected. For example, the K-means method can be used as the clustering process. The K-means method is a well-known technique, and thus, a detailed description thereof will be omitted. Known clustering techniques other than the K-means method may also be used.
In step S58, the arithmetic circuit 10 determines whether or not the number of samples belonging to the cluster including the non-defective image that is excessively detected is equal to or greater than m% (m is a real number of 0 or greater and 100 or less, for example, 60) of the total number of samples. If the number of samples belonging to the cluster is equal to or greater than m% of the total number of samples, this means that there is a tendency similar to the previous overdetection, and the process proceeds to step S60. If the average value is less than m%, it is inferred that there is no tendency to cause the over-detection suddenly. Thus, the process ends.
In step S60, the arithmetic circuit 10 stores the cluster classifier in the storage device 14.
Fig. 14 is a diagram for specifically explaining steps S56 and S58 of fig. 13. Each non-defective image is drawn using a two-dimensional image feature. For example, the horizontal axis represents the image feature amount related to luminance, and the vertical axis represents the GLCM feature amount related to contrast. As a result of performing clustering by a predetermined clustering technique, the clusters are classified into three clusters a, B, and C as shown in fig. 14. Among these, focus is on cluster a.
The cluster a includes a point s obtained by drawing the image feature amount of the image determined to be defective by the convolutional neural network 12n and a plurality of points a obtained by drawing each image feature amount of the non-defective image group registered in the database 14Z.
In the total number of n images, the point s is included and the image feature amounts of k images are plotted in the cluster a. The arithmetic circuit 10 calculates Q = (k/n) × 100, and determines the ratio at which the non-defective images are classified into clusters a. Then, in step S58 of FIG. 12, the arithmetic circuit 10 determines whether or not Q ≧ m. When Q is equal to or greater than m, the image corresponding to the point s can be estimated to be overdetected as in the case of a good image in which overdetection has previously occurred. Therefore, in step S60 of fig. 13, the cluster classifier used for clustering is stored and used for determination of an image of an object to be inspected, which is newly performed later.
Next, specific contents of the offset table generation/update process will be described with reference to fig. 15. The generation/update processing of the offset table is the processing executed in step S46 of fig. 12.
Fig. 15 is a flowchart showing a procedure of the generation/update processing of the offset table. The flow is executed by the arithmetic circuit 10.
In step S70, the arithmetic circuit 10 extracts n sheets of the newest non-defective product image among the non-defective product images having the attribute of the object from the database 14Z. The process is the same as step S50 of fig. 13, but "n" need not be the same value.
In step S72, the arithmetic circuit 10 generates an average image of the extracted n non-defective images. The pixel value at the position of the coordinate (x, y) of the average image is the average value of the pixel values at the coordinates (x, y) of the extracted n non-defective images.
In step S74, the arithmetic circuit 10 inputs the average image to the convolutional neural network 12n, and generates an offset table using the output result. More specifically, the description will be given. The average image can be said to represent a typical non-defective image. The average image is input to the convolutional neural network 12n, and defect detection processing is performed. Thus, a defect candidate region can be detected within the average image. This is because the convolutional neural network 12n detects a defect candidate region also for a non-defective product image having the image feature amount of the point belonging to the cluster a of fig. 14. The detection result may contain one or more defect candidate regions. In addition, the defect type and the probability value may be output for each defect candidate region. The arithmetic circuit 10 generates an offset table in which offset values (type 1 offset values) corresponding to the magnitude of the probability value are described in one or more partial regions. The offset table generated is, for example, the same as the offset table 14c of fig. 8.
In step S76, the arithmetic circuit 10 extracts defect candidate regions for each defect type. This processing is to prepare an offset table for each defect type. When the offset table includes the defect candidate region, it can be said that the convolutional neural network 12n is overdetected for the defect type. Therefore, the defect candidate region may also be referred to as an "over-detection region".
In step S78, the arithmetic circuit 10 sets the value of the region other than the overdetected region to 0 in the offset table for each defect type. Thus, the offset table describes the offset value of type 2.
In step S80, the arithmetic circuit 10 saves the offset table for each defect type in the database. Thus, an offset table for each defect type is prepared in the database 14Z. In fig. 3, an offset table 14c regarding one defect class (e.g., defect class: damage) is registered in the database 14Z.
In addition, even if the offset table already exists, when a non-defective image is newly registered, a new offset table is generated and the existing offset table is rewritten. In such a case, the process of fig. 15 may be referred to as an update process of the offset table.
3.4 database update Process
Next, the update process of the database 14Z will be described with reference to fig. 16. Steps S40, S42, and S44 are the same as those in fig. 12, and therefore, description thereof is omitted.
Fig. 16 is a flowchart showing a procedure of the update process of the database 14Z.
If it is determined in step S44 that there is a tendency to resemble, the process proceeds to step S96, and if it is determined that there is no tendency to resemble, the process proceeds to step S98.
In step S96, the arithmetic circuit 10 updates the offset table. On the other hand, in step S98, the arithmetic circuit 10 deletes the offset table. The reason for deleting the offset table is to suppress the risk of detection of defects being impaired. If an offset is set, the risk of undetected increases slightly. Therefore, when it is determined that there is no similar tendency, the offset table is deleted on the assumption that there is no tendency of excessive detection, thereby suppressing the risk of non-detection.
3.5 application handling of offset tables
Next, specific contents of the application process of the offset table will be described with reference to fig. 17. The application processing of the offset table is processing performed in step S26 in fig. 11.
Fig. 17 is a flowchart showing a procedure of application processing of the offset table. The flow is executed by the arithmetic circuit 10.
In step S100, the arithmetic circuit 10 loads an offset table of the same attribute as that of the object from the storage device. The attribute of the test object is determined based on the attribute data acquired in step S22 in fig. 11.
In step S102, the arithmetic circuit 10 cuts out a partial region of the image of the test object. The position of the cut partial region corresponds to, for example, a position in which the offset value of type 1 (non-zero value) is described in the offset table 14c (fig. 8).
In step S104, the arithmetic circuit 10 inputs the image feature amount of the cut partial region to the cluster classifier stored in step S60 of fig. 13.
In step S106, the arithmetic circuit 10 determines whether or not the image feature values in the partial region are classified as clusters including excess detection. The cluster containing the over-detection is cluster a in the example of fig. 14. If the image feature amount is classified as a cluster including excessive detection, the process proceeds to step S108, and if not, the process ends.
In step S108, the arithmetic circuit 10 adds the offset value of type 1 to the determination criterion (threshold value) of the cut partial region. This process corresponds to the process described with reference to fig. 9.
By the above processing, the threshold value of the reference table is updated to a larger value. This widens the range of non-defective products, and can suppress overdetection.
4. Modification example
The image of the object to be inspected for determination may be acquired by the visual inspection apparatus 100 by imaging one or more articles in real time by an imaging apparatus (camera). Alternatively, the image may be temporarily captured and then stored in a mass storage device, and read out from the mass storage device and acquired by the appearance inspection device 100. The latter example will be explained.
Fig. 18 shows a configuration of an appearance inspection system 1100 according to a modification. The appearance inspection system 1100 includes the appearance inspection apparatus 100 and the secondary storage apparatus 310 connected to the communication network 300. The communication network 300 is, for example, a wide area communication network (WAN) such as the internet or a local area communication network (LAN) laid in an area such as an enterprise. The imaging device 30 is also connected to the communication network 300, but the imaging device 30 is not an essential component of the visual inspection system 1100 at the time when the visual inspection device 100 performs the visual inspection.
The secondary storage 310 is a so-called cloud storage. In the present modification, the secondary storage device 310 receives and stores image data of an object to be inspected, which is captured and transmitted by the imaging device 30, via the communication network 300.
The interface device 22d of the appearance inspection apparatus 100 receives image data of the object to be inspected from the secondary storage device 310 via the communication network 300. The arithmetic circuit 10 acquires the image data received by the interface device 22d, and stores the image data in the storage device 14, for example, to execute the above-described processing.
The appearance inspection system and the computer program of the present disclosure can be suitably applied to appearance inspection of an article or a component in a manufacturing site such as a factory.
Description of the reference symbols
1000: an appearance inspection system; 10: an arithmetic circuit; 12: a memory; 14: a storage device; 16: a communication circuit; 18: an image processing circuit; 22a to 22c: an interface device; 30: a camera device; 120: an input device; 130: and a monitor.
Claims (22)
1. An appearance inspection system for determining whether an object to be inspected is acceptable or not by using an image of the object, the appearance inspection system comprising:
a storage device that stores a plurality of images obtained by imaging a plurality of articles determined as non-defective articles, a reference table in which a plurality of threshold values are described, and an offset table in which a plurality of offset values are described;
an interface device that receives image data of an object to be inspected; and
an arithmetic circuit for performing an arithmetic operation on a plurality of signals,
the image of the object under examination and the plurality of images each contain a plurality of partial regions,
each of the plurality of threshold values of the reference table and each of the plurality of offset values of the offset table are set in correspondence with the plurality of partial areas,
the plurality of thresholds are criteria for determining that the image of the inspection object included in each of the plurality of corresponding partial regions indicates a defect,
the plurality of offset values include a 1 st offset value that varies a part of the plurality of threshold values and a 2 nd offset value that does not vary the remaining part,
the arithmetic circuit performs the following processing:
a process (a) of performing a predetermined defect detection process on the image data, and extracting the object to be inspected as a defect candidate when an image of at least one defect is included in the image of the object to be inspected;
a process (b) of cutting out a partial image from the image of the inspection object extracted as the defect candidate, wherein a position of the cut-out partial image is a position of a partial region corresponding to the 1 st offset value in the offset table;
a process (c) of determining whether or not the image feature amount of the cut partial image is classified into an excess detection class into which the image feature amount of each partial image of the plurality of images is classified at a ratio equal to or higher than a predetermined value at the position of the partial area corresponding to the offset value of the 1 st type;
a process (d) of changing a threshold value of the reference table at a position of the partial region using the 1 st offset value and further increasing a reference for determining that the image of the inspection object indicates a defect, when the image feature amount of the cut partial image is classified into the over-detection category; and
and (e) determining whether or not an image of at least one defect is included in the image of the object to be inspected, using the reference table in which the threshold value is changed.
2. The visual inspection system of claim 1,
the interface device also receives attribute data representing an attribute relating to a manufacturing condition of the inspection object,
the storage means stores the reference table and the offset table for each attribute classified in advance,
the treatment (b) includes the following treatments:
a process (b 1) in which the arithmetic circuit reads out, from the offset tables of a plurality of types stored in the storage device, an identical-attribute offset table having an attribute identical to that of the object to be inspected, based on the attribute data of the object to be inspected; and
and (b 2) a process of cutting out a partial image from the image of the inspection object extracted as the defect candidate at a position of the partial area corresponding to the 1 st offset value included in the read-out equivalent offset table.
3. The visual inspection system of claim 1 or 2,
the attribute relating to the manufacturing condition is at least one selected from the group consisting of a mold, a production line, and a manufacturing factory used when manufacturing the inspection object.
4. The visual inspection system of any one of claims 1 to 3,
the arithmetic circuit detects an image of the at least one defect using a 1 st image feature amount of the image data in the defect detection process performed by the process (a),
when the image feature amount used in the processing (c) is set as a 2 nd image feature amount,
the 1 st image feature amount and the 2 nd image feature amount are the same.
5. The visual inspection system of any one of claims 1 to 3,
the image feature amount in the processing (c) is an image feature amount related to a gray level co-occurrence matrix (GLCM) of an image.
6. The visual inspection system of any one of claims 1 to 5,
the defect detection processing in the processing (a) is processing for detecting a plurality of kinds of defects,
the processing (a) includes extracting the defect candidate according to the type of the at least one detected defect.
7. The visual inspection system of claim 6,
the storage means stores the reference table and the offset table for each type of the defect,
the treatment (b) includes the following treatments:
a process (b 3) in which the arithmetic circuit reads out a same type offset table corresponding to the type of the at least one defect from the offset tables of the plurality of types stored in the storage device; and
and (b 4) a process of cutting out a partial image from the image of the inspection object extracted as the defect candidate at the position of the partial area corresponding to the offset value of the 1 st type included in the read offset table of the same type.
8. The visual inspection system of any one of claims 1 to 7,
the plurality of partial regions are regions each represented by one pixel,
each of the plurality of threshold values of the reference table and each of the plurality of offset values of the offset table are set to correspond to each pixel.
9. The visual inspection system of any one of claims 1 to 8,
the appearance inspection system further includes the following process (f): when the image of the inspection object does not include the image of the at least one defect in the image of the inspection object in the processing (e), the image of the inspection object is additionally stored in the storage device as the plurality of images obtained by capturing the plurality of articles determined as non-defective articles.
10. The visual inspection system of claim 9,
the appearance inspection system further includes the following process (g): after the processing (f), rewriting the offset table stored in the storage device into an offset table in which the 2 nd offset value is changed.
11. The visual inspection system of claim 10,
the processing (f) includes the following processes:
a process (f 1) of determining whether or not there is a similar excessive detection result; and
a process (f 2) of rewriting the 2 nd offset value of the offset table when the determination result indicates that there is a similar overdetection result,
the process (f 1) is a process of: when the image feature amounts of the respective partial images of the plurality of images are classified into the excessive detection category at a ratio equal to or greater than the predetermined ratio with respect to the position of the partial region corresponding to the offset value of type 1, it is determined that there is a similar excessive detection result.
12. The visual inspection system of claim 11,
the process (f 1) is a process of: when the position of the partial region corresponding to the offset value of type 1 indicates that the partial region is classified into the same cluster at a ratio equal to or higher than the predetermined ratio, it is determined that there is a similar excessive detection result.
13. The visual inspection system of claim 12,
the appearance inspection system also has a category classifier for use in the prescribed clustering process,
the processing (c) is processing for determining whether or not the image feature amount of the cut partial image is classified into an excessive detection category by the category classifier.
14. The visual inspection system of claim 11,
the treatment (f) includes the following treatment (f 3): when the determination result indicates that there is no similar overdetection result, the offset table in which the 2 nd offset value is changed is discarded.
15. The visual inspection system of any one of claims 10 to 14,
the interface device also receives attribute data representing an attribute relating to a manufacturing condition of the inspection object,
the storage device stores the plurality of images obtained by imaging the plurality of articles determined as non-defective articles, the reference table, and the offset table in association with attributes classified in advance,
the treatment (g) comprises the following treatments:
a process (g 1) of selecting, from the plurality of images stored by the storage means, an image group having the same attribute as the attribute data received by the interface means;
a process (g 2) for generating at least one reference image from the image group;
a process (g 3) of performing the predetermined defect detection process on the image data of the at least one reference image, and generating an excess detection table in which an area indicating a position of an image of each defect is associated with a value indicating a degree of occurrence of excess detection when it is determined that the at least one reference image includes one or more images of defects;
a process (g 4) for describing, in the excess detection table, a 1 st value indicating a degree of occurrence of the excess detection at a position of the image of each defect, and a 2 nd value indicating that the excess detection does not occur at a position other than the position of the image of each defect; and
and a process (g 5) of rewriting the offset table stored in the storage device with an offset table in which the 2 nd offset value is changed, using the excess detection table.
16. The visual inspection system of any one of claims 1 to 15,
the visual inspection system also has a camera device,
the imaging device captures the object to be inspected, acquires the image data, and outputs the image data.
17. The visual inspection system of claim 16,
the camera device transmits the image data of the object to be inspected to the interface device,
the arithmetic circuit acquires the image data received by the interface device.
18. The visual inspection system of claim 16,
the interface device receives image data of the object from the imaging device,
the storage device saves the received image data,
the arithmetic circuit acquires image data of the object from the storage device.
19. The visual inspection system of claim 16,
the visual inspection system also has a secondary storage device connected to the communication network,
the image pickup device transmits image data of the object to be inspected to the secondary storage device via the communication network,
the secondary storage device saves the received image data,
the interface device receives image data of the inspection object from the secondary storage device via the communication network,
the arithmetic circuit acquires the image data received by the interface device.
20. The visual inspection system of any one of claims 1 to 19,
the predetermined defect detection processing in the processing (a) is processing for determining whether or not the image of the inspection object includes the image of the at least one defect using a learned neural network,
the neural network is constructed by a machine learning process in which an image of each of a plurality of articles including a non-defective article and a defective article and non-defective data indicating whether each article is a non-defective article or a defective article are used as teacher data.
21. The visual inspection system of claim 20,
the neural network is constructed by a machine learning process in which an image of each of a plurality of articles including a non-defective article and a defective article, non-defective data indicating whether each article is a non-defective article or a defective article, and defective type data indicating a type of non-defective of the defective article are used as teacher data.
22. A computer program executed by an arithmetic circuit of an appearance inspection system for judging whether an object to be inspected is acceptable or not by using an image of the object to be inspected,
the appearance inspection system comprises:
a storage device that stores a plurality of images obtained by imaging a plurality of articles determined as non-defective articles, a reference table in which a plurality of threshold values are described, and an offset table in which a plurality of offset values are described;
an interface device that receives image data of an object to be inspected; and
the operation circuit is used for carrying out the operation,
the image of the object under examination and the plurality of images each contain a plurality of partial regions,
each of the plurality of threshold values of the reference table and each of the plurality of offset values of the offset table are set in correspondence with the plurality of partial areas,
the plurality of thresholds are criteria for determining that the image of the inspection object included in each of the plurality of corresponding partial regions indicates a defect,
the plurality of offset values include a 1 st offset value that varies a part of the plurality of threshold values and a 2 nd offset value that does not vary the remaining part,
the computer program causes the arithmetic circuit to execute:
a process (a) of performing a predetermined defect detection process on the image data, and extracting the object to be inspected as a defect candidate when the image of the object to be inspected includes at least one defective image;
a process (b) of cutting out a partial image from the image of the inspection object extracted as the defect candidate, wherein a position of the cut-out partial image is a position of a partial region corresponding to the 1 st offset value in the offset table;
a process (c) of determining whether or not the image feature amount of the cut partial image is classified into an excessive detection class in which the image feature amount of each partial image of the plurality of images is classified at a ratio equal to or larger than a predetermined ratio at the position of the partial region corresponding to the 1 st offset value;
a process (d) of changing the threshold value of the reference table at the position of the partial area by using the offset value of type 1 when the image feature amount of the cut partial image is classified into the excessive detection type, and further increasing the reference for determining that the inspection object indicates a defect; and
and (e) determining whether or not an image of at least one defect is included in the image of the object to be inspected, using the reference table in which the threshold value is changed.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-052573 | 2020-03-24 | ||
JP2020052573 | 2020-03-24 | ||
PCT/JP2020/039729 WO2021192376A1 (en) | 2020-03-24 | 2020-10-22 | Visual inspection system and computer program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115298539A true CN115298539A (en) | 2022-11-04 |
Family
ID=77891669
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080098729.0A Pending CN115298539A (en) | 2020-03-24 | 2020-10-22 | Appearance inspection system and computer program |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115298539A (en) |
WO (1) | WO2021192376A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230230217A1 (en) * | 2022-01-19 | 2023-07-20 | Hon Hai Precision Industry Co., Ltd. | Image detection method, computing device, and storage medium |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023007260A (en) * | 2021-07-01 | 2023-01-18 | 株式会社日立製作所 | Computer and visual inspection method |
WO2023095490A1 (en) * | 2021-11-26 | 2023-06-01 | 新興窯業株式会社 | Program, information processing device, information processing method, method for generating learning model, and imaging system |
JP7554487B2 (en) * | 2022-02-08 | 2024-09-20 | 慶應義塾 | DEFECT ESTIMATION DEVICE, DEFECT ESTIMATION METHOD, AND PROGRAM |
JP7297354B1 (en) * | 2023-02-07 | 2023-06-26 | 株式会社シュヴァルベル | Image processing system and image processing method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001126064A (en) * | 1999-10-26 | 2001-05-11 | Nachi Fujikoshi Corp | Method for detecting defect on metallic surface |
JP2008004863A (en) * | 2006-06-26 | 2008-01-10 | Hitachi High-Technologies Corp | Appearance inspection method and device therefor |
JP2013224833A (en) * | 2012-04-20 | 2013-10-31 | Keyence Corp | Visual inspection device, visual inspection method and computer program |
CN108732182A (en) * | 2017-04-21 | 2018-11-02 | 欧姆龙株式会社 | Sheet material check device and inspection system |
JP2019132720A (en) * | 2018-01-31 | 2019-08-08 | 日本特殊陶業株式会社 | Visual inspection device and visual inspection method |
WO2019194064A1 (en) * | 2018-04-02 | 2019-10-10 | 日本電産株式会社 | Image processing device, image processing method, appearance inspection system, and appearance inspection method |
CN110390661A (en) * | 2018-04-20 | 2019-10-29 | 欧姆龙株式会社 | It checks management system, check managing device and checks management method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019176614A1 (en) * | 2018-03-16 | 2019-09-19 | 日本電産株式会社 | Image processing device, image processing method, and computer program |
-
2020
- 2020-10-22 WO PCT/JP2020/039729 patent/WO2021192376A1/en active Application Filing
- 2020-10-22 CN CN202080098729.0A patent/CN115298539A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001126064A (en) * | 1999-10-26 | 2001-05-11 | Nachi Fujikoshi Corp | Method for detecting defect on metallic surface |
JP2008004863A (en) * | 2006-06-26 | 2008-01-10 | Hitachi High-Technologies Corp | Appearance inspection method and device therefor |
JP2013224833A (en) * | 2012-04-20 | 2013-10-31 | Keyence Corp | Visual inspection device, visual inspection method and computer program |
CN108732182A (en) * | 2017-04-21 | 2018-11-02 | 欧姆龙株式会社 | Sheet material check device and inspection system |
JP2019132720A (en) * | 2018-01-31 | 2019-08-08 | 日本特殊陶業株式会社 | Visual inspection device and visual inspection method |
WO2019194064A1 (en) * | 2018-04-02 | 2019-10-10 | 日本電産株式会社 | Image processing device, image processing method, appearance inspection system, and appearance inspection method |
CN110390661A (en) * | 2018-04-20 | 2019-10-29 | 欧姆龙株式会社 | It checks management system, check managing device and checks management method |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230230217A1 (en) * | 2022-01-19 | 2023-07-20 | Hon Hai Precision Industry Co., Ltd. | Image detection method, computing device, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2021192376A1 (en) | 2021-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7004145B2 (en) | Defect inspection equipment, defect inspection methods, and their programs | |
CN115298539A (en) | Appearance inspection system and computer program | |
JP7015001B2 (en) | Defect inspection equipment, defect inspection methods, and their programs | |
JP6869490B2 (en) | Defect inspection equipment, defect inspection methods, and their programs | |
JP5953842B2 (en) | Image inspection method and inspection area setting method | |
JP6794737B2 (en) | Information processing equipment, information processing methods, programs and inspection systems | |
JP5075111B2 (en) | Image classification reference updating method, program, and image classification apparatus | |
US9183450B2 (en) | Inspection apparatus | |
JP5956814B2 (en) | Appearance inspection apparatus, appearance inspection method, and computer program | |
CN113096119B (en) | Method and device for classifying wafer defects, electronic equipment and storage medium | |
CN110596120A (en) | Glass boundary defect detection method, device, terminal and storage medium | |
JP2010230452A (en) | Method for inspecting defect and defect inspection system | |
WO2019194064A1 (en) | Image processing device, image processing method, appearance inspection system, and appearance inspection method | |
KR20210091189A (en) | Optimization of setup steps in automated visual inspection processes | |
CN112200790B (en) | Cloth defect detection method, device and medium | |
CN117495856B (en) | Wafer surface detection method, device, equipment and medium based on deep learning | |
JPWO2020071234A1 (en) | Image processing equipment, image processing methods, visual inspection systems and computer programs | |
JP2001188906A (en) | Method and device for automatic image calssification | |
CN116434066A (en) | Deep learning-based soybean pod seed test method, system and device | |
JP7380332B2 (en) | Image processing device, control method and program for the image processing device | |
CN114387232A (en) | Wafer center positioning, wafer gap positioning and wafer positioning calibration method | |
JP2004239870A (en) | Spatial filter, method and program for generating the same, and method and apparatus for inspecting screen defect | |
JPS6385432A (en) | Inspecting method for linear defect | |
CN110288662A (en) | Display detection method and system | |
CN112567229B (en) | Defect inspection device, defect inspection method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |