CN116804637A - Inspection system, teacher data generation device, teacher data generation method, and storage medium - Google Patents

Inspection system, teacher data generation device, teacher data generation method, and storage medium Download PDF

Info

Publication number
CN116804637A
CN116804637A CN202310101400.0A CN202310101400A CN116804637A CN 116804637 A CN116804637 A CN 116804637A CN 202310101400 A CN202310101400 A CN 202310101400A CN 116804637 A CN116804637 A CN 116804637A
Authority
CN
China
Prior art keywords
defect
inspection
image
unit
teacher data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310101400.0A
Other languages
Chinese (zh)
Inventor
海津正博
盐见顺一
杉山胜彦
泷本达也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Screen Holdings Co Ltd
Original Assignee
Screen Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Screen Holdings Co Ltd filed Critical Screen Holdings Co Ltd
Publication of CN116804637A publication Critical patent/CN116804637A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/888Marking defects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8883Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges involving the calculation of gauges, generating models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biophysics (AREA)
  • Chemical & Material Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Quality & Reliability (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biochemistry (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The inspection system (1) comprises: an inspection unit (20) that inspects an image obtained by photographing an object without using machine learning to detect a defect; a classification unit (52) having a previously generated learned model, and classifying the defect type of the defect by inputting an image representing the defect into the learned model; and a classification necessity determining unit (53) that determines whether or not the defect detected by the inspection unit (20) needs to be classified by the classifying unit (52) based on the defect-related information acquired or used by the inspection unit (20) when the defect is detected. Thus, it is possible to achieve avoidance of serious misclassification in a learned model and shortening of time required for classification processing.

Description

Inspection system, teacher data generation device, teacher data generation method, and storage medium
Technical Field
The present invention relates to a technique for inspecting an object.
Background
Conventionally, inspection systems that capture images of objects such as printed circuit boards to detect defects have been used. The inspection system of japanese patent application laid-open No. 2021-177154 is provided with: a primary inspection unit that performs failure determination without using machine learning, based on an image obtained by capturing an object; and a secondary inspection unit that separates the actual defective product from the excessive determination product by using a machine learning model based on the image of the object determined to be defective by the primary inspection unit. This can suppress a decrease in productivity due to the occurrence of an excessive determination product.
As described above, when a learned model (machine learning model) is used for classification of a true defect or a false defect, it is necessary to perform learning in advance using a plurality of pieces of teacher data, and generate the learned model. In this case, the operator generates teacher data in which the true defect or the false defect is marked on the defect image by determining the true defect or the false defect (i.e., by labeling) on the defect image prepared in advance.
However, since defects on a printed circuit board have a great influence on the operation, performance, etc. of the printed circuit board according to the position, state, etc., classification errors based on a learned model (in this case, misclassification of true defects as false defects) are sometimes unacceptable. In order to avoid serious misclassification in the learned model, learning using a plurality of teacher data is considered, but in this case, the marking work of the operator takes a long time and the time required for learning also becomes long. In addition, it is impossible to completely avoid misclassification in the learned model. Further, when all detected defects are input to the learned model, the time required for the classification process becomes long.
Disclosure of Invention
The invention is directed to an inspection system, and aims to avoid serious misclassification in a learned model, shorten the time required for classification processing, and shorten the time required for marking operation and learning in the generation of teacher data.
The inspection system of the present invention includes: an inspection unit that inspects an image obtained by photographing the object without using machine learning to detect a defect; a classification unit having a learned model generated in advance, and classifying a defect type of a defect by inputting an image representing the defect into the learned model; and a classification necessity determining unit configured to determine whether or not the defect detected by the inspecting unit is required to be classified by the classifying unit, based on the defect-related information acquired or utilized by the inspecting unit when the defect is detected.
According to the present invention, it is possible to achieve avoidance of serious misclassification in a learned model and shortening of time required for classification processing.
Preferably, the classification unit classifies the defect detected by the inspection unit as a true defect or a false defect, the inspection unit obtains a defect type of the defect when detecting the defect, the defect type is included in the defect-related information, and the classification necessity determination unit determines whether the defect as a specific defect type is not required to be classified by the classification unit.
Preferably, the inspection unit includes: a first inspection processing unit that detects a defect by a first inspection process and acquires a defect image including the defect; and a second inspection processing unit configured to detect a defect by a second inspection process different from the first inspection process, and acquire a defect image including the defect. The first inspection processing unit may acquire positional information of a defect in a defect image more detailed than the second inspection processing unit, the positional information acquired by the first inspection processing unit may be included in the defect-related information, the classification necessity determining unit may determine that the defect detected by the first inspection processing unit is required to be classified by the classifying unit, and may determine that the defect detected by the second inspection processing unit is not required to be classified by the classifying unit, and the classifying unit may input an image obtained by cutting out a region of the defect from the defect image based on the positional information of the defect into the learned model.
Preferably, one of a plurality of inspection sensitivities is set for each position of the object, information indicating the inspection sensitivity used by the inspection unit when detecting a defect is included in the defect-related information, and the classification necessity determining unit determines that the defect detected at the specific inspection sensitivity does not need to be classified by the classifying unit.
The invention also provides a teacher data generating device for generating teacher data. The teacher data generation device of the present invention includes: an image receiving unit that receives a defect image including a defect and defect-related information acquired or used when detecting the defect from an inspection unit that inspects an image obtained by capturing an object without using machine learning; an image necessity determining section that determines whether to use the defective image for teacher data based on the defect association information; a display control section that displays a data defect image for teacher data on a display; a determination result receiving unit that receives an input of a determination result of a defect type of the defect image displayed on the display by an operator; and a teacher data generation unit configured to generate teacher data by labeling the determination result to the defective image. According to the present invention, the time required for the marking work and learning in the generation of teacher data can be shortened.
Preferably, the determination result receiving unit receives an input of a determination result of a true defect or a false defect by an operator, wherein a defect type of the defect obtained when the defect is detected by the inspection unit is included in the defect-related information, and the image necessity determining unit determines not to use a defect image including the defect as a specific defect type for teacher data.
Preferably, the inspection unit includes: a first inspection processing unit that detects a defect by a first inspection process and acquires a defect image including the defect; and a second inspection processing unit configured to detect a defect by a second inspection process different from the first inspection process, and acquire a defect image including the defect. The first inspection processing unit may acquire positional information of a defect in a defect image more detailed than the second inspection processing unit, the positional information acquired by the first inspection processing unit may be included in the defect-related information, the image necessity determining unit may determine that the defect image in which the defect is detected by the first inspection processing unit is used for teacher data, and may determine that the defect image in which the defect is detected by the second inspection processing unit is not used for teacher data, and the teacher data generating unit may generate teacher data including an image obtained by cutting out a region of the defect from the defect image based on the positional information of the defect.
Preferably, one of a plurality of inspection sensitivities is set for each position of the object, information indicating the inspection sensitivity used by the inspection unit when detecting a defect is included in the defect-related information, and the image necessity determining unit determines not to use a defect image including a defect detected at a specific inspection sensitivity for teacher data.
The invention also provides a teacher data generation method for generating teacher data. The teacher data generation method of the present invention includes: a) A step of receiving a defect image including a defect and defect related information acquired or used when detecting the defect from an inspection unit for inspecting an image obtained by photographing an object without using machine learning; b) A step of determining whether to use the defect image for teacher data based on the defect association information; c) A step of displaying a defect image for teacher data on a display; d) A step of receiving an input of a determination result of a defect type by an operator with respect to the defect image displayed on the display; and e) labeling the determination result on the defect image to generate teacher data.
The present invention is also directed to a storage medium storing a program for causing a computer to generate teacher data. The procedure executed by the computer of the program of the invention comprises: a) A step of receiving a defect image including a defect and defect-related information acquired or used when detecting the defect from an inspection unit that inspects an image obtained by photographing an object without using machine learning; b) A step of determining whether to use the defect image for teacher data based on the defect association information; c) A step of displaying a defect image for teacher data on a display; d) A step of receiving an input of a determination result of a defect type by an operator with respect to the defect image displayed on the display; and e) labeling the determination result on the defect image to generate teacher data.
The above objects and other objects, features, aspects and advantages will be clarified by the following detailed description of the present invention with reference to the accompanying drawings.
Drawings
Fig. 1 is a diagram showing a structure of an inspection system.
Fig. 2 is a diagram showing the structure of a computer.
Fig. 3 is a diagram showing the structure of the teacher data generation device.
Fig. 4 is a diagram for explaining the first inspection process.
Fig. 5 is a diagram showing defect types that can be detected by the first inspection process.
Fig. 6 is a diagram for explaining other examples of the first inspection process.
Fig. 7 is a diagram showing a binary divided image and a main image.
Fig. 8 is a diagram for explaining the second inspection process.
Fig. 9 is a diagram showing a flow of processing for generating teacher data.
Fig. 10 is a diagram showing a flow of a process of inspecting a printed circuit board.
Fig. 11 is a diagram showing a printed circuit board.
Fig. 12 is a diagram showing a part of a printed circuit board.
Description of the reference numerals
1: inspection system
3: computer with a memory for storing data
4: teacher data generating device
9: printed circuit board with improved heat dissipation
20: inspection part
21: a first inspection processing part
22: a second inspection processing part
35: display device
41: image receiving section
42: image necessity determining part
43: display control unit
44: determination result receiving unit
45: teacher data generating unit
52: classification part
53: classification necessity determining part
69: defects(s)
71: shooting an image
521: classifier
811: program
S11 to S15, S21 to S25: step (a)
Detailed Description
(first embodiment)
Fig. 1 is a diagram showing a structure of an inspection system 1 according to a first embodiment of the present invention. The inspection system 1 inspects a printed circuit board as an object. The inspection system 1 includes an inspection device 2, a computer 3, and a defect confirmation device 11. In fig. 1, the functional structure implemented by the computer 3 is enclosed by a rectangle of broken line.
The inspection device 2 includes an imaging unit and a moving mechanism, which are not shown. The photographing section photographs a printed circuit board. The moving mechanism moves the printed circuit board relative to the photographing section. The inspection device 2 further includes an inspection unit 20. The inspection unit 20 is implemented by a computer and/or a circuit, for example. The inspection unit 20 includes a first inspection unit 21 and a second inspection unit 22. The first inspection processing section 21 and the second inspection processing section 22 perform inspection processing different from each other on the photographed image output from the photographing section, and detect a defect from the photographed image. When a defect is detected in the first inspection processing unit 21 or the second inspection processing unit 22, a defect image including the region including the defect is output to the computer 3. The printed circuit board inspected by the inspection device 2 is carried into the defect inspection device 11. The defect confirmation device 11 photographs and displays a defective region in the printed circuit board on a display based on information input from the computer 3, and allows an operator to confirm the defect.
Fig. 2 is a diagram showing the structure of the computer 3. The computer 3 has a structure of a general computer system including a CPU31, a ROM32, a RAM33, a fixed disk 34, a display 35, an input unit 36, a reading device 37, a communication unit 38, a GPU39, and a bus 30. The CPU31 performs various arithmetic processing. The GPU39 performs various arithmetic processing related to image processing. The ROM32 stores a basic program. The RAM33 and the fixed disk 34 store various information. The display 35 displays various information such as an image. The input unit 36 has a keyboard 36a and a mouse 36b for receiving input from an operator. The reading device 37 reads information from a computer-readable storage medium 81 such as an optical disk, a magnetic disk, a magneto-optical disk, or a memory card. The communication unit 38 transmits and receives signals to and from other structures of the inspection system 1 and an external device. The bus 30 is a signal circuit that connects the CPU31, the GPU39, the ROM32, the RAM33, the fixed disk 34, the display 35, the input unit 36, the reading device 37, and the communication unit 38.
In the computer 3, the program 811 is read out from the storage medium 81 as a program product via the reading device 37 in advance and stored in the fixed disk 34. The program 811 may be stored in the fixed disk 34 via a network. The CPU31 and the GPU39 execute arithmetic processing while using the RAM33 and the fixed disk 34 according to the program 811. The CPU31 and the GPU39 function as an arithmetic unit in the computer 3. Other configurations may be employed in addition to the CPU31 and the GPU39, which function as an arithmetic unit.
In the inspection system 1, the computer 3 executes arithmetic processing or the like in accordance with the program 811, thereby realizing a functional configuration surrounded by a broken line in fig. 1. That is, the CPU31, the GPU39, the ROM32, the RAM33, the fixed disk 34, and their peripheral structures of the computer 3 realize the teacher data generation device 4, the learning unit 51, the classification unit 52, and the classification necessity determination unit 53. All or part of these functions may be implemented by a dedicated circuit, and these functions may also be implemented by a separate program. In addition, these functions can be realized by a plurality of computers.
The classifying unit 52 includes a classifier 521, and the classifier 521 classifies a defect as a true defect or a false defect (also referred to as false alarm or false defect) by inputting an image representing the defect. The classification necessity determining unit 53 determines whether or not classification is necessary in the classifying unit 52 for the defect indicated by the defect image inputted from the inspecting unit 20. The learning unit 51 performs learning using a plurality of teacher data described later, and generates the classifier 521 as a learned model. The teacher data generation device 4 generates teacher data for the learning unit 51.
Fig. 3 is a diagram showing the structure of the teacher data generation device 4. The teacher data generation device 4 includes: an image receiving unit 41, an image necessity determining unit 42, a display control unit 43, a determination result receiving unit 44, and a teacher data generating unit 45. The image receiving unit 41 is connected to the inspection unit 20, and receives input of a defect image or the like from the inspection unit 20. The image necessity determining section 42 determines whether or not each defective image is to be used for teacher data. The display control unit 43 is connected to the display 35, and displays a defect image or the like on the display 35. The determination result receiving unit 44 is connected to the input unit 36, and receives an input from an operator via the input unit 36. The teacher data generation unit 45 marks the defective image and generates teacher data.
Here, the process of detecting a defect by the inspection unit 20 will be described. As described above, in the inspection apparatus 2, the photographed image is acquired, and the first inspection processing section 21 and the second inspection processing section 22 of the inspection section 20 perform the inspection processing different from each other on the photographed image. In the inspection by the inspection unit 20, defects are detected by, for example, a rule base without using machine learning. The determination of the threshold value or the like for the inspection may also be made by machine learning. Here, the photographed image is assumed to be a gray-scale image, but may be a color image.
Fig. 4 is a diagram for explaining the first inspection process performed by the first inspection processing section 21, and shows a part of the captured image. In fig. 4, a wiring region 61 formed of a metal such as copper and a region 62 (hereinafter referred to as "background region 62") as a substrate surface of a printed circuit board or the like are shown. For example, the wiring region 61 and the background region 62 can be distinguished by a prescribed threshold value.
In one example of the first inspection process, the line width is measured. For example, a plurality of inspection positions are set in advance on the wiring region 61 based on design data or the like, and pixels located at the inspection positions are specified as target pixels in the captured image. The inspection position is, for example, a position substantially in the center of the width of the wiring region 61 (i.e., the width in the direction perpendicular to the longitudinal direction of the wiring region 61). Next, 16 lines in the directions around the target pixel are set at equal angular intervals, and the length of each line overlapping the wiring region 61 (i.e., the length between positions overlapping the two edges of the wiring region 61 on the line) is obtained. The minimum length of the 16 straight lines was obtained as the measurement distance. The measured distance is compared with a predetermined lower limit distance and an upper limit distance. When the measurement distance is smaller than the lower limit distance or larger than the upper limit distance, the presence of a defect at the inspection position is detected. In the example of fig. 4, since the measured distance indicated by the arrow labeled with reference numeral A1 is smaller than the lower limit distance, the defect 69, which is a defect of the wiring region 61, is detected.
By obtaining the measurement distances of the inspection positions in this way, various types of defects shown in fig. 5 can be detected. In the left example of fig. 5, a defect 69, which is a line-thinning defect, is detected at a plurality of adjacent inspection positions at a measurement distance smaller than the lower limit distance. In the example in the center of fig. 5, a defect 69 that is a line thickening defect (line thickening of the pad portion) is detected at a plurality of adjacent inspection positions, the measurement distance being larger than the upper limit distance. In the right example of fig. 5, a notch defect whose measured distance is smaller than the lower limit distance at one inspection position and a protrusion defect whose measured distance is larger than the upper limit distance at one inspection position are detected as defects 69, respectively.
As described above, the first inspection processing unit 21 can acquire the positional information of the defect in detail (more detail than the second inspection processing unit 22 performing the second inspection processing described later) for each inspection position. When the defect 69 is detected in the first inspection processing section 21, a defect image of a predetermined size including the defect 69 is cut out from the captured image. Further, position information (hereinafter also referred to as "defect position information") indicating the position of the defect 69 in the defect image and defect type of the defect 69 are acquired, and defect related information including both are generated. The defect image and the defect-related information are associated with each other and output to the computer 3. The defect-related information also includes a position on the printed circuit board shown by the defect image (the same applies hereinafter).
In this example, the first inspection processing unit 21 performs the first inspection processing for detecting the open (circuit break) defect and the short defect of the wiring region 61. In this other first inspection process, connectivity of the wiring region 61 is confirmed in the photographed image 71 shown on the right side of fig. 6. For example, in the wiring region 61, the connectivity of the wiring region 61 can be confirmed by specifying each inspection position (or a position different from the inspection position in the above-described process) and another inspection position continuous via the wiring region 61. As shown in the left side of fig. 6, the first inspection processing unit 21 prepares a main image 70 (for example, a binary image generated from design data) representing the same area as the captured image 71, and confirms connectivity (connection relationship between inspection positions) of the wiring area 61 in the main image 70 in the same manner as described above. In addition, if the connectivity of the wiring region 61 in the captured image 71 is different from the connectivity of the wiring region 61 in the main image 70, a different portion is detected as a defect.
In the example of fig. 6, since connectivity (connection between inspection positions) that does not exist in the main image 70 is generated in the captured image 71 (refer to arrow A2), a portion where the connectivity is generated is detected as a short-circuit defect, that is, defect 69. In the first inspection processing section 21, a defect image of a predetermined size including the defect 69 is cut out from the captured image 71. Further, defect-related information including defect position information in the defect image and defect category of the defect 69 is generated. Then, the defect image and the defect-related information are correlated with each other and output to the computer 3. On the other hand, when connectivity existing in the main image is absent (not occurring) in the captured image, the portion where the connectivity is absent is detected as an open defect, that is, a defect. Then, a defect image of a predetermined size including the defect and defect related information including defect position information and defect type are output to the computer 3.
Next, an example of the second inspection process performed by the second inspection processing section 22 will be described. In the second inspection process, the photographed image of multiple gradations is divided into a plurality of images of a prescribed size (hereinafter referred to as "divided images"). Then, an image obtained by binarizing each of the divided images by a predetermined threshold (hereinafter referred to as a "binary divided image") is compared with a corresponding region of the binary main image. In fig. 7, a binary divided image 72 is shown on the right side, and a region of the main image 70 corresponding to the divided image is shown on the left side. A matrix is used to compare the binary segmented image 72 to the primary image 70.
Fig. 8 is an image showing exclusive or of each pixel of the binary divided image and a corresponding pixel of the binary main image, and the pixel with the parallel diagonal line in fig. 8 indicates a pixel (hereinafter referred to as a "different pixel") having a different pixel value in the two images. In fig. 8, for convenience of explanation, a binary divided image and a main image different from those of fig. 7 are used. In the comparison of the binary divided image and the main image, in the image of fig. 8, in a case where the proportion of the number of different pixels among pixels included in the matrix M1 exceeds the allowable value while scanning the matrix M1 in the row direction and the column direction, the presence of a defect is conceptually detected. In the example of fig. 8, the matrix M1 has a size of 4×4 pixels, and the allowable value is 75%. Therefore, the proportion of the dissimilar pixels does not exceed the allowable value at the position of the matrix M1 indicated by the thin broken line in fig. 8, but exceeds the allowable value at the position of the matrix M1 indicated by the thick broken line in fig. 8. Thereby, the presence of a defect is detected in the divided image. The size of the matrix M1 may be appropriately changed to 2×2 pixels, 3×3 pixels, or the like, or the allowable value may be changed.
In fact, in the binary divided image 72 of fig. 7, the ratio of the number of pixels having different pixel values from the corresponding pixels of the main image 70 among the pixels included in the matrix M1 arranged at each position is obtained. If the ratio exceeds the allowable value, the presence of a defect in the divided image is detected. In the example of fig. 7, pixels having different values from the corresponding pixels of the main image 70 (hereinafter, referred to as "different pixels" as in fig. 8) are surrounded by thick solid lines in the binary divided image 72. In the present processing example using the matrix M1, the detection of a defect is not affected (ignored) by the dissimilar pixels existing in isolation in the binary divided image 72, but a set of a certain number of dissimilar pixels is easily detected as a defect (refer to the dissimilar pixel group located near the center of the binary divided image 72).
When the presence of a defect is detected in the binary divided image 72, the corresponding multi-gradation divided image is output as a defect image to the computer 3. In the second inspection processing unit 22, the detailed position of the defect in the divided image (defect image) is not acquired, and the defect type is not acquired. Therefore, unlike the first inspection processing section 21, defect-related information that does not include defect position information and defect type is output to the computer 3. As described above, the defect-related information contains the position on the printed circuit board shown by the defect image. The defect-related information may include information indicating that it is detected by the second inspection process.
Fig. 9 is a diagram showing a flow of processing of generating teacher data by the teacher data generation device 4. First, the image receiving unit 41 in fig. 3 receives the defect image and the defect-related information from the inspection unit 20 (step S11). In this processing example, a plurality of defect images are acquired in advance from a plurality of captured images of a plurality of printed circuit boards by the inspection unit 20, and the plurality of defect images and defect-related information corresponding to the plurality of defect images are received by the image receiving unit 41. The defect-related information of the plurality of defect images may be contained in one list in a state associated with the plurality of defect images, respectively. As described above, the defect-related information of the defect detected by the first inspection processing section 21 includes the defect position information and the defect type. On the other hand, the defect-related information of the defect detected by the second inspection processing section 22 does not include defect position information and defect type.
Next, in the image necessity determining section 42, it is determined whether or not each defective image is used for teacher data based on the defect association information (step S12). In this processing example, the defect image that is a defect of a specific defect type is not used for teacher data. One example of a specific defect class is open defects and short defects. The defect image of the defect detected by the second inspection processing unit 22, that is, the defect image in which the defect-related information does not include the defect position information and the defect type, is not used for teacher data. The remaining defect images other than these defect images are determined as defect images for teacher data. The reason why the defect image of the specific defect type such as the open defect or the short defect and the defect image of the defect detected by the second inspection processing unit 22 are not used for the teacher data will be described later.
In the display control section 43, a defect image for teacher data is displayed on the display 35 (step S13). The image displayed on the display 35 may be all or a portion of the defect image. That is, the display control unit 43 displays at least a part of the defective image on the display 35. In one example, thumbnails of a plurality of defect images are displayed in a window arrangement on the display 35, and at least a part of one defect image (hereinafter referred to as "selected defect image") is displayed on the display 35 by an operator selecting the thumbnail of the defect image via the input section 36. The selection of the defect image displayed on the display 35 may be performed by various known methods.
The determination result receiving unit 44 receives an input of the determination result of the true defect or the false defect of the selection defect image displayed on the display 35 from the operator (step S14). In one example, in a window on the display 35, a button representing "true defect" and a button representing "false defect" are provided together with the selection defect image. The operator confirms the selection of the defect image and selects any button via the input unit 36, and inputs a determination result indicating whether the defect indicated by the selection of the defect image is a true defect or a false defect. The input of the determination result is received by the determination result receiving unit 44. The operator may input the determination result by various known methods.
The teacher data generation unit 45 generates teacher data by labeling the determination result on the defective image (step S15). The teacher data is data including a defective image and a determination result of the defective image by an operator. As described above, the defect-related information of the defect image used for the teacher data includes defect position information, and an image obtained by cutting out a defective region from the defect image based on the defect position information (hereinafter, also referred to as a "defect image") is preferably included in the teacher data. This suppresses the use of the features of the unnecessary regions other than the defective region for learning described later, and improves the classification accuracy of the classifier 521. In practice, the operator inputs the determination results for a plurality of defect images, and generates a plurality of teacher data. Thus, the teacher data generation processing is completed, and a plurality of teacher data (learning data sets) are obtained.
When generating a plurality of pieces of teacher data, the learning unit 51 of fig. 1 performs machine learning so that the output of the classifier for the input of the defective image in the plurality of pieces of teacher data is substantially the same as the determination result (true defect or false defect) indicated by the plurality of pieces of teacher data, and generates the classifier. The classifier is a learned model for classifying defects shown in an image as true defects or false defects, and in the generation of the classifier, the values of parameters included in the classifier and the structure of the classifier are determined. Machine learning is performed, for example, by deep learning using a neural network. The machine learning may be performed by a known method other than deep learning. The classifier (actually, the value of the parameter, information indicating the structure of the classifier) is transferred to the classification unit 52 and introduced.
Fig. 10 is a diagram showing a flow of processing of inspecting a printed circuit board by the inspection system 1. When the processing of fig. 10 is performed, the classifier 521 as a learned model is generated in advance by the above-described processing. In the inspection of the printed circuit board, a plurality of photographed images indicating a plurality of positions of the printed circuit board are acquired in the inspection device 2, and the inspection unit 20 inspects whether or not there is a defect in the plurality of photographed images. When a defect is detected (step S21), a defect image including the defect and defect-related information are output to the classification necessity determining section 53. As described above, the defect-related information of the defect detected by the first inspection processing unit 21 includes defect position information (i.e., position information of the defect in the defect image) and defect type. On the other hand, the defect-related information of the defect detected by the second inspection processing section 22 does not include defect position information and defect type.
Next, the classification necessity determining section 53 determines whether or not the classification of the defect shown in the defect image in the classifying section 52 is necessary based on the defect related information. In this processing example, it is determined that the classification in the classification section 52 is not necessary for a defect that is a specific defect type (step S22). One example of a specific defect class is open defects and short defects. The defects detected by the second inspection processing unit 22, that is, defects not including the defect position information and the defect type in the defect related information are also determined not to require classification by the classification unit 52. The reason why defects of a specific defect type such as an open defect and a short defect and defects detected by the second inspection processing section 22 are determined not to be classified by the classification section 52 will be described later.
The defect image of the defect determined not to need classification and the defect-related information are output to the defect verification device 11. As described above, the defect-related information contains the position on the printed circuit board shown by the defect image (i.e., the position information of the defect image on the printed circuit board). In the defect verification device 11, the region of the defect image on the printed circuit board is photographed with reference to the defect related information, and displayed on the display. The operator confirms the defect included in the displayed image to determine whether the defect is a true defect or a false defect (step S23). Further, a defect image may be displayed on the display together with the image captured by the defect confirmation device 11 (the same applies hereinafter).
On the other hand, the classification necessity determining unit 53 determines that the classification in the classifying unit 52 is necessary for the defect which is detected by the first inspection processing unit 21 and is not a specific defect type (step S22). The defect image and the defect-related information of the defect determined to be required to be classified are output to the classification section 52. In the classification section 52, by inputting the defect image to the classifier 521, the defect shown in the defect image is classified as a true defect or a false defect (step S24). In the preferred classifying unit 52, an image obtained by cutting out a defective region from a defective image based on the defective position information is acquired, and the image is input to the classifier 521. Thus, the characteristics of the unnecessary region other than the region where the defect can be suppressed are used for the classification processing by the classifier 521, and it is possible to classify whether the defect is a true defect or a false defect with higher accuracy.
When the classification unit 52 classifies the defect as a true defect (step S25), the defect image and the defect related information of the defect are output to the defect verification device 11. In the defect verification device 11, the region of the defect image on the printed circuit board is photographed and displayed on the display. The operator confirms the defect included in the displayed image to determine whether the defect is a true defect or a false defect (step S23). When the classification unit 52 classifies the defect as a false defect (step S25), the defect image and the defect related information of the defect are not output to the defect verification device 11, and the processing of the defect is ended. As described above, the operator can omit the operator's confirmation of the defect classified as the false defect by the classification unit 52, thereby reducing the man-hour of the operator required for confirming the defect.
Here, the reason why the classification by the classification unit 52 is not required for a defect of a specific defect type such as an open defect or a short defect and a defect detected by the second inspection processing unit 22 will be described. Since defects of a specific defect class such as an open defect and a short defect have a great influence on the action and performance of the printed circuit board, classification errors of the classifier 521 (here, true defects are misclassified as false defects) are often not allowed. Therefore, in order to avoid serious misclassification in the classifier 521, it is preferable that the operator confirms the defect in the defect confirmation device 11 without classifying the defect in the specific defect type, and determines whether the defect is a true defect or a false defect. In this way, since defects of a specific defect type such as open defects and short defects are not classified by the classifier 521, it is preferable that a defect image of the specific defect type is not used as teacher data.
Further, since the defect-related information of the defect detected by the second inspection processing unit 22 does not include defect position information, the classification unit 52 cannot obtain an image obtained by cutting out a defective region from the defective image. In this case, if the defective image is directly input to the classifier 521, the characteristics of the unnecessary region other than the defective region may be used for the classification process. As a result, the classification accuracy in the classifier 521 is lowered for the defects detected by the second inspection processing unit 22, and erroneous classification is likely to occur. Therefore, it is preferable that the defects detected by the second inspection processing unit 22 are not classified by the classifier 521, but are confirmed by the operator at the defect confirmation device 11, and whether the defects are true defects or false defects is finally determined. In this way, since the defects detected by the second inspection processing unit 22 are not classified by the classifier 521, it is preferable that the defect image of the defects detected by the second inspection processing unit 22 is not used as teacher data.
As described above, the inspection system 1 of fig. 1 is provided with the inspection unit 20 to inspect an image obtained by photographing a printed circuit board without using machine learning to detect a defect; and a classification unit 52 that classifies the defect type (true defect or false defect in the above-described process) of the defect by inputting an image representing the defect into a classifier 521. The classification necessity determining unit 53 determines whether or not the defect detected by the inspection unit 20 needs to be classified by the classifying unit 52 based on the defect-related information acquired by the inspection unit 20 at the time of detecting the defect. In the inspection system 1, defects unsuitable for classification by the classifier 521 can be easily removed from the classification target, and serious erroneous classification by the classifier 521 can be avoided and the time required for the classification process can be shortened.
Preferably, in the classification section 52, the defect detected by the inspection section 20 is classified as a true defect or a false defect. The inspection unit 20 acquires the defect type (in this case, the defect type excluding the true defect and the false defect) of the defect at the time of detecting the defect, and includes the defect type in the defect-related information. The classification necessity determining unit 53 determines that the classification by the classifying unit 52 is not necessary for a defect that is a specific defect type. Thus, it is possible to easily prevent misclassification of defects (i.e., important defects) of defect types for which misclassification is not allowed. In the case where the specific defect type includes an open defect and a short defect, erroneous classification of the open defect and the short defect can be easily prevented.
Preferably, the inspection unit 20 is provided with: a first inspection processing unit 21 that detects a defect by a first inspection process and acquires a defect image including the defect; the second inspection processing unit 22 detects a defect by a second inspection process different from the first inspection process, and acquires a defect image including the defect. The first inspection processing unit 21 can acquire position information of the defect in the defect image more detailed than the second inspection processing unit 22, and the defect-related information includes the position information acquired by the first inspection processing unit 21. The classification necessity determining section 53 determines that the classification in the classifying section 52 is necessary for the defect detected by the first inspection processing section 21, and determines that the classification in the classifying section 52 is not necessary for the defect detected by the second inspection processing section 22. In the classification unit 52, an image obtained by cutting out a defective region from a defective image based on position information of the defect is input to the classifier 521. With such a configuration, defects in which detailed positions (defect position information) of defects are not acquired from the classification target and the classification accuracy of the classifier 521 is low can be easily eliminated, and reduction of erroneous classification in the classifier 521 and reduction of time required for classification processing can be more reliably achieved.
In the teacher data generation device 4 of fig. 3, a defect image including a defect and defect-related information acquired when the defect is detected are input from the inspection unit 20, and received by the image receiving unit 41. In the image necessity determining section 42, it is determined whether or not to use the defective image for teacher data based on the defect association information. The display control unit 43 displays a defect image for teacher data on the display 35, and the determination result receiving unit 44 receives an input of a determination result of the type of defect (true defect or false defect in the above-described processing) of the displayed defect image from the operator. Then, the teacher data generating unit 45 marks the determination result on the defective image, and generates teacher data. In the teacher data generation device 4, images unsuitable for teacher data can be easily excluded, and the time required for the marking work and learning can be shortened.
Preferably, the determination result receiving unit 44 receives an input of a determination result of a true defect or a false defect from an operator. The inspection unit 20 includes defect type of the defect obtained at the time of detecting the defect in the defect related information. The image necessity determining unit 42 determines that a defect image including a defect of a specific defect type is not used for the teacher data. Thus, the classifier 521 can be generated on the premise that the classification of the defects of the specific defect type is not performed. When the specific defect type includes an open defect and a short defect, the preferred classifier 521 can be generated on the premise that the classification of the open defect and the short defect is not performed.
The inspection unit 20 is preferably provided with the first inspection unit 21 and the second inspection unit 22. The first inspection processing unit 21 can acquire position information of the defect in the defect image more detailed than the second inspection processing unit 22, and the defect-related information includes the position information acquired by the first inspection processing unit 21. In the image necessity determining section 42, it is determined that the defect image of the defect detected by the first inspection processing section 21 is used for teacher data, and it is determined that the defect image of the defect detected by the second inspection processing section 22 is not used for teacher data. Further, the teacher data generating unit 45 generates teacher data including an image obtained by cutting out a region of the defect from the defect image based on the position information of the defect. With such a configuration, it is possible to generate the preferable classifier 521 on the premise that classification of defects (defects with low classification accuracy) at detailed positions in the non-acquired defect image is not performed.
(second embodiment)
Next, a process of the inspection system 1 according to the second embodiment of the present invention will be described. Fig. 11 is a diagram showing the whole of the printed circuit board 9. The printed circuit board 9 during manufacture includes a waste circuit board region 92 which is a portion removed from the final product. In fig. 11, the waste circuit board region 92 is given parallel oblique lines. Fig. 12 is an enlarged view showing a portion B1 surrounded by a broken line in the printed circuit board 9 of fig. 11. In fig. 12, the waste circuit board region 92 is surrounded by a thick dotted line.
As shown in fig. 12, in the printed circuit board 9, small plating areas are closely arranged, or there is an area 91 provided with a thin wiring pattern (an area surrounded by a thin broken line in fig. 12). Since the defect existing in the region 91 has a great influence on the operation of the printed circuit board 9, the inspection unit 20 of this processing example sets a first inspection sensitivity for the region 91 that is more severe than that of the other regions. Hereinafter, the region 91 is referred to as a "first sensitivity setting region 91". On the other hand, since the defect existing in the above-described waste circuit board region 92 has little influence on the operation of the printed circuit board 9, a second inspection sensitivity that is less sensitive than other regions is set for the waste circuit board region 92. Hereinafter, the waste circuit board region 92 is referred to as a "second sensitivity setting region 92". In addition, in the region 93 other than the first sensitivity setting region 91 and the second sensitivity setting region 92, an intermediate third examination sensitivity is set. Hereinafter, the region 93 is referred to as a "third sensitivity setting region 93". As described above, any one of a plurality of inspection sensitivities is set at each position of the printed circuit board 9.
In the inspection process of the inspection unit 20, a defect is detected based on the inspection sensitivity. For example, in the second inspection process described with reference to fig. 8, the allowable value compared with the proportion of the different pixels within the matrix M1 varies depending on the inspection sensitivity. Specifically, by referring to design data (CAM data or the like), it is determined which of the first sensitivity setting region 91, the second sensitivity setting region 92, and the third sensitivity setting region 93 the position indicated by the divided image in which the captured image is divided belongs to, and an allowable value to be used is obtained. In the first sensitivity setting region 91, a permissible value smaller than that in the other region is obtained, and in the second sensitivity setting region 92, a permissible value larger than that in the other region is obtained. Then, the ratio of the different pixels in the matrix M1 is compared with the allowable value, and if the ratio exceeds the allowable value, the presence of a defect is detected.
When the presence of a defect is detected, a defect image (multi-gradation divided image) containing the defect and defect-related information are output to the computer 3. In this case, the defect-related information includes inspection sensitivity information indicating the inspection sensitivity used for detecting the defect. For example, the inspection sensitivity information is information showing any one of the first inspection sensitivity, the second inspection sensitivity, and the third inspection sensitivity, or information showing any one of the first sensitivity setting region 91, the second sensitivity setting region 92, and the third sensitivity setting region 93. In the first inspection processing unit 21, similarly, defects are detected based on inspection sensitivity, and a defect image including the defects and defect-related information including inspection sensitivity information are output to the computer 3. In the detection of the defect, various methods may be used, and the method for setting the inspection sensitivity may be appropriately changed according to the method for detecting the defect.
In the generation of the teacher data by the teacher data generation device 4, the image receiving unit 41 receives the defect image and the defect-related information from the inspection unit 20 (fig. 9: step S11). As described above, the defect-related information contains inspection sensitivity information. In the image necessity determining section 42, it is determined whether or not each defective image is used for teacher data based on the defect association information (step S12). In this processing example, the defect image including the defect detected at the specific inspection sensitivity is not used for teacher data. One example of the specific examination sensitivity is the first examination sensitivity set in the first sensitivity setting region 91. A defect image including defects detected at the second inspection sensitivity and the third inspection sensitivity is determined as a defect image for teacher data. The reason why the defect image of the defect detected at the specific inspection sensitivity is not used for the teacher data will be described later.
After the defect image for the teacher data is displayed on the display 35 (step S13), the operator makes an input of a determination result of a true defect or a false defect for the defect image, and accepts the input (step S14). Then, by labeling the determination result for the defective image, teacher data is generated (step S15). Then, as in the above processing example, a classifier 521 is generated using a plurality of teacher data.
In the inspection of the printed circuit board 9 in the inspection system 1, if a defect is detected in the inspection section 20 (fig. 10: step S21), a defect image including the defect and defect related information are output to the classification necessity determining section 53. As described above, the defect-related information includes inspection sensitivity information. Next, the classification necessity determining unit 53 determines whether or not the defect indicated in the defect image needs to be classified by the classifying unit 52 based on the defect related information. In this processing example, it is determined that the classification in the classification section 52 is not necessary for the defect detected with the specific inspection sensitivity (step S22). One example of the specific examination sensitivity is the first examination sensitivity set in the first sensitivity setting region 91. The reason why the classification in the classification section 52 is not necessary is determined for the defect detected with the specific inspection sensitivity, which will be described later.
The defect image of the defect determined not to need classification and the defect-related information are output to the defect verification device 11. In the defect verification device 11, the region of the defect image on the printed circuit board 9 is photographed and displayed on the display. The operator confirms the defect included in the displayed image to determine whether the defect is a true defect or a false defect (step S23).
On the other hand, defects detected with inspection sensitivity other than the specific inspection sensitivity are determined to be required to be classified in the classification section 52 (step S22). The defect image and the defect-related information of the defect determined to be required to be classified are output to the classification section 52. In the classification section 52, by inputting the defect image to the classifier 521, the defect shown in the defect image is classified as a true defect or a false defect (step S24). When the classification unit 52 classifies the defect as a true defect (step S25), the defect image and the defect related information of the defect are output to the defect verification device 11, and the operator makes a final determination as to whether the defect is a true defect or a false defect (step S23). When the classification unit 52 classifies the defect as a false defect (step S25), the defect image and the defect related information of the defect are not output to the defect verification device 11, and the processing of the defect is ended.
Here, the reason why the classification by the classification unit 52 is not necessary for the defect detected with the specific inspection sensitivity will be described. As described above, the defects existing in the first sensitivity setting region 91 have a great influence on the operation of the printed circuit board 9, and therefore classification errors of the classifier 521 (in this case, misclassification with a true defect as a false defect) may not be allowed. Therefore, in order to avoid serious misclassification in the classifier 521, it is preferable that the operator confirms the defect by the defect confirmation device 11 and finally determines whether the defect is a true defect or a false defect, in order to avoid serious misclassification in the classifier 521, with respect to the defect detected in the first sensitivity setting region 91, that is, the defect detected at a specific inspection sensitivity. In this way, since the defects detected at the specific inspection sensitivity are not classified by the classifier 521, it is preferable that the defect image of the defects detected at the specific inspection sensitivity is not used as the teacher data.
As described above, in the present processing example of the inspection system 1, one of the plurality of inspection sensitivities is set for each position of the printed circuit board 9, and the defect-related information includes information indicating the inspection sensitivity used when the inspection unit 20 detects a defect. The classification necessity determining unit 53 determines that the classification by the classifying unit 52 is not necessary for the defect detected with the specific inspection sensitivity. This can easily prevent misclassification of defects in a region of high inspection sensitivity where misclassification is not allowed. In the image necessity determining unit 42 of the teacher data generating device 4, a defect image including a defect detected at a specific inspection sensitivity is determined not to be used for the teacher data. Thus, the classifier 521 can be preferably generated on the premise that the classification of defects in the region with high inspection sensitivity is not performed.
In the above processing example, the defects detected at the second inspection sensitivity and the defects detected at the third inspection sensitivity are classified by the same classifier 521, but a classifier for each inspection sensitivity may be generated. For example, a plurality of teacher data of defects detected at the second inspection sensitivity are generated, and machine learning is performed using the plurality of teacher data, so that a classifier for the second inspection sensitivity is generated. A classifier for a third examination sensitivity is also generated. In the inspection of the printed circuit board in the inspection system 1, the defects detected at the second inspection sensitivity are determined to be classified by the classification section 52, and the defects are classified as true defects or false defects by the classifier for the second inspection sensitivity. The defects detected at the third inspection sensitivity are also determined to be classified by the classification section 52, and the defects are classified as true defects or false defects by the classifier for the third inspection sensitivity. In the inspection system 1, a classifier (a classifier classified as a true defect or a false defect) for each defect type acquired by the inspection unit 20 may be generated.
Various modifications can be made in the above-described inspection system 1, teacher data generation device 4, and teacher data generation method.
Since the printed circuit board differs in line/space, material, process, and the like according to the model, it is also possible that the classification necessity determining section 53 determines whether or not the defect is required to be classified in the classifying section 52 based on the model of the printed circuit board in which the defect is detected by the inspecting section 20. In addition, whether or not classification of the defect in the classification section 52 is necessary may be determined based on the kind of process before the printed circuit board in which the defect is detected. For example, when a defect is detected, an identification number indicating the type of the printed circuit board and/or the type of the process is acquired in the inspection unit 20 and included in the defect-related information. The classification necessity determining unit 53 stores a table indicating whether classification is necessary for each model and/or identification number, and determines whether classification of the defect in the classifying unit 52 is necessary by using the model and/or identification number included in the defect related information and referring to the table. The same applies to the image necessity determining unit 42 of the teacher data generating device 4.
The defect types that do not need to be classified by the classification unit 52 are not limited to open defects and short defects, and for example, other defect types such as solder resist peeling in the external circuit board may be included in the specific defect types that do not need to be classified. The same applies to the case where a defective image that is not used for teacher data is determined.
In the classification necessity determining section 53 according to the first embodiment, it is not necessary to determine that both the defects of the specific defect type and the defects detected by the second inspection processing section 22 do not need to be classified by the classifying section 52, and only one of them may be determined that the classification by the classifying section 52 is not needed. The same applies to the image necessity determining unit 42 of the teacher data generating device 4.
In the inspection unit 20, it is not necessarily required to obtain the defect type of the defect. In addition, only one of the first inspection processing section 21 and the second inspection processing section 22 may be provided. The first inspection process in the first inspection process section 21 may be another process capable of acquiring defect position information. The second inspection process in the second inspection process section 22 may be a process other than the above.
In the first and second embodiments described above, in step S14 of fig. 9, the determination result of the true defect or the false defect with respect to the defect image is input by the operator, but the determination result of the defect type (for example, foreign matter adhesion, film peeling, etc.) other than the true defect and the false defect may be input. That is, the determination result receiving unit 44 receives an input of a determination result of the defect type (including a determination result of a true defect or a false defect) of the defect image displayed on the display 35 from the operator. Also, in the classification section 52, the defects may be classified into defect categories other than the true defects and the false defects.
In the inspection system 1, the function of the classification necessity determining unit 53 may be provided in the inspection apparatus 2. The functions of the image receiving unit 41 and the image necessity determining unit 42 in the teacher data generating device 4 may be provided in the examination device 2.
The inspection object in the inspection unit 20 may be a circuit board such as a semiconductor circuit board or a glass circuit board, in addition to a printed circuit board. Further, the inspection unit 20 may detect defects of objects other than the circuit board, such as mechanical components. The inspection system 1 and the teacher data generation device 4 can be used for inspection of various objects.
The structures of the above embodiments and modifications may be appropriately combined as long as they do not contradict each other.
Although the invention has been described and illustrated in detail, the foregoing description is intended to be in all respects illustrative and not restrictive. Accordingly, it can be said that a plurality of modifications and forms can be made without departing from the scope of the present invention.

Claims (10)

1. An inspection system comprising:
an inspection unit that inspects an image obtained by photographing the object without using machine learning, thereby detecting a defect;
a classification unit having a learned model generated in advance, and classifying a defect type of a defect by inputting an image representing the defect into the learned model; and
And a classification necessity determining unit configured to determine whether or not the defect detected by the inspecting unit is required to be classified by the classifying unit, based on the defect-related information acquired or used by the inspecting unit when the defect is detected.
2. The inspection system of claim 1, wherein the inspection system,
the classification section classifies the defect detected by the inspection section as a true defect or a false defect,
the inspection unit acquires a defect type of the defect when detecting the defect, the defect type is included in the defect-related information,
the classification necessity determining unit determines that the classification of the defect as the specific defect type is not necessary in the classifying unit.
3. An inspection system according to claim 1 or 2, wherein,
the inspection unit includes:
a first inspection processing unit that detects a defect by a first inspection process and acquires a defect image including the defect; and
a second inspection processing unit configured to detect a defect by a second inspection process different from the first inspection process, obtain a defect image including the defect,
the first inspection processing unit may acquire position information of a defect in a defect image more detailed than the second inspection processing unit, the defect-related information including the position information acquired by the first inspection processing unit,
The classification necessity determining section determines that the defect detected by the first inspection processing section is required to be classified in the classifying section, and determines that the defect detected by the second inspection processing section is not required to be classified in the classifying section,
the classification unit inputs an image obtained by cutting out a region of a defect from a defect image based on position information of the defect, to the learned model.
4. An inspection system according to claim 1 or 2, wherein,
one of a plurality of inspection sensitivities is set for each position of the object,
information indicating an inspection sensitivity used by the inspection section when detecting a defect is included in the defect-related information,
the classification necessity determining unit determines that the classification unit is not required to classify the defect detected with the specific inspection sensitivity.
5. A teacher data generation device for generating teacher data, comprising:
an image receiving unit that receives a defect image including a defect and defect-related information acquired or used when detecting the defect from an inspection unit that inspects an image obtained by capturing an object without using machine learning;
An image necessity determining section that determines whether to use the defective image for teacher data based on the defect association information;
a display control section that displays a defect image for teacher data on a display;
a determination result receiving unit configured to receive an input of a determination result of a defect type by an operator with respect to the defect image displayed on the display; and
and a teacher data generation unit configured to generate teacher data by labeling the determination result to the defective image.
6. The teacher data generation device according to claim 5, characterized in that,
the judgment result receiving unit receives an input of a judgment result of a true defect or a false defect by an operator,
the defect type of the defect obtained when the defect is detected in the inspection section is included in the defect related information,
the image necessity determining unit determines not to use a defect image including a defect as a specific defect type for teacher data.
7. The teacher data generation device according to claim 5 or 6, characterized in that,
the inspection unit includes:
a first inspection processing unit that detects a defect by a first inspection process and acquires a defect image including the defect; and
A second inspection processing unit configured to detect a defect by a second inspection process different from the first inspection process, obtain a defect image including the defect,
the first inspection processing unit may acquire position information of a defect in a defect image more detailed than the second inspection processing unit, the defect-related information including the position information acquired by the first inspection processing unit,
the image necessity determining section determines to use the defect image of the defect detected by the first inspection processing section for teacher data and determines not to use the defect image of the defect detected by the second inspection processing section for teacher data,
the teacher data generating unit generates teacher data including an image obtained by cutting out a region of a defect from a defect image based on position information of the defect.
8. The teacher data generation device according to claim 5 or 6, characterized in that,
one of a plurality of inspection sensitivities is set for each position of the object,
information indicating an inspection sensitivity used by the inspection section when detecting a defect is included in the defect-related information,
The image necessity determining unit determines not to use a defect image including a defect detected at a specific inspection sensitivity for teacher data.
9. A teacher data generation method for generating teacher data, comprising:
a) A step of receiving a defect image including a defect and defect related information acquired or used when detecting the defect from an inspection unit for inspecting an image obtained by photographing an object without using machine learning;
b) A step of determining whether to use the defect image for teacher data based on the defect association information;
c) A step of displaying a defect image for teacher data on a display;
d) A step of receiving an input of a determination result of a defect type by an operator with respect to the defect image displayed on the display; and
e) And a step of generating teacher data by labeling the determination result on the defect image.
10. A storage medium storing a program for causing a computer to generate teacher data, characterized in that the program is executed by the computer and causes the computer to execute:
a) A step of receiving a defect image including a defect and defect related information acquired or used when detecting the defect from an inspection unit for inspecting an image obtained by photographing an object without using machine learning;
b) A step of determining whether to use the defect image for teacher data based on the defect association information;
c) A step of displaying a defect image for teacher data on a display;
d) A step of receiving an input of a determination result of a defect type by an operator with respect to the defect image displayed on the display; and
e) And a step of generating teacher data by labeling the determination result on the defect image.
CN202310101400.0A 2022-03-24 2023-02-09 Inspection system, teacher data generation device, teacher data generation method, and storage medium Pending CN116804637A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-048188 2022-03-24
JP2022048188A JP2023141721A (en) 2022-03-24 2022-03-24 Inspection system, teacher data generation device, teacher data generation method and program

Publications (1)

Publication Number Publication Date
CN116804637A true CN116804637A (en) 2023-09-26

Family

ID=88078658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310101400.0A Pending CN116804637A (en) 2022-03-24 2023-02-09 Inspection system, teacher data generation device, teacher data generation method, and storage medium

Country Status (3)

Country Link
JP (1) JP2023141721A (en)
KR (1) KR20230138869A (en)
CN (1) CN116804637A (en)

Also Published As

Publication number Publication date
KR20230138869A (en) 2023-10-05
TW202338745A (en) 2023-10-01
JP2023141721A (en) 2023-10-05

Similar Documents

Publication Publication Date Title
CN101014850B (en) System and method for inspecting electrical circuits utilizing reflective and fluorescent imagery
US20060133660A1 (en) Apparatus and method for detecting defect existing in pattern on object
JPH0160767B2 (en)
CN107315011A (en) Image processing apparatus, image processing method and storage medium
JP4484844B2 (en) Image binarization processing method, image processing apparatus, and computer program
KR20230126163A (en) Training data generation apparatus, training data generation method, and program recorded on recording medium
JP2010133744A (en) Defect detection method, and visual inspection device using the same
JP2014126445A (en) Alignment device, defect inspection device, alignment method and control program
US6795186B2 (en) Adaptive tolerance reference inspection system
CN116804637A (en) Inspection system, teacher data generation device, teacher data generation method, and storage medium
JP3247823B2 (en) Defect inspection method and apparatus, and method of manufacturing element for thin film magnetic head
US11830232B2 (en) Image judgment apparatus and image judgment method
TWI837934B (en) Inspection system, training data generation apparatus, training data generation method, and program
JP7423368B2 (en) Defect detection device and method
JP4189094B2 (en) Inspection system
TW201409048A (en) Apparatus and method for inspection of marking
JP4423130B2 (en) Printed circuit board visual inspection method, printed circuit board visual inspection program, and printed circuit board visual inspection apparatus
JP2003203218A (en) Visual inspection device and method
TWI822269B (en) Image inspection method, image management method of defective parts, and image inspection device
JPH0569536A (en) Defect detecting method and defect detecting circuit in inspection device for printed matter
CN117252803A (en) Inspection device, inspection method, and recording medium containing program
JPH04184244A (en) Method for inspecting pattern of printed circuit board
JP2023050842A (en) Image inspection method, and image inspection device
JP2001050906A (en) Apparatus and method for pattern inspection
JP3210713B2 (en) Geometric pattern inspection method and apparatus using contraction, expansion and processing of an imaging pattern for identification of predetermined features and tolerances

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination