CN115375608A - Detection method and device, detection equipment and storage medium - Google Patents

Detection method and device, detection equipment and storage medium Download PDF

Info

Publication number
CN115375608A
CN115375608A CN202110557047.8A CN202110557047A CN115375608A CN 115375608 A CN115375608 A CN 115375608A CN 202110557047 A CN202110557047 A CN 202110557047A CN 115375608 A CN115375608 A CN 115375608A
Authority
CN
China
Prior art keywords
image
detected
template
connected domain
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110557047.8A
Other languages
Chinese (zh)
Inventor
陈鲁
肖遥
佟异
张鹏斌
张嵩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongke Feice Technology Co Ltd
Original Assignee
Shenzhen Zhongke Feice Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Feice Technology Co Ltd filed Critical Shenzhen Zhongke Feice Technology Co Ltd
Priority to CN202110557047.8A priority Critical patent/CN115375608A/en
Publication of CN115375608A publication Critical patent/CN115375608A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Abstract

The application provides a detection method, a detection device and a non-volatile computer readable storage medium. The detection method comprises the steps of matching an image to be detected with a preset template image to obtain a first image area; identifying a connected component of the first image region to generate a connected component image; acquiring a second image area corresponding to the connected domain image in the image to be detected; and comparing the second image area with the connected domain image to detect the defects of the image to be detected. The detection method, the detection device, the detection equipment and the nonvolatile computer readable storage medium can improve the matching effect of the image to be detected and the template image by matching the image to be detected integrally and then accurately matching each part to be detected in the image to be detected, and then compare each part to be detected with the corresponding connected domain image, thereby determining the defect of each part to be detected, completing the defect detection of the image to be detected and having better detection effect.

Description

Detection method and device, detection equipment and storage medium
Technical Field
The present application relates to the field of detection technologies, and in particular, to a detection method, a detection apparatus, a detection device, and a non-volatile computer-readable storage medium.
Background
When a workpiece is detected to determine the defects of the workpiece, the image of the region to be detected of the workpiece is matched with the corresponding region in the standard template by adopting a template matching mode, so that the overall matching of the region to be detected is completed, but when the deviated part exists in the region to be detected, the matching is not accurate any more, and the detection effect of template matching is poor.
Disclosure of Invention
The application provides a detection method, a detection device and a non-volatile computer readable storage medium.
The detection method comprises the steps of matching an image to be detected with a preset template image to obtain a first image area; identifying a connected domain of the first image region to generate a connected domain image; acquiring a second image area corresponding to the connected domain image in the image to be detected; and comparing the second image area with the connected domain image to detect the defects of the image to be detected.
The detection device comprises a matching module, an identification module, an acquisition module and a comparison module. The matching module is used for matching the image to be detected with a preset template image so as to acquire a first image area; the identification module is used for identifying the connected domain of the first image area so as to generate a connected domain image; the acquisition module is used for acquiring a second image area corresponding to the connected domain image in the image to be detected; and the comparison module is used for comparing the second image area with the connected domain image so as to detect the defects of the image to be detected.
The detection device of the embodiment of the application comprises a processor. The processor is used for matching the image to be detected with a preset template image to acquire a first image area; identifying a connected component of the first image region to generate a connected component image; acquiring a second image area corresponding to the connected domain image in the image to be detected; and comparing the second image area with the connected domain image to detect the defects of the image to be detected.
A non-transitory computer-readable storage medium embodying a computer program of embodiments of the application, which when executed by one or more processors, causes the processors to perform the detection method. The detection method comprises the steps of matching an image to be detected with a preset template image to obtain a first image area; identifying a connected component of the first image region to generate a connected component image; acquiring a second image area corresponding to the connected domain image in the image to be detected; and comparing the second image area with the connected domain image to detect the defects of the image to be detected.
According to the detection method, the detection device, the detection equipment and the nonvolatile computer readable storage medium, a first image area integrally matched with an image to be detected is obtained by matching the image to be detected and a template image, then a connected domain in the first image area is identified to obtain one or more connected domain images, and then the connected domain images and the image to be detected are matched again, so that a second image area corresponding to the connected domain image is found in the image to be detected, accurate matching of each part to be detected (corresponding to the second image area) in the image to be detected is completed, the matching effect of the image to be detected and the template image can be improved, then each part to be detected and the corresponding connected domain image are compared, the defect of each part to be detected is determined, the defect detection of the image to be detected is completed, and the detection effect is good.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic flow chart of a detection method according to certain embodiments of the present application;
FIG. 2 is a block schematic diagram of a detection device according to certain embodiments of the present application;
FIG. 3 is a schematic plan view of a detection apparatus according to certain embodiments of the present application;
FIG. 4 is a schematic diagram of an image under test according to some embodiments of the present application;
FIG. 5 is a schematic illustration of a template image of some embodiments of the present application;
FIG. 6 is a schematic flow chart of a detection method according to certain embodiments of the present application;
FIG. 7 is a schematic illustration of a second template according to certain embodiments of the present application;
FIGS. 8-9 are schematic flow charts of detection methods according to certain embodiments of the present disclosure;
FIGS. 10-12 are schematic illustrations of the detection method of certain embodiments of the present application;
13a, 13b and 13c are schematic illustrations of the detection method of certain embodiments of the present application;
FIG. 14 is a schematic flow chart of a detection method according to certain embodiments of the present application;
FIG. 15a, FIG. 15b, FIG. 16a, FIG. 16b, FIG. 17a and FIG. 17b are schematic illustrations of the detection method of certain embodiments of the present application; and
FIG. 18 is a schematic diagram of a connection between a processor and a computer-readable storage medium according to some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout. In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
Referring to fig. 1 to 3, the detection method of the present disclosure includes the following steps:
011: matching an image to be detected with a preset template image to obtain a first image area;
012: identifying a connected component of the first image region to generate a connected component image;
013: acquiring a second image area corresponding to the connected domain image in the image to be detected;
014: and comparing the second image area with the connected domain image to detect the defects of the image to be detected.
The detection device 10 of the embodiment of the present application includes a matching module 11, an identifying module 12, an obtaining module 13, and a comparing module 14. The matching module 11 is configured to match the image to be detected with a preset template image to obtain a first image area; the identification module 12 is configured to identify a connected component of the first image region to generate a connected component image; the obtaining module 13 is configured to obtain a second image area corresponding to the connected domain image in the image to be detected; the comparison module 14 is configured to compare the second image region with the connected domain image to detect a defect of the image to be detected. That is, step 011 can be implemented by the matching module 11, step 012 can be executed by the identification module 12, step 013 can be executed by the obtaining module 13, and step 014 can be executed by the comparison module 14.
The detection apparatus 100 of the present embodiment includes a processor 20. The processor 20 is configured to match the image to be detected with a preset template image to obtain a first image region; identifying a connected component of the first image region to generate a connected component image; acquiring a second image area corresponding to the connected domain image in the image to be detected; and comparing the second image area with the connected domain image to detect the defects of the image to be detected. That is, step 011, step 012, step 013, and step 014 may be performed by processor 20.
In particular, the detection device 100 may be a measuring machine. It is understood that the specific form of the inspection apparatus 100 is not limited to a measuring machine, but may be any apparatus capable of inspecting the object 200.
The detection apparatus 100 includes a processor 20, a motion platform 30, and a sensor 40. Both the processor 20 and the sensor 40 may be located on the motion platform 30. The motion platform 30 can be used to carry the object 200, and the motion platform 30 moves to drive the object 200 to move, so that the sensor 40 collects information of the object 200 (for example, collects an image of an area to be measured of the object 200).
For example, the motion platform 30 includes an XY motion platform 31 and a Z motion platform 32, the object 200 to be tested is disposed on the XY motion platform 31, and the sensor 40 is disposed on the Z motion platform 32, wherein the XY motion platform 31 is used for controlling the object 200 to be tested to move along a horizontal plane, so as to change the relative positions of the object 200 to be tested and the sensor 40 on the horizontal plane, and the Z motion platform 32 is used for controlling the sensor 40 to move along a direction vertical to the horizontal plane, so that the three-dimensional position (i.e., the relative positions in the horizontal plane and the vertical horizontal plane) of the sensor 40 relative to the object 200 to be tested is realized through cooperation of the XY motion platform 31 and the Z motion platform 32. Alternatively, the XY motion stage 31 is used to control the sensor 40 to move along the direction vertical to the horizontal plane, and the Z motion stage 32 is used to control the object 200 to move along the horizontal plane.
It is understood that the motion platform 30 is not limited to the above structure, and only needs to be able to change the three-dimensional position of the sensor 40 relative to the object 200.
The sensors 40 may each be a visible light camera, a depth camera, a distance sensor 40, or the like. In the present embodiment, the sensor 40 is a visible light camera.
The sensor 40 can acquire the image to be measured of the object 200 to be measured, for example, the object 200 to be measured includes a plurality of areas to be measured, the sensor 40 can acquire the image to be measured of at least a part of the area of the object 200 to be measured each time, and the acquisition of the image to be measured of all the areas of the object 200 to be measured is realized by the cooperation of the motion platform 30.
Then, when detecting the image to be detected, the processor 20 first obtains a preset template image matched with the image to be detected, for example, the preset template image may be determined (such as type, model, and the like) according to parameter information of the piece to be detected 200, and the piece to be detected 200 may be a wafer, a display panel, and the like.
Then the processor 20 matches the image to be detected with a preset template image, the matching process is to find a region in the preset template image, which is substantially the same as the image of the image to be detected, the template image can be traversed through a traversal frame with the same size as the image to be detected, the similarity between the image region in the traversal frame and the image to be detected is calculated in each traversal, the similarity can be determined through the pixel value difference values of the corresponding positions of the region image in the traversal frame and the image to be detected, such as the sum, the average, the variance, and the like of the pixel value difference values, then after traversing the whole template image, a plurality of similarities can be obtained, the image region with the maximum similarity can be determined to be matched with the image to be detected, and the image region with the maximum similarity is the first image region.
It can be understood that, referring to fig. 4, since the shooting accuracy of the sensor 40 cannot guarantee that the minimum repeating unit Q is shot each time, so that the image P1 to be measured of the minimum repeating unit Q is just obtained, the following situations may occur: the field of view of the sensor 40 spans the channel H while capturing a portion of the plurality of minimal repeating units Q.
In other embodiments, the object 200 has a positioning mark O for matching, and the processor 20 first identifies the positioning mark O in the image P1 and the template image, and then determines the first image region by matching the positioning mark O. For example, the positioning mark O is the center of a feature circle in the workpiece, and the centers of the feature circles in the image P1 to be measured and the feature circle in the template image are matched to align the centers of the feature circles, so as to determine the first image area.
Referring to fig. 5, the predetermined template image P2 may include images of a plurality of standard minimal repeating units Q, and since the field of view of the sensor 40 covers one minimal repeating unit Q, the sensor 40 can shoot a part of 4 minimal repeating units Q at most simultaneously across the channel, and therefore, the predetermined template image P2 may include standard images of 4 minimal repeating units Q, and the 4 minimal repeating units Q are arranged in a2 × 2 matrix, so as to ensure that the sensor 40 can find a matching first image area A1 in the predetermined template image P2 regardless of shooting any part of the device under test 200, such as the first image area A1 in the template image P2 of the image under test P1 shown in fig. 4.
After the matching between the image P1 to be measured and the template image P2 is completed to obtain the first image area A1, the overall matching between the image P1 to be measured and the template image P2 is completed. The processor 20 may then identify connected components in the first image area A1 to generate one or more connected component images.
The connected domain is an area formed by pixels larger than a preset threshold, it can be understood that the colors of the circuit in the wafer and the substrate of the wafer are greatly different, for example, the substrate of the wafer is near black, while the color of the circuit on the wafer is not black, but red, yellow, etc., the image to be detected collected by the sensor 40 can be a gray image, in which the part where the circuit is located is near white, so the pixels located in the circuit can be identified by setting the preset threshold (such as 100, 150, etc.), and the pixels adjacent to the circuit are used as the same connected domain, so as to identify one or more connected domains, which are areas where the circuit needing defect detection is located, and one or more connected domain images can be generated according to the pixels of the connected domains; or generating a connected domain image according to the image area where the circumscribed rectangle of the connected domain is located. Of course, the connected component in the template image may also be labeled in advance, so that the connected component can be determined only by identifying the labeled region.
After the connected domain image in the first image region is obtained, the connected domain image can be matched with the image to be detected again so as to find a second image region matched with the connected domain image in the image to be detected, and the matching method of the connected domain image and the second image region can be similar to the matching method of the image to be detected and the preset template image, and is not repeated herein. Therefore, each line region in the image to be detected can be accurately matched with the corresponding connected domain image, and the matching accuracy is improved.
Finally, the processor 20 compares the second image area with the connected domain image, and based on the difference between the second image area and the connected domain image, the defect detection of each second image area can be realized. For example, a difference image may be generated from the pixel value difference values of the corresponding positions of the second image region and the connected component image, and then the difference between the two may be determined from the difference image, so as to determine the defect of each third region image.
According to the detection method, the detection device 10 and the detection equipment 100, the first image area integrally matched with the image to be detected is obtained by matching the image to be detected with the template image, then the connected domain in the first image area is identified to obtain one or more connected domain images, then the connected domain images and the image to be detected are matched again, so that the second image area corresponding to the connected domain images is found in the image to be detected, the accurate matching of each part to be detected (corresponding to the second image area) in the image to be detected is completed, the matching effect of the image to be detected and the template image can be improved, then each part to be detected and the corresponding connected domain image are compared, the defect of each part to be detected is determined, the defect detection of the image to be detected is completed, and the detection effect is good.
Referring to fig. 2, 3 and 6, in some embodiments, the template image includes a first template and a second template, the first template is an image of a minimum repeating unit of the device under test 200, and the second template is a circuit image of the minimum repeating unit, step 011 includes:
0111: matching the image to be detected with the first template to obtain a target image area;
0112: and acquiring an image area corresponding to the target image area in the second template as a first image area based on the mapping relation between the first template and the second template.
In some embodiments, the matching module 11 is further configured to match the image to be detected with the first template to obtain a target image region; and acquiring an image area corresponding to the target image area in the second template as a first image area based on the mapping relation between the first template and the second template. That is, step 0111 and step 0112 may be performed by the matching module 11.
In some embodiments, the processor 20 is further configured to match the image to be measured with the first template to obtain a target image region; and acquiring an image area corresponding to the target image area in the second template as a first image area based on the mapping relation between the first template and the second template. That is, step 0111 and step 0112 may be performed by processor 20.
Specifically, referring to fig. 5 and 7, the template image P2 includes a first template P21 and a second template P22, the first template P21 is an image (e.g., a grayscale image) of the minimum repeating unit Q, and the second template P22 is a circuit image of the minimum repeating unit Q, and includes only an image of the line portion in the minimum repeating unit Q. The first template P21 and the second template P22 both include images corresponding to 4 minimum repeating units Q, and the images corresponding to the 4 minimum repeating units Q are arranged in a2 × 2 matrix, so as to ensure that the images to be measured have matched image regions in both the first template P21 and the second template P22.
When the image to be detected and the template image P2 are matched, the acquired image to be detected is generally a gray image, so that the image to be detected and the first template P21 are first matched to determine a target image area (such as a target image area a11 shown in fig. 5), and the matching method refers to the matching method of the image to be detected and the template image P2 in the previous embodiment. As the first template P21 and the second template P22 have a preset mapping relationship, it can be understood that the first template P21 and the second template P22 have the same size, and the images at the corresponding positions all correspond to the same part of the standard object 200, so that after the target image area a11 in the first template P21 is determined, the image area a12 corresponding to the target image area a11 (corresponding to the area where the white square in fig. 7 is located) can be quickly found in the second template P22 according to the mapping relationship, and since the defect of the line is detected, the image area A1 corresponding to the target image area a11 in the second template P22 is used as the first image area A1, so that the matching of the line part in the object image can be more accurately realized, and the defect detection accuracy of the line is improved.
The second template P22 may be generated according to the first template P21, for example, by labeling the contour of the line in the first template P21 to determine the region image where the line is located, so as to generate the second template P22. Alternatively, the circuit of the device under test 200 is designed according to a preset circuit diagram, such as a Computer Aided Design (CAD) line diagram, and the second template P22 can also be generated according to the CAD line diagram, for example, the actual width of the circuit exists, so that the lines in the CAD line diagram can be thickened to make the thickness of the circuit conform to the thickness of the actual circuit in the second template P22, so as to generate the second template P22.
Referring to fig. 2, 3 and 8, in some embodiments, step 012 includes:
0121: one or more pixels of the first image region having a pitch less than the predetermined pitch are identified as the same connected component to generate a connected component image.
In some embodiments, the identification module 12 is further configured to identify one or more pixels of the first image region having a pitch less than a predetermined pitch as the same connected component to generate a connected component image. That is, step 0121 may be performed by the identification module 12.
In some embodiments, the processor 20 is further configured to identify one or more pixels of the first image region having a pitch less than the predetermined pitch as the same connected component to generate a connected component image. That is, step 0121 may be performed by processor 20.
Specifically, when determining the connected component, the adjacent pixels in the first image area that are in the line area and have a pitch (distance between the centers of two pixels) smaller than a predetermined pitch (e.g., 1 pixel, 2 pixels, etc.) may be used as the pixels in the same connected component. The pixels are rectangular, when the preset distance is 1 pixel, only the pixels with adjacent sides and in the line region of the two pixels can be used as the pixels of the same connected domain, and when the preset distance is 2 and the corner points of the two pixels are adjacent and are pixels in the line region, the pixels of the same connected domain can be used as the pixels of the same connected domain, so that the pixels in the same connected domain can be quickly determined.
It is to be understood that the first image region is a part of the line map, the line map may be a binary image, and if the pixels of the region where the line is located are 255, the region where the non-line is located is 0, so that when determining whether the pixel is located in the line region, it may be determined whether the pixel value of the pixel is 255. In this way, connected components in the first image region can be accurately identified.
Processor 20 then generates a connected component image based on the identified connected components. The method specifically comprises the following steps: the processor 20 intercepts an image area where the minimum circumscribed rectangle of the connected domain is located, and the image area is used as a connected domain image corresponding to the connected domain, so that the connected domain image and the image to be detected can be conveniently compared in a follow-up manner, and a second image area can be determined.
Referring to fig. 2, 3 and 9, in some embodiments, step 013 includes:
0131: enlarging the connected component image to generate a third image region;
0132: binarizing the image to be detected to generate a binarized image to be detected;
0133: acquiring a fourth image area corresponding to the third image area in the binary image to be detected;
0134: and matching the connected component image with the fourth image area to obtain a fifth image area matched with the connected component image in the fourth image area as a second image area.
In some embodiments, the obtaining module 13 is further configured to expand the connected component image to generate a third image region; binarizing the image to be detected to generate a binarized image to be detected; acquiring a fourth image area corresponding to the third image area in the binary image to be detected; and matching the connected component image with the fourth image area to obtain a fifth image area matched with the connected component image in the fourth image area as a second image area. That is, step 0131, step 0132, step 0133, and step 0134 may be performed by the acquisition module 13.
In some embodiments, processor 20 is further configured to expand the connected component image to generate a third image region; binarizing the image to be detected to generate a binarized image to be detected; acquiring a fourth image area corresponding to the third image area in the binary image to be detected; and matching the connected component image with the fourth image area to obtain a fifth image area matched with the connected component image in the fourth image area as a second image area. That is, step 0131, step 0132, step 0133 and step 0134 may be executed by processor 20.
Specifically, referring to fig. 10, taking an image area where a minimum circumscribed rectangle N of a connected domain M is located as an example of a connected domain image, since an image to be detected and a first image area A1 are aligned after matching, a corresponding image area exists in the image to be detected in the connected domain image, however, since a line area in the image to be detected may have a position deviation (if the line area cannot be completely aligned with a standard line), if the minimum circumscribed rectangle N directly based on the connected domain M is matched with the image to be detected, a part of a corresponding line in the image to be detected may not be in an image area corresponding to the minimum circumscribed rectangle N, and therefore, referring to fig. 11 and 12, the processor 20 needs to first expand the minimum circumscribed rectangle N, and may expand the center of the minimum circumscribed rectangle N as an expansion center by a predetermined multiple, and the predetermined side length is an expansion multiple of the circumscribed rectangle, such as a predetermined multiple of 1.1, 1.2, 1.5, and the like. For example, if the minimum circumscribed rectangle N is 10 × 10 pixels in size and the predetermined multiple is 1.2 times, the enlarged circumscribed rectangle K is 12 × 12 in size. The processor 20 may then use the enlarged image area where the circumscribed rectangle K is located as the third image area A3, so that even if the line has a deviation, the fourth image area A4 corresponding to the third image area A3 in the image P1 to be measured can still cover the line.
The processor 20 may first convert the image to be detected P1 into a binarized image to be detected of the same type as the second template, and then obtain a fourth image area A4 corresponding to the third image area A3 in the binarized image to be detected; alternatively, the processor 20 may also obtain the fourth image area A4 corresponding to the third image area A3 in the to-be-detected image P1, and then binarize the to-be-detected image P1 to obtain a binarized to-be-detected image. Thus, matching of the subsequent connected component image and the fourth image area A4 can be facilitated.
The image P1 to be detected is a gray image, in which a portion where the line is located is close to white, so that by setting a preset threshold (such as 100, 150, and the like), pixels belonging to a line region (such as a pixel value greater than the preset threshold) and pixels belonging to a non-line image region (such as a pixel value less than or equal to the preset threshold) can be determined, so as to convert the image to be detected into a binarized image to be detected (such as the pixels of the line region are set to 255 and the pixels of the non-line region are set to 0).
Then, referring to fig. 10 and 12, the processor 20 matches the connected component image with the fourth image area A4, and since the size of the fourth image area A4 is larger than that of the connected component image, when matching, a plurality of fifth image areas A5 with the same size as that of the connected component image in the fourth image area A4 may be obtained first. Specifically, referring to fig. 13a, 13b and 13c, the processor 20 may poll the fourth image area A4 according to a polling frame having the same size as the connected component image, so that the polling frame polls the entire fourth image area A4 line by line, thereby obtaining a plurality of fifth image areas A5. Then, the processor 20 matches each fifth image area A5 with the connected component image, so as to calculate a matching degree between the fifth image area A5 and the connected component image, for example, by using a difference value between pixel values of corresponding positions of the fifth image area A5 and the connected component image, and then using a sum, an average value, and the like of the difference values of all the pixel values as the matching degree, so as to calculate the matching degree corresponding to each fifth image area A5, and the processor 20 obtains the fifth image area A5 with the highest matching degree as the second image area A2 (as shown in fig. 13C) matched with the connected component image, so as to realize accurate matching of each connected component image, and obtain the second image area A2 corresponding to each connected component image.
Referring to fig. 2, 3 and 14, in some embodiments, step 014 includes:
0141: acquiring pixels corresponding to the connected domain in the second image region to generate a comparison image;
0142: comparing the pair image with the connected domain image to generate a differential image;
0143: and detecting the defects of the image to be detected according to the differential image.
In some embodiments, the comparison module 14 is further configured to obtain pixels corresponding to the connected component in the second image region to generate a comparison image; comparing the pair image with the connected domain image to generate a difference image; and detecting the defects of the image to be detected according to the difference image. That is, step 0141, step 0142 and step 0143 may be performed by alignment module 14.
In some embodiments, the processor 20 is further configured to obtain pixels corresponding to the connected component in the second image region to generate a comparison image; comparing the pair image with the connected domain image to generate a difference image; and detecting the defects of the image to be detected according to the difference image. That is, step 0141, step 0142 and step 0143 may be performed by processor 20.
Specifically, after the second image area matched with the connected domain image is acquired, the position of the pixel corresponding to the connected domain in the second image area A2 can be determined according to the position of the connected domain in the connected domain image, and the comparison image P3 is generated according to the position. For example, the pixels at the positions corresponding to the connected component in the second image area A2 may be set to 255, and the pixels at other positions in the second image area A2 may be set to 0, so as to generate the comparison image P3.
As shown in fig. 15a and 15b, the second image area A2 without line-missing defect and with line-missing defect, respectively, and fig. 16a and 16b are the alignment images P3 corresponding to fig. 15a and 15b, respectively. The processor differentiates the comparison image P3 and the corresponding connected domain image, specifically, obtains a pixel value difference of a pixel at a corresponding position in the comparison image P3 and the connected domain image, and generates a differential image according to an absolute value of the pixel value difference (for example, fig. 17a and 17b are differential images P4 corresponding to fig. 15a and 15b, respectively). The difference image can represent the difference between the comparison image P4 (corresponding to the circuit area in the image to be detected) and the connected domain image (corresponding to the circuit area in the second template), so that the detection of each circuit area in the image to be detected is realized.
Referring to fig. 17b, in an embodiment, the processor 20 identifies a connected domain in the difference image, for example, a portion where a pixel with a pixel value greater than 0 (e.g. 255) in the difference image is located is the connected domain, and the connected domain is a portion corresponding to a missing line in the image to be detected (specifically, a line region). The processor 20 may determine the location and area of the missing portion of the trace based on the location and area of the connected component, and it is understood that when the area of the connected component is small (e.g., smaller than a predetermined area threshold, such as a predetermined area threshold of 5, 7, 10 (pixels), etc.), the portion may be noise only, and not the missing portion of the trace, and thus, the trace region may be determined to be defect-free. When the area of the connected domain is larger (e.g., greater than or equal to the predetermined area threshold), it can be determined that the connected domain is a line region defect. Therefore, the position and the area of the defect of each line region can be accurately determined, and the defect of the image to be detected can be detected.
Referring to fig. 18, one or more non-transitory computer-readable storage media 300 containing a computer program 302 according to an embodiment of the present disclosure, when the computer program 302 is executed by one or more processors 20, enable the processor 20 to perform the calibration method according to any of the above embodiments.
For example, referring to fig. 1-3, the computer program 302, when executed by the one or more processors 20, causes the processors 20 to perform the steps of:
011: matching an image to be detected with a preset template image to obtain a first image area;
012: identifying a connected component of the first image region to generate a connected component image;
013: acquiring a second image area corresponding to the connected domain image in the image to be detected;
014: and comparing the second image area with the connected domain image to detect the defects of the image to be detected.
As another example, referring to fig. 2, 3 and 5, when the computer program 302 is executed by one or more processors 20, the processors 20 may further perform the steps of:
0121: one or more pixels of the first image region having a pitch less than the predetermined pitch are identified as the same connected component to generate a connected component image.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the various embodiments or examples and features of the various embodiments or examples described in this specification can be combined and combined by those skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (12)

1. A method of detection, comprising:
matching the image to be detected with a preset template image to obtain a first image area;
identifying a connected component of the first image region to generate a connected component image;
acquiring a second image area corresponding to the connected domain image in the image to be detected;
and comparing the second image area with the connected domain image to detect the defects of the image to be detected.
2. The method according to claim 1, wherein the template image includes a first template and a second template, the first template is an image of a minimal repeating unit of the object to be tested, the second template is a circuit image of the minimal repeating unit, and the matching of the image to be tested and a preset template image to obtain a first image area includes:
matching the image to be detected with the first template to obtain a target image area;
and acquiring an image area corresponding to the target image area in the second template as the first image area based on the mapping relation between the first template and the second template.
3. The inspection method of claim 2, wherein said first template comprises 4 images of said minimal repeating units arranged in a2 x 2 matrix and said second template comprises 4 images of said electrical circuits arranged in a2 x 2 matrix.
4. The detection method according to claim 2, further comprising:
labeling contours of lines in the first template to generate the second template; or
And acquiring a preset circuit diagram, and generating the second template according to the circuit diagram.
5. The detection method according to claim 1, wherein the identifying the connected component of the first image region to generate a connected component image comprises:
identifying one or more pixels in the first image region having a pitch less than a predetermined pitch as the same connected component to generate the connected component image.
6. The detection method according to claim 1, wherein the obtaining a second image region corresponding to the connected component image in the image to be detected includes:
enlarging the connected domain image to generate a third image region;
binarizing the image to be detected to generate a binarized image to be detected;
acquiring a fourth image area corresponding to the third image area in the binarized image to be detected;
and matching the connected domain image with the fourth image area to obtain the second image area matched with the connected domain image in the fourth image area.
7. The detection method according to claim 6, wherein the matching the connected component image and the fourth image region to obtain the second image region matched with the connected component image in the fourth image region comprises:
acquiring a plurality of fifth image areas with the same size as the connected domain images in the fourth image area;
matching the fifth image area with the connected domain image to obtain the matching degree of each fifth image area and the connected domain image;
acquiring the fifth image area with the highest matching degree as the fifth image area;
generating the second image region from the fifth image region.
8. The method according to claim 1, wherein the comparing the second image region with the connected domain image to detect the defect of the image to be detected comprises:
acquiring pixels corresponding to the connected domain in the second image area to generate a comparison image;
differentiating the comparison image and the connected domain image to generate a differential image;
and detecting the defects of the image to be detected according to the differential image.
9. The detecting method according to claim 8, wherein said detecting the image defect according to the difference image comprises:
identifying the connected domain in the difference image;
and determining whether the connected domain image has defects and the positions of the defects according to the positions and the areas of the connected domains.
10. A detection device, comprising:
the matching module is used for matching the image to be detected with a preset template image to acquire a first image area;
the identification module is used for identifying the connected domain of the first image area so as to generate a connected domain image;
the acquisition module is used for acquiring a second image area corresponding to the connected domain image in the image to be detected; and
and the comparison module is used for comparing the second image area with the connected domain image so as to detect the defects of the image to be detected.
11. A detection device, comprising a processor configured to:
matching an image to be detected with a preset template image to obtain a first image area;
identifying a connected domain of the first image region to generate a connected domain image;
acquiring a second image area corresponding to the connected domain image in the image to be detected;
and comparing the second image area with the connected domain image to detect the defects of the image to be detected.
12. A non-transitory computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the processors to perform the detection method of any one of claims 1 to 9.
CN202110557047.8A 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium Pending CN115375608A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110557047.8A CN115375608A (en) 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110557047.8A CN115375608A (en) 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115375608A true CN115375608A (en) 2022-11-22

Family

ID=84058392

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110557047.8A Pending CN115375608A (en) 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115375608A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342589A (en) * 2023-05-23 2023-06-27 之江实验室 Cross-field scratch defect continuity detection method and system
CN117237347A (en) * 2023-11-14 2023-12-15 深圳思谋信息科技有限公司 PCB defect detection method and device, storage medium and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342589A (en) * 2023-05-23 2023-06-27 之江实验室 Cross-field scratch defect continuity detection method and system
CN116342589B (en) * 2023-05-23 2023-08-22 之江实验室 Cross-field scratch defect continuity detection method and system
CN117237347A (en) * 2023-11-14 2023-12-15 深圳思谋信息科技有限公司 PCB defect detection method and device, storage medium and electronic equipment
CN117237347B (en) * 2023-11-14 2024-03-29 深圳思谋信息科技有限公司 PCB defect detection method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
KR100830523B1 (en) Image processing system for use with inspection systems
EP3109826B1 (en) Using 3d vision for automated industrial inspection
CN109900711A (en) Workpiece, defect detection method based on machine vision
US11158039B2 (en) Using 3D vision for automated industrial inspection
CN109472271B (en) Printed circuit board image contour extraction method and device
CN112053318A (en) Two-dimensional PCB defect real-time automatic detection and classification device based on deep learning
JP2001524228A (en) Machine vision calibration target and method for determining position and orientation of target in image
TWI500925B (en) Check the device, check the method and check the program
Kim et al. Automated inspection system for rolling stock brake shoes
CN106501272B (en) Machine vision soldering tin positioning detection system
Adamo et al. Calibration of an inspection system for online quality control of satin glass
US20120213425A1 (en) Combining feature boundaries
CN113454445A (en) Compensating for reference misalignment during part inspection
CN115375608A (en) Detection method and device, detection equipment and storage medium
CA2464033C (en) Inspection system and method
CN111409070A (en) Detection method and device, intelligent robot and storage medium
CN107271445B (en) Defect detection method and device
CN115375610A (en) Detection method and device, detection equipment and storage medium
JP4814116B2 (en) Mounting board appearance inspection method
JP2010071768A (en) Vehicle inspection apparatus
CN115984197A (en) Defect detection method based on standard PCB image and related device
US10241000B2 (en) Method for checking the position of characteristic points in light distributions
CN113358058B (en) Computer vision detection method for weld contour features based on discrete sequence points
Zhu et al. Automatic indication recognition of dual pointer meter in thermo-hygrometer calibrator
Anu et al. Automatic visual inspection of PCB using CAD information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination