CN115375610A - Detection method and device, detection equipment and storage medium - Google Patents

Detection method and device, detection equipment and storage medium Download PDF

Info

Publication number
CN115375610A
CN115375610A CN202110558116.7A CN202110558116A CN115375610A CN 115375610 A CN115375610 A CN 115375610A CN 202110558116 A CN202110558116 A CN 202110558116A CN 115375610 A CN115375610 A CN 115375610A
Authority
CN
China
Prior art keywords
image
area
area image
detected
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110558116.7A
Other languages
Chinese (zh)
Inventor
陈鲁
肖遥
佟异
张鹏斌
张嵩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongke Feice Technology Co Ltd
Original Assignee
Shenzhen Zhongke Feice Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Feice Technology Co Ltd filed Critical Shenzhen Zhongke Feice Technology Co Ltd
Priority to CN202110558116.7A priority Critical patent/CN115375610A/en
Publication of CN115375610A publication Critical patent/CN115375610A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a detection method, a detection device and a non-volatile computer readable storage medium. The detection method comprises the steps of obtaining a first area image corresponding to an image to be detected in a preset template; identifying a line area of the image to be detected to generate a second area image; acquiring a third area image corresponding to the second area image in the first area image; and comparing the second area image with the third area image to detect the defects of the image to be detected. The detection method, the detection device, the detection equipment and the nonvolatile computer readable storage medium can improve the matching effect of the image to be detected and the preset template by accurately matching each part to be detected in the image to be detected after the image to be detected is integrally matched, and then compare each part to be detected with the corresponding third area image, so that the defect of each part to be detected is determined, the defect detection of the image to be detected is completed, and the detection effect is good.

Description

Detection method and device, detection equipment and storage medium
Technical Field
The present application relates to the field of detection technologies, and in particular, to a detection method, a detection apparatus, a detection device, and a non-volatile computer-readable storage medium.
Background
When a workpiece is detected to determine the defects of the workpiece, the image of the region to be detected of the workpiece is matched with the corresponding region in the standard template by adopting a template matching mode, so that the overall matching of the region to be detected is completed, however, the accuracy of the overall matching cannot ensure that each part to be detected in the region to be detected is accurately matched, and the detection accuracy is poor.
Disclosure of Invention
The application provides a detection method, a detection device and a non-volatile computer readable storage medium.
The detection method comprises the steps of obtaining a first area image corresponding to an image to be detected in a preset template; identifying a line area of the image to be detected to generate a second area image; acquiring a third area image corresponding to the second area image in the first area image; and comparing the second area image with the third area image to detect the defects of the image to be detected.
The detection device comprises a first acquisition module, an identification module, a second acquisition module and a comparison module. The first acquisition module is used for acquiring a first area image corresponding to an image to be detected in a preset template; the identification module is used for identifying a line region of the to-be-detected image to generate a second region image; the second obtaining module is used for obtaining a third area image corresponding to the second area image in the first area image; and the comparison module is used for comparing the second area image with the third area image so as to detect the defect of the image to be detected.
The detection device of the embodiment of the application comprises a processor. The processor is used for acquiring a first area image corresponding to an image to be detected in a preset template; identifying a line area of the image to be detected to generate a second area image; acquiring a third area image corresponding to the second area image in the first area image; and comparing the second area image with the third area image to detect the defects of the image to be detected.
A non-transitory computer-readable storage medium embodying a computer program of embodiments of the application, which when executed by one or more processors, causes the processors to perform the detection method. The detection method comprises the steps of obtaining a first area image corresponding to an image to be detected in a preset template; identifying a line area of the image to be detected to generate a second area image; acquiring a third area image corresponding to the second area image in the first area image; and comparing the second area image with the third area image to detect the defects of the image to be detected.
The detection method, the detection device, the detection equipment and the nonvolatile computer readable storage medium are characterized in that a first area image which is integrally matched with an image to be detected is obtained by matching the image to be detected with a preset template, then a circuit area in the image to be detected is identified to obtain one or more second area images, then the second area image and the first area image are matched again, a third area image corresponding to the second area image is found in the preset template, accurate matching of each part to be detected (corresponding to the second area image) in the image to be detected is completed, the matching effect of the image to be detected and the preset template can be improved, then each part to be detected and the corresponding third area image are compared, the defect of each part to be detected is determined, the defect detection of the image to be detected is completed, and the detection effect is good.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic flow diagram of a detection method according to certain embodiments of the present application;
FIG. 2 is a block schematic diagram of a detection device according to certain embodiments of the present application;
FIG. 3 is a schematic plan view of a detection apparatus according to certain embodiments of the present application;
FIG. 4 is a schematic illustration of an image to be inspected according to certain embodiments of the present application;
FIG. 5 is a schematic illustration of a default template according to certain embodiments of the present application;
FIG. 6 is a schematic flow chart of a detection method according to certain embodiments of the present application;
FIG. 7 is a schematic illustration of a second default template according to some embodiments of the present application;
FIG. 8 is a schematic flow chart of a detection method according to certain embodiments of the present application;
FIG. 9 is a schematic illustration of a binarized image of an image to be examined according to certain embodiments of the present application;
FIG. 10 is a schematic illustration of an image of a line under test of an image to be inspected according to certain embodiments of the present application;
FIG. 11 is a schematic flow chart of a detection method according to certain embodiments of the present application;
FIGS. 12 and 13 are schematic illustrations of an assay method according to certain embodiments of the present application;
FIGS. 14a, 14b and 14c are schematic illustrations of the detection method of certain embodiments of the present application;
FIG. 15 is a schematic flow chart of a detection method according to certain embodiments of the present application;
FIGS. 16a, 16b, 17a and 17b are schematic illustrations of the detection method of certain embodiments of the present application; and
FIG. 18 is a schematic diagram of a connection between a processor and a computer-readable storage medium according to some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout. In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only used for explaining the embodiments of the present application, and are not to be construed as limiting the present application.
Referring to fig. 1 to 3, the detection method according to the embodiment of the present disclosure includes the following steps:
011: acquiring a first area image corresponding to an image to be detected in a preset template;
012: identifying a line area of the image to be detected to generate a second area image;
013: acquiring a third area image corresponding to the second area image in the first area image;
014: and comparing the second area image with the third area image to detect the defect of the image to be detected.
The detection device 10 of the embodiment of the present application includes a first obtaining module 11, an identifying module 12, a second obtaining module 13, and a comparing module 14. The first obtaining module 11 is configured to obtain a first area image corresponding to an image to be detected in a preset template; the identification module 12 is used for identifying a line region of the to-be-detected image to generate a second region image; the second obtaining module 13 is configured to obtain a third area image corresponding to the second area image in the first area image; the comparison module 14 is used for comparing the second area image and the third area image to detect the defect of the image to be detected. That is, step 011 can be implemented by the first obtaining module 11, step 012 can be performed by the identifying module 12, step 013 can be performed by the second obtaining module 13, and step 014 can be performed by the comparing module 14.
The detection apparatus 100 of the present embodiment includes a processor 20. The processor 20 is configured to acquire a first region image corresponding to an image to be detected in a preset template; identifying a line area of the image to be detected to generate a second area image; acquiring a third area image corresponding to the second area image in the first area image; and comparing the second area image with the third area image to detect the defects of the image to be detected. That is, step 011, step 012, step 013, and step 014 may be performed by processor 20.
In particular, the detection device 100 may be a measuring machine. It is understood that the specific form of the inspection apparatus 100 is not limited to a measuring machine, and may be any apparatus capable of inspecting the object 200.
The detection apparatus 100 includes a processor 20, a motion platform 30, and a sensor 40. Both the processor 20 and the sensor 40 may be located on the motion platform 30. The moving platform 30 can be used to carry the object 200, and the moving platform 30 moves to drive the object 200 to move, so that the sensor 40 collects information of the object 200 (for example, collects an image to be detected of an area to be detected of the object 200).
For example, the motion platform 30 includes an XY motion platform 31 and a Z motion platform 32, the object 200 to be tested is disposed on the XY motion platform 31, and the sensor 40 is disposed on the Z motion platform 32, wherein the XY motion platform 31 is used for controlling the object 200 to be tested to move along a horizontal plane, so as to change the relative positions of the object 200 to be tested and the sensor 40 on the horizontal plane, and the Z motion platform 32 is used for controlling the sensor 40 to move along a direction vertical to the horizontal plane, so that the three-dimensional position (i.e., the relative positions in the horizontal plane and the vertical horizontal plane) of the sensor 40 relative to the object 200 to be tested is realized through cooperation of the XY motion platform 31 and the Z motion platform 32. Alternatively, the XY motion stage 31 is used to control the sensor 40 to move along the direction vertical to the horizontal plane, and the Z motion stage 32 is used to control the object 200 to move along the horizontal plane.
It is understood that the motion platform 30 is not limited to the above structure, and only needs to be able to change the three-dimensional position of the sensor 40 relative to the dut 200.
The sensors 40 may each be a visible light camera, a depth camera, a distance sensor 40, or the like. In the present embodiment, the sensor 40 is a visible light camera.
The sensor 40 can acquire the image to be inspected of the object 200, for example, the object 200 includes a plurality of areas to be inspected, the sensor 40 can acquire the image to be inspected of at least a partial area of the object 200 each time, and the acquisition of the image to be inspected of all areas of the object 200 is realized by the cooperation of the moving platform 30.
Then, when the processor 20 detects the image to be detected, first, a preset template matched with the image to be detected is obtained, for example, the preset template may be determined (such as type, model, and the like) according to parameter information of the piece to be detected 200, and the piece to be detected 200 may be a wafer, a display panel, and the like.
Then the processor 20 matches the image to be detected with a preset template, the matching process is to find a region in the preset template which is substantially the same as the image to be detected, the preset template can be traversed through a traversal frame which is the same as the size of the image to be detected, the similarity between the region image in the traversal frame and the image to be detected can be calculated in each traversal, the similarity can be determined through the pixel value difference values of the corresponding positions of the region image in the traversal frame and the image to be detected, such as the sum, the average value, the variance and the like of the pixel value difference values, then after traversing the whole preset template, a plurality of similarities can be obtained, the region image with the maximum similarity can be determined to be matched with the image to be detected, and the region image with the maximum similarity is the first region image.
It can be understood that, referring to fig. 4, since the shooting accuracy of the sensor 40 cannot guarantee that the minimum repetition unit Q is shot each time, so that the to-be-detected image P1 of the minimum repetition unit Q is just obtained, the following situations may occur: the field of view of the sensor 40 spans the channel H while capturing a portion of the plurality of minimal repeating units Q.
In other embodiments, the object 200 to be detected has a positioning mark O for matching, and the processor 20 first identifies the image P1 to be detected and the positioning mark O in the preset template, and then determines the first area image through matching of the positioning mark O. For example, the positioning mark O is the center of a feature circle in the workpiece 200, and the centers of the feature circles in the image P1 to be inspected are matched with the centers of the feature circles in the preset template, so that the centers of the feature circles are aligned, thereby determining the first area image.
Referring to fig. 5, the predetermined template P2 may include images of a plurality of standard minimal repeating units Q, and since the field of view of the sensor 40 covers one minimal repeating unit Q, the sensor 40 may simultaneously capture a portion of 4 minimal repeating units Q at most across the channel, and therefore, the predetermined template P2 may include standard images of 4 minimal repeating units Q, and the 4 minimal repeating units Q are arranged in a2 × 2 matrix, so as to ensure that the sensor 40 can find a matching first region image A1 in the predetermined template P2 regardless of capturing any portion of the dut 200, such as the first region image A1 shown in fig. 4 existing in the predetermined template P2 in the image P1 to be inspected.
After the matching of the image to be detected P1 and the preset template P2 is completed to obtain the first area image A1, the overall matching of the image to be detected P1 and the preset template P2 is completed. Processor 20 may then identify the line regions in suspect image P1 to generate one or more second region images.
The circuit area is an area formed by pixels larger than a preset threshold, it can be understood that the colors of the circuit in the wafer and the substrate of the wafer are greatly different, for example, the substrate of the wafer is close to black, the color of the circuit on the wafer is not black, but is red, yellow and the like, the image to be detected collected by the sensor 40 can be a gray image, in the gray image, the part where the circuit is located is close to white, therefore, the pixels located in the circuit can be identified by setting the preset threshold (such as 100, 150 and the like), the adjacent pixels which are the circuit are used as the same circuit area, so as to identify one or more circuit areas, the circuit areas are areas where the circuit which needs to be subjected to defect detection is located, and one or more second area images can be generated according to the pixels of the circuit areas; or generating a second area image according to the area image where the circumscribed rectangle of the line area is located.
After the second region image in the image to be detected is obtained, the second region image can be matched with the preset template again so as to find a third region image matched with the second region image in the preset template, and the matching method of the second region image and the third region image can be similar to the matching method of the image to be detected and the preset template, and is not repeated here. Therefore, each line region in the to-be-detected image can be accurately matched with a corresponding third region image in the preset template, and therefore matching accuracy is improved.
Finally, the processor 20 compares the third area image with the second area image, and detects the defect of each third area image according to the difference between the third area image and the second area image. For example, a difference image may be generated from pixel value differences at corresponding positions of the third region image and the second region image, and then the difference between the third region image and the second region image may be determined from the difference image, thereby determining the defect of each third region image.
The detection method, detection device 10 and detection apparatus 100 of this application, through waiting to examine the image and predetermine the template to the matching, obtain with waiting to examine the first region image that the image is whole to be matched, then the circuit region in the image is examined in the discernment, in order to obtain one or more second region images, then match second region image and first region image once more, thereby find the third region image that corresponds with the second region image in predetermineeing the template, accomplish the accurate matching of every part of awaiting measuring (corresponding second region image) in waiting to examine the image, can promote the matching effect of waiting to examine the image and predetermine the template, then compare every part of awaiting measuring and the third region image that corresponds, thereby confirm the defect of every part of awaiting measuring, accomplish the defect detection of waiting to examine the image, detection effect is better.
Referring to fig. 2, 3 and 6, in some embodiments, the preset templates include a first preset template and a second preset template, the first preset template is an image of a minimum repeating unit of the device under test 200, the second preset template is a line image of the minimum repeating unit, and the step 011 includes:
0111: matching the image to be detected with a first preset template to obtain a target area image;
0112: and acquiring a region image corresponding to the target region image in the second preset template as a first region image based on the mapping relation between the first preset template and the second preset template.
In some embodiments, the first obtaining module 11 is further configured to match the image to be detected with a first preset template to obtain an image of the target area; and acquiring an area image corresponding to the target area image in the second preset template as a first area image based on the mapping relation between the first preset template and the second preset template. That is, step 0111 and step 0112 may be performed by the first acquisition module 11.
In some embodiments, the processor 20 is further configured to match the image to be inspected with a first preset template to obtain an image of the target area; and acquiring a region image corresponding to the target region image in the second preset template as a first region image based on the mapping relation between the first preset template and the second preset template. That is, step 0111 and step 0112 may be performed by processor 20.
Specifically, referring to fig. 5 and 7, the preset template P2 includes a first preset template P21 and a second preset template P22, the first preset template P21 is an image (such as a grayscale image) of the minimum repeating unit Q, and the second preset template P22 is a line image of the minimum repeating unit Q, and includes only an image of a line portion in the minimum repeating unit Q. The first preset template P21 and the second preset template P22 both include images corresponding to 4 minimum repeating units Q, and the images corresponding to the 4 minimum repeating units Q are arranged in a2 × 2 matrix, so that it is ensured that the images to be detected have matched region images in both the first preset template P21 and the second preset template P22.
When the image to be detected is matched with the preset template P2, the acquired image to be detected is generally a gray image, so that the image to be detected is firstly matched with the first preset template P21 to determine a target area image (such as the target area image a11 shown in fig. 5), and the matching mode refers to the matching mode of the image to be detected and the preset template P2 in the previous embodiment. The first preset template P21 and the second preset template P22 have a preset mapping relationship, and it can be understood that the first preset template P21 and the second preset template P22 have the same size, and the images at the corresponding positions both correspond to the same part of the standard to-be-detected piece 200, so that after the target area image a11 in the first preset template P21 is determined, the area image a12 corresponding to the target area image a11 (corresponding to the area where the white square frame in fig. 7 is located) can be quickly found in the second preset template P22 according to the mapping relationship, and since the detected defect of the line is detected, the area image A1 corresponding to the target area image a11 in the second preset template P22 is used as the first area image A1, so that the matching of the line area in the to-be-detected image can be more accurately realized, and the defect detection accuracy of the line is improved.
The second preset template P22 may be generated according to the first preset template P21, for example, by labeling the line contour in the first preset template P21 to determine the area image where the line is located, so as to generate the second preset template P22. Or, the line of the to-be-tested object 200 is designed according to a preset line drawing, such as a Computer Aided Design (CAD) line drawing, and the second preset template P22 can also be generated according to the CAD line drawing, for example, the line actually has a width, so that the line in the CAD line drawing can be thickened to make the thickness of the line conform to the thickness of the actual line in the second preset template P22, thereby generating the second preset template P22.
Referring to fig. 2, 3 and 8, in some embodiments, step 012 includes:
0121: binarizing an image to be detected to generate a binary image;
0122: identifying one or more pixels with the distance smaller than the preset distance in the binary image as the same line region; and
0123: and generating a second area image according to the line area.
In some embodiments, the identification module 12 is further configured to binarize the image to be detected to generate a binarized image; identifying one or more pixels with the distance smaller than the preset distance in the binary image as the same line region; and generating a second area image according to the line area. That is, step 0121, step 0122 and step 0123 may be performed by the identification module 12.
In some embodiments, the processor 20 is further configured to binarize the image to be detected to generate a binarized image; identifying one or more pixels with the distance smaller than the preset distance in the binary image as the same line region; and generating a second area image according to the line area. That is, steps 0121, 0122 and 0123 may be executed by processor 20.
Specifically, referring to fig. 9, when determining the line region M, adjacent pixels in the to-be-detected image P1, which are located in the line region M and have a pitch (a distance between centers of two pixels) smaller than a predetermined pitch (e.g., 1 pixel, 2 pixels, etc.), may be used as pixels in the same line region M. Taking the pixels as rectangles as an example, when the predetermined pitch is 1 pixel, only the pixels with adjacent sides of two pixels and in the line region M can be used as the pixels of the same line region M, and when the predetermined pitch is 2, and the corner points of two pixels are adjacent and are the pixels in the line region M, the pixels in the same line region M can also be used as the pixels of the same line region M, so that the pixels in the same line region M can be quickly determined.
It can be understood that, for more convenient determination of the line region M in the image to be inspected P1, the image to be inspected P1 may be binarized according to a preset threshold to obtain a binarized image (such as the image to be inspected P1 shown in fig. 9), for example, a pixel value of a pixel having a pixel value greater than the preset threshold is determined as a pixel of the line region M, a pixel value thereof is set to 255, a pixel value of a pixel having a pixel value less than or equal to the preset threshold is determined as a pixel of the non-line region, a pixel value thereof is set to 0, so as to accurately identify the line region M in the image to be inspected.
The preset threshold may be a preset fixed value, such as 100, 150, etc.; or, since the image to be detected may be acquired in different illumination environments, the preset threshold may be determined according to the gray value distribution of the image to be detected P1, so as to determine the preset threshold adapted to each image to be detected P1, and improve the detection accuracy of the line region M.
It can be understood that, for better matching with the line graph of the line of the second preset template, after the binarized image of the to-be-detected image P1 and the line region M are determined, the line region M may be processed again, so that the line region M in the binarized image is also the line graph, thereby generating the second region image A2. Specifically, the edge of the line region M in the binarized image may be first identified, and for example, binary numbers may be generated according to pixel values of 8 pixels around each pixel, where the binary number corresponding to the pixel value 255 is 1, and the binary number corresponding to the pixel value 0 is 0. The processor 20 then looks up the binary number in a preset look-up table to determine whether the pixel corresponding to the binary number is an edge pixel (i.e., a pixel located at an edge of the line area).
Referring to fig. 10, the processor 20 may determine an edge contour of the line region M according to edge pixels in the binarized image, then determine a corresponding pixel width in the binarized image according to an actual thickness of the line or a pixel width of the line of the second preset template, and then process the edge contour so that the line width in each line region M is adjusted to the pixel width, thereby generating a line image P3 to be detected, and then generate a second region image A2 according to the adjusted line region M in the line image P3 to be detected, for example, an image region where the line region M is located is used as the second region image A2, and the second region image A2 may be better matched with the first region image in the second preset template.
In other embodiments, the processor 20 generates the second area image A2 according to the adjusted line area M, which may specifically be: the processor 20 cuts out the area image where the minimum circumscribed rectangle N of the line area M is located, so as to serve as the second area image A2 corresponding to the line area M, thereby facilitating subsequent comparison between the second area image A2 and the first area image, so as to determine the third area image.
Referring to fig. 2, 3 and 11, in some embodiments, step 013 includes:
0131: enlarging the second area image;
0132: acquiring a fourth image area corresponding to the expanded second area image in the first area image;
0133: and matching the second area image with the fourth area image to acquire a third area image matched with the second area image in the fourth area image.
In some embodiments, the second acquiring module 13 is further configured to enlarge the second region image; acquiring a fourth image area corresponding to the expanded second area image in the first area image; and matching the second area image with the fourth area image to acquire a third area image matched with the second area image in the fourth area image. That is, step 0131, step 0132 and step 0133 may be performed by the second obtaining module 13.
In some embodiments, processor 20 is also configured to enlarge the second region image; acquiring a fourth image area corresponding to the expanded second area image in the first area image; and matching the second area image with the fourth area image to acquire a third area image matched with the second area image in the fourth area image. That is, step 0131, step 0132 and step 0133 may be executed by processor 20.
Specifically, referring to fig. 7 and 11, if the area image where the minimum circumscribed rectangle N of the line area M of the to-be-detected image P1 is located is taken as the second area image A2, since the to-be-detected image P1 and the first area image A1 are aligned after matching, the second area image A2 has a corresponding area image in the first image area A1, however, since the line area M in the to-be-detected image P1 may have a position deviation (e.g., cannot be completely aligned with a standard line), if the minimum circumscribed rectangle N directly based on the line area M is matched with the first image area A1, a part of the corresponding line in the first image area A1 may not be in the area image corresponding to the minimum circumscribed rectangle N, and therefore, referring to fig. 12 and 13, the processor 20 needs to first enlarge the minimum circumscribed rectangle N, and may enlarge the circumscribed rectangle by a predetermined multiple, where the predetermined multiple is an enlarged side length of the circumscribed rectangle, such as a predetermined multiple of 1.1, 1.2, 1.5, and the like. For example, if the minimum circumscribed rectangle N is 10 × 10 pixels and the predetermined multiple is 1.2 times, the enlarged circumscribed rectangle K is 12 × 12 pixels, so that even if the trace has a deviation, the fourth area image A4 corresponding to the image area in which the circumscribed rectangle K is located in the first image area A1 can still cover the trace.
Then, the processor 20 matches the second area image A2 with the fourth area image A4, and since the size of the fourth area image A4 is larger than that of the second area image A2, in the matching process, a plurality of fifth area images A5 in the fourth area image A4, which are the same as the second area image A2, may be obtained first, and specifically, referring to fig. 14a, 14b, and 14c, the processor 20 may poll the fourth area image A4 according to a polling frame, which is the same as the second area image A2, so that the polling frame finishes polling the entire fourth area image A4 line by line, thereby obtaining a plurality of fifth area images A5. Then, the processor 20 matches each of the fifth area images A5 and the second area images A2 to calculate a matching degree of the fifth area images A5 and the second area images A2, for example, by using differences of pixel values of corresponding positions of the fifth area images A5 and the second area images A2, and then using a sum, an average, and the like of the differences of all the pixel values as the matching degree to calculate a matching degree corresponding to each of the fifth area images A5, and the processor 20 obtains the fifth area image A5 with the highest matching degree as a third area image A3 (as in fig. 14C) matched with the second area image A2, thereby achieving precise matching of each of the second area images A2 to obtain the third area image A3 corresponding to each of the second area images A2.
Referring to fig. 2, 3 and 15, in some embodiments, step 014 includes:
0141: differentiating the second region image and the third region image to generate a difference image;
0142: and detecting the defects of the line region image according to the difference image.
In some embodiments, the comparison module 14 is further configured to perform a difference operation on the second region image and the third region image to generate a difference image; and detecting the defects of the line region image according to the difference image. That is, steps 0141 and 0142 may be performed by the comparison module 14.
In some embodiments, the processor 20 is further configured to perform a difference between the second region image and the third region image to generate a difference image; and detecting the defects of the line region image according to the difference value image. That is, steps 0141 and 0142 may be performed by processor 20.
Specifically, as shown in fig. 16a and 16b, the second region image A2 without line-missing defect and with line-missing defect, respectively, and fig. 17a and 17b are the difference images P4 corresponding to fig. 16a and 16b, respectively. Specifically, the pixel value difference of the pixels at the corresponding positions in the second area image A2 and the third area image A3 is obtained, so that the difference image P4 is generated according to the absolute value of the pixel value difference. The difference image P4 can represent the difference between the second area image A2 (the line area in the to-be-detected image) and the third area image A3 (the corresponding line area in the second preset template), so as to realize the detection of each line area in the to-be-detected image.
Referring to fig. 17b, in one embodiment, the processor 20 identifies a connected component in the difference image, for example, a portion of the difference image where a pixel value greater than 0 (e.g. 255) is located is a connected component, which is a portion corresponding to an extra line in the to-be-detected image (specifically, a line region). Processor 20 may determine the location and area of the added trace segment based on the location and area of the connected component, respectively, and it will be appreciated that when the area of the connected component is small (e.g., less than a predetermined area threshold, such as a predetermined area threshold of 5, 7, 10 (pixels), etc.), the segment may be noise only, and not the added trace segment, and thus, the trace segment may be determined to be defect free. When the area of the connected domain is larger (for example, larger than or equal to the preset area threshold), the connected domain can be determined to be the line region, so that the defect of the line region is determined. Therefore, the position and the area of the defect of each line region can be accurately determined, and the defect of the image to be detected can be detected.
Referring to fig. 18, one or more non-transitory computer-readable storage media 300 containing a computer program 302 according to an embodiment of the present disclosure, when the computer program 302 is executed by one or more processors 20, enable the processor 20 to perform the calibration method according to any of the embodiments described above.
For example, referring to fig. 1-3, the computer program 302, when executed by the one or more processors 20, causes the processors 20 to perform the steps of:
011: matching the image to be detected with a preset template to obtain a first area image;
012: identifying a line region of the first region image to generate a second region image;
013: acquiring a third area image corresponding to the second area image in the to-be-detected image;
014: and comparing the third area image with the second area image to detect the defects of the image to be detected.
As another example, referring to fig. 2, 3 and 5, when the computer program 302 is executed by one or more processors 20, the processors 20 may further perform the steps of:
0121: one or more pixels having a pitch smaller than a predetermined pitch in the first region image are identified as the same line region to generate a second region image.
In the description of the present specification, reference to the description of "one embodiment", "some embodiments", "illustrative embodiments", "examples", "specific examples" or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the various embodiments or examples and features of the various embodiments or examples described in this specification can be combined and combined by those skilled in the art without being mutually inconsistent.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (12)

1. A method of detection, comprising:
acquiring a first area image corresponding to an image to be detected in a preset template;
identifying a line area of the image to be detected to generate a second area image;
acquiring a third area image corresponding to the second area image in the first area image;
and comparing the second area image with the third area image to detect the defects of the image to be detected.
2. The detection method according to claim 1, wherein the preset templates include a first preset template and a second preset template, the first preset template is an image of a minimal repeating unit of a to-be-detected piece, the second preset template is a circuit image of the minimal repeating unit, and the acquiring a first area image corresponding to the to-be-detected image in the preset template comprises:
matching the image to be detected with the first preset template to obtain a target area image;
and acquiring an area image corresponding to the target area image in the second preset template as the first area image based on the mapping relation between the first preset template and the second preset template.
3. The method of claim 2, wherein the first predetermined pattern comprises 4 images of the minimal repeating unit arranged in a2 x 2 matrix, and the second predetermined pattern comprises 4 images of the electrical circuit arranged in a2 x 2 matrix.
4. The detection method according to claim 2, further comprising:
labeling the contour of the line in the first preset template to generate a second preset template; or
And acquiring a preset circuit diagram, and generating the second preset template according to the circuit diagram.
5. The inspection method of claim 1, wherein said identifying a line region of said suspect image to generate a second region image comprises:
binarizing the image to be detected to generate a binarized image;
identifying one or more pixels with the intermediate distance smaller than a preset distance in the binarized image as the same line region; and
and generating the second area image according to the line area.
6. The detection method according to claim 1, wherein the acquiring a third area image corresponding to the second area image in the first area image comprises:
enlarging the second area image;
acquiring a fourth image area corresponding to the second area image after the first area image is enlarged;
and matching the second area image with the fourth area image to acquire the third area image matched with the second area image in the fourth area image.
7. The detection method according to claim 6, wherein the matching the second region image and the fourth region image to obtain the third region image in the fourth region image that matches the second region image comprises:
acquiring a plurality of fifth area images with the same size as the second area images in the fourth area images;
matching the fifth area image with the second area image to obtain the matching degree of each fifth area image with the second area image;
and acquiring the fifth area image with the highest matching degree as the third area image.
8. The inspection method according to claim 1, wherein said comparing said second area image and said third area image to detect defects of said suspect image comprises:
differentiating the second region image and the third region image to generate a difference image;
and detecting the defects of the line region image according to the difference value image.
9. The method of claim 8, wherein the detecting the defect of the line region image according to the difference image comprises:
identifying the connected components in the difference image;
and determining whether the line region image has defects and the positions of the defects according to the positions and the areas of the connected domains.
10. A detection device, comprising:
the first acquisition module is used for acquiring a first area image corresponding to the image to be detected in a preset template;
the identification module is used for identifying a line area of the to-be-detected image so as to generate a second area image;
the second acquisition module is used for acquiring a third area image corresponding to the second area image in the first area image; and
and the comparison module is used for comparing the second area image with the third area image so as to detect the defect of the image to be detected.
11. A detection device, comprising a processor configured to:
acquiring a first area image corresponding to an image to be detected in a preset template;
identifying a line area of the image to be detected to generate a second area image;
acquiring a third area image corresponding to the second area image in the first area image;
and comparing the second area image with the third area image to detect the defect of the image to be detected.
12. A non-transitory computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the processors to perform the detection method of any one of claims 1 to 9.
CN202110558116.7A 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium Pending CN115375610A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110558116.7A CN115375610A (en) 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110558116.7A CN115375610A (en) 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115375610A true CN115375610A (en) 2022-11-22

Family

ID=84059668

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110558116.7A Pending CN115375610A (en) 2021-05-21 2021-05-21 Detection method and device, detection equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115375610A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309574A (en) * 2023-05-19 2023-06-23 成都数之联科技股份有限公司 Method, system, equipment and storage medium for detecting panel leakage process defects
CN117497439A (en) * 2023-09-27 2024-02-02 上海朋熙半导体有限公司 Method, device, equipment and readable medium for predicting key defects in wafer
CN117994250A (en) * 2024-04-03 2024-05-07 武汉罗博半导体科技有限公司 Wafer defect detection method, device, equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309574A (en) * 2023-05-19 2023-06-23 成都数之联科技股份有限公司 Method, system, equipment and storage medium for detecting panel leakage process defects
CN116309574B (en) * 2023-05-19 2023-08-18 成都数之联科技股份有限公司 Method, system, equipment and storage medium for detecting panel leakage process defects
CN117497439A (en) * 2023-09-27 2024-02-02 上海朋熙半导体有限公司 Method, device, equipment and readable medium for predicting key defects in wafer
CN117994250A (en) * 2024-04-03 2024-05-07 武汉罗博半导体科技有限公司 Wafer defect detection method, device, equipment and storage medium
CN117994250B (en) * 2024-04-03 2024-06-21 武汉罗博半导体科技有限公司 Wafer defect detection method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN115375610A (en) Detection method and device, detection equipment and storage medium
EP3109826B1 (en) Using 3d vision for automated industrial inspection
CN109472271B (en) Printed circuit board image contour extraction method and device
US11158039B2 (en) Using 3D vision for automated industrial inspection
WO2017181724A1 (en) Inspection method and system for missing electronic component
CN112053318A (en) Two-dimensional PCB defect real-time automatic detection and classification device based on deep learning
CN113454445A (en) Compensating for reference misalignment during part inspection
JP2001524228A (en) Machine vision calibration target and method for determining position and orientation of target in image
CN111539927B (en) Detection method of automobile plastic assembly fastening buckle missing detection device
EP1467176B1 (en) Inspection system and method
US20210183037A1 (en) Image processing device, image processing method, and image processing non-transitory computer readable medium
CN115375608A (en) Detection method and device, detection equipment and storage medium
WO2004083901A2 (en) Detection of macro-defects using micro-inspection inputs
US6898333B1 (en) Methods and apparatus for determining the orientation of an object in an image
CN111007086A (en) Defect detection method and device and storage medium
US20240104715A1 (en) Production-speed component inspection system and method
US8588511B2 (en) Method and apparatus for automatic measurement of pad geometry and inspection thereof
US10241000B2 (en) Method for checking the position of characteristic points in light distributions
CN117495846B (en) Image detection method, device, electronic equipment and storage medium
Vacho et al. Selected method of image analysis used in quality control of manufactured components
JP2638121B2 (en) Surface defect inspection equipment
JP3043530B2 (en) Dot pattern inspection equipment
JP2009085900A (en) System and method for testing component
KR19990087848A (en) Inspection Region Preparing Method and Visual Inspection Method
JPH10170451A (en) Defect-inspecting apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination