CN112070766A - Defect detection method and device, detection equipment and readable storage medium - Google Patents

Defect detection method and device, detection equipment and readable storage medium Download PDF

Info

Publication number
CN112070766A
CN112070766A CN202011275270.5A CN202011275270A CN112070766A CN 112070766 A CN112070766 A CN 112070766A CN 202011275270 A CN202011275270 A CN 202011275270A CN 112070766 A CN112070766 A CN 112070766A
Authority
CN
China
Prior art keywords
image
edge
point
line
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011275270.5A
Other languages
Chinese (zh)
Other versions
CN112070766B (en
Inventor
陈鲁
夏爱华
佟异
张嵩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongke Feice Technology Co Ltd
Original Assignee
Shenzhen Zhongke Flying Test Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Flying Test Technology Co ltd filed Critical Shenzhen Zhongke Flying Test Technology Co ltd
Priority to CN202011275270.5A priority Critical patent/CN112070766B/en
Publication of CN112070766A publication Critical patent/CN112070766A/en
Application granted granted Critical
Publication of CN112070766B publication Critical patent/CN112070766B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)

Abstract

The application discloses a defect detection method for detecting the edge of a wafer, which comprises the following steps: processing the image information of the edge to create a first image; establishing a second image according to a straight line of which the inclination angle is within a preset angle range in the contour line of the first image; according to the first image and the second image, reconstructing the edge image into a third image, wherein the third image comprises a reconstructed edge boundary; and identifying the position of the defect of the edge according to the edge boundary and the first image. The application also discloses a defect detection device, detection equipment and a computer readable storage medium. The edge boundary is reconstructed, so that the edge of the wafer can be detected, the detected edge defects are accurate, and the false detection degree of the normal image is low.

Description

Defect detection method and device, detection equipment and readable storage medium
Technical Field
The present disclosure relates to the field of detection technologies, and in particular, to a defect detection method, a defect detection apparatus, a detection device, and a non-volatile computer-readable storage medium.
Background
For the defect detection of the effective area in the wafer, the pattern appears repeatedly for each chip, the detection mode is that the template picture of the chip is firstly made, then the template picture and the detection picture are compared after the alignment is successful, and when the difference between the two pictures exceeds the set threshold parameter, the defect area is judged. However, the boundary line of the edge is easy to have defects such as bending and abnormal bubbles, the pattern of the edge of the wafer is not repeated, and comparative detection cannot be performed according to the template diagram, but the detection of the edge of the wafer is important, and a detection method is needed to detect the edge of the wafer.
Disclosure of Invention
The embodiment of the application provides a defect detection method, a defect detection device, detection equipment and a non-volatile computer readable storage medium.
The defect detection method of the embodiment of the application is used for detecting the edge of a wafer, and comprises the following steps: processing the image information of the edge to create a first image; establishing a second image according to a straight line of which the inclination angle is within a preset angle range in the contour line of the first image; reconstructing the image of the edge as a third image according to the first image and the second image, wherein the third image comprises a reconstructed edge boundary; and identifying the position of the defect of the edge according to the edge boundary and the first image.
In some embodiments, the processing the image information of the edge to create a first image comprises: carrying out binarization processing on the image information of the edge to obtain a binarized image; carrying out edge detection on the binary image to obtain a contour line of the edge; and establishing the first image according to the contour line of the edge.
In some embodiments, the binarizing the image information of the edge to obtain a binarized image includes: acquiring the pixel value of each pixel point in the edge image; establishing a histogram according to the number of pixel points corresponding to the pixel values, wherein the histogram comprises the pixel values and the number of the pixel points corresponding to the pixel values; acquiring a standard pixel value according to the distribution condition of the pixel values in the histogram; and carrying out binarization processing on the image of the edge according to the standard pixel value to obtain the binarized image.
In some embodiments, the binarizing the image of the edge according to the standard pixel value to obtain the binarized image includes: increasing the standard pixel value by a predetermined variable to obtain a binary pixel threshold value; and determining pixel points of the pixel values within the binarization pixel threshold value as a first color, and determining pixel points of the pixel values outside the binarization pixel threshold value as a second color so as to obtain the binarization image.
In some embodiments, before the binarizing the image information of the edge to obtain a binarized image, the processing the image information of the edge to create a first image further includes: acquiring a received initial image of the edge; and adjusting the initial image to the preset inclination angle to obtain the image information.
In some embodiments, the creating a second image according to a straight line in the contour line of the first image, the straight line having an inclination angle within a preset angle range, includes: performing straight line detection on the first image to obtain a straight line in the first image; determining the inclination angle of each straight line according to the result of the straight line detection; screening a straight line of the inclination angle within the preset angle range to serve as a foreground straight line; and establishing the second image according to the foreground straight line.
In some embodiments, the reconstructing the image of the edge from the first image and the second image into a third image, the third image including a reconstructed edge boundary, includes: establishing an initial reconstruction image; analyzing the first image and the second image to obtain the reconstructed edge boundary; marking the reconstructed edge boundary in the initial reconstructed image to obtain the third image.
In some embodiments, the analyzing the first image and the second image to obtain the reconstructed edge boundary includes: scanning the second image line by line along a first preset direction; setting an identifier to be a first value when the point in the second image is scanned to meet a first preset condition; scanning the first image line by line along the first preset direction to obtain a first characteristic point; and reconstructing the edge boundary line according to the first characteristic point.
In some embodiments, the scanning the first image along the first predetermined direction to acquire a first feature point includes: judging whether a first point in the first image, which corresponds to a point in the second image meeting a first preset condition, meets a second preset condition; if yes, scanning the first image from the first point along the first preset direction; and modifying the identifier into a second value until a second point which does not meet the second preset condition in the first image is scanned; wherein, along the first predetermined direction, a point between the first point and the second point in the first image is the first feature point.
In some embodiments, said scanning said first image in said first predetermined direction to obtain a first feature point after modifying said identifier to a second value further comprises: starting to scan the second image along the first predetermined direction from a point in the second image corresponding to the second point; modifying the identifier to a first value when a point in the second image that meets the first preset condition is scanned; scanning the first image along the first predetermined direction; until the point in the first image is scanned and does not meet the second preset condition, modifying the identifier into a second value; and repeatedly executing the steps of scanning the first image, modifying the identifier, scanning the second image and modifying the identifier until the end point of the current line of the first image is scanned.
In some embodiments, after the reconstructing the edge boundary, the analyzing the first image and the second image to obtain the reconstructed edge boundary further includes: scanning the second image line by line along a second predetermined direction, the second predetermined direction being opposite to the first predetermined direction; when the point in the second image is scanned to meet the first preset condition, modifying the identifier into the first value; scanning the first image line by line along the second preset direction to obtain a second characteristic point; and updating the reconstructed edge boundary according to the second characteristic point.
In some embodiments, said marking said reconstructed edge boundary in said initial reconstructed image to obtain said third image comprises: and marking the point corresponding to the boundary of the edge in the initial reconstruction image as a characteristic color to obtain the third image.
In some embodiments, the identifying the location of the defect of the edge according to the edge boundary and the first image includes: determining a broken area with a breaking distance of the edge boundary being greater than a preset length as an initial abnormal area; removing false abnormal regions in the initial abnormal region according to the first image and the edge boundary to obtain a real defect region; and carrying out coordinate conversion on the coordinates of the real defect area so as to determine the position of the defect on the edge.
In some embodiments, the removing false abnormal regions in the initial abnormal region according to the first image and the edge boundary to obtain real defect regions includes: judging whether an area corresponding to the initial abnormal area in the first image meets a third preset condition or not; if so, determining the initial abnormal region as the false abnormal region; and if not, determining the initial abnormal area as the real defect area.
The defect detecting device of the embodiment of the application is used for detecting the edge of a wafer, and comprises: the device comprises a processing module, an establishing module, a reconstructing module and an identifying module, wherein the processing module is used for processing the image information of the edge to establish a first image; the establishing module is used for establishing a second image according to a straight line with an inclination angle within a preset angle range in the contour line of the first image; the reconstruction module is used for reconstructing the image of the edge into a third image according to the first image and the second image, wherein the third image comprises a reconstructed edge boundary; the identification module is used for identifying the position of the defect of the edge according to the edge boundary and the first image.
The detection device of the embodiment of the application comprises one or more processors and a memory; and one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the defect detection method of any of the embodiments described above.
A non-transitory computer-readable storage medium containing a computer program according to an embodiment of the present application, which, when executed by one or more processors, causes the processors to perform the defect detection method according to any one of the above embodiments.
In the defect detection method, the defect detection device and the detection equipment, the first image is obtained by processing the image information of the edge, and then the second image is established according to the straight line of which the inclination angle is within the preset angle range in the contour line of the first image, so that the non-linear background pattern noise and other linear noises can be removed, and the obtained contour line is more accurate. Further, the image of the edge is reconstructed into a third image according to the first image and the second image, and since the third image includes the reconstructed edge boundary, the specific position of the defect can be identified according to the reconstructed edge boundary and the first image. The method can reduce the noise interference in the original image by reconstructing the edge boundary, so that the obtained edge boundary is more accurate, the position of the detected edge defect is more accurate, and in addition, the false detection degree of the normal image is lower due to the fact that the edge boundary is reconstructed.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 2 is a block schematic diagram of a detection apparatus according to an embodiment of the present application;
FIG. 3 is a block diagram of a defect detection apparatus according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 5 is a block diagram of a processing module of the defect detecting apparatus according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating the principle of a defect detection method according to an embodiment of the present application;
FIG. 7 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 8 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 9 is a block diagram of a binarization unit of a processing module according to an embodiment of the present application;
FIG. 10 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating the principle of a defect detection method according to an embodiment of the present application;
FIG. 12 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 13 is a block diagram of a setup module of the defect detection apparatus according to an embodiment of the present disclosure;
FIG. 14 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 15 is a block diagram of a reconstruction module of the defect detection apparatus according to an embodiment of the present disclosure;
FIG. 16 is a schematic flow chart of a defect detection method according to an embodiment of the present application;
FIG. 17 is a block diagram of an analysis unit of a reconstruction block in an embodiment of the present application;
FIG. 18 is a schematic flow chart diagram of a defect detection method according to an embodiment of the present application;
FIG. 19 is a schematic flow chart diagram of a defect detection method according to an embodiment of the present application;
FIG. 20 is a schematic flow chart diagram of a defect detection method according to an embodiment of the present application;
FIG. 21 is a schematic flow chart diagram of a defect detection method according to an embodiment of the present application;
FIG. 22 is a schematic flow chart diagram illustrating a defect detection method according to an embodiment of the present application;
FIG. 23 is a schematic flow chart diagram of a defect detection method according to an embodiment of the present application;
FIG. 24 is a schematic diagram illustrating a defect detection method according to an embodiment of the present application;
FIG. 25 is a schematic flow chart diagram of a defect detection method according to an embodiment of the present application;
FIG. 26 is a block diagram of an identification module of the defect detection apparatus according to an embodiment of the present application;
FIG. 27 is a schematic flow chart diagram of a defect detection method according to an embodiment of the present application;
FIG. 28 is a block diagram of a removal unit of an identification module of an embodiment of the present application;
fig. 29 is a schematic diagram illustrating a connection relationship between a computer-readable storage medium and a processor according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout.
In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "above," and "over" a second feature may mean that the first feature is directly above or obliquely above the second feature, or that only the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1 to 3, a defect inspection method according to an embodiment of the present disclosure is used for inspecting an edge of a wafer, and the defect inspection method includes the following steps:
01: processing the image information of the edge to create a first image;
02: establishing a second image according to a straight line of which the inclination angle is within a preset angle range in the contour line of the first image;
03: according to the first image and the second image, reconstructing the edge image into a third image, wherein the third image comprises a reconstructed edge boundary; and
04: and identifying the position of the defect of the edge according to the edge boundary and the first image.
The defect detecting apparatus 200 according to an embodiment of the present application is used for detecting an edge of a wafer, and the defect detecting apparatus 200 includes a processing module 210, an establishing module 220, a reconstructing module 230, and an identifying module 240. The processing module 210, the establishing module 220, the reconstructing module 230 and the identifying module 240 may be used to implement step 01, step 02, step 03 and step 04, respectively. That is, the processing module 210 may be configured to process the image information of the edge to create a first image; the establishing module 220 may be configured to establish the second image according to a straight line in the contour line of the first image, where an inclination angle is within a preset angle range; the reconstruction module 230 may be configured to reconstruct an edge image into a third image according to the first image and the second image, where the third image includes a reconstructed edge boundary; the identification module 240 may be configured to identify a location of a defect on the edge based on the edge boundary and the first image.
The inspection apparatus 100 of the embodiments of the present application includes one or more processors 10, a memory 20, and one or more programs, where the one or more programs are stored in the memory 20 and executed by the one or more processors 10, the programs including instructions for performing the defect inspection method of the embodiments of the present application. When the processor 10 executes the program, the processor 10 may be configured to implement step 01, step 02, step 03, and step 04. That is, the processor 10 may be configured to process image information of the edge to create a first image; establishing a second image according to a straight line of which the inclination angle is within a preset angle range in the contour line of the first image; according to the first image and the second image, reconstructing the edge image into a third image, wherein the third image comprises a reconstructed edge boundary; and identifying the position of the defect of the edge according to the edge boundary and the first image.
In the defect detection method, the defect detection device 200, and the detection apparatus 100 according to the embodiment of the present application, first, the first image is obtained by processing the image information of the edge, and then, the second image is created according to the straight line of which the inclination angle is within the preset angle range in the contour line of the first image, so that the non-straight background pattern noise and other straight line noise can be removed, and the obtained contour line is relatively accurate. Further, the image of the edge is reconstructed into a third image according to the first image and the second image, and since the third image includes the reconstructed edge boundary, the specific position of the defect can be identified according to the reconstructed edge boundary and the first image. The method can reduce the noise interference in the original image by reconstructing the edge boundary, so that the obtained edge boundary is more accurate, the position of the detected edge defect is more accurate, and in addition, the false detection degree of the normal image is lower due to the fact that the edge boundary is reconstructed.
The detection device 100 may be a wafer detection device, a semiconductor detection device, or other detection devices, and is not limited herein. The inspection apparatus 100 may be used to inspect the interior, edges or surfaces of semiconductor devices such as wafers, chips, etc. for the presence of defects such as bubbles, cracks, voids, scratches, abrasions, particles, etc. The wafer may be a wafer subjected to a photoresist coating process, a wafer subjected to a photolithography and development process, a wafer subjected to an etching process (including wet etching or dry etching), a wafer subjected to a chemical mechanical polishing process, a wafer subjected to a chemical vapor deposition process, or a wafer subjected to physical vapor deposition.
The material of the wafer can be silicon (Si), germanium (Ge), or silicon germanium (GeSi), silicon carbide (SiC); or silicon-on-insulator (SOI), germanium-on-insulator (GOI); or may be other materials such as group III-V compounds such as gallium arsenide. The size (diameter) of the wafer may be 6 inches, 8 inches, 12 inches or 18 inches, and more, without limitation.
Specifically, in step 01, the image information of the edge is processed to create a first image. The image information of the edge may be an edge image of the wafer captured by a camera on the detection apparatus 100, the image information of the edge may also be information of the edge of the wafer transmitted by a software layer, and the image information of the edge may also be a region of interest (ROI) in the complete edge of the wafer, which is not listed here. Further, the image information of the processed edge may be specifically image information of the edge subjected to image correction processing, feathering processing, filtering processing, amplification processing, predetermined line extraction processing, and the like, which are not listed here. After the image information of the edge is processed, a first image needs to be created according to the result of processing the image information of the edge.
In step 02, a second image is created according to a straight line in the contour line of the first image, wherein the inclination angle is within a preset angle range. It is understood that there may be a plurality of tilt angles in the first image, for example, the tilt angles may be 10 °, 20 °, 30 °, 35 °, 45 °, 60 °, 70 °, 80 °, 85 °, 88 °, 90 °, 95 °, and so on, and it is necessary to screen the straight lines in the first image to obtain the required straight lines or accurate contour lines. The preset inclination angle range can be determined according to the inclination angle of the real contour line of the edge of the current wafer, so that the second image is established according to the straight line of the inclination angle in the contour line within the preset angle range, the contour line in the second image is more accurate, and the subsequent detection result is more accurate.
For example, if the inclination angle of the real contour line of the edge of the current wafer is 90 °, the preset inclination angle range may be determined as [88 °, 92 ° ], and a straight line with an inclination angle between [88 °, 92 ° ] in the contour lines of the screened first image is used to create the second image. For another example, if the inclination angle of the real contour line of the edge of the current wafer is 45 °, the preset inclination angle range may be determined as [40 °, 48 ° ], and a straight line with an inclination angle between [40 °, 48 ° ] in the contour lines of the first image may be screened to establish the second image. For example, if the inclination angle of the real contour line of the edge of the current wafer is 0 °, the preset inclination angle range may be determined as [ -3 °, 3 ° ], and a straight line with an inclination angle between [ -3 °, 3 ° ] in the contour lines of the first image may be screened to establish the second image.
In step 03, the image of the reconstructed edge is a third image according to the first image and the second image, and the third image includes a reconstructed edge boundary. Because the first image comprises the image information of the edge, the second image comprises the accurate contour line of the edge, and the image of the edge is reconstructed into the third image according to the first image and the second image, the boundary of the reconstructed edge is accurate, and the defect detected in the subsequent detection process is accurate.
In step 04, the defect position of the edge is identified according to the edge boundary and the first image. The first image includes the initial image information of the edge of the wafer, the reconstructed edge boundary is obtained in step 03, and the defect position of the edge of the wafer is identified more accurately and with less error according to the first image and the reconstructed edge boundary. Specifically, the preliminary defect position may be determined by the boundary of the edge, and then analyzed with the first image to determine the position of the real defect in the edge, so as to reduce the detection error.
Referring to fig. 4-6, in some embodiments, step 01 includes the following steps:
011: acquiring a received initial image of the edge; and
012: and adjusting the initial image to a preset inclination angle to obtain the image information of the edge.
In some embodiments, the processing module 210 includes an obtaining unit 211 and an adjusting unit 212, and the obtaining unit 211 may be configured to obtain an initial image of the received edge; the adjusting unit 212 may be configured to adjust the initial image to a preset tilt angle to obtain image information of the edge. That is, the obtaining unit 211 may be configured to implement step 011, and the adjusting unit 212 may be configured to implement step 012.
In some embodiments, the processor 10 may also be configured to obtain an initial image of the received edge; and adjusting the initial image to a preset inclination angle to obtain the image information of the edge. That is, the processor 10 may also be configured to implement step 011 and step 012.
Specifically, the initial image of the edge may be an image of the edge captured or scanned by a camera, a scanner, or the like, and lines with various tilt angles may exist in the initial image, and it is not clear which lines may be boundary lines of the edge and which lines are miscellaneous lines. Therefore, the image can be corrected to adjust the image to the preset inclination angle, so that it can be more clearly distinguished which lines may be boundary lines of edges and which lines may be miscellaneous lines (i.e. boundary lines other than edges).
Specifically, the difference between the real inclination angle and the preset inclination angle can be calculated according to the real inclination angle of the edge when the initial image is obtained, and the initial image is adjusted to the preset inclination angle according to the difference to obtain the image information of the edge, so that the straight line with the inclination angle being the preset inclination angle in the image information of the edge is larger and is likely to be the boundary of the edge. For example, in the embodiment shown in fig. 6, the diagram T2 is obtained by rotating the diagram T1 by a preset inclination angle (90 °).
In one embodiment, the preset inclination angle is 90 °, the true inclination angle of the edge when the initial image of the edge is taken is 30 °, it can be understood that the larger probability of the straight line with the inclination angle of 30 ° in the initial image is the boundary of the edge, the difference between the true inclination angle and the preset inclination angle is 30 ° -90 ° = -60 °, which indicates that the initial image needs to be rotated clockwise by 60 °, and after the initial image is rotated clockwise by 60 °, the larger probability of the straight line with the inclination angle of 90 ° in the rotated initial image is the boundary of the edge.
In another embodiment, the preset inclination angle is 90 °, the true inclination angle of the edge when the initial image of the edge is captured is 110 °, it can be understood that the larger probability of the straight line with the inclination angle of 110 ° in the initial image is the boundary of the edge, the difference between the true inclination angle and the preset inclination angle is 110 ° -90 ° =20 °, which indicates that the initial image needs to be rotated by 20 ° counterclockwise, and after the initial image is rotated by 20 ° counterclockwise, the larger probability of the straight line with the inclination angle of 90 ° in the rotated initial image is the boundary of the edge.
In the above embodiment, the preset inclination angle is 90 °, it can be understood that when the inclination angle of the straight line or the line segment is 90 °, the straight line or the line segment will be vertical, and at this time, the straight line with the inclination angle being the preset inclination angle can be accurately and rapidly identified, that is, the boundary line of the edge can be accurately and rapidly identified, and meanwhile, the miscellaneous lines can be visually identified. Of course, the preset inclination angle is not limited to 90 °, and may be more angle values such as 0 °, 15 °, 30 °, 45 °, 60 °, 135 °, 180 °, which are not listed here.
Referring to fig. 6, in some embodiments, the edge of the wafer in the received initial image of the edge is a closed figure, so that after the received initial image of the edge is obtained, ROI segmentation may be performed on the initial image, for example, the graph T1 in fig. 6, and then the segmented ROI region is adjusted to a predetermined tilt angle, for example, the graph T2 in fig. 6; or, the received initial image of the edge may be adjusted to a preset inclination angle, and then ROI capture may be performed.
In one embodiment, the initial image is subjected to ROI truncation multiple times until the complete edge is completely truncated so that the complete edge can be detected. In another example, the regions of the initial image where the edge may have defects may be preliminarily determined by using a magnifying glass, a microscope, etc., and then these regions are subjected to ROI segmentation to further determine whether the defects exist and the specific positions of the defects.
Referring to fig. 5 and 7, in some embodiments, after step 012, step 01 further includes the following steps:
013: carrying out binarization processing on the image information of the edge to obtain a binarized image;
014: carrying out edge detection on the binary image to obtain a contour line of an edge; and
015: and establishing a first image according to the contour line of the edge.
In some embodiments, the processing module 210 further includes a binarization unit 213, an edge detection unit 214, and a first establishing unit 215, where the binarization unit 213 may be configured to perform binarization processing on the image information of the edge to obtain a binarized image; the edge detection unit 214 may be configured to perform edge detection on the binarized image to obtain a contour line of an edge; the first creating unit 215 may be configured to create the first image based on the contour lines of the edges. That is, the binarization unit 213, the edge detection unit 214, and the first creation unit 215 may be used to implement step 013, step 014, and step 015, respectively.
In some embodiments, the processor 10 may be further configured to perform binarization processing on the image information of the edge to obtain a binarized image; carrying out edge detection on the binary image to obtain a contour line of an edge; and establishing a first image according to the contour line of the edge. That is, processor 10 may also be used to implement step 013, step 014 and step 015.
Specifically, the edge image information includes multiple colors, so that the edge contour of the wafer cannot be quickly identified, binarization processing needs to be performed on the image information, a binarized image can be obtained after the binarization processing, and the edge contour region and other regions can be quickly and accurately identified in the binarized image.
Further, edge detection is performed on the binarized image, and contour line information of the edge can be acquired. By carrying out edge detection on the binary image, the obtained contour line of the edge is clearer, and the edge can be separated more clearly. Specifically, edge detection may be performed on the image information of the edge by an edge detection algorithm, such as various edge detection algorithms like Marr-Hildern edge detection, canny edge detection, sobel operator, Prewitt operator, Kirsch operator, Laplace operator, etc., which are not described in detail herein.
Further, after the contour line of the edge is obtained, the first image may be established according to the contour line of the edge. Specifically, an initial first image may be created, and then the contour line of the edge is correspondingly marked in the initial first image to obtain the first image. The size and resolution of the first image may be the same as the edge image information acquired in step 011, so that the points in the first image correspond to the points in the adjusted initial image one to one, and the specific position of the defect on the edge of the wafer can be determined more quickly when the defect is detected. In addition, the size and resolution of the first image may be different from those of the initial image of the edge acquired in step 011, for example, the size of the first image is reduced or enlarged according to a certain ratio according to the initial image.
Referring to fig. 8 and 9, in some embodiments, step 013 includes the following steps:
0131: acquiring a pixel value of each pixel point in the edge image;
0132: establishing a histogram according to the number of pixel points corresponding to the pixel values, wherein the histogram comprises the pixel values and the number of the pixel points corresponding to the pixel values;
0133: acquiring a standard pixel value according to the distribution condition of the pixel values in the histogram; and
0134: and carrying out binarization processing on the image of the edge according to the standard pixel value to obtain a binarized image.
In some embodiments, the binarization unit 213 further includes a first obtaining sub-unit 2131, a creating sub-unit 2132, a second obtaining sub-unit 2133, and a binarization sub-unit 2134. The first obtaining subunit 2131 may be configured to obtain a pixel value of each pixel point in an edge image; the creating subunit 2132 may be configured to create a histogram according to the number of pixel points corresponding to the pixel values, where the histogram includes the pixel values and the number of pixel points corresponding to the pixel values; the second obtaining subunit 2133 may be configured to obtain a standard pixel value according to a distribution of pixel values in the histogram; the binarization sub-unit 2134 may be configured to perform binarization processing on the image of the edge according to the standard pixel value to obtain a binarized image. That is, the first acquiring sub-unit 2131, the establishing sub-unit 2132, the second acquiring sub-unit 2133 and the binarization sub-unit 2134 may be respectively used to implement step 0131, step 0132, step 0133 and step 0134.
In some embodiments, the processor 10 may be further configured to obtain pixel values of each pixel point in the edge image; establishing a histogram according to the number of pixel points corresponding to the pixel values, wherein the histogram comprises the pixel values and the number of the pixel points corresponding to the pixel values; acquiring a standard pixel value according to the distribution condition of the pixel values in the histogram; and carrying out binarization processing on the image of the edge according to the standard pixel value to obtain a binarized image.
Specifically, the pixel values of the pixel points in the edge image obtained in step 012 are determined, the number of the pixel points corresponding to the pixel values is determined, and a histogram is created according to the number of the pixel points corresponding to the pixel values. The distribution of pixel values in the image of the edge can be known according to the histogram, for example, which pixel values are mainly used by pixel points in the image of the edge.
Further, the standard pixel value may be determined according to the distribution of the pixel values. For example, the largest number of the histograms is the pixel value a, and the pixel value a may be determined as the standard pixel value. For another example, if the more number of the histograms is pixel value a, pixel value B, and pixel value C, the average value of pixel value a, pixel value B, and pixel value C may be taken as the standard pixel value; or calculating the pixel value A, the pixel value B and the pixel value C according to a certain weight to obtain a standard pixel value.
Further, after the standard pixel value is determined, the image of the edge is subjected to binarization processing according to the standard pixel value to obtain a binarized image. And carrying out binarization processing according to the standard pixel value, wherein the contour line of the edge in the obtained binarization image is more accurate and clear, the contour of the edge can be more easily identified, the edge is not easily interfered by noise, and meanwhile, the result of subsequent detection is more accurate.
In one example, the pixel point with the pixel value smaller than the standard pixel value is determined as a first color, and the pixel point with the pixel value larger than the standard pixel value is determined as a second color, so that the binarized image including only the first color and the second color can be obtained. In another example, the pixel corresponding to the standard pixel value is determined as the first color, and the pixel corresponding to the other pixel values (i.e., the non-standard pixel values) is determined as the second color.
Further, referring to fig. 10 and 11, in some embodiments, step 0134 includes the following steps:
1341: increasing the standard pixel value by a predetermined variable to obtain a binarized pixel threshold value;
1342: determining pixel points of which the pixel values are within a binarization pixel threshold value as a first color; and
1343: and determining the pixel position of the pixel value outside the binarization pixel threshold value as a second color so as to obtain a binarization image.
In some embodiments, the binarization sub-unit 2134 may be further configured to increase the standard pixel value by a predetermined variable to obtain a binarized pixel threshold value; determining pixel points of which the pixel values are within a binarization pixel threshold value as a first color; and determining the pixel position of the pixel value outside the threshold value of the binary pixel as a second color so as to obtain a binary image. That is, the binarization subunit 2134 may also be used to implement step 1341, step 1342 and step 1343.
In some embodiments, the processor 10 may be further configured to increase the standard pixel value by a predetermined variable to obtain a binarized pixel threshold value; determining pixel points of which the pixel values are within a binarization pixel threshold value as a first color; and determining the pixel position of the pixel value outside the threshold value of the binary pixel as a second color so as to obtain a binary image. That is, the processor 10 may also be used to implement step 1341, step 1342, and step 1343.
Specifically, the standard pixel value is obtained in step 0133, and in order to improve the accuracy of the binarized image, a predetermined variable is added to the standard pixel value so that the binarized pixel threshold value can be obtained. In one embodiment, the predetermined variable may be a preset value, for example, the standard pixel value is a, the predetermined variable is-b and + c, and the binarized pixel threshold may be [ a-b, a + c ]. For another example, if the standard pixel value is a and the predetermined variable is b, the binarized pixel threshold value may be [ a, a + b ].
In another embodiment, the predetermined variable may be determined from the histogram established in step 0132 so that the accuracy of the resulting binarized pixel threshold is relatively high. For example, in the histogram, the standard pixel value is a; in the direction of decreasing the standard pixel value, if the number of the pixel values is b from the pixel value corresponding to the turning point in the multi-dip, a predetermined variable is b-a, and a binary pixel threshold value is a + b-a = b; in the direction of increasing the standard pixel value, if the number of the pixel values is c from the pixel value corresponding to the turning point in the multi-dip process, another preset variable is c-a, and then zero one binary pixel threshold value is a + c-a = c; thus, the binarized pixel threshold is [ b, c ].
Further, after the binary pixel threshold is determined, determining pixel points with pixel values within the binary pixel threshold as a first color, determining pixel points with pixel values outside the binary pixel threshold as a second color, and modifying the first color and the second color of corresponding points in the edge image to obtain the binary image. Therefore, the binarization effect is stable, the binarization image comprises points of the first color and the second color, the area with the first color has a high probability of being the edge of the wafer, the area with the second color has a low probability of being the edge of the wafer, and the edge information of the wafer can be visually identified through the binarization image.
The first color is different from the second color, so that the outline of the wafer can be clearly and quickly identified. For example, the first color may be black and the second color may be white. Alternatively, the first color may be white and the second color may be black. For another example, the first color may be red and the second color may be white. Also for example, the first color may be purple and the second color may be white. The first color and the second color may be other colors, which are not listed or limited herein.
The sequence of step 1342 and step 1343 is not limited herein. For example, step 1342 may be performed first, followed by step 1343. For another example, step 1343 may be performed first, and then step 1342 may be performed. Also for example, step 1342 and step 1343 may be performed simultaneously.
Referring to fig. 11, in the embodiment shown in fig. 11, fig. 11 includes a histogram established according to the pixel values of the pixels in the image shown in fig. 6 and the corresponding numbers of the pixel values. In fig. 11, it can be seen that the gray scale distribution at two ends of the histogram is more, and the distribution quantity of the pixel values between 0 and 36 is more, so that the maximum value of the number can be found between the histogram pixel values 0 and 36, the pixel value corresponding to the maximum value of the number is found to be 12, and 12 can be determined as the standard pixel value. And then, on the basis of 12, adding preset variables of-4 and +4, determining the pixel value of 8-16 as a binary pixel threshold value, modifying pixel points with the pixel values of 8-16 in the edge image into a first color, modifying pixel points with the pixel values of 0-7 and 17-255 into a second color, and obtaining the modified image which is the binary image.
Referring to fig. 12 and 13, in some embodiments, step 02 includes the following steps:
021: performing straight line detection on the first image to obtain a straight line in the first image;
022: determining the inclination angle of each straight line according to the result of the straight line detection;
023: screening a straight line with the inclination angle within a preset angle range to serve as a foreground straight line; and
024: and establishing a second image according to the foreground straight line.
In some embodiments, the establishing module 220 includes a straight line detecting unit 221, a first determining unit 222, a screening unit 223, and a second establishing unit 224. The line detection unit 221 may be configured to perform line detection on the first image to obtain a line in the first image; the first determining unit 222 may be configured to determine the inclination angle of each straight line according to the result of the straight line detection; the screening unit 223 may be configured to screen a straight line of which the inclination angle is within a preset angle range as a foreground straight line; the second establishing unit 224 may be configured to establish the second image from the foreground straight line. That is, the line detecting unit 221, the first determining unit 222, the screening unit 223, and the second establishing unit 224 may be used to implement steps 021, 022, 023, and 024, respectively.
In some embodiments, the processor 10 may be further configured to perform line detection on the first image to obtain a line in the first image; determining the inclination angle of each straight line according to the result of the straight line detection; screening a straight line with the inclination angle within a preset angle range to serve as a foreground straight line; and establishing a second image according to the foreground straight line. That is, the processor 10 may also be used to implement step 021, step 022, step 023, and step 024.
Specifically, lines such as straight lines, arcs, curves and the like may exist in the obtained first image, and the edge contour line of the wafer is a straight line or a curve very close to a straight line, so that in order to avoid the influence of non-straight lines such as arcs, curves and the like on the detection result, the straight line detection needs to be performed on the first image, so that the straight line in the first image can be detected, and the non-straight line contour is removed. Specifically, the first image may be subjected to line detection through a line detection algorithm, and the line detection algorithm may include Hough _ line detection algorithm, LSD line detection algorithm, FLD line detection algorithm, EDlines line detection algorithm, LSWMS line detection algorithm, and other algorithms, which are not listed herein, and are not limited specifically.
Further, based on the results of the line detection, the inclination angle of each line may be determined so as to determine whether the line is a boundary of the edge based on the inclination angle of the line. In step 012, the initial image of the edge is adjusted so that the straight line in the first image whose tilt angle is within the preset tilt angle range is more likely to be the boundary of the edge of the wafer, and the straight line in the first image whose tilt angle is not within the preset tilt angle range is not likely to be the boundary of the edge of the wafer.
Further, straight lines with the inclination angles within the preset inclination angles are screened out, and the straight lines are determined as foreground straight lines. The straight line whose inclination angle is not within the preset inclination angle range will be discarded so that the disturbing straight line of the edge can be discarded. Further, a second image is established according to the foreground straight line. For example, a blank image is created, and then the foreground straight line is marked in the blank image, resulting in a second image. For another example, the part of the first image that is not a foreground straight line is removed, and only a foreground image is left in the first image, so that the processed first image is the second image. The color of the foreground straight line in the second image is different from that of the background, for example, the foreground straight line may be black, and the background may be white. It is understood that the foreground straight line includes the boundary of the edge of the wafer, the second image drawn according to the foreground straight line includes the boundary of the edge of the wafer, and other disturbing straight lines in the second image are fewer.
Referring to fig. 14 and 15, in some embodiments, step 03 includes the following steps:
031: establishing an initial reconstruction image;
032: analyzing the first image and the second image to obtain a reconstructed edge boundary; and
033: the reconstructed edge boundary is marked in the initial reconstructed image to obtain a third image.
In some embodiments, the reconstruction module 230 includes a third establishing unit 231, an analyzing unit 232, and a marking unit 233. The third establishing unit 231 may be configured to establish an initial reconstructed image; the analyzing unit 232 may be configured to analyze the first image and the second image to obtain a reconstructed edge boundary; the marking unit 233 may be configured to mark the reconstructed edge boundary in the initial reconstructed image to obtain the third image. That is, the third establishing unit 231, the analyzing unit 232 and the marking unit 233 may be used to implement step 031, step 032 and step 033, respectively.
In some embodiments, the processor 10 may be further configured to: establishing an initial reconstruction image; analyzing the first image and the second image to obtain a reconstructed edge boundary; and marking the reconstructed edge boundary in the initial reconstructed image to obtain a third image. That is, the processor 10 may also be configured to implement step 031, step 032, and step 033.
Specifically, the size and resolution of the initial reconstructed image may be the same as those of the second image or the first image, and the size and resolution of the initial reconstructed image may also be scaled down or scaled up by the second image or the first image, so that the actual position of the defect on the wafer may be determined more quickly when the position of the defect is determined. The initial reconstructed image may be an image with empty contents, and the initial reconstructed image may be a pure color image, for example, each region in the initial reconstructed image is black, white, purple, blue, or the like.
Further, the first image and the second image are analyzed. For example, the foreground straight line in the second image may be analyzed to preliminarily determine an edge boundary, and then the first image may be analyzed to modify the preliminarily determined edge boundary, where the modified edge boundary is the reconstructed edge boundary. For another example, the first image may be analyzed to determine a preliminary edge boundary, and then the second image may be analyzed to modify the preliminary edge boundary, where the modified edge boundary is the reconstructed edge boundary. By analyzing the first image and the second image, the obtained reconstructed edge boundary is more accurate, and whether the edge has defects or not is more accurate.
Furthermore, after the reconstructed edge boundary is obtained, the reconstructed edge boundary may be marked to a corresponding position of the initial reconstructed image, and the marked initial reconstructed image is the third image. Specifically, the coordinates of each point on the reconstructed edge boundary are determined, then the points are correspondingly converted to the points on the initial reconstructed image, and the points converted to the initial reconstructed image are marked with a color different from other points, so that the edge boundary can be more intuitively, clearly and accurately identified.
Referring to fig. 16 and 17, in some embodiments, step 032 includes the following steps:
0321: scanning the second image line by line along a first predetermined direction;
0322: setting the identifier to be a first value when the point in the second image is scanned to meet a first preset condition;
0323: scanning the first image line by line along a first preset direction to obtain a first characteristic point; and
0324: and reconstructing an edge boundary according to the first characteristic point.
In some embodiments, the analysis unit 232 includes a first scanning subunit 2321, a first setting subunit 2322, a second scanning subunit 2323, and a reconstruction subunit 2324. The first scanning subunit 2321 may be configured to scan the second image line by line along a first predetermined direction; the first setting subunit 2322 may be configured to set the identifier to a first value when the point scanned in the second image satisfies the first preset condition; the second scanning subunit 2323 may be configured to perform line-by-line scanning on the first image along the first predetermined direction to obtain a first feature point; the reconstruction subunit may 2324 be configured to reconstruct the edge boundary from the first feature point. That is, the first scanning subunit 2321, the first setting subunit 2322, the second scanning subunit 2323 and the reconstruction subunit 2324 can be used to implement step 0321, step 0322, step 0323 and step 0324, respectively.
In some embodiments, the processor 10 may be further configured to: scanning the second image line by line along a first predetermined direction; setting the identifier to be a first value when the point in the second image is scanned to meet a first preset condition; scanning the first image line by line along a first preset direction to obtain a first characteristic point; and reconstructing an edge boundary line based on the first feature point. That is, processor 10 may also be used to implement step 0321, step 0322, step 0323 and step 0324.
Specifically, the second image includes a foreground straight line and a background portion, the second image includes a plurality of rows of pixel points, and each row includes a plurality of pixel points. The first predetermined direction may be a direction from left to right, from right to left, from top to bottom, from bottom to top, and the like, which are not listed here, and the first predetermined direction may also be any direction perpendicular to the foreground straight line. And scanning the pixel points of the first line in the second image one by one along the first preset direction, and continuing to scan the pixel points of the second line in the second image one by one along the first preset direction after the first line is scanned, until all the lines in the second image are scanned.
When scanning a certain row of pixel points in the second image, if a pixel point satisfying a first preset condition is scanned, the identifier is modified to a first value, for example, the identifier is modified to 1. The first preset condition may be that the pixel point is a point on a foreground straight line; or, the first preset condition may be that the pixel point is a point on the designated position; or, the first preset condition may also be that the pixel point is a point of a specified color, for example, white; still alternatively, the first preset condition may be that the pixel point is a point on the background.
When the identifier is a first value, the first image is scanned line by line along a first preset direction so as to acquire a first characteristic point from the first image. Specifically, in one embodiment, when the first image is scanned line by line, points in the first image that satisfy predetermined conditions may be scanned, and then the points of the predetermined conditions may be determined as the first feature points. It will be appreciated that when the identifier is not the first value, the scanning of the second image will continue until the point in the second image that meets the first predetermined condition is scanned, without scanning the first image.
Further, after the first feature point is obtained, the edge boundary may be reconstructed according to the first feature point. Specifically, in one embodiment, the acquired first feature points are connected to form a line, and the line is a reconstructed edge boundary line. In another embodiment, the plurality of first feature points may be fitted by a fitting algorithm, and then a fitted line may be obtained, which may be a reconstructed edge boundary line. Further, in yet another embodiment, a smoothing process may be performed on a fitted line obtained by fitting a plurality of first feature points, where the smoothed line is a reconstructed edge boundary, and the reconstructed edge boundary is smoother and has smaller error. In yet another embodiment, the point corresponding to the first feature point is modified in the initial reconstructed image to a color different from the background (e.g., black background, white, green, red, purple, etc. may be modified).
Further, referring to fig. 18, in some embodiments, step 0323 includes the following steps:
3231: judging whether a first point in the first image, which corresponds to a point in the second image meeting a first preset condition, meets a second preset condition or not;
3232: if yes, scanning the first image from the first point along a first preset direction;
3233: resetting the identifier to be a second value until a second point which does not meet a second preset condition in the first image is scanned;
and along the first preset direction, points between the first point and the second point in the first image are first characteristic points.
In some embodiments, the second scanning subunit 2323 may also be configured to: judging whether a first point in the first image, which corresponds to a point in the second image meeting a first preset condition, meets a second preset condition or not; if yes, scanning the first image from the first point along a first preset direction; resetting the identifier to be a second value until a second point which does not meet a second preset condition in the first image is scanned; and along the first preset direction, points between the first point and the second point in the first image are first characteristic points. That is, the second scanning subunit may also be used to implement step 3231, step 3232 and step 3233.
In some embodiments, the processor 10 may be further configured to: judging whether a first point in the first image, which corresponds to a point in the second image meeting a first preset condition, meets a second preset condition or not; if yes, scanning the first image from the first point along a first preset direction; resetting the identifier to be a second value until a second point which does not meet a second preset condition in the first image is scanned; and along the first preset direction, points between the first point and the second point in the first image are first characteristic points. That is, processor 10 may also be used to implement step 3231, step 3232 and step 3233.
Specifically, when a point in the second image, which satisfies the first preset image, is scanned as a point a, a first point in the first image, which corresponds to the point a, is determined, and whether the first point satisfies a second preset condition is determined. The second preset condition may be a preset condition, for example, the second preset condition may include that the dot is a specific color (e.g., black); or the second preset condition may include the point being in an edge region of the wafer, etc.
Further, when the first point meets a second preset condition, scanning the first image in a first preset direction from the first point; and resetting the identifier to a second value until a second point in the first image which does not meet a second preset condition is scanned. In the first image, all points from the first point to the second point are the first feature points (including the first point, excluding the second point), and it can be understood that the first point and all points between the first point and the second point satisfy the second predetermined condition, and all points can be regarded as points on the edge of the wafer. When the first point does not satisfy the second preset condition, the identifier is reset to a second value (for example, 0), and the scanning of the second image in the first preset direction is continued along the point A.
In one embodiment, the first image comprises the binarized image obtained in step 1343. When a point meeting a first preset condition in the second image is scanned as an A point, determining a B point corresponding to the A point in the binary image, and then judging whether the B point meets a second preset condition, wherein the second preset condition is that the point is in a first color, if the B point is in the first color, starting to scan a line where the B point is located along a first preset direction from the B point until the identifier is reset to a second value when a point C where the line point is in the second color is scanned. The point B and the points from the point B to the point C are all first characteristic points. If the point B is of the second color, the identifier is reset to be the second value, the second image is scanned from the point A again, and when the point D scanned on the second image meets the first preset condition, whether the point E corresponding to the point D in the first image meets the second preset condition is judged.
Referring to fig. 19, in some embodiments, after step 3233, step 0323 further includes the following steps:
3234: starting to scan the second image along a first preset direction from a point corresponding to the second point in the second image;
3235: setting the identifier to be a first value when a point satisfying a first preset condition in the second image is scanned;
3236: scanning the first image in a first predetermined direction;
3237: resetting the identifier to be a second value until the point in the first image is scanned to be not capable of meeting a second preset condition; and
3238: and repeating the steps of scanning the first image, setting the identifier, scanning the second image and resetting the identifier until the end point of the current line of the first image is scanned.
In some embodiments, the second scanning subunit 2323 may also be configured to: starting to scan the second image along a first preset direction from a point corresponding to the second point in the second image; setting the identifier to be a first value when a point satisfying a first preset condition in the second image is scanned; scanning the first image in a first predetermined direction; resetting the identifier to be a second value until the point in the first image is scanned to be not capable of meeting a second preset condition; and repeatedly executing the steps of scanning the first image, setting the identifier, scanning the second image and resetting the identifier until the end point of the current line of the first image is scanned. That is, the second scanning subunit 2323 may also be used for implementing step 3234, step 3235, step 3236, step 3237 and step 3238.
In some embodiments, the processor 10 may be further configured to: starting to scan the second image along a first preset direction from a point corresponding to the second point in the second image; setting the identifier to be a first value when a point satisfying a first preset condition in the second image is scanned; scanning the first image in a first predetermined direction; resetting the identifier to be a second value until the point in the first image is scanned to be not capable of meeting a second preset condition; and repeatedly executing the steps of scanning the first image, setting the identifier, scanning the second image and resetting the identifier until the end point of the current line of the first image is scanned. That is, processor 10 may also be used to implement step 3234, step 3235, step 3236, step 3237 and step 3238.
Specifically, after step 3233, the second point may be an end point of the current scanning line of the first image, and if the first image is not scanned continuously, the number of the acquired first feature points is insufficient, which may result in incomplete boundary of the reconstructed edge. Accordingly, after step 3233, a point in the second image corresponding to the second point is acquired, and then the second image is scanned in the first predetermined direction from that point. The identifier is set to a first value when a point in the second image is scanned that satisfies a first preset condition.
Further, if the point satisfying the first preset condition in the second image obtained in step 3235 is a point D, a point E corresponding to the point D in the first image is obtained, and the first image is scanned in the first predetermined direction from the point E until a point F in the first image not satisfying the second preset condition is scanned, and the identifier is reset to the second value. It is understood that point E and points between point E and point F are also first feature points. Step 3234, step 3235 and step 3236 are then executed again until the end of the current line of the first image is scanned. Therefore, the obtained first feature points are more accurate and more in number, so that the edge boundary reconstructed according to the first feature points is more complete and correct, and the detected edge defects are more accurate.
It is understood that step 3234, step 3235 and step 3236 are the same or approximately the same as step 3231, step 3232 and step 3233, respectively. That is, the analyzing of the first image and the second image to obtain the content of the reconstructed boundary line is mainly to repeat the steps of scanning the first image, setting the identifier, scanning the second image, and resetting the identifier until the end point of the current line of the first image is scanned.
Further, after the end point of the current line of the first image is scanned, the next line of the first image is scanned. That is, step 3231, step 3232, step 3233, step 3234, step 3235, step 3236 and step 3237 are performed on the pixel points in the next row of the first image and the second image, respectively, to extract the first feature points in the next row. And until all the rows in the first image are scanned, the acquired first feature points are complete and accurate, and the edge boundary reconstructed based on the first feature points is more complete and accurate.
In some embodiments, the edge of the wafer normally has two boundaries, if the left boundary is curved a little bit, the curved left boundary cannot be detected when performing the straight line detection, and the right boundary is a straight line, so that only the right boundary in the second image can be detected, and when the second image is scanned from left to right, the identifier is set to the first value when the right boundary is scanned, and then the corresponding position of the first image is scanned, so that the edge contour finally reconstructed may have only the right boundary, resulting in an inaccurate reconstructed edge contour.
Further, referring to fig. 17 and fig. 20, after step 0324, step 032 further includes the following steps:
0325: scanning the second image line by line along a second predetermined direction, wherein the second predetermined direction is opposite to the first predetermined direction;
0326: setting the identifier to be a first value when the point in the second image is scanned to meet a first preset condition;
0327: scanning the first image line by line along a second preset direction to obtain a second characteristic point; and
0328: and updating the reconstructed edge boundary according to the second characteristic point.
In some embodiments, the analysis unit 232 further includes a third scanning subunit 2325, a second setting subunit 2326, a fourth scanning subunit 2327 and an updating subunit 2328. The third scanning subunit 2325 may be configured to scan the second image line by line along a second predetermined direction, where the second predetermined direction is opposite to the first predetermined direction; the second setting subunit 2326 may be configured to set the identifier to a first value when the point scanned in the second image satisfies the first preset condition; the fourth scanning subunit 2327 may be configured to perform progressive scanning on the first image along the second predetermined direction to obtain a second feature point; the update subunit 2328 may be configured to update the reconstructed edge boundary according to the second feature point. It is understood that third scanning subunit 2325, second set subunit 2326, fourth scanning subunit 2327 and update subunit 2328 can be used to implement steps 0325, 0326, 0327 and 0328, respectively.
In some embodiments, the processor 10 may be further configured to: scanning the second image line by line along a second predetermined direction, wherein the second predetermined direction is opposite to the first predetermined direction; setting the identifier to be a first value when the point in the second image is scanned to meet a first preset condition; scanning the first image line by line along a second preset direction to obtain a second characteristic point; and updating the reconstructed edge boundary according to the second characteristic point. Processor 10 may also be used to implement step 0325, step 0326, step 0327, and step 0328.
In particular, the second predetermined direction is opposite to the first predetermined direction. For example, the first predetermined direction is from left to right, and the second predetermined direction may be from right to left. For another example, the first predetermined direction may be from right to left, and the second predetermined direction may be from left to right. Also for example, the first predetermined direction is from top to bottom, and the second predetermined direction may be from bottom to top. This is not intended to be exhaustive or to limit the second predetermined direction and the first predetermined direction.
The second image is scanned in a second predetermined direction, and the identifier is set to a first value when a point in the second image is scanned to satisfy a first preset condition. And then scanning the first image along a second preset direction, and when scanning to a point in the first image, wherein the point meets a second preset condition, taking the point as a second characteristic point. The edge boundary reconstructed in step 0324 may then be updated according to the second characteristic point. The acquired second feature points may include at least part of the first feature points, that is, the second feature points and the first feature points have the same points. When the reconstructed edge boundary is updated according to the second feature point, the points of the second feature point that are the same as the first feature point may be removed, and then other points in the second feature point are marked in the image of the reconstructed edge boundary in step 0324, so as to update the reconstructed edge boundary.
After the first image and the second image are scanned along the first preset direction, the first image and the second image are scanned along the second preset direction, and therefore the situation that the boundary of the reconstructed edge is incomplete due to the fact that the boundary of one side is bent and the boundary is not detected can be avoided, and the boundary of the finally reconstructed edge is more complete and accurate and is more accurate when used for detecting the edge defects.
More specifically, referring to fig. 21, in some embodiments, step 0327 includes the following steps:
3271: judging whether a third point corresponding to a point in the first image and a point in the second image which meet a first preset condition meet a second preset condition or not;
3272: if yes, scanning the first image from the third point along a second preset direction; and
3273: resetting the identifier to be a second value until a fourth point which does not meet a second preset condition in the first image is scanned;
and along the second preset direction, a point between the third point and the fourth point in the first image is a first characteristic point.
In some embodiments, the fourth scanning subunit 2327 may also be configured to: judging whether a third point corresponding to a point in the first image and a point in the second image which meet a first preset condition meet a second preset condition or not; if yes, scanning the first image from the third point along a second preset direction; and resetting the identifier to be a second value until a fourth point which does not meet a second preset condition in the first image is scanned. That is, the fourth scanning subunit 2327 may also be used for implementing step 3271, step 3272 and step 3273.
In some embodiments, the processor 10 may be further configured to: judging whether a third point corresponding to a point in the first image and a point in the second image which meet a first preset condition meet a second preset condition or not; if yes, scanning the first image from the third point along a second preset direction; and resetting the identifier to be a second value until a fourth point which does not meet a second preset condition in the first image is scanned. That is, the processor 10 can also be used to implement step 3271, step 3272 and step 3273.
Specifically, steps 3271, 3272 and 3273 are the same as steps 3231, 3232 and 3233, except that the scanning directions of the first image and the second image are opposite, and therefore, they are not described in detail herein. The second feature point obtained can be made more accurate by following step 3271, step 3272, and step 3273.
Further, referring to fig. 22, in some embodiments, after step 3273, step 0327 further includes the following steps:
3274: scanning the second image along a second predetermined direction from a point in the second image corresponding to the second point;
3275: setting the identifier to be a first value when a point satisfying a first preset condition in the second image is scanned;
3276: scanning the first image in a second predetermined direction;
3277: resetting the identifier to be a second value until the point in the first image is scanned to be not capable of meeting a second preset condition; and
3278: and repeating the steps of scanning the first image, setting the identifier, scanning the second image and resetting the identifier until the end point of the current line of the first image is scanned.
In some embodiments, the fourth scanning subunit 2327 may also be configured to: scanning the second image along a second predetermined direction from a point in the second image corresponding to the second point; setting the identifier to be a first value when a point satisfying a first preset condition in the second image is scanned; scanning the first image in a second predetermined direction; resetting the identifier to be a second value until the point in the first image is scanned to be not capable of meeting a second preset condition; and repeatedly executing the steps of scanning the first image, setting the identifier, scanning the second image and resetting the identifier until the end point of the current line of the first image is scanned. That is, fourth scanning subunit 2327 may also be used to implement steps 3274, 3275, 3276, 3277 and 3278.
In some embodiments, the processor 10 may be further configured to: scanning the second image along a second predetermined direction from a point in the second image corresponding to the second point; setting the identifier to be a first value when a point satisfying a first preset condition in the second image is scanned; scanning the first image in a second predetermined direction; resetting the identifier to be a second value until the point in the first image is scanned to be not capable of meeting a second preset condition; and repeatedly executing the steps of scanning the first image, setting the identifier, scanning the second image and resetting the identifier until the end point of the current line of the first image is scanned. That is, processor 10 may also be used to implement step 3274, step 3275, step 3276, step 3277, and step 3278.
Step 3274, step 3275, step 3276, step 3277 and step 3278 are the same as step 3234, step 3235, step 3236, step 3237 and step 3238, except that the scanning directions of the first image and the second image are opposite, and therefore will not be described in detail herein. All the points of the current scan line of the first image are scanned through step 3274, step 3275, step 3276, step 3277 and step 3278, so that the scanned second feature points are more complete.
Referring to fig. 23, in some embodiments, step 033 includes the following steps:
0331: and marking the point corresponding to the boundary of the edge in the initial reconstructed image as a characteristic color to obtain a third image.
In some embodiments, the labeling unit 233 may be further configured to label a point in the initial reconstructed image corresponding to the boundary of the edge as a feature color to obtain a third image. That is, the tagging unit 233 may also be used to implement step 0331.
In some embodiments, the processor 10 may be further configured to mark a point in the initial reconstructed image corresponding to the boundary of the edge as a characteristic color to obtain a third image. That is, processor 10 may also be used to implement step 0331.
Specifically, the reconstructed edge boundary obtained in step 032 is obtained by first determining points in the initial reconstructed image corresponding to the edge boundary, then marking the points as feature colors, and the initial reconstructed image marked with the feature colors is the third image. Wherein the characteristic color may be a color different from the background of the initial reconstructed image. For example, if the background is black, the characteristic color may be white, red, purple, etc.; for example, if the background is white, the characteristic color may be black, red, or purple.
Referring to fig. 6 and 24, fig. 24 includes a graph T3 obtained by reconstructing edge boundaries of a wafer based on the graph T2 in fig. 6. In the graph T3, the boundary line of the edge is white, the background is black, the straight lines on both sides of the white area in the graph T3 can be represented as the boundary line of the edge after reconstruction, and the areas in the boundary line of both sides of the edge are the edge, so that the edge condition and the broken condition of the wafer can be clearly and intuitively recognized in the graph T3.
Referring to fig. 25 and 26, in some embodiments, step 04 includes the following steps:
041: determining a broken area with a broken distance larger than a preset length in the boundary line of the edge as an initial abnormal area;
042: removing the false abnormal region in the initial abnormal region according to the first image and the edge boundary to obtain a real defect region; and
043: and performing coordinate transformation on the coordinates of the real defect area to determine the position of the defect at the edge.
In some embodiments, the identification module 240 includes a second determination unit 241, a removal unit 242, and a coordinate conversion unit 243. The second determining unit 241 may be configured to determine a broken region, in which a broken distance is greater than a predetermined length, in the edge boundary as an initial abnormal region; the removing unit 242 may be configured to remove the false abnormal region in the initial abnormal region according to the first image and the edge boundary to obtain a real defect region; the coordinate transformation unit 243 may be configured to perform coordinate transformation on the coordinates of the real defect region to determine the position of the defect at the edge. That is, the second determining unit 241, the removing unit 242, and the coordinate transforming unit 243 may be respectively used to implement step 041, step 042, and step 043.
In some embodiments, the processor 10 may be further configured to determine a broken region in the edge boundary, where the broken distance is greater than a predetermined length, as an initial abnormal region; removing the false abnormal region in the initial abnormal region according to the first image and the edge boundary to obtain a real defect region; and carrying out coordinate conversion on the coordinates of the real defect area so as to determine the position of the defect at the edge. That is, processor 10 may also be used to implement step 041, step 042 and step 043.
In particular, a reconstructed edge boundary is obtained in step 03. And identifying whether the interrupted area exists in the boundary line of the edge, acquiring the disconnection distance of the interrupted area, and determining the interrupted area with the disconnection distance larger than or equal to a preset length as the initial abnormal area. The predetermined length may be a preset length value, or may be an empirical value obtained from a plurality of tests. A region having a break distance less than a predetermined length may be regarded as a normal region rather than a defective region, and a region having a break distance greater than or equal to a predetermined length may be regarded as a preliminary abnormal region, and it is understood that the preliminary abnormal region includes a normal region (i.e., a false abnormal region) and a defective region (i.e., a real defective region).
Further, the false abnormal region in the preliminary abnormal region needs to be determined to remove the false abnormal region, so as to avoid the interference of the false abnormal region on the detection result, so that the detected defects are real defects, and the detection accuracy of the edge of the wafer is high. Because the first image comprises the image information of the edge of the wafer, the edge contour in the first image is more accurate, the false abnormal region and the real abnormal region in the preliminary abnormal region can be determined by combining the first image and the edge boundary, and then the false abnormal region can be removed.
For example, whether the initial abnormal region is complete in the first image is determined, and if the region corresponding to the initial abnormal region in the first image is complete and not disconnected, the initial abnormal region is considered as a false abnormal region, and is not a defect on the edge; if the area corresponding to the initial abnormal area in the first image is disconnected, the initial abnormal area can be considered as a real defect area.
Further, the position coordinates of the real defect region in the third image are acquired, and the position coordinates may include coordinates of a plurality of vertices or coordinates of a contour line of the real defect region. And then, converting the position coordinates of the real defect area in the third image into the edge original image according to the relationship between the third image and the second image, the relationship between the second image and the first image, and the relationship between the first image and the edge original image of the wafer, and marking in the edge original image, for example, marking in the edge original image as a prominent color (for example, red, yellow, black, etc.), so that the specific position of the defect on the edge can be visually identified.
For example, if the size and resolution of the third image are the same as those of the second image, the size and resolution of the second image are the same as those of the first image, and the size and resolution of the first image are the same as those of the edge original image of the wafer, the size and resolution of the third image are the same as those of the edge original image of the wafer, the coordinates of the pixel points in the third image are the same as those of the pixel points in the edge original image, and the coordinates of the real defect area are the coordinates in the original image.
Referring to fig. 27 and 28, in some embodiments, step 042 includes the following steps:
0421: judging whether an area corresponding to the initial abnormal area in the first image meets a third preset condition or not;
0422: if so, determining the initial abnormal region as a false abnormal region; and
0423: and if not, determining the initial abnormal area as a real defect area.
In some embodiments, the removal unit 242 includes a determination subunit 2421, a first determination subunit 2422, and a second determination subunit 2423. The judging subunit 2421 may be configured to judge whether a region corresponding to the initial abnormal region in the first image satisfies a third preset condition; the first determining subunit 2422 may be configured to determine the initial abnormal region as the false abnormal region when the output result of the determining subunit is yes; the second determining subunit 2423 may be configured to determine the initial abnormal region as the real defective region when the output result of the judging subunit is no. That is, the judging subunit 2421 may be used to implement step 0421, the first determining subunit 2422 may be used to implement step 0422, and the second determining subunit 2423 may be used to implement step 0423.
In some embodiments, the processor 10 may be configured to determine whether a region corresponding to the initial abnormal region in the first image satisfies a third preset condition; if so, determining the initial abnormal region as a false abnormal region; and if not, determining the initial abnormal area as a real defect area. That is, processor 10 may be used to implement steps 0421, 0422 and 0423.
Specifically, regions in the first image corresponding to the initial abnormal regions are determined, and it is determined whether the regions satisfy a third preset condition. And determining the regions meeting the third preset condition from the regions as false abnormal regions, and removing the false abnormal regions. And determining the areas which do not meet the third preset condition in the areas as real defect areas, and marking the real defect areas in the third image. By removing the false abnormal area, the edge defect which can be identified is more accurate, and the accuracy of detecting the edge defect of the wafer is improved.
The third preset condition may be that the area is a normal area of the edge of the wafer in the first image, the third preset condition may also be a designated position of the area in the first image, the third preset condition may also be that the area is in a designated color in the first image, and the third preset condition may also be that the proportion of the designated color of the area in the first image is greater than a preset percentage. Of course, the third preset condition may be other conditions, which are not listed here.
In one embodiment, the first image includes the binarized image obtained in step 013, the third preset condition is that the region is in the second color, the initial abnormal region includes region a1, region a2, region A3 and region a4, and region a1, region a2, region A3 and region a4 correspond to region B1, region B2, region B3 and region B4 in the binarized image, respectively. In the binarized image, the color of the region B1 and the region B4 is a first color, the color of the region B2 and the region B3 is a second color, the region B2 and the region B3 meet a third preset condition, the region B1 and the region B4 do not meet the third preset condition, the region a2 and the region A3 are false abnormal regions, the region a1 and the region a4 are real defect regions, and the coordinates of the region a1 and the region a4 are converted into an original image of the edge, so that the real defect position of the edge of the wafer can be identified.
Referring to fig. 1 and fig. 2 again, the memory 20 is used for storing a computer program that can be executed on the processor 10, and the processor 10 executes the computer program to implement the defect detection method according to any of the above embodiments.
The memory 20 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory. Further, the intelligent robot may further include a communication interface 30 for communication between the memory 20 and the processor 10.
If the memory 20, the processor 10 and the communication interface 30 are implemented independently, the communication interface 30, the memory 20 and the processor 10 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 2, but it is not intended that there be only one bus or one type of bus.
Optionally, in a specific implementation, if the memory 20, the processor 10, and the communication interface 30 are integrated on a chip, the memory 20, the processor 10, and the communication interface 30 may complete communication with each other through an internal interface.
The processor 10 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present Application.
Referring to fig. 29, a non-transitory computer readable storage medium 300 according to an embodiment of the present application includes a computer program 301, and when the computer program 301 is executed by one or more processors 400, the processor 400 executes the defect detection method according to any embodiment of the present application.
For example, referring to fig. 1, when the computer program 301 is executed by the processor 400, the processor 400 is configured to perform the following steps:
01: processing the image information of the edge to create a first image;
02: establishing a second image according to a straight line of which the inclination angle is within a preset angle range in the contour line of the first image;
03: according to the first image and the second image, reconstructing the edge image into a third image, wherein the third image comprises a reconstructed edge boundary; and
04: and identifying the position of the defect of the edge according to the edge boundary and the first image.
For another example, referring to fig. 16, when the computer program 301 is executed by the processor 400, the processor 400 is configured to perform the following steps:
0321: scanning the second image line by line along a first predetermined direction;
0322: setting the identifier to be a first value when the point in the second image is scanned to meet a first preset condition;
0323: scanning the first image line by line along a first preset direction to obtain a first characteristic point; and
0324: and reconstructing an edge boundary according to the first characteristic point.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium. The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the terms "certain embodiments," "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present application, "a plurality" means at least two, e.g., two, three, unless specifically limited otherwise.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application, which is defined by the claims and their equivalents.

Claims (17)

1. A defect detection method for detecting the edge of a wafer is characterized by comprising the following steps:
processing the image information of the edge to create a first image;
establishing a second image according to a straight line of which the inclination angle is within a preset angle range in the contour line of the first image;
reconstructing the image of the edge as a third image according to the first image and the second image, wherein the third image comprises a reconstructed edge boundary; and
and identifying the position of the defect of the edge according to the edge boundary and the first image.
2. The defect detection method of claim 1, wherein the processing the image information of the edge to create a first image comprises:
carrying out binarization processing on the image information of the edge to obtain a binarized image;
carrying out edge detection on the binary image to obtain a contour line of the edge; and
and establishing the first image according to the contour line of the edge.
3. The defect detection method according to claim 2, wherein said binarizing the image information of the edge to obtain a binarized image comprises:
acquiring the pixel value of each pixel point in the edge image;
establishing a histogram according to the number of pixel points corresponding to the pixel values, wherein the histogram comprises the pixel values and the number of the pixel points corresponding to the pixel values;
acquiring a standard pixel value according to the distribution condition of the pixel values in the histogram; and
and carrying out binarization processing on the image of the edge according to the standard pixel value to obtain the binarized image.
4. The defect detection method according to claim 3, wherein said binarizing the image of the edge according to the standard pixel value to obtain the binarized image comprises:
increasing the standard pixel value by a predetermined variable to obtain a binary pixel threshold value; and
and determining pixel points of the pixel values within the binarization pixel threshold value as a first color, and determining pixel points of the pixel values outside the binarization pixel threshold value as a second color to obtain the binarization image.
5. The defect detection method according to claim 2, wherein said processing the image information of the edge to create a first image before said binarizing the image information of the edge to obtain a binarized image, further comprises:
acquiring a received initial image of the edge; and
and adjusting the initial image to the preset inclination angle to obtain the image information.
6. The method of claim 1, wherein the creating a second image according to a straight line in the contour line of the first image, the straight line having an inclination angle within a predetermined angle range, comprises:
performing straight line detection on the first image to obtain a straight line in the first image;
determining the inclination angle of each straight line according to the result of the straight line detection;
screening a straight line of the inclination angle within the preset angle range to serve as a foreground straight line; and
and establishing the second image according to the foreground straight line.
7. The method of claim 1, wherein reconstructing the image of the edge as a third image from the first image and the second image, the third image including a reconstructed edge boundary comprises:
establishing an initial reconstruction image;
analyzing the first image and the second image to obtain the reconstructed edge boundary;
marking the reconstructed edge boundary in the initial reconstructed image to obtain the third image.
8. The method of claim 7, wherein analyzing the first image and the second image to obtain the reconstructed edge boundary comprises:
scanning the second image line by line along a first preset direction;
setting an identifier to be a first value when the point in the second image is scanned to meet a first preset condition;
scanning the first image line by line along the first preset direction to obtain a first characteristic point; and
and reconstructing the edge boundary line according to the first characteristic point.
9. The defect detection method of claim 8, wherein said scanning the first image along the first predetermined direction to obtain a first feature point comprises:
judging whether a first point in the first image, which corresponds to a point in the second image meeting a first preset condition, meets a second preset condition;
if yes, scanning the first image from the first point along the first preset direction; and
resetting the identifier to a second value until a second point in the first image which does not meet the second preset condition is scanned;
wherein, along the first predetermined direction, a point between the first point and the second point in the first image is the first feature point.
10. The defect detection method of claim 9, wherein said scanning the first image in the first predetermined direction to obtain a first feature point after said resetting the identifier to the second value, further comprises:
starting to scan the second image along the first predetermined direction from a point in the second image corresponding to the second point;
setting the identifier to a first value when a point in the second image satisfying the first preset condition is scanned;
scanning the first image along the first predetermined direction;
resetting the identifier to be a second value until the point in the first image is scanned to be not capable of meeting the second preset condition; and
and repeating the steps of scanning the first image, setting the identifier, scanning the second image and resetting the identifier until the end point of the current line of the first image is scanned.
11. The method of claim 8, wherein after the reconstructing the edge boundary, the analyzing the first image and the second image to obtain the reconstructed edge boundary further comprises:
scanning the second image line by line along a second predetermined direction, the second predetermined direction being opposite to the first predetermined direction;
when the point in the second image is scanned to meet the first preset condition, setting the identifier as the first value;
scanning the first image line by line along the second preset direction to obtain a second characteristic point; and
and updating the reconstructed edge boundary according to the second characteristic point.
12. The defect detection method of claim 7, wherein said marking the reconstructed edge boundary in the initial reconstructed image to obtain the third image comprises:
and marking the point corresponding to the boundary of the edge in the initial reconstruction image as a characteristic color to obtain the third image.
13. The method of claim 1, wherein the identifying the location of the defect at the edge based on the edge boundary and the first image comprises:
determining a broken area with a breaking distance of the edge boundary being greater than a preset length as an initial abnormal area;
removing false abnormal regions in the initial abnormal region according to the first image and the edge boundary to obtain a real defect region; and
and performing coordinate transformation on the coordinates of the real defect area to determine the position of the defect on the edge.
14. The method of claim 13, wherein the removing false abnormal regions from the initial abnormal region according to the first image and the edge boundary to obtain real defect regions comprises:
judging whether an area corresponding to the initial abnormal area in the first image meets a third preset condition or not;
if so, determining the initial abnormal region as the false abnormal region; and
if not, determining the initial abnormal area as the real defect area.
15. A defect inspection apparatus for inspecting an edge of a wafer, the apparatus comprising:
the processing module is used for processing the image information of the edge to establish a first image;
the establishing module is used for establishing a second image according to a straight line of which the inclination angle is within a preset angle range in the contour line of the first image;
a reconstruction module, configured to reconstruct an image of the edge as a third image according to the first image and the second image, where the third image includes a reconstructed edge boundary;
and the identification module is used for identifying the position of the defect of the edge according to the edge boundary and the first image.
16. An inspection apparatus for inspecting an edge of a wafer, the apparatus comprising:
one or more processors, memory; and
one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the defect detection method of any of claims 1 to 14.
17. A non-transitory computer readable storage medium containing a computer program which, when executed by one or more processors, causes the processors to perform the defect detection method of any one of claims 1 to 14.
CN202011275270.5A 2020-11-16 2020-11-16 Defect detection method and device, detection equipment and readable storage medium Active CN112070766B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011275270.5A CN112070766B (en) 2020-11-16 2020-11-16 Defect detection method and device, detection equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011275270.5A CN112070766B (en) 2020-11-16 2020-11-16 Defect detection method and device, detection equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112070766A true CN112070766A (en) 2020-12-11
CN112070766B CN112070766B (en) 2021-02-19

Family

ID=73655474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011275270.5A Active CN112070766B (en) 2020-11-16 2020-11-16 Defect detection method and device, detection equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112070766B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113052822A (en) * 2021-03-26 2021-06-29 深圳中科飞测科技股份有限公司 Adjustment method, detection device, and computer-readable storage medium
CN113533375A (en) * 2021-08-26 2021-10-22 惠州市特创电子科技股份有限公司 Forward and reverse scanning modeling detection method for printed circuit board
CN114913112A (en) * 2021-02-08 2022-08-16 东方晶源微电子科技(北京)有限公司 Method, device and equipment for detecting double edges of wafer
CN116428985A (en) * 2023-06-13 2023-07-14 江苏京创先进电子科技有限公司 Edge coordinate acquisition method, wafer alignment identification method and wafer circular cutting method
CN117036350A (en) * 2023-10-08 2023-11-10 保定来福汽车照明集团沧州有限公司 Defect detection method, device, terminal and storage medium for metal lamp holder welding mud

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07167793A (en) * 1993-12-16 1995-07-04 Hitachi Ltd Phase difference semiconductor inspection device and its production method
CN109166109A (en) * 2018-08-14 2019-01-08 珠海格力智能装备有限公司 Defect detection method, device, storage medium and processor
CN109211937A (en) * 2018-08-28 2019-01-15 西安工程大学 A kind of detection system and its detection method of underwear elastic woven tape curved strip defect
CN110286126A (en) * 2019-06-17 2019-09-27 浙江大学 A kind of wafer surface defects subregion area detecting method of view-based access control model image
CN110767564A (en) * 2019-10-28 2020-02-07 苏师大半导体材料与设备研究院(邳州)有限公司 Wafer detection method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07167793A (en) * 1993-12-16 1995-07-04 Hitachi Ltd Phase difference semiconductor inspection device and its production method
CN109166109A (en) * 2018-08-14 2019-01-08 珠海格力智能装备有限公司 Defect detection method, device, storage medium and processor
CN109211937A (en) * 2018-08-28 2019-01-15 西安工程大学 A kind of detection system and its detection method of underwear elastic woven tape curved strip defect
CN110286126A (en) * 2019-06-17 2019-09-27 浙江大学 A kind of wafer surface defects subregion area detecting method of view-based access control model image
CN110767564A (en) * 2019-10-28 2020-02-07 苏师大半导体材料与设备研究院(邳州)有限公司 Wafer detection method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭毅强: "晶圆表面缺陷视觉检测研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114913112A (en) * 2021-02-08 2022-08-16 东方晶源微电子科技(北京)有限公司 Method, device and equipment for detecting double edges of wafer
CN113052822A (en) * 2021-03-26 2021-06-29 深圳中科飞测科技股份有限公司 Adjustment method, detection device, and computer-readable storage medium
CN113052822B (en) * 2021-03-26 2024-06-04 深圳中科飞测科技股份有限公司 Adjustment method, detection apparatus, and computer-readable storage medium
CN113533375A (en) * 2021-08-26 2021-10-22 惠州市特创电子科技股份有限公司 Forward and reverse scanning modeling detection method for printed circuit board
CN116428985A (en) * 2023-06-13 2023-07-14 江苏京创先进电子科技有限公司 Edge coordinate acquisition method, wafer alignment identification method and wafer circular cutting method
CN116428985B (en) * 2023-06-13 2023-08-29 江苏京创先进电子科技有限公司 Edge coordinate acquisition method, wafer alignment identification method and wafer circular cutting method
CN117036350A (en) * 2023-10-08 2023-11-10 保定来福汽车照明集团沧州有限公司 Defect detection method, device, terminal and storage medium for metal lamp holder welding mud
CN117036350B (en) * 2023-10-08 2023-12-15 保定来福汽车照明集团沧州有限公司 Defect detection method, device, terminal and storage medium for metal lamp holder welding mud

Also Published As

Publication number Publication date
CN112070766B (en) 2021-02-19

Similar Documents

Publication Publication Date Title
CN112070766B (en) Defect detection method and device, detection equipment and readable storage medium
US8027527B2 (en) Method for analyzing defect data and inspection apparatus and review system
CN105139386B (en) A kind of image processing method of fast automatic detecting electric connector solder joint defective work
JP7271873B2 (en) Inspection device, image forming device and inspection method
CN115020267B (en) Semiconductor surface defect detection method
JPH11328408A (en) Device for processing data and information storage medium
CN111080661A (en) Image-based line detection method and device and electronic equipment
US20060067569A1 (en) Image inspection device, image inspection method, and image inspection program
CN114693610A (en) Welding seam surface defect detection method, equipment and medium based on machine vision
CN116485779B (en) Adaptive wafer defect detection method and device, electronic equipment and storage medium
CN113785181A (en) OLED screen point defect judgment method and device, storage medium and electronic equipment
KR102242996B1 (en) Method for atypical defects detect in automobile injection products
CN117094975A (en) Method and device for detecting surface defects of steel and electronic equipment
CN116843626A (en) Machine vision steel wire rope surface defect detection method based on multi-feature fusion
CN113902697A (en) Defect detection method and related device
CN114359251A (en) Automatic identification method for concrete surface damage
CN113935927A (en) Detection method, device and storage medium
CN113379726A (en) Line detection method, device, equipment and computer readable storage medium
CN117495856A (en) Wafer surface detection method, device, equipment and medium based on deep learning
CN112213314B (en) Detection method and detection system for wafer side surface defects
JPH06207909A (en) Inspection apparatus for surface defect
CN116563298A (en) Cross line center sub-pixel detection method based on Gaussian fitting
CN113643245B (en) Method and device for measuring screen defects and computer readable storage medium
CN115619813A (en) SEM image foreground extraction method and device, computer equipment and storage medium
JPH0718811B2 (en) Defect inspection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 518110 101, 201, 301, No.2, Shanghenglang fourth industrial zone, Tongsheng community, Dalang street, Longhua District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Zhongke feice Technology Co.,Ltd.

Address before: 518110 101, 201, 301, No.2, Shanghenglang fourth industrial zone, Tongsheng community, Dalang street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Zhongke Flying Test Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder