CN116804633A - Detection method and detection system - Google Patents

Detection method and detection system Download PDF

Info

Publication number
CN116804633A
CN116804633A CN202210263513.6A CN202210263513A CN116804633A CN 116804633 A CN116804633 A CN 116804633A CN 202210263513 A CN202210263513 A CN 202210263513A CN 116804633 A CN116804633 A CN 116804633A
Authority
CN
China
Prior art keywords
detected
light
imaging
target
measured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210263513.6A
Other languages
Chinese (zh)
Inventor
陈鲁
刘健鹏
顾玥
张鹏斌
张嵩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongke Feice Technology Co Ltd
Original Assignee
Shenzhen Zhongke Feice Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Feice Technology Co Ltd filed Critical Shenzhen Zhongke Feice Technology Co Ltd
Priority to CN202210263513.6A priority Critical patent/CN116804633A/en
Publication of CN116804633A publication Critical patent/CN116804633A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/061Sources

Landscapes

  • Chemical & Material Sciences (AREA)
  • Biochemistry (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Immunology (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method and system of detection, the method comprising: acquiring a region to be measured of a target to be measured, wherein the region to be measured comprises the center position of a substance of the target to be measured; the method comprises the steps that a first detection module is used for carrying out first detection on a target to be detected, the height of the target to be detected along the direction perpendicular to the surface of an object to be detected is obtained, the first detection module comprises an image collector, and the image collector comprises a target surface for collecting images; the first detection includes: imaging the target to be detected through a first detection module, and obtaining the spot position of an imaging spot of a region to be detected of the target to be detected on the target surface of the image collector; and acquiring the height of the object to be measured along the direction vertical to the surface of the object to be measured according to the light spot position. The invention is beneficial to obtaining the high-precision detection result.

Description

Detection method and detection system
Technical Field
The embodiment of the invention relates to the field of optical detection, in particular to a detection method and a detection system.
Background
With the rapid development of integrated circuit manufacturing technology, advanced packaging forms such as 2.5D/3D integration and wafer level packaging have been the main direction of packaging technology development.
Along with the high-density development of integrated circuit manufacturing, the package size is smaller and smaller, the interconnection density is increased, the size and the spacing of the bumps for connecting chips are smaller and smaller in an integrated circuit, and meanwhile, the problem of interconnection short circuit caused by solder deformation is also more and more prominent, so that the requirement for three-dimensional defect detection of chip bump coplanarity is also more urgent.
Currently, optical detection methods are generally adopted for three-dimensional defect detection.
Disclosure of Invention
The embodiment of the invention solves the problem of providing a detection method and a detection system, and obtaining a high-precision detection result.
In order to solve the above problems, an embodiment of the present invention provides a detection method, including: acquiring a region to be detected of the target to be detected, wherein the region to be detected comprises the object center position of the target to be detected; performing first detection on the target to be detected through a first detection module, and acquiring the height of the target to be detected along the direction perpendicular to the surface of the object to be detected, wherein the first detection module comprises an image collector; the first detection includes: imaging the target to be detected through a first detection module, and obtaining the spot position of an imaging spot of a region to be detected of the target to be detected on the target surface of the image collector; and acquiring the height of the target to be detected along the direction perpendicular to the surface of the object to be detected according to the light spot position.
The embodiment of the invention also provides a detection system, which comprises: the device comprises a first detection module, a second detection module and a detection module, wherein the first detection module comprises an illumination module, an imaging module and a processing module, the illumination module is used for generating incident light irradiated to an object to be detected, the incident light irradiates the object to be detected to generate illumination spots, and the incident light is reflected by the object to be detected to form detection light; the imaging module comprises an image acquisition component, wherein the image acquisition component is used for receiving detection light and obtaining imaging information of the target to be detected according to the detection light, and the imaging information comprises imaging light spots formed by the target to be detected in the image acquisition component; the processing module is used for processing the imaging information to obtain the position of the imaging point of the object to be detected, and obtaining the height of the object to be detected along the direction perpendicular to the surface of the object to be detected according to the position of the imaging point of the object to be detected.
Compared with the prior art, the technical scheme of the embodiment of the invention has the following advantages:
in the detection method provided by the embodiment of the invention, the first detection is carried out on the target to be detected, the three-dimensional information of the target to be detected is obtained according to the imaging light spots, and great convenience is provided for obtaining the three-dimensional information of the target to be detected.
In the detection system provided by the embodiment of the invention, the target to be detected is detected, the three-dimensional information of the target to be detected is obtained according to the imaging light spots, great convenience is provided for obtaining the three-dimensional information of the target to be detected, and the detection system is favorable for obtaining a high-stability and high-precision detection result.
Drawings
FIG. 1 is a flow chart of an embodiment of the detection method of the present invention;
FIG. 2 is a schematic diagram of an embodiment of a detection system and an optical path diagram according to the present invention;
FIG. 3 is an enlarged view of a portion of any one of the bumps under test of FIG. 2;
FIG. 4 is a top view of an embodiment of the detection system of the present invention.
Detailed Description
As known from the background art, the optical detection method is a commonly used detection technology for a target to be detected on a target to be detected. However, in the conventional optical detection method for detection, the accuracy of the detection result is required to be improved.
In order to solve the technical problem, an embodiment of the present invention provides a detection method, including: acquiring a region to be detected of the target to be detected, wherein the region to be detected comprises the object center position of the target to be detected; performing first detection on the target to be detected through a first detection module, and acquiring the height of the target to be detected along the direction perpendicular to the surface of the object to be detected, wherein the first detection module comprises an image collector; the first detection includes: imaging the target to be detected through a first detection module, and obtaining the spot position of an imaging spot of a region to be detected of the target to be detected on the target surface of the image collector; and acquiring the height of the target to be detected along the direction perpendicular to the surface of the object to be detected according to the light spot position.
In the detection method provided by the embodiment of the invention, the first detection is carried out on the target to be detected, the three-dimensional information of the target to be detected is obtained according to the imaging light spots, and great convenience is provided for obtaining the three-dimensional information of the target to be detected.
In order that the above objects, features and advantages of embodiments of the invention may be readily understood, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings.
Referring to fig. 1, fig. 1 is a flowchart of an embodiment of the detection method of the present invention.
In this embodiment, the detection method includes the following steps:
step S1: acquiring a region to be detected of the target to be detected, wherein the region to be detected comprises the object center position of the target to be detected;
step S2: performing first detection on the target to be detected through a first detection module, and acquiring the height of the target to be detected along the direction perpendicular to the surface of the object to be detected, wherein the first detection module comprises an image collector;
step S21: the first detection includes: imaging the target to be detected through a first detection module to obtain the spot position of an imaging spot of a region to be detected of the target to be detected on the target surface of the image collector;
step S22: and acquiring the height of the object to be detected along the direction vertical to the surface of the object to be detected according to the light spot position.
In the detection method provided by the embodiment of the invention, the first detection is carried out on the target to be detected, the three-dimensional information of the target to be detected is obtained according to the imaging light spots, and great convenience is provided for obtaining the three-dimensional information of the target to be detected.
The individual steps of the detection method are described below in connection with the detection system.
Referring to fig. 2 to 4 in combination, fig. 2 is a schematic structural diagram and an optical path diagram of an embodiment of the inspection system of the present invention, fig. 3 is a partial enlarged view of any bump to be inspected in fig. 2, and fig. 4 is a top view of an embodiment of the inspection system of the present invention, and the inspection method of the embodiment is described in detail.
Step S1 is executed to obtain a to-be-measured area 810c of the to-be-measured object, where the to-be-measured area 810c includes the object center position of the to-be-measured object.
In this embodiment, the target to be measured is the bump 101 to be measured.
Specifically, as an example, the object to be detected is a wafer 100, the object to be detected is a bump 101 to be detected on the surface of the wafer 100, and the detection device of the embodiment is used for detecting a three-dimensional defect of coplanarity of the bump on the surface of the wafer 100.
The area to be measured 810c is an area where the object to be measured needs to be detected, and the area to be measured 810c may be the whole area of the surface of the object to be measured or a partial area of the surface of the object to be measured.
In this embodiment, the to-be-measured area 810c is a circular area with a preset radius as a radius around the center of the object; the preset radius is smaller than the minimum size of the object to be measured on the surface parallel to the object to be measured.
The minimum dimension of the object to be measured on the surface of the parallel object to be measured is as follows: the minimum radius of the object to be measured on the surface parallel to the object to be measured.
The to-be-measured region 810c is a circular region with a preset radius and centered on the center of the object, which is favorable for making the light spot corresponding to the to-be-measured region 810c relatively uniform, and the preset radius of the to-be-measured region 810c is smaller than the minimum size of the to-be-measured object on the surface parallel to the to-be-measured object, so that the to-be-measured region 810c can be included for each to-be-measured object.
In this embodiment, the projection of the object to be measured on the surface of the object to be measured is circular; the radius of the measured area 810c is less than or equal to 1/10 of the radius of the object to be measured.
It should be noted that the ratio of the radius of the region 810c to the radius of the object to be measured is not excessively large. If the radius of the area to be measured 810c is too large in proportion to the radius of the target to be measured, the imaging light spot of the area to be measured 810c of the target to be measured is processed subsequently, the range of the imaging light spot to be processed is still large, and the processing amount of the imaging light spot to be processed is difficult to reduce, so that the calculation force is difficult to save, and the detection yield of the detection system is difficult to improve. For this reason, in the present embodiment, the radius of the region under test 810c is less than or equal to 1/10 of the radius of the object under test.
In this embodiment, if the target to be measured is the bump 101 to be measured, the region 810c to be measured of the target to be measured is the region where the maximum height of the bump 101 to be measured is located, and the region 810c to be measured of the bump 101 to be measured is the region for obtaining the height of the bump 101 to be measured. The imaging light spot of the to-be-detected area 810c of the to-be-detected bump 101 is subsequently processed in the embodiment, so that the range of the imaging light spot to be processed is reduced, the processing amount of the imaging light spot is reduced, the calculation force is saved, and the detection yield of the detection system is improved.
Specifically, in this embodiment, the second detection module performs pre-scanning on the object to be detected to obtain the center position of the object to be detected; the region to be measured 810c is obtained according to the object center position, and the region to be measured 810c includes the object center position of the object to be measured.
In this embodiment, the to-be-measured area 810c is obtained according to the object center position, so that the to-be-measured area 810c can be more uniformly distributed around the object center position, specifically, the radius of the to-be-measured area 810c is set, and the radius range with the object center position as the center of the circle is used as the to-be-measured area 810c.
As an example, the second detection module is an imaging device, and pre-scanning the object to be detected includes: imaging the object to be detected to obtain an image of the object to be detected, and obtaining the object center position of the object to be detected according to the image of the object to be detected.
After the to-be-detected area 810c of the to-be-detected target is obtained, before the first detection is performed on the to-be-detected target by the first detection module, the method further includes: and carrying out focusing treatment on the center of the target to be detected according to the center position of the object of the target to be detected.
Since the object to be measured is the bump 101 to be measured, the surface of the object to be measured is in an arc shape, and the reflection angles of the subsequent incident light are different for the positions of different heights of the object to be measured, so that the imaging definition of the positions of different heights of the object to be measured is different, and the embodiment needs to obtain the height of the position of the center of the object to be measured, so that the imaging of the position of the center of the object to be measured is required to be as clear as possible, therefore, before the first detection module performs the first detection on the object to be measured, focusing treatment is performed on the center of the object to be measured according to the position of the center of the object to be measured.
Specifically, in this embodiment, during the detection process, the first detection module moves relatively to the object to be detected, when the center of the object to be detected is determined to be located in the first detection module according to the center position of the object to be detected, the distance between the first detection module and the object to be detected along the optical axis direction of the first detection module is adjusted, and when the first detection module clearly images the object to be detected, the object to be detected is detected.
Step S2 is executed, in which a first detection module is used to perform a first detection on the target to be detected, so as to obtain the height of the target to be detected along the direction perpendicular to the surface of the object to be detected, where the first detection module includes an image collector 810.
In the embodiment of the invention, the first detection is performed on the target to be detected, the three-dimensional information of the target to be detected is obtained according to the imaging light spot, and great convenience is provided for obtaining the three-dimensional information of the target to be detected.
In this embodiment, imaging the target to be detected through the first detection module includes: step S21 is performed: the target to be detected is imaged through the first detection module, and the spot position of the imaging spot of the target area 810c of the target to be detected on the target surface 80 of the image collector 810 is obtained.
Specifically, in the first detection, the first detection module is used for generating incident light focused on the object to be detected, the incident light irradiates the object to be detected to generate an illumination light spot 800s, and the incident light is reflected by the object to be detected to form detection light; at least part of the probe light is collected by the image collector 810, and imaging information of the target to be measured is obtained according to the collected probe light, wherein the imaging information comprises the spot position of an imaging spot of a region 810c of the target to be measured.
In this embodiment, the incident light irradiates the object to be measured to generate the illumination spot 800s, and the illumination spot 800s is linear.
A linear light spot 800s is generated on the object to be detected, so that the detection device can be used for completing linear scanning on the object to be detected.
The linear light spots 800s are adopted to scan the object to be detected, edges of adjacent linear light spots 800s can be just abutted or partially overlapped along the scanning direction, linear scanning is completed by adopting the linear light spots 800s, complete coverage of all linear light spots 800s on the object to be detected is facilitated, meanwhile, each linear light spot 800s can be fully utilized, scanning of the object to be detected can be completed by adopting a small number of linear light spots 800s, and therefore the scanning efficiency is facilitated to be improved. Specifically, the detection device of the present embodiment is used for measuring the microscopic three-dimensional morphology of the height of the bump 101 to be measured.
In this embodiment, the incident angle α of the incident light is smaller than 45 degrees or larger than 45 degrees.
In this embodiment, the incident angle of the incident light is smaller than 45 degrees, which is favorable for flexibly adjusting the incident angle of the incident light, reducing the probability of shielding the object to be detected on the object to be detected, so that the object to be detected can obtain more sufficient irradiation, thereby being favorable for enabling the imaging of the object to be detected in the imaging module to be more accurate, the incident angle of the incident light is larger than 45 degrees, and being favorable for flexibly adjusting the incident angle of the incident light, increasing the space above the object to be detected, and facilitating the installation of other detection devices.
In this embodiment, the incident angle of the incident light is 25 degrees to 35 degrees.
The incident light is subjected to specular reflection through the surface of the object to be detected, the incident angle of the incident light is equal to the reflection angle of the detection light, and the incident angle of the incident light is 25-35 degrees, so that the setting of the relative positions of the illumination module and the imaging module is facilitated while the object to be detected can be more fully irradiated, and meanwhile, the complete receiving of the detection light by the imaging module is facilitated, so that more accurate imaging is facilitated to be obtained in the imaging module.
Referring to fig. 3 in combination, in this embodiment, taking an incident angle α of incident light being smaller than 45 degrees as an example, fig. 3 shows a partial enlarged view of any bump under test, for convenience of explanation, two bumps under test 101 with different heights are overlapped in fig. 3, where a first bump under test 101a is represented by a black solid bump, and a second bump under test 101b is represented by a dashed outline. For the first bump to be measured 101a, when the incident light irradiates the first bump to be measured 101a, the probe light is formed by reflecting at the point P (the optical path of the probe light formed by reflecting at the point P is indicated by a dotted line), the imaging light spot on the target surface 80 is at the point P ', and when the incident light irradiates the second bump to be measured 101b, the probe light is formed by reflecting at the point Q (the optical path of the probe light formed by reflecting at the point P is indicated by a solid line), and the imaging light spot on the target surface 80 is at the point Q'.
In this embodiment, the target surface 80 is perpendicular to the incident direction of the corresponding received probe light, so that the conjugate image of the target surface 80 is located on the base surface 80e passing through the point P, the base surface 80e is perpendicular to the main optical path 10b of the probe light, but the incident angle α of the incident light is smaller than 45 degrees, the incident light and the probe light are not perpendicular, so that the base surface 80e and the optical axis 10a of the incident light do not overlap, that is, the point Q on the second bump 101b to be measured is not on the base surface 80e, and thus the imaging spot of the incident light passing through the point Q is a diffuse spot, that is, the point Q cannot be clearly imaged at the point Q ' but forms a diffuse spot at the point Q ', so that in order to obtain a clear position of the point Q ', the diffuse spot needs to be processed.
In this embodiment, the target surface 80 is perpendicular to the incident direction of the corresponding received detection light, so that the setting of the target surface 80 in this embodiment is easier, and the modification of the original detection system is reduced.
In this embodiment, the to-be-measured area 810c of the to-be-measured object is located in the spot position of the imaging spot of the target surface 80 of the image collector 810, where one to-be-measured area 810c corresponds to a plurality of imaging spots.
In the first inspection process, when a plurality of adjacent linear light spots 800s cover one area to be inspected 810c, one area to be inspected 810c correspondingly forms a plurality of imaging light spots.
In this embodiment, collecting at least part of the probe light by the image collector 810 includes: along the surface direction of the object to be detected, the linear light spot 800s is translated relative to the object to be detected along the scanning direction (shown as the X direction in fig. 4) according to a preset step length, the object to be detected is scanned, and the scanning direction is perpendicular to the length direction of the linear light spot 800s, wherein the preset step length is smaller than or equal to the width of the linear light spot 800s in the process of scanning the object to be detected.
In this embodiment, the linear light spot 800s is translated along the scanning direction relative to the object to be detected according to a preset step length along the scanning direction, so that the surface of the object to be detected can be imaged in the imaging module, the scanning direction is perpendicular to the length direction of the linear light spot 800s, the length of the linear light spot 800s is larger, and the width is smaller.
It should be noted that the preset step size should not be too large. If the preset step length is too large, gaps are easy to generate between two adjacent linear light spots 800s, so that the surface of the object to be detected is incompletely scanned, the object to be detected is difficult to be detected completely, and the detection result is influenced. For this reason, in this embodiment, the preset step length is smaller than or equal to the width of the linear light spots 800s, so that it can be ensured that two adjacent linear light spots 800s are just spliced or have an overlapping portion.
In this embodiment, the number of the image collectors 810 is multiple, and the first detection module further includes: the beam splitter 700 is configured to transmit the received probe light along a plurality of directions along different optical paths, and project the probe light transmitted along the plurality of directions along different optical paths into the image acquisition assemblies 810, where each image acquisition assembly 810 sequentially and alternately acquires each probe light to form images of different areas of the object to be detected.
Referring to fig. 3, in this embodiment, the imaging light spot of the incident light passing through the Q point is a diffuse spot, that is, the Q point cannot be imaged clearly at the Q ' point, but forms a diffuse spot at the Q ' point, so the diffuse spot is processed by the processing module 900 to obtain a clear position of the Q ' point, so that the heights of the bumps 101a and 101b to be measured are obtained according to the clear positions of the P ' point and the Q ' point.
Specifically, in this embodiment, acquiring the spot position of the imaging spot of the to-be-measured area 810c of the to-be-measured target on the target surface 80 of the image collector 810 includes: and extracting the center of the imaging light spot to obtain the center of the imaging light spot, and taking the center of the imaging light spot as the image center position.
In this embodiment, the center of the diffuse spot is extracted, and the center of the diffuse spot is obtained, that is, the clear position of the Q' point.
In this embodiment, the method for extracting the center of the imaging light spot includes a gray level gravity center method, a quadratic curve fitting vertex-taking method, a gaussian curve fitting vertex-taking method, a centroid method or a maximum position method.
The gray level gravity center method, the quadratic curve fitting vertex-taking method, the Gaussian curve fitting vertex-taking method, the centroid method or the maximum value position method are all common methods for center extraction in the optical field, and the calculation method is mature and simple and convenient, is favorable for accurate center extraction and obtains accurate clear positions of Q' points.
In this embodiment, processing the imaging light spot includes: the imaging light spot of the region to be measured 810c of the target to be measured is processed to obtain the position of the imaging point of the region to be measured 810c of the target to be measured.
The imaging light spot of the to-be-detected area 810c of the to-be-detected target is processed, so that the range of the imaging light spot to be processed is reduced, the processing amount of the imaging light spot to be processed is reduced, the calculation force is saved, and the detection yield of the detection system is improved.
It should be noted that, in this embodiment, when the accuracy of pre-scanning the object to be measured is high, the area to be measured 810c is the object center point of the object to be measured, and the imaging light spot of the object center point of the object to be measured is an imaging point, the center extraction may be performed without using an algorithm, the position of the imaging point may be directly obtained as the image center position, and the height of the object to be measured along the direction perpendicular to the surface of the object to be measured may be obtained according to the image center position.
Step S22 is executed, wherein the height of the object to be detected along the direction vertical to the surface of the object to be detected is obtained according to the light spot position.
Accordingly, in the present embodiment, the position of the imaging point corresponds to the height of the bump 101 to be tested.
In this embodiment, obtaining the height of the target to be measured along the direction perpendicular to the surface of the object to be measured according to the light spot position includes: and acquiring the height of the object to be measured according to the image center position.
Specifically, in this embodiment, the height of each position is fixed on the target surface 80 of the image acquisition component 810 according to the triangulation method, that is, each position corresponds to the height of the target to be measured on the target surface 80, and in this embodiment, the target to be measured is imaged on the target surface 80 by the probe light, and the height of the corresponding point of the target to be measured can be obtained according to the position of the imaging point of the target to be measured.
In this embodiment, as shown in fig. 4, each linear light spot 800s spans the bump 101 to be measured and the surface around the bottom of the bump 101 to be measured, so that the reference surface is not required to be set, but only the difference in positions of the imaging points between the point on the bump 101 to be measured and the point on the surface around the bottom of the bump 101 to be measured is required, so that the difference in height between the point on the bump 101 to be measured and the point on the surface around the bottom of the bump 101 to be measured can be obtained correspondingly, and the height of the bump 101 to be measured can be obtained correspondingly, which is beneficial to reducing the detection error caused by the uneven reference surface, thereby being beneficial to obtaining the height of the bump 101 to be measured more accurately.
In this embodiment, the spot position is obtained according to the imaging spot of the target to be measured, and the average value of the heights obtained according to the imaging spots in the direction perpendicular to the surface of the object to be measured is taken as the height of the area to be measured 810 c.
In the first inspection process, when a plurality of adjacent linear light spots 800s cover one area to be inspected 810c, one area to be inspected 810c will correspondingly form a plurality of imaging light spots, so that an average value of heights obtained by the plurality of imaging light spots is used as the height of the area to be inspected 810c, which is favorable for obtaining more accurate heights.
It should be noted that, after the height of the target to be measured along the direction perpendicular to the surface of the object to be measured is obtained, the obtained height of the target to be measured may be calibrated by stitching the images obtained by the target surface 80 to obtain the height variation condition of the target to be measured.
Correspondingly, the embodiment also provides a detection system.
Referring to fig. 2 to 4, fig. 2 is a schematic structural diagram and an optical path diagram of an embodiment of the inspection system of the present invention, fig. 3 is a partial enlarged view of any bump to be inspected in fig. 2, and fig. 4 is a top view of an embodiment of the inspection system of the present invention.
The system is used for detecting the object to be detected on the object to be detected, and the detection system comprises: the first detection module comprises an illumination module, an imaging module and a processing module 900, wherein the illumination module is used for generating incident light irradiated to an object to be detected, the incident light irradiates the object to be detected to generate illumination light spots 800s, and the incident light is reflected by the object to be detected to form detection light; the imaging module comprises an image acquisition component 810, the image acquisition component 810 is used for receiving the detection light and obtaining imaging information of the target to be detected according to the detection light, and the imaging information comprises imaging light spots formed by the target to be detected in the image acquisition component 810; the processing module 900 is configured to process the imaging information to obtain a position of an imaging point of the target to be measured, and obtain a height of the target to be measured along a direction perpendicular to the surface of the object to be measured according to the position of the imaging point of the target to be measured.
In this embodiment, the target to be measured is the bump 101 to be measured.
Specifically, as an example, the object to be detected is a wafer 100, the object to be detected is a bump 101 to be detected formed on the surface of the wafer 100, and the detection device of the embodiment is used for detecting a three-dimensional defect of coplanarity of the bump on the surface of the wafer 100. Specifically, the detection device of the present embodiment is used for measuring the microscopic three-dimensional morphology of the height of the bump 101 to be measured.
In the actual detection process, the target to be detected also has a region to be detected 810c, where the region to be detected 810c is a region to be detected of the target to be detected, and the region to be detected 810c may be the whole region of the surface of the target to be detected or a partial region of the surface of the target to be detected.
In this embodiment, the area to be measured 810c is a circular area with a preset radius and centered on the center of the target to be measured; the preset radius is smaller than the minimum size of the object to be measured on the surface parallel to the object to be measured.
The minimum dimension of the object to be measured on the surface of the parallel object to be measured is as follows: the minimum radius of the object to be measured on the surface parallel to the object to be measured.
The to-be-measured region 810c is a circular region with a preset radius and taking the center position of the to-be-measured object as the center, which is beneficial to making the light spot corresponding to the to-be-measured region 810c relatively uniform, and the preset radius of the to-be-measured region 810c is smaller than the minimum size of the to-be-measured object on the surface parallel to the to-be-measured object, so that the to-be-measured region 810c can be included for each to-be-measured object.
In this embodiment, the projection of the object to be measured on the surface of the object to be measured is circular; the radius of the measured area 810c is less than or equal to 1/10 of the radius of the object to be measured.
It should be noted that the ratio of the radius of the region 810c to the radius of the object to be measured is not excessively large. If the radius of the area to be measured 810c is too large in proportion to the radius of the target to be measured, the imaging light spot of the area to be measured 810c of the target to be measured is processed subsequently, the range of the imaging light spot to be processed is still large, and the processing amount of the imaging light spot to be processed is difficult to reduce, so that the calculation force is difficult to save, and the detection yield of the detection system is difficult to improve. For this reason, in the present embodiment, the radius of the region under test 810c is less than or equal to 1/10 of the radius of the object under test.
In this embodiment, if the target to be measured is the bump 101 to be measured, the region 810c to be measured of the target to be measured is the region where the maximum height of the bump 101 to be measured is located, and the region 810c to be measured of the bump 101 to be measured is the region for obtaining the height of the bump 101 to be measured. In the actual detection process, the imaging light spot of the to-be-detected area 810c of the to-be-detected salient point 101 is processed, so that the range of the imaging light spot to be processed is reduced, the processing amount of the imaging light spot is reduced, the calculation force is saved, and the detection yield of the detection system is improved.
The illumination module is used for generating incident light to the object to be measured, and in this embodiment, the incident angle of the incident light is smaller than 45 degrees or larger than 45 degrees. The incident angle of the incident light is smaller than 45 degrees, the probability that the object to be detected on the object to be detected is shielded is reduced by flexibly adjusting the incident angle of the incident light, the object to be detected can obtain more sufficient irradiation, the imaging of the object to be detected in the imaging module is more accurate, the incident angle of the incident light is larger than 45 degrees, the space above the object to be detected is increased by flexibly adjusting the incident angle of the incident light, and other detection devices can be conveniently installed.
In this embodiment, the incident angle of the incident light is 25 degrees to 35 degrees.
The incident light is subjected to specular reflection through the surface of the object to be detected, the incident angle of the incident light is equal to the reflection angle of the detection light, and the incident angle of the incident light is 25-35 degrees, so that the setting of the relative positions of the illumination module and the imaging module is facilitated while the object to be detected can be more fully irradiated, and meanwhile, the complete receiving of the detection light by the imaging module is facilitated, so that more accurate imaging is facilitated to be obtained in the imaging module.
The illumination module is used for generating linear light spots 800s on the object to be detected, and the extending direction of the linear light spots 800s is perpendicular to the incident surface of the incident light.
The incident plane is a plane where incident light and a normal line of the surface of the object to be measured are located.
The illumination module generates linear light spots 800s on the object to be detected, so that the detection device can be used for completing linear scanning on the object to be detected.
The linear light spots 800s are adopted to scan the object to be detected, edges of adjacent linear light spots 800s can be just abutted or partially overlapped along the scanning direction, linear scanning is completed by adopting the linear light spots 800s, complete coverage of all linear light spots 800s on the object to be detected is facilitated, meanwhile, each linear light spot 800s can be fully utilized, scanning of the object to be detected can be completed by adopting a small number of linear light spots 800s, and therefore the scanning efficiency is facilitated to be improved.
The illumination module includes a light source assembly 200, a first slit element 400 and a first mirror set 500 sequentially arranged along a light path transmission direction (as shown by an arrow direction on a dotted line in fig. 2), the light source assembly 200 is used for generating a light beam, the first slit element 400 is used for transmitting the light beam to generate linear incident light, and the first mirror set 500 is used for focusing the linear incident light onto an object to be measured to generate linear light spots 800s.
In this embodiment, the light source assembly 200 is used to generate a linear light beam.
In this embodiment, along the optical path transmission direction, the light beam generated by the light source assembly 200 passes through the first slit element 400, so as to generate the linear light spot 800s on the object to be measured, so that compared with a circular light beam, when the light source assembly 200 generates a linear light beam, the light beam generated by the light source assembly 200 is coupled with the first slit element 400, which is beneficial to reducing the waste of light beam energy and improving the coupling efficiency.
In this embodiment, the light source assembly 200 includes a light source, and the light source is an incoherent light source.
The light source assembly 200 generates a light beam using light emitted from a light source.
In this embodiment, the light source is an incoherent light source, and compared with a coherent light source (for example, a laser light source), the noise point of the incoherent light source is smaller, so that the signal-to-noise ratio of the image generated by the incoherent light source is higher in the imaging module, and therefore, the imaging information of the target to be detected can be obtained accurately in the imaging module.
Specifically, in the present embodiment, the type of the light source includes an LED light source, a halogen lamp, or a xenon lamp.
The LED light source, the halogen lamp or the xenon lamp is an incoherent light source, and has the characteristics of smaller volume, longer service life, higher luminous efficiency, lower power consumption and the like.
In this embodiment, the light source assembly 200 includes: a light source and a shaping element 220, the shaping element 220 being configured to shape light emitted by the light source to produce a linear light beam.
In practical applications, the incoherent light source generally emits a circular light beam, and therefore, the shaping element 220 is used to shape the light emitted by the light source, so as to ensure that the light source assembly 200 generates a linear light beam.
In this embodiment, the shaping element 220 includes: the shape of the incident port of the optical fiber bundle is matched with the shape of the light spot of the light emitted by the light source, and the optical fibers of the emergent port of the optical fiber bundle are arranged in a straight shape.
The optical fiber bundle is used for shaping light emitted by the light source, the operation is simple and convenient, and the port shape of the optical fiber bundle is easy to adjust, so that the incident port of the optical fiber bundle can be adjusted according to the light spot formation of the light emitted by the light source, and the emergent port of the optical fiber bundle can be adjusted according to the shape of the incident light obtained as required.
In this embodiment, the shape of the incident port of the optical fiber bundle is matched with the shape of the light spot of the light emitted by the light source, so that the coupling efficiency of the light emitted by the light source and the optical fiber bundle can be improved, and the shape of the exit port of the optical fiber bundle is in a straight shape, so that the light source assembly 200 can generate a linear light beam.
In other embodiments, the light source assembly may further include: and an imaging element for directly generating a linear beam.
In this embodiment, the light source assembly 200 includes a light source and a filter color wheel for controlling the spectrum emitted by the light source assembly 200.
As an example, the light source assembly 200 includes a light box 210, with a light source and a filter color wheel disposed in the light box 210 for controlling the spectrum emitted by the light box 210.
The light box 210 is used as a means for placing a light source and a color filter wheel.
In this embodiment, the light emitted from the light source passes through the color filter wheel and then is emitted from the light box 210, so that the spectrum emitted from the light box 210 can be controlled by the color filter wheel.
As an example, the light source emits white light, and after passing through the filter color wheel, the light box 210 may emit blue, green, yellow, or cyan light.
In this embodiment, the spectrum of the color filter wheel is adjusted according to the spectrum of the surface coating of the object to be measured, for example, according to the spectrum of the coating on the wafer surface. Specifically, the spectrum of the color wheel of the optical filter is consistent with the spectrum of the surface coating of the target to be detected, so that the signal to noise ratio of the incident light generated by the surface of the target to be detected is improved.
The first slit member 400 is for obtaining linear incident light.
The first slit member 400 includes a fixed slit, a single-side adjustable asymmetric slit, or a double-side adjustable symmetric slit.
In this embodiment, the length direction of the slit opening in the first slit element 400 is perpendicular to the incident surface of the optical path of the incident light, so that the linear incident light matched with the slit opening can be obtained, and accordingly, in this embodiment, the length of the obtained linear incident light is controlled by setting the length of the slit opening of the first slit element 400.
The first lens group 500 is used for focusing the linear incident light onto the object to be measured to generate a linear light spot, and controlling the size of the generated linear light spot. As an example, the first lens group 500 is a first microscope objective.
In this embodiment, the size and the pitch of the bump 101 to be tested are generally smaller, so that a finer linear light spot is required to test the bump 101 to be tested.
In this embodiment, the first slit element 400 is formed on the object to be detected by using the first lens group 500 in a reduced manner, so that the width of the linear incident light passing through the first slit element 400 is reduced, and a linear light spot with a smaller width can be obtained.
In this embodiment, the first slit element 400 and the first lens group 500 are used to shape and image the light beam generated by the light source assembly 200, and the first slit element 400 can be reduced and imaged on the object to be detected by the first lens group 500, so that a linear light spot 800s with a smaller size can be obtained, and the linear light spot 800s forms detection light through the object to be detected, thereby being beneficial to obtaining imaging information of the object to be detected with higher precision, and correspondingly obtaining a detection result with high stability and high precision.
In this embodiment, the lighting module further includes: the optical fiber coupler 300 is disposed between the light source assembly 200 and the first slit member 400, and the optical fiber coupler 300 is used to couple the light beam at the slit opening of the first slit member 400.
In this embodiment, the light source assembly 200 generates a linear light beam through the optical fiber bundle, and the light beam generated by the optical fiber bundle is a discontinuous light spot, so that the optical fiber coupler 300 is adopted to converge the light beam and couple the light beam to the slit opening of the first slit element 400, which is beneficial to improving the illumination efficiency of the incident light, controlling the light of the incident light, reducing the divergence of the incident light, and improving the uniformity of the imaging in the imaging module.
In this embodiment, the optical fiber coupler 300 includes: the cylindrical mirror 310 is disposed on a side of the optical fiber coupler 300 near the light emitting end surface, or on a side of the optical fiber coupler near the light entering end surface.
The cylindrical mirror 310 serves to improve the illumination efficiency of the incident light and the uniformity of the incident light.
In this embodiment, the imaging module is configured to collect probe light, and obtain imaging information of the target to be detected according to the probe light.
In this embodiment, the imaging information includes an imaging spot formed by the object to be measured in the image acquisition component 810.
In this embodiment, the image capturing component 810 includes a target surface 80 for receiving the probe light, and the target surface 80 of the image capturing component 810 is perpendicular to the incident direction of the received probe light.
Referring to fig. 3 in combination, in this embodiment, taking an incident angle α of incident light being smaller than 45 degrees as an example, fig. 3 shows a partial enlarged view of any bump under test, for convenience of explanation, two bumps under test 101 with different heights are overlapped in fig. 3, where a first bump under test 101a is represented by a black solid bump, and a second bump under test 101b is represented by a dashed outline. For the first bump to be measured 101a, when the incident light irradiates the first bump to be measured 101a, the probe light is formed by reflecting at the point P (the optical path of the probe light formed by reflecting at the point P is indicated by a dotted line), the imaging light spot on the target surface 80 is at the point P ', and when the incident light irradiates the second bump to be measured 101b, the probe light is formed by reflecting at the point Q (the optical path of the probe light formed by reflecting at the point P is indicated by a solid line), and the imaging light spot on the target surface 80 is at the point Q'.
In this embodiment, the target surface 80 of the image capturing component 810 is perpendicular to the incident direction of the corresponding received probe light, so that the conjugate image of the target surface 80 is located on the base surface 80e passing through the point P, the base surface 80e is perpendicular to the main optical path 10b of the probe light, but the incident angle α of the incident light is not 45 degrees, the incident light and the probe light are not perpendicular, so that the base surface 80e and the optical axis 10a of the incident light do not coincide, that is, the point Q on the second bump 101b to be detected is not on the base surface 80e, and the imaging spot of the incident light passing through the point Q is a diffuse spot, that is, the point Q cannot be clearly imaged at the point Q ', but forms a diffuse spot at the point Q ', so that in order to obtain a clear position of the point Q ', the diffuse spot needs to be processed.
In this embodiment, the target surface 80 of the image acquisition component 810 is perpendicular to the incident direction of the corresponding received probe light, so that the setting of the image acquisition component 810 in this embodiment is easier, and the modification to the original detection system is reduced.
In this embodiment, the imaging module further includes an imaging assembly 600, where the imaging assembly 600 is configured to collect probe light into the image acquisition assembly 810.
In this embodiment, the imaging assembly 600 includes a second lens group 620, a diaphragm 630 and a tube lens 640 sequentially disposed along the optical path transmission direction, where the second lens group 620 is configured to collect the probe light and make the probe light incident into the diaphragm 630, and the tube lens 640 is configured to receive the probe light passing through the diaphragm 630 and collect the probe light.
In this embodiment, the second lens group 620 is used to amplify the optical path of the probe light, so that the imaging of the object to be measured is clearer. As an example, the second lens group 620 is a second microscope objective.
In this embodiment, the diaphragm 630 is used to control the light quantity of the probe light passing through, and the diaphragm 630 is also used to limit the telecentricity of the chief ray of the probe light, so that the telecentricity of the probe light is infinitely close to 0, which is beneficial to making the imaging quality of each view field uniform in the imaging module, and thus is beneficial to obtaining imaging information with higher precision.
In this embodiment, the diaphragm 630 images the object to be measured at infinity, the tube lens 640 is used for imaging the object to be measured at a limited distance, and correspondingly, the diaphragm 630 and the tube lens 640 are matched to facilitate the detection light passing through the imaging assembly 600, and the imaging quality on the image acquisition assembly 810 is uniform, clear and accurate.
In this embodiment, the imaging assembly 600 further includes: the second slit element 610 is disposed on a side of the second mirror group 620 facing away from the aperture 630, and the second slit element 610 is used for reducing stray light of the probe light.
In this embodiment, the second slit element 610 is used to limit the Numerical Aperture (NA) of the optical system of the probe light, reducing the divergence of the probe light.
In this embodiment, the number of image capturing components 810 is plural.
By employing a plurality of image acquisition assemblies 810 for image acquisition, the maximum frame rate of the image acquisition assemblies 810 is advantageously increased, thereby advantageously increasing the image acquisition efficiency of the image acquisition assemblies 810.
Specifically, the number of image capturing components 810 is a multiple of the maximum frame rate of the image capturing component 800 compared to a single image capturing component.
Referring to fig. 4, in this embodiment, the number of the image capturing assemblies 810 is 2, so that the optical paths of the probe light are not too complex while the image capturing efficiency of the image capturing assemblies 810 is improved, and the optical paths of the probe light in the imaging module are easy to set while the image capturing efficiency of the image capturing assemblies 810 is ensured.
Specifically, the 2 image acquisition components 810 alternately work to acquire imaging information of the linear light spots 810s and 820s, respectively, until linear scanning is completed on the object to be detected.
In this embodiment, the imaging assembly 600 further includes: the beam splitter 700 is disposed at a side of the imaging assembly 600 closest to the light emitting end, the beam splitter 700 is configured to transmit the received probe light along a plurality of directions along different light paths, and project the probe light transmitted along the plurality of directions along different light paths into the image capturing assemblies 810, where each image capturing assembly 810 sequentially captures each probe light in turn to form images of different areas of the object to be detected.
The beam splitter 700 is used for implementing the setting of image acquisition by the plurality of image acquisition components 810, and in order to implement the normal operation of each image acquisition component 810, the sub-probe light split by the beam splitter 700 corresponds to the image acquisition components 810 one by one.
Specifically, in this embodiment, the number of the image capturing assemblies 810 is 2, and the beam splitter 700 is configured to reflect the probe light and project the reflected probe light into one of the image capturing assemblies 810, and is also configured to transmit the probe light and project the transmitted probe light into the other image capturing assembly 810.
In this embodiment, the beam splitter 700 includes a beam splitting prism.
The beam splitting prism can split the detection light, and in this embodiment, the beam splitting prism can split the horizontal polarization and the vertical polarization of a beam of detection light, and accordingly, split a beam of detection light into sub-detection light along the original optical path transmission direction of the detection light and sub-detection light along the direction perpendicular to the original optical path transmission direction. The beam splitter prism has the characteristics of small stress, high extinction ratio, good imaging quality, small beam deflection angle and the like.
In this embodiment, the splitting ratio of the splitting prism is 1:1, so that the two sub-probe lights are relatively uniform, and the uniformity of the imaging quality on the two image acquisition components 810 is relatively high.
In this embodiment, the detection system further includes: the processing module 900 is configured to process the imaging light spot to obtain a position of an imaging point of the target to be measured, where the position of the imaging point corresponds to a height of the target to be measured.
Accordingly, in the present embodiment, the position of the imaging point corresponds to the height of the bump 101 to be tested.
Referring to fig. 3, in this embodiment, the imaging light spot of the incident light passing through the Q point is a diffuse spot, that is, the Q point cannot be imaged clearly at the Q ' point, but forms a diffuse spot at the Q ' point, so the diffuse spot is processed by the processing module 900 to obtain a clear position of the Q ' point, so that the heights of the bumps 101a and 101b to be measured are obtained according to the clear positions of the P ' point and the Q ' point.
Specifically, in this embodiment, the processing module 900 is configured to perform center extraction on the imaging light spot, so as to obtain the center of the imaging light spot, where the center of the imaging light spot is the position of the imaging point.
In this embodiment, the center of the diffuse spot is extracted, and the center of the diffuse spot is obtained, that is, the clear position of the Q' point.
Specifically, in this embodiment, the height of each position is fixed on the target surface 80 of the image acquisition component 810 according to the triangulation method, that is, each imaging position on the target surface 80 corresponds to the height of the target to be measured, and in this embodiment, the target to be measured is imaged on the target surface 80 by the probe light, and the height of the corresponding point of the target to be measured can be obtained according to the position of the imaging point of the target to be measured.
In this embodiment, as shown in fig. 4, each linear light spot 800s spans the bump 101 to be measured and the surface around the bottom of the bump 101 to be measured, so that the reference surface is not required to be set, but only the difference in positions of the imaging points between the point on the bump 101 to be measured and the point on the surface around the bottom of the bump 101 to be measured is required, so that the difference in height between the point on the bump 101 to be measured and the point on the surface around the bottom of the bump 101 to be measured can be obtained correspondingly, and the height of the bump 101 to be measured can be obtained correspondingly, which is beneficial to reducing the detection error caused by the uneven reference surface, and thus is beneficial to obtaining the height information of the bump 101 to be measured more accurately.
The probe light reflected by the P-point and the Q-point correspondingly form different imaging points in the imaging module, and therefore, the positions of the imaging points correspond to the height information of the bump 101 to be measured.
Specifically, in this embodiment, the illumination module, the object to be measured, and the imaging module form a triangle, so that the height of the bump 101 to be measured can be obtained according to the position of the imaging point by the triangulation method.
In this embodiment, the detection system further includes: and a second detection module (not shown) for pre-scanning the object to be detected to obtain a central position of the object to be detected, and obtaining a region to be detected 810c according to the central position, wherein the region to be detected 810c includes the central position.
810c in this embodiment, if the target to be measured is the bump 101 to be measured, the region 810c to be measured of the target to be measured is the region where the maximum height of the bump 101 to be measured is located, and the region 810c to be measured of the target to be measured is the region for obtaining the height of the bump 101 to be measured.
In this embodiment, the processing module is configured to process the imaging light spot of the to-be-detected area 810c of the to-be-detected target, and obtain the position of the imaging point of the to-be-detected area 810 c.
The present embodiment processes the imaging light spot of the to-be-detected area 810c of the to-be-detected target, which reduces the range of the imaging light spot to be processed, and is beneficial to reducing the processing amount of the processing module 900 to process the imaging light spot, thereby being beneficial to saving the calculation force and improving the detection yield of the detection system.
Specifically, as an example, the second detection module is an imaging device, configured to obtain an image of the object to be detected, and obtain a center position of the object to be detected according to the image of the object to be detected.
It should be noted that, in this embodiment, when the precision of pre-scanning the object to be measured is higher, the area 810c to be measured is the central position, the imaging light spot of the point of the central position of the object to be measured in the imaging module is an imaging point, and then the processing module 900 is configured to process the imaging light spot of the central position of the object to be measured, directly obtain the position of the imaging point of the central position of the object to be measured, that is, directly obtain the position of the imaging point without adopting an algorithm, and obtain the height of the object to be measured along the direction perpendicular to the surface of the object to be measured according to the position of the imaging point of the object to be measured.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be made by one skilled in the art without departing from the spirit and scope of the invention, and the scope of the invention should be assessed accordingly to that of the appended claims.

Claims (20)

1. A detection method for detecting an object to be detected on an object to be detected, comprising:
acquiring a region to be detected of the target to be detected, wherein the region to be detected comprises the object center position of the target to be detected;
performing first detection on the target to be detected through a first detection module, and acquiring the height of the target to be detected along the direction perpendicular to the surface of the object to be detected, wherein the first detection module comprises an image collector;
the first detection includes: imaging the target to be detected through a first detection module to obtain the spot position of an imaging spot of a region to be detected of the target to be detected on the target surface of the image collector; and acquiring the height of the target to be detected along the direction perpendicular to the surface of the object to be detected according to the light spot position.
2. The method of claim 1, wherein imaging the object to be measured with the first detection module comprises: generating incident light focused on an object to be detected through the first detection module, wherein the incident light irradiates the object to be detected to generate an illumination light spot, and the incident light is reflected by the object to be detected to form detection light;
And collecting at least part of the detection light through the image collector, and acquiring imaging information of the target to be detected according to the collected detection light, wherein the imaging information comprises the spot position of an imaging spot of a region to be detected of the target to be detected.
3. The detection method according to claim 2, wherein the incident angle of the incident light is less than 45 degrees or greater than 45 degrees.
4. The method of detecting according to claim 1, wherein acquiring a spot position of an imaging spot of the region to be measured of the object to be measured on the target surface of the image pickup includes: extracting the center of the imaging light spot to obtain the center of the imaging light spot, and taking the center of the imaging light spot as an image center position;
the step of obtaining the height of the object to be measured along the direction perpendicular to the surface of the object to be measured according to the light spot position comprises the following steps: and acquiring the height of the target to be detected according to the image center position.
5. The method of claim 4, wherein the method of center-extracting the imaging spot comprises a gray-scale centroid method, a quadratic curve fitting vertex-finding method, a gaussian curve fitting vertex-finding method, a centroid method, or a maximum position method.
6. The method of detecting according to claim 1, wherein obtaining the test region of the test target comprises: pre-scanning the object to be detected through a second detection module to obtain the center position of the object to be detected; and acquiring the region to be detected according to the object center position, wherein the region to be detected comprises the object center position of the object to be detected.
7. The method of claim 6, wherein the second detection module is an imaging device; pre-scanning the test object comprises: imaging the object to be detected through the second detection module to obtain an image of the object to be detected, and obtaining the object center position of the object to be detected according to the image of the object to be detected.
8. The method according to claim 1, wherein after the obtaining the region to be measured of the target to be measured, before the first detection of the target to be measured by the first detection module, further comprises: and focusing the center of the object to be detected according to the center position of the object to be detected.
9. The detecting method according to claim 1, wherein the to-be-detected area is a circular area with a predetermined radius around the center of the object; the preset radius is smaller than the minimum dimension of the object to be measured on the surface parallel to the object to be measured.
10. The method according to claim 1, wherein the projection of the object to be measured on the surface of the object to be measured is a circle; the radius of the region to be measured is less than or equal to 1/10 of the radius of the object to be measured.
11. The detection method according to claim 1, wherein a region to be detected of the target to be detected is obtained in a spot position of an imaging spot of a target surface of the image pickup, one region to be detected corresponding to a plurality of the imaging spots;
and acquiring the average value of the heights of the object to be detected, which are corresponding to the object to be detected and are acquired according to a plurality of imaging light spots, from the heights of the object to be detected along the direction perpendicular to the surface of the object to be detected according to the light spot positions, and taking the average value as the height of the area to be detected of the object to be detected.
12. The detection method according to claim 2, wherein the incident light irradiates the object to be detected to generate an illumination spot, and the illumination spot is linear;
collecting at least part of the probe light by the image collector comprises: translating the linear light spots relative to the object to be detected along a scanning direction according to a preset step length along the surface direction of the object to be detected, and scanning the object to be detected, wherein the scanning direction is perpendicular to the length direction of the linear light spots;
And in the process of scanning the object to be detected, the preset step length is smaller than or equal to the width of the linear light spot.
13. A detection system for performing the detection method according to any one of claims 1 to 12, for detecting a target to be detected on an object to be detected, the detection system comprising:
the first detection module comprises an illumination module, an imaging module and a processing module, wherein,
the illumination module is used for generating incident light irradiated to an object to be detected, the incident light irradiates the object to be detected to generate illumination light spots, and the incident light is reflected by the object to be detected to form detection light;
the imaging module comprises an image acquisition component, wherein the image acquisition component is used for receiving detection light and obtaining imaging information of the target to be detected according to the detection light, and the imaging information comprises imaging light spots formed by the target to be detected in the image acquisition component;
the processing module is used for processing the imaging information to obtain the position of the imaging point of the object to be detected, and obtaining the height of the object to be detected along the direction perpendicular to the surface of the object to be detected according to the position of the imaging point of the object to be detected.
14. The detection system of claim 13, wherein the processing module is configured to perform a center extraction of the imaging spot to obtain a center of the imaging spot, the center of the imaging spot being a location of the imaging point.
15. The detection system of claim 13, wherein the incident angle of the incident light is less than 45 degrees or greater than 45 degrees;
the image acquisition component comprises a target surface for receiving the detection light, and the target surface of the image acquisition component is perpendicular to the incidence direction of the received detection light.
16. The detection system of claim 13, wherein the detection system further comprises: the second detection module is used for pre-scanning the object to be detected, obtaining the central position of the object to be detected, and obtaining a region to be detected according to the central position, wherein the region to be detected comprises the central position;
the processing module is used for processing the imaging light spots of the to-be-detected area of the to-be-detected object to obtain the position of the imaging point of the to-be-detected area.
17. The inspection system of claim 16, wherein the second inspection module is further configured to obtain an image of the object under inspection, and obtain a center position of the object under inspection based on the image of the object under inspection;
The processing module is used for processing the imaging light spot at the central position of the target to be detected to obtain the position of the imaging point at the central position of the target to be detected.
18. The inspection system of claim 13 wherein the number of image acquisition assemblies is a plurality;
the imaging assembly includes: the beam splitter is used for transmitting the received detection light along a plurality of different light path directions and respectively projecting the detection light transmitted along the plurality of different light path directions into the image acquisition assembly.
19. The inspection system of claim 13, wherein the illumination module comprises a light source assembly, a first slit element and a first mirror assembly sequentially arranged along the transmission direction of the light path, the light source assembly is used for generating a light beam, the first slit element is used for transmitting the light beam to generate linear incident light, and the first mirror assembly is used for focusing the linear incident light onto the object to be inspected to generate linear light spots.
20. The detection system of claim 19, wherein the light source assembly comprises: the light source and the shaping element are used for shaping the light emitted by the light source to generate linear light beams; the shaping element comprises: the shape of the incident port of the optical fiber bundle is matched with the shape of the light spot of the light emitted by the light source, and the optical fibers of the emergent port of the optical fiber bundle are arranged in a straight shape.
CN202210263513.6A 2022-03-17 2022-03-17 Detection method and detection system Pending CN116804633A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210263513.6A CN116804633A (en) 2022-03-17 2022-03-17 Detection method and detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210263513.6A CN116804633A (en) 2022-03-17 2022-03-17 Detection method and detection system

Publications (1)

Publication Number Publication Date
CN116804633A true CN116804633A (en) 2023-09-26

Family

ID=88078460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210263513.6A Pending CN116804633A (en) 2022-03-17 2022-03-17 Detection method and detection system

Country Status (1)

Country Link
CN (1) CN116804633A (en)

Similar Documents

Publication Publication Date Title
JP4546830B2 (en) Dark field inspection system
US7224826B2 (en) Fluorescent intensity measuring method and apparatus
US6291816B1 (en) System and method for measuring object features with coordinated two and three dimensional imaging
JP4741986B2 (en) Optical inspection method and optical inspection apparatus
US20040109170A1 (en) Confocal distance sensor
CN110376573B (en) Laser radar installation and adjustment system and installation and adjustment method thereof
KR20130033401A (en) System and method for inspecting a wafer
KR20100083743A (en) System and method for inspecting a wafer
KR20100083745A (en) System and method for inspecting a wafer
US6052189A (en) Height measurement device and height measurement method
CN109540005B (en) Optical detection system and optical detection method
CN116046803A (en) Multi-channel detection system for defects of non-pattern wafer
CN114486910A (en) Device and method for detecting surface defects of planar optical element
CN113299575B (en) Focusing method and apparatus, focusing device, and storage medium
CN112731773A (en) Electron beam exposure machine, focusing method and device
JP2003017536A (en) Pattern inspection method and inspection apparatus
CN217655026U (en) Bright and dark field detection device
CN116804633A (en) Detection method and detection system
CN217505694U (en) Detection device
CN218067678U (en) Detection device
CN217605696U (en) Detection device
CN116794042A (en) Detection system and detection method
CN215263090U (en) Illumination module and detection device
CN110763689B (en) Surface detection device and method
CN218584684U (en) Detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination