CN216816499U - Detection system - Google Patents

Detection system Download PDF

Info

Publication number
CN216816499U
CN216816499U CN202122639168.5U CN202122639168U CN216816499U CN 216816499 U CN216816499 U CN 216816499U CN 202122639168 U CN202122639168 U CN 202122639168U CN 216816499 U CN216816499 U CN 216816499U
Authority
CN
China
Prior art keywords
detected
detector
light
detection
piece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202122639168.5U
Other languages
Chinese (zh)
Inventor
刘健鹏
陈鲁
张嵩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongke Feice Technology Co Ltd
Original Assignee
Shenzhen Zhongke Feice Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Feice Technology Co Ltd filed Critical Shenzhen Zhongke Feice Technology Co Ltd
Priority to CN202122639168.5U priority Critical patent/CN216816499U/en
Application granted granted Critical
Publication of CN216816499U publication Critical patent/CN216816499U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The application discloses a detection system. The detection system comprises a light source, a detector and a lens, wherein the light source is used for generating detection light irradiating a piece to be detected, the lens is used for receiving the detection light reflected by the piece to be detected, the detector is used for imaging according to the reflected detection light, the detector is a time delay integration camera, and the lens is an object space telecentric lens. In the detecting system of this application embodiment, because the camera lens is object space telecentric lens, can guarantee that the formation of image light path of camera lens is parallel at the chief ray in each visual field, the reflectivity of having avoided awaiting measuring piece constantly changes along with the angle of the light that sends of camera lens, when having guaranteed to image according to the detection light through awaiting measuring piece reflection through the detector, the detection image homogeneity of awaiting measuring piece is better, and the detector can improve the light efficiency for time delay integral camera, when in order to compensate the camera lens for object space telecentric lens, because the aperture is less, the light efficiency that leads to descends, thereby the formation of image effect of the detection image of awaiting measuring piece has been guaranteed.

Description

Detection system
Technical Field
The present application relates to the field of semiconductor inspection technology, and more particularly, to an inspection system.
Background
At present, when a workpiece such as a wafer is detected, an illumination light source is generally used for irradiating a to-be-detected piece, so that an imaging device can acquire a relatively clear to-be-detected piece image, and the to-be-detected piece is detected.
SUMMERY OF THE UTILITY MODEL
The embodiment of the application provides a structure of a detection system.
The detecting system of this application embodiment, including light source, detector and camera lens, the light source is used for producing the detecting light that shines the piece that awaits measuring, the camera lens receives the warp the detecting light that awaits measuring a reflection, the detector is used for according to the detecting light imaging of reflection, the detector is time delay integral camera, the camera lens is thing side telecentric lens.
In some embodiments, the light source comprises a first illumination device, and an illumination optical path of the first illumination device and an imaging optical path of the detector are coincident.
In some embodiments, the detection system includes a light splitter, the light source further includes a second illumination device, and the light emitted by the first illumination device is reflected by the light splitter and then vertically incident on the object to be detected; when the light splitting sheet is positioned on an imaging light path of the detector, the first lighting device irradiates the piece to be detected, so that the detector generates a first detection image according to the detection light reflected by the piece to be detected; when the light splitting sheet leaves an imaging light path of the detector, the second illuminating device irradiates the piece to be detected, so that the detector generates a second detection image according to the detection light reflected by the piece to be detected.
In some embodiments, the probe light emitted by the second illumination device forms an acute angle with a normal of the surface of the object.
In some embodiments, the angle between the light emitted by the second illumination device and the optical axis of the detector is in the interval [70 degrees, 85 degrees ].
In certain embodiments, the detection system further comprises a processor electrically connected to the first illumination device, the second illumination device, and the detector, the processor configured to: controlling the light splitting sheet to be positioned on an imaging light path of the detector of the first lighting device and controlling the first lighting device to irradiate the piece to be detected, so that the detector generates a first detection image according to the detection light reflected by the piece to be detected; and/or controlling the light splitting sheet to leave an imaging light path of the detector of the first lighting device and controlling the second lighting device to irradiate the piece to be detected, so that the detector generates a second detection image according to the detection light reflected and scattered by the piece to be detected; and detecting the piece to be detected according to the first detection image and/or the second detection image.
In some embodiments, the processor is further configured to determine the wavelengths of the probe lights emitted by the first and second illumination devices according to the reflectivity of the surface to be inspected of the object to be inspected to different wavelengths of light and the type of defect to be inspected of the object to be inspected.
In some embodiments, the processor is further configured to relatively move the object to be tested and the detection system according to a predetermined trajectory, so that the detector respectively acquires the first detection image and the second detection image of the entire surface to be tested of the object to be tested when the first illumination device and the second illumination device illuminate the object to be tested.
In some embodiments, the predetermined trajectory includes a plurality of scanning trajectories, and the processor is further configured to control the detector to capture a first sub-image of a partial region of the surface under test while the first illumination device illuminates the object under test and the object under test follows the scanning trajectories with respect to the detection system; splicing the first sub-images corresponding to the plurality of scanning tracks to generate a first detection image; when the second lighting device irradiates the piece to be detected and the piece to be detected moves along the scanning track, controlling the detector to shoot a second sub-image of a partial area of the surface to be detected; and splicing a plurality of scanning tracks corresponding to the second sub-image to generate the second detection image.
In some embodiments, the number of the detectors is multiple, and the multiple detectors are respectively used for imaging different areas of the surface to be measured; the processor is further configured to: forming a first image to be spliced according to the plurality of first sub-images acquired by each detector; splicing a plurality of first images to be spliced to obtain a first detection image; forming a second image to be spliced according to the plurality of second sub-images acquired by each detector; and splicing a plurality of the second images to be spliced to obtain the second detection image.
In some embodiments, when the imaging optical paths of a plurality of the detectors overlap, overlapping portions of adjacent first images to be stitched in a plurality of the first images to be stitched are identified; splicing a plurality of first images to be spliced according to the overlapping parts of the adjacent first images to be spliced so as to obtain the first detection image; when the imaging light paths of the plurality of detectors are not overlapped, stitching the plurality of first images to be stitched to obtain the first detection image; when the imaging light paths of the plurality of detectors are overlapped, identifying the overlapped part of the adjacent second images to be spliced in the plurality of second images to be spliced; splicing a plurality of second images to be spliced according to the overlapping parts of the adjacent second images to be spliced so as to obtain a second detection image; and when the imaging light paths of the plurality of detectors are not overlapped, stitching the plurality of second images to be stitched to obtain the second images to be stitched.
In some embodiments, the integration order of the detector is within 128 to 512 times the single line scan.
In the detecting system of this application embodiment, because the camera lens is object space telecentric lens, can guarantee that the formation of image light path of camera lens is parallel at the chief ray in each visual field, the reflectivity of having avoided awaiting measuring piece constantly changes along with the angle of the light that sends of camera lens, when having guaranteed to image according to the detection light through awaiting measuring piece reflection through the detector, the detection image homogeneity of awaiting measuring piece is better, and the detector can improve the light efficiency for time delay integral camera, when in order to compensate the camera lens for object space telecentric lens, because the aperture is less, the light efficiency that leads to descends, thereby the formation of image effect of the detection image of awaiting measuring piece has been guaranteed.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIGS. 1 and 2 are schematic plan views of a detection system according to certain embodiments of the present application;
fig. 3-6 are schematic diagrams of a scenario of a detection system according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the embodiments of the present application, and are not construed as limiting the embodiments of the present application.
In the description of the present application, it is to be understood that the terms "center," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise" indicate orientations or positional relationships that are based on the orientations or positional relationships illustrated in the figures, but are used for convenience in describing the present application and to simplify the description, and do not indicate or imply that the device or element so referred to must have a particular orientation, be constructed in a particular orientation, and be operated in a particular manner, and thus, are not to be considered limiting of the present application. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as the case may be.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise direct contact of the first and second features, or may comprise contact of the first and second features not directly but through another feature in between. Also, the first feature "on," "above" and "over" the second feature may include the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The following disclosure provides many different embodiments or examples for implementing different features of the application. In order to simplify the disclosure of the present application, specific example components and arrangements are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Referring to fig. 1, an exemplary embodiment of a detection system 100 is provided. The inspection system 100 includes a light source 10, a detector 20 and a lens 30, wherein the light source 10 is configured to generate a detection light for irradiating the object 200, the lens 30 receives the detection light reflected by the object 200, the detector 20 is configured to image according to the reflected detection light, the detector 20 is a time delay integration camera, and the lens 30 is an object-side telecentric lens.
The device under test 200 may be a material having a conductive property between a conductor and an insulator, such as an element semiconductor, an inorganic compound semiconductor, an organic compound semiconductor, or a crystalline semiconductor. In the embodiment of the present application, the device under test 200 is taken as an example of a wafer, and it can be understood that the device under test 200 is not limited to a wafer, and may also be an optical element such as a concave lens, a convex lens, a mirror, and the like.
In the detection system 100 of the application, because the camera lens 30 is the object space telecentric lens, can guarantee that the chief ray of the imaging light path of the camera lens 30 is parallel in each visual field, the reflectivity of the piece 200 to be detected is avoided changing along with the angle of the light emitted by the camera lens 30, when the detection light reflected by the piece 200 to be detected is imaged through the detector 20, the uniformity of the detection image of the piece 200 to be detected is better, and the detector 20 can improve the light efficiency for the time delay integral camera, so as to compensate when the camera lens 30 is the object space telecentric lens, because the aperture is smaller, the light efficiency that leads to declines, thereby the imaging effect of the detection image of the piece 200 to be detected is guaranteed.
The following is further described with reference to the accompanying drawings.
Referring to fig. 1, the detection system 100 includes a light source 10, a detector 20, and a lens 30.
The light source 10 includes a first illumination device 11 and a second illumination device 12. The first lighting device 11 and the second lighting device 12 may be fiber-coupled light sources (fiber lasers), and the influence of external factors on light emitted by the fiber-coupled light sources is small, so that high-brightness light output can be realized, and the lighting effect of the to-be-measured part 200 is good.
Referring to fig. 2 and 3, the illumination optical path of the first illumination device 11 and the imaging optical path of the detector 20 are kept coincident.
Specifically, in certain embodiments, the detection system 100 further includes a beamsplitter 40. The light emitted from the first illumination device 11 is reflected by the beam splitter 40 and then vertically incident on the to-be-measured object 200. As shown in fig. 3, when the probe light emitted from the first illumination device 11 is emitted to the beam splitter 40, since the angle α of the beam splitter 40 inclined with respect to the first illumination device 11 is 45 degrees, when the probe light emitted from the first illumination device 11 is emitted to the beam splitter 40, the beam splitter 40 reflects the probe light, and the reflection angle β is also 45 degrees, so that the light can vertically enter the device under test 200, and similarly, the light vertically entering the device under test 200 is vertically reflected by the device under test 200 to the detector 20. The position of the beam splitter 40 is overlapped with the optical axis of the detector 20, so as to ensure that the detection light reflected by the beam splitter 40 is overlapped with the imaging light path of the detector 20, that is, the illumination light path of the first illumination device 11 and the imaging light path of the detector 20 are kept coincident.
Referring to fig. 1 and fig. 2 and fig. 4, the probing light emitted from the second illumination device 12 forms an acute angle with the normal of the surface of the device 200 to be tested, i.e., the probing light emitted from the second illumination device 12 is obliquely incident on the device 200 to be tested.
As shown in fig. 4, an angle between the probe light emitted from the second illumination device 12 and the normal of the surface of the device under test 200 is γ, that is, the angle γ is an acute angle, and an angle δ between the light emitted from the second illumination device 12 and the optical axis of the detector 20 is within an interval [70 degrees, 85 degrees ]. The sum of the included angle gamma and the included angle delta is 90 degrees, and the included angle gamma is positioned in an interval [15 degrees and 20 degrees ]. That is, the angle γ may be any value between 15 degrees and 20 degrees, and the angle δ may be any value between 70 degrees and 85 degrees. For example, the included angle γ may be any one of 15 degrees, 16 degrees, 16.4 degrees, 17 degrees, 18.1 degrees, 19 degrees, and 20 degrees, or any other value between 15 degrees and 20 degrees, and the included angle δ may be any one of 70 degrees, 71 degrees, 71.5 degrees, 72.3 degrees, 75 degrees, 77.7 degrees, 80 degrees, 82 degrees, 84 degrees, and 85 degrees, or any other value between 70 degrees and 85 degrees. When the included angle gamma is increased, the included angle delta is decreased, and when the included angle gamma is decreased, the included angle delta is increased.
Further, when the spectroscopic sheet 40 is located in the imaging optical path of the detector 20, the first illumination device 11 illuminates the object 200 so that the detector 20 generates a first detection image from the detection light reflected by the object 200. When the spectroscopic sheet 40 leaves the imaging optical path of the detector 20, the second illumination device 12 illuminates the object 200 to make the detector 20 generate a second detection image according to the detection light reflected by the object 200. The first illumination device 11 is a bright field light source, and the second illumination device 12 is a dark field light source. As can be seen, the first detection image generated by the probe 20 is a bright field image, and the second detection image generated by the probe 20 is a dark field image.
Specifically, a first predetermined station and a second predetermined station may be set in the detection system 100, as shown in fig. 3 and 4, the first predetermined station is located within the field of view of the imaging device, and the second predetermined station is located outside the field of view, that is, the first predetermined station is located on the imaging optical path of the detector 20, and the second predetermined station is not located on the imaging optical path of the detector 20. Therefore, when the first illumination device 11 is located at the first predetermined station, only the first illumination device 11 is provided for the detector 20, and the first illumination device 11 also drives the light splitter 40 to enter the first predetermined station, so that the bright field light emitted by the first illumination device 11 can be incident into the imaging light path of the detector 20 through the light splitter 40 to generate a first detection image, i.e., a bright field image, of the object 200 to be detected. When the first lighting device 11 is switched from the first predetermined station to the second predetermined station, the first lighting device 11 drives the beam splitter 40 to leave the imaging light path of the detector 20, so as to provide the light source 10 for the detector 20 only as the second lighting device 12, the dark field light emitted by the second lighting device 12 can be reflected by the object 200 to be detected, and thus the dark field light directly emits into the imaging light path of the detector 20, and the detector 20 can generate a second detection image, namely a dark field image, of the object 200 to be detected.
As shown in fig. 5, since the reflective surface of the object 200 generates specular reflection, when the object 200 is illuminated by the first illumination device 11, i.e. the bright field light source 10, in order to ensure that the detector 20 can form a first detection image with high contrast on the object 200 illuminated by the bright field light source 10, the light source 10 range of the bright field light source 10 should be located within twice the field of view of the detector 20, so as to provide uniform illumination for the object 200. When the dark field light source 10 irradiates the to-be-detected piece 200, the dark field light source 10 needs to be located outside the range twice of the field of view of the detector 20 to ensure that the light reflected by the mirror surface of the to-be-detected piece 200 does not enter the detector 20, thereby ensuring the imaging effect of the three-dimensional structure on the surface of the to-be-detected piece 200.
Currently, the image of the workpiece is usually obtained by bright field imaging, which can provide an image with good contrast in the resolution range of the optical system, or by dark field imaging, which can provide an image of a minute defect smaller than the resolution capability of the optical system. However, the difference between the brightness of the light source and the imaging angle of the bright field imaging system and the dark field imaging system is large, and if the bright field imaging and the dark field imaging are performed simultaneously, the bright field image and the dark field image are easily mixed together and are difficult to distinguish, and the bright field light source and the dark field light source interfere with each other, so that the light energy is lost, and the imaging effect is poor.
The inspection system 100 according to the embodiment of the present application is configured with the first predetermined station and the second predetermined station, so that when generating the image of the object 200 to be inspected, the positions of the first illumination device 11 can be switched to generate the first inspection image and the second inspection image of the object 200 to be inspected, respectively, and the first illumination device 11 is a bright field light source and the second illumination device 12 is a dark field light source, so that the bright field image and the dark field image can be acquired simultaneously. The accurate image information of the planar structure of the to-be-detected piece 200 can be accurately acquired according to the bright field light source, the accurate image information of the three-dimensional structure of the to-be-detected piece 200 can be accurately acquired according to the dark field light source, the accuracy of the detection effect of the to-be-detected piece 200 can be ensured, and after the first lighting device 11 is switched to the second preset station, the first lighting device 11 cannot influence the detection light emitted by the second lighting device 12 to enter the imaging light path of the imaging device, so that the loss of the energy of the detection light is avoided, and the imaging effect of the generated to-be-detected piece 200 image is ensured.
It should be noted that, since the second illumination device 12 is the dark field light source 10, the lower the angle δ between the detection light emitted from the second illumination device 12 and the detector 20 is, the higher the signal-to-noise ratio of the generated image is. In the detection system 100 according to the embodiment of the present invention, an included angle δ between the probe light emitted by the second illumination device 12 and the detector 20 is within an interval [70 degrees, 85 degrees ], which can ensure that the second illumination device 12 illuminates the to-be-detected object 200, so that when the detector 20 generates the second detection image according to the probe light reflected by the to-be-detected object 200, the signal-to-noise ratio of the generated image is low, thereby ensuring the imaging effect of the second detection image.
Further, when the second illumination device 12 illuminates the object 200 to be detected, the first illumination device 11 is switched from the first predetermined station to the second predetermined station, so that the beam splitter 40 exits from the imaging optical path, and therefore, the probe light emitted from the second illumination device 12 to the object 200 to be detected can be directly emitted into the imaging optical path, energy loss of the probe light emitted from the second illumination device is avoided, and the imaging effect of the second detection image is further improved.
Referring to fig. 1 and 2, the detector 20 is a Time Delay Integration (TDI) camera, and the lens 30 is an object-side telecentric lens 30.
At present, the lens 30 in the detector 20 is an imaging lens of an object-side non-telecentric structure, and the imaging lens of the object-side non-telecentric structure may cause the chief rays of each view field in the detector 20 to be unparallel, only the chief ray in the central view field is parallel to the optical axis of the lens 30, that is, the chief ray angle in the central view field is zero, and other ray angles all have included angles of a certain degree.
When the device under test 200 is a wafer, the reflection characteristic of the back surface of the wafer is usually mirror reflection, and the back surface of the wafer is attached with a multi-layer film, which causes the light emitted from the detector 20 to the back surface of the wafer to generate an interference reflection effect. This effect can make reflectivity and reflectance spectrum change along with the angle, therefore, when camera lens 30 in detector 20 adopts the imaging lens of object space non-telecentric structure, because the angle of the chief ray of different visual fields in detector 20 is different, then can lead to final reflectivity constantly changing along with the visual field, then can lead to the piece 200 image to be measured that detector 20 finally produced to be bright both sides in the middle effect dark, or bright both sides in the middle, so when the image of the different positions department of the piece 200 that splices of many pieces 200 that await measuring in order to obtain final detection image, the concatenation department of the image of the different positions department of many pieces 200 that await measuring then can produce the seam, thereby lead to can producing the erroneous judgement to the testing result of piece 200 that awaits measuring.
Therefore, in the detection system 100 of the embodiment of the present application, the lens 30 is an object-side telecentric lens, and the object-side telecentric lens can ensure that the principal rays of the light emitted by the detector 20 in each view field are parallel, so as to avoid the problem that when the light generates an interference reaction effect on the back of the wafer, the multilayer film on the back of the wafer changes with the angle, thereby ensuring that the uniformity of the detection image generated by the detector 20 is good, avoiding the situation of different brightness, and further avoiding the erroneous judgment of the detection result of the to-be-detected object 200.
And the detector 20 is a Time Delay Integration (TDI) camera, and it can be known from the above that the lens 30 of the detection system 100 in the embodiment of the present application adopts an object-side telecentric lens, and the light efficiency can be reduced due to the smaller aperture of the object-side telecentric lens, and when the detector 20 is the TDI camera, the light efficiency can be improved, so as to compensate for the energy reduction caused by the reduced aperture of the object-side telecentric lens, and thus ensure the effect of outputting the detection image by the detector 20.
Wherein the integration order of the detector is within 128 to 512 times the single line scan. Such as 128 times, 256 times, 512 times, etc., the integration order of the detector 20 may be that of a single line scan. The integral order of the detector 20 is related to the response efficiency, when the integral order of the detector 20 is higher, the response efficiency of the detector 20 is higher, so that energy reduction caused by reduction of the aperture of the object-side telecentric lens is compensated, and by using the TDI camera with a higher integral order, the light efficiency of the first detection image and the second detection image finally generated by the detector 20 can be higher. That is, the light efficiency of the bright field image and the dark field image generated finally is high, thereby ensuring the imaging effect of the detection image of the device under test 200.
Referring to fig. 1, in some embodiments, the detection system 100 further includes a processor 50, and the processor 50 is electrically connected to the first illumination device 11, the second illumination device 12, and the detector 20. The processor 50 is configured to control the beam splitter 40 to be located in the imaging optical path of the detector 20, and control the first illumination device 11 to illuminate the object 200, so that the detector 20 generates a first detection image according to the detection light reflected by the object 200; controlling the light splitting sheet 40 to leave an imaging light path of the detector 20 and controlling the second illumination device 12 to illuminate the to-be-detected piece 200, so that the detector 20 generates a second detection image according to the detection light reflected by the to-be-detected piece 200; and detecting the to-be-detected object 200 according to the first detection image and/or the second detection image.
Specifically, when inspecting the object 200, as shown in fig. 3, the processor 50 moves the first illumination device 11 to the first predetermined station to control the beam splitter 40 to be located on the imaging optical path of the detector 20, and controls the first illumination device 11 to illuminate the object 200, where the first illumination device 11 is the bright field light source 10, and thus the first detection image generated by the detector 20 according to the detection light reflected by the object 200 is a bright field image.
As shown in fig. 4, the processor 50 also moves the first illuminating device 11 to a second predetermined position, and the second predetermined position is not located in the imaging optical path of the detector 20, that is, the processor 50 controls the beam splitter 40 to leave the imaging optical path of the detector 20, at this time, the first illuminating device 11 does not emit light any more, but the second illuminating device 12 illuminates the object 200 to be detected, and the second illuminating device 12 is the dark field light source 10, so that the second detected image generated by the detector 20 according to the light reflected by the object 200 to be detected is a dark field image.
Thus, the inspection system 100 can obtain a bright field image and a dark field image of the object 200, respectively, and as can be seen from the above description, the TDI camera is used as the probe 20, so that a bright field image and a dark field image with high contrast can be obtained. The bright field light source 10 can accurately acquire the accurate image information of the planar structure of the object 200 to be measured, and the dark field light source 10 can accurately acquire the accurate image information of the three-dimensional structure of the object 200 to be measured.
When the processor 50 detects the to-be-detected piece 200 according to the first detection image, the accuracy of detecting the planar structure of the to-be-detected piece 200 can be ensured, and when the processor 50 detects the to-be-detected piece 200 according to the second detection image, the accuracy of detecting the surface three-dimensional structure of the to-be-detected piece 200 can be ensured. When the processor 50 detects the to-be-detected piece 200 according to the first detection image and the second detection image, the accuracy of detecting the whole structure of the to-be-detected piece 200 can be ensured.
In addition, since the first illumination device 11 is switched to the second predetermined station to exit the imaging optical path of the detector 20 before the dark field image is acquired, it is ensured that the light reflected by the to-be-measured object 200 does not enter the spectroscopic sheet 40 when the to-be-measured object 200 is irradiated by using the second illumination device 12, thereby avoiding energy loss.
In some embodiments, the processor 50 is configured to determine the wavelengths emitted by the first and second illumination devices 11 and 12 according to the reflectivity of the surface to be inspected of the piece 200 to different wavelengths of light and the type of defect to be inspected of the piece 200.
Specifically, in one embodiment, before inspecting the object 200, the inspection system 100 may determine the wavelengths of the probe lights emitted by the first and second illumination devices 11 and 12 according to the reflectivity of the object to be inspected of the object 200 to the light of the opaque wavelength. For example, when the reflectance of the object to be measured of the object 200 to green light with a wavelength of 492nm to 577nm is the strongest, the wavelengths of the probe lights emitted by the first and second illuminating devices 11 and 12 may be set to 492nm to 577nm, and the probe light of green color may be emitted. Therefore, when the first lighting device 11 or the second lighting device 12 illuminates the object 200, the intensity of light reflected by the object 200 to the detector 20 is the strongest, so that the imaging effect of the generated first detection image and the second detection image is ensured.
In another embodiment, before inspecting the object 200, the inspection system 100 may determine the wavelengths of the probe lights emitted by the first and second illumination devices 11 and 12 according to the types of defects to be inspected on the object 200. For example, when the defect type is a bump, if the probe light emitted from the first illumination device 11 or the second illumination device 12 is white light of 400nm to 760nm, the defect can be better displayed in the first inspection image and the second inspection image, and therefore, the inspection system 100 can set the wavelength of the probe light emitted from the first illumination device 11 and the second illumination device 12 to 400nm to 760nm and emit white probe light. For another example, when the type of the defect to be detected of the device under test 200 is a scratch, if the detection light emitted by the first illumination device 11 or the second illumination device 12 is green light with a wavelength of 492nm to 577nm, the defect lower than the surface of the device under test 200, i.e., the scratch, can be more clearly exposed, and therefore, the detection system 100 can set the wavelength of the detection light emitted by the first illumination device 11 and the second illumination device 12 to 492nm to 577nm and emit green detection light. Therefore, the detection system 100 can be ensured to identify the type of the defect to be detected of the object 200 more accurately through the first detection image and the second detection image.
In yet another embodiment, before inspecting the object 200, the inspection system 100 may further determine the wavelengths of the probe lights emitted by the first and second illumination devices 11 and 12 according to the reflectivity of the object 200 to be inspected facing different wavelengths of light and the type of defect to be inspected by the object 200. For example, when the reflectivity of the green light with the wavelength of 492nm to 577nm of the object 200 to be detected is strongest, and the type of the defect to be detected of the object 200 is a scratch, the detection system 100 may determine that the wavelengths of the detection lights emitted by the first illumination device 11 and the second illumination device 12 are 492nm to 577nm, and emit the green detection light, thereby ensuring both the imaging effect of the first detection image and the second detection image, and the accuracy of identifying the type of the defect to be detected of the object 200. For another example, when the reflectance of the to-be-detected surface to be detected with red light having a wavelength of 622nm to 770nm is strongest, and the type of the defect to be detected of the to-be-detected object 200 is a protrusion, at this time, if the light emitted by the first illumination device 11 and the second illumination device 12 is set to be red, it cannot be ensured that the detection system 100 can accurately identify the type of the defect to be detected of the to-be-detected object 200 according to the first image to be detected and the second image to be detected, the detection system 100 can set the wavelength of the detection light emitted by the first illumination device 11 to be 622nm to 770nm and emit red detection light, and the wavelength of the detection light emitted by the second illumination device 12 to be 400nm to 760nm and emit white detection light, thereby ensuring both the imaging effect of the first detection image and the accuracy of identifying the type of the defect to be detected of the to-be-detected object 200 through the second detection image.
Therefore, the detection system 100 determines the wavelengths emitted by the first illumination device 11 and the second illumination device 12 according to the reflectivity of the to-be-detected surface of the to-be-detected member 200 to different wavelengths and the type of the to-be-detected defect of the to-be-detected member 200, on one hand, it is ensured that the reflection of the detection light emitted by the to-be-detected surface of the to-be-detected member 200 to the first illumination device 11 and the second illumination device 12 is strong, so as to ensure the imaging effect, and on the other hand, it is ensured that the detection system 100 can accurately identify the type of the to-be-detected defect of the to-be-detected member 200 from the generated detection image.
Referring to fig. 1, in some embodiments, the processor 50 is further configured to relatively move the object 200 and the inspection system 100 according to a predetermined trajectory, so that the detector 20 respectively obtains a first inspection image and a second inspection image of the entire surface of the object 200 when the first illumination device 11 and the second illumination device 12 illuminate the object 200.
Referring to fig. 6, the detecting system 100 further includes a moving platform 60, the object 200 to be detected is placed on the moving platform 60, and the moving platform 60 can drive the object 200 to be detected to move along the first direction, the second direction, the third direction and the fourth direction. Therefore, when the processor 50 controls the moving platform 60 to drive the to-be-detected piece 200 to move according to the predetermined track, so that the detector 20 irradiates the to-be-detected piece 200 on the first lighting device 11 and the second lighting device 12, the first detection image and the second detection image of the whole to-be-detected surface of the to-be-detected piece 200 can be respectively obtained. For example, when the processor 50 controls the moving platform 60 to move in the first direction, the first illumination device 11 illuminates the object 200 to enable the detector 20 to acquire the first detection image. For another example, when the processor 50 controls the moving platform 60 to move along the second direction, the first illumination device 11 is switched to the second predetermined position (as shown in fig. 4), and the second illumination device 12 illuminates the object 200 to enable the detector 20 to acquire the second inspection image.
Referring to fig. 6, the predetermined track includes a plurality of scanning tracks, which are a scanning track M, a scanning track N, a scanning track P, and a scanning track Q, for example, the number of the detectors 20 is one, and the processor 50 is configured to control the detectors 20 to shoot a first sub-image of a partial area of the surface to be detected when the first illumination device 11 illuminates the object to be detected 200 and the object to be detected 200 follows the scanning track relative to the detection system 100; and splicing the first sub-images corresponding to the plurality of scanning tracks to generate a first detection image. The processor 50 is further configured to control the detector 20 to capture a second sub-image of a partial area of the surface to be measured when the second illumination device 12 illuminates the object 200 to be measured and the object 200 to be measured moves along the scanning track; and splicing the plurality of scanning tracks corresponding to the second sub-images to generate a second detection image.
Specifically, when the moving platform 60 drives the to-be-detected object 200 to move in the first direction along the scanning track M and the scanning track P, and the scanning track N and the scanning track Q move in the second direction, if the first illumination device 11 illuminates the to-be-detected object 200, 4 first sub-images corresponding to four scanning tracks can be generated, and thus, the processor 50 can splice the 4 first sub-images, thereby generating the first detection image. The processor 50 controls the moving platform 60 to drive the to-be-detected part 200 to move along the predetermined track, and the to-be-detected part may move along the scanning track M in the first direction, then translate along the fourth direction to the position of the scanning track N, then move along the scanning track N in the second direction, translate along the fourth direction to the position of the scanning track P, then move along the scanning track P in the first direction, translate along the fourth direction to the position of the scanning track Q, and finally move along the scanning track Q in the second direction, so as to complete the scanning of the whole to-be-detected surface of the to-be-detected part 200.
When the moving platform 60 drives the to-be-detected object 200 to move in the first direction along the scanning track M and the scanning track P, and the scanning track N and the scanning track Q move in the second direction, if the to-be-detected object 200 is illuminated by the second illumination device 12, 4 second sub-images corresponding to the four scanning tracks can be generated, and thus, the processor 50 can splice the 4 second sub-images, thereby generating a second detection image. When the second illumination device 12 illuminates the object 200, the first illumination device 11 is switched to the second predetermined position, so that the beam splitter 40 leaves the imaging optical path of the detector 20 and no longer illuminates the object 200. The moving platform 60 is moved in the same manner as the moving platform 60 is moved when the first detection image is generated.
The scanning track M, the scanning track N, the scanning track P, and the scanning track Q are symmetrically arranged along the center of the moving platform 60, so that after the moving platform 60 drives the to-be-detected part 200 to move according to 4 scanning tracks, the detection image of all areas of the to-be-detected part 200 can be obtained.
In some embodiments, the width of the object 200 may also be an integral multiple of the width of the field of view of the detector 20, as shown in fig. 6, the width of the object 200 and the width of the field of view of the detector 20 are consistent with the third direction or the fourth direction, and when the width of the object 200 and the width of the field of view of the detector 20 are the same, the moving platform 60 drives the object 200 to move to the first direction or the second direction along the scanning track once, so that the detector 20 can completely acquire the first detection image or the second detection image of the object 200. When the width of the to-be-detected piece 200 is twice as wide as the field of view of the detector 20, the moving platform 60 drives the to-be-detected piece 200 to move once along the scanning track in the first direction or the second direction, and the detector 20 can obtain the first sub-image or the second sub-image of a half area of the to-be-detected piece 200, and can obtain the first sub-image or the second sub-image of the other half area of the to-be-detected piece 200 by repeating the steps again, so that the first detection image and the second sub-image of the to-be-detected piece 200 can be obtained by splicing the two first sub-images or the two second sub-images. Therefore, when a plurality of first sub-images or a plurality of second sub-images are spliced, repeated images of the area to be measured of the piece to be measured 200 do not exist between different first sub-images or second sub-images, so that the splicing of the plurality of first sub-images or the plurality of second sub-images is facilitated.
In addition, there may be a plurality of detectors 20, for example, when there are 2 detectors 20, and when the moving platform 60 drives the object 200 to be tested to move along the scanning track, the 2 detectors 20 may obtain 2 first sub-images or 2 second sub-images at the same time. It can be understood that the larger the number of the detectors 20, the more the area of the detected image of the object 200 is obtained when the moving platform 60 drives the object 200 to move along the scanning track, and the scanning track can be correspondingly reduced. Therefore, the times of splicing the first detection image and the second detection image can be reduced, so that the imaging effect of the first detection image and the second detection image is ensured, and the accuracy of detecting the to-be-detected piece 200 is ensured.
When the processor 50 controls the moving platform 60 to move along the scanning track, each detector 20 acquires a sub-image, and the images to be stitched can be formed, and the processor 50 can stitch the images to be stitched to form a detection image. As shown in fig. 6, there are 2 detectors 20, one detector 20 corresponds to the scanning track M and the scanning track N, then after the moving platform 60 drives the to-be-tested object 200 to move in the first direction and then move in the second direction, if the first lighting device 11 illuminates the to-be-tested object 200, one detector 20 forms two first sub-images, the processor 50 may form a first to-be-stitched image according to the two first sub-images corresponding to the scanning track M and the scanning track N, and the other detector 20 corresponds to the scanning track P and the scanning track Q, at this time, the moving platform 60 drives the to-be-tested object 200 to move in the first direction and then move in the second direction, the other detector 20 forms another two first sub-images, the processor 50 may form a first to-be-stitched image according to the scanning track P and the two first sub-images corresponding to the scanning track Q, thus, when the processor 50 stitches two first images to be stitched, a first detection image can be obtained. Similarly, when the processor 50 controls the moving platform 60 to move along the scanning track, each of the detectors 20 may further acquire two second sub-images to form a second image to be stitched, and the processor 50 may stitch a plurality of second images to be stitched to obtain a second detection image.
It should be noted that the imaging optical paths of the multiple detectors 20 may or may not overlap. For example, if there are 2 detectors 20, if the imaging optical paths of the two detectors 20 overlap, before the processor 50 forms the first inspection image according to the formation of the plurality of first images to be stitched, the processor 50 further identifies the overlapping portion in the two first images to be stitched, so that, when stitching the two first images to be stitched, the processor 50 cuts off the overlapping portion in any one of the first images to be stitched, so that the processor 50 has only one image of the overlapping portion when stitching the two first images to be stitched, thereby forming the first inspection image of the object 200. Similarly, before the processor 50 forms the second test image according to the formation of the plurality of second images to be stitched, the processor 50 may identify the overlapping portion in the two second images to be stitched, so that when the two second images to be stitched are stitched, the processor 50 may cut off the overlapping portion in any one of the second images to be stitched, so that the processor 50 has only one image of the overlapping portion when the two second images to be stitched are stitched, thereby forming the second test image of the device under test 200.
If the imaging optical paths of the two detectors 20 are not overlapped, the processor 50 may directly stitch a plurality of first images to be detected to obtain a first detected image, or may directly stitch a plurality of second images to be detected to obtain a second image to be stitched.
Finally, it should be noted that the present application protects the structural arrangement of the detection system, wherein the execution steps of the processor are all prior art, and the present application does not improve the execution steps of the processor, that is, the execution steps of the processor in the present application are all conventional technical means.
In the description herein, reference to the terms "certain embodiments," "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiments or examples is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application, which is defined by the claims and their equivalents.

Claims (12)

1. The utility model provides a detection system, its characterized in that includes light source, detector and camera lens, the light source is used for producing the probe light that shines the piece that awaits measuring, the camera lens is received the warp the probe light that the piece that awaits measuring reflects, the detector be used for the basis the probe light imaging of reflection, the detector is time delay integral camera, the camera lens is object space telecentric lens.
2. The detection system according to claim 1, wherein the light source comprises a first illumination device, and an illumination light path of the first illumination device and an imaging light path of the detector are coincident.
3. The inspection system according to claim 2, wherein the inspection system comprises a light splitter, the light source further comprises a second illumination device, and the light emitted from the first illumination device is reflected by the light splitter and then vertically incident on the object to be inspected; when the light splitting sheet is positioned on an imaging light path of the detector, the first lighting device irradiates the piece to be detected, so that the detector generates a first detection image according to the detection light reflected by the piece to be detected; when the light splitting piece leaves an imaging light path of the detector, the second illuminating device irradiates the piece to be detected, so that the detector generates a second detection image according to the detection light reflected by the piece to be detected.
4. The inspection system of claim 3, wherein the probe light emitted from the second illumination device forms an acute angle with a normal to the surface of the dut.
5. A detection system according to claim 4, wherein the angle between the light from the second illumination means and the optical axis of the detector lies in the interval [70 degrees, 85 degrees ].
6. The detection system of claim 3, further comprising a processor electrically connected to the first illumination device, the second illumination device, and the detector, the processor configured to:
controlling the light splitting sheet to be positioned on an imaging light path of the detector and controlling the first lighting device to irradiate the piece to be detected so that the detector generates a first detection image according to the detection light reflected by the piece to be detected; and/or
Controlling the light splitting sheet to leave an imaging light path of the detector and controlling the second lighting device to irradiate the piece to be detected, so that the detector generates a second detection image according to the detection light reflected by the piece to be detected; and
and detecting the piece to be detected according to the first detection image and/or the second detection image.
7. The detection system of claim 6, wherein the processor is further configured to: and determining the wavelengths of the detection light emitted by the first lighting device and the second lighting device according to the reflectivity of the surface to be detected of the piece to be detected to different wavelengths of light and the type of the defect to be detected of the piece to be detected.
8. The detection system of claim 6, wherein the processor is further configured to: and relatively moving the to-be-detected piece and the detection system according to a preset track, so that the detector respectively acquires the first detection image and the second detection image of the whole to-be-detected surface of the to-be-detected piece when the first illumination device and the second illumination device illuminate the to-be-detected piece.
9. The detection system of claim 8, wherein the predetermined trajectory comprises a plurality of scan trajectories, the processor further configured to:
when the first lighting device illuminates the piece to be detected and the piece to be detected is along the scanning track relative to the detection system, controlling the detector to shoot a first sub-image of a partial area of the surface to be detected;
splicing the first sub-images corresponding to the plurality of scanning tracks to generate a first detection image; and
when the second lighting device irradiates the piece to be detected and the piece to be detected moves along the scanning track, controlling the detector to shoot a second sub-image of a partial area of the surface to be detected;
and splicing a plurality of scanning tracks corresponding to the second sub-image to generate the second detection image.
10. The detection system according to claim 9, wherein the number of the detectors is plural, and the plural detectors are respectively used for imaging different areas of the surface to be detected; the processor is further configured to:
forming a first image to be spliced according to the plurality of first sub-images acquired by each detector;
splicing a plurality of first images to be spliced to obtain a first detection image;
forming a second image to be spliced according to the plurality of second sub-images acquired by each detector; and
and splicing a plurality of the second images to be spliced to obtain the second detection image.
11. The detection system according to claim 10, wherein when the imaging optical paths of a plurality of said detectors overlap, overlapping portions of adjacent ones of said first images to be stitched are identified in a plurality of said first images to be stitched; splicing a plurality of first images to be spliced according to the overlapping parts of the adjacent first images to be spliced so as to obtain the first detection image;
when the imaging light paths of the detectors are not overlapped, splicing the first images to be spliced to obtain the first detection image; and
when imaging light paths of a plurality of detectors are overlapped, identifying the overlapped part of the adjacent second images to be spliced in the second images to be spliced; splicing a plurality of second images to be spliced according to the overlapping parts of the adjacent second images to be spliced so as to obtain a second detection image;
and when the imaging light paths of the plurality of detectors are not overlapped, stitching the plurality of second images to be stitched to obtain the second images to be stitched.
12. The detection system of claim 1, wherein the integration order of the detector is within 128 to 512 times the single line scan.
CN202122639168.5U 2021-10-29 2021-10-29 Detection system Active CN216816499U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202122639168.5U CN216816499U (en) 2021-10-29 2021-10-29 Detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202122639168.5U CN216816499U (en) 2021-10-29 2021-10-29 Detection system

Publications (1)

Publication Number Publication Date
CN216816499U true CN216816499U (en) 2022-06-24

Family

ID=82048272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202122639168.5U Active CN216816499U (en) 2021-10-29 2021-10-29 Detection system

Country Status (1)

Country Link
CN (1) CN216816499U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117405690A (en) * 2023-12-15 2024-01-16 深圳市什方智造科技有限公司 Multi-ray detection assembly and battery detection device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117405690A (en) * 2023-12-15 2024-01-16 深圳市什方智造科技有限公司 Multi-ray detection assembly and battery detection device

Similar Documents

Publication Publication Date Title
KR101612535B1 (en) System and method for inspecting a wafer
KR101646743B1 (en) System and method for inspecting a wafer
US7679756B2 (en) Device for a goniometric examination of optical properties of surfaces
US7433055B2 (en) Device for the examination of optical properties of surfaces
US7276719B2 (en) Device for a goniometric examination of the optical properties of surfaces
US7382457B2 (en) Illumination system for material inspection
KR101638883B1 (en) System and method for inspecting a wafer
US20040109170A1 (en) Confocal distance sensor
JP4847128B2 (en) Surface defect inspection equipment
KR20110127165A (en) System and method for detecting defects of substrate
JP2003503701A (en) Lighting module
JPS58219441A (en) Apparatus for detecting defect on surface of convex object
KR20160004099A (en) Defect inspecting apparatus
CN216816499U (en) Detection system
US20020167660A1 (en) Illumination for integrated circuit board inspection
KR20160121716A (en) Surface inspection apparatus based on hybrid illumination
JP2022049881A (en) Optical device
KR101124567B1 (en) Wafer inspecting apparatus having hybrid illumination
KR100633798B1 (en) Apparatus for testing installation condition and outer shape of a semiconductor
CN113075216A (en) Detection device and detection method
WO2008120882A1 (en) Apparatus for inspection of three-dimensional shape and method for inspection using the same
KR100389967B1 (en) Automatized defect inspection system
JP2004093211A (en) Nondestructive inspection system
CN212567282U (en) Detection device and detection equipment
US20220364993A1 (en) Inspection system for optical surface inspection of a test specimen

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant