CN113607756A - Detection method and detection equipment - Google Patents

Detection method and detection equipment Download PDF

Info

Publication number
CN113607756A
CN113607756A CN202110873817.XA CN202110873817A CN113607756A CN 113607756 A CN113607756 A CN 113607756A CN 202110873817 A CN202110873817 A CN 202110873817A CN 113607756 A CN113607756 A CN 113607756A
Authority
CN
China
Prior art keywords
workpiece
image
sub
area
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110873817.XA
Other languages
Chinese (zh)
Inventor
陈鲁
王天民
张嵩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Skyverse Ltd
Shenzhen Zhongke Feice Technology Co Ltd
Original Assignee
Shenzhen Zhongke Feice Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Feice Technology Co Ltd filed Critical Shenzhen Zhongke Feice Technology Co Ltd
Priority to CN202110873817.XA priority Critical patent/CN113607756A/en
Publication of CN113607756A publication Critical patent/CN113607756A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/958Inspecting transparent materials or objects, e.g. windscreens
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • G01N2021/0106General arrangement of respective parts
    • G01N2021/0112Apparatus in one mechanical, optical or electronic block

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a detection method. The detection method comprises a first acquisition step: projecting first illumination light to the workpiece and acquiring a first sub-image of the workpiece; a second acquisition step: projecting first illumination light and second illumination light to the workpiece, and acquiring a second sub-image of the workpiece; and a detection step: and detecting a first area of the workpiece according to the first sub-image and a second area of the workpiece according to the second sub-image, wherein the reflectivity of the first area is different from that of the second area. The application also discloses a detection device. The first sub-image is an image acquired when the first illumination light is projected to the workpiece, and the second sub-image is an image acquired when the first illumination light and the second illumination light are projected to the workpiece, so that the first sub-image and the second sub-image can respectively adapt to the detection requirements of the first area and the second area with different reflectivities, and the accuracy of the detection result of the workpiece is improved.

Description

Detection method and detection equipment
Technical Field
The present disclosure relates to the field of industrial detection technologies, and in particular, to a detection method and a detection apparatus.
Background
When a panel is detected, a camera in an Automatic Optical Inspection (AOI) device is usually used to collect an image of the panel, and the collected image is analyzed to detect defect information in the panel, so that the quality of the image collected by the camera is good and bad, which has a crucial influence on the accuracy of a detection result.
Disclosure of Invention
The embodiment of the application provides a detection method and detection equipment.
The detection method of the embodiment of the application comprises the following steps:
a first acquisition step: projecting first illumination light to a workpiece and acquiring a first sub-image of the workpiece;
a second acquisition step: projecting first illumination light and second illumination light to the workpiece, and acquiring a second sub-image of the workpiece; and
a detection step: and detecting a first area of the workpiece according to the first sub-image, and detecting a second area of the workpiece according to the second sub-image, wherein the reflectivity of the first area is different from that of the second area.
In certain embodiments, the detection method further comprises:
a driving step: driving relative motion between the workpiece and a field of view of an acquired image; and
in the process of implementing the driving step, the first acquisition step and the second acquisition step are alternately and cyclically implemented so as to acquire a plurality of first sub-images and a plurality of second sub-images of different parts of the workpiece.
In some embodiments, the time interval between adjacent first and second acquisition steps is equal to the time required to drive a set distance of relative movement between the workpiece and the field of view, the set distance being half the size of the field of view in the direction of relative movement between the workpiece and the field of view.
In certain embodiments, the detection method further comprises:
a first driving step: driving the workpiece and the field of view of the acquired image to move relatively along a first direction;
during the implementation of the first driving step, implementing the first acquisition step for a plurality of times to acquire a plurality of first sub-images of different parts of the workpiece;
a second driving step: driving relative motion between the workpiece and a field of view of the captured image in a second direction, the second direction being opposite the first direction; and
and in the process of implementing the second driving step, implementing the second acquisition step for a plurality of times so as to acquire a plurality of second sub-images of different parts of the workpiece.
In certain embodiments, the detecting step comprises:
compositing a plurality of the first sub-images to obtain a first image, compositing a plurality of the second sub-images to obtain a second image;
acquiring a first image area corresponding to the first area in the first image according to the distribution information of the first area on the workpiece, and acquiring a second image area corresponding to the second area in the second image according to the distribution information of the second area on the workpiece; and
and detecting the first area according to the first image area, and detecting the second area according to the second image area.
The detection device of the embodiment of the application comprises a first light source, a second light source, an imaging device and a processing device;
when the first acquisition step is carried out, the first light source projects first illumination light to a workpiece, and the imaging device acquires a first sub-image of the workpiece;
in the second acquisition step, the first light source projects first illumination light to the workpiece, the second light source projects second illumination light to the workpiece, and the imaging device acquires a second sub-image of the workpiece;
when the detecting step is carried out, the processing device detects a first area of the workpiece according to the first sub-image and detects a second area of the workpiece according to the second sub-image, wherein the reflectivity of the first area is different from that of the second area.
In certain embodiments, the detection apparatus further comprises a drive device; when the driving step is implemented, the driving device drives the workpiece to move relative to the field of view of the collected image;
in the process of implementing the driving step, the first acquisition step and the second acquisition step are alternately and cyclically implemented so as to acquire a plurality of first sub-images and a plurality of second sub-images of different parts of the workpiece.
In some embodiments, the time interval between adjacent first and second acquisition steps is equal to the time required to drive a set distance of relative movement between the workpiece and the field of view, the set distance being half the size of the field of view in the direction of relative movement between the workpiece and the field of view.
In certain embodiments, the detection apparatus further comprises a drive device;
when the first driving step is implemented, the driving device drives the workpiece and the field of view for collecting the image to move relatively along a first direction; during the implementation of the first driving step, implementing the first acquisition step a plurality of times so as to enable the imaging device to acquire a plurality of first sub-images of different parts of the workpiece;
when the second driving step is carried out, the driving device drives the workpiece and the field of view for collecting the image to move relatively along a second direction, and the second direction is opposite to the first direction; in the process of implementing the second driving step, the second acquiring step is implemented for a plurality of times so that the imaging device acquires a plurality of second sub-images of different parts of the workpiece.
In certain embodiments, the processing device, when performing the detecting step, is configured to:
compositing a plurality of the first sub-images to obtain a first image, compositing a plurality of the second sub-images to obtain a second image;
acquiring a first image area corresponding to the first area in the first image according to the distribution information of the first area on the workpiece, and acquiring a second image area corresponding to the second area in the second image according to the distribution information of the second area on the workpiece; and
and detecting the first area according to the first image area, and detecting the second area according to the second image area.
In the detection method and the detection apparatus of the embodiment of the application, the first sub-image is an image acquired when the first illumination light is projected to the workpiece, and the second sub-image is an image acquired when the first illumination light and the second illumination light are projected to the workpiece, so that the first sub-image and the second sub-image can respectively adapt to detection requirements of a first area and a second area with different reflectivities, and the accuracy of a detection result of the workpiece is improved.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic perspective view of a detection apparatus according to certain embodiments of the present disclosure;
FIG. 2 is a schematic plan view of a detection apparatus according to certain embodiments of the present disclosure;
FIG. 3 is a schematic diagram of a portion of a detection apparatus according to certain embodiments of the present application;
FIG. 4 is a schematic illustration of a workpiece according to certain embodiments of the present application;
FIG. 5 is a schematic flow chart of a detection method according to some embodiments of the present disclosure;
FIG. 6 is a schematic flow chart of a detection method according to some embodiments of the present disclosure;
FIGS. 7-9 are schematic illustrations of the detection method according to certain embodiments of the present application;
FIG. 10 is a schematic flow chart of a detection method according to some embodiments of the present disclosure;
FIG. 11 is a schematic plan view of a test device according to certain embodiments of the present application;
FIG. 12 is a schematic flow chart of a detection method according to some embodiments of the present disclosure.
Description of the main elements and symbols:
the device comprises a detection device 100, a first light source 10, a second light source 20, an imaging device 30, a processing device 40, a driving device 50, a rechecking device 60, a first body 70, a second body 80 and a workpiece 200.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1 to 3, fig. 1 is a schematic perspective view of a detection apparatus 100 according to some embodiments of the present disclosure, fig. 2 is a schematic plan view of the detection apparatus 100 according to some embodiments of the present disclosure, and fig. 3 is a schematic partial structure view of the detection apparatus 100 according to some embodiments of the present disclosure, in which the detection apparatus 100 according to some embodiments of the present disclosure includes a first light source 10, a second light source 20, an imaging device 30, a processing device 40, and a driving device 50.
Specifically, the inspection apparatus 100 may be an inspection tool or a part of a manufacturing tool, for example, the inspection apparatus 100 may be a semiconductor inspection tool. The inspection apparatus 100 may be used to inspect the workpiece 200 to detect the presence of defects in the workpiece 200, collect information about defects, classify defects, process defects, and the like.
The workpiece 200 may be any component that needs to be inspected, for example, the workpiece 200 may be a display panel, a substrate, a wafer, a chip, a film, a cover plate, a housing, and the like, and is not limited herein. Fig. 1 to fig. 3 of the present application take a workpiece 200 as an example of a display panel for exemplary illustration. It is understood that the physical properties of different regions of the workpiece 200 to be inspected may differ depending on the manufacturing process, materials, etc., such as the reflectivity of different regions, the conductivity of different regions, and the density of different regions, and therefore, even for the same workpiece 200, the inspection may not be performed in a single inspection manner, and the inspection method using different inspection manners according to the different reflectivity regions will be described in more detail later.
Both the first light source 10 and the second light source 20 may be used to project illumination light to the workpiece 200. The first light source 10 and the second light source 20 can be controlled to emit light or not to emit light relatively independently, and the light types and the intensities of the light emitted by the first light source 10 and the light emitted by the second light source 20 can be the same or different. In addition, the first light source 10 and the second light source 20 may be different light emitting parts in the same package, for example, different lamp beads in the same lamp bead array; the first light source 10 and the second light source 20 may also be different illuminants packaged independently, for example, the first light source 10 and the second light source 20 are two different light boxes, which is not limited herein. It should be noted that the illumination light emitted from the first light source 10 and the illumination light emitted from the second light source 20 may be shaped, attenuated, changed in transmission path, etc. by optical elements, and then projected onto the workpiece 200, and the optical elements with different functions may be adopted for different requirements, and will not be described in detail herein.
The imaging device 30 may be used to capture an image of the workpiece 200 to facilitate detection of defect information of the workpiece 200 from the image. The imaging device 30 may be a line camera or an area camera, and is not limited thereto. After the first light source 10 and/or the second light source 20 project illumination light to the workpiece 200, the imaging device 30 receives the illumination light reflected and/or scattered by the workpiece 200 to capture an image of the workpiece 200. In one example, the optical axis of the illumination light emitted by the first light source 10 and/or the second light source 20 is coaxially arranged with the optical axis of the imaging device 30, or is symmetrical with respect to the normal of the workpiece 200, so that the imaging device 30 performs bright field detection on the workpiece 200; in another example, the optical axis of the illumination light emitted by the first light source 10 and/or the second light source 20 is not symmetrical with the optical axis of the imaging device 30 with respect to the normal of the workpiece 200, so that the imaging device 30 performs dark field detection on the workpiece 200; in yet another example, the optical axis of the illumination light emitted by one of the first light source 10 and the second light source 20 is disposed coaxially with the optical axis of the imaging device 30, or is symmetrical with respect to the normal of the workpiece 200, and the optical axis of the illumination light emitted by the other of the first light source 10 and the second light source 20 is not symmetrical with respect to the optical axis of the imaging device 30 with respect to the normal of the workpiece 200.
It is understood that the imaging device 30 receives the illumination light reflected and/or scattered by the workpiece 200, and then shapes, attenuates, filters, changes the transmission path, and so on, through the optical elements, and then generates an image according to the illumination light after the operation, which will not be described in detail herein. In the example shown in fig. 1 and 2, the number of the imaging devices 30 may be multiple, and multiple imaging devices 30 may detect different portions of the workpiece 200 at the same time, so as to improve the detection efficiency.
The processing device 40 may be a processing unit in the detection apparatus 100, the processing device 40 may be configured to send control instructions to the first light source 10, the second light source 20, and the imaging device 30, and the processing device 40 may also be configured to process images acquired by the imaging device 30.
The drive device 50 may be used to drive the workpiece 200 relative to the imaging device 30. In one example, the driving device 50 may drive the workpiece 200 to move in a lateral direction (horizontal direction as shown in fig. 2) relative to the imaging device 30; in another example, the driving device 50 may drive the imaging device 30 to move longitudinally (vertically as shown in fig. 2) relative to the workpiece 200; in yet another example, the driving device 50 can also drive the workpiece 200 to rotate relative to the imaging device 30, so that the imaging device 30 can be aligned with different portions of the workpiece 200 and images of the different portions can be acquired.
In the example shown in fig. 1 and 2, the detection apparatus 100 further includes a rechecking device 60, a first body 70, and a second body 80.
The rechecking device 60 may be a camera with high imaging accuracy, and the rechecking device 60 may be an area camera or a line camera. In one example, after the workpiece 200 is inspected by the imaging device 30, the imaging device 30 can determine the position of the defect in the workpiece 200, and the rechecking device 60 can inspect the position of the defect in the workpiece 200 to further obtain the information of the defect. The number of the rechecking devices 60 may be multiple, and as shown in fig. 1 and fig. 2, the number of the rechecking devices 60 is two, the two rechecking devices 60 are respectively located at two sides of the imaging device 30, and the two rechecking devices 60 can simultaneously recheck a plurality of defects at different positions on the workpiece 200, so as to improve the detection efficiency.
The first body 70 and the second body 80 may collectively serve as a frame of the inspection apparatus 100. The driving device 50 may be mounted on the first body 70, and the workpiece 200 is also placed on the first body 70. At least a portion of the second body 80 may span over the first body 70, the imaging device 30 and the review device 60 may be mounted on the second body 80, and the imaging device 30 and the review device 60 may slide relative to the second body 80.
The following description will focus on an inspection method for inspecting defects of the workpiece 200 using the inspection apparatus 100.
Referring to fig. 5, fig. 5 is a schematic flow chart of a detection method according to some embodiments of the present disclosure, the detection method including the steps of:
01: a first acquisition step: projecting first illumination light to the workpiece 200 and acquiring a first sub-image of the workpiece 200;
02: a second acquisition step: projecting the first illumination light and the second illumination light to the workpiece 200, and acquiring a second sub-image of the workpiece 200; and
03: a detection step: a first area of the workpiece 200 is inspected according to the first sub-image, and a second area of the workpiece 200 is inspected according to the second sub-image, wherein the reflectivity of the first area is different from that of the second area.
The inspection apparatus 100 of the embodiment of the present application may be used to implement the inspection method of the embodiment of the present application, and when the first capturing step is implemented, the first light source 10 projects the first illumination light to the workpiece 200, and the imaging device 30 captures the first sub-image of the workpiece 200. In performing the second capturing step, the first light source 10 projects first illumination light toward the workpiece 200, and the second light source 20 projects second illumination light toward the workpiece 200, and the imaging device 30 captures a second sub-image of the workpiece 200. In performing the detecting step, the processing device 40 detects a first region of the workpiece 200 according to the first sub-image, and detects a second region of the workpiece 200 according to the second sub-image, wherein the reflectivity of the first region is different from that of the second region.
In the detection method and the detection apparatus 100 according to the embodiment of the application, the first sub-image is an image acquired when the first illumination light is projected onto the workpiece 200, and the second sub-image is an image acquired when the first illumination light and the second illumination light are projected onto the workpiece 200, so that the first sub-image and the second sub-image can respectively meet the detection requirements of the first area and the second area with different reflectivities, and the accuracy of the detection result of the workpiece 200 is improved.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a workpiece 200 according to some embodiments of the present disclosure, in which the workpiece 200 includes a first region and a second region, and the reflectivity of the first region is different from that of the second region, so that when the same illumination light is projected onto the first region and the second region, the intensities of the illumination light reflected by the first region and the second region are different, and the brightness of the image captured by the imaging device 30 is different. The reflectivity of the first area may be greater than the reflectivity of the second area, or the reflectivity of the first area may be less than the reflectivity of the second area.
In the embodiment of the present application, the reflectivity of the first region is higher than that of the second region, and the brightness of the first region collected by the imaging device 30 is higher than that of the second region under the same illumination light. Taking the workpiece 200 as an example of a display panel, the first region may be a circuit layout region in the display panel, and the second region may be an effective display region. It should be noted that the specific distribution of the first region and the second region on the workpiece 200 in fig. 4 is only for convenience of description, and should not be construed as a limitation to the specific structure of the workpiece 200 in the present application.
In performing step 01, first illumination light is projected onto the workpiece 200 and a first sub-image of the workpiece 200 is acquired. Specifically, in one embodiment, the first illumination light may be projected onto the first region and the second region simultaneously, and the first sub-image acquired by the imaging device 30 also includes images of the first region and the second region, so as to control the projection range of the first illumination light and the acquisition range of the imaging device 30. In another embodiment, the first illumination light may be projected only into the first region, and the first sub-image acquired by the imaging device 30 includes only the image of the first region, so as to detect the first region through the first sub-image, and reduce interference of images of other regions.
In step 02, first and second illumination lights are projected onto the workpiece 200 and a second sub-image of the workpiece 200 is acquired. It can be understood that, compared with the case of projecting only the first illumination light, projecting the first illumination light and the second illumination light at the same time and then collecting the second sub-image, the brightness of the second sub-image can be made higher, and the problem of underexposure of the second sub-image is not easily caused. Similarly to when step 01 is performed, in one embodiment, the first illumination light and the second illumination light can be projected onto the first region and the second region simultaneously, and the second sub-image acquired by the imaging device 30 also includes images of the first region and the second region. In another embodiment, the first illumination light and the second illumination light may be projected only into the second region, and the second sub-image acquired by the imaging device 30 also includes only the image of the second region.
In step 03, a first area of the workpiece 200 is detected according to the first sub-image, and a second area of the workpiece 200 is detected according to the second sub-image. The acquisition condition when acquiring the first sub-image is more suitable for clearly imaging the first area, so the detection accuracy of detecting the first area by the first sub-image is higher, for example, the reflectivity of the first area is higher, therefore, the first sub-image is acquired under the condition of only projecting the first illumination light, and the partial overexposure of the first area in the first sub-image is avoided. Meanwhile, the acquisition condition when the second sub-image is acquired is more suitable for clear imaging of the second area, so that the detection accuracy of detecting the second area by using the second sub-image is higher, for example, the reflectivity of the second area is lower, so that the second sub-image is acquired under the condition of simultaneously projecting the first illumination light and the second illumination light, and the partial under exposure of the second area in the second sub-image is avoided.
Referring to fig. 6, fig. 6 is a schematic flow chart of a detection method according to some embodiments of the present disclosure, in some embodiments, the detection method further includes step 04: a driving step: relative motion between the workpiece 200 and the field of view in which the image is acquired is driven. In addition, during the driving step, the first and second capturing steps are alternately and cyclically performed to capture the first and second sub-images of different portions of the workpiece 200.
The drive device 50 of the present embodiment may be used to perform the driving step, i.e., the drive device 50 drives the relative motion between the workpiece 200 and the field of view in which the image is captured. In the process of performing the driving step, the inspection apparatus 100 alternately and cyclically performs the first and second capturing steps to capture a plurality of first sub-images and a plurality of second sub-images of different portions of the workpiece 200.
In this embodiment, by driving the workpiece 200 and the relative movement between the fields of view for acquiring images, the acquisition device can acquire images of different parts of the workpiece 200 at different times, so as to acquire complete images of the workpiece 200 for multiple times, and meanwhile, the first acquisition step and the second acquisition step are alternately and cyclically implemented, so that the information of the first region can be more clearly and more completely embodied in a plurality of first sub-images, and the information of the second region can be more clearly and more completely embodied in a plurality of second sub-images.
Specifically, in performing step 04, relative motion between the workpiece 200 and the field of view in which the image is acquired is driven. Referring to fig. 2, in an actual detection environment, the area of the surface to be detected in the workpiece 200 may be much larger than the field of view of the imaging device 30, and the imaging device 30 cannot capture the entire surface to be detected by one-time acquisition, so that the workpiece 200 and the field of view of the acquired image can be driven to move relatively, so that different portions of the workpiece 200 sequentially enter the field of view of the imaging device 30, and the images acquired by the imaging device 30 for multiple times are spliced to detect the entire surface to be detected of the workpiece 200.
In the process of step 04, the first and second capturing steps are performed alternately and cyclically to capture the first and second sub-images of different portions of the workpiece 200. Referring to the examples shown in fig. 7 to 9, fig. 7 to 9 are schematic diagrams illustrating the principle of the inspection method according to some embodiments of the present application, wherein a field of view FOV of the captured image is indicated between two dashed lines in fig. 7, and the relative positional relationship between the workpiece 200 and the field of view FOV is illustrated at times T1, T2, T3, T4, and T5, and it can be seen that, as time goes by, the relative positional relationship between the workpiece 200 and the field of view FOV gradually changes, the portion of the workpiece 200 entering the field of view FOV also changes, the light color portion in the workpiece 200 is the first area, and the dark color portion is the second area.
Fig. 8 shows a driving signal L1 of the first light source 10, a driving signal S1 of the imaging device 30, and a driving signal L2 of the second light source 20, respectively, in the process of implementing step 04. The driving signal L1 of the first light source 10 is always kept at the high level, which indicates that the first light source 10 continues to project the first illumination light to the workpiece 200 in the process of performing step 04; in the driving signal S1 of the imaging device 30, a high level indicates that the imaging device 30 collects light and performs exposure, and a low level indicates that the imaging device 30 does not collect light; in the drive signal L2 of the second light source 20, a high level indicates that the second light source 20 projects the second illumination light to the workpiece 200, and a low level indicates that the second light source 20 is not emitting light.
Referring to fig. 7 to 9, at time T1, a first capturing step is performed, in which the first light source 10 projects the first illumination light to the workpiece 200, and the imaging device 30 captures a first sub-image P1; at time T2, a second capturing step is performed, i.e., the first light source 10 projects the first illumination light to the workpiece 200, the second light source 20 projects the second illumination light to the workpiece 200, and the imaging device 30 captures a second sub-image P2; at time T3, a first capturing step is performed, i.e., the first light source 10 projects first illumination light toward the workpiece 200, and the imaging device 30 captures a first sub-image P3; at time T4, a second capturing step is performed, i.e., the first light source 10 projects the first illumination light to the workpiece 200, the second light source 20 projects the second illumination light to the workpiece 200, and the imaging device 30 captures a second sub-image P4; at time T5, a first capturing step is performed in which the first light source 10 projects first illumination light toward the workpiece 200, and the imaging device 30 captures a first sub-image P5. It can be seen that the first sub-image P1, the first sub-image P3, and the first sub-image P5 reflect information of different locations on the workpiece 200, the second sub-image P2 and the second sub-image P4 reflect information of different locations on the workpiece 200, and subsequently, a complete image of the workpiece 200 can be obtained by combining the first sub-image P1, the first sub-image P3, and the first sub-image P5, or a complete image of the workpiece 200 can be obtained by combining the second sub-image P2 and the second sub-image P4.
Of course, fig. 7 to 9 are only used for illustration, and when the detection method is actually implemented, the first acquisition step and the second acquisition step are alternated and the number of times of implementation of the loop may be more or less. In addition, the first light source 10 may not be a normally bright light source, and the driving signal for driving the first light source 10 to emit light may be the same as the driving signal for the imaging device 30, which is not limited herein.
In some embodiments, the time interval between adjacent first and second acquisition steps is equal to the time required to drive the relative motion between the workpiece 200 and the field of view by a set distance. Wherein the set distance is half of the size of the field of view in the direction of relative movement of the workpiece 200 and the field of view.
Specifically, please refer to the example shown in fig. 7 and 8, if the relative movement direction between the workpiece 200 and the field of view FOV is transverse, and the dimension of the field of view FOV in the relative movement direction between the workpiece 200 and the field of view FOV is the distance D, the distance is half of the distance D. In the process of driving the workpiece 200 to move relative to the FOV, the time required for the relative movement to set the distance D/2 is the time interval Δ T between the adjacent first and second acquisition steps.
By means of the arrangement, the plurality of first sub-images acquired by implementing the plurality of first acquisition steps can reflect information of all parts of the workpiece 200 without causing that some parts of the workpiece 200 cannot be reflected in the first sub-images, and similarly, any part of the workpiece 200 can be reflected in at least one second sub-image. In addition, the first sub-images and the second sub-images can reflect the information of all the parts on the workpiece 200, and the imaging device 30 can acquire images at the lowest frequency, so that the use frequency of the imaging device 30 is reduced as much as possible, and the service life of the imaging device 30 is prolonged.
Referring to fig. 10, fig. 10 is a schematic flow chart of a detection method according to some embodiments of the present disclosure, in some embodiments, the detection method further includes:
05: a first driving step: driving relative motion between the workpiece 200 and the field of view of the captured image in a first direction; and
06: a second driving step: driving the workpiece 200 and the field of view of the captured image to move relative to each other in a second direction, the second direction being opposite to the first direction;
in the process of performing step 05, performing the first acquiring step a plurality of times to acquire a plurality of first sub-images of different portions of the workpiece 200; in the process of performing step 06, the second capturing step is performed a plurality of times to capture a plurality of second sub-images of different portions of the workpiece 200.
When the driving device 50 of the embodiment of the present application performs the first driving step, the driving device 50 drives the workpiece 200 and the field of view of the captured image to move relatively along the first direction; in the course of performing the first driving step, the inspection apparatus 100 performs the first capturing step a plurality of times so that the imaging device 30 captures a plurality of first sub-images of different portions of the workpiece 200. When the driving device 50 performs the second driving step, the driving device 50 drives the component to move relative to the field of view of the acquired image along a second direction, and the second direction is opposite to the first direction; in performing the second driving step, the second capturing step is performed a plurality of times to cause the imaging device 30 to capture a plurality of second sub-images of different portions of the workpiece 200.
In performing step 05, the relative motion between the workpiece 200 and the field of view in which the image is captured is driven in a first direction, and in performing step 05, the first capturing step is performed a plurality of times to capture a plurality of first sub-images of different portions of the workpiece 200. Referring to the example shown in fig. 2, in step 05, the workpiece 200 may move along a first direction, the imaging device 30 may remain stationary, the first light source 10 may remain bright and the second light source 20 may remain dark during the movement of the workpiece 200 along the first direction, and the imaging device 30 may collect a first sub-image at certain time intervals. The time interval between the two adjacent first sub-images acquired by the imaging device 30 may be the time length required for the workpiece 200 to move in the first direction by the distance of the width of the field of view.
When step 05 is performed, the workpiece 200 is moved from one end to the other end of the first body 70, and when the workpiece 200 is moved to the end of the first body 70, step 05 and step 01 may be stopped, and step 06 and step 02 may be started.
In performing step 06, the relative motion between the workpiece 200 and the field of view of the captured image is driven in a second direction, and in performing step 06, the second capturing step is performed a plurality of times to capture a plurality of second sub-images of different portions of the workpiece 200. Referring to the example shown in fig. 11, fig. 11 is a schematic plan view of the inspection apparatus 100 according to some embodiments of the present application, and step 06 may be implemented when the workpiece 200 moves along the second direction, the imaging device 30 remains stationary, the first light source 10 may remain normally on and the second light source 20 also remains normally on during the process that the workpiece 200 moves along the second direction, and the imaging device 30 acquires a second sub-image at regular intervals. The time interval between the two adjacent second sub-images acquired by the imaging device 30 may be the time length required for the workpiece 200 to move by the distance of the width of the field of view along the second direction.
Steps 06 and 02 may be stopped until after the workpiece 200 has moved from the position shown in fig. 11 to the position shown in fig. 2.
By acquiring a plurality of first sub-images and second sub-images in this embodiment, it is not necessary to frequently switch the light emitting states of the first light source 10 or the second light source 20, and the detection process is easily controlled.
Referring to fig. 12, fig. 12 is a schematic flow chart of a detection method according to some embodiments of the present disclosure, in some embodiments, step 04 includes the steps of:
041: synthesizing a plurality of first sub-images to obtain a first image, and synthesizing a plurality of second sub-images to obtain a second image;
042: acquiring a first image area corresponding to the first area in the first image according to the distribution information of the first area on the workpiece 200, and acquiring a second image area corresponding to the second area in the second image according to the distribution information of the second area on the workpiece 200; and
043: the first area is detected according to the first image area, and the second area is detected according to the second image area.
Processing device 40 of the embodiment of the present application may be configured to implement steps 041, 042 and 043, that is, processing device 40 may be configured to synthesize a plurality of first sub-images to obtain a first image, and synthesize a plurality of second sub-images to obtain a second image; acquiring a first image area corresponding to the first area in the first image according to the distribution information of the first area on the workpiece 200, and acquiring a second image area corresponding to the second area in the second image according to the distribution information of the second area on the workpiece 200; and detecting the first area according to the first image area and detecting the second area according to the second image area.
In performing step 041, the first plurality of sub-images is composited to obtain a first image, and the second plurality of sub-images is composited to obtain a second image. It can be understood that each of the first sub-image and the second sub-image may be only images of some parts of the workpiece 200, and it is necessary to combine the first sub-images into the first image and combine the second sub-images into the second image, so as to perform a more comprehensive detection on the workpiece 200 according to the first image and the second image. Referring to the example shown in fig. 9, the first image P11 is composed of the first sub-image P1, the first sub-image P2 and the first sub-image P3, and during the composition process, the edge portions of the first sub-image P1, the first sub-image P2 and the first sub-image P3 may need to be clipped, and will not be described in detail herein. Similarly, the second image P21 is composed of the second sub-image P2 and the second sub-image P4.
In step 042, a first image region corresponding to the first region in the first image is obtained according to the distribution information of the first region on the workpiece 200, and a second image region corresponding to the second region in the second image is obtained according to the distribution information of the second region on the workpiece 200. As described above, the first sub-image is an image acquired under the illumination condition adapted to the reflectance of the first region, and the second sub-image is an image acquired under the illumination condition adapted to the reflectance of the second region, so that the first region should be preferentially detected based on the first image composed of the plurality of first sub-images if detection of the first region is required, and the second region should be preferentially detected based on the second image composed of the plurality of second sub-images if detection of the first region is required.
The positions of the first region and the second region distributed in the workpiece 200 are the characteristics of the workpiece 200 itself, and are known in the actual detection process, so that the first image region corresponding to the first region in the first image can be acquired through the positions of the first region distributed in the workpiece 200, and the first image region is more suitable for detecting the first region, and similarly, the second image region is more suitable for detecting the second region.
In step 043, a first region is detected according to the first image region, and a second region is detected according to the second image region. Because the first image area is not easy to generate the phenomenon of overexposure, the first area is detected according to the first image area, so that the detection accuracy of the first area can be improved; because the second image area is not easy to generate the underexposure phenomenon, the second area is detected according to the second image area, and the detection accuracy of the second area can be improved.
In summary, in the detection method and the detection apparatus 100 according to the embodiment of the present application, the first sub-image is an image acquired when the first illumination light is projected onto the workpiece 200, and the second sub-image is an image acquired when the first illumination light and the second illumination light are projected onto the workpiece 200, so that the first sub-image and the second sub-image can respectively meet the detection requirements of the first area and the second area with different reflectivities, and the accuracy of the detection result of the workpiece 200 is improved.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A detection method, characterized in that the detection method comprises:
a first acquisition step: projecting first illumination light to a workpiece and acquiring a first sub-image of the workpiece;
a second acquisition step: projecting first illumination light and second illumination light to the workpiece, and acquiring a second sub-image of the workpiece; and
a detection step: and detecting a first area of the workpiece according to the first sub-image, and detecting a second area of the workpiece according to the second sub-image, wherein the reflectivity of the first area is different from that of the second area.
2. The detection method according to claim 1, further comprising:
a driving step: driving relative motion between the workpiece and a field of view of an acquired image; and
in the process of implementing the driving step, the first acquisition step and the second acquisition step are alternately and cyclically implemented so as to acquire a plurality of first sub-images and a plurality of second sub-images of different parts of the workpiece.
3. The inspection method according to claim 2, wherein a time interval between the first and second adjacent acquisition steps is equal to a time required to drive a set distance of relative movement between the workpiece and the field of view, the set distance being half a size of the field of view in a direction of relative movement between the workpiece and the field of view.
4. The detection method according to claim 1, further comprising:
a first driving step: driving the workpiece and the field of view of the acquired image to move relatively along a first direction;
during the implementation of the first driving step, implementing the first acquisition step for a plurality of times to acquire a plurality of first sub-images of different parts of the workpiece;
a second driving step: driving relative motion between the workpiece and a field of view of the captured image in a second direction, the second direction being opposite the first direction; and
and in the process of implementing the second driving step, implementing the second acquisition step for a plurality of times so as to acquire a plurality of second sub-images of different parts of the workpiece.
5. The detection method according to any one of claims 2 to 4, wherein the detection step comprises:
compositing a plurality of the first sub-images to obtain a first image, compositing a plurality of the second sub-images to obtain a second image;
acquiring a first image area corresponding to the first area in the first image according to the distribution information of the first area on the workpiece, and acquiring a second image area corresponding to the second area in the second image according to the distribution information of the second area on the workpiece; and
and detecting the first area according to the first image area, and detecting the second area according to the second image area.
6. A detection device is characterized by comprising a first light source, a second light source, an imaging device and a processing device;
when the first acquisition step is carried out, the first light source projects first illumination light to a workpiece, and the imaging device acquires a first sub-image of the workpiece;
in the second acquisition step, the first light source projects first illumination light to the workpiece, the second light source projects second illumination light to the workpiece, and the imaging device acquires a second sub-image of the workpiece;
when the detecting step is carried out, the processing device detects a first area of the workpiece according to the first sub-image and detects a second area of the workpiece according to the second sub-image, wherein the reflectivity of the first area is different from that of the second area.
7. The detection apparatus according to claim 6, wherein the detection apparatus further comprises a driving device; when the driving step is implemented, the driving device drives the workpiece to move relative to the field of view of the collected image;
in the process of implementing the driving step, the first acquisition step and the second acquisition step are alternately and cyclically implemented so as to acquire a plurality of first sub-images and a plurality of second sub-images of different parts of the workpiece.
8. The inspection apparatus of claim 7, wherein a time interval between adjacent ones of said first and second acquisition steps is equal to a time required to drive a set distance of relative movement between said workpiece and said field of view, said set distance being half a dimension of said field of view in a direction of relative movement between said workpiece and said field of view.
9. The detection apparatus according to claim 6, wherein the detection apparatus further comprises a driving device;
when the first driving step is implemented, the driving device drives the workpiece and the field of view for collecting the image to move relatively along a first direction; during the implementation of the first driving step, implementing the first acquisition step a plurality of times so as to enable the imaging device to acquire a plurality of first sub-images of different parts of the workpiece;
when the second driving step is carried out, the driving device drives the workpiece and the field of view for collecting the image to move relatively along a second direction, and the second direction is opposite to the first direction; in the process of implementing the second driving step, the second acquiring step is implemented for a plurality of times so that the imaging device acquires a plurality of second sub-images of different parts of the workpiece.
10. The sensing apparatus of any one of claims 7 to 9, wherein the processing means, when performing the sensing step, is configured to:
compositing a plurality of the first sub-images to obtain a first image, compositing a plurality of the second sub-images to obtain a second image;
acquiring a first image area corresponding to the first area in the first image according to the distribution information of the first area on the workpiece, and acquiring a second image area corresponding to the second area in the second image according to the distribution information of the second area on the workpiece; and
and detecting the first area according to the first image area, and detecting the second area according to the second image area.
CN202110873817.XA 2021-07-30 2021-07-30 Detection method and detection equipment Pending CN113607756A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110873817.XA CN113607756A (en) 2021-07-30 2021-07-30 Detection method and detection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110873817.XA CN113607756A (en) 2021-07-30 2021-07-30 Detection method and detection equipment

Publications (1)

Publication Number Publication Date
CN113607756A true CN113607756A (en) 2021-11-05

Family

ID=78338817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110873817.XA Pending CN113607756A (en) 2021-07-30 2021-07-30 Detection method and detection equipment

Country Status (1)

Country Link
CN (1) CN113607756A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090323052A1 (en) * 2008-06-25 2009-12-31 Shai Silberstein Dynamic Illumination in Optical Inspection Systems
US20110032350A1 (en) * 2008-04-18 2011-02-10 Olympus Corporation Illumination device and image acquisition apparatus
CN102821591A (en) * 2011-06-09 2012-12-12 雅马哈发动机株式会社 Component imaging method, component imaging device, and component mounting device
CN109030495A (en) * 2018-06-26 2018-12-18 大连鉴影光学科技有限公司 A kind of optical element defect inspection method based on machine vision technique
US20200202504A1 (en) * 2018-12-21 2020-06-25 Kla-Tencor Corporation Differential Imaging For Single-Path Optical Wafer Inspection
CN111551556A (en) * 2020-05-20 2020-08-18 上海御微半导体技术有限公司 Defect detection device and defect detection method
CN211347985U (en) * 2019-11-08 2020-08-25 北京大恒图像视觉有限公司 Machine vision detection device applied to surface detection industry
CN112240887A (en) * 2020-12-14 2021-01-19 惠州高视科技有限公司 Battery appearance defect detection system and method
CN113155845A (en) * 2021-04-09 2021-07-23 武汉精测电子集团股份有限公司 Light source, setting method thereof, optical detection method and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110032350A1 (en) * 2008-04-18 2011-02-10 Olympus Corporation Illumination device and image acquisition apparatus
US20090323052A1 (en) * 2008-06-25 2009-12-31 Shai Silberstein Dynamic Illumination in Optical Inspection Systems
CN102821591A (en) * 2011-06-09 2012-12-12 雅马哈发动机株式会社 Component imaging method, component imaging device, and component mounting device
CN109030495A (en) * 2018-06-26 2018-12-18 大连鉴影光学科技有限公司 A kind of optical element defect inspection method based on machine vision technique
US20200202504A1 (en) * 2018-12-21 2020-06-25 Kla-Tencor Corporation Differential Imaging For Single-Path Optical Wafer Inspection
CN211347985U (en) * 2019-11-08 2020-08-25 北京大恒图像视觉有限公司 Machine vision detection device applied to surface detection industry
CN111551556A (en) * 2020-05-20 2020-08-18 上海御微半导体技术有限公司 Defect detection device and defect detection method
CN112240887A (en) * 2020-12-14 2021-01-19 惠州高视科技有限公司 Battery appearance defect detection system and method
CN113155845A (en) * 2021-04-09 2021-07-23 武汉精测电子集团股份有限公司 Light source, setting method thereof, optical detection method and system

Similar Documents

Publication Publication Date Title
CN107110782B (en) Defect inspection method and defect inspection apparatus for wide band gap semiconductor substrate
KR101691242B1 (en) Multi-modal imaging
JP5410092B2 (en) Apparatus and method for inspecting composite structures for inconsistencies
CN107796825B (en) Device detection method
CN109478523B (en) Defect inspection device for wide band gap semiconductor substrate
JP2004012325A (en) Method and apparatus for inspection of defect
JP2002014057A (en) Defect checking device
JP2007256106A (en) Display panel inspection device and display panel inspection method using the same
KR101442792B1 (en) Method for Inspecting Sapphire Wafer
CN210269638U (en) Detection module and detection machine platform
KR100862883B1 (en) Apparatus for inspection of semiconductor device and method for inspection using the same
KR20110046901A (en) Substrate inspection system and substrate inspection method
TWI827863B (en) Wafer appearance inspection device and method
CN113607756A (en) Detection method and detection equipment
JP2009236760A (en) Image detection device and inspection apparatus
TW202127012A (en) Optical inspection apparatus and optical inspection method
CN113490844A (en) Foreign matter inspection device and foreign matter inspection method
KR101015808B1 (en) Apparatus and method for measuring line width of bonding electrode
JP2007071785A (en) Method for inspecting projector
KR20080088946A (en) Apparatus for inspection of three-dimensional shape and method for inspection by the same
JPH11248643A (en) Detection device for foreign matter in transparent film
KR100710703B1 (en) Inspection system for a measuring plating line width of semiconductor reed frame and thereof method
JP2003262509A (en) Inspection apparatus, measuring instrument, and method for measurement
CN213580716U (en) Line scanning optical detection system for detecting residual foreign matters on wafer chuck
JP2000258348A (en) Defect inspection apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination