CN110441315B - Electronic component testing apparatus and method - Google Patents

Electronic component testing apparatus and method Download PDF

Info

Publication number
CN110441315B
CN110441315B CN201910711395.9A CN201910711395A CN110441315B CN 110441315 B CN110441315 B CN 110441315B CN 201910711395 A CN201910711395 A CN 201910711395A CN 110441315 B CN110441315 B CN 110441315B
Authority
CN
China
Prior art keywords
image
electronic component
defect
combined image
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910711395.9A
Other languages
Chinese (zh)
Other versions
CN110441315A (en
Inventor
杨维平
姜勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Products Chengdu Co Ltd
Intel Corp
Original Assignee
Intel Products Chengdu Co Ltd
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Products Chengdu Co Ltd, Intel Corp filed Critical Intel Products Chengdu Co Ltd
Priority to CN201910711395.9A priority Critical patent/CN110441315B/en
Publication of CN110441315A publication Critical patent/CN110441315A/en
Application granted granted Critical
Publication of CN110441315B publication Critical patent/CN110441315B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8883Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges involving the calculation of gauges, generating models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)
  • Image Processing (AREA)

Abstract

There is provided an apparatus for testing an electronic component, comprising: an image capture device for capturing a first image of the electronic component when light in a first direction illuminates the electronic component and capturing a second image of the electronic component when light in a second direction illuminates the electronic component, the first direction being different from the second direction; and an image combining unit for combining the first and second images to generate a combined image, the distribution of gray values at respective pixels in the combined image being indicative of the relative depth at each corresponding location in the electronic component. From the resulting combined image, the detection of a protrusion defect and/or a depression defect can be simply achieved.

Description

Electronic component testing apparatus and method
Technical Field
The present application relates to the field of electronic component manufacturing, particularly to testing of electronic components, and more particularly to detection of surface defects of electronic components.
Background
Surface defects of electronic parts such as semiconductor chips can cause problems of yield reduction, performance deterioration, and the like of semiconductor devices. A conventional test machine can detect a defect on the surface of an electronic component in an image by acquiring an image of the electronic component as an object to be inspected.
The convex defect and the concave defect on the surface of the electronic component have similar appearances in the image captured by the conventional testing machine. For example, under certain lighting conditions, both the convex defect and the concave defect are displayed as a bright spot on the image, or both the convex defect and the concave defect are displayed as a dark spot on the image, and thus, it is difficult to distinguish between the convex defect and the concave defect. However, it is important to be able to distinguish between a protrusion defect and a depression defect, because the depression defect tends to be more severe than the protrusion defect. For example, a dent defect can be an actual defect, while a bump defect may simply be dust on the surface of the electronic component.
On the one hand, it is possible to further judge artificially whether the detected defect is a convex defect or a concave defect after the defect is detected by a conventional testing machine.
On the other hand, a depth image of the inspection object having depth information with which the convex defect and the concave defect are distinguished can be obtained by other means such as a stereo camera, a camera array, and structured light.
It is desirable to provide an improved solution for distinguishing between raised defects and recessed defects.
Disclosure of Invention
An improved apparatus and method for testing electronic components is provided that enables low cost, efficient, and accurate detection of bump and recess defects in electronic components.
According to an embodiment of the present invention, there is provided an apparatus for testing an electronic component. The apparatus comprises: an image pickup device for picking up a first image of the electronic component when light in a first direction irradiates the electronic component, and picking up a second image of the electronic component when light in a second direction irradiates the electronic component, the first direction being different from the second direction, wherein the electronic component is an electronic component to be tested; an image combining unit for combining the first and second images to generate a combined image, a distribution of gray values at respective pixels in the combined image characterizing relative depths at respective corresponding locations in the electronic component; and an output unit for outputting the combined image.
According to another embodiment of the invention, a method for testing an electronic component is provided. The method comprises the following steps: receiving a first image of the electronic component captured when light in a first direction illuminates the electronic component and a second image of the electronic component captured when light in a second direction illuminates the electronic component, the first direction being different from the second direction, wherein the electronic component is an electronic component to be tested; combining the first and second images to generate a combined image, the distribution of gray values at each pixel in the combined image characterizing the relative depth at each corresponding location in the electronic component; and outputting the combined image.
According to various embodiments of the present invention, it is only necessary to illuminate an electronic component to be tested from different illumination directions using a light source, acquire at least two images of the electronic component corresponding to different directions, combine the acquired images such that a gray value distribution in the combined images can characterize relative depth information, and be able to detect bumps and/or pits in the electronic component based on the combined images. On one hand, by simply combining the images which correspond to the irradiation collection in different directions, the gray value distribution which is specific to the images can represent the relative depth of each position of the images, so that the automatic detection of the bulges or the depressions in the electronic components can be realized according to the gray value distribution information which is specific to the images, and the time and the labor are saved. On the other hand, the apparatus and method for testing electronic components according to embodiments of the present invention only require simple improvement on the basis of the current test machine, and the combined image according to embodiments of the present invention only needs to show depth information (which may be referred to as a pseudo 3D image) without obtaining true depth values, compared to a depth image with true depth information acquired by a stereo camera, a camera array, structured light, or the like, which greatly reduces costs on the basis of simple calculation. In particular, the at least two images described above can be obtained by only a single camera as provided in conventional inspection machines, without the need for additional cameras or camera arrays. For example, it is only necessary to move the light source position or provide a light source at multiple positions, and only a single fixed position camera is provided. In addition, light illuminating the electronic component from different directions need not be designed as complicated structured light. Thus, the complexity and cost of the apparatus is greatly reduced.
Still further advantages of the present invention will be appreciated to those of ordinary skill in the art upon reading and understand the following detailed description.
Drawings
FIG. 1 shows a block diagram of an apparatus for testing electronic components according to one embodiment of the invention;
FIGS. 2(a) and 2(b) respectively show images acquired when an electronic component to be tested is illuminated in different directions according to one embodiment of the present invention;
FIG. 3 illustrates a combined image resulting from combining images acquired when electronic components to be tested are illuminated in different directions according to one embodiment of the present invention;
4(a) and 4(b) illustrate a convex defect and a concave defect, respectively, in a combined image according to one embodiment of the present invention; and is
Fig. 5 shows a flow diagram of a method for testing an electronic component according to an embodiment of the invention.
Detailed Description
Fig. 1 shows a block diagram of an apparatus 100 for testing electronic components according to an embodiment of the invention.
The device 100 comprises a support 101 on which an electronic component to be tested, for example a semiconductor chip 10 having a surface, can be carried. In the manufacturing process of semiconductor chips, defects may be left on the surface of the chip due to process technology and the like. One skilled in the art will appreciate that the electronic component can be any electronic component for which defect detection is desired, such as a metal sheet, a wire, etc.
The apparatus 100 further comprises an image acquisition device 102, such as a camera, which can be arranged above the support 101 at a position where the semiconductor chip 10 can be image-acquired. The camera 102 is capable of capturing at least two images of the semiconductor chip 10 at the same angle without moving itself, each image corresponding to a different illumination direction of the semiconductor chip 10 with the illumination light. For example, when light from one direction irradiates the semiconductor chip 10, the camera captures an image of the semiconductor chip 10. In this way, after the light from the at least two directions 11, 12 is respectively irradiated to the semiconductor chip 10, at least two images of the semiconductor chip 10 can be obtained. It should be noted that illuminating the semiconductor chip 10 should be understood that the semiconductor chip 10 should be illuminated bright enough to enable the camera to image the entire semiconductor chip 10. It can be appreciated that light from different directions necessarily causes the same edge of the semiconductor chip 10 to be illuminated to different degrees in different images.
The apparatus 100 further comprises an image combination unit 103 capable of generating a combined image of the semiconductor chip 10 from at least two images acquired by the camera. The combined image is different from an image acquired with the semiconductor chip 10 illuminated from one direction, wherein the distribution of the gray values of the individual pixels in the combined image is capable of representing the relative depth of the parts of the semiconductor chip 10 in the image. The distribution of the gray values of the respective pixels particularly refers to the distribution of the gray values between the specific pixel and the respective pixels in its neighborhood, for example, the relative relationship between the gray value of the specific pixel and the gray value of the respective pixels in its neighborhood.
For example, the distribution of the gray values of the individual pixels of the combined image (i.e., the relative relationship of the gray values of the individual pixels to the pixels in their neighborhood) can cause the image to visually exhibit a stereoscopic effect.
In one example, a gray value difference (e.g., gray gradient) is determined for each pixel in a particular image from pixels in its neighborhood. In this way, if the difference between a specific image region and its surrounding region is found to satisfy a predetermined criterion, it can be judged that the image region is a convex defect, and it can also be judged that a concave defect is present. The predetermined criterion can be set manually, for example empirically, or can be determined according to a machine learning method, as will be described below, the predetermined criterion comprising a threshold value.
Since the composite image does not give actual depth values, it can be understood as a pseudo 3D image. Although this embodiment of the invention has been described using grey values of the pixels, the skilled person will be able to envisage that other properties of the pixels can be used to characterise relative depth information in the combined image, for example for the chrominance values of a colour image.
The apparatus 100 further comprises an output unit 104 for outputting the combined image.
In one embodiment, the output unit 104 includes a display unit.
The display unit can be used to display the combined image, for example to a user for viewing. In general, a user can directly judge the protrusion defect and/or the depression defect in the semiconductor chip 10 from the displayed combined image and the visual stereoscopic effect embodied in the pixel gray value distribution on the image. This requires that the defects be sufficiently visible in the image to be discernible by the naked eye.
In a preferred embodiment, to enable more robust and accurate defect detection, and to enable a more automated process flow, the apparatus 100 further comprises a defect detection and classification unit 105.
In this case, the combined image is input to the defect detection and classification unit 105 for further processing to identify bumps or pits therein. The defect detection and classification unit 105 can automatically identify the defect of the semiconductor chip 10 based on the combined image, and can automatically determine whether the defect belongs to a convex defect or a concave defect. The skilled person will be able to envisage a number of existing image processing techniques to identify protrusion defects and/or indentation defects from the distribution of grey values in the combined image. For example, by image segmentation, regions in an image having pixels of similar gray values can be segmented out to form different regions on the image. Through the grey scale difference between different regions and/or the characteristics of the transition regions between different regions, it is possible to identify convex defects and/or concave defects in the combined image.
In a preferred embodiment, the defect detection and classification unit 105 is able to identify convex defects and/or concave defects in the combined images based on a pre-trained model by a machine learning method.
For example, the model is trained by a plurality of images having a particular grey value distribution and corresponding known convex and/or concave defects. The particular grey value distribution in the plurality of images is similar to the grey value distribution of the combined image according to the invention, i.e. the relative depth of parts in the images can also be characterized. In one example, the predetermined criterion for judging the protrusion defect or the depression defect can be determined by a machine learning method.
The combined image obtained according to the principles of the present invention is input into a trained model that is capable of automatically outputting whether a bump defect and/or a pit defect is included in the input combined image.
The machine learning method based on the training model further increases the accuracy and robustness of defect identification.
The detection result of the convex defect and/or the concave defect by the defect detection and classification unit 105 can be output through the output unit 104.
In one embodiment, the apparatus 100 further comprises an illumination device 106. The illumination device 106 can include a light source commonly used in the art for illumination, such as a point light source. The lighting device 106 can include a plurality of light sources arranged in different locations to emit light from different directions. The plurality of light sources are sequentially activated to illuminate the semiconductor chip 10 from different directions 11, 12, respectively. Alternatively, the lighting device 106 can include only a single light source. The single light source can be rotated to different angles or moved to different positions by an actuating device (not shown) comprised in the lighting device 106 to illuminate the semiconductor chip 10 from different directions 11, 12, respectively.
It can be appreciated that the illumination device 106 can be provided as part of the apparatus 100. Or the lighting device 106 can be an external lighting device of the apparatus 100.
In a preferred embodiment, light is emitted in two opposite directions above the semiconductor chip 10 with respect to the semiconductor chip 10 to illuminate the semiconductor chip 10, respectively. For example, light is emitted to the semiconductor chip 10 from two opposite directions located at the upper left and upper right of the semiconductor chip 10, respectively, and one image of the semiconductor chip 10 is acquired when the semiconductor chip 10 is irradiated with the light emitted from each direction, respectively.
Those skilled in the art will appreciate that the image acquisition device 102, the image combination unit 103, the output unit 104 and the defect detection and classification unit 105 can be implemented as software modules controlling hardware having corresponding functions.
Fig. 2(a) shows an image 201 acquired when the semiconductor chip is illuminated from the upper left of the two opposing directions, and fig. 2(b) shows an image 202 acquired when the semiconductor chip is illuminated from the upper right of the two opposing directions. The two images are combined into a combined image.
As can be seen from fig. 2(a) and 2(b), the images acquired when the semiconductor chip is illuminated from a single direction have a different distribution of gray values, but the gray values in these images only give a planar visual effect. Even if it can be determined from these images that different regions have different shades, it cannot be intuitively determined which regions are darker and which regions are lighter.
Fig. 3 shows a combined pseudo-3D image 301 combined with an image 201 and an image 202 according to an embodiment of the invention. The corresponding pixels of image 201 and image 202 are combined by the following formula:
log(x 1 +1.0)-log(x 2 +1.0) (1),
wherein x is 1 Represents the gray value of the corresponding pixel of image 201, and x 2 Representing the gray scale value of the corresponding pixel of image 201. X is to be 1 And x 2 The respective logarithms are taken to expand the dynamic range of the gray values so that the resulting composite image contains less noise. Before taking the logarithm, 1 is added to each gray value to ensure that no logarithm is calculated for zero values.
The values obtained according to the above equation (1) are normalized in the range of 0-255 to form the gray values of the corresponding pixels of the image 301. Thus, the gray scale value of each pixel of the image 301 is given by the following equation:
normalization (log (x) 1 +1.0)-log(x 2 +1.0),0,255), (2),
Such that each pixel of the resulting image 301 has a gray value in the range of 0-255.
In addition to taking the logarithm of the gray scale value, the image 201 and the image 202 may be combined and the gray scale value of each pixel of the image 301 may be determined by the following formula:
normalization (1/(256-x) 1 )-1/(256-x 2 ),0,255) (3)。
Other ways of combining the images can be envisaged by the skilled person in the light of the teachings of the present application.
As can be clearly seen in fig. 3, the image 301 can visually embody a stereoscopic effect although the image has only gray value information and does not have a depth value for each pixel at all. Therefore, the three-dimensional structure of the surface of the semiconductor chip 10 can be determined based on the image 301, both visually and in an automatic algorithm.
Fig. 4(a) and 4(b) show partial magnified images 401, 402, respectively, of a combined image according to an embodiment of the invention, wherein the partial magnified image 401 of fig. 4(a) shows a convex defect circled and the partial magnified image 402 of fig. 4(a) shows a concave defect circled. By comparing images 401 and 402, the appearance of the protrusion defect and the depression defect have a significant difference and conform to the human visual perception of protrusion and depression, respectively.
Returning to fig. 1, the principles of the present application allow the angle θ between the direction of light emitted from the light source and the surface of the semiconductor chip 10 to be as small as in the range of 5-25, within the allowable noise level. Thus, the present application enables the detection of bumps and/or pits as shallow as 10 μm. For example, stereo cameras based on binocular parallax generally require two cameras to be close to the right above a semiconductor chip, so that an excessively small included angle θ is not allowed, and thus the recognition shallowness that can be achieved by the present application cannot be achieved.
Experimental data has shown that the principles of the present application are capable of achieving detection accuracies of up to 99.55% for shallowness >30 μm and up to 97.69% for shallowness between 10 μm-30 μm.
Fig. 5 shows a flow diagram of a method for testing an electronic component according to an embodiment of the invention.
In 501, a first image of the electronic component captured when light in a first direction illuminates the electronic component and a second image of the electronic component captured when light in a second direction illuminates the electronic component are received, the first direction being different from the second direction.
In 502, the first and second images are combined to generate a combined image, a distribution of gray values at respective pixels in the combined image characterizing relative depths at respective corresponding locations in the electronic component.
In 503, the combined image is output.
Although the embodiments of the present invention have been described with reference to combining a first image and a second image, it is to be understood that a plurality of images may also be combined as necessary.
It is understood that the method according to the present application has the same and/or similar embodiments as the device according to the present application.
The method according to the present application may be implemented by running a computer program in the form of code. The computer program can be stored in any suitable storage medium, for example, a non-transitory computer readable storage medium.
The memory storing the above-mentioned computer program can be comprised in a system for testing electronic components together with a processor capable of running the computer program stored in said memory to perform the method of the invention.
While the method of the present invention has been described above with reference to only the embodiment shown in fig. 5, it is to be understood that the operations included in the above embodiment are not restrictive, and may be deleted, combined, changed, split and/or recombined as necessary to add/modify/delete the corresponding functions.
The systems and methods of the present invention have been described above with reference to various embodiments, which may include a particular feature, structure, or characteristic, but not every embodiment necessarily includes the particular feature, structure, or characteristic. In addition, some embodiments may have some, all, or none of the features described for other embodiments.
Various features of different embodiments or examples may be combined in various ways with some features included and others excluded to accommodate various different applications. The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may be combined into a single functional element. Alternatively, some elements may be divided into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, the order of the processes described herein may be changed and is not limited to the manner described herein. Moreover, the operations of any flow diagram need not be implemented in the order shown; nor does it necessarily require all operations to be performed. Further, those operations that are not dependent on other operations may be performed in parallel with the other operations. The scope of the embodiments is in no way limited by these specific examples. Many variations, such as differences in the order of operations, product compositions, and structures, are possible, whether or not explicitly set forth in the specification.

Claims (14)

1. An apparatus for testing electronic components, comprising:
an image pickup device for picking up a first image of the electronic component when light in a first direction irradiates the electronic component, and picking up a second image of the electronic component when light in a second direction irradiates the electronic component, the first direction being different from the second direction, wherein the electronic component is an electronic component to be tested;
an image combining unit for combining the first and second images acquired under different light irradiation directions to generate a combined image, a distribution of gray values at respective pixels in the combined image characterizing relative depths at respective corresponding positions in the electronic component; and
an output unit for outputting the combined image;
wherein the image combining unit determines data representing a difference in gray values at corresponding pixels in the first and second images based on the gray values of the corresponding pixels in the first and second images; determining a grayscale value for each pixel in the combined image based on the data.
2. The apparatus of claim 1, wherein the output unit further comprises:
a display unit for displaying the combined image.
3. The apparatus of claim 1, further comprising:
a defect detection and classification unit for detecting a bump defect and/or a pit defect in the electronic component based on the combined image;
the output unit is used for outputting the detection result of the convex defect and/or the concave defect.
4. The apparatus of claim 3, wherein,
the defect detection and classification unit detects bump defects and/or recess defects in the electronic component based on a pre-trained model, wherein the pre-trained model is trained from a plurality of images, each of the plurality of images showing known bump defects and/or recess defects and having a corresponding gray value distribution.
5. The apparatus of any of claims 1-4, further comprising:
an illumination device for illuminating the electronic component from the first direction and the second direction, respectively.
6. The apparatus of any one of claims 1-4,
the image combining unit determines the data by the following formula:
log(x 1 +1.0)-log(x 2 +1.0),
wherein x is 1 Is the gray value of a pixel in the first image, and x 2 Is the gray value of the corresponding pixel in the second image;
and the image combining unit determines a gray value for each pixel in the combined image by normalizing the data.
7. The apparatus of any one of claims 1-4,
wherein the first direction and the second direction each include an angle in the range of 5 ° -25 ° with respect to the surface of the electronic component.
8. A method for testing an electronic component, comprising:
receiving a first image of the electronic component captured when light in a first direction illuminates the electronic component and a second image of the electronic component captured when light in a second direction illuminates the electronic component, the first direction being different from the second direction, wherein the electronic component is an electronic component to be tested;
combining the first and second images acquired at different light illumination directions to generate a combined image, the distribution of gray values at respective pixels in the combined image characterizing the relative depth at each corresponding location in the electronic component; and
outputting the combined image;
wherein combining the first image and the second image comprises:
determining data representing a difference in grayscale values at corresponding pixels in the first and second images based on grayscale values of the corresponding pixels in the first and second images; and
determining a grayscale value for each pixel in the combined image based on the data.
9. The method of claim 8, wherein outputting the combined image further comprises:
displaying the combined image.
10. The method of claim 8, further comprising:
detecting a bump defect and/or a recess defect in the electronic component based on the combined image, an
And outputting the detection result of the convex defect and/or the concave defect.
11. The method of claim 10, wherein detecting bump defects and/or recess defects in the electronic component based on the combined image further comprises:
detecting bump defects and/or recess defects in the electronic component based on a pre-trained model, wherein the pre-trained model is trained from a plurality of images, each of the plurality of images showing known bump defects and/or recess defects and having a respective grey value distribution.
12. The method of any of claims 8-11, wherein the data is determined by the formula:
log(x 1 +1.0)-log(x 2 +1.0),
wherein x is 1 Is the gray value of a pixel in the first image, and x 2 Is the gray value of the corresponding pixel in the second image; and
determining a grayscale value for each pixel in the combined image by normalizing the data.
13. The method according to any one of claims 8-11,
wherein the first direction and the second direction each include an angle in the range of 5 ° -25 ° with respect to the surface of the electronic component.
14. A system for testing electronic components, comprising:
a memory storing computer program code; and
a processor running the computer program code to perform the method of any of claims 8-13.
CN201910711395.9A 2019-08-02 2019-08-02 Electronic component testing apparatus and method Active CN110441315B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910711395.9A CN110441315B (en) 2019-08-02 2019-08-02 Electronic component testing apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910711395.9A CN110441315B (en) 2019-08-02 2019-08-02 Electronic component testing apparatus and method

Publications (2)

Publication Number Publication Date
CN110441315A CN110441315A (en) 2019-11-12
CN110441315B true CN110441315B (en) 2022-08-05

Family

ID=68433006

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910711395.9A Active CN110441315B (en) 2019-08-02 2019-08-02 Electronic component testing apparatus and method

Country Status (1)

Country Link
CN (1) CN110441315B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111044522B (en) * 2019-12-14 2022-03-11 中国科学院深圳先进技术研究院 Defect detection method and device and terminal equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102567960A (en) * 2010-12-31 2012-07-11 同方威视技术股份有限公司 Image enhancing method for security inspection system
CN108830796A (en) * 2018-06-20 2018-11-16 重庆大学 Based on the empty high spectrum image super-resolution reconstructing method combined and gradient field is lost of spectrum

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4974261A (en) * 1988-11-15 1990-11-27 Matsushita Electric Works, Ltd. Optical surface inspection method
DE4410603C1 (en) * 1994-03-26 1995-06-14 Jenoptik Technologie Gmbh Detecting faults during inspection of masks, LCDs, circuit boards and semiconductor wafers
US8223327B2 (en) * 2009-01-26 2012-07-17 Kla-Tencor Corp. Systems and methods for detecting defects on a wafer
KR102058427B1 (en) * 2017-12-21 2019-12-23 동우 화인켐 주식회사 Apparatus and method for inspection
CN108445007B (en) * 2018-01-09 2020-11-17 深圳市华汉伟业科技有限公司 Detection method and detection device based on image fusion
CN109871895B (en) * 2019-02-22 2021-03-16 北京百度网讯科技有限公司 Method and device for detecting defects of circuit board
CN109934808B (en) * 2019-03-04 2020-11-27 佛山市南海区广工大数控装备协同创新研究院 PCB defect classification method based on image multi-shape normal gradient difference

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102567960A (en) * 2010-12-31 2012-07-11 同方威视技术股份有限公司 Image enhancing method for security inspection system
CN108830796A (en) * 2018-06-20 2018-11-16 重庆大学 Based on the empty high spectrum image super-resolution reconstructing method combined and gradient field is lost of spectrum

Also Published As

Publication number Publication date
CN110441315A (en) 2019-11-12

Similar Documents

Publication Publication Date Title
KR102084535B1 (en) Defect inspection device, defect inspection method
US10282629B2 (en) Main-subject detection method, main-subject detection apparatus, and non-transitory computer readable storage medium
US11747284B2 (en) Apparatus for optimizing inspection of exterior of target object and method thereof
JP5342413B2 (en) Image processing method
CN116503388B (en) Defect detection method, device and storage medium
CN108431824A (en) Image processing system
CN109870730B (en) Method and system for regular inspection of X-ray machine image resolution test body
CN110441315B (en) Electronic component testing apparatus and method
CN105572133B (en) Flaw detection method and device
KR101696086B1 (en) Method and apparatus for extracting object region from sonar image
JP6623545B2 (en) Inspection system, inspection method, program, and storage medium
Abdusalomov et al. Robust shadow removal technique for improving image enhancement based on segmentation method
KR20150009842A (en) System for testing camera module centering and method for testing camera module centering using the same
CN108827594B (en) Analytical force detection method and detection system of structured light projector
KR102015620B1 (en) System and Method for detecting Metallic Particles
CN113348484A (en) Computerized system and method for obtaining information about an area of an object
JP4967132B2 (en) Defect inspection method for object surface
JP6595800B2 (en) Defect inspection apparatus and defect inspection method
WO2019161486A1 (en) Image processing system for inspecting object distance and dimensions using a hand-held camera with a collimated laser
EP4010873B1 (en) Use of an hdr image in a visual inspection process
CN116523882B (en) Vision-based optical target area accuracy detection method and system
CN117459700B (en) Color luminosity three-dimensional imaging method, system, electronic equipment and medium
Khatyreva et al. Unsupervised anomaly detection for industrial manufacturing using multiple perspectives of free falling parts
KR20190042180A (en) Cover-glass analyzing method
KR102655139B1 (en) Camera module false feature classification method and system according to image segmentation inspection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant