CN110992326A - A fast tilt correction method for QFN chip pin image - Google Patents

A fast tilt correction method for QFN chip pin image Download PDF

Info

Publication number
CN110992326A
CN110992326A CN201911182587.1A CN201911182587A CN110992326A CN 110992326 A CN110992326 A CN 110992326A CN 201911182587 A CN201911182587 A CN 201911182587A CN 110992326 A CN110992326 A CN 110992326A
Authority
CN
China
Prior art keywords
image
point
corner
points
contour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911182587.1A
Other languages
Chinese (zh)
Other versions
CN110992326B (en
Inventor
巢渊
周伟
刘文汇
唐寒冰
李龑
李兴成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University of Technology
Original Assignee
Jiangsu University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University of Technology filed Critical Jiangsu University of Technology
Priority to CN201911182587.1A priority Critical patent/CN110992326B/en
Publication of CN110992326A publication Critical patent/CN110992326A/en
Application granted granted Critical
Publication of CN110992326B publication Critical patent/CN110992326B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/242Aligning, centring, orientation detection or correction of the image by image rotation, e.g. by 90 degrees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开一种QFN芯片引脚图像快速倾斜校正方法,包括以下步骤:(1)采集QFN芯片引脚图像,进行滤波、二值化处理;(2)利用多边形逼近方法提取芯片图像中心焊盘轮廓;(3)提出改进Harris角点检测算法,获取轮廓顶点;(4)利用最小二乘法对距离最远的顶点进行直线拟合,将直线作为角度识别方向;(5)以图像形心坐标为旋转中心,快速校正芯片引脚图像并去除白边。该校正方法为更快速、更准确地校正QFN芯片提供了一定的理论依据,提高了QFN封装缺陷视觉检测效率。

Figure 201911182587

The invention discloses a quick tilt correction method for a pin image of a QFN chip, comprising the following steps: (1) collecting a pin image of a QFN chip, filtering and binarizing it; (2) extracting the center pad of the chip image by using a polygon approximation method contour; (3) propose an improved Harris corner detection algorithm to obtain contour vertices; (4) use the least squares method to fit a straight line to the farthest vertex, and use the straight line as the angle to identify the direction; (5) use the image centroid coordinates For the center of rotation, quickly correct the chip pin image and remove white edges. The correction method provides a certain theoretical basis for faster and more accurate correction of QFN chips, and improves the visual inspection efficiency of QFN package defects.

Figure 201911182587

Description

QFN chip pin image rapid inclination correction method
Technical Field
The invention belongs to the field of image processing algorithm design, and provides a method for improving a Harris corner detection algorithm and designing a quick inclination correction method for a QFN chip pin image by combining a polygon approximation method.
Background
QFN (Quad Flat No-lead Package) is a leadless Package, which is square or rectangular, and uses the middle bonding pad at the bottom of the Package to conduct heat, and the periphery of the Package surrounding the middle bonding pad is provided with a conductive bonding pad for realizing electrical connection. The QFN chip has certain size error in the production process, so a margin is left when a tray bearing opening is manufactured, but the chip is inclined when being placed into the tray bearing opening, and the visual detection of the packaging quality of the chip is influenced.
Meanwhile, because the size of the chip is small, the manual detection and identification mode with low accuracy can not meet the requirements of current chip production, and the problem that the development of an effective technology capable of quickly and accurately correcting the chip image is in urgent need is solved at present. In the prior art, research on a QFN chip mainly focuses on the aspects of QFN chip structure improvement, manufacturing process, appearance detection and QFN chip defect detection, and documents and patents related to a QFN chip image inclination rapid correction method are not found at present, so that a method for rapidly correcting an inclination of a QFN chip pin image is designed, the defect of the existing research on the aspect is filled, and the method is particularly important for improving the QFN package defect visual detection efficiency.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an improved Harris angular point detection algorithm, and a quick inclination correction method for a QFN chip pin image is designed by combining a polygon approximation method, so that the speed is higher and the efficiency is obviously improved compared with the traditional algorithm.
The technical scheme of the invention is as follows: a method for fast inclination correction of QFN chip pin images comprises the following steps:
1. preprocessing a chip pin image acquired by an industrial personal computer:
1.1 image filtering:
in order to remove noise and reduce image distortion, a 5 × 5 gaussian filter is convolved with an image to smooth the image and reduce the obvious noise effect on an edge detector, and in image processing, a two-dimensional gaussian function is often used for filtering, and the calculation formula is as follows:
Figure BDA0002291658880000011
wherein G (x, y) is a two-dimensional Gaussian function, (x, y) is a point coordinate, sigma is a standard deviation, A is a normalization coefficient, and the sum of different weights is one;
1.2. and (3) binarization processing:
carrying out binarization processing on the image by adopting a fixed threshold method, wherein a calculation formula is as follows:
Figure BDA0002291658880000021
wherein f (x, y) represents a distribution function of image pixel values, g (x, y) represents a distribution function of pixel values after threshold segmentation, and a fixed threshold T is 145;
2. extracting the target contour by using a polygon approximation method, which specifically comprises the following steps:
2.1 edge detection is carried out on the chip pin image:
adopting Canny operator edge detection to obtain edge contour information of a chip pin image;
2.2, extracting a target contour by using a polygon approximation method, namely the contour of a central bonding pad of a chip pin image:
extracting the outline of a central bonding pad of a chip pin image by a polygon approximation method, and filtering out the rest outline parts, thereby simplifying the subsequent image processing link; then, a point which is farthest away from the line segment is searched from the target contour, the point is added into the approximated new contour, namely, a triangle formed by connecting the three points is used as the contour; finally, starting from any one side of the triangle, repeating the previous step, adding the point with the farthest distance into the new contour, and continuously iterating until the output precision requirement is met;
3. an improved Harris corner detection algorithm is provided, and a target contour vertex is obtained, and the method specifically comprises the following steps:
3.1. determining a selection threshold, and acquiring a target contour corner:
in general, when the difference between the pixel gray levels of the two black and white images is less than 10% -15% of the maximum pixel gray level, the human eye is hard to distinguish, so the threshold N is selected to extract the corner points of the target contour of the image, and the formula is as follows:
N=255×12%≈30
3.2. extracting corner points of the target contour:
extracting and storing all angular points, and sequentially reading three points M in the angular pointsa-n、Ma、Ma+nThree points form a template consisting of three elements, where M is defined by the three pointsaAs the template center, the subscript of the point represents the serial number of the corner point in all the corner points, all the corner points are traversed, and from the initial value, M isaDetermining as a real-time operating point, and taking two points with the distance of n between the front and rear serial numbers, MaRespectively form two edges with other two points in the template, and the included angle formed by the two edges is taken as MaCorner point response value of point, from point MaAnd point Ma-nDistance determination edge L1Point MaAnd point Ma+nIs determined by the distance of2Point Ma-nAnd point Ma+nIs determined by the distance of3Trilateral M can be obtained according to the cosine theoremaThe angle of the point is calculated as follows:
Figure BDA0002291658880000031
judging whether to reserve the angular point or not through an included angle formed by two edges, namely an angular point response value, setting g as the angular point response value, and if the point M is the point M, judging whether to reserve the angular point or notaIf the number is more than or equal to g, the corner points are saved as the needed corner points; otherwise, removing;
3.3. eliminating adjacent angular points, and keeping contour vertexes:
after the contour Corner corners are extracted, other corners may exist around the contour Corner corners, in order to eliminate the phenomenon, adjacent corners are removed, the remaining points are taken as contour Corner vertices, the image height is H, the image width is W, Corner (x, y) indicates whether a Corner exists at the image (x, y), when the Corner (x, y) is 1, (x, y) has a Corner, and mxm (m >1) is the size of a matrix with (x, y) as the center, then:
Figure BDA0002291658880000032
wherein, m is not less than x and not more than H, m is not less than y and not more than W, count refers to the corner point of the matrix range with (x, y) as the center, all the adjacent corner points of the matrix range with (x, y) as the center are removed, and the remaining corner points at the corners of the target contour are reserved as the vertexes, so as to facilitate the subsequent image correction processing;
4. fitting a straight line by a least square method:
performing linear fitting on two vertexes of the longest edge of the target contour by using a least square method to serve as angle identification directions of the chip pin image;
5. according to the quick correction chip of image centroid and remove the white edge, specifically include:
5.1. taking the centroid of the image as the rotation center:
in order to accurately correct the image, the center position of the image is determined by using a centroid method, and a centroid coordinate O point is used as the rotation center of the image;
5.2. and acquiring an image inclination angle and correcting the chip pin image.
Further, in the step 4, two vertex coordinates are respectively defined as p (x, y) and q (x, y), and the least square method fitting straight line calculation formula is as follows:
y=ax+b
a and b are the slope and intercept, respectively, of the linear equation, then:
Figure BDA0002291658880000041
wherein
Figure BDA0002291658880000042
Respectively the mean values of the abscissa and ordinate of the vertexes p, q,
Figure BDA0002291658880000043
further, in step 5.1, the upper left corner of the image is set as the start point coordinate (0, 0), the lower right corner is set as the end point coordinate (m, n), and the centroid formula of the image is as follows:
Figure BDA0002291658880000044
wherein (x)0,y0) Is the centroid coordinate, m, n are the number of rows and columns of the image respectively (m, n are both integers greater than or equal to 2), f (x, y) is the gray value of the image at point (x, y).
Further, in step 5.2, the chip image inclination angle α can be obtained by the following calculation formula:
Figure BDA0002291658880000046
the horizontal offset x of the pin image can be obtained as follows:
x=px-hx
the pin image tilt angle α is:
Figure BDA0002291658880000045
wherein l is the vertical distance from the point p to the point O, OA is the extension line of the centroid coordinate to the positive direction of the x axis, and h (x, y) is the intersection coordinate of pq and OA.
The invention has the beneficial effects that:
the quick inclination correction method for the QFN chip pin image, disclosed by the invention, provides a certain theoretical basis for correcting the QFN chip more quickly and accurately, and improves the visual detection efficiency of QFN package defects.
Drawings
FIG. 1a is a QFN chip original image, FIG. 1b is a Gaussian filter image, and FIG. 1c is a binary image;
FIG. 2a is a Canny edge detection image, and FIG. 2b is a polygon approximation contour image;
FIG. 3 is a corner image acquired from a threshold;
FIG. 4 is a diagram illustrating the extraction of the vertices of the contour of an image object;
FIG. 5 is a straight line fitting method using least square method to select two vertices of the longest profile;
FIG. 6 is a schematic view of angular misalignment;
FIG. 7a is the image after the rotation correction, and FIG. 7b is the image after the white border is removed;
FIG. 8 is a schematic diagram of calibration labeling after angle detection by Hough transform algorithm;
FIG. 9 is a schematic diagram of calibration labeling after detecting an angle by a minimum external moment method;
FIG. 10 is a schematic diagram of calibration labeling after detecting angles based on an improved Harris corner detection algorithm;
FIG. 11 is a table of angle deviation data statistics for three algorithms;
FIG. 12 is a statistical table of operational time data for three algorithms;
FIG. 13 is a flow chart of a method for rapidly correcting the inclination of the pin image of the QFN chip.
Detailed Description
The following examples further illustrate the present invention but are not to be construed as limiting the invention. Modifications and substitutions to methods, procedures, or conditions of the invention may be made without departing from the spirit of the invention.
In order to improve the visual inspection efficiency of QFN package defects, the present embodiment discloses a method for rapidly tilting and correcting a QFN chip pin image, where fig. 1a is an original image, and fig. 1a is taken as an explanatory image of the present embodiment, and a specific correction process includes the following steps:
(1) preprocessing a chip pin image acquired by an industrial personal computer:
(1.1) image filtering:
to remove noise and reduce image distortion, a 5 × 5 gaussian filter is convolved with the image (see fig. 1b) to smooth the image and reduce the apparent noise effect on the edge detector. In image processing, a two-dimensional gaussian function is often used for filtering, and the calculation formula is as follows:
Figure BDA0002291658880000051
wherein G (x, y) is a two-dimensional Gaussian function, (x, y) is a point coordinate, sigma is a standard deviation, and A is a normalization coefficient, so that the sum of different weights is one.
(1.2) binarization processing:
the image is binarized by using a fixed threshold method (as shown in fig. 1c), and the calculation formula is as follows:
Figure BDA0002291658880000052
where f (x, y) represents a distribution function of image pixel values, g (x, y) represents a distribution function of pixel values after threshold segmentation, and a fixed threshold T is 145;
(2) extracting the target contour by using a polygon approximation method, which specifically comprises the following steps:
(2.1) carrying out edge detection on the chip pin image:
adopting Canny operator edge detection to obtain edge contour information of the chip pin image, as shown in FIG. 2 a;
(2.2) extracting a target contour by using a polygon approximation method, namely extracting a contour of a central bonding pad of a chip pin image:
the outline of the central bonding pad of the chip pin image is extracted by a polygon approximation method (as shown in fig. 2b), and the rest outline parts are filtered, so that the subsequent image processing is simplified. The polygon approximation method is to pick out two farthest points from the target contour and connect the points; then, a point which is farthest away from the line segment is searched from the target contour, the point is added into the approximated new contour, namely, a triangle formed by connecting the three points is used as the contour; and finally, starting from any one side of the triangle, repeating the previous step, adding the point with the farthest distance into the new contour, and continuously iterating until the output precision requirement is met.
(3) An improved Harris corner detection algorithm is provided, and a target contour vertex is obtained, and the method specifically comprises the following steps:
(3.1) determining a selection threshold, and acquiring a target contour corner:
in general, when the difference between the pixel gray levels of the two black and white images is less than 10% -15% of the maximum pixel gray level, the human eye is hard to distinguish, so a threshold N is selected to extract the corner points of the target contour of the image (as shown in fig. 3), and the formula is as follows:
N=255×12%≈30
(3.2) extracting corner points of the target contour:
extracting and storing all angular points, and sequentially reading three points M in the angular pointsa-n、Ma、Ma+nThree points form a template consisting of three elements, where M is defined by the three pointsaAs the template center, the subscript of a point represents the number of the corner point among all the corner points. Traversing all corner points, starting from an initial value, MaDetermining as a real-time operating point, and taking two points with the distance of n between the front and rear serial numbers, MaRespectively form two edges with other two points in the template, and the included angle formed by the two edges is taken as MaThe corner response value of the point. From point MaAnd point Ma-nDistance determination edge L1Point MaAnd point Ma+nIs determined by the distance of2Point Ma-nAnd point Ma+nIs determined by the distance of3Trilateral M can be obtained according to the cosine theoremaThe angle of the point is calculated as follows:
Figure BDA0002291658880000061
and judging whether the angular point is reserved or not according to an included angle formed by the two edges, namely an angular point response value. Let g be the angular response value, if point MaIf the number is more than or equal to g, the corner points are saved as the needed corner points; otherwise, removing.
(3.3) eliminating adjacent angular points, and keeping contour vertexes:
after the corner points of the contour are extracted, other corner points may exist around the corner points, in order to eliminate the phenomenon, adjacent corner points are removed, and the remaining points are taken as the vertexes of the contour corners. Assuming that the image height is H, the image width is W, and Corner (x, y) indicates whether there is a Corner at the image (x, y), and when the Corner is 1 at the image (x, y), the Corner is at the image (x, y), and m × m (m >1) is the size of the matrix centered on the image (x, y), then:
Figure BDA0002291658880000071
wherein m is not less than x and not more than H, m is not less than y and not more than W, count refers to the corner points of the matrix range with (x, y) as the center, all the neighboring corner points of the matrix range with (x, y) as the center are removed, and the remaining corner points at the corners of the target contour are reserved as the vertexes (as shown in FIG. 4), so as to facilitate the subsequent image correction processing.
(4) Fitting a straight line by a least square method:
and performing linear fitting on two vertexes of the longest edge of the target contour by using a least square method to serve as the angle identification direction of the chip pin image, as shown in fig. 5.
(5) According to the quick correction chip of image centroid and remove the white edge, specifically include:
(5.1) taking the centroid of the image as the rotation center:
for accurately correcting the image, the center position of the image is determined by using a centroid method, and a centroid coordinate O point is used as the rotation center of the image.
(5.2) obtaining an image inclination angle, correcting a chip pin image:
when the QFN chip is packaged into a carrier tape, angular deviation which is difficult to distinguish by naked eyes exists, in order to calculate a deviation angle, an improved Harris corner detection algorithm is proposed, a polygonal approximation contour is combined, a least square method is utilized to perform straight line fitting on two vertexes of a longest side, at the moment, a chip inclination angle α and a chip horizontal direction deviation value x have a right triangle relationship, as shown in fig. 6, the chip is corrected according to an inclination angle α (as shown in fig. 7a), and white edges existing in a chip pin image after rotation correction are removed (as shown in fig. 7 b).
In the step (4), two vertex coordinates are respectively set as p (x, y) and q (x, y), and the least square method fitting straight line calculation formula is as follows:
y=ax+b
a and b are the slope and intercept, respectively, of the linear equation, then:
Figure BDA0002291658880000072
wherein
Figure BDA0002291658880000073
Respectively the mean values of the abscissa and ordinate of the vertexes p, q,
Figure BDA0002291658880000074
in the step (5.1), the upper left corner of the image is set as the start point coordinate (0, 0), the lower right corner is set as the end point coordinate (m, n), and the centroid formula of the image is as follows:
Figure BDA0002291658880000075
wherein (x)0,y0) Is the centroid coordinate, m, n are the number of rows and columns of the image respectively (m, n are both integers greater than or equal to 2), f (x, y) is the gray value of the image at point (x, y).
In step (5.2), the chip image inclination angle α is obtained from the following calculation formula:
Figure BDA0002291658880000082
the horizontal offset x of the pin image can be obtained as follows:
x=px-hx
the pin image tilt angle α is:
Figure BDA0002291658880000081
wherein l is the vertical distance from the point p to the point O, OA is the extension line of the centroid coordinate to the positive direction of the x axis, and h (x, y) is the intersection coordinate of pq and OA.
Fig. 13 is a flowchart of a QFN chip pin image fast skew correction method, which is experimentally verified and compared as follows:
(1) the correction method disclosed in the embodiment is compared with the correction precision of the traditional Hough transformation and the minimum second-order moment method:
in the embodiment, the memory is 4GB, the processor is AMD A10-7300 radio R6, 10 computer computers 4C +6G @1.9GHz operating system, and the Visual Studio version is 2013. Selecting 10 different QFN chip pin images as experimental objects, running a program in the same environment, and comparing the correction method disclosed by the embodiment with the traditional Hough transformation and minimum external moment method. Fig. 8, 9, and 10 are schematic diagrams of calibration labeling after detecting angles by the three algorithms. And carrying out angle detection and verification by using the corrected chip central bonding pad, wherein the gray frame line is the minimum external rectangle of the pentagonal part of the chip obtained by different algorithms, and the black frame line is the ideal external rectangle of the manual marking of the pentagonal part of the chip. It can be easily seen that there is a certain angular deviation between fig. 8 and 9. The schematic diagram of the circumscribed rectangle marked after being corrected by the correction method disclosed by the embodiment is shown in fig. 10, and the marked gray frame line is approximately overlapped with the artificially marked black frame line, so that the inclination angle obtained by the correction method disclosed by the embodiment is more accurate. Fig. 11 lists the tilt angles of the chip detected by three algorithms.
(2) The correction method disclosed by the embodiment is compared with the correction time of the traditional Hough transformation and the minimum second-order moment method:
in order to detect the running time difference between the correction method disclosed by the embodiment and the traditional Hough transformation and minimum external moment method, the running times of 10 different QFN chip pin images are respectively compared in an experiment. Fig. 12 lists the run times of the three algorithms. Taking the graph number 5 as an example, the conventional Hough transform correction time is 357ms, the minimum external moment method correction time is 116ms, and the correction method disclosed in this embodiment completes the chip image correction process only in 18 ms. Compared with the traditional Hough transformation, the running time of the correction method disclosed by the embodiment is only 1/34, and the running time of the correction method disclosed by the embodiment is only 1/10 compared with the minimum external moment method. Therefore, the QFN chip image fast skew correction method proposed in this embodiment is not only highly accurate, but also significantly reduces the running time, and has higher computational efficiency.
In summary, the method for rapidly correcting the inclination of the pin image of the QFN chip provided by the embodiment of the present invention is faster and more efficient than the conventional algorithm, and can be used in the correction link of the QFN chip, so as to provide a clear and accurate image for the inclination correction of the chip and the detection of the appearance defect of the chip, and improve the visual detection efficiency of the QFN package defect.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. However, the above description is only an example of the present invention, the technical features of the present invention are not limited thereto, and any other embodiments that can be obtained by those skilled in the art without departing from the technical solution of the present invention should be covered by the claims of the present invention.

Claims (4)

1.一种针对QFN芯片引脚图像快速倾斜校正的方法,其特征在于,包括以下步骤:1. a method for quick tilt correction for QFN chip pin image, is characterized in that, comprises the following steps: (1)对工控机采集的芯片引脚图像进行预处理:(1) Preprocess the chip pin image collected by the industrial computer: 1.1)图像滤波:1.1) Image filtering: 为去除噪声,减少图像失真,采用5×5高斯滤波器与图像进行卷积,在图像处理中,使用二维高斯函数进行滤波,计算公式如下:In order to remove noise and reduce image distortion, a 5×5 Gaussian filter is used to convolve the image. In image processing, a two-dimensional Gaussian function is used for filtering. The calculation formula is as follows:
Figure FDA0002291658870000011
Figure FDA0002291658870000011
其中,G(x,y)为二维高斯函数,(x,y)为点坐标,σ为标准差,A为归一化系数,使不同的权重之和为一;Among them, G(x, y) is the two-dimensional Gaussian function, (x, y) is the point coordinate, σ is the standard deviation, and A is the normalization coefficient, so that the sum of different weights is one; 1.2)二值化处理:1.2) Binarization processing: 采用固定阈值法对图像进行二值化处理,计算公式如下:The fixed threshold method is used to binarize the image, and the calculation formula is as follows:
Figure FDA0002291658870000012
Figure FDA0002291658870000012
其中,f(x,y)表示图像像素值的分布函数,g(x,y)表示阈值分割之后的像素值分布函数,固定阈值T=145;Among them, f(x, y) represents the distribution function of image pixel values, g(x, y) represents the pixel value distribution function after threshold segmentation, and the fixed threshold T=145; (2)利用多边形逼近方法提取目标轮廓,具体包括:(2) Extract the target contour by using the polygon approximation method, which specifically includes: 2.1)对芯片引脚图像进行边缘检测:2.1) Edge detection on the chip pin image: 采用Canny算子边缘检测,得到芯片引脚图像的边缘轮廓信息;Using Canny operator edge detection, the edge contour information of the chip pin image is obtained; 2.2)利用多边形逼近方法提取目标轮廓:2.2) Use the polygon approximation method to extract the target contour: 通过多边形逼近方法提取芯片引脚图像中心焊盘轮廓,滤除掉其余轮廓部分,多边形逼近方法是从目标轮廓中挑出两个最远的点,进行连接;接着从目标轮廓上寻找一个离线段距离最远的点,将该点加入逼近后的新轮廓,即连接着三个点形成的三角形作为轮廓;最后选择三角形的任意一条边出发,重复上一步骤,将距离最远点加入新轮廓,不断迭代,直至满足输出的精度要求;The contour of the center pad of the chip pin image is extracted by the polygon approximation method, and the remaining contour parts are filtered out. The polygon approximation method is to pick out the two farthest points from the target contour and connect them; then find an offline segment from the target contour. The farthest point is added to the new contour after approximation, that is, the triangle formed by connecting the three points is used as the contour; finally, any side of the triangle is selected to start, and the previous step is repeated to add the farthest point to the new contour. , and iterate continuously until the accuracy requirements of the output are met; (3)提出改进Harris角点检测算法,获取目标轮廓顶点,具体包括:(3) An improved Harris corner detection algorithm is proposed to obtain the target contour vertices, including: 3.1)确定选取阈值,获取目标轮廓角点:3.1) Determine the selection threshold and obtain the corner points of the target contour: 通常情况下,两幅黑白图像的点像素灰度之差小于最大像素灰度值的10%~15%时,人眼难以分辨,故选取阈值N提取图像目标轮廓角点,公式如下:Under normal circumstances, when the difference between the pixel gray levels of two black and white images is less than 10% to 15% of the maximum pixel gray value, it is difficult for the human eye to distinguish. Therefore, the threshold N is selected to extract the corner points of the target contour of the image. The formula is as follows: N=255×12%≈30N=255×12%≈30 3.2)提取目标轮廓拐角角点:3.2) Extract the corner points of the target contour: 提取并保存所有角点,按顺序读取角点中的三点Ma-n、Ma、Ma+n,三点构成一个由三个元素组成的模板,其中三点中将Ma作为模板中心,点的下标代表角点在所有角点中的序号,遍历所有角点,从初始值开始,Ma确定为一个实时性操作点,取其前后序号相距n的两点,Ma分别与模板中其他两点组成两条边,将两条边构成的夹角作为Ma点的角点响应值,由点Ma与点Ma-n距离确定边L1,点Ma与点Ma+n的距离确定边L2,点Ma-n与点Ma+n的距离确定边L3,三边可以根据余弦定理得到Ma点的角度,计算公式如下:Extract and save all the corner points, read the three points Man , M a , and M a+n in the corner points in order, the three points constitute a template composed of three elements, among which the three points take M a as the template center , the subscript of the point represents the sequence number of the corner point in all the corner points, traverse all the corner points, starting from the initial value, M a is determined as a real-time operation point, take the two points whose front and rear numbers are separated by n, and M a are respectively the same as The other two points in the template form two sides, the angle formed by the two sides is used as the corner response value of point Ma, and the distance between point Ma and point M an determines edge L 1 , point Ma and point M a + The distance of n determines the side L 2 , the distance between the point Man and the point M a +n determines the side L 3 , and the angle of the point Ma can be obtained from the three sides according to the cosine law. The calculation formula is as follows:
Figure FDA0002291658870000021
Figure FDA0002291658870000021
通过两条边构成的夹角,即角点响应值来判断是否保留角点,设g为角点响应值,若点Ma≥g,则保存为所需要的角点;反之,则去除;The angle formed by the two sides, that is, the response value of the corner point, is used to determine whether to retain the corner point. Let g be the response value of the corner point. If the point M a ≥ g, save it as the required corner point; otherwise, remove it; 3.3)剔除邻近角点,保留轮廓顶点:3.3) Eliminate adjacent corners and retain contour vertices: 提取出轮廓拐角角点后,周围可能还会存在有其它角点,为消除这一现象,将邻近角点进行剔除,取剩下的点作为轮廓拐角顶点,设图像高度为H,图像宽度为W,Corner(x,y)表示在图像(x,y)处是否有角点,令Corner(x,y)=1时,(x,y)处有角点,m×m(m>1)为以(x,y)为中心的矩阵的大小,则:After the contour corner points are extracted, there may be other corner points around. To eliminate this phenomenon, the adjacent corner points are removed, and the remaining points are taken as the contour corner vertices. Set the image height as H and the image width as W, Corner(x, y) indicates whether there is a corner point at the image (x, y), when Corner(x, y)=1, there is a corner point at (x, y), m×m (m>1 ) is the size of the matrix centered at (x, y), then:
Figure FDA0002291658870000022
Figure FDA0002291658870000022
其中,m≤x≤H,m≤y≤W,count指以(x,y)为中心的矩阵范围的角点,将(x,y)为中心的矩阵范围的邻近角点全部去除,保留目标轮廓拐角处剩下的角点作为顶点,以方便后续图像校正处理;Among them, m≤x≤H, m≤y≤W, count refers to the corner points of the matrix range with (x, y) as the center, and all the adjacent corner points of the matrix range with (x, y) as the center are removed and reserved The remaining corner points at the corners of the target contour are used as vertices to facilitate subsequent image correction processing; (4)最小二乘法拟合直线:(4) Fitting a straight line by the least squares method: 运用最小二乘法,将目标轮廓最长边的两个顶点进行直线拟合,作为芯片引脚图像的角度识别方向;Using the least squares method, the two vertices of the longest side of the target contour are fitted with a straight line as the angle identification direction of the chip pin image; (5)根据图像形心快速校正芯片并去除白边,具体包括:(5) Quickly correct the chip and remove the white edge according to the image centroid, including: 5.1)以图像形心为旋转中心:5.1) Take the image centroid as the rotation center: 为准确校正图像,利用形心法确定图像的中心位置,将形心坐标O点作为图像的旋转中心;In order to correct the image accurately, the centroid method is used to determine the center position of the image, and the centroid coordinate O point is used as the rotation center of the image; 5.2)获取图像倾斜角度,校正芯片引脚图像。5.2) Obtain the tilt angle of the image and correct the chip pin image.
2.如权利要求1所述的一种针对QFN芯片引脚图像快速倾斜校正的方法,其特征在于,在所述步骤4中,设两个顶点坐标分别为p(x,y)、q(x,y),所述最小二乘法拟合直线计算公式如下:2. a kind of method for quick tilt correction for QFN chip pin image as claimed in claim 1, is characterized in that, in described step 4, set two vertex coordinates to be respectively p(x, y), q( x, y), the least squares fitting straight line calculation formula is as follows: y=ax+by=ax+b a和b分别为直线方程的斜率和截距,则:a and b are the slope and intercept of the straight line equation, respectively, then:
Figure FDA0002291658870000031
Figure FDA0002291658870000031
其中,
Figure FDA0002291658870000032
分别为顶点p、q的横坐标与纵坐标的均值,
Figure FDA0002291658870000033
in,
Figure FDA0002291658870000032
are the mean values of the abscissa and ordinate of vertices p and q, respectively,
Figure FDA0002291658870000033
3.如权利要求2所述的一种针对QFN芯片引脚图像快速倾斜校正的方法,其特征在于,在所述步骤5.1中,图像左上角设为起始点坐标(0,0),右下角设为终点坐标(m,n),图像形心公式如下:3. a kind of method for quick tilt correction for QFN chip pin image as claimed in claim 2, is characterized in that, in described step 5.1, the upper left corner of the image is set as starting point coordinates (0,0), the lower right corner Set as the end point coordinates (m, n), the formula of the image centroid is as follows:
Figure FDA0002291658870000034
Figure FDA0002291658870000034
其中,(x0,y0)是形心坐标,m、n分别为图像的行数和列数(m、n均为大于等于2的整数),f(x,y)是图像在点(x,y)处的灰度值。Among them, (x 0 , y 0 ) is the centroid coordinate, m and n are the number of rows and columns of the image respectively (m, n are both integers greater than or equal to 2), f(x, y) is the image at the point ( gray value at x, y).
4.如权利要求3所述的一种针对QFN芯片引脚图像快速倾斜校正的方法,其特征在于,在所述步骤5.2中,芯片图像倾斜角度α由以下计算公式可求出:4. a kind of method for quick tilt correction for QFN chip pin image as claimed in claim 3, is characterized in that, in described step 5.2, chip image tilt angle α can be obtained by following calculation formula:
Figure FDA0002291658870000035
Figure FDA0002291658870000035
引脚图像在水平方向偏移值x可求得:The offset value x of the pin image in the horizontal direction can be obtained: x=px-hx x=p x -h x 则引脚图像倾斜角度α为:Then the pin image tilt angle α is:
Figure FDA0002291658870000036
Figure FDA0002291658870000036
其中,l为点p到点O的垂直距离,OA为形心坐标往x轴正方向的延长线,h(x,y)为pq与OA的交点坐标。Among them, l is the vertical distance from point p to point O, OA is the extension of the centroid coordinate to the positive direction of the x-axis, and h(x, y) is the coordinate of the intersection of pq and OA.
CN201911182587.1A 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method Active CN110992326B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911182587.1A CN110992326B (en) 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911182587.1A CN110992326B (en) 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method

Publications (2)

Publication Number Publication Date
CN110992326A true CN110992326A (en) 2020-04-10
CN110992326B CN110992326B (en) 2022-08-09

Family

ID=70087207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911182587.1A Active CN110992326B (en) 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method

Country Status (1)

Country Link
CN (1) CN110992326B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111537518A (en) * 2020-05-25 2020-08-14 珠海格力智能装备有限公司 Method and device for detecting defects of capacitor terminal, storage medium and processor
CN111754461A (en) * 2020-05-28 2020-10-09 江苏理工学院 Method and device for locating image character area of semiconductor chip
CN111950315A (en) * 2020-10-19 2020-11-17 江苏理工学院 A method, device and storage medium for segmentation and identification of multiple barcode images
CN112733843A (en) * 2020-12-30 2021-04-30 深圳市路远智能装备有限公司 Visual identification method of BGA (ball grid array)
CN113379681A (en) * 2021-05-20 2021-09-10 深圳技术大学 Method and system for obtaining inclination angle of LED chip, electronic device and storage medium
CN113733827A (en) * 2021-10-19 2021-12-03 长沙立中汽车设计开发股份有限公司 Device and method for detecting relative rotation angle between semitrailer trailer and trailer
CN113763279A (en) * 2021-09-10 2021-12-07 厦门理工学院 A Precise Correction Processing Method for Image with Rectangular Frame
CN114229396A (en) * 2022-02-18 2022-03-25 深圳市创新特科技有限公司 Correcting device and correcting method for pick-and-place position of circuit board
CN115308222A (en) * 2022-07-11 2022-11-08 江苏汤谷智能科技有限公司 System and method for identifying bad chip appearance based on machine vision
CN116051389A (en) * 2022-08-10 2023-05-02 荣耀终端有限公司 Calibration image correction method and device and electronic equipment
CN116309325A (en) * 2023-02-08 2023-06-23 深圳市振华兴智能技术有限公司 Patch detection method and system based on deep learning
CN117274246A (en) * 2023-11-17 2023-12-22 深圳市大族封测科技股份有限公司 Bonding pad identification method, computer equipment and storage medium
WO2024016686A1 (en) * 2022-07-18 2024-01-25 宁德时代新能源科技股份有限公司 Corner detection method and apparatus
CN117830336A (en) * 2024-03-04 2024-04-05 福建帝视科技集团有限公司 Polygonal contour detection method and device based on line scanning camera imaging
CN118212210A (en) * 2024-04-02 2024-06-18 常州信息职业技术学院 Point symmetry-based automobile dipped beam type feature point detection method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359402A (en) * 2014-11-17 2015-02-18 南京工业大学 Detection method for rectangular pin element visual positioning
CN109785316A (en) * 2019-01-22 2019-05-21 湖南大学 A kind of chip surface defect detection method
CN110933926A (en) * 2019-11-13 2020-03-27 浙江工业大学 An automatic correction method for the angle of the nozzle component of the placement machine based on the corner point detection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359402A (en) * 2014-11-17 2015-02-18 南京工业大学 Detection method for rectangular pin element visual positioning
CN109785316A (en) * 2019-01-22 2019-05-21 湖南大学 A kind of chip surface defect detection method
CN110933926A (en) * 2019-11-13 2020-03-27 浙江工业大学 An automatic correction method for the angle of the nozzle component of the placement machine based on the corner point detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙艳丽等: "SIFT在高分辨率SAR图像自动配准中的性能分析", 《电子设计工程》 *
金斌等: "一种基于局部不变特征的SAR图像配准新算法", 《哈尔滨工程大学学报》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111537518B (en) * 2020-05-25 2024-05-28 珠海格力智能装备有限公司 Method and device for detecting flaws of capacitor terminal, storage medium and processor
CN111537518A (en) * 2020-05-25 2020-08-14 珠海格力智能装备有限公司 Method and device for detecting defects of capacitor terminal, storage medium and processor
CN111754461A (en) * 2020-05-28 2020-10-09 江苏理工学院 Method and device for locating image character area of semiconductor chip
CN111754461B (en) * 2020-05-28 2024-03-01 江苏理工学院 Method and device for positioning image character area of semiconductor chip
CN111950315B (en) * 2020-10-19 2023-11-07 江苏理工学院 Method, device and storage medium for segmenting and identifying multiple bar code images
CN111950315A (en) * 2020-10-19 2020-11-17 江苏理工学院 A method, device and storage medium for segmentation and identification of multiple barcode images
CN112733843A (en) * 2020-12-30 2021-04-30 深圳市路远智能装备有限公司 Visual identification method of BGA (ball grid array)
CN113379681A (en) * 2021-05-20 2021-09-10 深圳技术大学 Method and system for obtaining inclination angle of LED chip, electronic device and storage medium
CN113763279A (en) * 2021-09-10 2021-12-07 厦门理工学院 A Precise Correction Processing Method for Image with Rectangular Frame
CN113733827A (en) * 2021-10-19 2021-12-03 长沙立中汽车设计开发股份有限公司 Device and method for detecting relative rotation angle between semitrailer trailer and trailer
CN114229396A (en) * 2022-02-18 2022-03-25 深圳市创新特科技有限公司 Correcting device and correcting method for pick-and-place position of circuit board
CN114229396B (en) * 2022-02-18 2022-05-13 深圳市创新特科技有限公司 Correcting device and correcting method for taking and placing positions of circuit board
CN115308222B (en) * 2022-07-11 2024-02-09 江苏汤谷智能科技有限公司 System and method for identifying poor chip appearance based on machine vision
CN115308222A (en) * 2022-07-11 2022-11-08 江苏汤谷智能科技有限公司 System and method for identifying bad chip appearance based on machine vision
WO2024016686A1 (en) * 2022-07-18 2024-01-25 宁德时代新能源科技股份有限公司 Corner detection method and apparatus
CN116051389A (en) * 2022-08-10 2023-05-02 荣耀终端有限公司 Calibration image correction method and device and electronic equipment
CN116309325A (en) * 2023-02-08 2023-06-23 深圳市振华兴智能技术有限公司 Patch detection method and system based on deep learning
CN117274246A (en) * 2023-11-17 2023-12-22 深圳市大族封测科技股份有限公司 Bonding pad identification method, computer equipment and storage medium
CN117274246B (en) * 2023-11-17 2024-02-20 深圳市大族封测科技股份有限公司 Bonding pad identification method, computer equipment and storage medium
CN117830336A (en) * 2024-03-04 2024-04-05 福建帝视科技集团有限公司 Polygonal contour detection method and device based on line scanning camera imaging
CN118212210A (en) * 2024-04-02 2024-06-18 常州信息职业技术学院 Point symmetry-based automobile dipped beam type feature point detection method and system
CN118212210B (en) * 2024-04-02 2024-12-20 常州信息职业技术学院 Vehicle low beam light pattern feature point detection method and system based on point symmetry

Also Published As

Publication number Publication date
CN110992326B (en) 2022-08-09

Similar Documents

Publication Publication Date Title
CN110992326A (en) A fast tilt correction method for QFN chip pin image
CN111612781B (en) Screen defect detection method and device and head-mounted display equipment
CN103914827B (en) The visible detection method of weather strip for automobile profile defects
CN106446894B (en) A method of based on outline identification ball-type target object location
CN115063421B (en) Pole piece region detection method, system and device, medium and defect detection method
CN114529459B (en) Method, system and medium for enhancing image edge
CN107045634B (en) Text positioning method based on maximum stable extremum region and stroke width
CN110276750A (en) A Method for Extracting Straight Line Length of Wafer with Arbitrary Tilt Angle and Isolating Grain Area
CN105488492B (en) A color image preprocessing method, road recognition method and related device
CN115294099B (en) Method and system for detecting hairline defect in steel plate rolling process
CN110647882A (en) Image correction method, device, equipment and storage medium
CN106709500B (en) Image feature matching method
CN105260694B (en) A kind of two-dimension code area localization method based on multistage key extraction with analysis
CN113808131B (en) Method, system, device and medium for identifying connector defects
CN108596925A (en) The heronsbill module surface screw hole site image processing method of view-based access control model
CN115409787A (en) Detection method for base defect of small pluggable transceiver optical module
CN117152165A (en) Photosensitive chip defect detection method and device, storage medium and electronic equipment
CN110288619A (en) Detection method of screw hole position on the surface of sunflower module based on vision
CN112419207A (en) Image correction method, device and system
CN114049380B (en) Target object positioning and tracking method, device, computer equipment and storage medium
CN118520893A (en) Method, device and storage medium for identifying bar code label applied to AOI
WO2024016686A1 (en) Corner detection method and apparatus
CN109934817B (en) A detection method for fruit body external contour deformity
CN112923852B (en) SD card pose detection method based on dynamic corner positioning
CN116434071B (en) Determination method, determination device, equipment and medium for normalized building mask

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant