CN110992326B - QFN chip pin image rapid inclination correction method - Google Patents

QFN chip pin image rapid inclination correction method Download PDF

Info

Publication number
CN110992326B
CN110992326B CN201911182587.1A CN201911182587A CN110992326B CN 110992326 B CN110992326 B CN 110992326B CN 201911182587 A CN201911182587 A CN 201911182587A CN 110992326 B CN110992326 B CN 110992326B
Authority
CN
China
Prior art keywords
image
point
corner
points
contour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911182587.1A
Other languages
Chinese (zh)
Other versions
CN110992326A (en
Inventor
巢渊
周伟
刘文汇
唐寒冰
李龑
李兴成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University of Technology
Original Assignee
Jiangsu University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University of Technology filed Critical Jiangsu University of Technology
Priority to CN201911182587.1A priority Critical patent/CN110992326B/en
Publication of CN110992326A publication Critical patent/CN110992326A/en
Application granted granted Critical
Publication of CN110992326B publication Critical patent/CN110992326B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/242Aligning, centring, orientation detection or correction of the image by image rotation, e.g. by 90 degrees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a QFN chip pin image rapid inclination correction method, which comprises the following steps: (1) acquiring a QFN chip pin image, and performing filtering and binarization processing; (2) extracting the outline of a central bonding pad of the chip image by using a polygon approximation method; (3) an improved Harris angular point detection algorithm is provided, and contour vertexes are obtained; (4) performing straight line fitting on the vertex with the farthest distance by using a least square method, and taking the straight line as an angle identification direction; (5) and (4) quickly correcting the chip pin image and removing the white edge by taking the image centroid coordinate as a rotation center. The correction method provides a certain theoretical basis for correcting the QFN chip more quickly and accurately, and improves the visual detection efficiency of the QFN packaging defects.

Description

QFN chip pin image rapid inclination correction method
Technical Field
The invention belongs to the field of image processing algorithm design, and provides a method for rapidly correcting a pin image of a QFN (quad flat no-lead) chip by improving a Harris corner detection algorithm and combining a polygon approximation method.
Background
QFN (Quad Flat No-lead Package) is a leadless Package, which is square or rectangular, and uses the middle bonding pad at the bottom of the Package to conduct heat, and the periphery of the Package surrounding the middle bonding pad is provided with a conductive bonding pad for realizing electrical connection. The QFN chip has certain size error in the production process, so a margin is left when a tray bearing opening is manufactured, but the chip is inclined when being placed into the tray bearing opening, and the visual detection of the packaging quality of the chip is influenced.
Meanwhile, because the size of the chip is small, the manual detection and identification mode with low accuracy can not meet the requirements of current chip production, and the problem that the development of an effective technology capable of quickly and accurately correcting the chip image is in urgent need is solved at present. In the prior art, research on a QFN chip mainly focuses on the aspects of QFN chip structure improvement, manufacturing process, appearance detection and QFN chip defect detection, and documents and patents related to a QFN chip image inclination rapid correction method are not found at present, so that a method for rapidly correcting an inclination of a QFN chip pin image is designed, the defect of the existing research on the aspect is filled, and the method is particularly important for improving the QFN package defect visual detection efficiency.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an improved Harris angular point detection algorithm, and a quick inclination correction method for a QFN chip pin image is designed by combining a polygon approximation method, so that the speed is higher and the efficiency is obviously improved compared with the traditional algorithm.
The technical scheme of the invention is as follows: a method for fast inclination correction of QFN chip pin images comprises the following steps:
1. preprocessing a chip pin image acquired by an industrial personal computer:
1.1 image filtering:
in order to remove noise and reduce image distortion, a 5 × 5 gaussian filter is convolved with an image to smooth the image and reduce the obvious noise effect on an edge detector, and in image processing, a two-dimensional gaussian function is often used for filtering, and the calculation formula is as follows:
Figure GDA0003711964400000011
wherein G (x, y) is a two-dimensional Gaussian function, (x, y) is a point coordinate, sigma is a standard deviation, A is a normalization coefficient, and the sum of different weights is one;
1.2. and (3) binarization processing:
carrying out binarization processing on the image by adopting a fixed threshold method, wherein a calculation formula is as follows:
Figure GDA0003711964400000021
wherein f (x, y) represents a distribution function of image pixel values, g (x, y) represents a distribution function of pixel values after threshold segmentation, and a fixed threshold T is 145;
2. extracting the target contour by using a polygon approximation method, which specifically comprises the following steps:
2.1 edge detection is carried out on the chip pin image:
adopting Canny operator edge detection to obtain edge contour information of a chip pin image;
2.2, extracting a target contour by using a polygon approximation method, namely the contour of a central bonding pad of a chip pin image:
extracting the outline of a central bonding pad of a chip pin image by a polygon approximation method, and filtering out the rest outline parts, thereby simplifying the subsequent image processing link; then, a point which is farthest away from the line segment is searched from the target contour, the point is added into the approximated new contour, namely, a triangle formed by connecting the three points is used as the contour; finally, starting from any one side of the triangle, repeating the previous step, adding the point with the farthest distance into the new contour, and continuously iterating until the output precision requirement is met;
3. an improved Harris corner detection algorithm is provided, and a target contour vertex is obtained, and the method specifically comprises the following steps:
3.1. determining a selection threshold, and acquiring a target contour corner:
in general, when the difference between the pixel gray levels of the two black and white images is less than 10% -15% of the maximum pixel gray level, the human eye is hard to distinguish, so the threshold N is selected to extract the corner points of the target contour of the image, and the formula is as follows:
N=255×12%≈30
3.2. extracting corner points of the target contour:
extracting and storing all corner pointsSequentially reading three points M in the corner points a-n 、M a 、M a+n Three points form a template consisting of three elements, where M is defined by the three points a As the center of the template, the subscript of the point represents the serial number of the corner point in all the corner points, and all the corner points are traversed, starting from the initial value, M a Determining as a real-time operating point, and taking two points with the distance of n between the front and rear serial numbers, M a Respectively form two edges with other two points in the template, and the included angle formed by the two edges is taken as M a Corner point response value of point, from point M a And point M a-n Distance determination edge L 1 Point M a And point M a+n Is determined by the distance of 2 Point M a-n And point M a+n Is determined by the distance of 3 Trilateral M can be obtained according to the cosine theorem a The angle of the point is calculated as follows:
Figure GDA0003711964400000031
judging whether to reserve the angular point or not through an included angle formed by two edges, namely an angular point response value, setting g as the angular point response value, and if the point M is the point M, judging whether to reserve the angular point or not a If the number is more than or equal to g, the corner points are saved as the needed corner points; otherwise, removing;
3.3. eliminating adjacent angular points, and keeping contour vertexes:
after the contour Corner corners are extracted, other corners may exist around the contour Corner corners, in order to eliminate the phenomenon, adjacent corners are removed, the remaining points are taken as contour Corner vertices, the image height is H, the image width is W, Corner (x, y) indicates whether a Corner exists at the image (x, y), when the Corner (x, y) is 1, (x, y) has a Corner, and mxm (m >1) is the size of a matrix with (x, y) as the center, then:
Figure GDA0003711964400000032
wherein, m is not less than x and not more than H, m is not less than y and not more than W, count refers to the corner point of the matrix range with (x, y) as the center, all the adjacent corner points of the matrix range with (x, y) as the center are removed, and the remaining corner points at the corners of the target contour are reserved as the vertexes, so as to facilitate the subsequent image correction processing;
4. fitting a straight line by a least square method:
performing linear fitting on two vertexes of the longest edge of the target contour by using a least square method to serve as angle identification directions of the chip pin image;
5. according to the quick correction chip of image centroid and remove the white edge, specifically include:
5.1. taking the centroid of the image as the rotation center:
in order to accurately correct the image, the center position of the image is determined by using a centroid method, and a centroid coordinate O point is used as the rotation center of the image;
5.2. and acquiring an image inclination angle and correcting the chip pin image.
Further, in the step 4, two vertex coordinates are respectively defined as p (x, y) and q (x, y), and the least square method fitting straight line calculation formula is as follows:
y=ax+b
a and b are the slope and intercept, respectively, of the linear equation, then:
Figure GDA0003711964400000041
wherein
Figure GDA0003711964400000042
Respectively the mean values of the abscissa and ordinate of the vertexes p, q,
Figure GDA0003711964400000043
further, in step 5.1, the upper left corner of the image is set as the start point coordinate (0, 0), the lower right corner is set as the end point coordinate (m, n), and the centroid formula of the image is as follows:
Figure GDA0003711964400000044
wherein (x) 0 ,y 0 ) Is the centroid coordinate, m, n are the number of rows and columns of the image respectively (m, n are both integers greater than or equal to 2), f (x, y) is the gray value of the image at point (x, y).
Further, in step 5.2, the chip image tilt angle α can be obtained by the following calculation formula:
Figure GDA0003711964400000045
the horizontal offset x of the pin image can be obtained as follows:
x=|p x -h x |
the pin image tilt angle α is:
Figure GDA0003711964400000046
wherein l is the vertical distance from the point p to the point O, OA is the extension line of the centroid coordinate to the positive direction of the x axis, and h (x, y) is the intersection coordinate of pq and OA.
The invention has the beneficial effects that:
the quick inclination correction method for the QFN chip pin image, disclosed by the invention, provides a certain theoretical basis for correcting the QFN chip more quickly and accurately, and improves the visual detection efficiency of QFN package defects.
Drawings
FIG. 1a is a QFN chip original image, FIG. 1b is a Gaussian filter image, and FIG. 1c is a binary image;
FIG. 2a is a Canny edge detection image, and FIG. 2b is a polygon approximation contour image;
FIG. 3 is a corner image acquired from a threshold;
FIG. 4 is a diagram illustrating the extraction of the vertices of the contour of an image object;
FIG. 5 is a straight line fitting method using least square method to select two vertices of the longest profile;
FIG. 6 is a schematic view of angular misalignment;
FIG. 7a is the image after the rotation correction, and FIG. 7b is the image after the white border is removed;
FIG. 8 is a schematic diagram of calibration labeling after angle detection by Hough transform algorithm;
FIG. 9 is a schematic diagram of calibration labeling after detecting an angle by a minimum external moment method;
FIG. 10 is a schematic diagram of calibration labeling after detecting angles based on an improved Harris corner detection algorithm;
FIG. 11 is a table of angle deviation data statistics for three algorithms;
FIG. 12 is a statistical table of operational time data for three algorithms;
FIG. 13 is a flow chart of a method for rapidly correcting the inclination of the pin image of the QFN chip.
Detailed Description
The following examples further illustrate the present invention but are not to be construed as limiting the invention. Modifications and substitutions to methods, procedures, or conditions of the invention may be made without departing from the spirit of the invention.
In order to improve the visual inspection efficiency of QFN package defects, the present embodiment discloses a method for rapidly tilting and correcting a QFN chip pin image, where fig. 1a is an original image, and fig. 1a is taken as an explanatory image of the present embodiment, and a specific correction process includes the following steps:
(1) preprocessing a chip pin image acquired by an industrial personal computer:
(1.1) image filtering:
to remove noise and reduce image distortion, a 5 × 5 gaussian filter is convolved with the image (see fig. 1b) to smooth the image and reduce the apparent noise effect on the edge detector. In image processing, a two-dimensional gaussian function is often used for filtering, and the calculation formula is as follows:
Figure GDA0003711964400000051
wherein G (x, y) is a two-dimensional Gaussian function, (x, y) is a point coordinate, sigma is a standard deviation, and A is a normalization coefficient, so that the sum of different weights is one.
(1.2) binarization processing:
the image is binarized by using a fixed threshold method (as shown in fig. 1c), and the calculation formula is as follows:
Figure GDA0003711964400000052
where f (x, y) represents a distribution function of image pixel values, g (x, y) represents a distribution function of pixel values after threshold segmentation, and a fixed threshold T is 145;
(2) extracting the target contour by using a polygon approximation method, which specifically comprises the following steps:
(2.1) carrying out edge detection on the chip pin image:
adopting Canny operator edge detection to obtain edge contour information of the chip pin image, as shown in FIG. 2 a;
(2.2) extracting a target contour by using a polygon approximation method, namely extracting a contour of a central bonding pad of a chip pin image:
the outline of the central bonding pad of the chip pin image is extracted by a polygon approximation method (as shown in fig. 2b), and the rest outline parts are filtered, so that the subsequent image processing is simplified. The polygon approximation method is to pick out two farthest points from the target contour and connect the points; then, a point which is farthest away from the line segment is searched from the target contour, the point is added into the approximated new contour, namely, a triangle formed by connecting the three points is used as the contour; and finally, starting from any one side of the triangle, repeating the previous step, adding the point with the farthest distance into the new contour, and continuously iterating until the output precision requirement is met.
(3) An improved Harris corner detection algorithm is provided, and a target contour vertex is obtained, and the method specifically comprises the following steps:
(3.1) determining a selection threshold, and acquiring a target contour corner:
in general, when the difference between the pixel gray levels of the two black and white images is less than 10% -15% of the maximum pixel gray level, the human eye is hard to distinguish, so a threshold N is selected to extract the corner points of the target contour of the image (as shown in fig. 3), and the formula is as follows:
N=255×12%≈30
(3.2) extracting corner points of the target contour:
extracting and storing all angular points, and sequentially reading three points M in the angular points a-n 、M a 、M a+n Three points form a template consisting of three elements, where M is defined by the three points a As the template center, the subscript of a point represents the number of the corner point among all the corner points. Traversing all corner points, starting from an initial value, M a Determining as a real-time operating point, and taking two points with the distance of n between the front and rear serial numbers, M a Respectively form two edges with other two points in the template, and the included angle formed by the two edges is taken as M a The corner response value of the point. From point M a And point M a-n Distance determination edge L 1 Point M a And point M a+n Is determined by the distance of 2 Point M a-n And point M a+n Is determined by the distance of 3 Trilateral M can be obtained according to the cosine theorem a The angle of the point is calculated as follows:
Figure GDA0003711964400000061
and judging whether the angular point is reserved or not according to an included angle formed by the two edges, namely an angular point response value. Let g be the angular point response value, if point M a If the number is more than or equal to g, the corner points are saved as the needed corner points; otherwise, removing.
(3.3) eliminating adjacent angular points, and keeping contour vertexes:
after the corner points of the contour are extracted, other corner points may exist around the corner points, in order to eliminate the phenomenon, adjacent corner points are removed, and the remaining points are taken as the vertexes of the contour corners. Assuming that the image height is H, the image width is W, and Corner (x, y) indicates whether there is a Corner at the image (x, y), and when the Corner is 1 at the image (x, y), the Corner is at the image (x, y), and m × m (m >1) is the size of the matrix centered on the image (x, y), then:
Figure GDA0003711964400000071
wherein m is not less than x and not more than H, m is not less than y and not more than W, count refers to the corner points of the matrix range with (x, y) as the center, all the neighboring corner points of the matrix range with (x, y) as the center are removed, and the remaining corner points at the corners of the target contour are reserved as the vertexes (as shown in FIG. 4), so as to facilitate the subsequent image correction processing.
(4) Fitting a straight line by a least square method:
and performing linear fitting on two vertexes of the longest edge of the target contour by using a least square method to serve as the angle identification direction of the chip pin image, as shown in fig. 5.
(5) According to the quick correction chip of image centroid and remove the white edge, specifically include:
(5.1) taking the centroid of the image as the rotation center:
for accurately correcting the image, the center position of the image is determined by using a centroid method, and a centroid coordinate O point is used as the rotation center of the image.
(5.2) obtaining an image inclination angle, correcting a chip pin image:
when the QFN chip is packaged into a carrier tape, angular deviation which is difficult to distinguish by naked eyes exists, in order to calculate a deviation angle, a Harris angular point detection algorithm is improved, a polygonal approximation contour is combined, straight line fitting is carried out on two vertexes of the longest side by using a least square method, at the moment, the chip inclination angle alpha and the chip deviation value x in the horizontal direction have a right triangle relationship, as shown in fig. 6, the chip is corrected according to the inclination angle alpha (as shown in fig. 7a), and white edges existing in a chip pin image after rotation correction are removed (as shown in fig. 7 b).
In the step (4), two vertex coordinates are respectively set as p (x, y) and q (x, y), and the least square method fitting straight line calculation formula is as follows:
y=ax+b
a and b are the slope and intercept, respectively, of the linear equation, then:
Figure GDA0003711964400000072
wherein
Figure GDA0003711964400000073
Respectively the mean values of the abscissa and ordinate of the vertexes p, q,
Figure GDA0003711964400000074
in the step (5.1), the upper left corner of the image is set as the start point coordinate (0, 0), the lower right corner is set as the end point coordinate (m, n), and the centroid formula of the image is as follows:
Figure GDA0003711964400000075
wherein (x) 0 ,y 0 ) Is the centroid coordinate, m, n are the number of rows and columns of the image respectively (m, n are both integers greater than or equal to 2), f (x, y) is the gray value of the image at point (x, y).
In the step (5.2), the chip image inclination angle α can be obtained by the following calculation formula:
Figure GDA0003711964400000081
the horizontal offset x of the pin image can be obtained as follows:
x=|p x -h x |
the pin image tilt angle α is:
Figure GDA0003711964400000082
wherein l is the vertical distance from the point p to the point O, OA is the extension line of the centroid coordinate to the positive direction of the x axis, and h (x, y) is the intersection coordinate of pq and OA.
Fig. 13 is a flowchart of a QFN chip pin image fast skew correction method, which is experimentally verified and compared as follows:
(1) the correction method disclosed in the embodiment is compared with the correction precision of the traditional Hough transformation and the minimum second-order moment method:
in the embodiment, an internal memory is 4GB, a processor is an AMD A10-7300 radio R6, 10 computer Cores4C +6G @1.9GHz operating system, and the Visual Studio version is 2013. Selecting 10 different QFN chip pin images as experimental objects, running a program in the same environment, and comparing the correction method disclosed by the embodiment with the traditional Hough transformation and minimum external moment method. Fig. 8, 9, and 10 are schematic diagrams of calibration labeling after detecting angles by the three algorithms. And carrying out angle detection and verification by using the corrected chip central bonding pad, wherein the gray frame line is the minimum external rectangle of the pentagonal part of the chip obtained by different algorithms, and the black frame line is the ideal external rectangle of the manual marking of the pentagonal part of the chip. It can be easily seen that there is a certain angular deviation between fig. 8 and 9. The schematic diagram of the circumscribed rectangle marked after being corrected by the correction method disclosed by the embodiment is shown in fig. 10, and the marked gray frame line is approximately overlapped with the artificially marked black frame line, so that the inclination angle obtained by the correction method disclosed by the embodiment is more accurate. Fig. 11 lists the tilt angles of the chip detected by three algorithms.
(2) The correction method disclosed by the embodiment is compared with the correction time of the traditional Hough transformation and the minimum second-order moment method:
in order to detect the running time difference between the correction method disclosed by the embodiment and the traditional Hough transformation and minimum external moment method, the running times of 10 different QFN chip pin images are respectively compared in an experiment. Fig. 12 lists the run times of the three algorithms. Taking the graph number 5 as an example, the conventional Hough transform correction time is 357ms, the minimum external moment method correction time is 116ms, and the correction method disclosed in this embodiment completes the chip image correction process only in 18 ms. The mean time for correcting 10 chip pin images by Hough transformation is 412.6ms, the mean time for correcting 10 chip pin images by the minimum external moment method is 125.8ms, compared with the traditional Hough transformation, the running time of the correction method disclosed by the embodiment is only 1/34, and compared with the minimum external moment method, the running time of the correction method disclosed by the embodiment is only 1/10. Therefore, the QFN chip image fast skew correction method proposed in this embodiment is not only highly accurate, but also significantly reduces the running time, and has higher computational efficiency.
In summary, the method for rapidly correcting the inclination of the pin image of the QFN chip provided by the embodiment of the present invention is faster and more efficient than the conventional algorithm, and can be used in the correction link of the QFN chip, so as to provide a clear and accurate image for the inclination correction of the chip and the detection of the appearance defect of the chip, and improve the visual detection efficiency of the QFN package defect.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. However, the above description is only an example of the present invention, the technical features of the present invention are not limited thereto, and any other embodiments that can be obtained by those skilled in the art without departing from the technical solution of the present invention should be covered by the claims of the present invention.

Claims (4)

1. A method for rapidly correcting the inclination of a QFN chip pin image is characterized by comprising the following steps:
(1) preprocessing a chip pin image acquired by an industrial personal computer:
1.1) image filtering:
in order to remove noise and reduce image distortion, a 5 × 5 gaussian filter is used for convolution with an image, and in image processing, a two-dimensional gaussian function is used for filtering, and the calculation formula is as follows:
Figure FDA0003711964390000011
wherein G (x, y) is a two-dimensional Gaussian function, (x, y) is a point coordinate, sigma is a standard deviation, A is a normalization coefficient, and the sum of different weights is one;
1.2) binarization treatment:
carrying out binarization processing on the image by adopting a fixed threshold method, wherein a calculation formula is as follows:
Figure FDA0003711964390000012
wherein f (x, y) represents a distribution function of image pixel values, g (x, y) represents a distribution function of pixel values after threshold segmentation, and a fixed threshold T is 145;
(2) extracting the target contour by using a polygon approximation method, which specifically comprises the following steps:
2.1) carrying out edge detection on the chip pin image:
adopting Canny operator edge detection to obtain edge contour information of a chip pin image;
2.2) extracting the target contour by a polygon approximation method:
extracting the outline of a central bonding pad of a chip pin image by a polygon approximation method, and filtering out the rest outline parts, wherein the polygon approximation method is to select two farthest points from a target outline for connection; then, a point which is farthest away from the line segment is searched from the target contour, the point is added into the approximated new contour, namely, a triangle formed by connecting the three points is used as the contour; finally, starting from any one side of the triangle, repeating the previous step, adding the point with the farthest distance into the new contour, and continuously iterating until the output precision requirement is met;
(3) an improved Harris corner detection algorithm is provided, and a target contour vertex is obtained, and the method specifically comprises the following steps:
3.1) determining a selection threshold value, and acquiring a target contour corner:
in general, when the difference between the pixel gray levels of two black and white images is less than 10% -15% of the maximum pixel gray level, the human eye can not distinguish the images, so the threshold N is selected to extract the corner points of the target contour of the image, and the formula is as follows:
N=255×12%≈30
3.2) extracting corner points of the target contour:
extracting and storing all angular points, and sequentially reading three points M in the angular points a-n 、M a 、M a+n Three points form a template consisting of three elements, where M is defined by the three points a As the center of the template, the subscript of the point represents the serial number of the corner point in all the corner points, and the traversalAll corner points, starting from an initial value, M a Determining as a real-time operating point, and taking two points with the distance of n between the front and rear serial numbers, M a Respectively form two edges with other two points in the template, and the included angle formed by the two edges is taken as M a Corner point response value of point, from point M a And point M a-n Distance determination edge L 1 Point M a And point M a+n Is determined by the distance of 2 Point M a-n And point M a+n Is determined by the distance of 3 Trilateral M can be obtained according to the cosine theorem a The angle of the point is calculated as follows:
Figure FDA0003711964390000021
judging whether to reserve the angular point or not through an included angle formed by two edges, namely an angular point response value, setting g as the angular point response value, and if the point M is the point M, judging whether to reserve the angular point or not a If the number is more than or equal to g, the corner points are saved as the needed corner points; otherwise, removing;
3.3) eliminating adjacent angular points, and keeping contour vertexes:
after the contour Corner corners are extracted, other corners may exist around the contour Corner corners, in order to eliminate the phenomenon, adjacent corners are removed, the remaining points are taken as contour Corner vertices, the image height is H, the image width is W, Corner (x, y) indicates whether a Corner exists at the image (x, y), when the Corner (x, y) is 1, the Corner exists at the image (x, y), mxm is the size of a matrix with the (x, y) as the center, wherein m is greater than 1, then:
Figure FDA0003711964390000022
wherein, m is not less than x and not more than H, m is not less than y and not more than W, count refers to the corner point of the matrix range with (x, y) as the center, all the adjacent corner points of the matrix range with (x, y) as the center are removed, and the remaining corner points at the corners of the target contour are reserved as the vertexes, so as to facilitate the subsequent image correction processing;
(4) fitting a straight line by a least square method:
performing linear fitting on two vertexes of the longest edge of the target contour by using a least square method to serve as angle identification directions of the chip pin image;
(5) according to the quick correction chip of image centroid and remove the white edge, specifically include:
5.1) taking the centroid of the image as the rotation center:
in order to accurately correct the image, the center position of the image is determined by using a centroid method, and a centroid coordinate O point is used as the rotation center of the image;
and 5.2) acquiring an image inclination angle and correcting the chip pin image.
2. The method as claimed in claim 1, wherein in step 4, two vertex coordinates are respectively defined as p (x, y) and q (x, y), and the least square fit straight line calculation formula is as follows:
y=ax+b
a and b are the slope and intercept, respectively, of the linear equation, then:
Figure FDA0003711964390000031
wherein,
Figure FDA0003711964390000032
respectively the mean values of the abscissa and ordinate of the vertexes p, q,
Figure FDA0003711964390000033
3. the method for fast skew correction of QFN chip pin image as claimed in claim 2, wherein in step 5.1, the upper left corner of the image is set as the starting point coordinate (0, 0), and the lower right corner is set as the end point coordinate (m, n), and the centroid formula of the image is as follows:
Figure FDA0003711964390000034
wherein (x) 0 ,y 0 ) Is the centroid coordinate, m, n are the number of rows and columns of the image respectively, m, n are integers greater than or equal to 2, f (x, y) is the gray value of the image at point (x, y).
4. The method for fast skew correction of QFN chip pin image as claimed in claim 3, wherein in step 5.2, the chip image skew angle α is calculated by the following formula:
Figure FDA0003711964390000036
the horizontal offset x of the pin image can be obtained as follows:
x=|p x -h x |
the pin image tilt angle α is:
Figure FDA0003711964390000035
wherein l is the vertical distance from the point p to the point O, OA is the extension line of the centroid coordinate to the positive direction of the x axis, and h (x, y) is the intersection coordinate of pq and OA.
CN201911182587.1A 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method Active CN110992326B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911182587.1A CN110992326B (en) 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911182587.1A CN110992326B (en) 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method

Publications (2)

Publication Number Publication Date
CN110992326A CN110992326A (en) 2020-04-10
CN110992326B true CN110992326B (en) 2022-08-09

Family

ID=70087207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911182587.1A Active CN110992326B (en) 2019-11-27 2019-11-27 QFN chip pin image rapid inclination correction method

Country Status (1)

Country Link
CN (1) CN110992326B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111537518B (en) * 2020-05-25 2024-05-28 珠海格力智能装备有限公司 Method and device for detecting flaws of capacitor terminal, storage medium and processor
CN111754461B (en) * 2020-05-28 2024-03-01 江苏理工学院 Method and device for positioning image character area of semiconductor chip
CN111950315B (en) * 2020-10-19 2023-11-07 江苏理工学院 Method, device and storage medium for segmenting and identifying multiple bar code images
CN113379681B (en) * 2021-05-20 2022-11-04 深圳技术大学 Method and system for obtaining inclination angle of LED chip, electronic device and storage medium
CN113763279A (en) * 2021-09-10 2021-12-07 厦门理工学院 Accurate correction processing method for image with rectangular frame
CN113733827A (en) * 2021-10-19 2021-12-03 长沙立中汽车设计开发股份有限公司 Device and method for detecting relative rotation angle between semitrailer trailer and trailer
CN114229396B (en) * 2022-02-18 2022-05-13 深圳市创新特科技有限公司 Correcting device and correcting method for taking and placing positions of circuit board
CN115308222B (en) * 2022-07-11 2024-02-09 江苏汤谷智能科技有限公司 System and method for identifying poor chip appearance based on machine vision
CN115830049B (en) * 2022-07-18 2024-08-09 宁德时代新能源科技股份有限公司 Corner detection method and device
CN116051389A (en) * 2022-08-10 2023-05-02 荣耀终端有限公司 Calibration image correction method and device and electronic equipment
CN116309325A (en) * 2023-02-08 2023-06-23 深圳市振华兴智能技术有限公司 Patch detection method and system based on deep learning
CN117274246B (en) * 2023-11-17 2024-02-20 深圳市大族封测科技股份有限公司 Bonding pad identification method, computer equipment and storage medium
CN117830336B (en) * 2024-03-04 2024-06-14 福建帝视科技集团有限公司 Polygonal contour detection method and device based on line scanning camera imaging

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359402A (en) * 2014-11-17 2015-02-18 南京工业大学 Detection method for rectangular pin element visual positioning
CN109785316A (en) * 2019-01-22 2019-05-21 湖南大学 A kind of apparent defect inspection method of chip

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110933926B (en) * 2019-11-13 2021-04-06 浙江工业大学 Automatic correction method for angle of suction nozzle element of chip mounter based on angular point detection

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359402A (en) * 2014-11-17 2015-02-18 南京工业大学 Detection method for rectangular pin element visual positioning
CN109785316A (en) * 2019-01-22 2019-05-21 湖南大学 A kind of apparent defect inspection method of chip

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SIFT在高分辨率SAR图像自动配准中的性能分析;孙艳丽等;《电子设计工程》;20110430;第19卷(第07期);全文 *
一种基于局部不变特征的SAR图像配准新算法;金斌等;《哈尔滨工程大学学报》;20141130;第46卷(第11期);全文 *

Also Published As

Publication number Publication date
CN110992326A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN110992326B (en) QFN chip pin image rapid inclination correction method
CN111612781B (en) Screen defect detection method and device and head-mounted display equipment
CN109978839B (en) Method for detecting wafer low-texture defects
CN109785291B (en) Lane line self-adaptive detection method
CN108369650B (en) Method for identifying possible characteristic points of calibration pattern
CN115063421B (en) Pole piece region detection method, system and device, medium and defect detection method
CN103914827B (en) The visible detection method of weather strip for automobile profile defects
CN114529459B (en) Method, system and medium for enhancing image edge
CN110933926B (en) Automatic correction method for angle of suction nozzle element of chip mounter based on angular point detection
CN110647882A (en) Image correction method, device, equipment and storage medium
CN113808131B (en) Method, system, device and medium for identifying connector defects
CN115830033A (en) Automobile hub surface defect detection method based on machine vision
CN104657728B (en) Processing in Barcode Recognizing System based on computer vision
CN115439523A (en) Method and equipment for detecting pin size of semiconductor device and storage medium
CN117152165B (en) Photosensitive chip defect detection method and device, storage medium and electronic equipment
CN116958125B (en) Electronic contest host power supply element defect visual detection method based on image processing
CN114037657A (en) Lithium battery tab defect detection method combining region growth and annular correction
CN117611589B (en) Tablet personal computer quality detection method and system
CN116993966A (en) Casting polishing vision intelligent positioning method and system
WO2024016686A1 (en) Corner detection method and apparatus
JP2014106713A (en) Program, method, and information processor
CN112381844A (en) Self-adaptive ORB feature extraction method based on image blocking
CN112923852B (en) SD card position detection method based on dynamic angular point positioning
CN114049380B (en) Target object positioning and tracking method, device, computer equipment and storage medium
CN116309780A (en) Water gauge water level identification method based on target detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant