CN115393320A - Area array image processing method, device, equipment and storage medium - Google Patents

Area array image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN115393320A
CN115393320A CN202211030665.8A CN202211030665A CN115393320A CN 115393320 A CN115393320 A CN 115393320A CN 202211030665 A CN202211030665 A CN 202211030665A CN 115393320 A CN115393320 A CN 115393320A
Authority
CN
China
Prior art keywords
calculating
wafer
image
camera
scanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211030665.8A
Other languages
Chinese (zh)
Inventor
孙序东
华凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seizet Technology Shenzhen Co Ltd
Original Assignee
Seizet Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seizet Technology Shenzhen Co Ltd filed Critical Seizet Technology Shenzhen Co Ltd
Priority to CN202211030665.8A priority Critical patent/CN115393320A/en
Publication of CN115393320A publication Critical patent/CN115393320A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The invention relates to the technical field of semiconductor wafer detection, and discloses an area array image processing method, device, equipment and storage medium. The method comprises the steps of calculating a scanning path of a wafer disc to be scanned, calculating a plurality of trigger point coordinate positions of each line in the scanning path, moving a module according to the scanning path to drive the wafer disc to be scanned to move, placing the wafer disc to be scanned on the module, triggering a camera when the module moves to the trigger point coordinate positions to enable the camera to shoot the wafer disc to be scanned under the current view field, setting the camera above the wafer disc to be scanned, and triggering the camera to shoot at the trigger point coordinate positions to ensure that all chip images are complete and meet detection requirements; receiving the wafer image fed back by the camera, correcting and splicing the wafer image to obtain a detection image, and avoiding the chip position deviation caused by the view field size of the camera and the installation angle of the camera, thereby being capable of segmenting all Dies and improving the detection efficiency and accuracy.

Description

Area array image processing method, device, equipment and storage medium
Technical Field
The present invention relates to the field of wafer inspection technologies, and in particular, to a method, an apparatus, a device, and a storage medium for processing an area array image.
Background
In the field of semiconductor wafer inspection, when the diameter of a solder ball of a wafer is inspected in the next step, the area of a single chip (Die) needs to be divided first, and then the inspection is carried out. The size of the camera view and the installation angle of the camera cause the position deviation of Die in one view, and the image of all Die in the whole Wafer disk (Wafer) cannot be complete and meet the detection requirement. Therefore, how to process the wafer area array image so as to segment all the Die, and improving the detection efficiency and accuracy is an urgent technical problem to be solved.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide an area array image processing method, an area array image processing device, area array image processing equipment and a computer storage medium, and aims to solve the technical problem that in the prior art, a chip image is incomplete due to camera installation angles and the like, and the detection requirement cannot be met.
In order to achieve the above object, the present invention provides an area array image processing method, including the steps of:
calculating a scanning path of a wafer disc to be scanned, and calculating a plurality of trigger point coordinate positions of each line in the scanning path;
moving a module according to the scanning path to drive the wafer disc to be scanned to move, wherein the wafer disc to be scanned is placed on the module;
when the module moves to the trigger point coordinate position, triggering a camera to enable the camera to shoot a wafer disc to be scanned under the current view field, wherein the camera is arranged above the wafer disc to be scanned;
and receiving the wafer image fed back by the camera, and correcting and splicing the wafer image to obtain a detection image.
Preferably, said calculating a plurality of trigger point coordinate positions for each line in said scan path comprises:
s21: traversing the array in the scanning path, and searching the positions of a top chip and an upper right chip in the current visual field;
s22: calculating whether the initial position of the next visual field is within a safe distance from a solder ball on the chip or not according to the positions of the top chip and the upper right chip in the current visual field to obtain a first calculation result;
s23: determining the next trigger position according to the first calculation result, and calculating the coordinate positions of the trigger points of the whole row in sequence;
s24: calculating the downward moving distance, and calculating whether the solder ball is in the visual field of a second preset direction to obtain a second calculation result;
s25: determining the downward movement distance of the module according to the second calculation result;
s26: and repeating the steps S22 to S25, and sequentially calculating and obtaining a plurality of trigger point coordinate positions of each line in the scanning path.
Preferably, the step S23: determining the next trigger position according to the first calculation result, and calculating the coordinate positions of the trigger points of the whole row in sequence, wherein the method comprises the following steps:
if the distance is within the safe distance, taking the width of one visual field as the distance between the next trigger position and the end position of the current visual field, and sequentially calculating the coordinate position of the trigger point of the whole line;
and if the current trigger point is not within the safe distance, the distance between the next trigger position and the end position of the current visual field is less than the width of one visual field, and the coordinate positions of the trigger points of the whole row are calculated in sequence.
Preferably, if the distance between the next trigger position and the end position of the current field of view is not within the safety distance, the calculating the coordinate position of the trigger point of the whole row sequentially includes:
if the distance is not within the safe distance, calculating a difference value between the width of a view field and half of the distance between the solder balls, taking the difference value as the distance between the next trigger position and the ending position of the current view field, and sequentially calculating the coordinate position of the trigger point of the whole line.
Preferably, the receiving the wafer image fed back by the camera, correcting and splicing the wafer image, and obtaining a detection image includes:
receiving a plurality of wafer images fed back by the camera;
calculating the pixel precision of the module in each coordinate axis direction;
correcting the widths of the plurality of wafer images in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images;
calculating the corresponding relation between the edges of the correction images and the positions of the actual axes according to the widths of the correction images in the moving directions of the X axis and the Y axis;
calculating the row and column contained in the positions of the plurality of correction images;
calculating chip positions in rows and columns contained in the plurality of correction images, and searching pixels belonging to the chip position range according to the chip positions;
and intercepting pixels belonging to the image in the chip position range according to the corresponding relation between the edges of the plurality of corrected images and the actual axis position, and splicing the images to obtain a detection image.
Preferably, after the calculating the pixel precision of the module in each coordinate axis direction, the method further includes:
calculating an exposure compensation value, and carrying out exposure compensation on a plurality of wafer images according to the exposure compensation value;
correspondingly, the correcting widths of the plurality of wafer images in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images comprises:
and correcting the widths of the plurality of images subjected to exposure compensation in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images.
Preferably, the capturing pixels belonging to the image within the chip position range according to the correspondence between the edges of the plurality of corrected images and the actual axis position, and performing image stitching to obtain a detection image includes:
calculating an image area belonging to the chip position range;
correcting the initial position of the image area according to the corresponding relation between the edges of the plurality of corrected images and the position of the actual axis and the central coordinate of the image area;
converting the image area according to the corrected initial position to obtain pixels belonging to the chip position range;
intercepting pixels belonging to the image in the chip position range to obtain all chip images of the wafer disc to be scanned;
and carrying out image splicing on all the chip images to obtain a detection image.
In order to achieve the above object, the present invention also provides an area array image processing apparatus including:
the calculation module is used for calculating a scanning path of the wafer disc to be scanned and calculating a plurality of trigger point coordinate positions of each line in the scanning path;
the moving module is used for moving the module according to the scanning path so as to drive the wafer disc to be scanned to move, and the wafer disc to be scanned is placed on the module;
the trigger module is used for triggering a camera when the module moves to the trigger point coordinate position so that the camera shoots a wafer disc to be scanned under the current view field, and the camera is arranged above the wafer disc to be scanned;
and the correction module is used for receiving the wafer image fed back by the camera, correcting and splicing the wafer image and obtaining a detection image.
Further, to achieve the above object, the present invention also proposes an area array image processing apparatus including: a memory, a processor and an area array image processing program stored on the memory and executable on the processor, the area array image processing program when executed by the processor implementing the steps of the area array image processing method as described above.
Furthermore, in order to achieve the above object, the present invention also provides a storage medium having an area array image processing program stored thereon, wherein the area array image processing program, when executed by a processor, implements the steps of the area array image processing method as described above.
According to the method, a scanning path of a wafer disc to be scanned is calculated, a plurality of trigger point coordinate positions of each line in the scanning path are calculated, a module is moved according to the scanning path to drive the wafer disc to be scanned to move, the wafer disc to be scanned is placed on the module, when the module moves to the trigger point coordinate positions, a camera is triggered to shoot the wafer disc to be scanned under the current visual field, the camera is arranged above the wafer disc to be scanned, and the camera is triggered to shoot through the calculated trigger point coordinate positions, so that the images of all chips can be ensured to be complete and meet detection requirements; and receiving the wafer image fed back by the camera, correcting and splicing the wafer image to obtain a detection image, and correcting and splicing the image to avoid chip position deviation caused by the view field size of the camera and the installation angle of the camera, so that all Dies can be segmented, and the detection efficiency and accuracy are improved.
Drawings
Fig. 1 is a schematic structural diagram of an area array image processing apparatus in a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of an area array image processing method according to the present invention;
FIG. 3 is a schematic diagram illustrating a wafer to be scanned in an embodiment of an area array image processing method according to the present invention;
FIG. 4 is a flowchart illustrating a specific step of a second embodiment of an area array image processing method according to the present invention;
fig. 5 is a block diagram of the first embodiment of the area array image processing apparatus according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an area array image processing apparatus in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the area array image processing apparatus may include: a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), and the optional user interface 1003 may further include a standard wired interface and a wireless interface, and the wired interface for the user interface 1003 may be a USB interface in the present invention. The network interface 1004 may optionally include a standard wired interface, a WIreless interface (e.g., a WIreless-FIdelity (WI-FI) interface). The Memory 1005 may be a Random Access Memory (RAM) Memory, or a Non-volatile Memory (NVM), such as a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration shown in fig. 1 does not constitute a limitation of the area array image processing apparatus, and may include more or less components than those shown, or some components in combination, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an area image processing program.
In the area array image processing apparatus shown in fig. 1, the network interface 1004 is mainly used for connecting to a background server and performing data communication with the background server; the user interface 1003 is mainly used for connecting user equipment; the area array image processing apparatus calls an area array image processing program stored in the memory 1005 through the processor 1001 and performs the area array image processing method provided by the embodiment of the present invention.
Based on the above hardware structure, an embodiment of the area array image processing method of the present invention is provided.
Referring to fig. 2, fig. 2 is a flowchart illustrating a first embodiment of the area array image processing method according to the present invention.
In a first embodiment, the area array image processing method includes the steps of:
step S10: and calculating a scanning path of the wafer disc to be scanned, and calculating a plurality of trigger point coordinate positions of each line in the scanning path.
It should be understood that the execution subject of the embodiment is the area array image processing device, and the area array image processing device may be an electronic device such as a personal computer, an industrial personal computer, a robot, or a server, and the embodiment is not limited thereto. Firstly, the machine camera automatically focuses, levels and positions, and then the step S10 is performed. Automatic focusing: the area array camera is focused on the wafer disc, the clearest image can be acquired by the camera, and the pixel precision can be determined. The pixel precision in the X-Y direction is (XPixUnit, YPixUnit), and the field size in the XY direction is (XLength, YLength).
Automatic leveling: and matching the Wafer coordinate direction with the motion direction of the module by utilizing a template, wherein the angle between the module and the camera view field is the installation angle DisAngle of the camera because the installation of the camera cannot be consistent with the module after leveling.
Automatic positioning: determining the physical location (X) of each Die in the Wafer ij ,Y ij ) The positions of all Dies are shown by a two-dimensional array
Figure BDA0003817192600000061
Wherein (X) ij ,Y ij ) Refers to the physical coordinates of the upper left corner of Die in i row and j column. Although the two-dimensional array is rectangular, the edge portion has no coordinate value.
As shown in fig. 3, fig. 3 is a schematic diagram of a wafer to be scanned, and the small squares in fig. 3 are Die. Starting from the first row of the array representing the wafer disc to be scanned, calculating integer rows in a visual field, calculating the position of the leftmost Die in the visual field as (lefttop.X, lefttop.Y) and the position of the rightmost Die as ((righttop.X, righttop.Y)), wherein the scanning position of the row is (lefttop.X-righttop.X), starting from the physical position of the leftmost Die, calculating the integer Die contained in one visual field in the X direction, sequentially calculating whether the starting visual field position keeps a safe distance with the solder ball, wherein the moving distance is smaller than one visual field width when the starting position is not at the safe distance and is next to a triggering position, calculating the position required to trigger each time, and sequentially calculating the triggering position and the scanning position of the whole row. And calculating the downward moving distance, wherein when the solder ball is positioned in the Y-direction visual field, the downward moving distance is less than the Y-direction visual field length so as to ensure that the Y-direction visual field can not cut the solder ball. Repeating the above operations, the scanning position and the triggering position of the whole wafer disc to be scanned can be calculated.
Step S20: and moving the module according to the scanning path to drive the wafer disc to be scanned to move, wherein the wafer disc to be scanned is placed on the module.
It should be noted that the wafer disc to be scanned is fixed on the module, and when the module moves, the wafer disc to be scanned is driven to move, so that the wafer disc to be scanned can be sequentially displayed under the view of the camera through the moving module, and shooting and subsequent wafer detection are performed.
Step S30: when the module moves to the trigger point coordinate position, a camera is triggered to enable the camera to shoot the wafer disc to be scanned under the current view field, and the camera is arranged above the wafer disc to be scanned.
It will be appreciated that, according to the calculation of step S10, the module moves and triggers the camera to capture frames at the trigger point. The left side and the upper side of the acquired image can not be cut on the solder ball.
Step S40: and receiving the wafer image fed back by the camera, and correcting and splicing the wafer image to obtain a detection image.
It is to be understood that the pixel accuracy in the axis moving direction is calculated, and the exposure compensation is calculated, and according to the pixel accuracy, the size of the wafer image in the axis direction is corrected, and according to the exposure compensation, the corrected image is compensated. Calculating the corresponding relation between the image edge and the actual axis coordinate position, calculating the row and column contained in the position of the image, calculating the Die position of the row and column contained in the image, finding the pixels belonging to the Die range in the image, and calculating the image area (module coordinates) belonging to the Die. And converting the image into an image area (pixel coordinate), and correcting the initial position according to the central coordinate, thereby correcting the deviation caused by the angle influence, and intercepting the pixels belonging to the current Die area to obtain a detection image.
In this embodiment, by calculating a scanning path of a wafer disc to be scanned and calculating a plurality of trigger point coordinate positions of each line in the scanning path, moving a module according to the scanning path to drive the wafer disc to be scanned to move, where the wafer disc to be scanned is placed on the module, and when the module moves to the trigger point coordinate positions, triggering a camera to shoot the wafer disc to be scanned in a current view, where the camera is disposed above the wafer disc to be scanned, and triggering the camera to shoot through the calculated trigger point coordinate positions, it is possible to ensure that images of all chips are complete and meet detection requirements; and receiving the wafer image fed back by the camera, correcting and splicing the wafer image to obtain a detection image, and correcting and splicing the image to avoid chip position deviation caused by the view field size of the camera and the installation angle of the camera, so that all Dies can be segmented, and the detection efficiency and accuracy are improved.
Referring to fig. 4, a second embodiment of the area array image processing method of the present invention is proposed based on the above-described first embodiment.
In a second embodiment, said calculating a plurality of trigger point coordinate positions for each line in said scan path comprises:
step S21: traversing the array in the scanning path, and searching the positions of a top chip and an upper right chip in the current view;
step S22: calculating whether the initial position of the next visual field is within a safe distance from a solder ball on the chip or not according to the positions of the top chip and the upper right chip in the current visual field to obtain a first calculation result;
step S23: determining the next trigger position according to the first calculation result, and calculating the coordinate positions of the trigger points of the whole row in sequence;
step S24: calculating the downward movement distance, and calculating whether the solder ball is in the visual field in a second preset direction to obtain a second calculation result;
step S25: determining the downward movement distance of the module according to the second calculation result;
step S26: and repeating the steps S22 to S25, and sequentially calculating and obtaining a plurality of trigger point coordinate positions of each line in the scanning path.
It can be understood that the top Die is found by traversing the array clockwise, and finally the upper right Die is found. Starting from the first row of the array, calculating integer rows belonging to one visual field, calculating the position of the leftmost Die in one visual field as (lefttop.X, lefttop.Y) and the position of the rightmost Die as ((righttop.X, righttop.Y)), wherein the scanning position of the row is (lefttop.X-righttop.X), starting from the physical position of the leftmost Die, calculating the integer Die contained in one visual field in the X direction, and sequentially calculating whether the starting visual field position keeps a safe distance with the solder ball, wherein the safe distance is 1/2 of the distance between two solder balls on a chip or is 30 pixels away from the solder ball.
Further, in the present embodiment, the step S23: determining the next trigger position according to the first calculation result, and calculating the coordinate positions of the trigger points of the whole row in sequence, wherein the method comprises the following steps: if the distance is within the safe distance, taking the width of one visual field as the distance between the next trigger position and the end position of the current visual field, and sequentially calculating the coordinate position of the trigger point of the whole line; and if the current trigger point is not within the safe distance, the distance between the next trigger position and the end position of the current visual field is less than the width of one visual field, and the coordinate positions of the trigger points of the whole row are calculated in sequence.
Further, in order to ensure that a safe distance is maintained between the starting view position and the solder ball, in this embodiment, if the starting view position is not within the safe distance, the distance between the next trigger position and the end position of the current view is less than a view width, and the sequentially calculating the coordinate positions of the trigger points of the entire row includes:
if the distance is not within the safe distance, calculating a difference value between the width of one view field and half of the distance between the solder balls, taking the difference value as the distance between the next trigger position and the end position of the current view field, and sequentially calculating the coordinate position of the trigger point of the whole line.
Further, in this embodiment, the receiving a wafer image fed back by the camera, correcting and stitching the wafer image, and obtaining a detection image includes:
receiving a plurality of wafer images fed back by the camera;
calculating the pixel precision of the module in each coordinate axis direction;
correcting the widths of the plurality of wafer images in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images;
calculating the corresponding relation between the edges of the correction images and the positions of the actual axes according to the widths of the correction images in the moving directions of the X axis and the Y axis;
calculating the row and column contained in the positions of the plurality of correction images;
calculating chip positions in rows and columns contained in the plurality of correction images, and searching pixels belonging to the chip position range according to the chip positions;
and intercepting pixels of the images in the chip position range according to the corresponding relation between the edges of the plurality of corrected images and the actual axis position, and splicing the images to obtain a detection image.
It should be noted that the pixel accuracy in the axis movement direction of the module is calculated as follows:
the angle radian degree = DisAngle PI/180, wherein DisAngle is a camera mounting angle, and PI is PI;
deltaX = XPixUnit/Cos (default) in the direction of X-axis movement, where XPixUnit is the X-direction pixel precision;
y-axis movement direction deltaY = YPixUnit/Cos (depth), where YPixUnit is Y-direction pixel precision.
Further, in this embodiment, after the calculating the pixel precision of the module in each coordinate axis direction, the method further includes:
calculating an exposure compensation value, and carrying out exposure compensation on a plurality of wafer images according to the exposure compensation value;
correspondingly, the correcting widths of the plurality of wafer images in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images comprises:
and correcting the widths of the plurality of images subjected to exposure compensation in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images.
It will be appreciated that the exposure compensation is calculated: exposure offset compensation exposure offset = speed x exposure/1000 x k, where speed is the axis running speed of the module, exposure time is the exposure time, k is the exposure compensation fine adjustment coefficient, and k has a value ranging from 0 to 1, such as 0.71.
Correcting the size of the image in the axial direction of the module:
correcting for a directional width of X-axis movement imageXLen = width deltaX, where width is a camera view width;
the correction is the width imageYLen of the Y-axis movement = height (camera view height) × deltaY, where height is the camera view height.
Calculating the corresponding relation between the image edge and the actual axis coordinate position:
calculating the coordinate position of the image:
left = ptcenter.x + imageXLen/2 on the left of the image, where ptcenter.x is the trigger position center X coordinate;
image right = ptcenter.x-imageXLen/2, where ptcenter.x is the trigger position center X coordinate;
top of the image top = ptcenter.y-imageYLen/2, where ptcenter.y is the trigger position center Y coordinate;
image lower side bottom = ptcenter.y + imageYLen/2, where ptcenter.y is the trigger position center Y coordinate.
Calculating the row and column contained in the position of the image:
the starting line topRow = (int) ((top-lefttop.y)/dieYLength), where lefttop.y is the Y coordinate of the leading position of each line trigger; dieYLength is the length in Die Y direction;
ending a line bottom row = (int) ((bottom-lefttop.y)/dieYLength), wherein lefttop.y is a Y coordinate of a head position of each line trigger; dieYLLength is the length of Die Y direction;
a start column leftCol = (int) ((lefttop.x-left)/dieXLength), wherein lefttop.x is an X coordinate of a head position triggered by each row, and dieXLength is a length in a Die X direction;
the end column rightCol = (int) ((lefttop.x-right)/dieXLength), where lefttop.x is the X coordinate of the head position of each row trigger, and dieXLength is the length of Die X direction.
Further, in this embodiment, the capturing pixels belonging to the image in the chip position range according to the correspondence between the edges of the plurality of corrected images and the actual axis positions, and performing image stitching to obtain the detected image includes:
calculating an image area belonging to the chip position range;
correcting the initial position of the image area according to the corresponding relation between the edges of the plurality of corrected images and the position of the actual axis and the central coordinate of the image area;
converting the image area according to the corrected initial position to obtain pixels belonging to the chip position range;
intercepting pixels belonging to the image in the chip position range to obtain all chip images of the wafer disc to be scanned;
and carrying out image splicing on all the chip images to obtain a detection image.
It will be appreciated that the Die positions of the lines and columns contained in the image are calculated, and pixels falling within the Die range are found in the image.
Calculating an image area belonging to the Die, namely calculating module coordinates belonging to the Die, specifically:
the Die belongs to an area left side module position cellL = math.min (cell. Lefttop.x, imagelt.x), wherein the cell. Lefttop.x is an X coordinate of the upper leftmost position module of the Die, and the imagelt.x is an X coordinate of the upper leftmost position of the visual field image;
the module position cell T = Math.Max (cell. LeftTop.Y, imageLT.Y) on the upper side of the area where the Die belongs, wherein the cell. LeftTop.Y is the Y coordinate of the module at the leftmost position of the Die, and the imageLT.Y is the Y coordinate of the leftmost position of the visual field image;
a right module position cell r = math.max (cell.right bottom.x, image rb.x) of the area to which the Die belongs, wherein the cell.right bottom.x is an X coordinate of a bottom right position module of the Die, and the image rb.x is an X coordinate of a bottom right position of the visual field image;
a Die module position cell b = Math.Min (cell. Right bottom.Y, image RB.Y) under the area where Die belongs, wherein the cell. Right bottom.Y is a Y coordinate of the Die bottom right position module, and the image RB.Y is a Y coordinate of the bottom right position of the visual field image;
and the zone central point position SzPoint ptCenter = new SzPoint ((image LT. X + image RB. X)/2, (image LT. Y + image RB. Y)/2) to which the Die belongs, wherein the image LT.X is the X coordinate of the leftmost upper position of the visual field image, the image RB.X is the X coordinate of the rightmost lower position of the visual field image, the image LT.Y is the Y coordinate of the leftmost upper position of the visual field image, and the image RB.Y is the Y coordinate of the rightmost lower position of the visual field image.
In order to correct the positional deviation caused by the angle influence, it is necessary to correct the start position based on the center coordinates, convert the corrected start position into an image area, and calculate pixel coordinates.
Actual image coordinate position left side (cut position): partImageL = (int) ((imagelt. X-cellL)/deltaX + (cellT-ptcenter. Y)/deltaY sin (degree));
actual image coordinate position upper side (cut position): partImageT = (int) ((cell-image LT. Y)/deltaY + (cell-pt center. X)/deltaX sin (degree));
and intercepting the pixels belonging to the current Die area according to the calculated interception position and obtaining a detection image.
In this embodiment, the obtained detection image can be divided into all Die, and a certain image only contains one Die, so that the detection efficiency and the detection accuracy can be improved.
Furthermore, the present invention also proposes a storage medium having stored thereon an area array image processing program which, when executed by a processor, implements the steps of the area array image processing method as described above.
Further, referring to fig. 5, an embodiment of the present invention further provides an area array image processing apparatus, including:
the calculation module 10 is configured to calculate a scanning path of a wafer disc to be scanned, and calculate coordinate positions of multiple trigger points in each line in the scanning path;
the moving module 20 is configured to move a module according to the scanning path to drive the wafer tray to be scanned to move, where the wafer tray to be scanned is placed on the module;
the triggering module 30 is configured to trigger a camera when the module moves to the trigger point coordinate position, so that the camera shoots a wafer disc to be scanned in a current field of view, and the camera is disposed above the wafer disc to be scanned;
and the correction module 40 is configured to receive the wafer image fed back by the camera, correct and splice the wafer image, and obtain a detection image.
Other embodiments or specific implementation manners of the area array image processing apparatus according to the present invention may refer to the above method embodiments, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or system comprising the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments. In the unit claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order, but rather the words first, second, etc. are to be interpreted as indicating.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g., a Read Only Memory image (ROM)/Random Access Memory (RAM), a magnetic disk, an optical disk), and includes several instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all equivalent structures or equivalent processes performed by the present invention or directly or indirectly applied to other related technical fields are also included in the scope of the present invention.

Claims (10)

1. An area array image processing method, characterized by comprising the steps of:
calculating a scanning path of a wafer disc to be scanned, and calculating a plurality of trigger point coordinate positions of each line in the scanning path;
moving a module according to the scanning path to drive the wafer disc to be scanned to move, wherein the wafer disc to be scanned is placed on the module;
when the module moves to the trigger point coordinate position, triggering a camera to enable the camera to shoot a wafer disc to be scanned under the current view field, wherein the camera is arranged above the wafer disc to be scanned;
and receiving the wafer image fed back by the camera, and correcting and splicing the wafer image to obtain a detection image.
2. The area array image processing method of claim 1, wherein said calculating a plurality of trigger point coordinate positions for each line in the scan path comprises:
s21: traversing the array in the scanning path, and searching the positions of a top chip and an upper right chip in the current view;
s22: calculating whether the initial position of the next view is within a safe distance from a solder ball on the chip according to the positions of the top chip and the upper right chip in the current view to obtain a first calculation result;
s23: determining the next trigger position according to the first calculation result, and calculating the coordinate positions of the trigger points of the whole row in sequence;
s24: calculating the downward movement distance, and calculating whether the solder ball is in the visual field in a second preset direction to obtain a second calculation result;
s25: determining the downward moving distance of the module according to the second calculation result;
s26: and repeating the steps S22 to S25, and sequentially calculating and obtaining a plurality of trigger point coordinate positions of each line in the scanning path.
3. The area array image processing method according to claim 2, wherein said step S23: determining the next trigger position according to the first calculation result, and calculating the coordinate positions of the trigger points of the whole row in sequence, wherein the method comprises the following steps:
if the distance is within the safe distance, taking the width of one visual field as the distance between the next trigger position and the end position of the current visual field, and sequentially calculating the coordinate position of the trigger point of the whole line;
and if the current trigger point is not within the safe distance, the distance between the next trigger position and the ending position of the current visual field is less than the width of one visual field, and the coordinate positions of the trigger points of the whole row are calculated in sequence.
4. The area array image processing method of claim 3, wherein if the distance between the next trigger position and the end position of the current field of view is not within the safe distance, the distance is less than a field of view width, and the calculating of the coordinate position of the trigger point of the whole row sequentially comprises:
if the distance is not within the safe distance, calculating a difference value between the width of one view field and half of the distance between the solder balls, taking the difference value as the distance between the next trigger position and the end position of the current view field, and sequentially calculating the coordinate position of the trigger point of the whole line.
5. The area-array image processing method of claim 1, wherein the receiving the wafer image fed back by the camera, correcting and stitching the wafer image to obtain a detection image, comprises:
receiving a plurality of wafer images fed back by the camera;
calculating the pixel precision of the module in each coordinate axis direction;
correcting the widths of the plurality of wafer images in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images;
calculating the corresponding relation between the edges of the plurality of corrected images and the position of the actual axis according to the width of the corrected images in the moving direction of the X axis and the Y axis;
calculating the row and column contained in the positions of the plurality of correction images;
calculating chip positions in rows and columns contained in the plurality of correction images, and searching pixels belonging to the chip position range according to the chip positions;
and intercepting pixels belonging to the image in the chip position range according to the corresponding relation between the edges of the plurality of corrected images and the actual axis position, and splicing the images to obtain a detection image.
6. The area array image processing method of claim 5, wherein after calculating the pixel precision of the module in each coordinate axis direction, further comprising:
calculating an exposure compensation value, and carrying out exposure compensation on a plurality of wafer images according to the exposure compensation value;
correspondingly, the correcting widths of the plurality of wafer images in the moving directions of the X axis and the Y axis according to the pixel precision to obtain a plurality of corrected images comprises:
and correcting the widths of the multiple images subjected to exposure compensation in the moving directions of the X axis and the Y axis according to the pixel precision to obtain multiple corrected images.
7. The area array image processing method of claim 5, wherein the obtaining the detection image by intercepting pixels belonging to the image within the chip position range according to the corresponding relationship between the edges of the plurality of correction images and the actual axis position and performing image stitching comprises:
calculating an image area belonging to the chip position range;
correcting the initial position of the image area according to the corresponding relation between the edges of the plurality of corrected images and the position of the actual axis and the central coordinate of the image area;
converting the image area according to the corrected initial position to obtain pixels belonging to the chip position range;
intercepting pixels belonging to the image in the chip position range to obtain all chip images of the wafer disc to be scanned;
and carrying out image splicing on all the chip images to obtain a detection image.
8. An area array image processing apparatus, characterized by comprising:
the calculation module is used for calculating a scanning path of the wafer disc to be scanned and calculating a plurality of trigger point coordinate positions of each line in the scanning path;
the moving module is used for moving the module according to the scanning path so as to drive the wafer disc to be scanned to move, and the wafer disc to be scanned is placed on the module;
the trigger module is used for triggering a camera when the module moves to the trigger point coordinate position so that the camera shoots a wafer disc to be scanned under the current view field, and the camera is arranged above the wafer disc to be scanned;
and the correction module is used for receiving the wafer images fed back by the camera, correcting and splicing the wafer images and obtaining a detection image.
9. An area array image processing apparatus characterized by comprising: a memory, a processor and an area array image processing program stored on the memory and executable on the processor, the area array image processing program when executed by the processor implementing the steps of the area array image processing method as claimed in any one of claims 1 to 7.
10. A storage medium having an area array image processing program stored thereon, the area array image processing program implementing the steps of the area array image processing method according to any one of claims 1 to 7 when executed by a processor.
CN202211030665.8A 2022-08-26 2022-08-26 Area array image processing method, device, equipment and storage medium Pending CN115393320A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211030665.8A CN115393320A (en) 2022-08-26 2022-08-26 Area array image processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211030665.8A CN115393320A (en) 2022-08-26 2022-08-26 Area array image processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115393320A true CN115393320A (en) 2022-11-25

Family

ID=84122999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211030665.8A Pending CN115393320A (en) 2022-08-26 2022-08-26 Area array image processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115393320A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342694A (en) * 2023-05-19 2023-06-27 恩纳基智能科技无锡有限公司 High-efficiency wafer pre-scanning method based on image processing
CN116598219A (en) * 2023-07-18 2023-08-15 上海孤波科技有限公司 Visualized wafer map generation method and device and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342694A (en) * 2023-05-19 2023-06-27 恩纳基智能科技无锡有限公司 High-efficiency wafer pre-scanning method based on image processing
CN116342694B (en) * 2023-05-19 2023-08-18 恩纳基智能科技无锡有限公司 High-efficiency wafer pre-scanning method based on image processing
CN116598219A (en) * 2023-07-18 2023-08-15 上海孤波科技有限公司 Visualized wafer map generation method and device and electronic equipment
CN116598219B (en) * 2023-07-18 2023-10-27 上海孤波科技有限公司 Visualized wafer map generation method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN115393320A (en) Area array image processing method, device, equipment and storage medium
US10977829B2 (en) Depth camera calibration device and method thereof
JP5296967B2 (en) 3D shape measuring device
JP5923824B2 (en) Image processing device
CN108581869B (en) Camera module alignment method
US10535157B2 (en) Positioning and measuring system based on image scale
US20190149788A1 (en) Calibration method of depth image capturing device
JPWO2018174011A1 (en) Position detecting device and position detecting method
JP4932202B2 (en) Part program generating apparatus for image measuring apparatus, part program generating method for image measuring apparatus, and part program generating program for image measuring apparatus
US20120044343A1 (en) Image measuring apparatus and image measuring method
KR101677822B1 (en) Scanning-electron-microscope image processing device and scanning method
TWI504859B (en) Method for photographing and piecing together the images of an object
CN102005038B (en) Image edge positioning method
CN115684012A (en) Visual inspection system, calibration method, device and readable storage medium
JP4634250B2 (en) Image recognition method and apparatus for rectangular parts
JP5148564B2 (en) Appearance inspection method and appearance inspection apparatus for inspecting using the method
CN114440768A (en) Wafer detection method, device and equipment of 3D measuring machine and storage medium
JP3632461B2 (en) Image recognition method
JP2005181250A (en) Method and device for inspecting liquid crystal display panel
JP6655422B2 (en) Image processing apparatus, mounting apparatus, image processing method, and program
KR20210153672A (en) Wire shape measuring device, wire three-dimensional image generation method and wire shape measuring method
JP2014106109A (en) Inspection device, inspection method, inspection program and recording medium
JP2006112930A (en) Object-shape discriminating method and apparatus
JP7283150B2 (en) Control device, inspection system, control method, program
CN116634134B (en) Imaging system calibration method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination