CN114603726A - Center positioning method - Google Patents

Center positioning method Download PDF

Info

Publication number
CN114603726A
CN114603726A CN202011418739.6A CN202011418739A CN114603726A CN 114603726 A CN114603726 A CN 114603726A CN 202011418739 A CN202011418739 A CN 202011418739A CN 114603726 A CN114603726 A CN 114603726A
Authority
CN
China
Prior art keywords
center
detection
chuck table
unit
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011418739.6A
Other languages
Chinese (zh)
Inventor
小池彩子
田中诚
小岛芳昌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Disco Corp
Original Assignee
Disco Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Disco Corp filed Critical Disco Corp
Priority to CN202011418739.6A priority Critical patent/CN114603726A/en
Publication of CN114603726A publication Critical patent/CN114603726A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B28WORKING CEMENT, CLAY, OR STONE
    • B28DWORKING STONE OR STONE-LIKE MATERIALS
    • B28D5/00Fine working of gems, jewels, crystals, e.g. of semiconductor material; apparatus or devices therefor
    • B28D5/02Fine working of gems, jewels, crystals, e.g. of semiconductor material; apparatus or devices therefor by rotary tools, e.g. drills
    • B28D5/022Fine working of gems, jewels, crystals, e.g. of semiconductor material; apparatus or devices therefor by rotary tools, e.g. drills by cutting with discs or wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B28WORKING CEMENT, CLAY, OR STONE
    • B28DWORKING STONE OR STONE-LIKE MATERIALS
    • B28D5/00Fine working of gems, jewels, crystals, e.g. of semiconductor material; apparatus or devices therefor
    • B28D5/0058Accessories specially adapted for use with machines for fine working of gems, jewels, crystals, e.g. of semiconductor material
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B28WORKING CEMENT, CLAY, OR STONE
    • B28DWORKING STONE OR STONE-LIKE MATERIALS
    • B28D5/00Fine working of gems, jewels, crystals, e.g. of semiconductor material; apparatus or devices therefor
    • B28D5/0058Accessories specially adapted for use with machines for fine working of gems, jewels, crystals, e.g. of semiconductor material
    • B28D5/0064Devices for the automatic drive or the program control of the machines

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Machine Tool Sensing Apparatuses (AREA)
  • Container, Conveyance, Adherence, Positioning, Of Wafer (AREA)

Abstract

Provided is a centering method capable of easily grasping the actual positional relationship between an imaging member and a detection member. The center positioning method comprises the following steps: a first central coordinate detection step (ST1) for obtaining the coordinates (X1, Y1) of the center of the chuck table from the image obtained by the image pickup means; a second central coordinate detection step (ST2) for obtaining the coordinate (X2, Y2) of the center of the chuck table based on the detection result of the detection unit; an inter-coordinate distance calculation step (ST3) for calculating the distance α between the centers as (X1-X2) and the distance β between the centers as (Y1-Y2); and a center matching step (ST4) for matching the imaging center of the imaging unit with the detection center of the detection unit by positioning the detection center of the detection unit at the coordinates of X0-alpha and YO-beta when the imaging center of the imaging unit is positioned at the coordinates of X0 and Y0.

Description

Center positioning method
Technical Field
The invention relates to a center positioning method.
Background
A cutting apparatus is used to divide a workpiece such as a semiconductor wafer into individual chips (see, for example, patent documents 1 and 2).
The cutting apparatuses used in the past as disclosed in patent documents 1 and 2 include an imaging means for imaging a workpiece and a detection means for measuring the height of the workpiece. In a cutting device used in the past, the relative positional relationship between an imaging member and a detection member is predetermined as a design value.
Patent document 1: japanese patent laid-open publication No. 2015-112698
Patent document 2: japanese laid-open patent publication No. 2005-093710
However, the cutting device used in the related art has a mounting error in the mounting position of at least one of the imaging member and the detection member, and the positional relationship determined by the design value is different from the actual positional relationship.
Therefore, there are problems as follows: even if the cutting device used in the related art detects the height of the position recognized based on the image captured by the imaging means by the detection means and performs the cutting process, the processing corresponding to the height detected by the detection means cannot be performed accurately.
In order to solve such a problem, patent document 1 proposes a method of measuring an actual positional relationship between an imaging member and a detection member by a virtual chip. However, the method disclosed in patent document 1 requires a dummy chip, which is relatively troublesome.
Disclosure of Invention
The present invention has been made in view of the above problems, and an object thereof is to provide a center positioning method capable of easily grasping an actual positional relationship between an imaging member and a detection member.
In order to solve the above problems and achieve the object, a center positioning method according to the present invention is a center positioning method for positioning a detection center of a detection member at an imaging center of an imaging member in an apparatus, the apparatus including: a circular chuck table for holding a workpiece; an X-axis moving member which moves the chuck table in an X-axis direction; an imaging member that images the workpiece held by the chuck table; a first Y-axis moving member that moves the photographing member in a Y-axis direction; the detection component detects the height of the processed object held by the chuck worktable; and a second Y-axis moving member that moves the detecting member in the Y-axis direction, wherein the center positioning method includes the steps of: a first central coordinate detection step of operating the X-axis moving member and the first Y-axis moving member and obtaining the coordinates of the center of the chuck table from the image captured by the imaging member (X1, Y1); a second central coordinate detection step of operating the X-axis moving member and the second Y-axis moving member and obtaining the coordinate of the center of the chuck table according to the detection result of the detection member (X2, Y2); an inter-coordinate distance calculating step of calculating an X coordinate of the center of the chuck table obtained based on the image captured by the imaging means as X1, an X coordinate of the center of the chuck table obtained based on the detection result of the detecting means as X2, an inter-center distance α as (X1-X2), an Y coordinate of the center of the chuck table obtained based on the image captured by the imaging means as Y1, an Y coordinate of the center of the chuck table obtained based on the detection result of the detecting means as Y2, and an inter-center distance β as (Y1-Y2); and a center matching step of matching the imaging center of the imaging member with the detection center of the detection member by positioning the detection center of the detection member at the coordinates of X0-alpha and Y0-beta when the imaging center of the imaging member is positioned at the coordinates of X0 and Y0.
In the center positioning method, the first center coordinate detecting step may be performed by imaging a feature point on an outer peripheral side of a center of the chuck table by an imaging means, obtaining coordinates (X11, Y11) of the feature point, imaging the feature point by rotating the chuck table by an arbitrary angle, obtaining coordinates (X12, Y12) of the feature point, obtaining coordinates (X1, Y1) of a position 1/2 located at a distance of 2 coordinates from the center on a linear function of a center passing through the obtained coordinates (X11, Y11) of the feature point and the coordinates (X12, Y12) of the feature point at right angles, the second center coordinate detecting step may be performed by detecting coordinates of 3 or more points of the detecting member located on an outer periphery of the chuck table where a detection result abruptly changes, and obtaining coordinates (X2) of the center of the chuck table from the detected coordinates of 3 or more points, y2).
In the center positioning method, the first center coordinate detecting step may image a region including an outer periphery of the chuck table, detect coordinates of 3 or more points of an imaging center of the imaging member positioned on the outer periphery from the imaged image, determine coordinates of the center of the chuck table from the detected coordinates of 3 or more points (X1, Y1), the second center coordinate detecting step may detect coordinates of 3 or more points of the detection member positioned on the outer periphery of the chuck table whose detection result changes abruptly, and determine coordinates of the center of the chuck table from the detected coordinates of 3 or more points (X2, Y2).
In the above-described centering method, the imaging means may be disposed on a first processing means connected to the first Y-axis moving means, and the detection means may be disposed on a second processing means connected to the second Y-axis moving means.
In the above-described centering method, the first Y-axis moving means and the second Y-axis moving means may be 1Y-axis moving means, and the imaging means and the detection means may be disposed on a processing means coupled to the Y-axis moving means.
The present invention has an effect of easily grasping the actual positional relationship between the imaging member and the detection member.
Drawings
Fig. 1 is a perspective view showing a configuration example of a processing apparatus for carrying out the centering method according to embodiment 1.
Fig. 2 is a plan view schematically showing an imaging center of an imaging unit of the machining apparatus shown in fig. 1.
Fig. 3 is a plan view schematically showing a detection center of a detection unit of the processing apparatus shown in fig. 1.
Fig. 4 is a plan view of a coordinate system showing an imaging position of an imaging unit and a detection position of a detection unit of the machining apparatus shown in fig. 1.
Fig. 5 is a flowchart showing a flow of the center positioning method according to embodiment 1.
Fig. 6 is a plan view showing a state where the image pickup unit picks up the characteristic points of the holding surface of the chuck table in the first center coordinate detection step shown in fig. 5.
Fig. 7 is a plan view showing a state in which the imaging means images the characteristic points of the holding surface of the chuck table rotated by a predetermined angle from the state of fig. 6 in the first center coordinate detection step shown in fig. 5.
Fig. 8 is a side view schematically showing a state where the detection unit is moved along the holding surface of the chuck table in the second center coordinate detection step of the positioning method shown in fig. 5.
Fig. 9 is a graph showing the detection result of the detection unit shown in fig. 8.
Fig. 10 is a plan view showing three points of the outer periphery of the chuck table determined according to the detection result of the detection unit shown in fig. 9.
Fig. 11 is a plan view schematically showing a state in which the imaging unit images three points on the outer periphery of the chuck table in the first center coordinate detection step of the center positioning method according to embodiment 2.
Fig. 12 is a perspective view showing a configuration example of a processing apparatus for performing the centering method according to embodiment 1 and embodiment 2.
Description of the reference symbols
1. 1-1: a processing device (apparatus); 10: a chuck table; 12: a center; 13: feature points; 21: a first cutting unit (first processing member); 21-1: a cutting unit (machining member); 22: a second cutting unit (second processing member); 30: an imaging unit (imaging means); 31: an image; 32: a shooting center; 40: a detection unit (detection member); 42: detecting a center; 51: an X-axis moving unit (X-axis moving member); 52: a first Y-axis moving unit (first Y-axis moving member); 52-1: a Y-axis moving unit (Y-axis moving member); 53: a second Y-axis moving unit (second Y-axis moving member); 200: a workpiece; ST 1: a first central coordinate detection process; ST 2: a second central coordinate detection process; ST 3: calculating the distance between coordinates; ST 4: and (5) performing a center matching process.
Detailed Description
A mode (embodiment) for carrying out the present invention will be described in detail with reference to the drawings. The present invention is not limited to the contents described in the following embodiments. The components described below include substantially the same components as those easily conceived by those skilled in the art. Further, the following structures may be combined as appropriate. Various omissions, substitutions, and changes in the structure can be made without departing from the spirit of the invention.
[ embodiment mode 1 ]
A center positioning method according to embodiment 1 of the present invention will be described with reference to the drawings. Fig. 1 is a perspective view showing a configuration example of a processing apparatus for carrying out the centering method according to embodiment 1. Fig. 2 is a plan view schematically showing an imaging center of an imaging unit of the machining apparatus shown in fig. 1. Fig. 3 is a plan view schematically showing a detection center of a detection unit of the processing apparatus shown in fig. 1. Fig. 4 is a plan view of a coordinate system showing an imaging position of an imaging unit and a detection position of a detection unit of the machining apparatus shown in fig. 1. Fig. 5 is a flowchart showing a flow of the center positioning method according to embodiment 1.
(processing apparatus)
The centering method according to embodiment 1 is performed by a machining apparatus 1 shown in fig. 1 as an apparatus. The machining apparatus 1 is a cutting apparatus that performs cutting machining on a workpiece 200 shown in fig. 1. In embodiment 1, the workpiece 200 is a wafer such as a disc-shaped semiconductor wafer or an optical device wafer based on silicon, sapphire, gallium, or the like. The object 200 has devices 203 formed in regions partitioned into a lattice shape by a plurality of lines to divide 202 formed in a lattice shape on the front surface 201.
The workpiece 200 of the present invention may be a so-called TAIKO (registered trademark) wafer in which the center portion is thinned and a thick portion is formed on the outer peripheral portion, and may be a rectangular package substrate, a ceramic substrate, a ferrite substrate, a substrate containing at least one of nickel and iron, or the like having a plurality of devices sealed with resin, in addition to the wafer. In embodiment 1, the back surface 204 of the workpiece 200 is supported by the ring frame 205 by being bonded to the adhesive tape 206 having the ring frame 205 attached to the outer peripheral edge thereof.
The machining apparatus 1 shown in fig. 1 is a cutting apparatus that holds a workpiece 200 on a chuck table 10 and performs a cutting process (corresponding to a machining process) along a line to divide 202 by a cutting tool 23. As shown in fig. 1, a processing apparatus 1 includes: a chuck table 10 having a circular planar shape, which sucks and holds the workpiece 200 by a holding surface 11; a first cutting unit 21 as a first processing member that cuts the workpiece 200 held by the chuck table 10 by a cutting tool 23; a second cutting unit 22 as a second processing member for cutting the workpiece 200 held by the chuck table 10 by a cutting tool 23; an imaging unit 30 as an imaging means for imaging the workpiece 200 held by the chuck table 10; a detection unit 40 as a detection means for measuring the height of the workpiece 200 held by the chuck table 10; and a control unit 100.
As shown in fig. 1, the machining apparatus 1 includes a moving unit 50 that moves the chuck table 10 and the cutting unit 20 relative to each other. The moving unit 50 has: an X-axis moving unit 51 that performs machining feed of the chuck table 10 in an X-axis direction parallel to the horizontal direction; a first Y-axis moving unit 52 that index-feeds the first cutting unit 21 and the photographing unit 30 in a Y-axis direction parallel to the horizontal direction and perpendicular to the X-axis direction; a second Y-axis moving unit 53 that index-feeds the first cutting unit 21 and the detecting unit 40 in the Y-axis direction; a first Z-axis moving unit 54 that performs a cutting-in feed of the first cutting unit 21 in a Z-axis direction parallel to a vertical direction perpendicular to both the X-axis direction and the Y-axis direction; a second Z-axis moving unit 55 that performs a cutting-in feed of the second cutting unit 22 in the Z-axis direction; and a rotation moving unit 56 that rotates the chuck table 10 about an axis parallel to the Z-axis direction. As shown in fig. 1, the machining apparatus 1 is a 2-spindle cutting machine having a first cutting unit 21 and a second cutting unit 22, and is a so-called facing dual type (cutting apparatus).
The X-axis moving unit 51 is an X-axis moving member that moves the chuck table 10 in the X-axis direction, which is a machining feed direction, and thereby relatively feeds the chuck table 10 and the cutting unit 20 in the X-axis direction. The first Y-axis moving unit 52 is a first Y-axis moving member that relatively index-feeds the chuck table 10, the first cutting unit 21, and the photographing unit 30 in the Y-axis direction by moving the first cutting unit 21 and the photographing unit 30 in the Y-axis direction, which is an index-feed direction. The second Y-axis moving unit 53 is a second Y-axis moving member that relatively indexes the chuck table 10, the second cutting unit 22, and the detection unit 40 in the Y-axis direction by moving the second cutting unit 22 and the detection unit 40 in the Y-axis direction, which is the index feeding direction.
The first Z-axis moving unit 54 is a first Z-axis moving member that moves the first cutting unit 21 and the imaging unit 30 in the cutting feed direction, that is, the Z-axis direction, and thereby cuts and feeds the chuck table 10, the first cutting unit 21, and the imaging unit 30 relatively in the Z-axis direction. The second Z-axis moving means 55 is a second Z-axis moving member that moves the second cutting means 22 and the detection means 40 in the direction of the cutting feed, that is, the Z-axis direction, and thereby cuts and feeds the chuck table 10, the second cutting means 22, and the detection means 40 in the Z-axis direction in a relative manner.
The X-axis moving means 51, the Y-axis moving means 52, 53, and the Z-axis moving means 54, 55 include a known ball screw provided to be rotatable about the axis, a known motor for rotating the ball screw about the axis, and a known guide rail for supporting the chuck table 10 or the cutting unit 20 to be movable in the X-axis direction, the Y-axis direction, or the Z-axis direction.
The chuck table 10 has a disk shape, and the holding surface 11 for holding the workpiece 200 is formed of porous ceramics or the like. The chuck table 10 is provided to be movable in the X-axis direction by the X-axis moving unit 51 over the entire range of a machining region below the cutting unit 20 and a carrying-in/out region that is separated from the lower side of the cutting unit 20 and carries in/out the workpiece 200, and is provided to be rotatable about an axis parallel to the Z-axis direction by the rotating moving unit 56. The chuck table 10 is connected to a vacuum suction source, not shown, and sucks and holds the workpiece 200 placed on the holding surface 11 by being sucked by the vacuum suction source. In embodiment 1, the chuck table 10 sucks and holds the back surface 204 side of the workpiece 200 through the adhesive tape 206.
The first cutting unit 21 and the second cutting unit 22 are cutting members to which a cutting tool 23 for cutting the workpiece 200 held on the chuck table 10 is detachably attached. The first cutting unit 21 is coupled to the first Y-axis moving unit 52, and is provided to be movable in the Y-axis direction by the first Y-axis moving unit 52 and movable in the Z-axis direction by the first Z-axis moving unit 54 with respect to the workpiece 200 held on the chuck table 10. The first cutting unit 21 is provided on one column portion of the gate-shaped support frame 3 erected from the apparatus main body 2 via the first Y-axis moving unit 52, the first Z-axis moving unit 54, and the like.
The second cutting unit 22 is connected to the second Y-axis moving unit 53, and is provided to be movable in the Y-axis direction by the second Y-axis moving unit 53 and movable in the Z-axis direction by the second Z-axis moving unit 55 with respect to the workpiece 200 held on the chuck table 10. The second cutting unit 22 is provided on the other column portion of the support frame 3 via the second Y-axis moving unit 53, the second Z-axis moving unit 55, and the like. The support frame 3 connects the upper ends of the column portions to each other via a horizontal beam. The first and second cutting units 21 and 22 can position the cutting tool 23 at an arbitrary position on the holding surface 11 of the chuck table 10 by the Y- axis moving units 52 and 53 and the Z- axis moving units 54 and 55.
Each of the cutting units 21 and 22 includes: an extremely thin cutting tool 23 as a cutting abrasive having a substantially annular shape; a spindle housing 24 provided to be movable in the Y-axis direction and the Z-axis direction by Y- axis moving units 52, 53 and Z- axis moving units 54, 55; and a spindle which is provided in the spindle housing 24 so as to be rotatable about the axis and serves as a rotation axis to which the cutting tool 23 is attached at the tip.
The photographing unit 30 is provided to the first cutting unit 21. In embodiment 1, the photographing unit 30 is fixed to the first cutting unit 21 so as to move integrally with the first cutting unit 21. The imaging unit 30 has a plurality of imaging elements that image the region to be divided of the workpiece 200 before cutting held on the chuck table 10. The image sensor is, for example, a CCD (Charge-Coupled Device) image sensor or a cmos (complementary mos) image sensor. The imaging unit 30 images the workpiece 200 held on the chuck table 10 to obtain an image 31 of an example shown in fig. 2 for performing alignment or the like of the workpiece 200 and the cutting tool 23, and outputs the obtained image 31 to the control unit 100.
In embodiment 1, as shown in fig. 2, the image 31 captured by the imaging unit 30 is a rectangle whose longitudinal direction is parallel to the Y-axis direction and whose short-side direction is parallel to the X-axis direction. The imaging center 32 of the imaging unit 30 is the center of each of the X-axis direction and the Y-axis direction of the image 31. The photographing unit 30 is opposed to a photographing center 32, i.e., a position indicating the center of the image 31, along the Z-axis direction.
The detection unit 40 is provided to the second cutting unit 22. In embodiment 1, the imaging unit 30 is fixed to the second cutting unit 22 so as to move integrally with the second cutting unit 22. In embodiment 1, the detection unit 40 is a back pressure sensor that detects the height of the workpiece 200 held on the chuck table 10, that is, the position in the Z-axis direction, but in the present invention, the back pressure sensor is not limited thereto, and may be a laser displacement meter or a contact sensor. The detection unit 40 detects the height of the detection center 42, that is, the position in the Z-axis direction within the detection range 41 shown in fig. 3, and outputs the detection result to the control unit 100.
In embodiment 1, as shown in fig. 3, the detection range 41 of the detection unit 40 is circular. The detection center 42 of the detection unit 40 is the center of each of the detection ranges 41 in the X-axis direction and the Y-axis direction. The detection unit 40 is opposed to a position indicating the detection center 42 along the Z-axis direction.
The machining apparatus 1 further includes: an X-axis direction position detection unit 61 for detecting the position of the chuck table 10 in the X-axis direction; a first Y-axis direction position detection unit 62 for detecting the Y-axis direction positions of the first cutting unit 21 and the photographing unit 30; a second Y-axis direction position detection unit 63 for detecting the Y-axis direction positions of the second cutting unit 22 and the detection unit 40; a first Z-axis direction position detection unit 64 for detecting the Z-axis direction positions of the first cutting unit 21 and the photographing unit 30; and a second Z-axis direction position detection unit 65 for detecting the Z-axis direction positions of the second cutting unit 22 and the detection unit 40.
The X-axis direction position detection unit 61 and the Y-axis direction position detection units 62 and 63 can be constituted by a linear scale parallel to the X-axis direction or the Y-axis direction and a reading head. The Z-axis direction position detection units 64 and 65 detect the Z-axis direction positions of the cutting units 21 and 22 by pulses of the motors of the Z- axis movement units 54 and 55. The X-axis direction position detection unit 61, the Y-axis direction position detection units 62, 63, and the Z-axis direction position detection units 64, 65 output the X-axis direction of the chuck table 10, and the Y-axis direction or Z-axis direction positions of the cutting unit 20, the imaging unit 30, and the detection unit 40 to the control unit 100.
In embodiment 1, the position in the Z-axis direction is determined based on the height from the holding surface 11 with the holding surface 11 of the chuck table 10 as a reference position. In embodiment 1, a coordinate system 301 (hereinafter referred to as a first coordinate system) defined by the X-axis direction and the Y-axis direction of the first cutting unit 21 and the imaging unit 30, and a coordinate system 302 (hereinafter referred to as a second coordinate system) defined by the X-axis direction and the Y-axis direction of the second cutting unit 22 and the detection unit 40 are the same in the X-axis direction and different in the Y-axis direction as shown in fig. 4.
In embodiment 1, the positions of the first cutting unit 21 and the imaging unit 30 in the X-axis direction and the Y-axis direction are determined by the distances parallel to the horizontal direction of the X-axis direction and the Y-axis direction from a reference position 301-1 (an example is shown in fig. 4, and hereinafter referred to as a first reference position) predetermined in the first coordinate system 301. The positions of the second cutting unit 22 and the detection unit 40 in the X-axis direction and the Y-axis direction are determined by distances parallel to the horizontal direction of the X-axis direction and the Y-axis direction from a reference position 302-1 (an example is shown in fig. 4, and hereinafter referred to as a second reference position) predetermined in the second coordinate system 302.
The control unit 100 controls each component of the machining apparatus 1 to cause the machining apparatus 1 to perform a machining operation on the workpiece 200. Further, the control unit 100 is a computer having: an arithmetic processing unit having a microprocessor such as a cpu (central processing unit); a storage device having a memory such as a rom (read only memory) or a ram (random access memory); and an input/output interface device. The arithmetic processing device of the control unit 100 performs arithmetic processing in accordance with a computer program stored in the storage device, and outputs a control signal for controlling the machining device 1 to each component of the machining device 1 via the input/output interface device.
The control unit 100 is connected to a display unit, not shown, including a liquid crystal display device or the like for displaying the state of the machining operation, images, and the like, and an input unit, not shown, used when the operator registers machining content information and the like. The input unit is configured by at least one of an external input device such as a touch panel and a keyboard provided in the display unit.
As shown in fig. 1, the control unit 100 includes a center coordinate detecting unit 101, a coordinate system matching unit 102, and a machining control unit 103. The center coordinate detecting unit 101 and the coordinate system matching unit 102 determine a relative relationship between the first coordinate system 301 and the second coordinate system 302.
The central coordinate detecting unit 101 obtains the coordinates (X1, Y1) of the center 12 (shown in fig. 4) of the chuck table 10 using the image 31 captured by the imaging unit 30 in the first coordinate system 301. The center coordinate detecting unit 101 obtains the coordinates (X2, Y2) of the center 12 of the chuck table 10 positioned at the position where the coordinates (X1, Y1) of the center 12 are obtained in the first coordinate system 301, using the detection result of the detecting unit 40 in the second coordinate system 302.
The coordinate system matching unit 102 obtains the relationship between the first coordinate system 301 and the second coordinate system 302 from the coordinates (X1, Y1) of the center 12 of the chuck table 10 obtained in the first coordinate system 301 and the coordinates (X2, Y2) of the center 12 of the chuck table 10 obtained in the second coordinate system 302.
The machining control unit 103 stores the relative position between the imaging center 32 and the lower end of the cutting edge of the cutting tool 23 of the first cutting unit 21, and the relative position between the detection center 42 and the lower end of the cutting edge of the cutting tool 23 of the second cutting unit 22. The machining control unit 103 controls each component so that the imaging center 32 matches the detection center 42, that is, so that the position imaged by the imaging unit 30 matches the position detected by the detection unit 40, from the image 31 imaged by the imaging unit 30, the detection result of the detection unit 40, and the detection results of the X-axis direction position detection unit 61 and the Y-axis direction position detection units 62 and 63, using the relationship between the first coordinate system 301 and the second coordinate system 302 obtained by the coordinate system matching unit 102 and the relative position stored in advance, and controls the machining operation of the machining apparatus 1 so that the position imaged by the imaging unit 30 matches the position detected by the detection unit 40.
The functions of the center coordinate detecting unit 101, the coordinate system matching unit 102, and the machining control unit 103 are realized by the arithmetic processing unit executing a computer program stored in the storage device.
(center positioning method)
The centering method is performed when at least one of the chuck table 10, the imaging unit 30, and the detection unit 40 is newly installed, when at least one of the chuck table 10, the imaging unit 30, and the detection unit 40 is replaced due to wear or a failure, or the like. The center positioning method is a method of positioning the detection center 42 of the detection unit 40 at the imaging center 32 of the imaging unit 30, that is, a method of finding the relationship between the first coordinate system 301 and the second coordinate system 302, and is a method of matching the position imaged by the imaging unit 30 with the position detected by the detection unit 40. When the control unit 100 receives an instruction to start the centering method from the operator, the machining device 1 starts the centering method. As shown in fig. 5, the center positioning method includes a first center coordinate detecting step ST1, a second center coordinate detecting step ST2, an inter-coordinate distance calculating step ST3, and a center matching step ST 4.
(first center coordinate detecting step)
Fig. 6 is a plan view showing a state where the image pickup unit picks up the characteristic points of the holding surface of the chuck table in the first center coordinate detection step shown in fig. 5. Fig. 7 is a plan view showing a state in which the imaging means images the characteristic points of the holding surface of the chuck table rotated by a predetermined angle from the state of fig. 6 in the first center coordinate detection step shown in fig. 5.
The first center coordinate detecting step ST1 is a step of operating the X-axis moving means 51 and the first Y-axis moving means 52 to obtain the coordinates (X1, Y1) of the center 12 of the chuck table 10 in the first coordinate system 301 from the image 31 captured by the imaging means 30. In the first center coordinate detecting step ST1, the operator operates the input means to operate the X-axis moving means 51 and the first Y-axis moving means 52 so that the feature point 13 that is recognizable from the other portion of the holding surface 11 and is located on the outer periphery side of the center 12 faces the imaging means 30 in the Z-axis direction.
In the first center coordinate detecting step ST1, the operator operates the input means, and the feature point 13 is photographed by the photographing means 30 as shown in fig. 6. Then, in the first center coordinate detecting step ST1, the center coordinate detecting unit 101 of the control unit 100 obtains the coordinates (X11, Y11) of the feature point 13 in the first coordinate system 301 from the detection results of the X-axis direction position detecting means 61 and the first Y-axis direction position detecting means 62.
In the first center coordinate detecting step ST1, the operator operates the input means to operate the rotation moving means 56 to rotate the chuck table 10 by an arbitrary predetermined angle θ about the axis as shown in fig. 7. In embodiment 1, the predetermined angle θ is 90 degrees, but the present invention is not limited to 90 degrees.
In the first center coordinate detecting step ST1, the operator operates the input means to operate the X-axis moving means 51 and the first Y-axis moving means 52 so that the feature point 13 of the holding surface 11 faces the imaging means 30 in the Z-axis direction. In the first center coordinate detecting step ST1, the operator operates the input means and the image pickup means 30 picks up the image of the feature point 13. Then, in the first center coordinate detecting step ST1, the center coordinate detecting unit 101 of the control unit 100 obtains the coordinates (X12, Y12) of the feature point 13 in the first coordinate system 301 from the detection results of the X-axis direction position detecting means 61 and the first Y-axis direction position detecting means 62.
In the first center coordinate detecting step ST1, the center coordinate detecting unit 101 of the control unit 100 obtains the distance d between the 2 coordinates (X11, Y11) and the coordinates (X12, Y12) of the feature point 13 by using the following formula 1.
Figure BDA0002821292430000101
In the first center coordinate detecting step ST1, the center coordinate detecting unit 101 of the control unit 100 obtains the distance r between the position defined by the coordinates (X11, Y11) and the center 12 of the chuck table 10 by using the following formula 2.
Figure BDA0002821292430000111
In the first center coordinate detecting step ST1, the center coordinate detecting unit 101 of the control unit 100 obtains the coordinates (X1, Y1) of the center 12 of the chuck table 10 in the first coordinate system 301 by using the following equations 3 and 4.
Figure BDA0002821292430000112
Figure BDA0002821292430000113
As described above, in the first center coordinate detecting step ST1 of the center positioning method according to embodiment 1, the coordinates of the position where the angle formed by the center 12 and the straight lines 17 and 18 of the coordinates (X11, Y11) and the coordinates (X12, Y12) on the straight line 15 is θ are determined as the coordinates (X1, Y1) of the center 12 of the chuck table 10, and the straight line 15 is perpendicular to the straight line 14 connecting the coordinates (X11, Y11) and the coordinates (X12, Y12) by the determined coordinates (X11, Y11) of the feature point 13 and the center 16 of the coordinates (X12, Y12) of the feature point 13. In this way, the first center coordinate detecting step ST1 of the center positioning method according to embodiment 1 is obtained by determining, as the coordinates (X1, Y1) of the center 12 of the chuck table 10, the coordinates of the position 1/2 at a distance d from the center by 2 coordinates (X11, Y11) and (X12, Y12) on a linear function passing perpendicularly through the determined coordinates (X11, Y11) of the feature point 13 and the center 12 of the coordinates (X12, Y12) of the feature point 13.
(second center coordinate detecting step)
Fig. 8 is a side view schematically showing a state where the detection unit is moved along the holding surface of the chuck table in the second center coordinate detection step of the positioning method shown in fig. 5. Fig. 9 is a graph showing the detection result of the detection unit shown in fig. 8. Fig. 10 is a plan view showing three points of the outer periphery of the chuck table determined according to the detection result of the detection unit shown in fig. 9.
The second center coordinate detecting step ST2 is a step of operating the X-axis moving means 51 and the second Y-axis moving means 53 to obtain the coordinates (X2, Y2) of the center 12 of the chuck table 10 in the second coordinate system 302 from the detection result of the detecting means 40. In the second center coordinate detecting step ST2, the operator operates the input means to operate the X-axis moving means 51 and the second Y-axis moving means 53, and detects the position of the holding surface 11 in the Z-axis direction by the detecting means 40 while moving the detecting means 40 along the holding surface 11 as shown in fig. 8.
Then, as shown in fig. 9, a position 19 where the position in the Z-axis direction changes abruptly occurs in the detection result of the detection unit 40. In addition, the horizontal axis of fig. 9 represents the distance in the direction parallel to the holding surface 11 from a predetermined position, and the vertical axis of fig. 9 represents the position in the Z-axis direction detected by the detection unit 40. The vertical axis of fig. 9 indicates the upward direction in fig. 9. The position 19 of the detection unit 40 when the detection result changes abruptly corresponds to a position located on the outer periphery of the chuck table 10.
In the second center coordinate detecting step ST2, the center coordinate detecting unit 101 of the control unit 100 specifies the positions 19-1, 19-2, and 19-3 of 3 or more points of the detecting unit 40 when the detection result abruptly changes, based on the detection result of the detecting unit 40. In embodiment 1, in the second center coordinate detecting step ST2, as shown in fig. 10, the center coordinate detecting unit 101 of the control unit 100 specifies 3 points, that is, the positions 19-1, 19-2, and 19-3 of the detecting unit 40 when the detection result abruptly changes, but in the present invention, the specified positions 19-1, 19-2, and 19-3 are not limited to 3 points.
In the second center coordinate detecting step ST2, the center coordinate detecting unit 101 of the control unit 100 obtains the coordinates (X21, Y21) of the position 19-1, the coordinates (X22, Y22) of the position 19-2, and the coordinates (X22, Y22) of the position 19-3 of the identified 3 points in the second coordinate system 302, based on the detection results of the X-axis direction position detecting unit 61 and the second Y-axis direction position detecting unit 63.
Here, when the coordinates of the center 12 of the chuck table 10 in the second coordinate system 302 are (X2, Y2) and the radius of a circle passing through the three points 19-1, 19-2, and 19-3 is R, the following formula 5 is established.
(X-X2)2+(Y-Y2)2=R2… (formula 5)
In the second center coordinate detecting step ST2, the center coordinate detecting unit 101 of the control unit 100 substitutes the coordinates (X21, Y21), (X22, Y22), and (X22, Y22) into X, Y of expression 5 to obtain the coordinates (X2, Y2) of the center 12 of the chuck table 10 in the second coordinate system 302.
In this way, the second center coordinate detection step ST2 of the center positioning method according to embodiment 1 detects the coordinates (X21, Y21), (X22, Y22) and (X22, Y22) of the 3 or more positions 19-1, 19-2, 19-3 of the detection unit 40 positioned on the outer periphery of the chuck table 10 where the detection result abruptly changes, and obtains the coordinates (X2, Y2) of the center 12 of the chuck table 10 in the second coordinate system 302 from the detected coordinates (X21, Y21), (X22, Y22) and (X22, Y22) of the 3 or more positions 19-1, 19-2, 19-3.
(inter-coordinate distance calculating Process)
The inter-coordinate distance calculating step ST3 is a step of: the X coordinate of the center 12 of the chuck table 10 in the first coordinate system 301 obtained based on the image 31 captured by the imaging unit 30 is X1, the X coordinate of the center 12 of the chuck table 10 in the second coordinate system 302 obtained based on the detection result of the detection unit 40 is X2, the distance α between the centers 12 is obtained as (X1-X2), the Y coordinate of the center 12 of the chuck table 10 in the first coordinate system 301 obtained based on the image 31 captured by the imaging unit 30 is Y1, the Y coordinate of the center of the chuck table 10 in the second coordinate system 302 obtained based on the detection result of the detection unit 40 is Y2, and the distance β between the centers 12 is obtained as (Y1-Y2). In the inter-coordinate distance calculating step ST3, the coordinate system matching unit 102 of the control unit 100 calculates X1 to X2 to calculate the distance α, and calculates Y1 to Y2 to calculate the distance β.
(center matching Process)
The center matching step ST4 is a step of matching the imaging center 32 of the imaging unit 30 with the detection center 42 of the detection unit 40 by positioning the detection center 42 of the detection unit 40 at the coordinates X0- α and YO- β when the imaging center 32 of the imaging unit 30 is positioned at the coordinates X0 and Y0. In the center matching step ST4, when the coordinates of the imaging center 32 of the imaging unit 30 in the first coordinate system 301 are (X0, Y0) and the coordinates of the detection center 42 of the detection unit 40 in the 2 nd coordinate system are (X02, Y02), the coordinate system matching unit 102 of the control unit 100 converts the coordinate system of the detection center 42 of the detection unit 40 from the second coordinate system 302 to the first coordinate system 301 by using the following expressions 6 and 7.
X02 ═ X0- α … (formula 6)
Y02 ═ Y0- β … (formula 7)
In this way, in the center matching step ST4, when the coordinate system matching unit 102 of the control unit 100 converts the coordinate system of the detection center 42 of the detection unit 40 from the second coordinate system 302 to the first coordinate system 301 and positions the imaging center 32 of the imaging unit 30 at the position corresponding to the coordinates (X0, Y0), the detection center 42 of the detection unit 40 is positioned at the position corresponding to the coordinates (X0- α, YO- β), and the imaging center 32 of the imaging unit 30 in the first coordinate system 301 and the detection center 42 of the detection unit 40 in the second coordinate system 302 are matched. In the center matching step ST4, the coordinate system matching unit 102 of the control unit 100 determines the relationship between the arbitrary coordinates (X0, Y0) in the first coordinate system 301 and the arbitrary coordinates (X02, Y02) in the second coordinate system 302 as shown in equations 6 and 7 by using equations 6 and 7 to set any one of the coordinates (X0, Y0) and the coordinates (X02, Y02) as an arbitrary coordinate, and ends the positioning method.
(machining operation of machining device)
In the machining apparatus 1, the operator registers the machining content information in the control unit 100, and places the workpiece 200 before cutting on the holding surface 11 of the chuck table 10. Then, the machining device 1 starts the machining operation when an instruction to start the machining operation is given from the operator. When the machining operation is started, the machining device 1 sucks and holds the back surface 204 side on the holding surface 11 of the chuck table 10 via the adhesive tape 206.
In the machining operation, the X-axis moving unit 51 of the machining apparatus 1 moves the chuck table 10 toward the machining area, the imaging unit 30 images the workpiece 200, and alignment is performed based on the image 31 captured by the imaging unit 30. The detection unit 40 detects the position of the workpiece 200 in the Z-axis direction.
The machining apparatus 1 cuts the workpiece 200 into the respective devices 203 by cutting the cutting tool 23 into the lines to divide the workpiece 200 into the respective devices 203 while relatively moving the workpiece 200 and the cutting unit 20 along the lines to divide 202. When the machining device 1 performs the cutting of the workpiece 200 along the line to divide 202, the machining control unit 103 of the control unit 100 controls the machining operation of the machining device 1 by controlling each component so that the position imaged by the imaging unit 30 matches the position detected by the detection unit 40, using the above equations 6 and 7. The machining apparatus 1 cuts all the lines to divide 202, and ends the machining operation when the object 200 is divided into individual devices 203.
As described above, the center positioning method according to embodiment 1 can obtain the relationship between the coordinates (X0, Y0) of the imaging center 32 of the imaging unit 30 and the coordinates (X02, Y02) of the detection center 42 of the detection unit 40 in the first coordinate system 301, based on the image 31 captured by the imaging unit 30, the detection result of the detection unit 40, and the detection results of the respective position detection units 61, 62, 63, and can grasp the positional relationship between the imaging unit 30 and the detection unit 40 in a state where the chuck table 10 is not mounted. As a result, the center positioning method according to embodiment 1 has an effect of easily grasping the actual positional relationship between the imaging unit 30 and the detection unit 40.
[ embodiment 2 ]
A center positioning method according to embodiment 2 of the present invention will be described with reference to the drawings. Fig. 11 is a plan view schematically showing a state in which the imaging unit images three points on the outer periphery of the chuck table in the first center coordinate detection step of the center positioning method according to embodiment 2. In fig. 11, the same components as those in embodiment 1 are denoted by the same reference numerals, and description thereof is omitted. The center positioning method according to embodiment 2 is the same as embodiment 1 except that the first center coordinate detecting step ST1 is different from embodiment 1.
In the first center coordinate detecting step ST1 of embodiment 2, the operator operates the input means to operate the X-axis moving means 51 and the first Y-axis moving means 52 so that the outer periphery of the chuck table 10 faces the imaging means 30 in the Z-axis direction. In the first center coordinate detecting step ST1 of embodiment 2, the operator operates the input means to intermittently rotate the chuck table 10 by a predetermined angle by the rotating and moving means 56, and during the stop of the chuck table 10, the imaging means 30 performs three or more imaging of the region 33 including the outer periphery of the chuck table 10. In embodiment 2, the region 33 is photographed at 3 by the photographing unit 30, but is not limited to 3 in the present invention.
In the first center coordinate detecting step ST1 of embodiment 2, the center coordinate detecting unit 101 of the control unit 100 obtains the coordinates (X13, Y13) of the position 32-1, the coordinates (X14, Y14) of the position 32-2, and the coordinates (X15, Y15) of the position 32-3 of the imaging center 32 when the imaging unit 30 images each area 33, from the detection results of the X-axis direction position detecting unit 61 and the first Y-axis direction position detecting unit 62.
In the first center coordinate detecting step ST1 of embodiment 2, the center coordinate detecting unit 101 of the control unit 100 substitutes the respective coordinates (X13, Y13), (X14, Y14), and (X15, Y15) into X, Y of expression 5 to obtain the coordinates (X1, Y1) of the center 12 of the chuck table 10 in the first coordinate system 301.
As described above, the first center coordinate detecting step ST1 of the center positioning method according to embodiment 2 images an area including the outer periphery of the chuck table 10, detects coordinates (X13, Y13), (X14, Y14), and (X15, Y15) of 3 or more points of the imaging center 32 of the imaging unit 30 positioned on the outer periphery from the imaged image 31, and obtains the coordinates (X1, Y1) of the center 12 of the chuck table 10 from the detected coordinates (X13, Y13), (X14, Y14), and (X15, Y15) of the 3 or more points.
The center positioning method according to embodiment 2 can grasp the positional relationship between the imaging unit 30 and the detection unit 40 based on the image 31 captured by the imaging unit 30, the detection result of the detection unit 40, and the detection results of the respective position detection units 61, 62, and 63. As a result, the center positioning method of embodiment 1 has the effect of easily grasping the actual positional relationship between the imaging unit 30 and the detection unit 40, as in embodiment 1.
[ DEFORMATION ] OF THE PREFERRED EMBODIMENT
A center positioning method according to a modification of embodiment 1 or embodiment 2 of the present invention will be described with reference to the drawings. Fig. 12 is a perspective view showing a configuration example of a processing apparatus for performing the centering method according to embodiment 1 and embodiment 2. In fig. 12, the same parts as those in embodiment 1 are denoted by the same reference numerals, and description thereof is omitted
The center positioning method according to the modification is the same as that of embodiment 1 except that the machining apparatus 1-1 is used, and the machining apparatus 1-1 is an apparatus in which, as shown in fig. 12, only one of the cutting unit 21-1, the Y-axis moving unit 52-1, the Z-axis moving unit 54-1, the Y-axis direction position detecting unit 62-1, and the Z-axis direction position detecting unit 64-1 is provided, and the detecting unit 40 is disposed in the cutting unit 21-1.
The cutting unit 21-1 of the machining apparatus 1-1 shown in fig. 12 has the same configuration as the first cutting unit 21 of embodiment 1 and the like, the Y-axis moving unit 52-1 has the same configuration as the first Y-axis moving unit 52 of embodiment 1 and the like, the Z-axis moving unit 54-1 has the same configuration as the first Z-axis moving unit 54 of embodiment 1 and the like, the Y-axis direction position detecting unit 62-1 has the same configuration as the first Y-axis direction position detecting unit 62 of embodiment 1 and the like, and the Z-axis direction position detecting unit 64-1 has the same configuration as the first Z-axis direction position detecting unit 64 of embodiment 1 and the like.
In this way, in the modification, the first Y-axis moving unit 52 and the second Y-axis moving unit 53 are the Y-axis moving unit 52-1 as one Y-axis moving unit, and the imaging unit 30 and the detection unit 40 are disposed on the cutting unit 21-1 as the machining member coupled to the Y-axis moving unit 52-1.
The center positioning method according to the modified example can grasp the positional relationship between the imaging unit 30 and the detection unit 40 based on the image 31 captured by the imaging unit 30, the detection result of the detection unit 40, and the detection results of the respective position detection units 61, 62, and 63. As a result, the center positioning method according to embodiment 1 has an effect that the actual positional relationship between the imaging unit 30 and the detection unit 40 can be easily grasped as in embodiment 1.
The present invention is not limited to the above embodiments. That is, various modifications can be made without departing from the scope of the present invention.

Claims (5)

1. A center positioning method of positioning a detection center of a detection member at a photographing center of a photographing member in an apparatus, the apparatus comprising:
a circular chuck table for holding a workpiece;
an X-axis moving member which moves the chuck table in an X-axis direction;
an imaging member that images the workpiece held by the chuck table;
a first Y-axis moving member that moves the photographing member in a Y-axis direction;
the detection component detects the height of the processed object held by the chuck worktable; and
a second Y-axis moving member that moves the detecting member in the Y-axis direction,
wherein,
the center positioning method comprises the following steps:
a first central coordinate detection step of operating the X-axis moving member and the first Y-axis moving member and obtaining a coordinate of the center of the chuck table from an image captured by the imaging member (X1, Y1);
a second central coordinate detection step of operating the X-axis moving member and the second Y-axis moving member and obtaining the coordinate of the center of the chuck table according to the detection result of the detection member (X2, Y2);
an inter-coordinate distance calculating step of calculating an X coordinate of the center of the chuck table obtained based on the image captured by the imaging means as X1, an X coordinate of the center of the chuck table obtained based on the detection result of the detecting means as X2, an inter-center distance α as (X1-X2), an Y coordinate of the center of the chuck table obtained based on the image captured by the imaging means as Y1, an Y coordinate of the center of the chuck table obtained based on the detection result of the detecting means as Y2, and an inter-center distance β as (Y1-Y2); and
and a center matching step of matching the imaging center of the imaging member with the detection center of the detection member by positioning the detection center of the detection member at the coordinates of X0-alpha and Y0-beta when the imaging center of the imaging member is positioned at the coordinates of X0 and Y0.
2. The center positioning method according to claim 1,
the first center coordinate detecting step is to photograph a feature point on the outer peripheral side of the center of the chuck table by an image pickup means, to obtain coordinates (X11, Y11) of the feature point, to photograph the feature point by rotating the chuck table by an arbitrary angle, to obtain coordinates (X12, Y12) of the feature point, to obtain coordinates (X1, Y1) of the center of the chuck table as coordinates (X1, Y1) of the center of the chuck table at a position of 1/2 which is a distance of 2 coordinates from the center on a linear function of the obtained coordinates (X11, Y11) of the feature point and the center of the coordinates (X12, Y12) of the feature point,
the second central coordinate detecting step detects coordinates of 3 or more points of the detecting member positioned on the outer periphery of the chuck table where the detection result abruptly changes, and obtains the coordinates of the center of the chuck table from the detected coordinates of 3 or more points (X2, Y2).
3. The center positioning method according to claim 1,
the first central coordinate detection step images an area including the outer periphery of the chuck table, detects coordinates of 3 or more points of the imaging center of the imaging member positioned on the outer periphery from the imaged image, and obtains coordinates of the center of the chuck table from the detected coordinates of 3 or more points (X1, Y1),
the second central coordinate detecting step detects coordinates of 3 or more points of the detecting member positioned on the outer periphery of the chuck table where the detection result abruptly changes, and obtains the coordinates of the center of the chuck table from the detected coordinates of 3 or more points (X2, Y2).
4. The center positioning method according to claim 1,
the shooting component is arranged on a first processing component connected with the first Y-axis moving component,
the detecting member is disposed on a second processing member coupled to the second Y-axis moving member.
5. The center positioning method according to claim 1,
the first and second Y-axis moving members are 1Y-axis moving member,
the imaging member and the detecting member are disposed on a processing member coupled to the Y-axis moving member.
CN202011418739.6A 2020-12-07 2020-12-07 Center positioning method Pending CN114603726A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011418739.6A CN114603726A (en) 2020-12-07 2020-12-07 Center positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011418739.6A CN114603726A (en) 2020-12-07 2020-12-07 Center positioning method

Publications (1)

Publication Number Publication Date
CN114603726A true CN114603726A (en) 2022-06-10

Family

ID=81855658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011418739.6A Pending CN114603726A (en) 2020-12-07 2020-12-07 Center positioning method

Country Status (1)

Country Link
CN (1) CN114603726A (en)

Similar Documents

Publication Publication Date Title
JP7336960B2 (en) Centering method
JP2015229231A (en) Processing device for peripheral edge of plate material and processing method for peripheral edge of curved plate
JP7382833B2 (en) processing equipment
CN107768242B (en) Cutting method for workpiece
JP2018078145A (en) Cutting apparatus
CN110504191B (en) Inspection jig and inspection method
JP2016153154A (en) Wafer positioning method
JP6128977B2 (en) Plate material peripheral edge processing apparatus and processing accuracy measurement and correction method
JP6955975B2 (en) Wafer processing method
CN114603726A (en) Center positioning method
TWI854057B (en) Center positioning method
JP7300938B2 (en) Kerf recognition method
CN112397381A (en) Wafer processing method and cutting device
JP7550633B2 (en) DICING APPARATUS AND METHOD FOR INSPECTING DICING APPARATUS
CN111497047B (en) Origin position registration method for cutting device
CN111515915B (en) Alignment method
JP7232071B2 (en) Camera position deviation detection method in processing equipment
JP5538015B2 (en) Method of determining machining movement amount correction value in machining apparatus
CN115863211A (en) Processing device
TW202138114A (en) Cutting apparatus
JP2022081254A (en) Processing device
TW202427645A (en) Detection device, processing device and registration method reduce burden of operator and increase operational efficiency
JP2024161298A (en) Alignment apparatus and method
TWI600499B (en) Hard brittle plate grinding device and processing precision measurement and correction method
JP2022083252A (en) Processing device and processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination