CN116153807A - Processing device - Google Patents
Processing device Download PDFInfo
- Publication number
- CN116153807A CN116153807A CN202211402795.XA CN202211402795A CN116153807A CN 116153807 A CN116153807 A CN 116153807A CN 202211402795 A CN202211402795 A CN 202211402795A CN 116153807 A CN116153807 A CN 116153807A
- Authority
- CN
- China
- Prior art keywords
- moving image
- unit
- moving
- image
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012545 processing Methods 0.000 title claims abstract description 86
- 238000003860 storage Methods 0.000 claims abstract description 53
- 238000003384 imaging method Methods 0.000 abstract description 53
- 238000005520 cutting process Methods 0.000 description 65
- 238000003754 machining Methods 0.000 description 57
- 238000004140 cleaning Methods 0.000 description 30
- 238000012423 maintenance Methods 0.000 description 16
- 238000000034 method Methods 0.000 description 16
- 238000003825 pressing Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 239000004065 semiconductor Substances 0.000 description 8
- 239000002390 adhesive tape Substances 0.000 description 5
- 230000000295 complement effect Effects 0.000 description 5
- 230000010354 integration Effects 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 238000012217 deletion Methods 0.000 description 3
- 230000037430 deletion Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 229910044991 metal oxide Inorganic materials 0.000 description 3
- 150000004706 metal oxides Chemical class 0.000 description 3
- 239000000919 ceramic Substances 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 229910052582 BN Inorganic materials 0.000 description 1
- PZNSFCLAULLKQX-UHFFFAOYSA-N Boron nitride Chemical compound N#B PZNSFCLAULLKQX-UHFFFAOYSA-N 0.000 description 1
- GYHNNYVSQQEPJS-UHFFFAOYSA-N Gallium Chemical compound [Ga] GYHNNYVSQQEPJS-UHFFFAOYSA-N 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 239000006061 abrasive grain Substances 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000010485 coping Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 229910052733 gallium Inorganic materials 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 229910052594 sapphire Inorganic materials 0.000 description 1
- 239000010980 sapphire Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L21/00—Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
- H01L21/67—Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
- H01L21/67005—Apparatus not specifically provided for elsewhere
- H01L21/67242—Apparatus for monitoring, sorting or marking
- H01L21/67259—Position monitoring, e.g. misposition detection or presence detection
- H01L21/67265—Position monitoring, e.g. misposition detection or presence detection of substrates stored in a container, a magazine, a carrier, a boat or the like
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/409—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L21/00—Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
- H01L21/67—Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
- H01L21/67005—Apparatus not specifically provided for elsewhere
- H01L21/67011—Apparatus for manufacture or treatment
- H01L21/67092—Apparatus for mechanical treatment
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L21/00—Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
- H01L21/67—Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
- H01L21/67005—Apparatus not specifically provided for elsewhere
- H01L21/67242—Apparatus for monitoring, sorting or marking
- H01L21/67276—Production flow monitoring, e.g. for increasing throughput
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- Automation & Control Theory (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Power Engineering (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Machine Tool Sensing Apparatuses (AREA)
- Constituent Portions Of Griding Lathes, Driving, Sensing And Control (AREA)
- Numerical Control (AREA)
- Dicing (AREA)
- Television Signal Processing For Recording (AREA)
- Image Input (AREA)
Abstract
The invention provides a processing device which can share information with high precision. The processing device (1) comprises: a holding table (10) for holding a workpiece (200); a processing unit (20) for processing the object to be processed held by the holding table; a display unit (110) that displays various information; and a control unit (100) that includes: a moving image input unit (101) for inputting a moving image acquired by shooting by the shooting device (70); a moving image storage unit (102) for storing the moving image input to the moving image input unit; and a moving image playing unit (103) that plays the moving image stored in the moving image storage unit, wherein the moving image input unit (101) causes the display unit to display the moving image input from the imaging device, and wherein the moving image playing unit (103) causes the display unit to display the moving image to be played.
Description
Technical Field
The present invention relates to a processing apparatus.
Background
After a semiconductor wafer having a plurality of devices such as an IC (integrated circuit ) and an LSI (Large Scale Integration, large-scale integration) formed thereon is ground to a predetermined thickness by a grinding device, the semiconductor wafer is divided into device chips by a processing device such as a dicing device or a laser processing device, and the divided device chips are used for electronic devices such as a mobile phone and a personal computer.
In recent years, as functions of processing apparatuses used in such a series of semiconductor manufacturing processes are expanded, operation methods, maintenance methods, and fault handling methods related to various processes are becoming more complicated.
Conventionally, in such a situation, a user such as an operator who uses these processing apparatuses has found a corresponding part by reading a manual having a thick book again or has heard a skilled person familiar with the processing apparatuses without knowing a maintenance method, a trouble handling method, or the like, thereby solving an unknown point or a problem. However, a lot of time is sometimes required to find the corresponding portion by reading a thick manual, and even if the corresponding portion is found, it is often difficult to understand intuitively. In addition, in the case of relying on a skilled person, there are the following problems: the skilled person is not always aside and spends a lot of time before the problem is solved.
In order to solve the above problems, the following techniques are considered: a technique capable of storing and displaying a previously created maintenance method in a processing apparatus (for example, refer to patent document 1); a technique that can arbitrarily write or store and play a sound to the processing apparatus and leave a memo (memo) for another operator (for example, refer to patent document 2 and patent document 3).
Patent document 1: japanese patent application laid-open No. 2010-49466
Patent document 2: japanese patent laid-open No. 2014-231123
Patent document 3: japanese patent laid-open No. 2020-183001
However, in a case where a new function is added to the processing apparatus, for example, a maintenance method or a failure handling method is not stored, and therefore, each time a teaching or the like is required, the operator needs to learn the teaching or the like. In addition, it is sometimes difficult to transmit detailed operations of the device or the like by means of information sharing by writing or voice.
Disclosure of Invention
The present invention has been made in view of the above circumstances, and an object thereof is to provide a processing apparatus capable of sharing information with high accuracy.
In order to achieve the above object, a machining apparatus according to the present invention machines an object to be machined, the machining apparatus comprising: a holding table for holding the workpiece; a processing unit for processing the object to be processed held by the holding table; a display unit that displays various information; and a control unit that controls each constituent element, the control unit including: a moving image input unit for inputting a moving image obtained by shooting by the shooting device; a moving image storage unit for storing the moving image input to the moving image input unit; and a moving image playing section that plays the moving image stored in the moving image storage section, wherein the moving image input section causes the display unit to display the moving image input from the photographing device, and wherein the moving image playing section causes the display unit to display the moving image to be played.
In the processing apparatus, the control unit may include a moving image editing unit capable of editing the moving image stored in the moving image storage unit.
The invention has the effect of high-precision information sharing.
Drawings
Fig. 1 is a perspective view showing a configuration example of a processing apparatus according to embodiment 1.
Fig. 2 is a diagram showing an example of a display screen of a top menu displayed on the display unit of the processing apparatus shown in fig. 1.
Fig. 3 is a diagram showing an example of a display screen of a selection screen of a moving image displayed on the display unit when a button image of "moving image data" of the top menu shown in fig. 2 is pressed.
Fig. 4 is a diagram showing a moving image or the like displayed on the display unit when a button image of "record" of the moving image selection screen shown in fig. 3 is pressed.
Fig. 5 is a diagram showing a selection screen of moving image data displayed on the display unit when a button image of "play" and "edit" of the selection screen of moving image shown in fig. 3 is pressed.
Fig. 6 is a diagram showing a display screen displayed on the display unit when a button image of "play" of a selection image of the moving image shown in fig. 3 is pressed and a button image of a moving image name "flange cleaning" of a selection screen of the moving image data shown in fig. 5 is pressed.
Fig. 7 is a diagram showing a display screen displayed on the display unit when the button image of "edit" of the selection image of the moving image shown in fig. 3 is pressed and the button image of the selection screen of the moving image data shown in fig. 5, which is called "flange cleaning", is pressed.
Description of the reference numerals
1: a processing device; 10: a holding table; 20: a cutting unit (processing unit); 100: a control unit; 101: a moving image input unit; 102: a moving image storage unit; 103: a moving image playing unit; 104: a moving image editing unit; 110: a display unit; 151. 171: a dynamic image; 200: a processed object.
Detailed Description
The mode (embodiment) for carrying out the present invention will be described in detail with reference to the accompanying drawings. The present invention is not limited to the following embodiments. The constituent elements described below include those that can be easily understood by those skilled in the art and those that are substantially the same. The structures described below may be appropriately combined. Various omissions, substitutions and changes in the structure may be made without departing from the spirit of the invention.
Embodiment 1
A processing apparatus according to embodiment 1 of the present invention will be described with reference to the accompanying drawings. Fig. 1 is a perspective view showing a configuration example of a processing apparatus according to embodiment 1.
(object to be processed)
The machining device 1 shown in fig. 1 of embodiment 1 is a cutting device that performs cutting machining on a workpiece 200. The object 200 to be processed in the processing apparatus 1 shown in fig. 1 is a wafer such as a disk-shaped semiconductor wafer or an optical device wafer having silicon, sapphire, gallium, or the like as a substrate 201. The workpiece 200 has a plurality of lines 203 intersecting each other on a front surface 202, and devices 204 are formed in regions defined by the lines 203. The device 204 is an integrated circuit such as an IC (Integrated Circuit ) or an LSI (Large Scale Integration, large scale integration), an image sensor such as a CCD (Charge Coupled Device, inductive coupling element) or a CMOS (Complementary Metal Oxide Semiconductor ), or a memory (semiconductor memory device).
In embodiment 1, as shown in fig. 1, an adhesive tape 209 having a circular plate shape with a diameter larger than the outer diameter of the workpiece 200 and an annular frame 210 attached to the outer edge portion is attached to a back surface 206 of the workpiece 200 on the opposite side of the front surface 202, and the workpiece 200 is supported in the opening of the annular frame 210. The predetermined parting line 203 is cut to divide the workpiece 200 into the individual devices 204.
(processing device)
The machining apparatus 1 shown in fig. 1 is a cutting apparatus that holds a workpiece 200 by a holding table 10, performs cutting (corresponding to machining) along a line 203 to be divided by a cutting tool 21, and divides the workpiece 200 into individual devices 204.
As shown in fig. 1, the processing apparatus 1 includes: a holding table 10 for sucking and holding the workpiece 200 by the holding surface 11; a cutting unit 20 that cuts the workpiece 200 held by the holding table 10; an imaging unit 30 that images the workpiece 200 held by the holding table 10; and a control unit 100. As shown in fig. 1, the machining device 1 is a cutting device having two cutting units 20, i.e., a double spindle cutting machine, i.e., a so-called double-spindle-facing cutting device.
The machining device 1 further includes a moving unit 40 that moves the holding table 10 and the cutting unit 20 relative to each other. The mobile unit 40 has at least: a machining feed unit 41 that performs machining feed in an X-axis direction parallel to the horizontal direction on the holding table 10; an indexing unit 42 that performs indexing of the cutting unit 20 in a Y-axis direction parallel to the horizontal direction and perpendicular to the X-axis direction; an infeed unit 43 for feeding the cutting unit 20 in the Z-axis direction parallel to the vertical direction perpendicular to both the X-axis direction and the Y-axis direction; and a rotation moving unit 44 that rotates the holding table 10 around an axis parallel to the Z-axis direction. That is, the moving unit 40 relatively moves the holding table 10 and the cutting unit 20 in the X-axis direction, the Y-axis direction, the Z-axis direction, and around the axis.
The machining feed unit 41 moves the holding table 10 and the rotary moving unit 44 in the X-axis direction as the machining feed direction, thereby relatively moving the cutting unit 20 and the holding table 10 in the X-axis direction. The indexing unit 42 moves the cutting unit 20 in the Y-axis direction as the indexing direction, thereby relatively moving the cutting unit 20 and the holding table 10 in the Y-axis direction. The plunge feed unit 43 moves the cutting unit 20 in the Z-axis direction as the plunge feed direction, thereby relatively moving the cutting unit 20 and the holding table 10 in the Z-axis direction.
The rotary movement unit 44 is supported by the machining feed unit 41, supports the holding table 10, and is disposed so as to be movable in the X-axis direction together with the holding table 10. The rotary movement unit 44 rotates the holding table 10 around the axis, thereby relatively rotating the cutting unit 20 and the holding table 10 around the axis.
The machining feed unit 41, the index feed unit 42, and the plunge feed unit 43 have: a known ball screw provided rotatably around an axis; a known motor that rotates a ball screw around an axis; and a well-known guide rail for supporting the holding table 10 or the cutting unit 20 so as to be movable in the X-axis direction, the Y-axis direction, or the Z-axis direction. The rotary movement unit 44 has a motor for rotating the holding table 10 around the axis.
The holding table 10 has a disk shape, and the holding surface 11 for holding the workpiece 200 is formed of porous ceramics or the like. The holding table 10 is provided so as to be movable in the X-axis direction between a machining area below the cutting unit 20 and a carry-in/out area for carrying in/out the workpiece 200 away from the lower side of the cutting unit 20 by the machining feed unit 41. The holding table 10 is rotatably provided around an axis parallel to the Z-axis direction by the rotary moving unit 44.
The holding surface 11 of the holding table 10 is connected to a suction source, not shown, and the holding surface 11 sucks by the suction source, thereby sucking and holding the workpiece 200 placed on the holding surface 11. In embodiment 1, the holding table 10 attracts and holds the back surface 206 side of the workpiece 200 via the adhesive tape 209. As shown in fig. 1, a plurality of clamping portions 12 for clamping the ring frame 210 are provided around the holding table 10.
The cutting unit 20 is a machining unit in which a cutting tool 21 is attached to a spindle 23 and cuts a workpiece 200 held by the holding table 10. The cutting units 20 are provided so as to be movable in the Y-axis direction by the indexing unit 42 and so as to be movable in the Z-axis direction by the plunge feeding unit 43, respectively, with respect to the workpiece 200 held by the holding table 10.
The cutting unit 20 is provided on a door-shaped support frame 3 erected from the apparatus main body 2 via an indexing unit 42, an plunge feeding unit 43, and the like. The cutting unit 20 can position the cutting tool 21 at an arbitrary position on the holding surface 11 of the holding table 10 by the index feed unit 42 and the plunge feed unit 43.
The cutting unit 20 has: a cutting tool 21; a spindle case 22 provided so as to be movable in the Y-axis direction and the Z-axis direction by the index feed unit 42 and the plunge feed unit 43; a spindle 23 rotatably provided to the spindle case 22 around an axis, the cutting tool 21 being attached to a tip of the spindle 23; and a spindle motor, not shown, for rotating the spindle 23 around the axis.
The cutting tool 21 is an extremely thin cutting grinder having a substantially annular shape. The cutting tool 21 is fixed to the front end of the spindle 23. In embodiment 1, the cutting tool 21 is a so-called hub tool, and the cutting tool 21 has: a circular base in the shape of a ring; and a circular cutting edge disposed on the outer periphery of the circular base for cutting the workpiece 200. The cutting edge is formed of abrasive grains such as diamond or CBN (Cubic Boron Nitride ) and a bonding material such as metal or resin, and is formed to have a predetermined thickness. In the present invention, the cutting tool 21 may be a so-called washer tool composed only of a cutting edge.
The axes of the cutting tool 21 and the spindle 23 of the cutting unit 20 are set parallel to the Y-axis direction.
The photographing unit 30 is fixed to the cutting unit 20 so as to move integrally with the cutting unit 20. The imaging unit 30 includes an imaging element that images a region to be divided of the workpiece 200 held by the holding table 10 before cutting. The imaging element is, for example, a CCD (Charge-Coupled Device) imaging element or a CMOS (Complementary MOS, complementary metal oxide semiconductor) imaging element. The imaging unit 30 images the workpiece 200 held by the holding table 10 to obtain an image for performing alignment, i.e., alignment of the workpiece 200 with the cutting tool 21, and outputs the obtained image to the control unit 100.
The processing device 1 further includes: an X-axis direction position detecting means, not shown, for detecting the position of the holding table 10 in the X-axis direction; a Y-axis direction position detecting unit, not shown, for detecting the Y-axis direction position of the cutting unit 20; and a Z-axis direction position detecting unit for detecting a Z-axis direction position of the cutting unit 20. The X-axis direction position detecting unit and the Y-axis direction position detecting unit may be constituted by a linear scale parallel to the X-axis direction or the Y-axis direction and a reading head. The Z-axis direction position detection unit detects the Z-axis direction position of the cutting unit 20 using the pulse of the motor. The X-axis direction position detecting means, the Y-axis direction position detecting means, and the Z-axis direction position detecting means output the position of the lower end of the cutting edge of the holding table 10 in the Y-axis direction or the Z-axis direction of the cutting unit 20 to the control means 100.
In embodiment 1, the positions in the X-axis direction, the Y-axis direction, and the Z-axis direction of the holding table 10 and the cutting unit 20 of the machining device 1 are determined based on a predetermined reference position, not shown. In embodiment 1, the positions in the X-axis direction, the Y-axis direction, and the Z-axis direction are determined by the distances in the X-axis direction, the Y-axis direction, and the Z-axis direction from the reference position. In embodiment 1, XY coordinates (coordinates represented by a distance in the X-axis direction from a reference position showing a position in the X-axis direction and a distance in the Y-axis direction from a reference position showing a position in the Y-axis direction) represented by the X-axis direction and the Y-axis direction of the machining device 1 may show any position of the workpiece 200 held by the holding surface 11 of the holding table 10.
The processing device 1 further includes: a cassette lifter 60 that mounts a cassette 50 that accommodates a plurality of work pieces 200 before and after cutting, and lifts and lowers the cassette 50 in the Z-axis direction; and a not-shown conveying unit that conveys the workpiece 200 between the cassette 50 and the holding table 10 while allowing the workpiece 200 to enter and exit the cassette 50.
The control unit 100 controls each component of the machining apparatus 1 to cause the machining apparatus 1 to perform a machining operation for the workpiece 200. In addition, the control unit 100 is a computer, and the control unit 100 has: an arithmetic processing device having a microprocessor such as a CPU (central processing unit ); a storage device having a memory such as a ROM (read only memory) or a RAM (random access memory ); and an input/output interface device. The arithmetic processing device of the control unit 100 performs arithmetic processing in accordance with a computer program stored in the storage device, and outputs a control signal for controlling the machining device 1 to each component of the machining device 1 via the input/output interface device.
Further, the control unit 100 is connected to the imaging device 70 that is separate from the imaging unit 30. The imaging device 70 is provided at an arbitrary position of the processing device 1. The imaging device 70 includes an imaging element that images an arbitrary position of the processing device 1. The imaging element is, for example, a CCD (Charge-Coupled Device) imaging element or a CMOS (Complementary MOS, complementary metal oxide semiconductor) imaging element. The imaging device 70 images a moving image at an arbitrary position of the processing device 1, and outputs the moving image obtained by the imaging to the control unit 100.
Particularly, the imaging device 70 preferably images moving images from the outside to the middle of the processing device 1 of the failure part, the maintenance part, and the function addition part of the processing device 1. The moving image is a video composed of, for example, 30 still images continuously acquired within 1 second, and the moving state of an arbitrary position of the processing apparatus 1 is displayed by continuously displaying, for example, 30 still images sequentially within 1 second. The imaging device 70 is, for example, a well-known USB (Universal Serial Bus ) camera or a stereo camera.
The processing device 1 further includes: a display unit 110 connected to the control unit 100 and configured by a liquid crystal display device or the like for displaying various information such as a state of a machining operation or an image; and an input unit 120 for an operator to register processing conditions and the like. The display unit 110 is controlled by an arithmetic processing device or the like of the control unit 100, and displays various information or the like stored in a storage device of the control unit 100. The display unit 110 includes, for example, a display panel such as a liquid crystal display (Liquid Crystal Display) or an Organic Electro-Luminescence Display, and a touch panel overlapping the display panel. The input unit 120 is constituted by an external input device such as a touch panel and a keyboard provided in the display unit 110.
The control unit 100 includes a moving image input unit 101, a moving image storage unit 102, a moving image playback unit 103, and a moving image editing unit 104. When the imaging device 70 is connected to the control unit 100, the moving image input unit 101 inputs a moving image obtained by imaging by the imaging device 70. The moving image input unit 101 causes the display unit 110 to display the moving image input from the imaging device 70.
The moving image storage unit 102 stores the moving image input to the moving image input unit 101. The moving image playing unit 103 plays the moving image stored in the moving image storage unit 102. The moving image playing section 103 causes the display unit 110 to display a moving image to be played. The moving image editing unit 104 can edit the moving image stored in the moving image storage unit 102.
The functions of the video input unit 101, the video playback unit 103, and the video editing unit 104 are realized by an arithmetic processing device executing arithmetic processing in accordance with a computer program stored in a storage device. The function of the moving image storage unit 102 is realized by the above-described storage device.
Next, a display screen displayed on the display unit 110 will be described with reference to the drawings. Fig. 2 is a diagram showing an example of a display screen of a top menu displayed on the display unit of the processing apparatus shown in fig. 1. Fig. 3 is a diagram showing an example of a display screen of a selection screen of a moving image displayed on the display unit when a button image of "moving image data" of the top menu shown in fig. 2 is pressed. Fig. 4 is a diagram showing a moving image or the like displayed on the display unit when a button image of "record" of the moving image selection screen shown in fig. 3 is pressed. Fig. 5 is a diagram showing a selection screen of moving image data displayed on the display unit when a button image of "play" and "edit" of the selection screen of moving image shown in fig. 3 is pressed. Fig. 6 is a diagram showing a display screen displayed on the display unit when a button image of "play" of a selection image of the moving image shown in fig. 3 is pressed and a button image of a moving image name "flange cleaning" of a selection screen of the moving image data shown in fig. 5 is pressed. Fig. 7 is a diagram showing a display screen displayed on the display unit when the button image of "edit" of the selection image of the moving image shown in fig. 3 is pressed and the button image of the selection screen of the moving image data shown in fig. 5, which is called "flange cleaning", is pressed.
The display unit 110 displays a top menu 130 shown in fig. 2 by being controlled by the control unit 100. The top menu 130 is displayed on the display unit 110 after the power of the machining device 1 is turned on, before the machining operation is performed, or the like. In the main area 111 in the center of the top menu 130 shown in fig. 2, button images 131, 132, 133, 134, 135, 136, 137, 138 of the respective modes of "full-automatic", "manual operation", "device data", "tool maintenance", "operator maintenance", "machine maintenance", "engineering maintenance", and "moving image data" are displayed so as to be able to perform a selection operation, and in the software keyboard area 112 in the lower portion of the main area 111, button images of item keys such as "tool setting" and "device data" are displayed so as to be able to perform a selection operation, and no button image is displayed in the input area 113.
When the control unit 100 accepts the depression of the button image 138 of the "moving image data" of the top menu 130 shown in fig. 2, the display unit 110 is controlled by the control unit 100 to display the selection screen 140 of the moving image as shown in fig. 3. The moving image selection screen 140 is a screen for selecting and storing (i.e., recording) the moving image input from the imaging device 70, playing the moving image stored in the moving image storage unit 102, or editing the moving image stored in the moving image storage unit 102.
The moving image selection screen 140 displays the following button images in the main area 111 so as to be selectable: a button image 141 for selecting "video" for storing (i.e., recording) the moving image input from the imaging device 70; a button image 142 for selecting "play" for playing the moving image stored in the moving image storage unit 102; and a button image 143 for selecting "edit" for editing the moving image stored in the moving image storage unit 102. In addition, the selection screen 140 of the moving image displays a "return" button image 144 for returning to the top menu 130 in the input area 113.
When the moving image input unit 101 of the control unit 100 receives the pressing of the button image 141 of the "record" of the moving image selection screen 140 shown in fig. 3, the display unit 110 is controlled by the moving image input unit 101 of the control unit 100 to display the moving image 151 acquired by shooting by the shooting device 70 in the main area 111 as shown in fig. 4. In addition, a display screen 150 shown in fig. 4, which displays a moving image 151 input from the imaging device 70 in the main area 111, displays the following button images in the input area 113: a button image 152 for storing the moving image 151 in the "video recording" of the moving image storage unit 102; a button image 153 for stopping recording of the moving image 151; and a button image 154 for storing "moving image name input" in the moving image storage section 102 in association with the moving image stored in the moving image storage section 102. In addition, the display screen 150 shown in fig. 4 displays a "return" button image 155 of the selection screen 140 for returning to the moving image in the input area 113.
In this way, when the moving image input unit 101 receives a press of the button image 141 of the "record" of the moving image selection screen 140 shown in fig. 3, the moving image 151 input from the imaging device 70 is displayed in the main area 111 of the display unit 110. In addition, when the control unit 100 accepts the pressing of the "back" button image 144 of the input area 113 of the selection screen 140 of the moving image, the display unit 110 displays the top menu 130.
When the moving image input unit 101 of the control unit 100 receives a press of the button image 152 of the "record" in the input area 113 of the display screen 150 of the moving image 151 input from the imaging device 70 displayed in the main area 111 shown in fig. 4, the moving image storage unit 102 starts storing the moving image 151 input from the imaging device 70 (i.e., records), and when the press of the button image 153 of "stop" is received, the moving image storage unit 102 stops storing the moving image 151 input from the imaging device 70 (i.e., records). After the moving image input unit 101 of the control unit 100 receives the pressing of the "stop" button image 153 and the moving image storage unit 102 stops storing (i.e., recording) the moving image 151 input from the imaging device 70, when the pressing of the "moving image name input" button image 154 is received, the moving image storage unit 102 stores the moving image name input from the input unit 120 in association with the stored moving image 151.
In this way, the moving image storage unit 102 stores the moving image 151 input to the moving image input unit 101, and stores the moving image name input from the input unit 120 in association with the moving image 151. When the moving image input unit 101 receives a press of the "return" button image 155 in the input area 113 of the display screen 150 of the moving image 151 input from the imaging device 70 displayed in the main area 111 as shown in fig. 4, the display unit 110 displays the moving image selection screen 140.
The moving image 151 displayed in the main area 111 on the display screen 150 shown in fig. 4 includes a character 156 showing "live" of the moving image 151 input from the imaging device 70 to be displayed in real time. The moving image 151 displayed in the main area 111 on the display screen 150 shown in fig. 4 is a moving image obtained by the image pickup device 70 picking up an image of the replacement work of the holding table 10. Accordingly, regarding the moving image 151 displayed in the main area 111 on the display screen 150 shown in fig. 4, the moving image name associated with the moving image stored in the moving image storage unit 102 is, for example, "maintenance table replacement". In the present invention, it is preferable that the moving image 151 stored in the moving image storage unit 102 is a moving image as follows for the processing apparatus 1: a moving image prompting attention not described in a manual of the processing apparatus 1 accompanying aged deterioration or the like; description of new functions dynamic images; a moving image showing a situation when a failure occurs; or a moving image showing the operations of inspection, assembly, etc.
When the control unit 100 accepts the pressing of the "play" button image 142 of the selection screen 140 of the moving image shown in fig. 3, the display unit 110 is controlled by the control unit 100 to display the selection screen 160 of the moving image data as shown in fig. 5. The selection screen 160 of moving image data is a screen displaying button images 161 and 162 of "moving image names" associated with moving images 151 and 171 (shown in fig. 4 and 6) stored in the moving image storage unit 102.
In the example shown in fig. 5, a selection screen 160 of moving image data displays a button image 161 called "maintenance table replacement" and a button image 162 called "flange cleaning" of a moving image in a selectable manner in the main area 111. The button image 161, which is a moving image named "maintenance table replacement", can select a moving image 151, which is a moving image named "maintenance table replacement", and associate the moving image 151, which is a moving image named "maintenance table replacement". The button image 162 of the moving image named "flange cleaning" can select a moving image 171 (shown in fig. 6) of the moving image named "flange cleaning" and is associated with the moving image 171 of the moving image named "flange cleaning".
The moving image 151 called "holding table replacement" is a moving image obtained by capturing an image of a job of replacing the holding table 10 by the imaging device 70. The moving image 171 called "flange cleaning" is a moving image obtained by capturing an image of an operation of cleaning the end surface 25 of the cutting tool 21, which is attached to the tip of the spindle 23 of the cutting unit 20 and supports the flange 24 (shown in fig. 6) of the cutting tool 21, by the imaging device 70. In addition, the selection screen 160 of the moving image data displays a "return" button image 163 for returning to the selection screen 140 of the moving image in the input area 113.
When the moving image playing section 103 of the control unit 100 receives a press of any one of the button images 161, 162 of the selection screen 160 of the moving image data shown in fig. 5, the display unit 110 plays the moving images 151, 171 associated with any one of the pressed button images 161, 162 by the moving image playing section 103, and displays them in the main area 111. For example, when the video playback unit 103 of the control unit 100 receives a press of the button image 162 called "flange cleaning" for the video of the selection screen 160 of the video data shown in fig. 5, the display unit 110 is controlled by the video playback unit 103 of the control unit 100 to display the video 171 called "flange cleaning" for the video stored in the video storage unit 102 in the main area 111 as shown in fig. 6. A display screen 170 for displaying a moving image 171 called "flange cleaning" stored in the moving image storage unit 102 in fig. 6 in the main area 111 is displayed in the input area 113: a button image 172 for playing "play" of the moving image 171; and a "stop" button image 173 for stopping the play of the moving image 171. In addition, the display screen 170 shown in fig. 6 displays a "return" button image 174 of the selection screen 160 for returning to the moving image data in the input area 113.
In this way, when the moving image playing section 103 accepts the pressing of any one of the button images 161, 162 showing the moving image names of the selection screen 160 of the moving image data shown in fig. 5, the first still image of the moving images 151, 171 stored in the moving image storage section 102 in association with any one of the pressed button images 161, 162 is displayed in the main area 111 of the display unit 110. For example, when the moving image playing section 103 receives a press of a button image 162 called "flange cleaning" of the moving image name screen 160 of the moving image data shown in fig. 5, the first still image of the moving image 171 stored in the "flange cleaning" of the moving image storage section 102 in association with the pressed button image 162 is displayed in the main area 111 of the display unit 110. In addition, when the control unit 100 accepts the pressing of the "back" button image 163 of the input area 113 of the selection screen 160 of moving image data shown in fig. 5, the display unit 110 displays the selection screen 140 of moving image.
When the moving image reproducing unit 103 accepts the pressing of the button image 172 of the "play" of the input area 113 of the display screen 170 of the first still image of the moving images 151, 171 stored in the moving image storage unit 102, which is associated with any of the pressed button images 161, 162, being displayed in the main area 111, the playing of the moving images 151, 171 is started, and the moving images 151, 171 are displayed in the main area 111. When the moving image playing section 103 accepts the press of the "stop" button image 173 of the input area 113 of the display screen 170, the playing of the moving images 151 and 171 is stopped, and a still image when the "stop" button image 173 of the moving images 151 and 171 is pressed is displayed in the main area 111.
For example, when the moving image playing section 103 receives a press of the "flange-cleaning" button image 162 of the moving image data selection screen 160, the first still image of the moving image 171 stored in the moving image storage section 102 is displayed in the main area 111, and when the press of the "play" button image 172 of the input area 113 of the display screen 170 is received, the playing of the moving image 171 called "flange-cleaning" is started, and the moving image 171 is displayed in the main area 111. When the moving image playing section 103 receives a press of the "stop" button image 173 of the input area 113 of the display screen 170, the playing of the moving image 171 called "flange cleaning" is stopped, and a still image when the "stop" button image 173 of the moving image 171 called "flange cleaning" is pressed is displayed in the main area 111.
When the control unit 100 accepts the pressing of the "edit" button image 143 of the moving image selection screen 140 shown in fig. 3, the display unit 110 is controlled by the control unit 100 to display the moving image data selection screen 160 as shown in fig. 5. When the moving image editing unit 104 of the control unit 100 accepts the pressing of any one of the button images 161, 162 of the selection screen 160 of the moving image data shown in fig. 5, the display unit 110 is controlled by the moving image editing unit 104 to display the moving images 151, 171 associated with any one of the pressed button images 161, 162 in the main area 111.
For example, when the moving image editing unit 104 of the control unit 100 receives a press of the button image 162 called "flange cleaning" of the moving image name screen 160 of the moving image data shown in fig. 5, the display unit 110 is controlled by the moving image editing unit 104 of the control unit 100 to display the moving image 171 called "flange cleaning" of the moving image stored in the moving image storage unit 102 in the main area 111 as shown in fig. 7. In addition, regarding a display screen 180 for displaying a moving image 171 called "flange cleaning" as a moving image stored in the moving image storage unit 102 shown in fig. 7 in the main area 111, the following button images are displayed in the input area 113: a button image 181 for playing "play" of the moving image 171; a button image 182 of "stop" for stopping the play of the moving image 171; and a "reverse play" button image 183 for reverse playing the moving image 171. Here, reverse playback refers to playback of a moving image 171 toward the original still image of the moving image 171.
In addition, the display screen 180 shown in fig. 7 displays the following button images in the input area 113: a "partial deletion" button image 184 for deleting a part of the moving image 171; a button image 185 for setting "start point setting" of a start point of the deleted part of the moving image 171; a button image 186 for setting "end point setting" of the end point of the deleted part of the moving image 171; a button image 187 for inputting "character input" such as characters to the moving image 171; and a button image 188 for returning to "back" of the selection screen 160 of moving image data.
In this way, when the moving image editing unit 104 accepts the pressing of any one of the button images 161, 162 showing the moving image names of the selection screen 160 of the moving image data shown in fig. 5, the first still image of the moving images 151, 171 stored in the moving image storage unit 102 in association with any one of the pressed button images 161, 162 is displayed in the main area 111 of the display unit 110. For example, when the moving image playing section 103 receives a press of a button image 162 called "flange cleaning" of the moving image name screen 160 of the moving image data shown in fig. 5, the first still image of the moving image 171 stored in the "flange cleaning" of the moving image storage section 102 in association with the pressed button image 162 is displayed in the main area 111 of the display unit 110.
When the moving image editing unit 104 accepts the pressing of the button image 181 of the "play" of the input area 113 of the display screen 180 of the first still image of the moving images 151, 171 stored in the moving image storage unit 102, which is associated with any of the pressed button images 161, 162, displayed in the main area 111, the playing of the moving images 151, 171 is started, and the moving images 151, 171 are displayed in the main area 111. When the moving image editing unit 104 accepts the pressing of the "stop" button image 182 in the input area 113 of the display screen 180, the playback of the moving images 151 and 171 is stopped, and a still image when the "stop" button image 182 of the moving images 151 and 171 is pressed is displayed in the main area 111. When the moving image editing unit 104 accepts the pressing of the button image 183 of "reverse play" in the input area 113 of the display screen 180, the moving images 151 and 171 are reverse played, and the reverse-played moving images 151 and 171 are displayed in the main area 111.
For example, when the moving image editing unit 104 receives a press of the "flange cleaning" button image 162 of the moving image data selection screen 160, the first still image of the moving image 171 stored in the moving image storage unit 102 is displayed in the main area 111. When the moving image editing unit 104 receives a press of the "play" button image 181 in the input area 113 of the display screen 180, play of the moving image 171 called "flange cleaning" is started, and the moving image 171 is displayed in the main area 111. When the moving image editing unit 104 receives a press of the "stop" button image 182 in the input area 113 of the display screen 180, the moving image 171 named "flange cleaning" is stopped from being played, and a still image when the "stop" button image 182 of the moving image 171 named "flange cleaning" is pressed is displayed in the main area 111. When the video editing unit 104 receives a press of the button image 183 of "reverse play" in the input area 113 of the display screen 180, the video 171 named "flange cleaning" is reverse played, and the video 171 named "flange cleaning" is displayed in the main area 111.
When the moving image editing unit 104 sequentially accepts the pressing of the "partial deletion" button image 184, the "start setting" button image 185, and the "end setting" button image 186 in the state where the moving images 151 and 171 or the still images of the moving images 151 and 171 are displayed in the main area 111, the moving image between the still images of the moving images 151 and 171 displayed in the main area 111 when the "start setting" button image 185 is pressed and the still images of the moving images 151 and 171 displayed in the main area 111 when the "end setting" button image 186 is pressed is deleted. When the moving image editing unit 104 receives a press of the button image 187 for "character input" in a state in which the moving images 151 and 171 or the still images of the moving images 151 and 171 are displayed in the main area 111, characters or the like input from the input unit 120 are inserted into predetermined positions of the moving images 151 and 171, and the moving images 151 and 171 into which the characters or the like are inserted are stored in the moving image storage unit 102.
In embodiment 1, the moving image editing unit 104 performs "partial deletion" for deleting a part of the moving images 151 and 171 and "text input" for inserting text into a predetermined position of the moving images 151 and 171 as editing of the moving images 151 and 171, but the present invention is not limited to these, and any operation may be performed as long as the moving images 151 and 171 are edited by inserting another moving image or a still image. In the present invention, the moving image editing unit 104 may insert a desired handwritten character or graphic into the moving images 151 and 171 by the operator operating the main area 111 in a state in which the moving images 151 and 171 are displayed in the main area 111.
In this way, the moving image editing unit 104 can edit the moving images 151 and 171 stored in the moving image storage unit 102. When the moving image editing unit 104 of the control unit 100 receives a press of the "return" button image 188 in the input area 113 of the display screen 180 shown in fig. 7, the display unit 110 displays the selection screen 160 of moving image data. In this way, the display unit 110 of the processing apparatus 1 can display the moving images 151 and 171 acquired by the imaging device 70 in real time, and the control unit 100 can record and play the moving images 151 and 171 acquired by the imaging device 70.
The machining device 1 having the above-described configuration sets the machining conditions in the control unit 100, sets the cassette 50 accommodating the workpiece 200 in the cassette lifter 60, and starts the machining operation when the control unit 100 receives a start instruction of the machining operation.
When the machining operation is started, the control unit 100 controls the conveying unit to take out one workpiece 200 from the cassette 50, and the workpiece 200 is placed on the holding surface 11 of the holding table 10 in the carry-in/out area via the adhesive tape 209. In the machining operation, the machining device 1 sucks and holds the workpiece 200 on the holding surface 11 via the adhesive tape 209, and clamps the annular frame 210 by the clamp 12.
In the machining apparatus 1, the main shaft 23 is rotated about the axis by the control unit 100, the control unit 100 controls the moving unit 40 to move the holding table 10 to the lower side of the imaging unit 30, and the imaging unit 30 images the workpiece 200 sucked and held by the holding table 10 to perform alignment. In the machining operation, the control unit 100 controls the moving unit 40 and the like according to the machining conditions, and the machining device 1 cuts the cutting tool 21 into the line 203 to cut the workpiece 200 until the adhesive tape 209 is reached, while relatively moving the cutting tool 21 and the workpiece 200 along the line 203 to cut.
The machining device 1 cuts the line 203 for dividing the workpiece 200 according to machining conditions to divide the workpiece 200 into the individual devices 204. When the machining device 1 cuts all the lines 203 for dividing the workpiece 200, the holding table 10 is moved from the machining area toward the carry-in/out area.
The processing apparatus 1 stops the movement of the holding table 10 in the carry-in/out area, stops the suction and holding of the workpiece 200 by the holding table 10, and releases the clamping by the clamping section 12. The machining device 1 controls the conveying means by the control means 100 to convey the machined workpiece 200 from the holding table 10 into the cassette 50. When the machining device 1 cuts all the objects to be machined 200 in the cassette 50, the machining operation is ended.
The control unit 100 of the processing apparatus 1 according to embodiment 1 described above includes: a moving image input unit 101 for displaying moving images 151 and 171 obtained by shooting by the shooting device 70 on the display unit 110; a moving image storage unit 102 for storing moving images 151 and 171 input to the moving image input unit 101; and a moving image playing unit 103 that plays the moving images 151 and 171 stored in the moving image storage unit 102 and displays the moving images on the display unit 110.
Therefore, when a new function is added, the processing apparatus 1 can store the moving images 151 and 171 obtained by the maintenance method and the trouble handling method captured by the imaging apparatus 70 in the moving image storage 102 while checking them by the display unit 110, and therefore can store moving images of an appropriate maintenance method, trouble handling method, and the like in the moving image storage 102. Further, by checking the moving image stored in the moving image storage unit 102, the processing apparatus 1 can easily grasp the detailed operation and the like of the processing apparatus 1, which are difficult in the case of sharing information by text and voice, and the like.
This enables the processing device 1 to share information with high accuracy.
In particular, since the machining device 1 can share the operations such as when an additional function is required and when a failure occurs in a moving image, the user of the machining device 1 can easily transfer the user's expectations and confusion to other people, and can contribute to better proposals and problem solutions. Further, the machining device 1 can facilitate the finding of the cause or the like of the occurrence of the failure by storing in advance the inspection during the assembly of the machining device 1 and the moving image at the time of the work. In addition, the processing device 1 also has the following effects: the action of the function under development is photographed, and when other persons who do not know the function perform trouble handling or add the function, a moving image is viewed, so that the image is easily imaginable.
Further, since the processing apparatus 1 captures a moving image at an arbitrary position by the imaging device 70, the degree of freedom is high. In addition, when the imaging device 70 is a stereoscopic camera, the processing device 1 can obtain a moving image that has a wider field of view than a monocular camera and is easy to understand.
The present invention is not limited to the above embodiment. That is, various modifications may be made and implemented within a range not departing from the gist of the present invention. In embodiment 1, the machining device 1 is a cutting device, but in the present invention, the machining device is not limited to the cutting device, and may be, for example, various machining devices such as: a grinding device having a grinding means as a processing means for grinding the workpiece 200; a grinding device having a grinding unit as a processing unit for grinding the workpiece 200; and a laser processing apparatus having a laser irradiation unit as a processing unit for performing laser processing on the object 200. In the present invention, the workpiece 200 is not limited to a wafer, and may be a plate-shaped workpiece 200 such as various package substrates, ceramic substrates, or glass substrates.
In the above embodiment, the button image 138 described as "moving image data" is displayed in the main area 111 of the top menu 130, but the present invention is not limited thereto. For example, the button image 138 of "moving image data" may be displayed in a region (for example, the input region 113, the software keyboard region 112, or the like) different from the main region 111 so that the button image 138 of "moving image data" can be selected and operated even when the button image 131 of "full-automatic" or the button image 132 of "manual operation" is selected.
In the present invention, the control unit 100 may confirm ID (identification) of the operator and set the display object and editing authority of the moving images 151 and 171 for each ID, that is, for each operator. The moving images 151 and 171 to be played and edited may include various information such as text, sound, a stop image, graphics, and a shooting date.
The processing device 1 may also be configured to perform image capturing by an image capturing device of ID (identification) of the operator and to analyze the ID by image recognition or perform face authentication by image capturing by an image capturing device of the operator, thereby changing the displayed screen so that a screen appropriate for a skilled person is displayed for a skilled person and a screen appropriate for a beginner is displayed for a beginner, for example, depending on the operator.
The processing device 1 is provided with a bar code reader or the like to read the ID of the cassette 50 or the like and set the corresponding menu (record) data. Therefore, in the present invention, the processing device 1 may read the ID of the cassette 50 at the timing designated by the attached imaging device, and automatically set the corresponding menu data. Bar codes include a plurality of types such as one-dimensional and QR codes (registered trademark), but even if the operation is changed, the same imaging device can be used for coping with the bar codes. In addition, even when the ID of the ring frame 210 is read by the attached imaging device, the processing device 1 can cope with the addition of the same imaging device, and it is not necessary to prepare a bar code for each operation.
In the present invention, when moving images 151 and 171 imaged by imaging device 70 are stored at all times, the file size of moving images 151 and 171 increases, so that it is possible to start and stop image recording by triggering a certain operation in the imaging field of imaging device 70 or a certain operation in processing device 1 (for example, transition to a predetermined screen, a screen operation of an operator, or the like).
For example, as the 1 st aspect, the processing apparatus 1 may be configured to set and record the imaging device 70 connected to the control unit 100 in a conveying unit or the like, and start and end imaging by the imaging device 70 with the start and end of the processing operation as a trigger. In this case, the processing device 1 can cope with the occurrence of the moving image analysis problem.
For example, as the 2 nd aspect, the processing device 1 may start and end the photographing by the photographing device 70 with the detection result of the sensor detecting the moving speed of each unit 41, 42, 43, 44 and the sensor detecting the falling of the object to be processed 200 as triggers. In this case, if the operation delay is equal to or longer than a predetermined time, the machining device 1 can automatically stop the machining operation so as to generate an alarm.
In addition, as the 3 rd aspect, the processing device 1 may designate a screen in an operation considered as an operation error, such as a notch inspection error or a tool replacement, and start and end the photographing by the photographing device 70 with the display of the screen as a trigger. In this case, the processing device 1 can determine the difference between the images before and after shooting and issue an alarm. Specifically, it is possible to avoid problems such as forgetting to attach the cutting tool 21, and damage to the workpiece 200 or the machining device 1 by placing the replacement jig in the machining device 1.
In addition, as the 4 th aspect, the processing device 1 may start and end the shooting of the shooting device 70 with the opening and closing of the cover covering the device main body 2 as a trigger. In this case, the processing device 1 can determine the difference between the images before and after shooting and issue an alarm. Specifically, the processing apparatus 1 has an operation (operation) of opening the apparatus cover due to a drop of the object 200, a vacuum abnormality of the conveying means, or the like, taking out the object 200, and restarting the mass production, but has a problem that the object 200 is damaged as a result of restarting in a state in which the object 200 remains in the processing apparatus, and forgetting to take out the object 200 can be prevented by determining a difference between images before and after shooting.
In addition, as the 5 th aspect, the processing device 1 may be provided with a plurality of photographing devices 70 at positions visible to the operator and at positions visible to the processing device, and start and end photographing by the photographing devices 70 triggered by a screen operation (for example, a specified screen such as a replacement screen or an error recovery screen of the cutting tool 21) to be shifted to a specific display unit 110 of the touch panel. In this case, the machining device 1 can record who performed the operation or the situation at the time of the generation when an error of the operator occurs. Further, by automatically editing the moving image captured in this case to the beginning of the moving images of the 1 st, 2 nd, 3 rd and 4 th aspects exemplified above, it is possible to record "who did what" by one moving image.
Claims (2)
1. A processing apparatus for processing an object to be processed, characterized in that,
the processing device comprises:
a holding table for holding the workpiece;
a processing unit for processing the object to be processed held by the holding table;
a display unit that displays various information; and
a control unit for controlling each component element,
The control unit includes:
a moving image input unit for inputting a moving image obtained by shooting by the shooting device;
a moving image storage unit for storing the moving image input to the moving image input unit; and
a moving image playing unit for playing the moving image stored in the moving image storage unit,
the moving image input unit causes the display unit to display a moving image input from the photographing device, and the moving image playback unit causes the display unit to display a moving image to be played back.
2. The processing apparatus according to claim 1, wherein,
the control unit includes a moving image editing unit capable of editing the moving image stored in the moving image storage unit.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021188877A JP2023075769A (en) | 2021-11-19 | 2021-11-19 | Processing apparatus |
JP2021-188877 | 2021-11-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116153807A true CN116153807A (en) | 2023-05-23 |
Family
ID=86337868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211402795.XA Pending CN116153807A (en) | 2021-11-19 | 2022-11-10 | Processing device |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP2023075769A (en) |
KR (1) | KR20230073989A (en) |
CN (1) | CN116153807A (en) |
TW (1) | TW202322202A (en) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010049466A (en) | 2008-08-21 | 2010-03-04 | Disco Abrasive Syst Ltd | Working device |
JP2014231123A (en) | 2013-05-29 | 2014-12-11 | 株式会社ディスコ | Processing device and information exchange method |
JP2020183001A (en) | 2019-05-08 | 2020-11-12 | 株式会社ディスコ | Processing device |
-
2021
- 2021-11-19 JP JP2021188877A patent/JP2023075769A/en active Pending
-
2022
- 2022-11-07 TW TW111142382A patent/TW202322202A/en unknown
- 2022-11-10 CN CN202211402795.XA patent/CN116153807A/en active Pending
- 2022-11-14 KR KR1020220151155A patent/KR20230073989A/en unknown
Also Published As
Publication number | Publication date |
---|---|
TW202322202A (en) | 2023-06-01 |
KR20230073989A (en) | 2023-05-26 |
JP2023075769A (en) | 2023-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6935168B2 (en) | Processing equipment | |
JP2009054904A (en) | Cutting method and cutting device | |
CN113146064A (en) | Processing device | |
JP2019038045A (en) | Dressing board, its usage method and cutting device | |
CN116153807A (en) | Processing device | |
JP6918421B2 (en) | Processing equipment and how to use the processing equipment | |
KR20180041585A (en) | Wafer processing method | |
JP7408235B2 (en) | processing equipment | |
JP2010040727A (en) | Method of dividing plate-like body | |
JP2021089938A (en) | Processing device | |
JP7455459B2 (en) | processing equipment | |
US11768478B2 (en) | Processing apparatus | |
JP5538015B2 (en) | Method of determining machining movement amount correction value in machining apparatus | |
JP7362334B2 (en) | Processing method | |
JP2024088288A (en) | Detection device, processing device, and registration method | |
JP7368138B2 (en) | processing equipment | |
JP2022116492A (en) | Machining device, program, and storage medium | |
JP2022083252A (en) | Processing device and processing method | |
JP2023068917A (en) | Processing device | |
JP2024089876A (en) | Machining device | |
CN115132610A (en) | Machining device and machining method | |
JP2023032082A (en) | Intersection region detection method | |
JP2022039182A (en) | Alignment method | |
JP2020109816A (en) | Diagnostic method | |
JP2022035316A (en) | Diametrical dimension detection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination |