CN107186347B - Laser processing system and processing control method - Google Patents

Laser processing system and processing control method Download PDF

Info

Publication number
CN107186347B
CN107186347B CN201710034885.0A CN201710034885A CN107186347B CN 107186347 B CN107186347 B CN 107186347B CN 201710034885 A CN201710034885 A CN 201710034885A CN 107186347 B CN107186347 B CN 107186347B
Authority
CN
China
Prior art keywords
controller
processing apparatus
image processing
laser
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710034885.0A
Other languages
Chinese (zh)
Other versions
CN107186347A (en
Inventor
福井浩
二神义弘
湊口伸哉
芦原克充
阪本达典
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp filed Critical Omron Corp
Publication of CN107186347A publication Critical patent/CN107186347A/en
Application granted granted Critical
Publication of CN107186347B publication Critical patent/CN107186347B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/03Observing, e.g. monitoring, the workpiece
    • B23K26/032Observing, e.g. monitoring, the workpiece using optical means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/04Automatically aligning, aiming or focusing the laser beam, e.g. using the back-scattered light
    • B23K26/042Automatically aligning the laser beam
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/08Devices involving relative movement between laser beam and workpiece
    • B23K26/082Scanning systems, i.e. devices involving movement of the laser beam relative to the laser head
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/08Devices involving relative movement between laser beam and workpiece
    • B23K26/0869Devices involving movement of the laser head in at least one axial direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/352Working by laser beam, e.g. welding, cutting or boring for surface treatment
    • B23K26/355Texturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/352Working by laser beam, e.g. welding, cutting or boring for surface treatment
    • B23K26/359Working by laser beam, e.g. welding, cutting or boring for surface treatment by providing a line or line pattern, e.g. a dotted break initiation line
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/36Removing material
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/36Removing material
    • B23K26/361Removing material for deburring or mechanical trimming
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/36Removing material
    • B23K26/38Removing material by boring or cutting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/36Removing material
    • B23K26/38Removing material by boring or cutting
    • B23K26/382Removing material by boring or cutting by boring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/70Auxiliary operations or equipment
    • B23K26/702Auxiliary equipment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K2101/00Articles made by soldering, welding or cutting
    • B23K2101/007Marks, e.g. trade marks
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45041Laser cutting

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Laser Beam Processing (AREA)

Abstract

The laser marking machine of the present invention comprises: a controller having an oscillator for oscillating the laser light; the marking machine head scans the laser beam on the processing surface of the object to be processed based on the control of the controller. When there is a setting for causing the image processing apparatus to execute a predetermined scene, the controller transmits a command for instructing the image processing apparatus to execute the scene. When receiving the command, the image processing device calculates a shift amount of the object to be processed with respect to the reference position using image data obtained by imaging the object to be processed by the marker head, and notifies the controller of the shift amount. The controller causes the marking machine head to scan after compensating the position where the laser is scanned based on the offset amount.

Description

Laser processing system and processing control method
Technical Field
The present invention relates to a laser processing system having a laser processing device and a processing control method of the laser processing system.
Background
Conventionally, a laser processing apparatus that processes an object to be processed (workpiece) using a laser beam is known. In addition, as one type of laser processing apparatus, a laser marker (laser marker) is known which marks a surface of a marking object (workpiece) with a laser beam to form characters, figures, and the like. In recent years, laser markers have been developed which can perform various processes such as punching, peeling, and cutting in addition to marking.
Japanese patent laying-open No.2013-086173 discloses a laser processing device for the purpose of easily correcting an error in a laser irradiation position. The laser processing apparatus includes a mirror, an optical axis operating mechanism, a camera sensor, and an error correcting mechanism. These structural components are explained below.
The mirror reflects the laser light from the laser oscillator toward a predetermined surface on which the object is placed. The optical axis operation mechanism positions the optical axis of the laser light at a desired target irradiation position by changing the direction of the mirror. The camera sensor captures an image of a target irradiation position reflected on the mirror and a region around the target irradiation position. The error correction mechanism detects an error between a target irradiation position instructed to the optical axis operation mechanism and a position of an actual laser optical axis in a prescribed surface with reference to an image captured by the camera sensor. In order to irradiate the laser beam to the target irradiation position during machining, the laser machining apparatus determines an instruction compensation amount to be applied to the optical axis operating mechanism based on the error.
Japanese patent application laid-open No.2013-184171 discloses a marking device for continuously marking a predetermined drawing pattern on a continuously conveyed marking object without shaking, as the laser marker. The marking device is provided with a conveying part, a first moving part of a marking unit, a second moving part of the marking unit, a marking position compensation part, a marking position superposition control part, a synchronous movement control part and a comprehensive control part.
The conveying part continuously conveys the marking object in a first direction. The marking unit first moving part moves the marking unit in a first direction. The marking unit second moving unit moves the marking unit in the second direction. The integrated control unit controls the marking position overlapping control unit and the synchronous movement control unit, thereby irradiating the marking light (laser) onto the same position in the first direction and the second direction on the marking object for a predetermined time while the marking object is continuously conveyed in the first direction.
However, laser markers mark by emitting laser light from the marker head toward a position determined each time the layout is marked. However, if a deviation occurs between the position where the target object is placed and the position specified in the marking layout, a phenomenon occurs in which marking cannot be performed at a desired position. To prevent this, an image processing device (also referred to as a "vision sensor") may be used as described below.
More specifically, the image processing apparatus checks how much the marking object is displaced from a preset reference position by using image data of the marking object captured by the camera. That is, the image processing apparatus calculates the shift amount. The image processing apparatus notifies the laser marker of the inspection result (calculated offset amount) via a PLC (Programmable Logic Controller). The laser marking machine compensates the scanning position of the laser based on the received offset. Specifically, the laser marker performs offset-based position compensation of the mark layout. The integrated control unit in japanese patent laid-open No.2013-184171 corresponds to a PLC.
However, in order to realize these processes, it is necessary to execute a control program (for example, ladder program) for controlling the operations of the laser marker and the image processing apparatus by the PLC. Therefore, the user needs to prepare to create the control program in advance.
Disclosure of Invention
The present invention has been made in view of the above problems, and an object of the present invention is to provide a laser processing system in which a laser processing apparatus can perform laser processing together with an image processing apparatus without using a PLC.
According to one embodiment of the present invention, a laser processing system includes a laser processing apparatus and an image processing apparatus. The laser processing device includes: a controller having an oscillator that oscillates laser light; and a head (head) for scanning the laser beam on the processing surface of the object to be processed under the control of the controller. When the controller sets for the image processing apparatus to execute a first process (process), the controller transmits a first command instructing the image processing apparatus to execute the first process. When the image processing device receives the first command, the image processing device calculates a shift amount of the object with respect to the reference position using image data obtained by imaging the object, and notifies the controller of the shift amount. The controller causes the head to scan after compensating for the scanning position of the laser based on the offset amount.
Preferably, the image processing apparatus is capable of performing a plurality of processes including the first process. The controller transmits a second command for specifying the first process to the image processing apparatus before transmitting the first command. The image processing apparatus confirms that the designated first process is included in the plurality of processes based on the second command, and then notifies the controller of the confirmation result.
Preferably, the controller transmits the first command to the image processing apparatus after receiving the notification of the result of the confirmation.
Preferably, the image processing apparatus transmits a predefined notification indicating that the designated process cannot be executed at the controller to the controller when the second command is not received from the controller but a third command for designating a second process not included in the plurality of processes is received before the first command is received.
Preferably, the controller transmits the second command to the image processing apparatus when information for specifying the processing target object is stored.
Preferably, the controller does not cause the head to scan if the offset amount has not been received from the image processing apparatus from the time when the first command is transmitted to the end of the preset time.
Preferably, an application program for setting the layout of the marks marked by the laser is stored in the controller. The controller receives and stores a setting for causing the image processing apparatus to execute the first process via a user interface displayed when the application is run.
According to another embodiment of the present invention, a laser processing system has a laser processing apparatus and an image processing apparatus. The laser processing device includes: a controller having an oscillator that oscillates laser light; and a marking machine head for scanning the laser beam on the processing surface of the object to be processed based on the control of the controller. When the controller performs a setting for causing the image processing apparatus to execute the first process, the controller transmits a first command for instructing the image processing apparatus to execute the first process. When the image processing apparatus receives the first command, the image processing apparatus calculates a position of the object using image data obtained by imaging the object, and notifies the controller of the calculated position. The controller calculates an offset amount of the calculated position with respect to a reference position. The controller compensates the scanning position of the laser based on the offset, and then causes the head to scan.
According to another further embodiment of the present invention, a machining control method is implemented in a laser machining system having: a controller having an oscillator that oscillates laser light; a marking machine head for scanning an oscillated laser beam on a processing surface of a processing object; an image processing apparatus. The processing control method comprises the following steps: transmitting, by the controller, a command instructing the image processing apparatus to execute a predetermined process under a setting condition for causing the image processing apparatus to execute the predetermined process; calculating a shift amount of the object from the reference position using image data obtained by imaging the object on condition that the image processing apparatus receives the command; a step in which the image processing apparatus notifies the controller of the calculated offset amount; and a step of enabling the machine head to scan after the controller compensates the scanning position of the laser according to the offset.
The above and other objects, features, embodiments and advantages of the present invention will become apparent from the following detailed description of the present invention with reference to the accompanying drawings.
Drawings
Fig. 1 is a block diagram showing a schematic configuration of a laser processing system.
Fig. 2 is a block diagram showing the structure of the laser processing system in more detail.
Fig. 3 is a block diagram showing hardware included in the control board.
Fig. 4 is a configuration diagram showing hardware included in the image processing apparatus.
Fig. 5 is a diagram showing a user interface displayed in a display device using a controller.
Fig. 6 is a flowchart for explaining a process flow in the laser processing system.
Fig. 7 is a diagram for explaining compensation of the marking position performed in the laser marking machine.
Detailed Description
Embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the drawings, the same or corresponding portions are denoted by the same reference numerals, and description thereof will not be repeated. Hereinafter, a laser marker will be described as an example of the laser processing apparatus. The laser marker of the present embodiment may have a marking function of characters or symbols, and may also have a processing other than marking, such as punching, peeling, and cutting.
Summary of marking systems
Fig. 1 is a block diagram showing a schematic configuration of a laser processing system 1. Referring to fig. 1, a laser processing system 1 includes a laser marker 2 and an image processing apparatus 3 (also referred to as a "visual sensor"). The laser marker 2 includes a controller 21 and a marker head 26.
The controller 21 is used to control the action of the marking machine head 26. The controller 21 includes a laser oscillator for oscillating the laser light L, and the detailed description thereof will be described later.
The marking machine head 26 has a camera unit 261. The marking machine head 26 irradiates the laser light L to the object 8 (the left object 8 in the state of fig. 1) placed on the member 9 on which the object 8 is placed, based on the control of the controller 21, and the member 9 is used to place the object 8. Specifically, the marking machine head scans the laser beam L on the processing surface of the object 8. In the example of fig. 1, when the processing (a series of processing such as scanning) on the object 8 is finished, the member 9 is moved in the left direction (the direction of the arrow a in the figure) to irradiate the next object 8 (the right object 8 in fig. 1) with the laser light L.
The camera unit 261 has a photographing device (specifically, a camera) and a communication device. The imaging device of the camera unit 261 is configured to be able to image a predetermined area. In some aspects, the imaging device images the object 8 as a subject. In addition, the communication device of the camera unit 261 transmits image data obtained by shooting to the image processing apparatus 3 via the communication cable 12.
The marking machine head 26 is connected to an oscillator in the controller 21 via an optical fiber 28. The marking machine head 26 is connected to the controller 21 via a control cable 29. Specifically, the marking machine head 26 is connected to a control board in the controller 21 via a control cable 29. Fourth, the controller 21 and the marking machine head 26 are connected in the same manner as in the related art, and therefore, a detailed description thereof will not be provided here.
In the laser processing system 1 of the present embodiment, the controller 21 of the laser marker 2 is directly connected to the image processing apparatus 3 via a LAN. Typically, the controller 21 is directly connected to the image processing apparatus 3 via an Ethernet (registered trademark). Specifically, the controller 21 is directly connected to the image processing apparatus 3 via the ethernet cable 11 without a control device such as a PLC.
Further, a detailed description of the communication performed between the controller 21 and the image processing apparatus 3 will be described later (fig. 6).
Detailed structure of laser processing System 1
Fig. 2 is a block diagram illustrating the structure of the laser processing system 1 in more detail. Referring to fig. 2, as described above, the laser processing system 1 has: an image processing device 3, and a controller 21 and a marking machine head 26 constituting the laser marking machine 2.
The controller 21 includes a laser oscillator 240, a control board 210, a driver (driver)220, and a driver power supply 230. The display device 6 and the input device 7 can be connected to the controller 21. The display device 6 and the input device 7 are used when the user changes the setting content of the controller 21.
(b1. controller 21)
(1) Laser oscillator 240
The laser oscillator 240 is explained as follows. The laser oscillator 240 has: an optical fiber 241; semiconductor lasers 242, 243, 249A to 249D; isolators (isolators) 244, 246; couplers 245, 248; a band pass filter 247.
The semiconductor laser 242 is a seed light source for generating seed light. The semiconductor laser 242 is driven by the driver 220 to generate pulsed seed light.
The isolator 244 transmits only light in one direction and blocks light incident in the direction opposite to the direction of the light. Specifically, the isolator 244 transmits seed light generated from the semiconductor laser 242, and blocks return light from the optical fiber 241. This can prevent damage to the semiconductor laser 242.
The semiconductor laser 243 is an excitation light source for generating excitation light for exciting the rare earth element added to the core of the optical fiber 241.
The coupler 245 couples the seed light from the semiconductor laser 242 and the excitation light from the semiconductor laser 243, and causes the coupled light to enter the optical fiber 241.
The excitation light incident on the optical fiber 241 from the semiconductor laser 243 via the coupler 245 is absorbed by the rare earth element contained in the core of the optical fiber 241. Thereby, the rare earth element is excited, and a state of population inversion can be obtained. In this state, the seed light from the semiconductor laser 242 is incident on the core of the optical fiber 241, and stimulated emission is generated. Seed light (pulsed light) is amplified by the stimulated emission. That is, the seed light and the excitation light are incident on the optical fiber amplifier constituted by the optical fiber 241, whereby the seed light is amplified.
The isolator 246 transmits the pulse light output from the optical fiber 241, and blocks the light returning to the optical fiber 241.
The band-pass filter 247 is configured to transmit light in a predetermined wavelength band. The "prescribed wavelength band" specifically refers to a wavelength band including the peak wavelength of the pulsed light output from the optical fiber 241. When the spontaneous emission light is emitted from the optical fiber 241, the spontaneous emission light is removed by the band pass filter 247.
The laser light having passed through the band pass filter 247 is incident on the optical fiber 28 provided for transmitting the laser light via the coupler 248. The semiconductor lasers 249A to 249D generate excitation light so that the laser light having passed through the band pass filter 247 is amplified in the optical fiber 28.
The coupler 248 couples the pulse light having passed through the band-pass filter 247 and the light from the semiconductor lasers 249A to 249D, and inputs the coupled light to the optical fiber 28.
The configuration of the laser oscillator 240 shown in fig. 2 is merely an example, and is not limited thereto.
(2) Control substrate 210
The control substrate 210 includes a control section 211, a pulse generation section 212, a storage section 213, and a communication processing section 214 and 217.
The control unit 211 controls the overall operation of the controller 21 by controlling the pulse generation unit 212 and the driver 220. Specifically, the control unit 211 controls the overall operation of the controller 21 by running an operating system and an application program stored in the storage unit 213.
The pulse generator 212 generates an electric signal having a predetermined repetition frequency and a predetermined pulse width. The pulse generating unit 212 outputs an electric signal or stops the output of the electric signal under the control of the control unit 211. The electrical signal from the pulse generator 212 is supplied to the semiconductor laser 242.
The storage unit 213 stores various data in addition to the operating system and the application programs. The data will be described later.
The communication processing unit 214 is an interface (interface) for communicating with the marking machine head 26. The control unit 211 transmits a control signal to the marking machine head 26 via the communication processing unit 214 and the control cable 29.
The communication processing unit 215 is an interface for communicating with the image processing apparatus 3. The control section 211 transmits various commands to the image processing apparatus 3 via the communication processing section 216 and the ethernet cable 11. Further, the control section 211 receives a response (response) corresponding to the above-described command from the image processing apparatus 3 via the ethernet cable 11 and the communication processing section 216. Details of data (commands, responses, and the like) exchange with the image processing apparatus 3 via the communication processing unit 215 will be described later (fig. 6).
The communication processing unit 216 receives an input from the input device 7. The communication processing unit 216 notifies the control unit 211 of the received input.
The communication processing unit 217 transmits the image data generated by the control unit 211 to the display device 6. In this case, the display device 6 displays an image (user interface) generated based on the image data. An example of the user interface displayed on the display device 6 will be described later (fig. 5).
(3) Driver 220 and driver power supply 230
The driver power supply 230 supplies power to the driver 220. Thereby, the driver 220 supplies a drive current to the semiconductor lasers 242, 243, and 249A to 249D. The semiconductor lasers 242, 243, and 249A to 249D perform laser oscillation by supplying a drive current. The drive current supplied to the semiconductor laser 242 is modulated by the electric signal from the pulse generating section 212. Thus, the semiconductor laser 242 performs pulse oscillation, and outputs pulsed light having a predetermined repetition frequency and a predetermined pulse width (described above) as seed light. On the other hand, a continuous drive current is supplied to the semiconductor lasers 243, 249A to 249D by the driver 220. Thereby, the semiconductor lasers 243 and 249A to 249D oscillate continuously, and output continuous light as excitation light.
(b2. marking machine head 26)
The marking machine head 26 has a camera unit 261, an isolator 262, a collimator lens 263, a galvano scanner 264, and a condenser lens 265. The isolator 262 passes the pulse light output from the optical fiber 28, and blocks the light returning to the optical fiber 28. The pulsed light having passed through the isolator 262 is output to the atmosphere from the collimator lens 263 attached to the isolator 262 and is incident on the galvano scanner 264. The current scanner scans the laser light in at least one of a first axis direction (specifically, an axis parallel to arrow a in fig. 1) and a second axis direction orthogonal to the first axis direction. The condenser lens 265 serves to condense the laser light L scanned by the current scanner 264.
(b3. image processing apparatus 3)
The image processing apparatus 3 includes a control unit 31, a storage unit 32, and communication processing units 33 and 34.
The control unit 31 controls the overall operation of the image processing apparatus 3 by running an operating system and an application program stored in the storage unit 32.
The storage unit 32 stores various data in addition to the operating system and the application program.
The communication processing unit 33 is an interface for communicating with the controller 21. The control unit 31 receives a command from the controller 21 via the ethernet cable 11 and the communication processing unit 33. Further, the control section 31 transmits a response corresponding to the above command to the controller 21 via the communication processing section 33 and the ethernet cable 11.
The communication processing unit 34 is an interface for communicating with the camera unit 261 of the marking machine head 26. The control section 31 receives image data from the camera unit 261 via the communication cable 12 and the communication processing section 34.
(b4. hardware configuration of control board 210 and image processing apparatus 3)
Fig. 3 is a block diagram showing hardware included in the control board 210. Referring to fig. 3, the control board 210 includes a processor 110, a memory 120, a communication interface 130, and a pulse generation circuit 140.
The Memory 120 includes, for example, a ROM (Read Only Memory) 121, a RAM (random access Memory) 122, and a flash Memory 123. The flash memory 123 stores the operating system, the application program, and various data described above. The memory 120 corresponds to the storage section 213 shown in fig. 2.
The processor 110 controls the overall operation of the controller 21. The control unit 211 shown in fig. 2 is realized by the processor 110 running an operating system and an application program stored in the memory 120. In addition, various data stored in the memory 120 will be referred to when running an application.
The communication interface 130 is an interface for communicating with external devices (for example, the image processing device 3, the marker head 26, the display device 6, and the input device 7). The communication interface corresponds to the communication processing section 214 and 217 of fig. 2.
The pulse generating circuit 140 corresponds to the pulse generating section 212 of fig. 2. That is, the pulse generation circuit 140 generates an electric signal having a predetermined repetition frequency and a predetermined pulse width based on an instruction from the processor 110.
Fig. 4 is a configuration diagram showing hardware included in the image processing apparatus 3. Referring to fig. 4, the image processing apparatus 3 includes an arithmetic processing circuit 150, a memory 160, and a communication interface 170. The arithmetic processing circuit 150 includes a main processor 151 and an image processing dedicated processor 152.
The memory 160 includes, for example, a ROM161, a RAM162, and a flash memory 163. The flash memory 163 stores the operating system, the application programs, and various data described above. The memory 120 corresponds to the storage section 213 shown in fig. 2. The memory 160 may also include a Hard Disk Drive (HDD).
The control unit 31 shown in fig. 2 is realized by the arithmetic processing circuit 150 running an operating system and an application program stored in the memory 160. When the application is executed, various data (for example, image data of the object 8 transmitted from the camera unit 261) stored in the memory 120 is referred to.
The main processor 151 controls the overall operation of the image processing apparatus 3. The image processing dedicated processor 152 performs a predetermined process on the image data transmitted from the camera unit 261 of the marker head 26. Instead of the image processing dedicated processor 152, an ASIC (application specific Integrated Circuit) for performing image processing may be provided.
The communication interface 170 is an interface for communicating with an external device (e.g., the controller 21, the camera unit 261 of the marking machine head 26). The communication interface corresponds to the communication processing sections 33, 34 of fig. 2.
In addition, the hardware configuration shown in fig. 3 and 4 is only an example and is not limited to these.
< C, input and storage in advance >
(c1. controller 21)
Fig. 5 is a diagram showing a user interface 700 displayed on the display device 6 by the controller 21. The user interface 700 is realized by the control unit 211 running an application program stored in the storage unit 213. Specifically, the user interface 700 is displayed on the display device 6 connected to the controller 21 by the processor 110 running a program having a function for setting the layout of the marking marks.
The controller 21 can switch the screen mode in accordance with the workflow. Fig. 5 shows a screen of an edit mode used when creating and editing marking data. When receiving a user operation for clicking the button 703, the controller 21 switches the screen from the screen in the edit mode to the screen in the operation mode used for actual marking and machining. The controller 21 receives a user operation of clicking a button displayed on the screen in the operation mode, and switches the screen in the operation mode to the editing mode.
In addition, when the user confirms the marking data that was produced and edited, the user clicks the button 702. Thereby, the controller 21 displays the test marking screen on the display device 6. The user can easily simulate a pilot laser or a laser for actual marking using the test marking picture.
The setting processing performed in the edit mode shown in fig. 5 will be described below.
The user inputs a reference position (hereinafter referred to as "reference position P") of the object to be processed using the commonly set input items. The reference position P is a position (ideal position) where the object 8 is supposed to be located by the user. In the image processing apparatus 3, the same position as the reference position P is set as the reference position. In short, the controller 21 and the image processing apparatus 3 store the same position (more specifically, the same coordinate value) as the reference position. Hereinafter, the coordinates of the reference position P are denoted by (xp, yp) using a coordinate system C composed of an X axis and a Y axis.
The user further draws letters, graphics, and symbols for marking using the drawing region 701. The coordinate system C is set in the drawing region 701.
In a state where the tab 705 is selected, the controller 21 receives a user setting related to a function called DFL (Direct Finder Link). The "DFL" is a function of performing various processes such as compensation of the marking position (compensation in the X-axis direction, the Y-axis direction, and the rotation angle), determination of reading the two-dimensional code (determination of whether or not the two-dimensional code is read), and the like by the controller 21 transmitting a direct command to the image processing apparatus 3 and then receiving a response corresponding to the command from the image processing apparatus 3.
The user interface 700 includes: a checkbox 710; an item 720 of connection setting related to setting of the connected image processing apparatus 3; and an item 730 related to setting of the DFL implementation item.
The checkbox 710 is provided to set whether to validate the DFL.
The item 720 has: an input block 721 for inputting an IP (Internet Protocol) address; button 722 is used to receive input to initiate a connection test. The IP address of the image processing apparatus 3 is input at an input block 721.
Item 730 includes: checkboxes 731, 732; the input fields 733 and 736 are used for inputting numerical values and the like by the user. The checkbox 731 is used to make a setting whether or not to perform position compensation. The check box 732 is set to check whether or not the two-dimensional code is to be executed after marking the object 8.
The input field 733 receives a print position compensation scene number. "scenario" refers to a process that includes at least one treatment. For example, a scenario includes a number of processes including preprocessing, edge detection, matching, calculating offsets, and so on. Typically, the plurality of processes include a process corresponding to the inspection. Such a scene is preset by the user. Generally, scenes are set for at least each type of object 8.
The input field 734 receives a block (block) number. The "information block" is graphic data such as marks, characters, and 2D codes for laser processing the object 8. When the block number is input, the controller 21 specifies data for processing the object 8.
A layer (layer) number is input in the input field 735. The "layer" is a data group (group) that corresponds to different heights of the object 8 in the height direction, and is a group of data having the same height, and that can be position-compensated for each layer.
The timeout time is entered in the input field 736. The timeout time is an upper limit value of the time for which the controller 21 waits for a response from the image processing apparatus 3. Specifically, the timeout time is a time when the controller 21 can receive a response from the image processing apparatus 3 after transmitting a command to the image processing apparatus 3.
The user interface 700 further includes: a button 750 for saving the input content (setting content) as a default value; button 740 for restoring values to default values.
The controller 21 can write the contents (so-called flag scheme) to be set by the user interface 700 to an external memory, for example, in the form of a file, or transmit to an external device. Accordingly, the setting contents can be transferred to a laser marker (not shown) other than the laser marker 2.
As described above, the controller 21 stores an application program for setting the layout of the marks to be marked using the laser L. The controller 21 receives and stores settings for executing a scene specified by the print position compensation scene number (hereinafter also referred to as "first scene") in the image processing apparatus 3 via the user interface 700 displayed when the application is executed. As described above, the controller 21 receives the input of the internet protocol address of the image processing apparatus 3 and the input of information for specifying the object 8 via the user interface 700.
(c2. image processing apparatus 3)
The image processing apparatus 3 stores the coordinates (xp, xq) of the reference position P in the storage unit 32 by user setting, in the same manner as the controller 21.
In addition, a plurality of scenes are stored in the image processing apparatus 3. In detail, the image processing apparatus 3 is configured to be able to execute each of a plurality of scenes. Which scene (step) the image processing apparatus 3 executes is determined by an instruction (command) from the controller 21.
< D. control Structure in laser machining System 1 >
Fig. 6 is a flowchart for explaining a process flow in the laser processing system 1. In fig. 6, both the processing in the controller 21 of the laser marker 2 and the processing in the image processing apparatus 3 are described from the viewpoint of easy understanding of the processing flow.
In addition, fig. 6 is a diagram for explaining processing in a state where a check mark (check mark) is in the check box 731 and a check mark is not in the check box 732 in the user interface 700 shown in fig. 5. The input field 733 is input with a print position compensation scene number.
Referring to fig. 6, in step S2, the controller 21 determines whether or not the DFL is specified. Specifically, the controller 21 determines whether the checkbox 710 is in a state in which a checkmark is entered in the user interface 700. In short, the controller 21 determines whether or not the setting for enabling the function of the DFL is performed.
When determining that the DFL is not specified (no in step S2), the controller 21 performs marking such as printing. That is, the controller 21 does not perform position compensation based on an offset amount described later, but performs marking. This ends the processing of one object 8.
On the other hand, if it is determined that the DFL is specified (yes in step S2), the controller 21 determines whether or not the compensation target is stored in step S4. Specifically, the controller 21 determines whether or not the information block sequence number and the layer sequence number are specified in the user interface 700.
If it is determined that the compensation target is not stored (no in step S4), the controller 21 ends the series of processing. On the other hand, if it is determined that the compensation target is stored (yes in step S4), the controller 21 makes a scene change request in step S6.
The "scene switching request" is an instruction to the image processing apparatus 3 as to what scene to execute in the image processing apparatus 3 (step). Specifically, the scene switching request is a process of transmitting a command based on a scene number input in the user interface 700 (in the case of fig. 5, the print position compensation scene number input in the input field 733) to a device (specifically, the image processing apparatus 3) specified by the IP address. That is, the scene switching request is a process of transmitting a command for specifying a scene to the image processing apparatus 3.
As described above, the controller 21 transmits the above-described command directly to the image processing apparatus 3 through the ethernet cable 11 without via the control device such as the PLC.
When the image processing apparatus 3 receives the above-described command from the controller 21, the command is executed in step S102. Specifically, the image processing apparatus 3 determines whether or not the scene designated from the controller 21 (specifically, the scene specified by the print position compensation scene number) can be executed based on the command received from the controller 21. More specifically, the image processing apparatus 3 determines whether or not a scene designated from the controller 21 is included in a plurality of scenes stored in advance.
When confirming that the scene designated from the controller 21 is included in a plurality of scenes, the image processing apparatus 3 notifies the controller 21 of the confirmation result (specifically, a preset code) as a response corresponding to the command. On the other hand, when the image processing apparatus 3 determines that the scene designated from the controller 21 is not included in the plurality of scenes, it notifies the controller 21 of the confirmation result (specifically, a preset code) as a response corresponding to the command.
The controller 21 receives the confirmation result in step S8 on the condition that the image processing apparatus 3 transmits the confirmation result to the controller 21. The controller determines in step S10 whether the scene cut request is approved. When the controller 21 has not received a response from the image processing apparatus 3 until the time input as the timeout time, it processes the response as "request not approved".
If it is determined that the request is not approved (no in step S10), the controller 21 ends the series of processing. On the other hand, if it is determined that the request is approved (yes in step S10), the controller 21 makes a scene execution request in step S12.
The "scene execution request" is a process for instructing the image processing apparatus 3 to cause the image processing apparatus 3 to execute a specified scene. Specifically, the scene execution request is a process of transmitting, to the image processing apparatus 3, a command for causing the image processing apparatus 3 to execute a scene notified to the image processing apparatus 3 in accordance with the scene switching request in step S6. That is, the "scene execution request" is a process of transmitting a command for causing the image processing apparatus 3 to execute a specified scene to the image processing apparatus 3.
The controller 21 transmits the command regarding the scene execution request to the image processing apparatus 3 directly via the ethernet cable 11 without passing through a control device such as a PLC, similarly to the command regarding the scene switching request.
When the image processing apparatus 3 receives the above-described command from the controller 21, the command is executed in step S104. Specifically, the image processing device 3 calculates the amount of displacement D of the object 8 from the reference position P using image data captured by the camera unit 261 of the marker head 26.
The image processing apparatus 3 notifies the calculated offset amount D to the controller 21 as a response corresponding to the above-described command. In addition, when the calculation of the offset amount D fails, the image processing apparatus 3 notifies the controller 21 of the content (specifically, a preset code) indicating the calculation result.
When the image processing apparatus 3 transmits the calculated offset amount D or a code indicating that the offset amount calculation has failed to the controller 21, the controller 21 receives a response (result) corresponding to the command in step S14. In other words, the controller 21 receives the offset amount D or the above code from the image processing apparatus 3.
The controller 21 determines in step S16 whether the scene execution request is approved. Specifically, the controller 21 determines whether the offset amount D can be acquired from the image processing apparatus 3. Further, if the controller 21 has not received a response from the image processing apparatus 3 until the time input as the timeout time, it processes as "request not approved".
If it is determined that the request is not approved (no in step S16), the controller 21 ends the series of processing. On the other hand, if it is determined that the request is approved (yes in step S16), the controller 21 performs the marking position compensation in step S18. Specifically, the controller 21 performs a process of shifting the marking position (the position where the laser light L is scanned) from the reference position P by the offset amount D acquired from the image processing apparatus 3 before marking. The specific content of this processing will be described later (fig. 7).
After the marking position compensation is performed, the controller 21 instructs the marking head 26 to start marking in step S20. Specifically, the controller 21 marks the mark drawn on the user interface 700 on the object 8 by the marking machine head 26. By the above processing, marking of one object 8 is completed. Thereafter, such a series of processes is repeated for each of the objects 8.
The above processing is summarized as follows. In the following, the command transmitted from the controller 21 to the image processing apparatus 3 in step S12 will be referred to as a "first command", and the command transmitted from the controller 21 to the image processing apparatus 3 in step S6 will be referred to as a "second command".
(1) The following are steps S6, S102, S18, S20. When the controller 21 has a setting for causing the image processing apparatus 3 to execute the first scene (specifically, the scene specified by the print position compensation scene number), the controller 21 transmits a first command for instructing execution of the first scene to the image processing apparatus 3. When receiving the first command, the image processing apparatus 3 calculates a shift amount D of the object 8 from the reference position P using image data obtained by imaging the object 8, and notifies the controller 21 of the shift amount D. The controller 21 compensates the scanning position of the laser beam L based on the offset amount D, and then causes the marking machine head 26 to perform the scanning.
With this configuration, it is not necessary to run a control program (for example, a ladder program) for controlling the operations of the laser marker 2 and the image processing apparatus 3 in a control apparatus such as a PLC. Therefore, the user does not need to create the control program in advance. Thus, according to the laser processing system 1, the laser marker 2 can perform laser processing together with the image processing apparatus 3 without passing through the PLC.
(2) The following are steps S6 and S102. The image processing apparatus 3 is capable of executing a plurality of scenes including the first scene. The controller 21 transmits a second command for specifying the first scene to the image processing apparatus 3 before transmitting the first command. The image processing apparatus 3, after confirming that the designated first scene is included in the plurality of scenes based on the second command, notifies the controller 21 of the result of the confirmation.
(3) The following is step S12. The controller 21 transmits a first command to the image processing apparatus 3 on condition that the notification of the confirmation result is received.
(4) The following is step S102. If the image processing apparatus 3 receives a third command for specifying a second scene not included in the plurality of scenes from the controller 21 instead of the second command before receiving the first command, it transmits a predefined notification indicating that the specified scene cannot be executed by the controller 21 to the controller 21.
(5) The following is step S4. When the information for specifying the object 8 is stored, the controller 21 transmits a second command to the image processing apparatus 3.
(6) The following is step S16. The controller 21 does not cause the marking machine head 26 to scan when the offset amount D has not been received from the image processing apparatus 3 until a predetermined time has elapsed from the time when the first command is transmitted.
Position compensation
Fig. 7 is a diagram for explaining compensation of the marking position performed in the laser marking machine 2. That is, fig. 7 is a diagram for explaining the compensation (step S18 in fig. 6) performed by the laser marker 2 using the offset amount D acquired from the image processing apparatus 3.
Fig. 7A shows an ideal position 810 of the object 8. That is, the position (hereinafter, also referred to as "ideal position 810") where the user assumes the object 8 to be processed when marking is performed with the laser light L.
Referring to fig. 7A, when the user assumes that the object 8 is located at the ideal position 810, a marker 820 is drawn in the drawing area 701 of the user interface 700. That is, the user generates the marking layout.
The reference position P is a position registered by the user. Fig. 7A shows, as an example, a case where the same position as one of the end points of the object 8 (see the end point Q in fig. 7B) is stored as the reference position P.
Fig. 7B is a diagram for explaining the actual position of the object 8 detected based on the image data captured by the camera unit 261. For convenience of explanation, fig. 7B shows the actual position 810A of the object 8 in the same drawing area 701 as fig. 7A, but actually, the drawing as shown in fig. 7B does not appear on the user interface 700.
As described above, the laser marker 2 (to be exact, the controller 21) acquires the shift amount D from the image processing apparatus 3. The offset amount D includes an offset amount xd in the X-axis direction, an offset amount yd in the Y-axis direction, and a rotation angle θ. In detail, the offset D includes: an offset amount (xd, yd) from the coordinate value (xp, yp) of the reference position P; the angle of rotation from a default angle (typically 0 degrees). The image processing apparatus 3 can calculate the shift amount D by detecting the edge and the end point of the object (in the example of the present embodiment, the processing object 8) from the captured image of the object.
In addition, when the default angle is not 0 degrees, it is preferable that the image processing apparatus 3 stores the default angle. When the image processing device 3 does not store the default angle, the controller 21 of the laser marker 2 may compensate the rotation angle θ acquired from the image processing device 3 by the default angle.
As shown in fig. 7B, in actuality, the coordinate value of the end point Q, which is the same coordinate value as the reference position P in fig. 7A, is formed from the coordinate value (xp, yp) to the coordinate value (xq, yq) shifted only by xd in the X-axis direction and yd in the Y-axis direction. Then, the object 8 is rotated by θ from the state shown in fig. 7A in a predetermined direction (clockwise in the drawing) around the end point Q.
When the offset amount D is acquired from the image processing device 3, the laser marker 2 performs compensation for shifting the marking position by the offset amount D based on the offset amount D. In the example of fig. 7, the marking position is compensated to the position of the mark 820 shown in fig. 7B. After the marking position is compensated, the laser marker 2 marks the mark 820 on the object 8 at the compensated position.
As described above, the laser processing system can compensate not only the marking position in the X-axis and Y-axis directions but also the offset due to the rotation.
BI variant case
(1) In the above description, the configuration in which the image processing apparatus 3 calculates the shift amount D has been described as an example, but the present invention is not limited to this. As described below, for example, the laser processing system 1 may be configured such that the controller 21 calculates the offset amount D.
When receiving the first command (the command transmitted from the controller 21 to the image processing apparatus 3 in step S12), the image processing apparatus 3 calculates the position (actual position) of the object 8 using the image data, and notifies the calculated position to the controller 21.
The controller 21 calculates the amount of deviation D of the calculated position from the reference position P. The controller 21 compensates the scanning position of the laser light L based on the offset amount D, and then causes the marking machine head 26 to scan. By this compensation process, a predetermined marking can be performed on a desired position of the object 8.
Further, according to this configuration, the laser marker 2 can also perform laser processing together with the image processing apparatus 3 without going through the PLC.
(2) The configuration in which the marking head 26 has the camera unit 261 is described as an example, but the present invention is not limited thereto. The camera unit 261 may be a single body and may be provided independently of the marking head 26.
The embodiments of the present invention have been described above, and the embodiments disclosed herein are only illustrative in all respects and are not intended to be limiting. The scope of the present invention is defined by the appended claims, and all changes that come within the meaning and range of equivalency of the claims are intended to be embraced therein.

Claims (8)

1. A laser processing system having a laser processing apparatus and an image processing apparatus, wherein,
the laser processing apparatus includes: a controller having an oscillator for oscillating the laser light; a head configured to scan the laser beam on a processing surface of an object to be processed based on control of the controller,
the controller, upon receiving an input for determining a scene number of a scene, the scene including a plurality of processes, transmits a first command for instructing execution of the scene determined by the scene number to the image processing apparatus,
the image processing apparatus calculates a shift amount of the object with respect to a reference position using image data obtained by imaging the object when receiving the first command, and notifies the controller of the shift amount,
the controller causes the head to perform the scanning after compensating for a scanning position of the laser based on the offset amount,
the image processing apparatus is capable of executing a plurality of scenes including a first scene,
the controller transmits a second command for specifying the first scene to the image processing apparatus before transmitting the first command,
the image processing apparatus notifies the controller of a result of the confirmation after confirming that the first scene specified is included in the plurality of scenes based on the second command.
2. The laser machining system of claim 1,
the controller transmits the first command to the image processing apparatus after receiving the notification of the result of the confirmation.
3. The laser processing system of claim 1 or 2,
the image processing apparatus, prior to receiving the first command, sends a notification to the controller that a predefined indication that a specified scene cannot be executed at the controller, in a case where the second command is not received from the controller but a third command for specifying a second scene not included in the plurality of scenes is received.
4. The laser processing system of claim 1 or 2,
the controller transmits the second command to the image processing apparatus when information for specifying the object to be processed is stored.
5. The laser processing system of claim 1 or 2,
the controller does not cause the head to perform the scanning when the offset amount is not received from the image processing apparatus from a time when the first command is transmitted to a time when a predetermined time elapses.
6. The laser processing system of claim 1 or 2,
the controller stores an application program for setting a layout of marks marked by the laser,
the controller receives and stores a setting for causing the image processing apparatus to execute the first scene via a user interface displayed when the application is executed.
7. A laser processing system having a laser processing apparatus and an image processing apparatus,
the laser processing apparatus includes: a controller having an oscillator for oscillating the laser light; a head configured to scan the laser beam on a processing surface of an object to be processed based on control of the controller,
the controller, upon receiving an input for determining a scene number of a scene, the scene including a plurality of processes, transmits a first command for instructing execution of the scene determined by the scene number to the image processing apparatus,
the image processing apparatus calculates a position of the object using image data obtained by imaging the object when receiving the first command, and notifies the calculated position to the controller,
the controller calculates an offset amount of the calculated position with respect to a reference position, and causes the head to perform the scanning after compensating a scanning position of the laser light based on the offset amount,
the image processing apparatus is capable of executing a plurality of scenes including a first scene,
the controller transmits a second command for specifying the first scene to the image processing apparatus before transmitting the first command,
the image processing apparatus notifies the controller of a result of the confirmation after confirming that the first scene specified is included in the plurality of scenes based on the second command.
8. A machining control method in a laser machining system, wherein,
the laser processing system includes: a controller having an oscillator for oscillating the laser light; a head for scanning the oscillated laser beam on a processing surface of a processing object; an image processing apparatus for performing an image processing operation,
the processing control method comprises the following steps:
a step in which the controller, upon receiving an input for determining a scene number of a scene including a plurality of processes, transmits a first command for instructing execution of the scene determined by the scene number to the image processing apparatus;
calculating a shift amount of the object to be processed with respect to a reference position using image data obtained by imaging the object to be processed on the condition that the image processing apparatus receives the first command;
a step in which the image processing apparatus notifies the controller of the calculated shift amount;
a step in which the controller causes the head to perform the scanning after compensating a scanning position of the laser based on the offset amount,
the image processing apparatus is capable of executing a plurality of scenes including a first scene,
the controller transmits a second command for specifying the first scene to the image processing apparatus before transmitting the first command,
the image processing apparatus notifies the controller of a result of the confirmation after confirming that the first scene specified is included in the plurality of scenes based on the second command.
CN201710034885.0A 2016-03-15 2017-01-17 Laser processing system and processing control method Active CN107186347B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-050602 2016-03-15
JP2016050602A JP6601285B2 (en) 2016-03-15 2016-03-15 Laser processing system and processing control method

Publications (2)

Publication Number Publication Date
CN107186347A CN107186347A (en) 2017-09-22
CN107186347B true CN107186347B (en) 2020-05-08

Family

ID=57758481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710034885.0A Active CN107186347B (en) 2016-03-15 2017-01-17 Laser processing system and processing control method

Country Status (5)

Country Link
US (1) US20170266758A1 (en)
EP (1) EP3219430A1 (en)
JP (1) JP6601285B2 (en)
KR (2) KR20170107361A (en)
CN (1) CN107186347B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108391847A (en) * 2018-03-16 2018-08-14 山西大学 A kind of cigarette laser marking system and method based on image procossing
JP6904927B2 (en) * 2018-07-30 2021-07-21 ファナック株式会社 Robot system and calibration method
CN109249138B (en) * 2018-11-12 2020-12-01 广州里程科技发展有限公司 Cross-platform motion control system of laser engraving machine
KR102209714B1 (en) * 2018-12-13 2021-01-29 (주)미래컴퍼니 A method of cutting a structure having brittle materials and its apparatus
JP2021003794A (en) 2019-06-27 2021-01-14 ファナック株式会社 Device and method for acquiring deviation amount of work position of tool
JP6795060B1 (en) * 2019-07-11 2020-12-02 オムロン株式会社 Control device and laser machining system equipped with it, laser machining method
WO2021038730A1 (en) * 2019-08-27 2021-03-04 日本電信電話株式会社 Photometric device
CN110682005B (en) * 2019-10-16 2021-04-27 佛山科学技术学院 Laser marking real-time correction method and control device
US11305377B2 (en) * 2019-12-23 2022-04-19 Precitec Gmbh & Co. Kg Add-on module for interposing between a control device and a laser machining head of a laser machining system
CN113199150B (en) * 2020-01-16 2023-08-25 大族激光科技产业集团股份有限公司 Two-dimensional code laser carving method and system for lens surface of camera
CN114101917A (en) * 2020-08-26 2022-03-01 复盛应用科技股份有限公司 Laser engraving method
JP2022139478A (en) * 2021-03-12 2022-09-26 オムロン株式会社 Laser printing data generator and laser marker system
US20230330764A1 (en) * 2022-04-19 2023-10-19 Path Robotics, Inc. Autonomous assembly robots

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101722363A (en) * 2008-10-21 2010-06-09 三菱电机株式会社 Laser processing apparatus
CN101961819A (en) * 2009-07-22 2011-02-02 中国科学院沈阳自动化研究所 Device for realizing laser welding and seam tracking and control method thereof
CN102582274A (en) * 2011-01-12 2012-07-18 株式会社其恩斯 Laser processing system and laser processing device
CN104159698A (en) * 2012-03-06 2014-11-19 东丽工程株式会社 Marking device and method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002316276A (en) * 2001-04-18 2002-10-29 Nec Corp Device for correcting marking deformation and method for correcting marking in laser marking apparatus
JP5027606B2 (en) * 2007-09-26 2012-09-19 株式会社キーエンス Laser machining apparatus, machining data generation method, and computer program
US9132585B2 (en) * 2009-12-30 2015-09-15 Resonetics, LLC Laser machining system and method for machining three-dimensional objects from a plurality of directions
US9511448B2 (en) * 2009-12-30 2016-12-06 Resonetics, LLC Laser machining system and method for machining three-dimensional objects from a plurality of directions
DE102010060958A1 (en) * 2010-12-02 2012-06-06 Scanlab Ag Laser processing apparatus comprises processing laser for generating processing laser beam to structural change, scan laser for generating scan laser beam, light detector, and reference structure for calibrating unit of control device
JP5545275B2 (en) 2011-07-13 2014-07-09 日本精工株式会社 Method for manufacturing shock absorbing steering shaft
JP5385356B2 (en) 2011-10-21 2014-01-08 株式会社片岡製作所 Laser processing machine

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101722363A (en) * 2008-10-21 2010-06-09 三菱电机株式会社 Laser processing apparatus
CN101961819A (en) * 2009-07-22 2011-02-02 中国科学院沈阳自动化研究所 Device for realizing laser welding and seam tracking and control method thereof
CN102582274A (en) * 2011-01-12 2012-07-18 株式会社其恩斯 Laser processing system and laser processing device
CN104159698A (en) * 2012-03-06 2014-11-19 东丽工程株式会社 Marking device and method

Also Published As

Publication number Publication date
KR102060913B1 (en) 2019-12-30
JP6601285B2 (en) 2019-11-06
CN107186347A (en) 2017-09-22
EP3219430A1 (en) 2017-09-20
US20170266758A1 (en) 2017-09-21
KR20170107361A (en) 2017-09-25
KR20190023062A (en) 2019-03-07
JP2017164764A (en) 2017-09-21

Similar Documents

Publication Publication Date Title
CN107186347B (en) Laser processing system and processing control method
JP6305270B2 (en) Laser processing apparatus and working distance measurement method
JP2009092557A (en) Apparatus for inspecting solder printing
WO2017110786A1 (en) Laser processing device
US20190022794A1 (en) Laser machining apparatus that machines surface of workpiece by irradiating laser beam thereon
JP2008068312A (en) Laser beam machining apparatus, offset adjusting method in height direction in three-dimensional laser beam machining, and control program of laser beam machining apparatus
KR101298706B1 (en) Method of calibrating marking for laser marking system
JP4958507B2 (en) Laser processing equipment
WO1993023781A1 (en) Scanner
JP6345028B2 (en) LASER PRINTING DEVICE, PRINT PROCESSING CONTROL METHOD AND SETTING PROGRAM
CN113798684A (en) Laser processing device and processing method
JP7338501B2 (en) LASER PROCESSING APPARATUS AND CONTROL METHOD OF LASER PROCESSING APPARATUS
JP4467333B2 (en) Laser processing apparatus and processing method
WO2021176800A1 (en) Laser machining device and method for controlling laser machining device
WO2016147977A1 (en) Image-rendering device
US20230211623A1 (en) Laser processing system
JP2002035981A (en) Scanning type laser beam processing device and laser beam processing method capable of processing simulation
JP5274404B2 (en) Laser processing apparatus and laser processing method
JP2010075952A (en) Laser beam machining method and laser beam machining apparatus
JP6645172B2 (en) Laser processing equipment
JP6355580B2 (en) LASER MARKING DEVICE, PRINT CHECK METHOD AND COMPUTER PROGRAM FOR CHECKING PRINT RESULT USING THE LASER MARKING DEVICE
JP2010158704A (en) Laser beam machining method and laser beam machining apparatus
JP2004341594A (en) Working position correcting method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant