WO2016093053A1 - Laser processing machine and nozzle mounting method - Google Patents

Laser processing machine and nozzle mounting method Download PDF

Info

Publication number
WO2016093053A1
WO2016093053A1 PCT/JP2015/082923 JP2015082923W WO2016093053A1 WO 2016093053 A1 WO2016093053 A1 WO 2016093053A1 JP 2015082923 W JP2015082923 W JP 2015082923W WO 2016093053 A1 WO2016093053 A1 WO 2016093053A1
Authority
WO
WIPO (PCT)
Prior art keywords
nozzle
unit
image
processing
nozzles
Prior art date
Application number
PCT/JP2015/082923
Other languages
French (fr)
Japanese (ja)
Inventor
啓一 中西
Original Assignee
村田機械株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 村田機械株式会社 filed Critical 村田機械株式会社
Priority to JP2016563599A priority Critical patent/JP6269859B2/en
Publication of WO2016093053A1 publication Critical patent/WO2016093053A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/14Working by laser beam, e.g. welding, cutting or boring using a fluid stream, e.g. a jet of gas, in conjunction with the laser beam; Nozzles therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/70Auxiliary operations or equipment

Definitions

  • the present invention relates to a laser processing machine and a nozzle mounting method.
  • a laser processing machine which is used for processing such as cutting of a workpiece with laser light and in which a nozzle for emitting laser light is replaceable. It is necessary to select an optimum type (hole diameter, normal / double) of this nozzle according to the workpiece. Conventionally, the operator refers to the processing conditions and manually replaces the nozzle. Further, it has been proposed to recognize and confirm that the nozzle mounted on the processing head is appropriate by recognizing the image of the nozzle after mounting and the mark formed on the nozzle (see Patent Documents 1 and 2). In order to automate the replacement of nozzles, an exchange system called a nozzle changer has been proposed.
  • a plurality of nozzles are arranged in a nozzle accommodating portion, and one of the nozzles is selected and attached to the processing head.
  • the operator sets a predetermined type of nozzle at a predetermined position of the nozzle accommodating portion, or after arranging a plurality of nozzles in the nozzle accommodating portion, the operator sets the position and type of the nozzle in the nozzle changer through a user interface.
  • An object of the present invention is to provide a laser processing machine and a nozzle mounting method capable of mounting a nozzle in a short time.
  • the laser processing machine of the present invention has a replaceable nozzle and is held by a hollow processing head that guides laser light for processing a workpiece and emits the laser light from the nozzle, and a nozzle housing portion in which a plurality of nozzles are arranged.
  • An imaging unit that acquires a nozzle image via the inside of the processing head, and a nozzle identification unit that recognizes the nozzle type from the nozzle image acquired by the imaging unit.
  • an illumination unit that irradiates illumination light to a nozzle for which an image is acquired by the imaging unit may be included. Further, the illumination unit may irradiate the nozzle with illumination light through the processing head. Further, an identification mark may be formed on the nozzle at a position where the imaging unit can capture an image, and the nozzle identification unit may recognize the identification mark from the image acquired by the imaging unit and recognize the nozzle type. Further, the identification mark may be formed around the nozzle hole when viewed from the penetration direction of the nozzle hole of the nozzle. The identification mark may include a main part having a nozzle type and a reference mark formed at a position defined with respect to the main part.
  • a nozzle determination unit that determines whether the nozzle type recognized by the identification mark and the nozzle type recognized by extracting the characteristic portion of the nozzle shape from the image acquired by the imaging unit may be included. Further, the nozzle identification unit may extract the characteristic part of the nozzle shape from the image acquired by the imaging unit and recognize the nozzle type. Moreover, you may include the moving part which moves a process head and a nozzle accommodating part relatively. Further, the nozzle identification unit may recognize the nozzle position based on the amount of movement between the processing head and the nozzle housing unit by the moving unit.
  • the moving unit may scan the image recognition area of the imaging unit in the circumferential direction by relatively moving the processing head and the nozzle accommodating unit in the circumferential direction centering on the penetrating direction of the nozzle hole of the nozzle.
  • a nozzle management table that associates the nozzle position recognized by the nozzle identification portion and the nozzle type, and a processing that associates a plurality of processing conditions and nozzle types according to the workpiece
  • a table and a mounting nozzle specifying unit that specifies a nozzle to be mounted by referring to the processing table and the nozzle management table based on the processing conditions may be included.
  • the imaging unit is capable of imaging a workpiece that is being processed or processed by laser light, and a processing state detection unit that detects a processing state of the workpiece from an image obtained by the imaging unit imaging the workpiece that is being processed or processed. May be included.
  • the nozzle mounting method of the present invention has a replaceable nozzle and is held in a hollow processing head that guides a laser beam for processing a workpiece and emits it from the nozzle, and a nozzle housing portion in which a plurality of nozzles are arranged.
  • a laser processing machine including an imaging unit that acquires an image of a nozzle through the processing head, the nozzle type is recognized from the nozzle image acquired by the imaging unit, and among the recognized nozzles, the mounting target Mounting a nozzle on the machining head.
  • the present invention since an image of the nozzle in the nozzle accommodating portion is acquired and recognized by the nozzle identifying portion, it is possible to prevent erroneous nozzle mounting and to cope with automation of nozzle mounting. Further, since the image of the nozzle is acquired through the inside of the processing head, the nozzle can be mounted on the processing head in a short time after the target nozzle is found, and the time required for mounting the nozzle can be shortened.
  • a good image can be acquired because the nozzle is illuminated with illumination light.
  • the illumination light can be efficiently applied to the area to be imaged.
  • the nozzle identification unit recognizes the nozzle type from the identification mark of the nozzle, the nozzle type can be efficiently recognized from the identification mark.
  • the identification mark is formed around the nozzle hole, the identification mark can be easily imaged by the imaging unit.
  • the identification mark includes a main part and a reference mark, the image recognition area of the imaging unit can be reliably guided to the main part by finding the reference mark.
  • the apparatus including the nozzle determination unit can determine whether the nozzle type recognized by the identification mark matches the actual nozzle type.
  • the nozzle identifying unit extracts the characteristic part of the nozzle shape and recognizes the nozzle type, it is not necessary to form an identification mark or the like on the nozzle.
  • the moving unit can easily move the image recognition area of the imaging unit. Further, in the case where the nozzle identification unit recognizes the nozzle position based on the amount of movement between the machining head and the nozzle storage unit by the moving unit, the nozzle position of each nozzle randomly arranged in the nozzle storage unit or the like can be easily acquired. .
  • the moving unit scans the image recognition area of the imaging unit in the circumferential direction of the nozzle hole, it is possible to efficiently image the identification mark formed around the nozzle hole.
  • the nozzle management table, the processing table, and the mounting nozzle specifying unit it is possible to easily specify the mounting target nozzle based on the processing conditions of the workpiece.
  • management of nozzles to be mounted later is facilitated by acquiring the nozzle types for all the nozzles of the nozzle storing unit, or when mounting target nozzles are found from the nozzle storing unit. Therefore, in any case, it is possible to prevent the wrong nozzle from being attached and to cope with automation of nozzle replacement.
  • the imaging unit includes a machining state detection unit that detects the machining state of the workpiece, the imaging unit can be used for both confirmation of the machining state of the workpiece and recognition of the nozzle type.
  • FIG. 1 It is a figure which shows an example of the laser processing machine which concerns on embodiment.
  • A The figure which shows the state which the process head opposed to the nozzle accommodating part
  • (b) is a top view of a nozzle accommodating part
  • (c) is a top view which shows an example of a nozzle.
  • A) to (e) are process drawings showing a nozzle identification process.
  • (A)-(e) is a top view which shows the other example of an identification mark.
  • (A)-(e) is a top view which shows an example of the classification of a nozzle. It is a flowchart which shows the other example which identifies a nozzle. It is a flowchart which shows an example of the nozzle mounting method.
  • (A) is a figure which shows an example of a nozzle management table
  • (b) is a figure which shows an example of a process table.
  • FIG. 1 is a diagram showing a laser beam machine 1 according to the present embodiment.
  • This laser processing machine 1 irradiates a workpiece W (processing object) with laser light, and performs processing such as cutting, formation of recesses such as grooves, drilling of through holes and slits, marking, and the like on the workpiece W.
  • a laser beam machine 1 includes a laser oscillator 2, a machining head 3, a moving unit 4, a nozzle housing unit 5, an imaging unit 6, an illumination unit 7, an image processing unit 8, a control unit 9, and a storage.
  • the unit 10 is provided.
  • the laser oscillator 2 generates, for example, infrared laser light as laser light for processing the workpiece W.
  • Laser light emitted from the laser oscillator 2 is supplied to the processing head 3 via a light guide member such as an optical fiber.
  • the processing head 3 irradiates the supplied laser beam toward the workpiece W (downward).
  • the processing head 3 includes a body 11 (processing head main body), an output optical system 12 accommodated in the body 11, and a nozzle 13 disposed on the light output side of the output optical system 12.
  • the emission optical system 12 includes a lens 15 supported by the body 11, a mirror 16, a lens 17, and the like.
  • the laser light travels horizontally from the side of the body 11 and enters the mirror 16 through the lens 15.
  • a dichroic mirror is used, which reflects laser light (infrared laser light) and transmits light other than the laser light (for example, visible light).
  • the laser beam reflected by the mirror 16 travels downward.
  • the optical axis 12a on the exit side of the exit optical system 12 is set, for example, in the vertical direction.
  • the laser beam reflected by the mirror 16 is irradiated to the workpiece W through the lens 17 and the nozzle 13.
  • the lens 17 condenses the laser light so that the laser light forms a spot of a predetermined size on the workpiece W.
  • the nozzle 13 is attached to the body 11 in a replaceable manner.
  • the nozzle 13 is attached to the opening 20 at the lower end of the body 11.
  • the nozzle 13 includes a proximal end portion 21 attached to the body 11 and a distal end portion 22 disposed toward the workpiece.
  • the base end portion 21 has a cylindrical shape and has a screw portion on the outer peripheral surface thereof.
  • the opening portion 20 of the body 11 has a screw portion that can be screwed to the screw portion of the nozzle 13.
  • the nozzle 13 is attached to and detached from the body 11 by the relative rotation of the nozzle 13 and the opening 20.
  • a mechanism for relatively rotating the nozzle 13 and the opening 20 is provided in one or both of the processing head 3 and a nozzle housing 5 described later.
  • the mounting of the nozzle 13 to the body 11 is not limited to screw connection.
  • the nozzle 13 may be attached to the body 11 by a mechanism for clamping a part of the nozzle, an electromagnet, or the like.
  • the distal end portion 22 of the nozzle 13 is disposed toward the workpiece W with the proximal end portion 21 attached to the body 11.
  • the nozzle 13 has a nozzle hole 23 (see FIG. 2C) penetrating from the base end portion 21 to the tip end portion 22, and emits laser light to the workpiece W through the nozzle hole 23.
  • a plurality of nozzles 13 are prepared according to processing conditions, and are appropriately replaced.
  • the types of the nozzles 13 are classified according to, for example, those having different diameters of the nozzle holes 23 or those having an assist gas injection port concentric with the nozzle holes 23. These nozzles 13 are arranged in the nozzle accommodating portion 5.
  • the moving unit 4 can relatively move the workpiece W and the machining head 3.
  • the laser beam machine 1 holds the workpiece W by a workpiece holding unit (not shown), and moves the machining head 3 in the X direction, the Y direction, and the Z direction by the moving unit 4, thereby Is moved relatively.
  • the laser beam machine 1 processes the workpiece W by irradiating the workpiece W with laser light from the machining head 3 while moving the machining head 3 by the moving unit 4.
  • the nozzle accommodating portion 5 includes a plurality of slots 26 in which a plurality of nozzles 13 are arranged.
  • the nozzle 13 is placed in the slot 26 with the tip 22 facing downward.
  • the nozzle accommodating portion 5 may be installed at a position where the machining head 3 can be accessed, or may be formed so as to be movable below the machining head 3.
  • the nozzle housing 5 includes a lid (not shown) and is formed so as to open the lid when the nozzle 13 is replaced. Moreover, the nozzle accommodating part 5 may adsorb
  • the imaging unit 6 acquires an image used for recognition of the nozzle 13.
  • the imaging unit 6 has a fixed relative position to the processing head 3.
  • the imaging unit 6 includes an imaging optical system 30 that forms an image of an object below the opening 20.
  • the imaging unit 6 is accommodated in a space partitioned by a window 31 on the upper side of the body 11.
  • the imaging optical system 30 uses the lens 17 of the emission optical system 12 in addition to the lens 32.
  • the optical axis 30 a of the imaging optical system 30 is coaxial with a part of the optical axis 12 a of the emission optical system 12.
  • the image of the object below the opening 20 enters the imaging unit 6 through the lens 17, the mirror 16, a half mirror 33 described later, and the lens 32.
  • the optical axis 30 a of the imaging optical system 30 may be set off the optical axis 12 a of the emission optical system 12.
  • the optical axis 30a of the imaging optical system 30 is not limited to being coaxial with a part of the optical axis 12a of the emission optical system 12, and may be arranged as a separate axis.
  • the imaging unit 6 uses an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary MOS) to capture an image formed by the imaging optical system 30.
  • the imaging unit 6 is accommodated in a casing 34 provided on the body 11 of the processing head 3.
  • the imaging unit 6 can image the processing position on the workpiece W through the nozzle hole 23 of the nozzle 13.
  • the imaging part 6 can also image the processing state of the workpiece W by laser light.
  • the imaging unit 6 outputs the captured image information to the image processing unit 8 described later.
  • the illumination unit 7 irradiates illumination light to the target nozzle 13 from which an image is acquired by the imaging unit 6.
  • the illumination unit 7 irradiates the visual field of the imaging unit 6 with illumination light through the processing head 3.
  • the illumination unit 7 is accommodated in the body 11 and fixed to the body 11.
  • the illumination unit 7 includes a light source 35, a lens 36, and a half mirror 33.
  • the light source 35 is, for example, an LED (Light Emitting Diode) or the like, and emits light (for example, visible light) in a wavelength band in which the imaging unit 6 has sensitivity.
  • an incoherent light source such as an LED, it is possible to prevent interference fringes from appearing in an image captured by the imaging unit 6 and to recognize the nozzle more accurately.
  • the light from the light source 35 is reflected by the half mirror 33 through the lens 36, passes through the mirror 16, passes through the lens 17, and is irradiated below the opening 20 of the body 11.
  • the laser processing machine 1 does not need to be equipped with the illumination part 7, and may image by the imaging part 6 by external light etc., for example.
  • the image processing unit 8 performs image processing on the image information output from the imaging unit 6. For example, the image processing unit 8 performs processing such as binarization processing and pattern matching in addition to processing such as gamma correction and white balance.
  • the image processing unit 8 supplies the processed image information to the control unit 9.
  • the image processing unit 8 controls illumination light irradiation by the illumination unit 7 and imaging by the imaging unit 6. The control of the imaging unit 6 and the illumination unit 7 may be performed by the control unit 9 described later.
  • the control unit 9 comprehensively controls the laser processing machine 1 including the laser oscillator 2 and the moving unit 4.
  • the control unit 9 includes, for example, a CPU (Central / Processing / Unit), and controls each unit by calling a program stored in the storage unit 10 and causing the CPU to execute the program.
  • the control unit 9 includes a nozzle identification unit 41, a nozzle determination unit 42, a mounting nozzle specification unit 43, a nozzle mounting instruction unit 44, and a processing state detection unit 45.
  • the nozzle identification unit 41 recognizes the nozzle type from the image of the nozzle 13 acquired by the imaging unit 6.
  • the nozzle 13 is, for example, a shape (a shape of a nozzle hole, a shape of a nozzle appearance), a dimension (a nozzle hole diameter, a nozzle outer diameter, a nozzle height), a surface treatment (the presence or absence of plating, the type of plating), and the like.
  • the nozzle type is used as a type for classifying a plurality of types (nozzle forms) of nozzles, in other words, a type for specifying the type (nozzle form) of each nozzle.
  • the shape of the nozzle hole means, for example, one nozzle hole or a concentric double nozzle hole, etc.
  • the shape of the nozzle appearance is, for example, flat, steeply inclined Means a stepped shape or the like.
  • the nozzle identification unit 41 recognizes the nozzle position based on the amount of movement between the machining head 3 and the nozzle storage unit 5 by the moving unit 4.
  • the nozzle identifying unit 41 acquires, for example, the amount of movement of the machining head 3 by the moving unit 4 and determines in which slot 26 (see FIG. 2B) the target nozzle 13 for recognizing the nozzle type is arranged. calculate.
  • the nozzle identification unit 41 generates, for example, data in which the position of the slot 26 (for example, XY coordinates) and the nozzle type of the nozzle 13 held in the slot 26 are associated with each other, and this data is stored in the nozzle management of the storage unit 10. Register in table D3.
  • the control unit 9 can find the nozzle 13 to be mounted with reference to the nozzle management table D3 without newly recognizing the nozzle type for the nozzle 13 whose nozzle type has already been recognized.
  • the nozzle management table D3 is information that associates the nozzle position recognized by the nozzle identification unit 41 and the nozzle type with respect to the plurality of nozzles 13 arranged in the nozzle accommodating unit 5.
  • the nozzle determination unit 42 recognizes a nozzle type recognized by an identification mark M (see FIG. 2C) described later, and a nozzle type recognized by extracting a characteristic part of the shape of the nozzle 13 from the image acquired by the imaging unit 6. , Are determined to match. Note that whether or not the control unit 9 includes the nozzle determination unit 42 is arbitrary.
  • the mounting nozzle specifying unit 43 specifies the mounting target nozzle 13 by referring to the processing table D4 and the nozzle management table D3 of the storage unit 10 based on the processing conditions.
  • the machining conditions are set according to the workpiece to be machined, and include, for example, at least one operating condition among the machining head 3 by the moving unit 4, the laser oscillator 2, and the assist gas injection.
  • the operating conditions of the moving unit 4 are, for example, a speed (eg, cutting speed) for moving the machining head 3 and an acceleration.
  • the operating conditions of the laser oscillator 2 are, for example, laser light output, frequency, duty ratio, and the like.
  • the assist gas injection conditions include, for example, presence / absence of assist gas injection, pressure, and flow rate.
  • the processing table D4 is information in which a plurality of processing conditions corresponding to the workpiece W and nozzle types are associated with each other.
  • the nozzle mounting instruction unit 44 instructs to mount the nozzle 13 of the nozzle housing unit 5 on the processing head 3.
  • the nozzle mounting instruction unit 44 may instruct mounting of the mounting target nozzles 13 after acquiring the nozzle types related to all the nozzles 13 in the nozzle housing unit 5. Since the nozzle types for all the nozzles 13 are acquired, the nozzle management table D3 is completed, and the nozzle 13 to be mounted can be easily specified based on the data of the nozzle management table D3 at the next nozzle replacement.
  • the acquisition of the nozzle types for all the nozzles 13 may be performed at a timing such as when the laser processing machine 1 is started up or after the lid of the nozzle housing 5 is swept.
  • the nozzle mounting instruction unit 44 may instruct the mounting of the nozzle 13 when the nozzle type of the nozzle 13 of the nozzle accommodating unit 5 is sequentially obtained and the mounting target nozzle 13 is found.
  • the nozzle management table D3 is unnecessary, and the burden on the storage unit 10 can be reduced.
  • the nozzle position and the nozzle type acquired until the mounting target nozzle 13 is found may be registered in the nozzle management table D3 and referred to at the next nozzle replacement.
  • the machining state detection unit 45 detects the machining state of the workpiece W from an image obtained by the imaging unit 6 imaging the workpiece W being processed or processed by the laser beam.
  • the processing state detection unit 45 allows the imaging unit 6 to be shared by the processing state imaging and the nozzle 13 imaging.
  • the machining state detection unit 45 can also extract numerical information such as a laser machining cutting width from an image of the workpiece W, for example.
  • the control unit 9 may execute feedback control of laser processing based on the numerical information extracted by the processing state detection unit 45. Note that whether or not the control unit 9 includes the machining state detection unit 45 is arbitrary.
  • the laser beam machine 1 can recognize the nozzle type of the nozzle 13 arranged in the nozzle housing portion 5. Thereby, for example, the laser processing machine 1 automatically finds out the nozzle 13 to be replaced from a plurality of types of nozzles 13 arranged in the nozzle housing 5 and automatically performs at least a part of the nozzle replacement processing. be able to.
  • FIG. 2A is a view showing a state in which the machining head 3 is opposed to the nozzle accommodating portion 5
  • FIG. 2B is a plan view of the nozzle accommodating portion 5
  • FIG. 2C is a plan view showing an example of the nozzle 13.
  • the plurality of nozzles 13 are arranged in the plurality of slots 26 of the nozzle accommodating portion 5, and the machining head 3 is positioned with respect to any of the nozzles 13 in the slot 26. It is positioned.
  • Each of the plurality of slots 26 has a position in the XY direction.
  • the form of the nozzle accommodating part 5 is an example, Comprising: The arbitrary structures which can hold
  • the nozzle 13 has an identification mark M.
  • the identification mark M is formed at a position where the imaging unit 6 can capture an image in a state where the nozzle 13 is disposed in the nozzle housing unit 5.
  • the identification mark M is formed around the nozzle hole 23 when viewed from the penetration direction of the nozzle hole 23 of the nozzle 13.
  • the identification mark M is formed on the end surface 24 around the base end portion 21.
  • the identification mark M has, for example, a main part M1 and a reference mark M2.
  • the main part M1 has a character code indicating the nozzle type.
  • the reference mark M2 is formed at a position defined with respect to the main part M1. For example, the reference mark M2 is formed at a position closer to the nozzle hole 23 than the main portion M1.
  • FIG. 3 is a flowchart showing an example of identifying the nozzle 13 by the identification mark M.
  • FIG. 4 is a flowchart showing an example of recognition of the reference mark M2.
  • FIG. 5 is a flowchart showing an example of recognition of characters and codes (main part M1).
  • FIG. 6 is a process diagram showing the nozzle 13 identification process. This process creates a nozzle management table D3 in which the accommodation position of the nozzle 13 (position of the slot 26) in the nozzle accommodation unit 5 is associated with the nozzle type of the nozzle 13.
  • step S ⁇ b> 1 the control unit 9 moves the machining head 3 to the reference position of the target slot 26 in the nozzle housing unit 5.
  • step S1 the control unit 9 controls the moving unit 4 so that the image recognition area FV of the imaging unit 6 is arranged at the reference position of the slot 26 as shown in FIG. Move.
  • the reference position of the slot 26 is set outside the base end portion 21 of the nozzle 13 and at a position where the reference mark M2 exists in the circumferential direction around the nozzle hole 23.
  • step S2 the controller 9 executes a process of recognizing the reference mark M2 of the nozzle 13 accommodated in the target slot 26 (see FIG. 4 below).
  • control unit 9 determines whether or not the reference mark M2 exists in the image recognition area FV in step S11.
  • control unit 9 causes the image capturing unit 6 to capture an image of the image recognition area FV.
  • the control unit 9 causes the image processing unit 8 to process the image captured by the imaging unit 6 and determines whether or not the reference mark M2 is detected in the captured image based on the processing result.
  • step S12 determines whether or not the recognition of the reference mark M2 is continued.
  • the control unit 9 counts the number of times that the process of step S12 has been executed, and determines that the recognition of the reference mark M2 is continued when the number of processes is equal to or less than a predetermined number.
  • step S12 determines that the recognition of the reference mark M2 is continued.
  • step S13 the control unit 9 relatively moves the image recognition area and the nozzle 13 (step S13), and repeats the processes of steps S11 and S12.
  • step S13 The relative movement in step S13 is performed by, for example, stepping the machining head 3 in the circumferential direction around the nozzle hole 23 by driving the moving unit 4.
  • the reference mark M2 normally enters the image recognition area FV as shown in FIG. 6B.
  • the control unit 9 determines that the recognition of the reference mark M2 is not continued when the processing in step S12 exceeds a predetermined number (step S12; NO). Then, the control unit 9 determines that the reference mark M2 of the nozzle 13 held in the target slot 26 in this process has failed to be recognized, and ends this reference mark recognition process.
  • the control unit 9 detects the position (P1) of the reference mark M2 on the captured image in step S14 (FIG. 6). (See (c)). For example, the control unit 9 calculates the center position P1 of the reference mark M2 from the edge of the reference mark M2 detected by the image processing unit 8. Next, the control unit 9 calculates a difference Pd between the center P0 of the image recognition area FV and the position P1 of the reference mark M2 (step S15).
  • the center P0 of the image recognition area FV is the position of the optical axis 30a of the imaging optical system 30.
  • the control unit 9 calculates, for example, a difference Pd between a pixel position corresponding to the center position P1 of the reference mark M2 on the captured image and a pixel position corresponding to the center P0 of the captured image.
  • This difference value is represented, for example, by a vector including a component in which the difference in position in the X direction is represented by the number of pixels and a component in which the difference in position in the Y direction is represented by the number of pixels.
  • the control unit 9 can also convert the position difference into the actual size value by using a conversion coefficient between the size of one pixel of the captured image and the actual size value (for example, millimeter) on the nozzle 13. This conversion coefficient can be examined in advance by, for example, imaging a scale arranged at a height at which an identification mark is formed in the nozzle 13.
  • the control unit 9 determines whether or not the absolute value
  • corresponds to the distance (deviation amount) between the center P0 of the image recognition area and the reference mark M2.
  • may be, for example, 0 or a numerical value in the vicinity of 0.
  • the control unit 9 determines that the absolute value
  • the control unit 9 controls the moving unit 4 to make the image recognition area FV and the nozzle 13 relative to each other according to the difference Pd. Move (step S17).
  • step S ⁇ b> 17 the control unit 9 moves the processing head 3 using the difference Pd so that the center P ⁇ b> 0 of the image recognition area FV approaches the center position P ⁇ b> 1 of the reference mark M ⁇ b> 2. Subsequently, the control unit 9 performs the processing from step S14 to step S16. When it is determined that the absolute value
  • the control unit 9 estimates the position of the main part M1 from the recognized position of the reference mark M2 in step S3 in FIG. (See FIG. 5). In step S21 in FIG. 5, the control unit 9 controls the moving unit 4 to move the processing head 3 to the recognition position of the character (main part M1) (see FIG. 6D). In step S21, the control unit 9 controls the moving unit 4 so that at least a part of the main part M1 (for example, the beginning of the character) enters the image recognition area FV.
  • the difference Pd calculated in step S15 is stored as an offset value, and this offset value is used in step S21 in FIG.
  • the image recognition area FV may be moved to the position of the main part M1.
  • the control unit 9 causes the imaging unit 6 to capture an image and causes the image processing unit 8 to process the captured image (step S22).
  • the image processing unit 8 performs processing such as character recognition by OCR or graphic recognition by binarization (eg, barcode recognition) (step S22).
  • the control unit 9 determines whether or not the main part M1 of the identification mark M has been successfully recognized (step S23).
  • the control unit 9 determines whether or not the recognition of the main part M1 is continued (step S24). When it determines with continuing the recognition of main part M1 (step S24; YES), the control part 9 changes a recognition parameter (step S25), and repeats the process of step S22 and step S23.
  • the recognition parameter is, for example, brightness by the illumination unit 7, an algorithm used in the process of step S22, or a parameter thereof (eg, binarization threshold).
  • the control unit 9 counts the number of times that the process of step S24 has been executed, and determines that the recognition of the main part M1 is not continued when the number of times of the process of step S24 reaches a specified number (step S24; No). . In this case, the controller 9 determines that the recognition of the main part M1 has failed and ends the recognition process of the main part M1.
  • the control part 9 When it determines with the recognition of the main part M1 having succeeded (step S23; YES), the control part 9 memorize
  • the main part M1 is a character code and includes, for example, a predetermined data amount (eg, 7 digits) of numbers and symbols.
  • the number of digits that can be recognized in the recognition process of the main part M1 by one imaging is one digit.
  • control unit 9 performs identification processing for the number of times (for example, 7 times) necessary to recognize the main part M1 having a predetermined data amount.
  • control unit 9 stores information obtained by the first identification in the storage unit 10, adds information obtained by the next and subsequent identifications, and updates the identification information.
  • the control unit 9 determines whether or not to continue the recognition of the main part M1 (step S27). For example, when the acquisition of the identification information of the predetermined data amount is not completed, the control unit 9 determines to continue the recognition of the main part M1 (step S27; YES). In this case, the control unit 9 moves the machining head 3 by a predetermined amount in step S28 (see FIG. 6E), and repeats the processing from step S22 to step S27. For example, when the acquisition of the identification information of the predetermined data amount is completed, the control unit 9 determines that the recognition of the main part M1 is not continued (Step S27; No). In this case, the control part 9 complete
  • control unit 9 compares the information of the main part M1 recognized in step S3 with the registration information stored in advance in the storage unit 10, and determines the nozzle type (step S4).
  • This registration information is information in which identification mark information and nozzle type information are associated with each other.
  • the control unit 9 determines whether or not the nozzle type has been successfully recognized (step S5). When it is determined that the nozzle type has been successfully recognized (step S5; YES), the control unit 9 associates the XY coordinate position (nozzle position) of the slot 26 to be identified with the nozzle type and registers them in the nozzle management table D3. (Step S6). The controller 9 determines whether or not to set the next slot 26 as a recognition target after the process of step S6 or when it is determined in step S5 that the nozzle type has failed to be recognized (step S5; NO). (Step S7).
  • step S7 When it is determined that the next slot 26 is set as a recognition target (step S7; YES), the control unit 9 sets the reference position of the slot 26 to be recognized next (step S8), and the process proceeds from step S1 to step S7. Repeat the process.
  • step S7; NO the control unit 9 ends the series of processes.
  • FIGS. 7A to 7E are diagrams showing other examples of the identification marks from the identification marks MA to ME, respectively.
  • the identification mark MA in FIG. 7A includes the main part M1 and does not include the reference mark.
  • the main part M1 includes a three-digit number (001), a symbol ( ⁇ ), and a three-digit number (180).
  • the identification mark MA does not need to include the reference mark M2, and in this case, the recognition process of the reference mark M2 can be omitted.
  • the type (eg, alpha bed, kanji, hiragana, number) and number of characters included in the main part M1, and the type and number of symbols are not limited to the above example, and can be arbitrarily set.
  • the main part M1 is represented by a figure.
  • the main portion M1 includes ten circles arranged in the circumferential direction of the nozzle 13, and the circle colors (for example, white and black) are set to a plurality of colors.
  • the circle has two colors, the main part M1 can express a binary 10-digit number string, that is, 10-bit information.
  • the identification mark MC in FIG. 7C includes ten fan-shaped figures arranged in the circumferential direction of the nozzle 13.
  • the shape, number, and color type of the figure are not limited to the above example and can be arbitrarily set.
  • the identification mark ME in FIG. 7E is a so-called one-dimensional barcode. Note that a two-dimensional barcode may be used instead of the one-dimensional barcode. Further, the identification marks MA to ME shown in FIGS. 7A to 7E may be combined with the reference mark M2.
  • FIGS. 8A to 8D are diagrams showing examples of nozzles having different nozzle types.
  • the nozzles 13A to 13C in FIGS. 8A to 8C each have one nozzle hole 23, and the nozzle hole diameter ⁇ is 3 mm, 2 mm, and 1.5 mm.
  • the nozzle types of the nozzles 13A to 13C are represented by, for example, “normal nozzle, ⁇ 3 mm”, “normal nozzle, 2 mm”, and “normal nozzle, 1.5 mm”.
  • the nozzle 13D of FIG. 8D has double nozzle holes 23, and the inner nozzle hole diameter ⁇ is 2 mm.
  • the nozzle type of the nozzle 13D is represented by “double nozzle, ⁇ 2 mm”, for example.
  • the nozzle shown in FIG. 8 is an example, and the nozzle type is not limited.
  • the nozzle type may be classified according to the size of the nozzle outer diameter, the nozzle shape, the nozzle surface treatment, and the like.
  • the classification according to the dimension of the nozzle outer diameter can be classified by, for example, the outer diameter on the distal end side or the proximal end side, the nozzle height, or the like.
  • the classification according to the nozzle shape can be classified according to, for example, a flat shape in the nozzle height direction, a steeply inclined shape, or a step formed on a part of the outer periphery.
  • Classification by nozzle surface treatment can be classified by, for example, the presence or absence of plating applied to the nozzle surface, the type of plating, and the like. Further, the nozzle type may be classified other than those described above.
  • FIG. 9 shows a method for recognizing the nozzle type from the shape feature of the nozzle 13.
  • FIG. 9 is a flowchart showing another example of identifying nozzles.
  • the same reference numerals are given to those that perform the same processing as in FIG. 3, and the description thereof is omitted or simplified.
  • the control unit 9 moves the machining head 3 to the center position of the target slot 26.
  • the control unit 9 controls the moving unit 4 to move the processing head 3 so that the image recognition area FV is arranged in the nozzle hole 23 of the nozzle 13.
  • the image recognition area FV at this time may be set to a size that allows the entire base end portion 21 of the nozzle 13 to be imaged, for example, or an image is obtained by scanning a small image recognition area FV similar to FIG. May be.
  • step S32 the control unit 9 causes the image capturing unit 6 to capture an image, and causes the image processing unit 8 to process the captured image, thereby extracting a feature amount such as an edge of the nozzle hole 23.
  • step S33 the control unit 9 collates the feature amount extracted in step S32 with a database in which the feature amount and the nozzle type are associated with each other. For example, from the edge of the nozzle hole 23 detected in step S32, the control unit 9 uses a normal nozzle (see FIGS. 8A to 8C) or a double nozzle (see FIG. 8D). Determine if there is. Further, the control unit 9 calculates the nozzle hole diameter from the edge of the nozzle hole 23 detected in step S32.
  • the control unit 9 searches the database for information indicating the distinction between the normal nozzle and the double nozzle and information indicating the nozzle hole diameter, and acquires the corresponding nozzle type. Thereafter, the control unit 9 performs the processing from step S5 to step S8 described with reference to FIG. 3, associates the slot position with the nozzle type for the nozzle determined to have been successfully recognized in step S5, and manages the nozzle. Create table D3.
  • control unit 9 may perform only one of the nozzle identification process using the identification mark M (see FIGS. 4 and 5) and the nozzle identification process using the shape (see FIG. 9). , You may do both. For example, it is determined by the nozzle determination unit 42 (see FIG. 1) of the control unit 9 whether the result of the nozzle identification process by the identification mark M and the result of the nozzle identification process by the shape match. Processing may be performed in which the type is registered in the nozzle management table D3 and is not registered if they do not match.
  • FIG. 10 is a flowchart showing a nozzle mounting method.
  • FIG. 11A shows an example of the nozzle management table D3
  • FIG. 11B shows an example of the processing table D4.
  • the nozzle management table D3 in FIG. 11A includes a management number item, a nozzle position item, and a nozzle type item.
  • the management number is an ID that is arbitrarily set, but may be, for example, the number of the slot 26 of the nozzle housing 5.
  • the nozzle position is a position where the nozzle 13 is arranged in the nozzle accommodating portion 5, and may be, for example, the XY coordinates of the slot 26.
  • the nozzle position and the nozzle type are associated by a management number.
  • the nozzle 13 corresponding to the management number 1 is a normal nozzle having an X-direction coordinate of X1 and a Y-direction coordinate of Y1, and a nozzle hole diameter of 3 mm.
  • step S42 of FIG. 10 the control unit 9 acquires the processing conditions.
  • the processing conditions may be input by an operator, or based on the workpiece W carried into the laser processing machine 1, for example, from a host controller (for example, a controller that supervises the system) by wired or wireless control unit 9 may also be sent.
  • the control unit 9 acquires processing conditions from an operator input or the like.
  • step S43 the control unit 9 collates the machining conditions with the machining table D4, determines the nozzle type according to the machining conditions, and identifies the nozzle 13 to be mounted.
  • This step S43 is performed by the mounting nozzle specifying unit 43 (see FIG. 1) of the control unit 9.
  • the processing table D4 includes a management number item, a processing condition item, and a nozzle type item.
  • the management number is an ID that is arbitrarily set.
  • the processing conditions include items such as cutting speed, laser oscillation output, and assist gas pressure.
  • the processing table D4 may be input by an operator in advance, or may be sent from the host controller to the control unit 9 as appropriate. Nozzle types corresponding to the processing conditions are selected and registered in advance.
  • the mounting nozzle specifying unit 43 specifies the nozzle 13 of the nozzle type that matches the given processing condition as the mounting target nozzle.
  • step S44 the control unit 9 collates the nozzle type of the mounting target nozzle 13 specified in step S43 with the nozzle management table D3, and acquires the position of the mounting target nozzle 13.
  • the control unit 9 controls the moving unit 4 based on information on the position (slot position or XY coordinates) of the nozzle 13 to be mounted, moves the machining head 3 to the position of the nozzle 13 to be mounted, and mounts it.
  • the target nozzle is mounted on the machining head 3 (step S45). As described above, the nozzles 13 can be automatically mounted by performing steps S41 to S45.
  • FIG. 12 is a flowchart showing another example of the nozzle mounting method.
  • the nozzle mounting process is performed in a state where the nozzle 13 to be mounted is not registered in the nozzle management table D3 or the nozzle management table D3 does not exist.
  • the same reference numerals are assigned to the same processes as those in FIGS. 3 and 9, and the description thereof is omitted or simplified.
  • the control unit 9 recognizes the nozzle type of the nozzle 13 of the target slot through the processing of steps S31 to S33 in FIG. 9 and step S5 in FIG.
  • the control unit 9 determines whether or not the recognized nozzle type is a nozzle type to be mounted (step S51).
  • the nozzle type to be mounted is determined by the control unit 9 with reference to the processing table D4, for example. If the controller 9 determines that the recognized nozzle type is not the mounting target nozzle type (step S52; NO), or determines that the nozzle type has failed to be recognized in step S5, the control unit 9 determines the reference position as follows. (Step S53), and the processing from step S31 to step S5 is repeated.
  • the control unit 9 mounts the nozzle 13 on the processing head 3 (step S54). That is, when the mounting target nozzle 13 is found, the control unit 9 does not recognize the next nozzle 13 and mounts the mounting target nozzle 13 on the processing head 3. In this case, since the imaging unit 6 acquires an image through the processing head 3, the nozzle 13 can be mounted on the processing head 3 without greatly moving the processing head 3, and the time required for nozzle mounting is reduced. it can. Specifically, when the imaging unit 6 captures an image outside the processing head 3, the processing unit 3 and the imaging unit 6 are located at different positions.
  • the imaging unit 6 It is necessary to move the machining head 3 from the imaging position facing the nozzle 13 to the mounting position where the machining head 3 faces the nozzle 13.
  • the imaging position and the mounting position are the same or close to each other, so that the machining head 3 is not moved or the machining head 3 is finely adjusted.
  • the machining head 3 can be attached to the nozzle 13.
  • the control unit 9 may register the identified nozzle type in the nozzle management table D3 until the nozzle 13 to be mounted is found. In this case, when the nozzle 13 to be mounted is registered in the nozzle management table D3 in the subsequent nozzle replacement, the nozzle replacement can be performed using the nozzle management table D3. In addition, when the nozzle 13 to be mounted is not in the nozzle management table D3 in the nozzle replacement after the next time, nozzle identification is performed from a nozzle position that is not in the nozzle management table D3. Thereby, the number of nozzles 13 to be identified can be reduced. Further, when the mounting target nozzle 13 is registered in the nozzle management table D3, the nozzle identification may be performed again at the nozzle position. The nozzle 13 may be mounted if the result of the newly performed nozzle identification matches the information in the nozzle management table D3, and if not matched, the mounting of the nozzle 13 may be stopped to notify the operator of the abnormality.
  • the control unit 9 controls the laser oscillator 2 and irradiates the machining position of the workpiece W with laser light from the machining head 3 to process the workpiece W. At that time, the control unit 9 controls the imaging unit 6 to cause the imaging unit 6 to capture the machining state of the workpiece W by the laser light.
  • the machining state detection unit 45 of the control unit 9 detects the machining state of the workpiece W from the image obtained by the imaging unit 6 imaging the workpiece W being processed or processed. The machining state detection unit 45 may determine whether the machining state of the workpiece W is good or bad as detection of the machining state of the workpiece W.
  • the processing state detection unit 45 may extract numerical information such as a cutting width of laser processing from the image of the workpiece W.
  • the control unit 9 may execute feedback control of laser processing based on this numerical information.
  • the imaging unit 6 can be shared for both the identification of the nozzle 13 and the detection of the machining state, and an increase in device cost can be suppressed.
  • FIG. 13 is a diagram showing another example of the illumination unit.
  • the same members as those in FIGS. 1 and 2 are denoted by the same reference numerals, and the description thereof is omitted or simplified.
  • the illumination unit 7 ⁇ / b> A is provided outside the body 11 below the processing head 3.
  • the illumination unit 7 ⁇ / b> A is supported by the lower end of the body 11.
  • the illumination unit 7A includes a light source unit 37 in which a plurality of light sources (for example, LEDs) are arranged in a ring shape. As described above, when the illumination unit 7A is externally attached to the processing head 3, the illumination unit 7A can be easily replaced or inspected.
  • the technical scope of this invention is not limited to above-described embodiment.
  • one or more of the requirements described in the above embodiments may be omitted.
  • each requirement demonstrated by said embodiment can be combined suitably.
  • the above-mentioned embodiment does not mention the imaging magnification by the imaging part 6, you may image by switching the image recognition area FV by the imaging part 6 to a low magnification or a high magnification, for example.
  • the image recognition area FV is imaged at a low magnification, and after confirming the presence of the identification mark M, the center of the image recognition area FV is moved to that position and switched to imaging at a high magnification to enlarge the identification mark M.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Laser Beam Processing (AREA)

Abstract

The present invention: makes it possible to recognize a nozzle to be mounted, prevent an incorrect nozzle from being mounted, and address automation of nozzle mounting; and reduces the amount of time for mounting a nozzle. Provided is a laser processing machine (1) for processing a workpiece (W), wherein the laser processing machine (1) includes: a hollow processing head (3) that has an exchangeable nozzle (13), a laser beam for processing the workpiece (W) being guided and emitted from the nozzle (13); an image-capturing unit (6) for obtaining, through the inside of the processing head (3), an image of the nozzles (13) held in a nozzle-accommodating unit (5) in which a plurality of nozzles (13) are arranged; and a nozzle identifying unit (41) for recognizing the nozzle type from the image of the nozzles (13) obtained by the imaging unit (6).

Description

レーザ加工機及びノズル装着方法Laser processing machine and nozzle mounting method
 本発明は、レーザ加工機及びノズル装着方法に関する。 The present invention relates to a laser processing machine and a nozzle mounting method.
 レーザ加工機は、レーザ光によりワークの切断等の加工に用いられ、レーザ光を出射するノズルを交換可能にしたものが知られている。このノズルは、ワークに応じて最適な種別(穴径や、通常/二重など)のものを選定する必要がある。従来では、オペレータが加工条件を参照し、ノズルを手動により交換していた。また、加工ヘッドに装着したノズルが適切であることを、装着後のノズルの画像やノズルに形成したマークを認識して確認することが提案されている(特許文献1,2参照)。また、ノズルの交換を自動化するため、ノズルチェンジャと呼ばれる交換システムが提案されている。ノズルチェンジャは、ノズル収容部に複数のノズルを配置し、いずれかのノズルを選択して加工ヘッドに装着する。このとき、オペレータがノズル収容部の所定位置に決められた種別のノズルをセットするか、複数のノズルをノズル収容部に配置後、オペレータがユーザインタフェースによりノズルの位置と種別とをノズルチェンジャに設定していた。 A laser processing machine is known which is used for processing such as cutting of a workpiece with laser light and in which a nozzle for emitting laser light is replaceable. It is necessary to select an optimum type (hole diameter, normal / double) of this nozzle according to the workpiece. Conventionally, the operator refers to the processing conditions and manually replaces the nozzle. Further, it has been proposed to recognize and confirm that the nozzle mounted on the processing head is appropriate by recognizing the image of the nozzle after mounting and the mark formed on the nozzle (see Patent Documents 1 and 2). In order to automate the replacement of nozzles, an exchange system called a nozzle changer has been proposed. In the nozzle changer, a plurality of nozzles are arranged in a nozzle accommodating portion, and one of the nozzles is selected and attached to the processing head. At this time, the operator sets a predetermined type of nozzle at a predetermined position of the nozzle accommodating portion, or after arranging a plurality of nozzles in the nozzle accommodating portion, the operator sets the position and type of the nozzle in the nozzle changer through a user interface. Was.
特開2005-334922JP-A-2005-334922 特開2004-322127JP 2004-322127 A
 オペレータが手動でノズルの交換を行う場合は、自動化の妨げとなるだけでなく、誤って違う種別のノズルを装着するとワークの加工不良を招くことになる。また、特許文献1,2は、いずれも装着後のノズルを確認するので、誤ったノズルの場合は改めてオペレータ等によるノズルの装着が必要となり、オペレータによる面倒な作業を有する点に変わりない。また、ノズルチェンジャを用いる場合であっても、オペレータが誤って違う種別のノズルをセットすると誤ったノズルが装着されてワークの加工不良を招くことになる。同様に、ノズルの位置と種別とをユーザインタフェースにより設定する場合も、オペレータが誤って設定するとワークの加工不良を招くことになる。さらに、オペレータによるノズルの設置やユーザインタフェースを用いた設定を行うのでは、その作業に時間がかかるため生産性が落ちるといった問題がある。 When the operator manually replaces the nozzle, not only will the automation be hindered, but if a different type of nozzle is installed by mistake, it will cause a processing failure of the workpiece. Further, since both Patent Documents 1 and 2 check the nozzle after mounting, it is necessary to mount the nozzle again by an operator or the like in the case of an incorrect nozzle, and there is no change in that the operator has troublesome work. Even when a nozzle changer is used, if the operator sets a different type of nozzle by mistake, the wrong nozzle is mounted, leading to a processing failure of the workpiece. Similarly, when the position and type of the nozzle are set by the user interface, if the operator sets the nozzle position and type by mistake, a machining failure of the workpiece is caused. Furthermore, if the operator installs nozzles or makes settings using a user interface, there is a problem in that productivity is reduced because the operation takes time.
 本発明は、上述の事情に鑑みてなされたものであり、装着対象のノズルを認識可能にして誤ったノズルが装着されるのを回避するとともに、ノズル装着の自動化に対応することができ、さらに、ノズルの装着を短時間で行うことが可能なレーザ加工機及びノズルの装着方法を提供することを目的とする。 The present invention has been made in view of the above-described circumstances, and can recognize the nozzle to be mounted and avoid mounting an incorrect nozzle, and can cope with automation of nozzle mounting. An object of the present invention is to provide a laser processing machine and a nozzle mounting method capable of mounting a nozzle in a short time.
 本発明のレーザ加工機は、交換可能なノズルを有しかつワークを加工するレーザ光を導光してノズルから出射する中空形状の加工ヘッドと、ノズルを複数配置するノズル収容部に保持されたノズルの画像を、加工ヘッド内を介して取得する撮像部と、撮像部により取得したノズルの画像からノズル種別を認識するノズル識別部と、を含む。 The laser processing machine of the present invention has a replaceable nozzle and is held by a hollow processing head that guides laser light for processing a workpiece and emits the laser light from the nozzle, and a nozzle housing portion in which a plurality of nozzles are arranged. An imaging unit that acquires a nozzle image via the inside of the processing head, and a nozzle identification unit that recognizes the nozzle type from the nozzle image acquired by the imaging unit.
 また、撮像部により画像を取得する対象のノズルに対して照明光を照射する照明部を含んでもよい。また、照明部が、加工ヘッド内を介してノズルに照明光を照射してもよい。また、撮像部による撮像可能な位置に識別マークがノズルに形成され、ノズル識別部が、撮像部で取得した画像から識別マークを判別してノズル種別を認識してもよい。また、識別マークが、ノズルのノズル穴の貫通方向から見てノズル穴の周囲に形成されてもよい。また、識別マークが、ノズル種別を有する主要部と、主要部に対して規定された位置に形成された基準マークと、を含んでもよい。また、識別マークにより認識したノズル種別と、撮像部で取得した画像からノズルの形状の特徴部分を抽出して認識したノズル種別と、が一致するかを判定するノズル判定部を含んでもよい。また、ノズル識別部が、撮像部で取得した画像からノズルの形状の特徴部分を抽出してノズル種別を認識してもよい。また、加工ヘッドとノズル収容部とを相対的に移動させる移動部を含んでもよい。また、ノズル識別部が、移動部による加工ヘッドとノズル収容部との移動量に基づいてノズル位置を認識してもよい。また、移動部が、ノズルのノズル穴の貫通方向を中心とした周方向に、加工ヘッドとノズル収容部とを相対的に移動させ、撮像部の画像認識エリアを周方向に走査してもよい。また、ノズル収容部に配置された複数のノズルに関して、ノズル識別部が認識したノズル位置とノズル種別とを関連付けたノズル管理テーブルと、ワークに応じた複数の加工条件とノズル種別とを関連付けた加工テーブルと、加工条件に基づいて加工テーブル及びノズル管理テーブルを参照することにより装着対象のノズルを特定する装着ノズル特定部と、を含んでもよい。また、ノズル収容部のノズルを加工ヘッドに装着するように指示するノズル装着指示部を含み、ノズル装着指示部が、ノズル収容部の全てのノズルに関するノズル種別を取得した後、またはノズル収容部のノズルに対して順次ノズル種別を取得して装着対象のノズルを見つけた際に、装着対象のノズルの装着を指示してもよい。また、撮像部が、レーザ光により加工中又は加工済みのワークを撮像可能であり、加工中又は加工済みのワークを撮像部が撮像した画像から、ワークの加工状態を検出する加工状態検出部を含んでもよい。 Further, an illumination unit that irradiates illumination light to a nozzle for which an image is acquired by the imaging unit may be included. Further, the illumination unit may irradiate the nozzle with illumination light through the processing head. Further, an identification mark may be formed on the nozzle at a position where the imaging unit can capture an image, and the nozzle identification unit may recognize the identification mark from the image acquired by the imaging unit and recognize the nozzle type. Further, the identification mark may be formed around the nozzle hole when viewed from the penetration direction of the nozzle hole of the nozzle. The identification mark may include a main part having a nozzle type and a reference mark formed at a position defined with respect to the main part. Further, a nozzle determination unit that determines whether the nozzle type recognized by the identification mark and the nozzle type recognized by extracting the characteristic portion of the nozzle shape from the image acquired by the imaging unit may be included. Further, the nozzle identification unit may extract the characteristic part of the nozzle shape from the image acquired by the imaging unit and recognize the nozzle type. Moreover, you may include the moving part which moves a process head and a nozzle accommodating part relatively. Further, the nozzle identification unit may recognize the nozzle position based on the amount of movement between the processing head and the nozzle housing unit by the moving unit. Further, the moving unit may scan the image recognition area of the imaging unit in the circumferential direction by relatively moving the processing head and the nozzle accommodating unit in the circumferential direction centering on the penetrating direction of the nozzle hole of the nozzle. . In addition, for a plurality of nozzles arranged in the nozzle accommodating portion, a nozzle management table that associates the nozzle position recognized by the nozzle identification portion and the nozzle type, and a processing that associates a plurality of processing conditions and nozzle types according to the workpiece A table and a mounting nozzle specifying unit that specifies a nozzle to be mounted by referring to the processing table and the nozzle management table based on the processing conditions may be included. In addition, it includes a nozzle mounting instruction unit that instructs to mount the nozzle of the nozzle storage unit on the processing head, and after the nozzle mounting instruction unit acquires the nozzle types for all the nozzles of the nozzle storage unit, or of the nozzle storage unit When the nozzle type is sequentially acquired from the nozzles and the target nozzle is found, the mounting of the target nozzle may be instructed. In addition, the imaging unit is capable of imaging a workpiece that is being processed or processed by laser light, and a processing state detection unit that detects a processing state of the workpiece from an image obtained by the imaging unit imaging the workpiece that is being processed or processed. May be included.
 本発明のノズル装着方法は、交換可能なノズルを有しかつワークを加工するレーザ光を導光してノズルから出射する中空形状の加工ヘッドと、ノズルを複数配置するノズル収容部に保持されたノズルの画像を、加工ヘッド内を介して取得する撮像部と、を含むレーザ加工機において、撮像部により取得したノズルの画像からノズル種別を認識することと、 認識したノズルのうち、装着対象のノズルを加工ヘッドに装着することと、を含む。 The nozzle mounting method of the present invention has a replaceable nozzle and is held in a hollow processing head that guides a laser beam for processing a workpiece and emits it from the nozzle, and a nozzle housing portion in which a plurality of nozzles are arranged. In a laser processing machine including an imaging unit that acquires an image of a nozzle through the processing head, the nozzle type is recognized from the nozzle image acquired by the imaging unit, and among the recognized nozzles, the mounting target Mounting a nozzle on the machining head.
 本発明によれば、ノズル収容部のノズルの画像を取得してノズル識別部により認識するので、誤ったノズルの装着を防止することができ、ノズル装着の自動化にも対応できる。また、加工ヘッド内を介してノズルの画像を取得するので、装着対象のノズルを見つけてから加工ヘッドにノズルを短時間で装着でき、ノズルの装着に要する時間を短縮できる。 According to the present invention, since an image of the nozzle in the nozzle accommodating portion is acquired and recognized by the nozzle identifying portion, it is possible to prevent erroneous nozzle mounting and to cope with automation of nozzle mounting. Further, since the image of the nozzle is acquired through the inside of the processing head, the nozzle can be mounted on the processing head in a short time after the target nozzle is found, and the time required for mounting the nozzle can be shortened.
 また、照明部を含むものでは、照明光によりノズルを照明するので良好な画像を取得できる。また、加工ヘッド内を介してノズルに照明光を照射するものでは、撮像する領域に効率よく照明光を照射できる。また、ノズル識別部が、ノズルの識別マークによりノズル種別を認識するものでは、識別マークによりノズル種別を効率よく認識できる。また、識別マークがノズル穴の周囲に形成されたものでは、撮像部によって識別マークを容易に撮像できる。また、識別マークが主要部と基準マークとを含むものでは、基準マークを見つけることにより撮像部の画像認識エリアを確実に主要部に案内できる。また、ノズル判定部を含むものでは、識別マークにより認識したノズル種別が実際のノズル種別と一致するかを判定できる。また、ノズル識別部がノズルの形状の特徴部分を抽出してノズル種別を認識するものでは、ノズルに識別マーク等の形成が不要となる。また、加工ヘッドとノズル収容部とを相対的に移動させる移動部を含むものでは、移動部によって撮像部の画像認識エリアを容易に移動できる。また、ノズル識別部が移動部による加工ヘッドとノズル収容部との移動量に基づいてノズル位置を認識するものでは、ノズル収容部等にランダムに配置された各ノズルのノズル位置を容易に取得できる。また、移動部によって撮像部の画像認識エリアをノズル穴の周方向に走査するものでは、ノズル穴の周りに形成された識別マークを効率よく撮像できる。また、ノズル管理テーブルと加工テーブルと装着ノズル特定部とを含むものでは、装着対象のノズルをワークの加工条件に基づいて容易に特定できる。また、ノズル装着指示部を含むものでは、ノズル収容部の全てのノズルに関するノズル種別を取得することで、後に装着するノズルの管理が容易となり、またはノズル収容部から装着対象のノズルを見つけた際にそのノズルの装着を行うので、いずれの場合も誤ったノズルの装着を防止でき、ノズル交換の自動化に対応できる。また、撮像部がワークの加工状態を検出する加工状態検出部を含むものでは、ワークの加工状態の確認と、ノズル種別の認識とで撮像部を兼用できる。 Moreover, in the case of including an illumination unit, a good image can be acquired because the nozzle is illuminated with illumination light. In addition, in the case of irradiating the nozzle with illumination light through the processing head, the illumination light can be efficiently applied to the area to be imaged. Further, when the nozzle identification unit recognizes the nozzle type from the identification mark of the nozzle, the nozzle type can be efficiently recognized from the identification mark. Further, when the identification mark is formed around the nozzle hole, the identification mark can be easily imaged by the imaging unit. In addition, when the identification mark includes a main part and a reference mark, the image recognition area of the imaging unit can be reliably guided to the main part by finding the reference mark. In addition, the apparatus including the nozzle determination unit can determine whether the nozzle type recognized by the identification mark matches the actual nozzle type. In addition, when the nozzle identifying unit extracts the characteristic part of the nozzle shape and recognizes the nozzle type, it is not necessary to form an identification mark or the like on the nozzle. In addition, in the case of including a moving unit that relatively moves the processing head and the nozzle housing unit, the moving unit can easily move the image recognition area of the imaging unit. Further, in the case where the nozzle identification unit recognizes the nozzle position based on the amount of movement between the machining head and the nozzle storage unit by the moving unit, the nozzle position of each nozzle randomly arranged in the nozzle storage unit or the like can be easily acquired. . In addition, when the moving unit scans the image recognition area of the imaging unit in the circumferential direction of the nozzle hole, it is possible to efficiently image the identification mark formed around the nozzle hole. Further, in the case including the nozzle management table, the processing table, and the mounting nozzle specifying unit, it is possible to easily specify the mounting target nozzle based on the processing conditions of the workpiece. In addition, in the case of including a nozzle mounting instruction unit, management of nozzles to be mounted later is facilitated by acquiring the nozzle types for all the nozzles of the nozzle storing unit, or when mounting target nozzles are found from the nozzle storing unit. Therefore, in any case, it is possible to prevent the wrong nozzle from being attached and to cope with automation of nozzle replacement. Further, when the imaging unit includes a machining state detection unit that detects the machining state of the workpiece, the imaging unit can be used for both confirmation of the machining state of the workpiece and recognition of the nozzle type.
実施形態に係るレーザ加工機の一例を示す図である。It is a figure which shows an example of the laser processing machine which concerns on embodiment. (a)加工ヘッドがノズル収容部に対向した状態を示す図、(b)はノズル収容部の平面図、(c)はノズルの一例を示す平面図である。(A) The figure which shows the state which the process head opposed to the nozzle accommodating part, (b) is a top view of a nozzle accommodating part, (c) is a top view which shows an example of a nozzle. ノズルを識別する一例を示すフローチャートである。It is a flowchart which shows an example which identifies a nozzle. 基準マークの認識の一例を示すフローチャートである。It is a flowchart which shows an example of recognition of a reference mark. 文字、コードの認識の一例を示すフローチャートである。It is a flowchart which shows an example of recognition of a character and a code. (a)~(e)はノズルの識別処理を示す工程図画である。(A) to (e) are process drawings showing a nozzle identification process. (a)~(e)は識別マークの他の例を示す平面図である。(A)-(e) is a top view which shows the other example of an identification mark. (a)~(e)はノズルの種別の一例を示す平面図である。(A)-(e) is a top view which shows an example of the classification of a nozzle. ノズルを識別する他の例を示すフローチャートである。It is a flowchart which shows the other example which identifies a nozzle. ノズル装着方法の一例を示すフローチャートである。It is a flowchart which shows an example of the nozzle mounting method. (a)はノズル管理テーブルの一例を示す図、(b)は加工テーブルの一例を示す図である。(A) is a figure which shows an example of a nozzle management table, (b) is a figure which shows an example of a process table. ノズル装着方法の他の例を示すフローチャートである。It is a flowchart which shows the other example of the nozzle mounting method. 照明部の他の例を示す図である。It is a figure which shows the other example of an illumination part.
 以下、本発明の実施形態について図面を参照しながら説明する。ただし、本発明はこれに限定されるものではない。また、図面においては実施形態を説明するため、一部分を大きくまたは強調して記載するなど適宜縮尺を変更して表現している。以下の各図において、XYZ座標系を用いて図中の方向を説明する。このXYZ座標系においては、水平方向をX方向、Y方向とし、このXY平面に垂直な鉛直方向をZ方向とする。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. However, the present invention is not limited to this. Further, in the drawings, in order to describe the embodiment, the scale is appropriately changed and expressed by partially enlarging or emphasizing the description. In the following drawings, directions in the drawings will be described using an XYZ coordinate system. In this XYZ coordinate system, the horizontal direction is the X direction and the Y direction, and the vertical direction perpendicular to the XY plane is the Z direction.
 図1は、本実施形態に係るレーザ加工機1を示す図である。このレーザ加工機1は、ワークW(加工対象物)にレーザ光を照射し、ワークWに対して切断、溝などの凹部の形成、貫通孔やスリットなどの穴あけ、マーキングなどの加工を施すことができる。図1に示すように、レーザ加工機1は、レーザ発振機2、加工ヘッド3、移動部4、ノズル収容部5、撮像部6、照明部7、画像処理部8、制御部9、及び記憶部10を備える。 FIG. 1 is a diagram showing a laser beam machine 1 according to the present embodiment. This laser processing machine 1 irradiates a workpiece W (processing object) with laser light, and performs processing such as cutting, formation of recesses such as grooves, drilling of through holes and slits, marking, and the like on the workpiece W. Can do. As shown in FIG. 1, a laser beam machine 1 includes a laser oscillator 2, a machining head 3, a moving unit 4, a nozzle housing unit 5, an imaging unit 6, an illumination unit 7, an image processing unit 8, a control unit 9, and a storage. The unit 10 is provided.
 レーザ発振機2は、ワークWを加工するレーザ光として、例えば赤外レーザ光を発生する。レーザ発振機2から出射したレーザ光は、光ファイバなどの導光部材を介して、加工ヘッド3へ供給される。加工ヘッド3は、供給されたレーザ光をワークWに向けて(下方に向けて)照射する。加工ヘッド3は、ボディ11(加工ヘッド本体)と、ボディ11に収容された出射光学系12と、出射光学系12の光出射側に配置されたノズル13とを備える。 The laser oscillator 2 generates, for example, infrared laser light as laser light for processing the workpiece W. Laser light emitted from the laser oscillator 2 is supplied to the processing head 3 via a light guide member such as an optical fiber. The processing head 3 irradiates the supplied laser beam toward the workpiece W (downward). The processing head 3 includes a body 11 (processing head main body), an output optical system 12 accommodated in the body 11, and a nozzle 13 disposed on the light output side of the output optical system 12.
 出射光学系12は、ボディ11に支持されたレンズ15、ミラー16、及びレンズ17等を備える。レーザ光は、例えば、ボディ11の側方から水平方向に進行し、レンズ15を通ってミラー16に入射する。ミラー16は、例えばダイクロイックミラーが用いられ、レーザ光(赤外レーザ光)を反射し、レーザ光以外の光(例えば可視光)を透過する。ミラー16で反射したレーザ光は、下方に向けて進行する。出射光学系12の出射側の光軸12aは、例えば鉛直方向に設定される。ミラー16で反射したレーザ光は、レンズ17を通りノズル13を介して、ワークWに照射される。レンズ17は、レーザ光がワークW上に所定サイズのスポットを形成するようにレーザ光を集光する。 The emission optical system 12 includes a lens 15 supported by the body 11, a mirror 16, a lens 17, and the like. For example, the laser light travels horizontally from the side of the body 11 and enters the mirror 16 through the lens 15. As the mirror 16, for example, a dichroic mirror is used, which reflects laser light (infrared laser light) and transmits light other than the laser light (for example, visible light). The laser beam reflected by the mirror 16 travels downward. The optical axis 12a on the exit side of the exit optical system 12 is set, for example, in the vertical direction. The laser beam reflected by the mirror 16 is irradiated to the workpiece W through the lens 17 and the nozzle 13. The lens 17 condenses the laser light so that the laser light forms a spot of a predetermined size on the workpiece W.
 ノズル13は、ボディ11に対して交換可能に取り付けられる。ノズル13は、ボディ11下端の開口部20に取り付けられる。ノズル13は、ボディ11に取り付けられる基端部21と、ワークに向けて配置される先端部22とを備える。基端部21は、円筒状であり、その外周面にネジ部を有する。ボディ11の開口部20は、ノズル13のネジ部とネジ結合可能なネジ部を有している。ノズル13と開口部20とが相対的に回転することでボディ11に対してノズル13の着脱を行う。ノズル13と開口部20とを相対的に回転させる機構は、加工ヘッド3または後述のノズル収容部5のいずれか一方または双方に設けられる。なお、ボディ11へのノズル13の装着は、ネジ結合に限定されない。例えば、ノズルの一部をクランプする機構や電磁石などによりノズル13をボディ11に装着してもよい。 The nozzle 13 is attached to the body 11 in a replaceable manner. The nozzle 13 is attached to the opening 20 at the lower end of the body 11. The nozzle 13 includes a proximal end portion 21 attached to the body 11 and a distal end portion 22 disposed toward the workpiece. The base end portion 21 has a cylindrical shape and has a screw portion on the outer peripheral surface thereof. The opening portion 20 of the body 11 has a screw portion that can be screwed to the screw portion of the nozzle 13. The nozzle 13 is attached to and detached from the body 11 by the relative rotation of the nozzle 13 and the opening 20. A mechanism for relatively rotating the nozzle 13 and the opening 20 is provided in one or both of the processing head 3 and a nozzle housing 5 described later. Note that the mounting of the nozzle 13 to the body 11 is not limited to screw connection. For example, the nozzle 13 may be attached to the body 11 by a mechanism for clamping a part of the nozzle, an electromagnet, or the like.
 ノズル13の先端部22は、基端部21がボディ11に取り付けられた状態で、ワークWに向けて配置される。ノズル13は、基端部21から先端部22にわたって貫通するノズル穴23(図2(c)参照)を有しており、このノズル穴23を介してワークWにレーザ光を出射する。ノズル13は、加工条件に応じて複数用意されており、適宜交換される。ノズル13の種別は、例えば、ノズル穴23の径が異なるものや、ノズル穴23と同心円状のアシストガスの噴射口を有するものなどで分けられる。これらのノズル13は、ノズル収容部5に配置されている。 The distal end portion 22 of the nozzle 13 is disposed toward the workpiece W with the proximal end portion 21 attached to the body 11. The nozzle 13 has a nozzle hole 23 (see FIG. 2C) penetrating from the base end portion 21 to the tip end portion 22, and emits laser light to the workpiece W through the nozzle hole 23. A plurality of nozzles 13 are prepared according to processing conditions, and are appropriately replaced. The types of the nozzles 13 are classified according to, for example, those having different diameters of the nozzle holes 23 or those having an assist gas injection port concentric with the nozzle holes 23. These nozzles 13 are arranged in the nozzle accommodating portion 5.
 移動部4は、ワークWと加工ヘッド3とを相対移動させることができる。例えば、レーザ加工機1は、不図示のワーク保持部によりワークWを保持し、加工ヘッド3を移動部4によってX方向、Y方向、Z方向に移動させることにより、ワークWと加工ヘッド3とを相対的に移動させる。本実施形態のレーザ加工機1は、移動部4により加工ヘッド3を移動させつつ、加工ヘッド3からワークWにレーザ光を照射してワークWの加工を行う。 The moving unit 4 can relatively move the workpiece W and the machining head 3. For example, the laser beam machine 1 holds the workpiece W by a workpiece holding unit (not shown), and moves the machining head 3 in the X direction, the Y direction, and the Z direction by the moving unit 4, thereby Is moved relatively. The laser beam machine 1 according to the present embodiment processes the workpiece W by irradiating the workpiece W with laser light from the machining head 3 while moving the machining head 3 by the moving unit 4.
 ノズル収容部5は、複数のノズル13を配置する複数のスロット26を備える。ノズル13は、先端部22を下向きとしてスロット26に載置される。ノズル収容部5は、加工ヘッド3がアクセス可能な位置に設置されるか、または加工ヘッド3の下方に移動可能に形成されるか、いずれでもよい。ノズル収容部5は、不図示の蓋部を備え、ノズル13の交換時に蓋部を開けるように形成される。また、ノズル収容部5は、載置したノズル13を吸着してノズル13を保持してもよい。 The nozzle accommodating portion 5 includes a plurality of slots 26 in which a plurality of nozzles 13 are arranged. The nozzle 13 is placed in the slot 26 with the tip 22 facing downward. The nozzle accommodating portion 5 may be installed at a position where the machining head 3 can be accessed, or may be formed so as to be movable below the machining head 3. The nozzle housing 5 includes a lid (not shown) and is formed so as to open the lid when the nozzle 13 is replaced. Moreover, the nozzle accommodating part 5 may adsorb | suck the mounted nozzle 13 and hold | maintain the nozzle 13. FIG.
 撮像部6は、ノズル13の認識に使われる画像を取得する。撮像部6は、加工ヘッド3との相対位置が固定されている。撮像部6は、開口部20下方の物体の像を形成する撮像光学系30を備える。撮像部6は、ボディ11の上方側に窓部31で仕切られた空間に収容されている。撮像光学系30は、レンズ32の他に、出射光学系12のレンズ17を利用する。撮像光学系30の光軸30aは、出射光学系12の光軸12aの一部と同軸である。開口部20下方にある物体の像は、レンズ17、ミラー16、後述のハーフミラー33、レンズ32を通って撮像部6に入射する。撮像光学系30の光軸30aは、出射光学系12の光軸12aから外れて設定されてもよい。なお、撮像光学系30の光軸30aは、出射光学系12の光軸12aの一部と同軸にすることに限定されず、別軸として配置してもよい。 The imaging unit 6 acquires an image used for recognition of the nozzle 13. The imaging unit 6 has a fixed relative position to the processing head 3. The imaging unit 6 includes an imaging optical system 30 that forms an image of an object below the opening 20. The imaging unit 6 is accommodated in a space partitioned by a window 31 on the upper side of the body 11. The imaging optical system 30 uses the lens 17 of the emission optical system 12 in addition to the lens 32. The optical axis 30 a of the imaging optical system 30 is coaxial with a part of the optical axis 12 a of the emission optical system 12. The image of the object below the opening 20 enters the imaging unit 6 through the lens 17, the mirror 16, a half mirror 33 described later, and the lens 32. The optical axis 30 a of the imaging optical system 30 may be set off the optical axis 12 a of the emission optical system 12. The optical axis 30a of the imaging optical system 30 is not limited to being coaxial with a part of the optical axis 12a of the emission optical system 12, and may be arranged as a separate axis.
 撮像部6は、例えばCCD(Charge Coupled Device)やCMOS(Complementary MOS:相補性金属酸化膜半導体)等のイメージセンサが用いられ、撮像光学系30が形成した像を撮像する。撮像部6は、加工ヘッド3のボディ11に設けられたケーシング34に収容されている。撮像部6は、ノズル13のノズル穴23を通して、ワークW上の加工位置を撮像できる。さらに、撮像部6は、レーザ光によるワークWの加工状態を撮像することもできる。撮像部6は、撮像した画像情報を後述の画像処理部8に出力する。 The imaging unit 6 uses an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary MOS) to capture an image formed by the imaging optical system 30. The imaging unit 6 is accommodated in a casing 34 provided on the body 11 of the processing head 3. The imaging unit 6 can image the processing position on the workpiece W through the nozzle hole 23 of the nozzle 13. Furthermore, the imaging part 6 can also image the processing state of the workpiece W by laser light. The imaging unit 6 outputs the captured image information to the image processing unit 8 described later.
 照明部7は、撮像部6により画像を取得する対象のノズル13に対して照明光を照射する。例えば、照明部7は、加工ヘッド3内を介して撮像部6の視野に照明光を照射する。照明部7は、ボディ11に収容されかつボディ11に固定されている。照明部7は、光源35、レンズ36、ハーフミラー33を備える。光源35は、例えばLED(Lignt Emitting Diode)などであり、撮像部6が感度を有する波長帯の光(例、可視光)を出射する。なお、LED等のようなインコヒーレント光源を使用することで、撮像部6が撮像する画像に干渉縞が写りこむことを防止し、より正確にノズルを認識できる。光源35からの光は、レンズ36を通ってハーフミラー33で反射し、ミラー16を透過した後にレンズ17を通ってボディ11の開口部20の下方に照射される。また、レーザ加工機1は、照明部7を備えていなくてもよく、撮像部6による撮像を、例えば外光等により行ってもよい。 The illumination unit 7 irradiates illumination light to the target nozzle 13 from which an image is acquired by the imaging unit 6. For example, the illumination unit 7 irradiates the visual field of the imaging unit 6 with illumination light through the processing head 3. The illumination unit 7 is accommodated in the body 11 and fixed to the body 11. The illumination unit 7 includes a light source 35, a lens 36, and a half mirror 33. The light source 35 is, for example, an LED (Light Emitting Diode) or the like, and emits light (for example, visible light) in a wavelength band in which the imaging unit 6 has sensitivity. In addition, by using an incoherent light source such as an LED, it is possible to prevent interference fringes from appearing in an image captured by the imaging unit 6 and to recognize the nozzle more accurately. The light from the light source 35 is reflected by the half mirror 33 through the lens 36, passes through the mirror 16, passes through the lens 17, and is irradiated below the opening 20 of the body 11. Moreover, the laser processing machine 1 does not need to be equipped with the illumination part 7, and may image by the imaging part 6 by external light etc., for example.
 画像処理部8は、撮像部6から出力された画像情報に対して画像処理を行う。画像処理部8は、例えば、ガンマ補正やホワイトバランスなどの処理の他、二値化処理、パターンマッチングなどの処理を行う。画像処理部8は、処理された画像情報を制御部9に供給する。また、画像処理部8は、照明部7による照明光の照射や、撮像部6による撮像を制御する。なお、撮像部6や照明部7の制御は、後述する制御部9が行ってもよい。 The image processing unit 8 performs image processing on the image information output from the imaging unit 6. For example, the image processing unit 8 performs processing such as binarization processing and pattern matching in addition to processing such as gamma correction and white balance. The image processing unit 8 supplies the processed image information to the control unit 9. The image processing unit 8 controls illumination light irradiation by the illumination unit 7 and imaging by the imaging unit 6. The control of the imaging unit 6 and the illumination unit 7 may be performed by the control unit 9 described later.
 制御部9は、レーザ発振機2及び移動部4を含めてレーザ加工機1を統括制御する。制御部9は、例えばCPU(Central Processing Unit)を備え、記憶部10等に格納されたプログラムを呼び出してCPUに実行させることで各部を制御する。制御部9は、ノズル識別部41、ノズル判定部42、装着ノズル特定部43、ノズル装着指示部44、及び加工状態検出部45を備える。 The control unit 9 comprehensively controls the laser processing machine 1 including the laser oscillator 2 and the moving unit 4. The control unit 9 includes, for example, a CPU (Central / Processing / Unit), and controls each unit by calling a program stored in the storage unit 10 and causing the CPU to execute the program. The control unit 9 includes a nozzle identification unit 41, a nozzle determination unit 42, a mounting nozzle specification unit 43, a nozzle mounting instruction unit 44, and a processing state detection unit 45.
 ノズル識別部41は、撮像部6により取得したノズル13の画像からノズル種別を認識する。ここで、ノズル13は、例えば、形状(ノズル穴の形状、ノズル外観の形状)、寸法(ノズル穴径、ノズル外径、ノズル高さ)、及び表面処理(メッキの有無、メッキの種類)等の任意の組み合わせにより、複数の種類(ノズル形態)が存在する。このため、ノズル種別は、複数の種類(ノズル形態)のノズルを分類する種別、換言すれば、各ノズルの種類(ノズル形態)を特定する種別として使用される。なお、詳細は後述するが、ノズル穴の形状とは、例えば、1つのノズル穴、又は同心円状の2重のノズル穴等を意味し、ノズル外観の形状とは、例えば、扁平状、急傾斜状、又は段付き形状等を意味する。 The nozzle identification unit 41 recognizes the nozzle type from the image of the nozzle 13 acquired by the imaging unit 6. Here, the nozzle 13 is, for example, a shape (a shape of a nozzle hole, a shape of a nozzle appearance), a dimension (a nozzle hole diameter, a nozzle outer diameter, a nozzle height), a surface treatment (the presence or absence of plating, the type of plating), and the like. There are a plurality of types (nozzle forms) by any combination of the above. For this reason, the nozzle type is used as a type for classifying a plurality of types (nozzle forms) of nozzles, in other words, a type for specifying the type (nozzle form) of each nozzle. Although details will be described later, the shape of the nozzle hole means, for example, one nozzle hole or a concentric double nozzle hole, etc. The shape of the nozzle appearance is, for example, flat, steeply inclined Means a stepped shape or the like.
 ノズル識別部41は、移動部4による加工ヘッド3とノズル収容部5との移動量に基づいて、ノズル位置を認識する。ノズル識別部41は、例えば、移動部4による加工ヘッド3の移動量を取得し、ノズル種別を認識する対象のノズル13がいずれのスロット26(図2(b)参照)に配置されているかを算出する。ノズル識別部41は、例えば、スロット26の位置(例えばXY座標)と、このスロット26に保持されているノズル13のノズル種別とを関連付けたデータを生成し、このデータを記憶部10のノズル管理テーブルD3に登録する。 The nozzle identification unit 41 recognizes the nozzle position based on the amount of movement between the machining head 3 and the nozzle storage unit 5 by the moving unit 4. The nozzle identifying unit 41 acquires, for example, the amount of movement of the machining head 3 by the moving unit 4 and determines in which slot 26 (see FIG. 2B) the target nozzle 13 for recognizing the nozzle type is arranged. calculate. The nozzle identification unit 41 generates, for example, data in which the position of the slot 26 (for example, XY coordinates) and the nozzle type of the nozzle 13 held in the slot 26 are associated with each other, and this data is stored in the nozzle management of the storage unit 10. Register in table D3.
 制御部9は、ノズル種別を認識済のノズル13については、新たにノズル種別の認識を行うことなく、ノズル管理テーブルD3を参照して、装着対象のノズル13を見つけることができる。ノズル管理テーブルD3は、ノズル収容部5に配置された複数のノズル13に関して、ノズル識別部41が認識したノズル位置とノズル種別とを関連付けた情報である。 The control unit 9 can find the nozzle 13 to be mounted with reference to the nozzle management table D3 without newly recognizing the nozzle type for the nozzle 13 whose nozzle type has already been recognized. The nozzle management table D3 is information that associates the nozzle position recognized by the nozzle identification unit 41 and the nozzle type with respect to the plurality of nozzles 13 arranged in the nozzle accommodating unit 5.
 ノズル判定部42は、後述する識別マークM(図2(c)参照)により認識したノズル種別と、撮像部6で取得した画像からノズル13の形状の特徴部分を抽出して認識したノズル種別と、が一致するか否かを判定する。なお、制御部9がノズル判定部42を備えるか否かは任意である。 The nozzle determination unit 42 recognizes a nozzle type recognized by an identification mark M (see FIG. 2C) described later, and a nozzle type recognized by extracting a characteristic part of the shape of the nozzle 13 from the image acquired by the imaging unit 6. , Are determined to match. Note that whether or not the control unit 9 includes the nozzle determination unit 42 is arbitrary.
 装着ノズル特定部43は、加工条件に基づいて記憶部10の加工テーブルD4及びノズル管理テーブルD3を参照することにより、装着対象のノズル13を特定する。加工条件は、加工対象のワークに応じて設定され、例えば、移動部4による加工ヘッド3や、レーザ発振機2、及びアシストガスの噴射のうち少なくとも一つの動作条件を含む。移動部4の動作条件は、例えば加工ヘッド3を移動させる速度(例、切断速度)、加速度などである。レーザ発振機2の動作条件は、例えばレーザ光の出力、周波数、デューティ比などである。アシストガスの噴射条件は、例えば、アシストガスの噴射の有無、圧力、流量などである。加工テーブルD4は、ワークWに応じた複数の加工条件とノズル種別とを関連付けた情報である。 The mounting nozzle specifying unit 43 specifies the mounting target nozzle 13 by referring to the processing table D4 and the nozzle management table D3 of the storage unit 10 based on the processing conditions. The machining conditions are set according to the workpiece to be machined, and include, for example, at least one operating condition among the machining head 3 by the moving unit 4, the laser oscillator 2, and the assist gas injection. The operating conditions of the moving unit 4 are, for example, a speed (eg, cutting speed) for moving the machining head 3 and an acceleration. The operating conditions of the laser oscillator 2 are, for example, laser light output, frequency, duty ratio, and the like. The assist gas injection conditions include, for example, presence / absence of assist gas injection, pressure, and flow rate. The processing table D4 is information in which a plurality of processing conditions corresponding to the workpiece W and nozzle types are associated with each other.
 ノズル装着指示部44は、ノズル収容部5のノズル13を加工ヘッド3に装着するように指示する。ノズル装着指示部44は、ノズル収容部5の全てのノズル13に関するノズル種別を取得した後に、装着対象のノズル13の装着を指示してもよい。全てのノズル13に関するノズル種別を取得するので、ノズル管理テーブルD3が完成し、次のノズル交換の際にはノズル管理テーブルD3のデータに基づいて、装着対象のノズル13を容易に特定できる。全てのノズル13に関するノズル種別の取得は、レーザ加工機1の立ち上げ時や、ノズル収容部5の蓋部を掃けた後などのタイミングで行ってもよい。 The nozzle mounting instruction unit 44 instructs to mount the nozzle 13 of the nozzle housing unit 5 on the processing head 3. The nozzle mounting instruction unit 44 may instruct mounting of the mounting target nozzles 13 after acquiring the nozzle types related to all the nozzles 13 in the nozzle housing unit 5. Since the nozzle types for all the nozzles 13 are acquired, the nozzle management table D3 is completed, and the nozzle 13 to be mounted can be easily specified based on the data of the nozzle management table D3 at the next nozzle replacement. The acquisition of the nozzle types for all the nozzles 13 may be performed at a timing such as when the laser processing machine 1 is started up or after the lid of the nozzle housing 5 is swept.
 また、ノズル装着指示部44は、ノズル収容部5のノズル13に対して順次ノズル種別を取得して装着対象のノズル13を見つけた際に、そのノズル13の装着を指示してもよい。これにより、ノズル管理テーブルD3が不要であり、記憶部10の負担を軽減できる。ただし、装着対象のノズル13を見つけるまでに取得したノズル位置及びノズル種別は、ノズル管理テーブルD3に登録し、次のノズル交換時に参照してもよい。 Further, the nozzle mounting instruction unit 44 may instruct the mounting of the nozzle 13 when the nozzle type of the nozzle 13 of the nozzle accommodating unit 5 is sequentially obtained and the mounting target nozzle 13 is found. Thereby, the nozzle management table D3 is unnecessary, and the burden on the storage unit 10 can be reduced. However, the nozzle position and the nozzle type acquired until the mounting target nozzle 13 is found may be registered in the nozzle management table D3 and referred to at the next nozzle replacement.
 加工状態検出部45は、レーザ光により加工中又は加工済みのワークWを撮像部6が撮像した画像から、ワークWの加工状態を検出する。加工状態検出部45により、加工状態の撮像とノズル13の撮像とで撮像部6を共用することができる。ワークWの加工状態の検出の一例として、ワークWの加工状態の良否判断も含む。また、加工状態検出部45は、例えば、ワークWの画像からレーザ加工の切断幅等の数値情報を抽出することも可能である。この場合、制御部9は、加工状態検出部45が抽出した数値情報に基づいて、レーザ加工のフィードバック制御を実行してもよい。なお、制御部9が加工状態検出部45を備えるか否かは任意である。 The machining state detection unit 45 detects the machining state of the workpiece W from an image obtained by the imaging unit 6 imaging the workpiece W being processed or processed by the laser beam. The processing state detection unit 45 allows the imaging unit 6 to be shared by the processing state imaging and the nozzle 13 imaging. As an example of the detection of the machining state of the workpiece W, the quality determination of the machining state of the workpiece W is included. Further, the machining state detection unit 45 can also extract numerical information such as a laser machining cutting width from an image of the workpiece W, for example. In this case, the control unit 9 may execute feedback control of laser processing based on the numerical information extracted by the processing state detection unit 45. Note that whether or not the control unit 9 includes the machining state detection unit 45 is arbitrary.
 上述のような構成のレーザ加工機1に基づき、本実施形態に係るノズル装着方法の一例について説明する。本実施形態のレーザ加工機1は、ノズル収容部5に配置されているノズル13のノズル種別を認識可能である。これにより、レーザ加工機1は、例えば、ノズル収容部5に配置されている複数種類のノズル13から、交換対象のノズル13を自動的に見つけ出し、ノズル交換処理の少なくとも一部を自動的に行うことができる。 An example of the nozzle mounting method according to this embodiment will be described based on the laser beam machine 1 having the above-described configuration. The laser beam machine 1 according to the present embodiment can recognize the nozzle type of the nozzle 13 arranged in the nozzle housing portion 5. Thereby, for example, the laser processing machine 1 automatically finds out the nozzle 13 to be replaced from a plurality of types of nozzles 13 arranged in the nozzle housing 5 and automatically performs at least a part of the nozzle replacement processing. be able to.
 図2(a)は、加工ヘッド3がノズル収容部5に対向した状態を示す図、(b)はノズル収容部5の平面図、(c)はノズル13の一例を示す平面図である。図2(a)及び(b)に示すように、複数のノズル13は、ノズル収容部5の複数のスロット26に配置されており、スロット26のノズル13のいずれかに対して加工ヘッド3が位置決めされている。複数のスロット26は、それぞれXY方向の位置が定められている。なお、ノズル収容部5の形態は一例であって、複数のノズル13を保持可能な任意の構成が適用される。また、加工ヘッド3においては、先に装着していたノズル13を不図示の回転機構等により予めボディ11から取り外した状態となっている。 2A is a view showing a state in which the machining head 3 is opposed to the nozzle accommodating portion 5, FIG. 2B is a plan view of the nozzle accommodating portion 5, and FIG. 2C is a plan view showing an example of the nozzle 13. FIG. As shown in FIGS. 2A and 2B, the plurality of nozzles 13 are arranged in the plurality of slots 26 of the nozzle accommodating portion 5, and the machining head 3 is positioned with respect to any of the nozzles 13 in the slot 26. It is positioned. Each of the plurality of slots 26 has a position in the XY direction. In addition, the form of the nozzle accommodating part 5 is an example, Comprising: The arbitrary structures which can hold | maintain the several nozzle 13 are applied. Further, in the machining head 3, the previously mounted nozzle 13 is removed from the body 11 in advance by a rotation mechanism (not shown).
 また、図2(c)に示すように、ノズル13は、識別マークMを有する。識別マークMは、ノズル13がノズル収容部5に配置されている状態で、撮像部6により撮像可能な位置に形成されている。識別マークMは、ノズル13のノズル穴23の貫通方向から見てノズル穴23の周囲に形成されている。識別マークMは、基端部21周囲の端面24に形成されている。識別マークMは、例えば、主要部M1および基準マークM2を有する。主要部M1は、ノズル種別を示す文字コードを有する。基準マークM2は、主要部M1に対して規定された位置に形成されている。基準マークM2は、例えば、主要部M1よりもノズル穴23に近い位置に形成される。基準マークM2を検出すると、基準マークM2と主要部M1との相対位置が既知であるので、主要部M1の位置を推定できる。 Further, as shown in FIG. 2C, the nozzle 13 has an identification mark M. The identification mark M is formed at a position where the imaging unit 6 can capture an image in a state where the nozzle 13 is disposed in the nozzle housing unit 5. The identification mark M is formed around the nozzle hole 23 when viewed from the penetration direction of the nozzle hole 23 of the nozzle 13. The identification mark M is formed on the end surface 24 around the base end portion 21. The identification mark M has, for example, a main part M1 and a reference mark M2. The main part M1 has a character code indicating the nozzle type. The reference mark M2 is formed at a position defined with respect to the main part M1. For example, the reference mark M2 is formed at a position closer to the nozzle hole 23 than the main portion M1. When the reference mark M2 is detected, since the relative position between the reference mark M2 and the main part M1 is known, the position of the main part M1 can be estimated.
 次に、本実施形態のレーザ加工機1が実行する処理について説明する。図3は、識別マークMによりノズル13を識別する一例を示すフローチャートである。図4は、基準マークM2の認識の一例を示すフローチャートである。図5は、文字、コード(主要部M1)の認識の一例を示すフローチャートである。図6は、ノズル13の識別処理を示す工程図である。この処理は、ノズル収容部5におけるノズル13の収容位置(スロット26の位置)と、ノズル13のノズル種別とを関連付けたノズル管理テーブルD3を作成する。 Next, processing executed by the laser beam machine 1 according to the present embodiment will be described. FIG. 3 is a flowchart showing an example of identifying the nozzle 13 by the identification mark M. FIG. 4 is a flowchart showing an example of recognition of the reference mark M2. FIG. 5 is a flowchart showing an example of recognition of characters and codes (main part M1). FIG. 6 is a process diagram showing the nozzle 13 identification process. This process creates a nozzle management table D3 in which the accommodation position of the nozzle 13 (position of the slot 26) in the nozzle accommodation unit 5 is associated with the nozzle type of the nozzle 13.
 まず、制御部9は、ステップS1において、ノズル収容部5において対象のスロット26の基準位置に加工ヘッド3を移動させる。ステップS1において、制御部9は、移動部4を制御し、図6(a)に示すように、撮像部6の画像認識エリアFVがスロット26の基準位置に配置されるように、加工ヘッド3を移動させる。スロット26の基準位置は、ノズル13の基端部21の外側であって、ノズル穴23を中心とした周回方向に基準マークM2が存在する位置に設定される。次に、制御部9は、ステップS2において、対象のスロット26に収容されているノズル13の基準マークM2を認識する処理を実行する(以下、図4参照)。 First, in step S <b> 1, the control unit 9 moves the machining head 3 to the reference position of the target slot 26 in the nozzle housing unit 5. In step S1, the control unit 9 controls the moving unit 4 so that the image recognition area FV of the imaging unit 6 is arranged at the reference position of the slot 26 as shown in FIG. Move. The reference position of the slot 26 is set outside the base end portion 21 of the nozzle 13 and at a position where the reference mark M2 exists in the circumferential direction around the nozzle hole 23. Next, in step S2, the controller 9 executes a process of recognizing the reference mark M2 of the nozzle 13 accommodated in the target slot 26 (see FIG. 4 below).
 まず、図4に示すように、制御部9は、ステップS11において、画像認識エリアFVに基準マークM2が存在するか否かを判定する。ステップS11において、制御部9は、画像認識エリアFVの画像を撮像部6で撮像させる。制御部9は、撮像部6による撮像画像を画像処理部8に処理させ、その処理結果をもとに撮像画像に基準マークM2が検出されたか否かを判定する。 First, as shown in FIG. 4, the control unit 9 determines whether or not the reference mark M2 exists in the image recognition area FV in step S11. In step S <b> 11, the control unit 9 causes the image capturing unit 6 to capture an image of the image recognition area FV. The control unit 9 causes the image processing unit 8 to process the image captured by the imaging unit 6 and determines whether or not the reference mark M2 is detected in the captured image based on the processing result.
 制御部9は、画像認識エリア内に基準マークM2が検出されないと判定した場合(ステップS11;NO)、基準マークM2の認識を継続するか否かを判定する(ステップS12)。制御部9は、ステップS12の処理を実行した回数をカウントしておき、処理回数が所定数以下である場合に、基準マークM2の認識を継続すると判定する。制御部9は、基準マークM2の認識を継続すると判定した場合(ステップS12;YES)、画像認識エリアとノズル13とを相対移動させ(ステップS13)、ステップS11およびステップS12の処理を繰り返す。ステップS13の相対移動は、例えば、移動部4を駆動してノズル穴23を中心とした周回方向に加工ヘッド3をステップ移動させて行う。ステップS11からステップS13の処理を繰り返すことにより、通常は、図6(b)に示すように画像認識エリアFVに基準マークM2が入り込む。 When it is determined that the reference mark M2 is not detected in the image recognition area (step S11; NO), the control unit 9 determines whether or not the recognition of the reference mark M2 is continued (step S12). The control unit 9 counts the number of times that the process of step S12 has been executed, and determines that the recognition of the reference mark M2 is continued when the number of processes is equal to or less than a predetermined number. When it is determined that the recognition of the reference mark M2 is continued (step S12; YES), the control unit 9 relatively moves the image recognition area and the nozzle 13 (step S13), and repeats the processes of steps S11 and S12. The relative movement in step S13 is performed by, for example, stepping the machining head 3 in the circumferential direction around the nozzle hole 23 by driving the moving unit 4. By repeating the processing from step S11 to step S13, the reference mark M2 normally enters the image recognition area FV as shown in FIG. 6B.
 また、制御部9は、ステップS12の処理が所定数を超えた場合に、基準マークM2の認識を継続しないと判定する(ステップS12;NO)。そして、制御部9は、この処理において対象のスロット26に保持されているノズル13の基準マークM2の認識を失敗したものとして、この基準マーク認識処理を終了する。 The control unit 9 determines that the recognition of the reference mark M2 is not continued when the processing in step S12 exceeds a predetermined number (step S12; NO). Then, the control unit 9 determines that the reference mark M2 of the nozzle 13 held in the target slot 26 in this process has failed to be recognized, and ends this reference mark recognition process.
 制御部9は、画像認識エリア内に基準マークM2が検出されたと判定した場合(ステップS11;YES)、ステップS14において、撮像画像上での基準マークM2の位置(P1)を検出する(図6(c)参照)。例えば、制御部9は、画像処理部8が検出した基準マークM2のエッジから、基準マークM2の中心の位置P1を算出する。次に、制御部9は、画像認識エリアFVの中心P0と基準マークM2の位置P1との間の差Pdを算出する(ステップS15)。画像認識エリアFVの中心P0は、撮像光学系30の光軸30aの位置である。 When it is determined that the reference mark M2 is detected in the image recognition area (step S11; YES), the control unit 9 detects the position (P1) of the reference mark M2 on the captured image in step S14 (FIG. 6). (See (c)). For example, the control unit 9 calculates the center position P1 of the reference mark M2 from the edge of the reference mark M2 detected by the image processing unit 8. Next, the control unit 9 calculates a difference Pd between the center P0 of the image recognition area FV and the position P1 of the reference mark M2 (step S15). The center P0 of the image recognition area FV is the position of the optical axis 30a of the imaging optical system 30.
 制御部9は、例えば、撮像画像上の基準マークM2の中心の位置P1に相当する画素の位置と、撮像画像の中心P0に相当する画素の位置との差Pdを算出する。この差分値は、例えば、X方向の位置の差を画素数で表した成分と、Y方向の位置の差を画素数で表した成分とを含むベクトルで表される。制御部9は、撮像画像の1画素のサイズとノズル13上の実寸値(例えばミリメートル)との換算係数を用いて、位置の差を実寸値に換算することもできる。この換算係数は、例えば、ノズル13において識別マークが形成される高さに配置されたスケールを撮像することにより、予め調べておくことができる。 The control unit 9 calculates, for example, a difference Pd between a pixel position corresponding to the center position P1 of the reference mark M2 on the captured image and a pixel position corresponding to the center P0 of the captured image. This difference value is represented, for example, by a vector including a component in which the difference in position in the X direction is represented by the number of pixels and a component in which the difference in position in the Y direction is represented by the number of pixels. The control unit 9 can also convert the position difference into the actual size value by using a conversion coefficient between the size of one pixel of the captured image and the actual size value (for example, millimeter) on the nozzle 13. This conversion coefficient can be examined in advance by, for example, imaging a scale arranged at a height at which an identification mark is formed in the nozzle 13.
 制御部9は、差Pdの絶対値|Pd|が規定値未満であるか否かを判定する(ステップS16)。絶対値|Pd|は、画像認識エリアの中心P0と基準マークM2との距離(ずれ量)に相当する。絶対値|Pd|に対する規定値としては、例えば、0または0近傍の数値であってもよい。制御部9は、絶対値|Pd|が規定値以上であると判定した場合(ステップS16;NO)、移動部4を制御し、画像認識エリアFVとノズル13とを、差Pdに応じて相対移動させる(ステップS17)。ステップS17において、制御部9は、差Pdを用いて、画像認識エリアFVの中心P0と基準マークM2の中心の位置P1とが近づく向きに、加工ヘッド3を移動させる。続いて、制御部9は、ステップS14からステップS16の処理を行う。制御部9は、絶対値|Pd|が規定値未満であると判定した場合(ステップS16;NO)、基準マークM2の認識に成功したものとして、基準マークの認識処理を終了する。 The control unit 9 determines whether or not the absolute value | Pd | of the difference Pd is less than a specified value (step S16). The absolute value | Pd | corresponds to the distance (deviation amount) between the center P0 of the image recognition area and the reference mark M2. The specified value for the absolute value | Pd | may be, for example, 0 or a numerical value in the vicinity of 0. When the control unit 9 determines that the absolute value | Pd | is equal to or larger than the specified value (step S16; NO), the control unit 9 controls the moving unit 4 to make the image recognition area FV and the nozzle 13 relative to each other according to the difference Pd. Move (step S17). In step S <b> 17, the control unit 9 moves the processing head 3 using the difference Pd so that the center P <b> 0 of the image recognition area FV approaches the center position P <b> 1 of the reference mark M <b> 2. Subsequently, the control unit 9 performs the processing from step S14 to step S16. When it is determined that the absolute value | Pd | is less than the specified value (step S16; NO), the control unit 9 determines that the reference mark M2 has been successfully recognized, and ends the reference mark recognition process.
 基準マークの認識処理の終了後に、制御部9は、図3のステップS3において、認識した基準マークM2の位置から主要部M1の位置を推定し、主要部M1の認識処理を実行する(以下、図5参照)。制御部9は、図5のステップS21において、移動部4を制御し、文字(主要部M1)の認識位置に加工ヘッド3を移動させる(図6(d)参照)。ステップS21において、制御部9は、移動部4を制御して、主要部M1の少なくとも一部(例えば文字の先頭)が画像認識エリアFVに入るようにする。 After the completion of the reference mark recognition process, the control unit 9 estimates the position of the main part M1 from the recognized position of the reference mark M2 in step S3 in FIG. (See FIG. 5). In step S21 in FIG. 5, the control unit 9 controls the moving unit 4 to move the processing head 3 to the recognition position of the character (main part M1) (see FIG. 6D). In step S21, the control unit 9 controls the moving unit 4 so that at least a part of the main part M1 (for example, the beginning of the character) enters the image recognition area FV.
 なお、図4に示した処理において、画像認識エリアFV内に基準マークM2が入った場合、ステップS15で算出した差Pdをオフセット値として記憶し、このオフセット値を用いて図5のステップS21に示すように画像認識エリアFVを主要部M1の位置まで移動させてもよい。これにより、画像認識エリアFV内に基準マークM2が入った後に、画像認識エリアFVの中心P0と基準マークM2の位置P1との位置合わせ(図4のステップS16およびステップS17の処理)を行わなくてもよい。 In the process shown in FIG. 4, when the reference mark M2 enters the image recognition area FV, the difference Pd calculated in step S15 is stored as an offset value, and this offset value is used in step S21 in FIG. As shown, the image recognition area FV may be moved to the position of the main part M1. Thereby, after the reference mark M2 enters the image recognition area FV, the alignment of the center P0 of the image recognition area FV and the position P1 of the reference mark M2 is not performed (the processing in steps S16 and S17 in FIG. 4). May be.
 制御部9は、撮像部6に撮像させ、その撮像画像を画像処理部8に処理させる(ステップS22)。画像処理部8は、ステップS22において、OCRによる文字認識、あるいは二値化による図形認識(例、バーコード認識)などの処理を行う(ステップS22)。次に、制御部9は、識別マークMの主要部M1の認識に成功したか否かを判定する(ステップS23)。 The control unit 9 causes the imaging unit 6 to capture an image and causes the image processing unit 8 to process the captured image (step S22). In step S22, the image processing unit 8 performs processing such as character recognition by OCR or graphic recognition by binarization (eg, barcode recognition) (step S22). Next, the control unit 9 determines whether or not the main part M1 of the identification mark M has been successfully recognized (step S23).
 制御部9は、主要部M1の認識に失敗したと判定した場合(ステップS23;NO)、制御部9は、主要部M1の認識を継続するか否かを判定する(ステップS24)。制御部9は、主要部M1の認識を継続すると判定した場合(ステップS24;YES)、認識パラメータを変更し(ステップS25)、ステップS22およびステップS23の処理を繰り返す。認識パラメータは、例えば、照明部7による明るさ、ステップS22の処理で使われるアルゴリズムあるいはそのパラメータ(例、二値化の閾値)などである。制御部9は、ステップS24の処理を実行した回数をカウントしており、ステップS24の処理の回数が規定数に達した場合に、主要部M1の認識を継続しない判定する(ステップS24;No)。この場合に、制御部9は、主要部M1の認識に失敗したものとして、主要部M1の認識処理を終了する。 When it is determined that the recognition of the main part M1 has failed (step S23; NO), the control unit 9 determines whether or not the recognition of the main part M1 is continued (step S24). When it determines with continuing the recognition of main part M1 (step S24; YES), the control part 9 changes a recognition parameter (step S25), and repeats the process of step S22 and step S23. The recognition parameter is, for example, brightness by the illumination unit 7, an algorithm used in the process of step S22, or a parameter thereof (eg, binarization threshold). The control unit 9 counts the number of times that the process of step S24 has been executed, and determines that the recognition of the main part M1 is not continued when the number of times of the process of step S24 reaches a specified number (step S24; No). . In this case, the controller 9 determines that the recognition of the main part M1 has failed and ends the recognition process of the main part M1.
 制御部9は、主要部M1の認識に成功したと判定した場合(ステップS23;YES)、認識した情報(認識情報)を記憶部10に記憶させ、記憶部10に認識情報が記憶されている場合にこれを更新する(ステップS26)。ここで、主要部M1が文字コードであり、例えば所定のデータ量(例、7桁)の数字、記号を含むとする。また、一回の撮像による主要部M1の認識処理で、認識可能な桁数が1桁であるとする。この場合、制御部9は、所定のデータ量の主要部M1を認識するのに必要な回数(例、7回)の識別処理を行う。制御部9は、ステップS26において、初回の識別で得られた情報を記憶部10に記憶させ、次回以降の識別で得られた情報を追加し、識別情報を更新する。 When it determines with the recognition of the main part M1 having succeeded (step S23; YES), the control part 9 memorize | stores the recognized information (recognition information) in the memory | storage part 10, and the recognition information is memorize | stored in the memory | storage part 10. If this is the case, this is updated (step S26). Here, it is assumed that the main part M1 is a character code and includes, for example, a predetermined data amount (eg, 7 digits) of numbers and symbols. In addition, it is assumed that the number of digits that can be recognized in the recognition process of the main part M1 by one imaging is one digit. In this case, the control unit 9 performs identification processing for the number of times (for example, 7 times) necessary to recognize the main part M1 having a predetermined data amount. In step S <b> 26, the control unit 9 stores information obtained by the first identification in the storage unit 10, adds information obtained by the next and subsequent identifications, and updates the identification information.
 制御部9は、主要部M1の認識を継続するか否かを判定する(ステップS27)。制御部9は、例えば、所定のデータ量の識別情報の取得が完了していない場合、主要部M1の認識を継続すると判定する(ステップS27;YES)。この場合、制御部9は、ステップS28において加工ヘッド3を所定量だけ移動させ(図6(e)参照)、ステップS22からステップS27の処理を繰り返す。制御部9は、例えば、所定のデータ量の識別情報の取得が完了した場合、主要部M1の認識を継続しないと判定する(ステップS27;No)。この場合、制御部9は、主要部M1の認識に成功したものとして、主要部M1の認識処理を終了する。 The control unit 9 determines whether or not to continue the recognition of the main part M1 (step S27). For example, when the acquisition of the identification information of the predetermined data amount is not completed, the control unit 9 determines to continue the recognition of the main part M1 (step S27; YES). In this case, the control unit 9 moves the machining head 3 by a predetermined amount in step S28 (see FIG. 6E), and repeats the processing from step S22 to step S27. For example, when the acquisition of the identification information of the predetermined data amount is completed, the control unit 9 determines that the recognition of the main part M1 is not continued (Step S27; No). In this case, the control part 9 complete | finishes the recognition process of the main part M1 as what succeeded in recognition of the main part M1.
 図3に戻り、制御部9は、ステップS3で認識した主要部M1の情報を、予め記憶部10に記憶されている登録情報と照合し、ノズル種別を判定する(ステップS4)。この登録情報は、識別マークの情報とノズル種別の情報とを関連付けた情報である。 3, the control unit 9 compares the information of the main part M1 recognized in step S3 with the registration information stored in advance in the storage unit 10, and determines the nozzle type (step S4). This registration information is information in which identification mark information and nozzle type information are associated with each other.
 制御部9は、ノズル種別の認識に成功したか否かを判定する(ステップS5)。制御部9は、ノズル種別の認識に成功したと判定した場合(ステップS5;YES)、識別対象のスロット26のXY座標位置(ノズル位置)とノズル種別とを関連付けて、ノズル管理テーブルD3に登録する(ステップS6)。制御部9は、ステップS6の処理の後、又はステップS5においてノズル種別の認識に失敗したと判定した場合(ステップS5;NO)、次のスロット26を認識対象に設定するか否かを判定する(ステップS7)。制御部9は、次のスロット26を認識対象に設定すると判定した場合(ステップS7;YES)、次に認識対象とするスロット26の基準位置をセットし(ステップS8)、ステップS1からステップS7の処理を繰り返す。また、制御部9は、次のスロット26を認識対象に設定しないと判定した場合(ステップS7;NO)、一連の処理を終了する。 The control unit 9 determines whether or not the nozzle type has been successfully recognized (step S5). When it is determined that the nozzle type has been successfully recognized (step S5; YES), the control unit 9 associates the XY coordinate position (nozzle position) of the slot 26 to be identified with the nozzle type and registers them in the nozzle management table D3. (Step S6). The controller 9 determines whether or not to set the next slot 26 as a recognition target after the process of step S6 or when it is determined in step S5 that the nozzle type has failed to be recognized (step S5; NO). (Step S7). When it is determined that the next slot 26 is set as a recognition target (step S7; YES), the control unit 9 sets the reference position of the slot 26 to be recognized next (step S8), and the process proceeds from step S1 to step S7. Repeat the process. When it is determined that the next slot 26 is not set as a recognition target (step S7; NO), the control unit 9 ends the series of processes.
 次に、識別マークMの他の例について説明する。図7(a)~(e)の識別マークMA~MEからは、それぞれ、識別マークの他の例を示す図である。図7(a)の識別マークMAは、主要部M1を含み、基準マークを含んでいない。ここでは、主要部M1は、3桁の数字(001)と記号(-)と3桁の数字(180)とを含む。このように、識別マークMAは、基準マークM2を含んでいなくてもよく、この場合に基準マークM2の認識処理は省略可能である。主要部M1に含まれる文字の種類(例、アルファベッド、漢字、ひらがな、数字)および数、並びに記号の種類および数は、上記の例に限定されず、任意に設定可能である。 Next, another example of the identification mark M will be described. FIGS. 7A to 7E are diagrams showing other examples of the identification marks from the identification marks MA to ME, respectively. The identification mark MA in FIG. 7A includes the main part M1 and does not include the reference mark. Here, the main part M1 includes a three-digit number (001), a symbol (−), and a three-digit number (180). Thus, the identification mark MA does not need to include the reference mark M2, and in this case, the recognition process of the reference mark M2 can be omitted. The type (eg, alpha bed, kanji, hiragana, number) and number of characters included in the main part M1, and the type and number of symbols are not limited to the above example, and can be arbitrarily set.
 図7(b)の識別マークMBは、主要部M1が図形で表されている。この主要部M1は、ノズル13の周方向に配列された10個の丸を含み、丸の色(例、白と黒)が複数色に設定されている。丸が2色である場合、主要部M1は、2進数10桁の数字列すなわち10ビットの情報を表現できる。また、図7(c)の識別マークMCは、ノズル13の周方向に配列された10個の扇状の図形を含む。主要部M1が図形を含む場合、図形の形状、数、色の種類は、上記の例に限定されず、任意に設定可能である。図7(d)の識別マークMDは、基端部21の内側のテーパ面に配置された複数の図形(例、扇状)を含む。このように、主要部M1が形成される位置は、上記の例に限定されず、任意に設定可能である。図7(e)の識別マークMEは、いわゆる1次元バーコード状である。なお、1次元バーコードに代えて2次元バーコードが用いられてもよい。また、図7(a)~(e)に示す識別マークMA~MEについて、それぞれ基準マークM2と組み合わせてもよい。 In the identification mark MB in FIG. 7B, the main part M1 is represented by a figure. The main portion M1 includes ten circles arranged in the circumferential direction of the nozzle 13, and the circle colors (for example, white and black) are set to a plurality of colors. When the circle has two colors, the main part M1 can express a binary 10-digit number string, that is, 10-bit information. Further, the identification mark MC in FIG. 7C includes ten fan-shaped figures arranged in the circumferential direction of the nozzle 13. When the main part M1 includes a figure, the shape, number, and color type of the figure are not limited to the above example and can be arbitrarily set. The identification mark MD in FIG. 7D includes a plurality of figures (for example, fan-shaped) arranged on the tapered surface inside the base end portion 21. Thus, the position where the main part M1 is formed is not limited to the above example, and can be arbitrarily set. The identification mark ME in FIG. 7E is a so-called one-dimensional barcode. Note that a two-dimensional barcode may be used instead of the one-dimensional barcode. Further, the identification marks MA to ME shown in FIGS. 7A to 7E may be combined with the reference mark M2.
 次に、ノズル種別の例について説明する。図8(a)~(d)は、それぞれ、ノズル種別が異なるノズルの例を示す図である。図8(a)~(c)のノズル13A~13Cは、それぞれ1つのノズル穴23を持ち、ノズル穴径φが3mm、2mm、1.5mmである。このようなノズル13A~13Cのノズル種別は、例えば「通常ノズル、φ3mm」、「通常ノズル、2mm」、「通常ノズル、1.5mm」でそれぞれ表される。図8(d)のノズル13Dは、2重のノズル穴23を持ち、内側のノズル穴径φが2mmである。このノズル13Dは、上方から見て外側のノズル穴が見えないが、外側のノズル穴に連通するガス導入部23aが確認されるので2重のノズル穴を持つことが認識可能である。このようなノズル13Dのノズル種別は、例えば「2重ノズル、φ2mm」で表される。なお、図8に示すノズルは一例であって、ノズル種別を限定するものではない。ノズル種別は、ノズル外径の寸法や、ノズル形状、ノズル表面処理などによって分類されてもよい。ノズル外径の寸法による種別は、例えば先端側や基端側の外径や、ノズル高さなどにより分類可能である。また、ノズル形状による種別は、例えばノズルの高さ方向に偏平状となったものや、急傾斜状になったもの、または外周の一部に段が形成されたものなどにより分類可能である。ノズル表面処理による種別は、例えばノズル表面に施すメッキの有無や、メッキの種類などにより分類可能である。また、ノズル種別は、上記した以外において分類されてもよい。 Next, examples of nozzle types will be described. FIGS. 8A to 8D are diagrams showing examples of nozzles having different nozzle types. The nozzles 13A to 13C in FIGS. 8A to 8C each have one nozzle hole 23, and the nozzle hole diameter φ is 3 mm, 2 mm, and 1.5 mm. The nozzle types of the nozzles 13A to 13C are represented by, for example, “normal nozzle, φ3 mm”, “normal nozzle, 2 mm”, and “normal nozzle, 1.5 mm”. The nozzle 13D of FIG. 8D has double nozzle holes 23, and the inner nozzle hole diameter φ is 2 mm. Although this nozzle 13D does not see the outer nozzle hole when viewed from above, it can be recognized that the gas introduction portion 23a communicating with the outer nozzle hole is confirmed, so that it has double nozzle holes. The nozzle type of the nozzle 13D is represented by “double nozzle, φ2 mm”, for example. The nozzle shown in FIG. 8 is an example, and the nozzle type is not limited. The nozzle type may be classified according to the size of the nozzle outer diameter, the nozzle shape, the nozzle surface treatment, and the like. The classification according to the dimension of the nozzle outer diameter can be classified by, for example, the outer diameter on the distal end side or the proximal end side, the nozzle height, or the like. Further, the classification according to the nozzle shape can be classified according to, for example, a flat shape in the nozzle height direction, a steeply inclined shape, or a step formed on a part of the outer periphery. Classification by nozzle surface treatment can be classified by, for example, the presence or absence of plating applied to the nozzle surface, the type of plating, and the like. Further, the nozzle type may be classified other than those described above.
 図8に示すように、ノズル13をノズル種別で分類した場合、形状上の特徴が存在する。従って、この形状上の特徴を見つけてノズル種別を認識できれば識別マークMが不要となり、取扱性が向上する。図9は、このようなノズル13の形状上の特徴からノズル種別を認識する手法である。 As shown in FIG. 8, when the nozzles 13 are classified by nozzle type, there are features on the shape. Therefore, if the feature on the shape can be found and the nozzle type can be recognized, the identification mark M becomes unnecessary, and the handleability is improved. FIG. 9 shows a method for recognizing the nozzle type from the shape feature of the nozzle 13.
 図9は、ノズルを識別する他の例を示すフローチャートである。図9において、図3と同様の処理を実行するものには同様の符号を付してその説明を省略または簡略化する。図9に示すように、ステップS31において、制御部9は、加工ヘッド3を対象となるスロット26の中心位置に移動させる。例えば、制御部9は、移動部4を制御し、ノズル13のノズル穴23に画像認識エリアFVが配置されるように、加工ヘッド3を移動させる。このときの画像認識エリアFVは、例えば、ノズル13の基端部21全体が撮像可能な大きさに設定されてもよいし、図6と同様の小さな画像認識エリアFVを走査して画像を取得してもよい。 FIG. 9 is a flowchart showing another example of identifying nozzles. In FIG. 9, the same reference numerals are given to those that perform the same processing as in FIG. 3, and the description thereof is omitted or simplified. As shown in FIG. 9, in step S <b> 31, the control unit 9 moves the machining head 3 to the center position of the target slot 26. For example, the control unit 9 controls the moving unit 4 to move the processing head 3 so that the image recognition area FV is arranged in the nozzle hole 23 of the nozzle 13. The image recognition area FV at this time may be set to a size that allows the entire base end portion 21 of the nozzle 13 to be imaged, for example, or an image is obtained by scanning a small image recognition area FV similar to FIG. May be.
 ステップS32において、制御部9は、撮像部6に撮像させ、その撮像画像を画像処理部8に処理させることにより、ノズル穴23のエッジなどの特徴量を抽出させる。ステップS33において、制御部9は、ステップS32で抽出した特徴量を、特徴量とノズル種別とを関連付けたデータベースと照合する。例えば、制御部9は、ステップS32で検出されたノズル穴23のエッジから、通常ノズル(図8(a)~(c)参照)であるか、2重ノズル(図8(d)参照)であるかを判定する。また、制御部9は、ステップS32で検出されたノズル穴23のエッジから、ノズル穴径を算出する。制御部9は、通常ノズルと2重ノズルの区別を示す情報およびノズル穴径を示す情報を、データベースで検索し、該当するノズル種別を取得する。以下、制御部9は、図3を参照して説明したステップS5からステップS8の処理を行って、ステップS5において認識に成功したと判定したノズルについて、スロット位置とノズル種別とを関連付け、ノズル管理テーブルD3を作成する。 In step S32, the control unit 9 causes the image capturing unit 6 to capture an image, and causes the image processing unit 8 to process the captured image, thereby extracting a feature amount such as an edge of the nozzle hole 23. In step S33, the control unit 9 collates the feature amount extracted in step S32 with a database in which the feature amount and the nozzle type are associated with each other. For example, from the edge of the nozzle hole 23 detected in step S32, the control unit 9 uses a normal nozzle (see FIGS. 8A to 8C) or a double nozzle (see FIG. 8D). Determine if there is. Further, the control unit 9 calculates the nozzle hole diameter from the edge of the nozzle hole 23 detected in step S32. The control unit 9 searches the database for information indicating the distinction between the normal nozzle and the double nozzle and information indicating the nozzle hole diameter, and acquires the corresponding nozzle type. Thereafter, the control unit 9 performs the processing from step S5 to step S8 described with reference to FIG. 3, associates the slot position with the nozzle type for the nozzle determined to have been successfully recognized in step S5, and manages the nozzle. Create table D3.
 なお、本実施形態において、制御部9は、識別マークMによるノズルの識別処理(図4および図5参照)と、形状によるノズルの識別処理(図9参照)の一方のみを行ってもよいし、双方を行ってもよい。例えば、識別マークMによるノズル識別処理の結果と形状によるノズル識別処理の結果とが一致するかを制御部9のノズル判定部42(図1参照)により判定し、合致する場合はノズル位置およびノズル種別をノズル管理テーブルD3に登録し、合致しない場合は登録しないといった処理を行ってもよい。 In the present embodiment, the control unit 9 may perform only one of the nozzle identification process using the identification mark M (see FIGS. 4 and 5) and the nozzle identification process using the shape (see FIG. 9). , You may do both. For example, it is determined by the nozzle determination unit 42 (see FIG. 1) of the control unit 9 whether the result of the nozzle identification process by the identification mark M and the result of the nozzle identification process by the shape match. Processing may be performed in which the type is registered in the nozzle management table D3 and is not registered if they do not match.
 次に、本実施形態に係るノズル装着方法について説明する。図10は、ノズル装着方法を示すフローチャートである。図11(a)は、ノズル管理テーブルD3の一例を示す図、図11(b)は加工テーブルD4の一例を示す図である。 Next, a nozzle mounting method according to this embodiment will be described. FIG. 10 is a flowchart showing a nozzle mounting method. FIG. 11A shows an example of the nozzle management table D3, and FIG. 11B shows an example of the processing table D4.
 図10に示すように、加工ヘッド3へのノズル13の装着において、制御部9は、図3または図9に示したノズルの識別処理により、ノズル管理テーブルD3を作成する(ステップS41)。図11(a)のノズル管理テーブルD3は、管理番号の項目と、ノズル位置の項目と、ノズル種別の項目とを含む。管理番号は、任意に設置されるIDであるが、例えばノズル収容部5のスロット26の番号などでもよい。ノズル位置は、ノズル収容部5においてノズル13が配置されている位置であり、例えばスロット26のXY座標であってもよい。ノズル管理テーブルD3において、ノズル位置およびノズル種別は、管理番号で関連付けられている。例えば、管理番号1に該当するノズル13は、X方向の座標がX1、Y方向の座標がY1の位置に配置されており、ノズル穴径が3mmの通常ノズルである。 As shown in FIG. 10, when the nozzle 13 is attached to the machining head 3, the control unit 9 creates the nozzle management table D3 by the nozzle identification process shown in FIG. 3 or FIG. 9 (step S41). The nozzle management table D3 in FIG. 11A includes a management number item, a nozzle position item, and a nozzle type item. The management number is an ID that is arbitrarily set, but may be, for example, the number of the slot 26 of the nozzle housing 5. The nozzle position is a position where the nozzle 13 is arranged in the nozzle accommodating portion 5, and may be, for example, the XY coordinates of the slot 26. In the nozzle management table D3, the nozzle position and the nozzle type are associated by a management number. For example, the nozzle 13 corresponding to the management number 1 is a normal nozzle having an X-direction coordinate of X1 and a Y-direction coordinate of Y1, and a nozzle hole diameter of 3 mm.
 図10のステップS42において、制御部9は、加工条件を取得する。加工条件は、オペレータから入力される場合もあるし、レーザ加工機1に搬入されるワークWに基づいて、例えば、上位のコントローラ(例えば、システムを統括するコントローラなど)から有線または無線により制御部9に送られる場合もある。制御部9は、オペレータの入力等から加工条件を取得する。 In step S42 of FIG. 10, the control unit 9 acquires the processing conditions. The processing conditions may be input by an operator, or based on the workpiece W carried into the laser processing machine 1, for example, from a host controller (for example, a controller that supervises the system) by wired or wireless control unit 9 may also be sent. The control unit 9 acquires processing conditions from an operator input or the like.
 ステップS43において、制御部9は、加工条件を加工テーブルD4に照合し、加工条件に応じたノズル種別を決定して装着対象のノズル13を特定する。このステップS43は、制御部9の装着ノズル特定部43(図1参照)が行う。図11(b)に示すように、加工テーブルD4は、管理番号の項目、加工条件の項目、ノズル種別の項目を含む。管理番号は、任意に設定されるIDである。加工条件は、例えば、切断速度、レーザ発振出力、アシストガス圧などの項目を含む。加工テーブルD4は、事前にオペレータが入力しておくものでもよいし、上位のコントローラから適宜制御部9に送られるものでもよい。ノズル種別は、加工条件に応じたものが選択されて予め登録されている。装着ノズル特定部43は、与えられた加工条件に合致したノズル種別のノズル13を、装着対象のノズルとして特定する。 In step S43, the control unit 9 collates the machining conditions with the machining table D4, determines the nozzle type according to the machining conditions, and identifies the nozzle 13 to be mounted. This step S43 is performed by the mounting nozzle specifying unit 43 (see FIG. 1) of the control unit 9. As shown in FIG. 11B, the processing table D4 includes a management number item, a processing condition item, and a nozzle type item. The management number is an ID that is arbitrarily set. The processing conditions include items such as cutting speed, laser oscillation output, and assist gas pressure. The processing table D4 may be input by an operator in advance, or may be sent from the host controller to the control unit 9 as appropriate. Nozzle types corresponding to the processing conditions are selected and registered in advance. The mounting nozzle specifying unit 43 specifies the nozzle 13 of the nozzle type that matches the given processing condition as the mounting target nozzle.
 ステップS44において、制御部9は、ステップS43で特定された装着対象のノズル13のノズル種別をノズル管理テーブルD3と照合し、装着対象のノズル13の位置を取得する。制御部9は、装着対象のノズル13の位置(スロットの位置またはXY座標)の情報をもとに、移動部4を制御し、装着対象のノズル13の位置まで加工ヘッド3を移動させ、装着対象のノズルを加工ヘッド3に装着させる(ステップS45)。このように、ステップS41~S45を行うことにより、ノズル13の自動装着が可能となる。 In step S44, the control unit 9 collates the nozzle type of the mounting target nozzle 13 specified in step S43 with the nozzle management table D3, and acquires the position of the mounting target nozzle 13. The control unit 9 controls the moving unit 4 based on information on the position (slot position or XY coordinates) of the nozzle 13 to be mounted, moves the machining head 3 to the position of the nozzle 13 to be mounted, and mounts it. The target nozzle is mounted on the machining head 3 (step S45). As described above, the nozzles 13 can be automatically mounted by performing steps S41 to S45.
 図12は、ノズル装着方法の他の例を示すフローチャートである。この方法は、装着対象のノズル13がノズル管理テーブルD3に登録されていない、もしくはノズル管理テーブルD3が存在しない状態で、ノズル装着処理を行うものである。なお、図12において、図3及び図9と同様の処理を実行するものには同様の符号を付してその説明を省略または簡略化する。 FIG. 12 is a flowchart showing another example of the nozzle mounting method. In this method, the nozzle mounting process is performed in a state where the nozzle 13 to be mounted is not registered in the nozzle management table D3 or the nozzle management table D3 does not exist. In FIG. 12, the same reference numerals are assigned to the same processes as those in FIGS. 3 and 9, and the description thereof is omitted or simplified.
 制御部9は、図9のステップS31~S33、図3のステップS5の処理により、対象スロットのノズル13のノズル種別を認識する。ステップS5においてノズル種別の認識に成功した場合(ステップS5;YES)、制御部9は、認識されたノズル種別が装着対象のノズル種別であるか否かを判定する(ステップS51)。装着対象のノズル種別は、例えば、制御部9が加工テーブルD4を参照して決定されている。また、制御部9は、認識されたノズル種別が装着対象のノズル種別でないと判定した場合(ステップS52;NO)、またはステップS5でノズル種別の認識に失敗したと判定した場合、基準位置を次のスロット26に更新し(ステップS53)、ステップS31からステップS5の処理を繰り返す。 The control unit 9 recognizes the nozzle type of the nozzle 13 of the target slot through the processing of steps S31 to S33 in FIG. 9 and step S5 in FIG. When the nozzle type is successfully recognized in step S5 (step S5; YES), the control unit 9 determines whether or not the recognized nozzle type is a nozzle type to be mounted (step S51). The nozzle type to be mounted is determined by the control unit 9 with reference to the processing table D4, for example. If the controller 9 determines that the recognized nozzle type is not the mounting target nozzle type (step S52; NO), or determines that the nozzle type has failed to be recognized in step S5, the control unit 9 determines the reference position as follows. (Step S53), and the processing from step S31 to step S5 is repeated.
 制御部9は、認識されたノズル種別が装着対象のノズル種別であると判定した場合(ステップS51;YES)、このノズル13を加工ヘッド3に装着させる(ステップS54)。すなわち、制御部9は、装着対象のノズル13が見つかった場合、次のノズル13の認識を行わないで、装着対象のノズル13を加工ヘッド3に装着させる。この場合、撮像部6は、加工ヘッド3内を介して画像を取得しているので、加工ヘッド3を大きく移動させることなく、加工ヘッド3にノズル13を装着でき、ノズル装着に要する時間を短縮できる。詳細には、加工ヘッド3の外で撮像部6が撮像する場合には、加工ヘッド3及び撮像部6が互いに異なる位置にあるため、装着対象のノズル13が見つかったときに、撮像部6がノズル13に対向する撮像位置から、加工ヘッド3がノズル13に対向する装着位置に、加工ヘッド3を移動させる必要がある。一方、加工ヘッド3内を介して画像を撮像する場合には、撮像位置と装着位置が同一又は近接しているため、加工ヘッド3を移動させずに、又は、加工ヘッド3を微調整することで、加工ヘッド3をノズル13に装着できる。 When it is determined that the recognized nozzle type is the nozzle type to be mounted (step S51; YES), the control unit 9 mounts the nozzle 13 on the processing head 3 (step S54). That is, when the mounting target nozzle 13 is found, the control unit 9 does not recognize the next nozzle 13 and mounts the mounting target nozzle 13 on the processing head 3. In this case, since the imaging unit 6 acquires an image through the processing head 3, the nozzle 13 can be mounted on the processing head 3 without greatly moving the processing head 3, and the time required for nozzle mounting is reduced. it can. Specifically, when the imaging unit 6 captures an image outside the processing head 3, the processing unit 3 and the imaging unit 6 are located at different positions. Therefore, when the mounting target nozzle 13 is found, the imaging unit 6 It is necessary to move the machining head 3 from the imaging position facing the nozzle 13 to the mounting position where the machining head 3 faces the nozzle 13. On the other hand, when an image is captured through the machining head 3, the imaging position and the mounting position are the same or close to each other, so that the machining head 3 is not moved or the machining head 3 is finely adjusted. Thus, the machining head 3 can be attached to the nozzle 13.
 なお、制御部9は、装着対象のノズル13が見つかるまでに識別したノズル種別をノズル管理テーブルD3に登録してもよい。この場合、次回以降のノズル交換において、装着対象のノズル13がノズル管理テーブルD3に登録されている場合には、ノズル管理テーブルD3を用いてノズル交換を行うことができる。また、次回以降のノズル交換において、装着対象のノズル13がノズル管理テーブルD3にない場合には、ノズル管理テーブルD3にないノズル位置からノズル識別を行う。これにより識別するノズル13の数を削減できる。また、装着対象のノズル13がノズル管理テーブルD3に登録されている場合、そのノズル位置で改めてノズル識別を行ってもよい。改めて行ったノズル識別の結果とノズル管理テーブルD3の情報が合致する場合にはノズル13を装着し、合致しない場合にはノズル13の装着を中止して異常などをオペレータに報知してもよい。 The control unit 9 may register the identified nozzle type in the nozzle management table D3 until the nozzle 13 to be mounted is found. In this case, when the nozzle 13 to be mounted is registered in the nozzle management table D3 in the subsequent nozzle replacement, the nozzle replacement can be performed using the nozzle management table D3. In addition, when the nozzle 13 to be mounted is not in the nozzle management table D3 in the nozzle replacement after the next time, nozzle identification is performed from a nozzle position that is not in the nozzle management table D3. Thereby, the number of nozzles 13 to be identified can be reduced. Further, when the mounting target nozzle 13 is registered in the nozzle management table D3, the nozzle identification may be performed again at the nozzle position. The nozzle 13 may be mounted if the result of the newly performed nozzle identification matches the information in the nozzle management table D3, and if not matched, the mounting of the nozzle 13 may be stopped to notify the operator of the abnormality.
 加工ヘッド3にノズル13を装着後、制御部9は、レーザ発振機2を制御し、加工ヘッド3からワークWの加工位置にレーザ光を照射してワークWの加工を行う。その際、制御部9は、撮像部6を制御し、レーザ光によるワークWの加工状態を撮像部6により撮像させる。制御部9の加工状態検出部45は、加工中又は加工済みのワークWを撮像部6が撮像した画像から、ワークWの加工状態を検出する。加工状態検出部45は、ワークWの加工状態の検出として、ワークWの加工状態の良否を判断してもよい。また、加工状態検出部45は、ワークWの画像からレーザ加工の切断幅等の数値情報を抽出してもよい。制御部9は、この数値情報に基づいて、レーザ加工のフィードバック制御を実行してもよい。このように、撮像部6をノズル13の識別と加工状態の検出との双方に共用でき、装置コストの増加を抑制できる。 After mounting the nozzle 13 on the machining head 3, the control unit 9 controls the laser oscillator 2 and irradiates the machining position of the workpiece W with laser light from the machining head 3 to process the workpiece W. At that time, the control unit 9 controls the imaging unit 6 to cause the imaging unit 6 to capture the machining state of the workpiece W by the laser light. The machining state detection unit 45 of the control unit 9 detects the machining state of the workpiece W from the image obtained by the imaging unit 6 imaging the workpiece W being processed or processed. The machining state detection unit 45 may determine whether the machining state of the workpiece W is good or bad as detection of the machining state of the workpiece W. Further, the processing state detection unit 45 may extract numerical information such as a cutting width of laser processing from the image of the workpiece W. The control unit 9 may execute feedback control of laser processing based on this numerical information. Thus, the imaging unit 6 can be shared for both the identification of the nozzle 13 and the detection of the machining state, and an increase in device cost can be suppressed.
 図13は、照明部の他の例を示す図である。図13において、図1及び図2と同様の部材には同様の符号を付してその説明を省略または簡略化する。図13に示すように、照明部7Aは、加工ヘッド3の下方においてボディ11の外部に設けられている。照明部7Aは、ボディ11の下端部に支持されている。照明部7Aは、複数の光源(例えばLEDなど)がリング状に配置された光源ユニット37を備えている。このように、照明部7Aが加工ヘッド3に対して外付けされることにより、照明部7Aの交換や点検が容易となる。 FIG. 13 is a diagram showing another example of the illumination unit. In FIG. 13, the same members as those in FIGS. 1 and 2 are denoted by the same reference numerals, and the description thereof is omitted or simplified. As shown in FIG. 13, the illumination unit 7 </ b> A is provided outside the body 11 below the processing head 3. The illumination unit 7 </ b> A is supported by the lower end of the body 11. The illumination unit 7A includes a light source unit 37 in which a plurality of light sources (for example, LEDs) are arranged in a ring shape. As described above, when the illumination unit 7A is externally attached to the processing head 3, the illumination unit 7A can be easily replaced or inspected.
 以上、実施形態について説明したが、本発明の技術範囲は、上記した実施形態に限定されるものではない。例えば、上記の実施形態で説明した各要件の1つ以上は、省略されることがある。また、上記の実施形態で説明した各要件は、適宜組み合わせることができる。また、上記した実施形態は、撮像部6による撮像倍率について言及していないが、例えば、撮像部6による画像認識エリアFVを低倍率または高倍率に切り替えて撮像してもよい。この場合、まず画像認識エリアFVを低倍率で撮像し、識別マークMの存在を確認した後にその位置に画像認識エリアFVの中心を移動させつつ高倍率での撮像に切り替え、識別マークMを拡大した状態で撮像してもよい。 As mentioned above, although embodiment was described, the technical scope of this invention is not limited to above-described embodiment. For example, one or more of the requirements described in the above embodiments may be omitted. Moreover, each requirement demonstrated by said embodiment can be combined suitably. Moreover, although the above-mentioned embodiment does not mention the imaging magnification by the imaging part 6, you may image by switching the image recognition area FV by the imaging part 6 to a low magnification or a high magnification, for example. In this case, first, the image recognition area FV is imaged at a low magnification, and after confirming the presence of the identification mark M, the center of the image recognition area FV is moved to that position and switched to imaging at a high magnification to enlarge the identification mark M. You may image in the state.
 1・・・レーザ加工機、3・・・加工ヘッド、4・・・移動部、5・・・ノズル収容部、6・・・撮像部、7、7A・・・照明部、13、13A~13D・・・ノズル、23・・・ノズル穴、41・・・ノズル識別部、42・・・ノズル判定部、43・・・装着ノズル特定部、44・・・ノズル装着指示部、45・・・加工状態検出部、W・・・ワーク、D3・・・ノズル管理テーブル、D4・・・加工テーブル、M、MA~ME・・・識別マーク、M1・・・主要部、M2・・・基準マーク、FV・・・画像認識エリア DESCRIPTION OF SYMBOLS 1 ... Laser processing machine, 3 ... Processing head, 4 ... Moving part, 5 ... Nozzle accommodating part, 6 ... Imaging part, 7, 7A ... Illumination part, 13, 13A- 13D ... Nozzle, 23 ... Nozzle hole, 41 ... Nozzle identification part, 42 ... Nozzle determination part, 43 ... Installation nozzle specifying part, 44 ... Nozzle installation instruction part, 45 ...・ Processing state detection unit, W ... work, D3 ... nozzle management table, D4 ... processing table, M, MA to ME ... identification mark, M1 ... main part, M2 ... reference Mark, FV ... Image recognition area

Claims (15)

  1.  交換可能なノズルを有しかつワークを加工するレーザ光を導光して前記ノズルから出射する中空形状の加工ヘッドと、
     前記ノズルを複数配置するノズル収容部に保持された前記ノズルの画像を、前記加工ヘッド内を介して取得する撮像部と、
     前記撮像部により取得した前記ノズルの画像からノズル種別を認識するノズル識別部と、を含むレーザ加工機。
    A hollow processing head that has a replaceable nozzle and guides laser light for processing the workpiece and emits the laser light from the nozzle,
    An imaging unit that acquires an image of the nozzle held in a nozzle housing unit in which a plurality of the nozzles are arranged, through the processing head;
    And a nozzle identification unit that recognizes a nozzle type from the nozzle image acquired by the imaging unit.
  2.  前記撮像部により画像を取得する対象の前記ノズルに対して照明光を照射する照明部を含む請求項1記載のレーザ加工機。 The laser processing machine according to claim 1, further comprising an illuminating unit that irradiates illumination light to the nozzles for which an image is acquired by the imaging unit.
  3.  前記照明部は、前記加工ヘッド内を介して前記ノズルに照明光を照射する請求項2記載のレーザ加工機。 The laser processing machine according to claim 2, wherein the illumination unit irradiates the nozzle with illumination light through the processing head.
  4.  前記ノズルは、前記撮像部による撮像可能な位置に識別マークが形成され、
     前記ノズル識別部は、前記撮像部で取得した画像から前記識別マークを判別して前記ノズル種別を認識する請求項1~請求項3のいずれか1項に記載のレーザ加工機。
    The nozzle has an identification mark formed at a position where the imaging unit can capture an image,
    The laser processing machine according to any one of claims 1 to 3, wherein the nozzle identification unit recognizes the nozzle type by determining the identification mark from an image acquired by the imaging unit.
  5.  前記識別マークは、前記ノズルのノズル穴の貫通方向から見て前記ノズル穴の周囲に形成される請求項4記載のレーザ加工機。 The laser processing machine according to claim 4, wherein the identification mark is formed around the nozzle hole when viewed from the penetrating direction of the nozzle hole of the nozzle.
  6.  前記識別マークは、前記ノズル種別を有する主要部と、前記主要部に対して規定された位置に形成された基準マークと、を含む請求項4または請求項5記載のレーザ加工機。 6. The laser processing machine according to claim 4, wherein the identification mark includes a main part having the nozzle type and a reference mark formed at a position defined with respect to the main part.
  7.  前記識別マークにより認識した前記ノズル種別と、前記撮像部で取得した画像から前記ノズルの形状の特徴部分を抽出して認識した前記ノズル種別と、が一致するかを判定するノズル判定部を含む請求項4~請求項6のいずれか1項に記載のレーザ加工機。 A nozzle determination unit that determines whether the nozzle type recognized by the identification mark matches the nozzle type extracted by extracting a characteristic portion of the shape of the nozzle from the image acquired by the imaging unit. The laser beam machine according to any one of claims 4 to 6.
  8.  前記ノズル識別部は、前記撮像部で取得した画像から前記ノズルの形状の特徴部分を抽出して前記ノズル種別を認識する請求項1~請求項3のいずれか1項に記載のレーザ加工機。 The laser processing machine according to any one of claims 1 to 3, wherein the nozzle identification unit recognizes the nozzle type by extracting a characteristic portion of the shape of the nozzle from an image acquired by the imaging unit.
  9.  前記加工ヘッドと前記ノズル収容部とを相対的に移動させる移動部を含む請求項1~請求項8のいずれか1項に記載のレーザ加工機。 The laser processing machine according to any one of claims 1 to 8, further comprising a moving unit that relatively moves the processing head and the nozzle housing unit.
  10.  前記ノズル識別部は、前記移動部による前記加工ヘッドと前記ノズル収容部との移動量に基づいてノズル位置を認識する請求項9記載のレーザ加工機。 The laser processing machine according to claim 9, wherein the nozzle identification unit recognizes a nozzle position based on a movement amount of the processing head and the nozzle housing unit by the moving unit.
  11.  前記移動部は、前記ノズルのノズル穴の貫通方向を中心とした周方向に、前記加工ヘッドと前記ノズル収容部とを相対的に移動させ、前記撮像部の画像認識エリアを前記周方向に走査する請求項9または請求項10記載のレーザ加工機。 The moving unit relatively moves the processing head and the nozzle accommodating unit in a circumferential direction centering on a penetrating direction of a nozzle hole of the nozzle, and scans an image recognition area of the imaging unit in the circumferential direction. The laser processing machine according to claim 9 or 10.
  12.  前記ノズル収容部に配置された複数の前記ノズルに関して、前記ノズル識別部が認識した前記ノズル位置と前記ノズル種別とを関連付けたノズル管理テーブルと、
     前記ワークに応じた複数の加工条件と前記ノズル種別とを関連付けた加工テーブルと、
     前記加工条件に基づいて前記加工テーブル及び前記ノズル管理テーブルを参照することにより装着対象の前記ノズルを特定する装着ノズル特定部と、を含む請求項10記載のレーザ加工機。
    Nozzle management table that associates the nozzle position recognized by the nozzle identification unit and the nozzle type with respect to the plurality of nozzles arranged in the nozzle accommodating unit,
    A machining table that associates a plurality of machining conditions according to the workpiece and the nozzle type;
    The laser processing machine according to claim 10, further comprising: a mounting nozzle specifying unit that specifies the nozzle to be mounted by referring to the processing table and the nozzle management table based on the processing conditions.
  13.  前記ノズル収容部の前記ノズルを前記加工ヘッドに装着するように指示するノズル装着指示部を含み、
     前記ノズル装着指示部は、前記ノズル収容部の全ての前記ノズルに関する前記ノズル種別を取得した後、または前記ノズル収容部の前記ノズルに対して順次前記ノズル種別を取得して装着対象の前記ノズルを見つけた際に、装着対象の前記ノズルの装着を指示する請求項1~請求項12のいずれか1項に記載のレーザ加工機。
    A nozzle mounting instruction unit that instructs to mount the nozzle of the nozzle housing unit on the processing head;
    The nozzle mounting instruction unit acquires the nozzle types related to all the nozzles of the nozzle storage unit, or sequentially acquires the nozzle types for the nozzles of the nozzle storage unit to select the nozzles to be mounted. The laser processing machine according to any one of claims 1 to 12, wherein when found, the installation of the nozzle to be mounted is instructed.
  14.  前記撮像部は、前記レーザ光により加工中又は加工済みの前記ワークを撮像可能であり、
     加工中又は加工済みの前記ワークを前記撮像部が撮像した画像から、前記ワークの加工状態を検出する加工状態検出部を含む請求項1~請求項13のいずれか1項に記載のレーザ加工機。
    The imaging unit is capable of imaging the workpiece being processed or processed by the laser light,
    The laser processing machine according to any one of claims 1 to 13, further comprising a processing state detection unit that detects a processing state of the workpiece from an image obtained by the imaging unit capturing the workpiece that is being processed or has been processed. .
  15.  交換可能なノズルを有しかつワークを加工するレーザ光を導光して前記ノズルから出射する中空形状の加工ヘッドと、前記ノズルを複数配置するノズル収容部に保持された前記ノズルの画像を、前記加工ヘッド内を介して取得する撮像部と、を含むレーザ加工機において、
     前記撮像部により取得した前記ノズルの画像からノズル種別を認識することと、
     認識した前記ノズルのうち、装着対象のノズルを前記加工ヘッドに装着することと、を含むノズル装着方法。
    A hollow processing head that has a replaceable nozzle and guides a laser beam for processing a workpiece and emits the laser light, and an image of the nozzle held in a nozzle housing portion in which a plurality of the nozzles are arranged, In a laser processing machine including an imaging unit that acquires through the processing head,
    Recognizing the nozzle type from the image of the nozzle acquired by the imaging unit;
    Mounting a nozzle to be mounted on the processing head among the recognized nozzles.
PCT/JP2015/082923 2014-12-08 2015-11-24 Laser processing machine and nozzle mounting method WO2016093053A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016563599A JP6269859B2 (en) 2014-12-08 2015-11-24 Laser processing machine and nozzle mounting method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-247766 2014-12-08
JP2014247766 2014-12-08

Publications (1)

Publication Number Publication Date
WO2016093053A1 true WO2016093053A1 (en) 2016-06-16

Family

ID=56107243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/082923 WO2016093053A1 (en) 2014-12-08 2015-11-24 Laser processing machine and nozzle mounting method

Country Status (2)

Country Link
JP (1) JP6269859B2 (en)
WO (1) WO2016093053A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018118279A (en) * 2017-01-25 2018-08-02 株式会社アマダホールディングス Laser processing machine
WO2019070055A1 (en) * 2017-10-06 2019-04-11 株式会社アマダホールディングス Laser processing method and device
JP2019069470A (en) * 2017-10-06 2019-05-09 株式会社アマダホールディングス Laser processing method and laser processing device
CN110340503A (en) * 2018-04-06 2019-10-18 依赛彼集团公司 For welding and the automatic identification of the component of cutting torch
US11267069B2 (en) 2018-04-06 2022-03-08 The Esab Group Inc. Recognition of components for welding and cutting torches
WO2023137507A1 (en) 2022-01-19 2023-07-27 Trotec Laser Gmbh Method for detecting a lens and/or nozzle on a focussing unit of a laser plotter for cutting, engraving, marking and/or inscribing a workpiece, and lens holder, nozzle holder, and laser plotter for engraving, marking and/or inscribing a workpiece therefor
AT525822A1 (en) * 2022-01-19 2023-08-15 Trotec Laser Gmbh Method for detecting a lens and/or nozzle on a focusing unit of a laser plotter for cutting, engraving, marking and/or inscribing a workpiece, and laser plotter for engraving, marking and/or inscribing a workpiece therefor
JP7435195B2 (en) 2020-04-15 2024-02-21 ウシオ電機株式会社 Extreme ultraviolet light source device and plasma position adjustment method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004322127A (en) * 2003-04-23 2004-11-18 Amada Co Ltd Method for controlling lens and nozzle in laser beam machine
JP2005334922A (en) * 2004-05-26 2005-12-08 Yamazaki Mazak Corp Nozzle checking device in laser beam machine
JP2008309590A (en) * 2007-06-13 2008-12-25 Mitsubishi Electric Corp Nozzle inspection apparatus and nozzle inspection method
JP2010207819A (en) * 2007-07-05 2010-09-24 Mitsubishi Electric Corp Laser processing apparatus and nozzle determination method employed in the same
EP2540432A1 (en) * 2011-06-27 2013-01-02 Trumpf Maschinen AG Laser processing machine with exchangeable components and operating method for the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004322127A (en) * 2003-04-23 2004-11-18 Amada Co Ltd Method for controlling lens and nozzle in laser beam machine
JP2005334922A (en) * 2004-05-26 2005-12-08 Yamazaki Mazak Corp Nozzle checking device in laser beam machine
JP2008309590A (en) * 2007-06-13 2008-12-25 Mitsubishi Electric Corp Nozzle inspection apparatus and nozzle inspection method
JP2010207819A (en) * 2007-07-05 2010-09-24 Mitsubishi Electric Corp Laser processing apparatus and nozzle determination method employed in the same
EP2540432A1 (en) * 2011-06-27 2013-01-02 Trumpf Maschinen AG Laser processing machine with exchangeable components and operating method for the same

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018139312A1 (en) * 2017-01-25 2018-08-02 株式会社アマダホールディングス Laser processing machine
JP2018118279A (en) * 2017-01-25 2018-08-02 株式会社アマダホールディングス Laser processing machine
US11052489B2 (en) 2017-10-06 2021-07-06 Amada Holdings Co., Ltd. Method for laser processing
WO2019070055A1 (en) * 2017-10-06 2019-04-11 株式会社アマダホールディングス Laser processing method and device
JP2019069470A (en) * 2017-10-06 2019-05-09 株式会社アマダホールディングス Laser processing method and laser processing device
CN111201108B (en) * 2017-10-06 2021-08-17 株式会社天田控股集团 Laser processing method and apparatus
JP2019217552A (en) * 2017-10-06 2019-12-26 株式会社アマダホールディングス Laser processing method and laser processing device
CN111201108A (en) * 2017-10-06 2020-05-26 株式会社天田控股集团 Laser processing method and apparatus
EP3693125A4 (en) * 2017-10-06 2020-12-02 Amada Holdings Co., Ltd. Laser processing method and device
CN110340503A (en) * 2018-04-06 2019-10-18 依赛彼集团公司 For welding and the automatic identification of the component of cutting torch
JP2019214072A (en) * 2018-04-06 2019-12-19 ザ・エサブ・グループ・インク Automatic recognition of components for welding and cutting torches
JP2021178365A (en) * 2018-04-06 2021-11-18 ザ・エサブ・グループ・インク Automatic identification of components for welding and cutting torches
US11267069B2 (en) 2018-04-06 2022-03-08 The Esab Group Inc. Recognition of components for welding and cutting torches
US11883896B2 (en) 2018-04-06 2024-01-30 The Esab Group, Inc. Recognition of components for welding and cutting torches
JP7435195B2 (en) 2020-04-15 2024-02-21 ウシオ電機株式会社 Extreme ultraviolet light source device and plasma position adjustment method
WO2023137507A1 (en) 2022-01-19 2023-07-27 Trotec Laser Gmbh Method for detecting a lens and/or nozzle on a focussing unit of a laser plotter for cutting, engraving, marking and/or inscribing a workpiece, and lens holder, nozzle holder, and laser plotter for engraving, marking and/or inscribing a workpiece therefor
AT525822A1 (en) * 2022-01-19 2023-08-15 Trotec Laser Gmbh Method for detecting a lens and/or nozzle on a focusing unit of a laser plotter for cutting, engraving, marking and/or inscribing a workpiece, and laser plotter for engraving, marking and/or inscribing a workpiece therefor

Also Published As

Publication number Publication date
JPWO2016093053A1 (en) 2017-04-27
JP6269859B2 (en) 2018-01-31

Similar Documents

Publication Publication Date Title
JP6269859B2 (en) Laser processing machine and nozzle mounting method
CN107803585B (en) Laser processing machine and laser processing method
JP7176966B2 (en) Image inspection device
WO2018066576A1 (en) Appearance inspecting method
JP2019214072A (en) Automatic recognition of components for welding and cutting torches
CN114311346A (en) Wafer and workbench alignment identification method
US11776110B2 (en) Image measuring apparatus
JP2012064171A (en) Code print quality evaluation device
JP2008241255A (en) Method of detecting alignment mark position and laser machining device using method thereof
JP2009258069A (en) Inspection apparatus and inspection method of foreign matters within hole of spinneret
KR20200010643A (en) Identification character recognition apparatus and method
US10846881B2 (en) Image measuring apparatus
JP6714729B2 (en) Surface mounter, component recognition device, component recognition method
JP2010120079A (en) Microfabrication device and microfabrication method
JP6291582B2 (en) Optical symbol reading apparatus and reading method
JP2017037055A (en) Defect measurement device
JP2017037006A (en) Defect measurement device
JP7281942B2 (en) Inspection device and inspection method
US11680911B2 (en) Marking inspection device, marking inspection method and article inspection apparatus
JP6723391B2 (en) Component mounter and production job creation method
KR20160003840U (en) Coordinate measuring machine having illumination wavelength conversion function
JP2022016156A (en) Processing device
JP4832586B2 (en) Substrate recognition processing method
KR20060063768A (en) Non-oriented optical character recognition of a wafer mark
JP2005166995A (en) Method and apparatus of recognition and component mounting apparatus using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15867436

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016563599

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15867436

Country of ref document: EP

Kind code of ref document: A1