WO2022138617A1 - Laser machining device and control program of same, and detection method - Google Patents

Laser machining device and control program of same, and detection method Download PDF

Info

Publication number
WO2022138617A1
WO2022138617A1 PCT/JP2021/047230 JP2021047230W WO2022138617A1 WO 2022138617 A1 WO2022138617 A1 WO 2022138617A1 JP 2021047230 W JP2021047230 W JP 2021047230W WO 2022138617 A1 WO2022138617 A1 WO 2022138617A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspection
inspection pattern
image
pattern
guide light
Prior art date
Application number
PCT/JP2021/047230
Other languages
French (fr)
Japanese (ja)
Inventor
洋祐 熊谷
政敏 神山
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2022138617A1 publication Critical patent/WO2022138617A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring

Definitions

  • the present disclosure relates to a laser processing device and a detection method for detecting stains on an optical system.
  • Patent Document 1 includes a scanning lens for condensing a laser beam deflected by a deflection mirror onto a work piece, and a deflection region is obtained by using the deflection mirror and the scanning lens.
  • An energy measuring means for measuring the energy of a laser beam deflected and irradiated by the deflection mirror at a plurality of different measurement positions in the deflection region in a laser processing apparatus that irradiates a plurality of positions in the lens with a laser beam for processing. It is characterized in that it is provided with a means for detecting dirt on the scanning lens based on the measurement by the energy measuring means.
  • the energy measuring means In the above laser processing apparatus, the energy measuring means must be moved to the irradiation position in order to measure the energy of the laser beam, but the energy measuring means is moved each time in order to measure the energy at a plurality of different points. It must be a very time-consuming operation depending on the number of measurement points.
  • the present disclosure has been made in view of the above-mentioned points, and by photographing the visible light scanned and condensed with respect to the reference plane by the optical system by the imaging unit, the optical system and the photographing unit can be quickly captured. It is an object of the present invention to provide a laser processing apparatus and a detection method capable of detecting at least one of the stains.
  • the present specification is arranged between a guide light emitting unit that emits a guide light which is visible light and a guide light emitting unit and a reference surface in order to collect the guide light while scanning the reference surface. It is equipped with an optical system, an imaging unit that photographs the reference plane, and a control unit.
  • the control unit has a drawing process that draws an inspection pattern on the reference plane with guide light, and the inspection pattern is projected by imaging the reference plane. Based on the acquisition process for acquiring the inspection image, the analysis process for performing image analysis on the inspection pattern based on the inspection image, and the inspection result of the image analysis, the presence or absence of stains on at least one of the optical system and the photographing unit is checked.
  • a laser processing apparatus characterized by performing diagnostic processing for automatic diagnosis and execution.
  • a guide light emitting unit that emits a guide light which is visible light
  • a guide light emitting unit and a reference surface in order to collect the guide light while scanning the reference surface.
  • a stain detection method for detecting the presence or absence of stains on a laser processing device including an optical system to be arranged and an imaging unit for photographing a reference surface, a drawing step of drawing an inspection pattern on the reference surface with guide light, and a drawing step.
  • a stain detection method including a diagnostic step for automatically diagnosing the presence or absence of stains on at least one of the portions.
  • the laser processing apparatus, its control program, and the detection method promptly capture the optical system and the imaging by photographing the visible light scanned and condensed with respect to the reference plane by the optical system by the imaging unit. It is possible to detect dirt on at least one of the parts.
  • FIG. 1 It is a figure which showed the schematic structure of the laser marker of this embodiment. It is a block diagram showing the electrical structure of the laser marker. It is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the f ⁇ lens of the laser marker and the camera are not dirty.
  • (A) is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the f ⁇ lens of the laser marker is contaminated with oil.
  • (B) is a figure showing an inspection pattern of guide light in an inspection image acquired by a camera when the f ⁇ lens of the laser marker is dirty with dust.
  • A) is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the camera of the laser marker is contaminated with oil.
  • (B) is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the camera of the laser marker is contaminated with dust. It is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the f ⁇ lens of the laser marker and the camera are contaminated with oil.
  • (A) is a figure showing an image in which the oil-stained portion of the f ⁇ lens is emphasized among the inspection patterns taken by the camera when the f ⁇ lens of the laser marker is contaminated with oil.
  • (B) is a figure showing an image in which the oil-stained portion of the camera is emphasized among the inspection patterns taken by the camera when the camera of the laser marker is contaminated with oil.
  • FIG. 1 It is a figure showing the inspection pattern and the re-inspection pattern of the guide light in the inspection image and the re-inspection image acquired by the camera when the f ⁇ lens of the laser marker is contaminated with oil.
  • A is a figure showing the irradiation area of the pointer light in the inspection image acquired by the camera when the pointer light emitter of the laser marker is not dirty.
  • B is a diagram showing an irradiation area of pointer light in an inspection image acquired by a camera when the pointer light emitter of the laser marker is dirty with oil
  • C is a diagram showing the irradiation area of the pointer light of the laser marker.
  • FIGS. 1 and 2 used in the following description, a part of the basic configuration is omitted, and the dimensional ratio and the like of each drawn part are not always accurate.
  • the vertical direction is as shown in FIG.
  • the laser marker 1 of this embodiment is composed of a print information creating unit 2 and a laser processing unit 3.
  • the print information creating unit 2 is composed of a personal computer or the like.
  • the laser processing unit 3 performs marking (printing) processing by two-dimensionally scanning the processing laser beam R on the processing surface 8 of the processing object 7.
  • the laser processing unit 3 includes a laser controller 6.
  • the laser controller 6 is composed of a computer and is connected to the print information creating unit 2 so as to be capable of bidirectional communication.
  • the laser controller 6 drives and controls the laser processing unit 3 based on the print information, control parameters, various instruction information, and the like transmitted from the print information creation unit 2.
  • the laser processing unit 3 includes a laser oscillation unit 12, a guide light unit 15, a dichroic mirror 101, a focal system 70, a pointer light emitter 105, a camera 103, a galvano scanner 18, an f ⁇ lens 19, and the like, which are not shown. It is covered with a housing cover that has a substantially rectangular shape.
  • the laser oscillation unit 12 is composed of a laser oscillator 21 and the like.
  • the laser oscillator 21 is composed of a CO2 laser, a YAG laser, or the like, and emits a processed laser beam R.
  • the light diameter of the processed laser light R is adjusted (for example, enlarged) by a beam expander (not shown).
  • the guide light unit 15 is composed of a visible semiconductor laser 28 or the like.
  • the visible semiconductor laser 28 emits a guide light Q, which is visible interference light, for example, a red laser light.
  • the guide light Q is made into parallel light by a lens group (not shown), and is further scanned two-dimensionally to surround, for example, an image of a print pattern to be marked (printed) by the processed laser light R, and the image thereof.
  • a rectangular image, an image having a predetermined shape, or the like is drawn on a machined surface 8 of a machined object 7 with a locus (time afterimage). That is, the guide light Q does not have the marking (printing) processing ability.
  • the predetermined shape is a grid pattern, but a detailed description thereof will be described later.
  • the wavelength of the guide light Q is different from the wavelength of the processed laser light R.
  • the wavelength of the processed laser light R is 1064 nm
  • the wavelength of the guide light Q is 650 nm.
  • the dichroic mirror 101 In the dichroic mirror 101, almost all of the incident processed laser light R is transmitted. Further, in the dichroic mirror 101, the guide light Q is incident at an incident angle of 45 degrees at a substantially central position through which the processed laser light R is transmitted, and is reflected on the optical path of the processed laser light R at a reflection angle of 45 degrees. ..
  • the reflectance of the dichroic mirror 101 has a wavelength dependence. Specifically, the dichroic mirror 101 is surface-treated with a multilayer film structure of a dielectric layer and a metal layer, has a high reflectance with respect to the wavelength of the guide light Q, and is light of other wavelengths. Is configured to be almost (99%) transparent.
  • the alternate long and short dash line in FIG. 1 indicates the optical axis 10 of the processed laser light R and the guide light Q. Further, the direction of the optical axis 10 indicates the path direction of the processed laser light R and the guide light Q.
  • the focal system 70 includes a first lens 72, a second lens 74, and a moving mechanism 76.
  • the processed laser light R and the guide light Q that have passed through the dichroic mirror 101 enter the first lens 72 and pass therethrough. At that time, the light diameters of the processed laser light R and the guide light Q are reduced by the first lens 72. Further, the processed laser light R and the guide light Q that have passed through the first lens 72 enter the second lens 74 and pass through. At that time, the processed laser light R and the guide light Q are made into parallel light by the second lens 74.
  • the moving mechanism 76 includes a focal system motor 80, a rack and pinion (not shown) that converts the rotational motion of the focal system motor 80 into linear motion, and the like, and is second by controlling the rotation of the focal system motor 80.
  • the lens 74 is moved in the path direction of the processed laser light R and the guide light Q.
  • the moving mechanism 76 may be configured to move the first lens 72 instead of the second lens 74, or the first lens so that the distance between the first lens 72 and the second lens 74 changes. It may be configured to move both the 72 and the second lens 74.
  • the galvano scanner 18 two-dimensionally scans the processed laser light R and the guide light Q that have passed through the focal system 70.
  • the galvano X-axis motor 31 and the galvano Y-axis motor 32 are attached so that their respective motor axes are orthogonal to each other, and the scanning mirrors 18X and 18Y attached to the tip of each motor axis are inside. They are facing each other.
  • the processing laser light R and the guide light Q are two-dimensionally scanned by rotating the scanning mirrors 18X and 18Y under the rotation control of the motors 31 and 32.
  • the two-dimensional scanning directions are the X direction and the Y direction.
  • the f ⁇ lens 19 collects the processed laser light R and the guide light Q two-dimensionally scanned by the galvano scanner 18 on the processed surface 8 of the processed object 7. Therefore, the processing laser light R and the guide light Q are two-dimensionally scanned in the X direction and the Y direction on the processing surface 8 of the processing object 7 by the rotation control of the motors 31 and 32, respectively.
  • the wavelengths of the processed laser light R and the guide light Q are different. Therefore, when the distance between the first lens 72 and the second lens 74 in the focal system 70 is constant, the position where the processed laser light R and the guide light Q are focused (hereinafter referred to as “focus position F”) is. , It will be different in the vertical direction. Therefore, the focal position F of the processed laser light R and the guide light Q is set on the processed surface 8 of the processed object 7 by adjusting the distance between the first lens 72 and the second lens 74 in the focal system 70. It can be adjusted to.
  • the distance between the reference position related to the position of the f ⁇ lens 19 and the machined surface 8 of the machined object 7 is referred to as “working distance”.
  • the lower surface of the f ⁇ lens 19 is set as a reference position related to the position of the f ⁇ lens 19. That is, the working distance L of the present embodiment is the distance between the lower surface of the f ⁇ lens 19 and the machined surface 8 of the machined object 7.
  • the reference position related to the position of the f ⁇ lens 19 includes, for example, the upper surface of the f ⁇ lens 19 or the center of the f ⁇ lens 19 in the vertical direction.
  • the focal position F of the processed laser light R and the guide light Q is aligned with the machined surface 8 of the machined object 7.
  • the pointer light emitter 105 emits the pointer light P, which is visible light, toward the processing surface 8 of the processing object 7, and is provided in the vicinity of the f ⁇ lens 19. As a result, the pointer light P is irradiated on the machined surface 8 of the machined object 7 in a relatively small circular shape.
  • the pointer light P may be a laser light having no marking (printing) processing ability as long as it is visible light.
  • the camera 103 is provided near the f ⁇ lens 19 in a state of being directed to the machined surface 8 of the object to be machined 7.
  • the camera 103 is drawn, for example, on the machined surface 8 of the object 7 by repeating the two-dimensional scanning of the illuminated circular pointer light P or the galvano scanner 18.
  • the locus of the guide light Q is imaged.
  • an image in which the machined surface 8 of the object 7 to be machined is projected and includes the locus of the circular pointer light P or the grid-shaped guide light Q is captured.
  • the working distance L is determined based on an image taken by the camera 103 (a projection of the guide light Q and the pointer light P on the processing surface 8 of the processing object 7) and the like, but the technique thereof is determined. Since it is known, the description thereof will be omitted.
  • the laser processing unit 3 includes a laser controller 6, a galvano controller 35, a galvano driver 36, a laser driver 37, a semiconductor laser driver 38, a focal system driver 78, a pointer light emitter 105, and a camera 103. And so on.
  • the laser controller 6 controls the entire laser processing unit 3.
  • a galvano controller 35, a laser driver 37, a semiconductor laser driver 38, a focal system driver 78, and the like are electrically connected to the laser controller 6.
  • an external print information creating unit 2 is connected to the laser controller 6 and the camera 103 so as to be capable of bidirectional communication.
  • the pointer light emitter 105 is electrically connected to the laser controller 6.
  • the laser controller 6 is configured to be able to receive each information transmitted from the print information creation unit 2 (for example, print information, control parameters for the laser processing unit 3, various instruction information from the user, etc.).
  • the pointer light emitter 105 is configured to be able to receive each information (for example, lighting / extinguishing instruction information) transmitted from the print information creating unit 2.
  • the camera 103 is configured to be able to receive each information (for example, image pickup instruction information) transmitted from the print information creation unit 2, and is configured to be able to transmit the captured image to the print information creation unit 2.
  • the laser controller 6 includes a CPU 41, a RAM 42, a ROM 43, and the like.
  • the CPU 41 is an arithmetic unit and a control device that controls the entire laser processing unit 3.
  • the CPU 41, RAM 42, and ROM 43 are connected to each other by a bus line (not shown), and data is exchanged with each other.
  • the RAM 42 is for temporarily storing various calculation results calculated by the CPU 41, (XY coordinates) data of the print pattern, and the like.
  • the ROM 43 stores various programs. For example, a program that calculates XY coordinate data of a print pattern based on the print information transmitted from the print information creation unit 2 and stores it in the RAM 42, or a guide light. A program or the like that calculates the XY coordinate data of the grid-like locus by Q and stores it in the RAM 42 is stored.
  • the various programs include, for example, various delay values, the thickness, depth and number of print patterns corresponding to the print information input from the print information creation unit 2, and the laser oscillator 21.
  • the ROM 43 stores parameters for distortion correction, galvano scanner 18, offset correction data of the pointer light P, and status information (error information, number of times of processing, processing time, etc.) of the laser marker 1.
  • the CPU 41 performs various operations and controls based on various programs stored in the ROM 43.
  • the CPU 41 scans the XY coordinate data of the print pattern calculated based on the print information input from the print information creation unit 2, the XY coordinate data of the grid-like locus by the guide light Q, and the guide light Q by the galvano scanner 18. , And the galvano scanning speed information or the like indicating the speed or the like of scanning the processed laser beam R by the galvano scanner 18 is output to the galvano controller 35. Further, the CPU 41 outputs to the laser driver 37 the laser output of the laser oscillator 21 set based on the print information input from the print information creation unit 2, and the laser drive information indicating the laser pulse width of the processed laser light R and the like. do.
  • the CPU 41 outputs an on signal instructing the start of lighting of the visible semiconductor laser 28 or an off signal instructing the extinguishing of the visible semiconductor laser 28 to the semiconductor laser driver 38.
  • the galvano controller 35 is based on each information input from the laser controller 6 (for example, XY coordinate data of the print pattern, XY coordinate data of the grid-like locus by the guide light Q, galvano scanning speed information, etc.), and the galvano X-axis.
  • the drive angle, rotation speed, and the like of the motor 31 and the galvano Y-axis motor 32 are calculated, and motor drive information indicating the drive angle and rotation speed is output to the galvano driver 36.
  • the galvano driver 36 drives and controls the galvano X-axis motor 31 and the galvano Y-axis motor 32 based on the motor drive information input from the galvano controller 35, and scans the processed laser light R and the guide light Q in two dimensions.
  • the laser driver 37 drives the laser oscillator 21 based on the laser output of the laser oscillator 21 input from the laser controller 6 and the laser drive information indicating the laser pulse width of the processed laser light R and the like.
  • the semiconductor laser driver 38 turns on or turns off the visible semiconductor laser 28 based on the on signal or off signal input from the laser controller 6.
  • the focus system driver 78 drives and controls the focus system motor 80 based on the information input from the laser controller 6 to move the second lens 74.
  • the print information creation unit 2 includes a control unit 51, an input operation unit 55, a liquid crystal display (LCD) 56, a CD-ROM drive 58, and the like.
  • An input operation unit 55, a liquid crystal display 56, a CD-ROM drive 58, and the like are connected to the control unit 51 via an input / output interface (not shown).
  • the input operation unit 55 is composed of a mouse, a keyboard, etc. (not shown), and is used, for example, when the user inputs various instruction information.
  • the CD-ROM drive 58 reads various data, various application software, and the like from the CD-ROM 57.
  • the control unit 51 can control the entire print information creation unit 2 and execute image processing described later by a known technique, and is capable of executing a CPU 61, a RAM 62, a ROM 63, and a hard disk drive (hereinafter, “HDD”). It is equipped with 66 and the like.
  • the CPU 61 is an arithmetic unit and a control device that controls the entire print information creation unit 2.
  • the CPU 61, RAM 62, and ROM 63 are connected to each other by a bus line (not shown), and data is exchanged with each other. Further, the CPU 61 and the HDD 66 are connected to each other via an input / output interface (not shown), and data is exchanged with each other.
  • the RAM 62 is for temporarily storing various calculation results and the like calculated by the CPU 61.
  • the ROM 63 stores various programs and the like. Further, the ROM 63 stores data such as a start point, an end point, a focal point, and a curvature of the font of each character composed of a straight line and an elliptical arc for each type of font.
  • the HDD 66 stores various application software programs, various data files, and the like.
  • the laser marker 1 of the present embodiment detects dirt on the f ⁇ lens 19 and the camera 103. At that time, a grid-like locus is drawn on the machined surface 8 of the machined object 7 by the guide light Q that is two-dimensionally scanned. Further, by photographing the processed surface 8 of the processed object 7 with the camera 103 exposed for a relatively long time, for example, the inspection image 110 shown in FIG. 3 is acquired. In the inspection image 110 of FIG. 3, the inspection pattern 112 is projected on the machined surface 8 of the machined object 7. The inspection pattern 112 is a grid-like pattern, and is a trajectory drawn on the machined surface 8 of the object to be machined 7 by the guide light Q when the f ⁇ lens 19 and the camera 103 are not dirty.
  • an inspection image 110 as shown in FIG. 4A is acquired.
  • a portion 114 of the inspection pattern 112 corresponding to oil stains on the f ⁇ lens 19 is blurred and projected.
  • oil stains on the f ⁇ lens 19 cause refraction and scattering of light.
  • the f ⁇ lens 19 is flat and has a relatively large aperture, the influence thereof is caused. It is possible that it will appear as a blur.
  • an inspection image 110 as shown in FIG. 4B is acquired.
  • a portion 116 of the inspection pattern 112 corresponding to dust stains on the f ⁇ lens 19 is missing and projected.
  • the guide light Q may be blocked by dust.
  • an inspection image 110 as shown in FIG. 5A is acquired.
  • a portion 118 of the inspection pattern 112 corresponding to oil stains of the camera 103 is distorted and projected.
  • the cause of such distortion is refraction and scattering of light due to oil stains on the camera 103.
  • the cover glass of the camera 103 has a spherical shape and its diameter is relatively small, so that the effect is affected. It is possible that it appears as a distortion.
  • the inspection image 110 as shown in FIG. 5B is acquired.
  • the portion 120 corresponding to dust stains on the camera 103 is projected in a pattern different from that of the inspection pattern 112.
  • dust adhering to the cover glass of the camera 103 is photographed in a state of being out of focus.
  • the inspection image 110 as shown in FIG. 6 is acquired.
  • the portion 114 corresponding to the oil stain of the f ⁇ lens 19 is blurred and projected, and the portion 118 corresponding to the oil stain of the camera 103 is distorted and projected.
  • the overlapping portions 114 and 118 are blurred. It may be distorted and projected.
  • the laser marker 1 of the present embodiment compares and collates the inspection pattern 112 of the inspection image 110 acquired by the camera 103 with the inspection pattern 112 when the f ⁇ lens 19 and the camera 103 are not dirty, thereby making the f ⁇ lens. 19 and the camera 103 are detected as dirty.
  • the comparison and collation may be performed on the condition that the machined surface 8 of the machined object 7 is set at a predetermined position, or the inspection pattern 112 is enlarged or expanded based on the working distance L at the time of acquiring the inspection image 110. It may be reduced. Further, the inspection pattern 112 may be drawn on a surface other than the machined surface 8 of the machined object 7.
  • the inspection pattern 112 when the f ⁇ lens 19 and the camera 103 are not dirty is referred to as a “contrast pattern that serves as a reference for the inspection pattern 112”.
  • the data of the comparison pattern which is the reference of the inspection pattern 112, is stored in advance in the ROM 63, HDD 66, CD-ROM 57, or the like of the print information creating unit 2 for the above-mentioned comparison and collation.
  • the laser marker 1 of the present embodiment displays an image on the liquid crystal display 56 that emphasizes the portion having the difference detected by the above comparison and collation.
  • the inspection image 110 of FIG. 4A is acquired, for example, as shown in FIG. 7A, a portion 114 corresponding to oil stains on the f ⁇ lens 19, that is, that is, An image 125 is generated in which the portion where the inspection pattern 112 is blurred and projected is surrounded by a circle 126, and the generated image 125 is displayed on the liquid crystal display 56.
  • the inspection image 110 of FIG. 5A is acquired, for example, as shown in FIG. 7B, the four corners of the frame 128 of the comparison pattern, which is the reference of the inspection pattern 112, are formed.
  • An image 125 arranged so as to overlap the four corners of the inspection pattern 112 is generated, and the generated image 125 is displayed on the liquid crystal display 56.
  • the liquid crystal display 56 displays an image 125 that can be visually recognized by distinguishing between the portion without the difference and the portion with the difference detected in the above comparison and collation.
  • it was detected by the above comparison and collation by changing the brightness or color of the display or superimposing the inspection pattern 112 and the reference comparison pattern of the inspection pattern 112.
  • An image 125 that can be visually recognized by distinguishing between a portion having no difference and a portion having a difference may be generated and displayed on the liquid crystal display 56.
  • the laser marker 1 of the present embodiment can specify a range of stains on the f ⁇ lens 19 and the camera 103 with coarse accuracy, and can detect stains on the f ⁇ lens 19 and the camera 103 in the specified range. ..
  • Such detection will be described by taking the case of FIG. 4A (that is, the case where the f ⁇ lens 19 is contaminated with oil) as an example.
  • an inspection pattern 130 having a pitch a larger than that of the inspection pattern 112 is drawn on the machining surface 8 of the machining object 7 by the guide light Q that is two-dimensionally scanned.
  • the inspection image 132 is acquired by photographing the machined surface 8 of the machined object 7 with the camera 103 exposed for a relatively long time.
  • the portion where the inspection pattern 130 is blurred and projected is specified as the portion 114 corresponding to oil stain by the above comparison and collation.
  • the pitch a is smaller than that of the inspection pattern 112 by the guide light Q two-dimensionally scanned in the range including the position corresponding to the position 114 specified in the inspection image 132.
  • the re-inspection pattern 134 is drawn. In this way, on the machined surface 8 of the object to be machined 7, the re-inspection pattern 134, which is narrower than the drawing area 136 of the inspection pattern 130 and has a finer pitch a, becomes the inspection pattern 130 in the drawing area 136 of the inspection pattern 130. Instead, it is drawn with the guide light Q.
  • the re-inspection image 138 is acquired by photographing the processed surface 8 of the processed object 7 with the camera 103 exposed for a relatively long time.
  • oil stains on the f ⁇ lens 19 are detected by the above comparison and collation.
  • the inspection image 132 and the re-inspection image 138 are superimposed and represented, and the inspection pattern 130 and the re-inspection pattern 134 are superimposed and represented.
  • the inspection pattern 130 is projected on the inspection image 132
  • the re-inspection pattern 134 is projected on the re-inspection image 138.
  • the inspection pattern 130 when the f ⁇ lens 19 and the camera 103 are not dirty is used as a reference comparison pattern of the inspection pattern 130.
  • the re-inspection pattern 134 when the f ⁇ lens 19 and the camera 103 are not dirty is used as a reference comparison pattern of the re-inspection pattern 134.
  • the data of the comparison patterns are also stored in advance in the ROM 63, HDD 66, CD-ROM 57, or the like of the print information creating unit 2 for comparison and collation.
  • the laser marker 1 of the present embodiment uses the inspection pattern 112 as the object to be machined 7 instead of the inspection pattern 130 having a pitch a larger than that of the inspection pattern 112 when specifying a dirty range of the f ⁇ lens 19 and the camera 103. May be drawn on the machined surface 8 of.
  • the laser marker 1 of the present embodiment detects dirt on the pointer light emitter 105.
  • the pointer light P is emitted from the pointer light emitter 105 toward the machined surface 8 of the machined object 7.
  • the inspection image 110 shown in FIG. 9A is acquired.
  • the irradiation region 122 of the pointer light P is projected on the processing surface 8 of the processing object 7.
  • the irradiation region 122 of the pointer light P has a circular outer shape as shown in FIG. 9A.
  • an inspection image 110 as shown in FIG. 9B is acquired.
  • the irradiation region 122 of the pointer light P changes its shape and is projected in an elliptical shape.
  • the irradiation area 122 of the pointer light P is in a state where the position is displaced or the area is changed in addition to the above-mentioned shape change in the inspection image 110 of FIG. 9B. It may be projected on. Further, interference fringes or abnormal light may be projected outside the irradiation area 122 of the pointer light P.
  • an inspection image 110 as shown in FIG. 9C is acquired.
  • the irradiation area 122 of the pointer light P is projected without the portion 124 corresponding to the dust stain.
  • the laser marker 1 of the present embodiment compares and collates the irradiation area 122 of the inspection image 110 acquired by the camera 103 with the irradiation area 122 when the pointer light emitter 105 is not dirty, thereby emitting pointer light. Detects dirt on the vessel 105.
  • the comparison and collation is performed on the condition that the machined surface 8 of the object to be machined 7 is set at a predetermined position, as in the case described above (that is, when the dirt on the f ⁇ lens 19 and the camera 103 is detected).
  • the irradiation area 122 may be enlarged or reduced based on the working distance L at the time of acquisition of the inspection image 110.
  • the pointer light P may be applied to a surface other than the processing surface 8 of the processing object 7.
  • the irradiation area 122 when the pointer light emitter 105 is not dirty is referred to as a “contrast area that serves as a reference for the irradiation area 122”.
  • the data of the reference area of the irradiation area 122 is stored in advance in the ROM 63, HDD 66, CD-ROM 57, or the like of the print information creating unit 2 for the above-mentioned comparison and collation.
  • the program of the detection method 200 shown in the flowcharts of FIGS. 10 to 13 is stored in the ROM 63 of the control unit 51, and is executed by the CPU 61 of the control unit 51 when detecting the dirt on the f ⁇ lens 19 and the camera 103. Will be done. Therefore, in the processing described later, when the control target is a component of the laser processing unit 3, control is performed via the laser controller 6 except for the camera 103.
  • This program may be stored in the CD-ROM 57, read by the CD-ROM drive 58, and executed by the CPU 61. This program will be described below.
  • step 10 the drawing process of step 10 (hereinafter, simply referred to as “S”) 10 is performed.
  • the inspection pattern 112 is drawn by the guide light Q on the machined surface 8 of the machined object 7. Therefore, the guide light Q is emitted from the guide light unit 15, and the scanning mirrors 18X and 18Y of the galvano scanner 18 are rotated so that the swing angle of the galvano scanner 18 becomes a predetermined angle.
  • a grid-like locus that is, an inspection pattern 112 is drawn on the machined surface 8 of the machined object 7 by the guide light Q during the two-dimensional scanning.
  • the camera 103 photographs the processed surface 8 of the object to be processed 7, so that the inspection image 110 showing the inspection pattern 112 is acquired. At that time, the inspection pattern 112 is photographed by the camera 103 with the numerical value of the exposure time stored in the ROM 63.
  • image analysis related to the inspection pattern 112 is performed based on the inspection image 110.
  • the above-mentioned comparison and collation are performed as image analysis.
  • the difference between the inspection pattern 112 projected on the inspection image 110 and the reference comparison pattern of the inspection pattern 112 is detected.
  • information on the line width, position, and brightness of each part of the acquired inspection image 110 is extracted and compared with the appropriate value of the pattern drawing image stored in advance when there is no stain. Based on the comparison result, it is detected as a difference.
  • the differences include blurring of the inspection pattern 112 (see FIGS. 4A and 6), chipping (see FIG. 4B), distortion (see FIGS. 5A and 6), or inspection pattern 112. Has different patterns (see FIG. 5B) and the like.
  • f ⁇ is based on the inspection result of the image analysis in the above analysis process S14.
  • the presence or absence of dirt on at least one of the lens 19 and the camera 103 is automatically diagnosed. Therefore, the program shown in the flowchart of FIG. 11 is executed by the CPU 61 of the control unit 51.
  • the determination process S20 is performed. In this process, it is determined whether or not the inspection result of the image analysis has a blur or a chip in the inspection pattern 112. Here, if the inspection result of the image analysis has blurring or chipping of the inspection pattern 112 (S20: YES), the determination process S22 is performed.
  • the inspection result of the image analysis includes distortion of the inspection pattern 112 or a pattern different from the inspection pattern 112.
  • the f ⁇ lens 19 and the camera 103 are specified in the place where the stain is present.
  • the specific process S24 for displaying the fact on the liquid crystal display 56 is performed. After that, the process returns to the flowchart of FIG.
  • the inspection result of the image analysis does not show the distortion of the inspection pattern 112 and the pattern different from the inspection pattern 112 (S22: NO)
  • the f ⁇ lens 19 is specified in the place where the stain is present.
  • the specific process S26 for displaying the fact on the liquid crystal display 56 is performed. After that, the process returns to the flowchart of FIG.
  • the determination process S28 is performed. In this process, it is determined whether the inspection result of the image analysis has a distortion of the inspection pattern 112 or a pattern different from the inspection pattern 112.
  • the inspection result of the image analysis includes distortion of the inspection pattern 112 or a pattern different from the inspection pattern 112 (S28: YES)
  • the camera 103 is specified as a place where dirt is present, and this is indicated.
  • the specific process S30 to be displayed on the liquid crystal display 56 is performed. After that, the process returns to the flowchart of FIG.
  • the detection method without specifying the place where the stain exists. 200 ends. At that time, the liquid crystal display 56 indicates that the f ⁇ lens 19 and the camera 103 are clean.
  • the program represented by the flowchart of FIG. 12 is executed by the CPU 61 of the control unit 51.
  • the image generation process S40 is performed.
  • those parts are selectively given by the circle 126 or the frame 128, in different colors.
  • An image 125 (see FIGS. 7 (A) and 7 (B)) that can be distinguished and visually recognized by display or the like is generated.
  • the display process S42 the generated image 125 is displayed on the liquid crystal display 56. After that, the detection method 200 ends.
  • the program shown in the flowchart of FIG. 10 is executed by the CPU 61 of the control unit 51.
  • the irradiation region 122 of the pointer light P is used instead of the inspection pattern 112 of the guide light Q.
  • the reference comparison region of the irradiation region 122 is used instead of the reference comparison pattern of the inspection pattern 112.
  • the shape change, the positional deviation, or the area change of the irradiation area 122, or the interference fringes or abnormal light projected outside the irradiation area 122 are projected on the inspection image 110. It is detected as a difference between the region 122 and the reference contrast region of the irradiation region 122. Further, in the diagnostic process S16, when the difference between the irradiation area 122 projected on the inspection image 110 and the reference comparison area of the irradiation area 122 is detected, the pointer light emitter 105 is found to be dirty. When it is displayed on the liquid crystal display 56 and the difference is not detected, the liquid crystal display 56 indicates that the pointer light emitter 105 is clean. After that, the detection method 200 ends.
  • FIGS. 10 and 10 and FIGS. The program represented by each flowchart of 11 is executed by the CPU 61 of the control unit 51.
  • an inspection pattern 130 having a pitch a larger than that of the inspection pattern 112 is drawn instead of the inspection pattern 112 (S10), and the inspection pattern 130 is projected.
  • Image 132 is acquired (S12).
  • a comparison pattern that is a reference of the inspection pattern 130 is used instead of the comparison pattern that is a reference of the inspection pattern 112 (S14).
  • each of the specific processes S24, S26, and S30 a portion having a blur, chip, distortion, or a pattern different from the inspection pattern 130 is specified in the inspection image 132. As a result, the diagnostic process S16 is performed. However, in each of the specific processes S24, S26, and S30, instead of specifying the f ⁇ lens 19 or the camera 103 in a place where dirt is present, the liquid crystal display indicates that the f ⁇ lens 19 or the camera 103 may be dirty. It is displayed on the display 56.
  • the program shown in the flowchart of FIG. 13 is executed by the CPU 61 of the control unit 51.
  • the pitch a is higher than that of the inspection pattern 112 on the machined surface 8 of the machined object 7 in a range including the positions corresponding to the places specified by the specific processes S24, S26, and S30.
  • a small re-inspection pattern 134 is drawn with the guide light Q during the two-dimensional scan.
  • the pitch a is narrower than the drawing area 136 of the inspection pattern 130 in the drawing area 136 of the inspection pattern 130 and the pitch a is finer.
  • the inspection pattern 134 is drawn with the guide light Q instead of the inspection pattern 130.
  • the camera 103 photographs the processed surface 8 of the object to be processed 7, so that the re-inspection image 138 showing the re-inspection pattern 134 is acquired. At that time, the re-inspection pattern 134 is photographed by the camera 103 with the numerical value of the exposure time stored in the ROM 63.
  • image analysis regarding the re-inspection pattern 134 is performed based on the re-inspection image 138.
  • the above-mentioned comparison and collation are performed as image analysis.
  • the difference between the re-inspection pattern 134 projected on the re-inspection image 138 and the reference comparison pattern of the re-inspection pattern 134 is detected.
  • the re-diagnosis process S56 automatically diagnoses the presence or absence of dirt on at least one of the f ⁇ lens 19 and the camera 103 based on the inspection result of the image analysis in the re-analysis process S54. Therefore, the program shown in the flowchart of FIG. 11 described above is executed by the CPU 61 of the control unit 51. Further, the program shown in the flowchart of FIG. 12 described above is executed by the CPU 61 of the control unit 51. After that, the detection method 200 ends.
  • the laser marker 1 of the present embodiment, its control program, and the detection method 200 are inspected by the guide light Q which is two-dimensionally scanned by the galvano scanner 18 and condensed by the f ⁇ lens 19.
  • the pattern 112 is drawn on the machined surface 8 of the machined object 7 (S10), and the machined surface 8 of the machined object 7 is photographed by the camera 103 to acquire the inspection image 110 on which the inspection pattern 112 is projected (S10).
  • S12 an image analysis regarding the inspection pattern 112 is performed based on the inspection image 110 (S14), and the presence or absence of stains on at least one of the f ⁇ lens 19 and the camera 103 is automatically diagnosed based on the inspection result of the image analysis (S12).
  • S16 ).
  • the guide light Q scanned and focused on the machined surface 8 of the machined object 7 by the galvano scanner 18 and the f ⁇ lens 19 is photographed by the camera 103.
  • the laser marker 1 of the present embodiment performs image analysis by detecting the difference between the inspection pattern 112 projected on the inspection image 110 and the comparison pattern as a reference of the inspection pattern 112 (S14). It is possible to perform automatic diagnosis (S16) relatively easily.
  • the laser marker 1 of the present embodiment inspects the presence or absence of distortion, blurring, or chipping of the inspection pattern 112 in the image analysis (S14) (S20, S22, S28), the automatic diagnosis (S16) is relatively performed. It can be done easily.
  • the laser marker 1 of the present embodiment has a distortion in the inspection pattern 112 or a pattern different from the inspection pattern 112 because the influence of the stain appearing on the inspection pattern 112 is different between the f ⁇ lens 19 and the camera 103.
  • the camera 103 is specified as the place where the stain exists (S28: YES) and the inspection pattern 112 is blurred or chipped as the inspection content of the image analysis.
  • the f ⁇ lens 19 is specified as a place where dirt is present (S20: YES). In this way, the laser marker 1 of the present embodiment identifies the place where the stain is present based on the inspection result of the image analysis, so that the stain can be easily dealt with.
  • the inspection pattern 112 of the laser marker 1 of the present embodiment is a grid pattern, it can be used in combination with, for example, a pattern used in a scribed circle test.
  • the laser marker 1 of the present embodiment is drawn on the machined surface 8 of the object to be machined 7 when it is automatically diagnosed by the diagnostic process S16 that dirt may be present (S24, S26, S30).
  • the re-inspection pattern 134 which is narrower than the drawing area 136 of the inspection pattern 130 and has a finer pitch a than the inspection pattern 112, is drawn by the guide light Q instead of the inspection pattern 130. (S50).
  • the re-inspection image 138 on which the re-inspection pattern 134 is projected is acquired by photographing the processed surface 8 of the processed object 7 (S52), and image analysis is performed on the re-inspection pattern 134 based on the re-inspection image 138. (S54), based on the inspection result of the image analysis, the presence or absence of dirt on at least one of the f ⁇ lens 19 and the camera 103 is automatically diagnosed (S56). Thereby, the laser marker 1 of the present embodiment can detect the dirt of at least one of the f ⁇ lens 19 and the camera 103 faster and more accurately.
  • the laser marker 1 of the present embodiment generates a visible image 125 by distinguishing a portion having no difference and a portion having a difference by a circle 126, a frame 128, or the like (S40), and the image 125 is displayed on the liquid crystal display 56. Is displayed in (S42). Thereby, the laser marker 1 of the present embodiment can visually identify the position of dirt.
  • the laser marker 1 of the present embodiment automatically diagnoses the presence or absence of dirt on the pointer light emitter 105 by treating the irradiation region 122 of the pointer light P on the machined surface 8 of the machined object 7 as an inspection pattern 112. S10 to S16).
  • the pointer light P irradiated to the machined surface 8 of the object to be machined 7 by the pointer light emitter 105 is photographed by the camera 103, whereby the pointer light emitter 105 It is possible to detect dirt.
  • the laser marker 1 of the present embodiment determines the presence / absence of deviation, chipping, shape change, or area change of the irradiation area 122 of the pointer light P, or the presence / absence of interference fringes or abnormal light outside the irradiation area 122 of the pointer light P. Since the image analysis is performed by the inspection (S14), the automatic diagnosis (S16) can be performed relatively easily.
  • the laser marker 1 is an example of a "laser processing device”.
  • the machined surface 8 of the object to be machined 7 is an example of a “reference surface”.
  • the galvano scanner 18 and the f ⁇ lens 19 are examples of an “optical system”.
  • the visible semiconductor laser 28 is an example of a “guided light emitting unit”.
  • the liquid crystal display 56 is an example of a "display”.
  • the camera 103 is an example of a “shooting unit”.
  • the pointer light emitter 105 is an example of a “pointer light irradiating unit”.
  • the drawing process S10 is an example of a “drawing step”.
  • the acquisition process S12 is an example of the “acquisition step”.
  • the analysis process S14 is an example of an “analysis step”.
  • the diagnostic process S16 is an example of a "diagnosis step”.
  • the inspection pattern 140 as shown in FIG. 14 may be drawn on the machined surface 8 of the machined object 7 instead of the grid-shaped inspection pattern 112. Since the inspection pattern 140 is a pattern in which a plurality of straight lines are arranged in parallel, it can be easily drawn on the machined surface 8 of the machined object 7 as compared with the grid-shaped inspection pattern 112.
  • the direction in which the plurality of straight lines are lined up in parallel may be any direction.
  • liquid crystal display 56 is provided in the print information creation unit 2 composed of a personal computer or the like, it may be provided in the laser processing unit 3, or the print information creation unit 2 and the liquid crystal display 56 may be provided. It may be independent of the laser processing unit 3.
  • the camera 103 is provided in the laser marker 1 in order to realize a function of aligning the focal position F of the processing laser light R and the guide light Q with the processing surface 8 of the processing object 7, but the laser processing
  • the laser marker 1 may be provided in order to realize functions such as finish confirmation and reading of a one-dimensional code or a two-dimensional code. Further, the laser marker 1 may be provided only for executing the detection method 200.

Abstract

To provide a detection method and a laser machining device, with which the deterioration of at least one among an optical system and an imaging unit can be detected by imaging, with an imaging unit, visible light that scans a reference surface with the optical system and is condensed. According to the present invention, a laser marker comprises: a visible semiconductor laser which emits visible light; a Galvano scanner and an fθ lens which are disposed between the visible semiconductor laser and a machining surface so that the visible light is condensed while scanning the machining surface of an object to be machined; and a camera which images the machining surface, wherein a drawing process S10 for drawing an inspection pattern on the machining surface by means of the visible light, an acquisition process S12 for acquiring an inspection image by projecting the inspection pattern by imaging the machining surface, an analysis process S14 for performing image analysis on the inspection pattern on the basis of the inspection image, and a diagnosis process S16 for automatically diagnosing whether at least one among the fθ lens and the camera is deteriorated are executed.

Description

レーザ加工装置とその制御プログラム、検知方法Laser processing equipment and its control program, detection method
 本開示は、光学系の汚れを検知するレーザ加工装置及び検知方法に関するものである。 The present disclosure relates to a laser processing device and a detection method for detecting stains on an optical system.
 従来、上記のレーザ加工装置等に関し、種々の技術が提案されている。例えば、下記特許文献1に記載の技術は、偏向ミラーにより偏向されたレーザ光を被加工物上に集光するためのスキャニングレンズを備え、前記偏向ミラーと前記スキャニングレンズとを用いることにより偏向領域内の複数の位置にレーザ光を照射し加工を行うレーザ加工装置において、前記偏向領域内の複数の異なる測定位置に、前記偏向ミラーにより偏向され照射されるレーザ光のエネルギを測定するエネルギ測定手段と、エネルギ測定手段による測定に基づきスキャニングレンズの汚れを検知する手段とを備えていることを特徴とする。 Conventionally, various techniques have been proposed for the above laser processing devices and the like. For example, the technique described in Patent Document 1 below includes a scanning lens for condensing a laser beam deflected by a deflection mirror onto a work piece, and a deflection region is obtained by using the deflection mirror and the scanning lens. An energy measuring means for measuring the energy of a laser beam deflected and irradiated by the deflection mirror at a plurality of different measurement positions in the deflection region in a laser processing apparatus that irradiates a plurality of positions in the lens with a laser beam for processing. It is characterized in that it is provided with a means for detecting dirt on the scanning lens based on the measurement by the energy measuring means.
特許第3797327号公報Japanese Patent No. 3797327
 上記のレーザ加工装置においては、レーザ光のエネルギを測定するためにエネルギ測定手段を照射位置へ移動させなければならないが、複数の異なる箇所でエネルギを測定するためには都度エネルギ測定手段を移動させねばならず、測定個所の数に応じて非常に時間がかかる動作になる。 In the above laser processing apparatus, the energy measuring means must be moved to the irradiation position in order to measure the energy of the laser beam, but the energy measuring means is moved each time in order to measure the energy at a plurality of different points. It must be a very time-consuming operation depending on the number of measurement points.
 そこで、本開示は、上述した点に鑑みてなされたものであり、光学系で基準面に対して走査及び集光させた可視光を撮像部で撮影することによって、速やかに光学系及び撮影部のうち少なくとも一方の汚れを検知することが可能なレーザ加工装置及び検知方法を提供することを課題とする。 Therefore, the present disclosure has been made in view of the above-mentioned points, and by photographing the visible light scanned and condensed with respect to the reference plane by the optical system by the imaging unit, the optical system and the photographing unit can be quickly captured. It is an object of the present invention to provide a laser processing apparatus and a detection method capable of detecting at least one of the stains.
 本明細書は、可視光であるガイド光を出射するガイド光出射部と、ガイド光を基準面に対して走査しながら集光させるために、ガイド光出射部と基準面との間に配される光学系と、基準面を撮影する撮影部と、制御部と、を備え、制御部は、ガイド光で基準面に検査パターンを描画する描画処理と、基準面の撮影によって検査パターンが映し出された検査画像を取得する取得処理と、検査画像に基づいて、検査パターンに関する画像分析を行う分析処理と、画像分析の検査結果に基づいて、光学系及び撮影部のうち少なくとも一方の汚れの有無を自動診断する診断処理と、を実行することを特徴とするレーザ加工装置を開示する。 The present specification is arranged between a guide light emitting unit that emits a guide light which is visible light and a guide light emitting unit and a reference surface in order to collect the guide light while scanning the reference surface. It is equipped with an optical system, an imaging unit that photographs the reference plane, and a control unit. The control unit has a drawing process that draws an inspection pattern on the reference plane with guide light, and the inspection pattern is projected by imaging the reference plane. Based on the acquisition process for acquiring the inspection image, the analysis process for performing image analysis on the inspection pattern based on the inspection image, and the inspection result of the image analysis, the presence or absence of stains on at least one of the optical system and the photographing unit is checked. Disclosed is a laser processing apparatus characterized by performing diagnostic processing for automatic diagnosis and execution.
 また、本明細書は、可視光であるガイド光を出射するガイド光出射部と、ガイド光を基準面に対して走査しながら集光させるために、ガイド光出射部と基準面との間に配される光学系と、基準面を撮影する撮影部と、を備えるレーザ加工装置の汚れの有無を検知する汚れ検知方法であって、ガイド光で基準面に検査パターンを描画する描画ステップと、基準面の撮影によって検査パターンが映し出された検査画像を取得する取得ステップと、検査画像に基づいて、検査パターンに関する画像分析を行う分析ステップと、画像分析の検査結果に基づいて、光学系及び撮影部のうち少なくとも一方の汚れの有無を自動診断する診断ステップと、を備えることを特徴とする汚れ検知方法を開示する。 Further, in the present specification, there is a guide light emitting unit that emits a guide light which is visible light, and a guide light emitting unit and a reference surface in order to collect the guide light while scanning the reference surface. A stain detection method for detecting the presence or absence of stains on a laser processing device including an optical system to be arranged and an imaging unit for photographing a reference surface, a drawing step of drawing an inspection pattern on the reference surface with guide light, and a drawing step. An acquisition step of acquiring an inspection image in which an inspection pattern is projected by photographing a reference plane, an analysis step of performing image analysis on the inspection pattern based on the inspection image, and an optical system and imaging based on the inspection result of the image analysis. Disclosed is a stain detection method including a diagnostic step for automatically diagnosing the presence or absence of stains on at least one of the portions.
 本開示によれば、レーザ加工装置とその制御プログラム、及び検知方法は、光学系で基準面に対して走査及び集光させた可視光を撮像部で撮影することによって、速やかに光学系及び撮影部のうち少なくとも一方の汚れを検知することが可能である。 According to the present disclosure, the laser processing apparatus, its control program, and the detection method promptly capture the optical system and the imaging by photographing the visible light scanned and condensed with respect to the reference plane by the optical system by the imaging unit. It is possible to detect dirt on at least one of the parts.
本実施形態のレーザマーカの概略構成が表された図である。It is a figure which showed the schematic structure of the laser marker of this embodiment. 同レーザマーカの電気的構成が表されたブロック図である。It is a block diagram showing the electrical structure of the laser marker. 同レーザマーカのfθレンズ及びカメラが汚れていない場合にカメラで取得される検査画像内の、ガイド光の検査パターンが表された図である。It is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the fθ lens of the laser marker and the camera are not dirty. (A)は、同レーザマーカのfθレンズが油で汚れている場合にカメラで取得される検査画像内の、ガイド光の検査パターンが表された図である。(B)は、同レーザマーカのfθレンズがゴミで汚れている場合にカメラで取得される検査画像内の、ガイド光の検査パターンが表された図である。(A) is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the fθ lens of the laser marker is contaminated with oil. (B) is a figure showing an inspection pattern of guide light in an inspection image acquired by a camera when the fθ lens of the laser marker is dirty with dust. (A)は、同レーザマーカのカメラが油で汚れている場合にカメラで取得される検査画像内の、ガイド光の検査パターンが表された図である。(B)は、同レーザマーカのカメラがゴミで汚れている場合にカメラで取得される検査画像内の、ガイド光の検査パターンが表された図である。(A) is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the camera of the laser marker is contaminated with oil. (B) is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the camera of the laser marker is contaminated with dust. 同レーザマーカのfθレンズとカメラとが油で汚れている場合にカメラで取得される検査画像内の、ガイド光の検査パターンが表された図である。It is a figure showing the inspection pattern of the guide light in the inspection image acquired by the camera when the fθ lens of the laser marker and the camera are contaminated with oil. (A)は、同レーザマーカのfθレンズが油で汚れている場合にカメラで撮影される検査パターンのうち、fθレンズの油汚れの箇所が強調された画像が表された図である。(B)は、同レーザマーカのカメラが油で汚れている場合にカメラで撮影される検査パターンのうち、カメラの油汚れの箇所が強調された画像が表された図である。(A) is a figure showing an image in which the oil-stained portion of the fθ lens is emphasized among the inspection patterns taken by the camera when the fθ lens of the laser marker is contaminated with oil. (B) is a figure showing an image in which the oil-stained portion of the camera is emphasized among the inspection patterns taken by the camera when the camera of the laser marker is contaminated with oil. 同レーザマーカのfθレンズが油で汚れている場合にカメラで取得される検査画像や再検査画像内の、ガイド光の検査パターンや再検査パターンが表された図である。It is a figure showing the inspection pattern and the re-inspection pattern of the guide light in the inspection image and the re-inspection image acquired by the camera when the fθ lens of the laser marker is contaminated with oil. (A)は、同レーザマーカのポインタ光出射器が汚れていない場合にカメラで取得される検査画像内の、ポインタ光の照射領域が表された図である。(B)は、同レーザマーカのポインタ光出射器が油で汚れている場合にカメラで取得される検査画像内の、ポインタ光の照射領域が表された図である(C)は、同レーザマーカのポインタ光出射器がゴミで汚れている場合にカメラで取得される検査画像内の、ポインタ光の照射領域が表された図である。(A) is a figure showing the irradiation area of the pointer light in the inspection image acquired by the camera when the pointer light emitter of the laser marker is not dirty. (B) is a diagram showing an irradiation area of pointer light in an inspection image acquired by a camera when the pointer light emitter of the laser marker is dirty with oil, and (C) is a diagram showing the irradiation area of the pointer light of the laser marker. It is a figure showing the irradiation area of a pointer light in the inspection image acquired by a camera when a pointer light emitter is contaminated with dust. 同レーザマーカが実行する各処理が表されたフローチャートである。It is a flowchart showing each process executed by the laser marker. 同レーザマーカが実行する各処理が表されたフローチャートである。It is a flowchart showing each process executed by the laser marker. 同レーザマーカが実行する各処理が表されたフローチャートである。It is a flowchart showing each process executed by the laser marker. 同レーザマーカが実行する各処理が表されたフローチャートである。It is a flowchart showing each process executed by the laser marker. 同レーザマーカの加工対象物の加工面が表された平面図であって、加工対象物の加工面上に描画された検査パターンの変更例が表された図である。It is a plan view which showed the machined surface of the machined object of the laser marker, and is the figure which showed the modification example of the inspection pattern drawn on the machined surface of a machined object.
 以下、本開示のレーザマーカについて、具体化した実施形態に基づき、図面を参照しつつ説明する。以下の説明に用いる図1及び図2では、基本的構成の一部が省略されて描かれており、描かれた各部の寸法比等は必ずしも正確ではない。尚、以下の説明において、上下方向は、図1に示された通りである。 Hereinafter, the laser marker of the present disclosure will be described with reference to the drawings based on the embodied embodiment. In FIGS. 1 and 2 used in the following description, a part of the basic configuration is omitted, and the dimensional ratio and the like of each drawn part are not always accurate. In the following description, the vertical direction is as shown in FIG.
[1.レーザマーカの概略構成]
 先ず、図1及び図2に基づいて、本実施形態のレーザマーカ1の概略構成について説明する。本実施形態のレーザマーカ1は、印字情報作成部2及びレーザ加工部3で構成されている。印字情報作成部2は、パーソナルコンピュータ等で構成されている。
[1. Schematic configuration of laser marker]
First, a schematic configuration of the laser marker 1 of the present embodiment will be described with reference to FIGS. 1 and 2. The laser marker 1 of this embodiment is composed of a print information creating unit 2 and a laser processing unit 3. The print information creating unit 2 is composed of a personal computer or the like.
 レーザ加工部3は、加工レーザ光Rを加工対象物7の加工面8上で2次元走査してマーキング(印字)加工を行うものである。レーザ加工部3は、レーザコントローラ6を備えている。 The laser processing unit 3 performs marking (printing) processing by two-dimensionally scanning the processing laser beam R on the processing surface 8 of the processing object 7. The laser processing unit 3 includes a laser controller 6.
 レーザコントローラ6は、コンピュータで構成され、印字情報作成部2と双方向通信可能に接続されている。レーザコントローラ6は、印字情報作成部2から送信された印字情報、制御パラメータ、各種指示情報等に基づいてレーザ加工部3を駆動制御する。 The laser controller 6 is composed of a computer and is connected to the print information creating unit 2 so as to be capable of bidirectional communication. The laser controller 6 drives and controls the laser processing unit 3 based on the print information, control parameters, various instruction information, and the like transmitted from the print information creation unit 2.
 レーザ加工部3の概略構成について説明する。レーザ加工部3は、レーザ発振ユニット12、ガイド光部15、ダイクロイックミラー101、焦点系70、ポインタ光出射器105、カメラ103、ガルバノスキャナ18、及びfθレンズ19等を備えており、不図示の略直方体形状の筐体カバーで覆われている。 The outline configuration of the laser processing unit 3 will be described. The laser processing unit 3 includes a laser oscillation unit 12, a guide light unit 15, a dichroic mirror 101, a focal system 70, a pointer light emitter 105, a camera 103, a galvano scanner 18, an fθ lens 19, and the like, which are not shown. It is covered with a housing cover that has a substantially rectangular shape.
 レーザ発振ユニット12は、レーザ発振器21等で構成されている。レーザ発振器21は、CO2レーザ、YAGレーザ等で構成されており、加工レーザ光Rを出射する。尚、加工レーザ光Rの光径は、不図示のビームエキスパンダで調整(例えば、拡大)される。 The laser oscillation unit 12 is composed of a laser oscillator 21 and the like. The laser oscillator 21 is composed of a CO2 laser, a YAG laser, or the like, and emits a processed laser beam R. The light diameter of the processed laser light R is adjusted (for example, enlarged) by a beam expander (not shown).
 ガイド光部15は、可視半導体レーザ28等で構成されている。可視半導体レーザ28は、可視可干渉光であるガイド光Q、例えば、赤色レーザ光を出射する。ガイド光Qは、不図示のレンズ群で平行光にされ、更に、2次元走査されることによって、例えば、加工レーザ光Rでマーキング(印字)加工すべき印字パターンの像、その像を取り囲んだ矩形の像、又は所定形状の像等を、加工対象物7の加工面8上に軌跡(時間残像)で描画するものである。つまり、ガイド光Qには、マーキング(印字)加工能力がない。尚、本実施形態において、所定形状は格子状であるが、それに関する詳細な説明については、後述する。 The guide light unit 15 is composed of a visible semiconductor laser 28 or the like. The visible semiconductor laser 28 emits a guide light Q, which is visible interference light, for example, a red laser light. The guide light Q is made into parallel light by a lens group (not shown), and is further scanned two-dimensionally to surround, for example, an image of a print pattern to be marked (printed) by the processed laser light R, and the image thereof. A rectangular image, an image having a predetermined shape, or the like is drawn on a machined surface 8 of a machined object 7 with a locus (time afterimage). That is, the guide light Q does not have the marking (printing) processing ability. In the present embodiment, the predetermined shape is a grid pattern, but a detailed description thereof will be described later.
 ガイド光Qの波長は、加工レーザ光Rの波長とは異なる。本実施形態では、例えば、加工レーザ光Rの波長は1064nmであり、ガイド光Qの波長は、650nmである。 The wavelength of the guide light Q is different from the wavelength of the processed laser light R. In the present embodiment, for example, the wavelength of the processed laser light R is 1064 nm, and the wavelength of the guide light Q is 650 nm.
 ダイクロイックミラー101では、入射された加工レーザ光Rのほぼ全部が透過する。また、ダイクロイックミラー101では、加工レーザ光Rが透過する略中央位置にて、ガイド光Qが45度の入射角で入射され、45度の反射角で加工レーザ光Rの光路上に反射される。ダイクロイックミラー101の反射率は、波長依存性を持っている。具体的には、ダイクロイックミラー101は、誘電体層と金属層との多層膜構造の表面処理がなされており、ガイド光Qの波長に対して高い反射率を有し、それ以外の波長の光をほとんど(99%)透過するように構成されている。 In the dichroic mirror 101, almost all of the incident processed laser light R is transmitted. Further, in the dichroic mirror 101, the guide light Q is incident at an incident angle of 45 degrees at a substantially central position through which the processed laser light R is transmitted, and is reflected on the optical path of the processed laser light R at a reflection angle of 45 degrees. .. The reflectance of the dichroic mirror 101 has a wavelength dependence. Specifically, the dichroic mirror 101 is surface-treated with a multilayer film structure of a dielectric layer and a metal layer, has a high reflectance with respect to the wavelength of the guide light Q, and is light of other wavelengths. Is configured to be almost (99%) transparent.
 尚、図1の一点鎖線は、加工レーザ光Rとガイド光Qの光軸10を示している。また、光軸10の方向は、加工レーザ光Rとガイド光Qの経路方向を示している。 The alternate long and short dash line in FIG. 1 indicates the optical axis 10 of the processed laser light R and the guide light Q. Further, the direction of the optical axis 10 indicates the path direction of the processed laser light R and the guide light Q.
 焦点系70は、第1レンズ72、第2レンズ74、及び移動機構76を備えている。焦点系70では、ダイクロイックミラー101を経た加工レーザ光Rとガイド光Qが、第1レンズ72に入射し通過する。その際、第1レンズ72によって、加工レーザ光Rとガイド光Qの各光径が縮小される。また、第1レンズ72を通過した加工レーザ光Rとガイド光Qは、第2レンズ74に入射し通過する。その際、第2レンズ74によって、加工レーザ光Rとガイド光Qが平行光にされる。移動機構76は、焦点系モータ80と、焦点系モータ80の回転運動を直線運動に変換するラック・アンド・ピニオン(不図示)等を備えており、焦点系モータ80の回転制御によって、第2レンズ74を加工レーザ光Rとガイド光Qの経路方向に移動させる。 The focal system 70 includes a first lens 72, a second lens 74, and a moving mechanism 76. In the focal system 70, the processed laser light R and the guide light Q that have passed through the dichroic mirror 101 enter the first lens 72 and pass therethrough. At that time, the light diameters of the processed laser light R and the guide light Q are reduced by the first lens 72. Further, the processed laser light R and the guide light Q that have passed through the first lens 72 enter the second lens 74 and pass through. At that time, the processed laser light R and the guide light Q are made into parallel light by the second lens 74. The moving mechanism 76 includes a focal system motor 80, a rack and pinion (not shown) that converts the rotational motion of the focal system motor 80 into linear motion, and the like, and is second by controlling the rotation of the focal system motor 80. The lens 74 is moved in the path direction of the processed laser light R and the guide light Q.
 尚、移動機構76は、第2レンズ74に代えて第1レンズ72を移動させる構成であってもよいし、第1レンズ72と第2レンズ74との間の距離が変わるように第1レンズ72と第2レンズ74の双方を移動させる構成であってもよい。 The moving mechanism 76 may be configured to move the first lens 72 instead of the second lens 74, or the first lens so that the distance between the first lens 72 and the second lens 74 changes. It may be configured to move both the 72 and the second lens 74.
 ガルバノスキャナ18は、焦点系70を経た加工レーザ光Rとガイド光Qとを2次元走査するものである。ガルバノスキャナ18では、ガルバノX軸モータ31とガルバノY軸モータ32とが、それぞれのモータ軸が互いに直交するように取り付けられ、各モータ軸の先端部に取り付けられた走査ミラー18X、18Yが内側で互いに対向している。そして、各モータ31、32の回転制御で、各走査ミラー18X、18Yを回転させることによって、加工レーザ光Rとガイド光Qとを2次元走査する。この2次元走査方向は、X方向とY方向である。 The galvano scanner 18 two-dimensionally scans the processed laser light R and the guide light Q that have passed through the focal system 70. In the galvano scanner 18, the galvano X-axis motor 31 and the galvano Y-axis motor 32 are attached so that their respective motor axes are orthogonal to each other, and the scanning mirrors 18X and 18Y attached to the tip of each motor axis are inside. They are facing each other. Then, the processing laser light R and the guide light Q are two-dimensionally scanned by rotating the scanning mirrors 18X and 18Y under the rotation control of the motors 31 and 32. The two-dimensional scanning directions are the X direction and the Y direction.
 fθレンズ19は、ガルバノスキャナ18によって2次元走査された加工レーザ光Rとガイド光Qとを加工対象物7の加工面8上に集光するものである。従って、加工レーザ光Rとガイド光Qは、各モータ31、32の回転制御によって、加工対象物7の加工面8上でX方向とY方向に2次元走査される。 The fθ lens 19 collects the processed laser light R and the guide light Q two-dimensionally scanned by the galvano scanner 18 on the processed surface 8 of the processed object 7. Therefore, the processing laser light R and the guide light Q are two-dimensionally scanned in the X direction and the Y direction on the processing surface 8 of the processing object 7 by the rotation control of the motors 31 and 32, respectively.
 加工レーザ光Rとガイド光Qとでは、波長が異なる。そのため、焦点系70における第1レンズ72と第2レンズ74との間の距離が一定の場合、加工レーザ光Rとガイド光Qが集光する位置(以下、「焦点位置F」という。)は、上下方向で異なってしまう。そこで、加工レーザ光Rとガイド光Qの焦点位置Fは、焦点系70における第1レンズ72と第2レンズ74との間の距離が調整されることによって、加工対象物7の加工面8上に合わせられる。 The wavelengths of the processed laser light R and the guide light Q are different. Therefore, when the distance between the first lens 72 and the second lens 74 in the focal system 70 is constant, the position where the processed laser light R and the guide light Q are focused (hereinafter referred to as “focus position F”) is. , It will be different in the vertical direction. Therefore, the focal position F of the processed laser light R and the guide light Q is set on the processed surface 8 of the processed object 7 by adjusting the distance between the first lens 72 and the second lens 74 in the focal system 70. It can be adjusted to.
 ここで、fθレンズ19の位置に関連した基準位置と加工対象物7の加工面8との間の距離を、「ワーキングディスタンス」と表記する。本実施形態では、fθレンズ19の下面を、fθレンズ19の位置に関連した基準位置とする。つまり、本実施形態のワーキングディスタンスLは、fθレンズ19の下面と加工対象物7の加工面8との間の距離である。尚、fθレンズ19の位置に関連した基準位置には、上記のfθレンズ19の下面の他に、例えば、fθレンズ19の上面、又はfθレンズ19の上下方向の中央等がある。 Here, the distance between the reference position related to the position of the fθ lens 19 and the machined surface 8 of the machined object 7 is referred to as “working distance”. In the present embodiment, the lower surface of the fθ lens 19 is set as a reference position related to the position of the fθ lens 19. That is, the working distance L of the present embodiment is the distance between the lower surface of the fθ lens 19 and the machined surface 8 of the machined object 7. In addition to the lower surface of the fθ lens 19, the reference position related to the position of the fθ lens 19 includes, for example, the upper surface of the fθ lens 19 or the center of the fθ lens 19 in the vertical direction.
 従って、ワーキングディスタンスLが変わる場合は、fθレンズ19の下面と加工対象物7の加工面8との間の距離が変わる場合であるので、そのような場合においても、焦点系70における第1レンズ72と第2レンズ74との間の距離が調整されることによって、加工レーザ光Rとガイド光Qの焦点位置Fが、加工対象物7の加工面8上に合わせられる。 Therefore, when the working distance L changes, the distance between the lower surface of the fθ lens 19 and the machined surface 8 of the object to be machined 7 changes. Therefore, even in such a case, the first lens in the focal system 70 changes. By adjusting the distance between the 72 and the second lens 74, the focal position F of the processed laser light R and the guide light Q is aligned with the machined surface 8 of the machined object 7.
 ポインタ光出射器105は、可視光であるポインタ光Pを加工対象物7の加工面8に向けて出射するものであり、fθレンズ19付近に設けられている。これにより、加工対象物7の加工面8上には、ポインタ光Pが比較的小さな円状に照射される。尚、ポインタ光Pは、可視光であれば、マーキング(印字)加工能力がないレーザ光であってもよい。 The pointer light emitter 105 emits the pointer light P, which is visible light, toward the processing surface 8 of the processing object 7, and is provided in the vicinity of the fθ lens 19. As a result, the pointer light P is irradiated on the machined surface 8 of the machined object 7 in a relatively small circular shape. The pointer light P may be a laser light having no marking (printing) processing ability as long as it is visible light.
 カメラ103は、加工対象物7の加工面8に向けられた状態で、fθレンズ19付近に設けられている。これにより、カメラ103は、例えば、加工対象物7の加工面8上において、照射されている円状のポインタ光P、又はガルバノスキャナ18の2次元走査が繰り返されることによって描画されている格子状のガイド光Qの軌跡を撮像する。これにより、加工対象物7の加工面8が映し出された画像であって、円状のポインタ光P、又は格子状のガイド光Qの軌跡を含む画像が撮影される。尚、ワーキングディスタンスLは、カメラ103で撮影された画像(加工対象物7の加工面8上のガイド光Q及びポインタ光Pが映し出されたもの)等に基づいて決定されるが、その技術は公知のため、その説明は省略する。 The camera 103 is provided near the fθ lens 19 in a state of being directed to the machined surface 8 of the object to be machined 7. As a result, the camera 103 is drawn, for example, on the machined surface 8 of the object 7 by repeating the two-dimensional scanning of the illuminated circular pointer light P or the galvano scanner 18. The locus of the guide light Q is imaged. As a result, an image in which the machined surface 8 of the object 7 to be machined is projected and includes the locus of the circular pointer light P or the grid-shaped guide light Q is captured. The working distance L is determined based on an image taken by the camera 103 (a projection of the guide light Q and the pointer light P on the processing surface 8 of the processing object 7) and the like, but the technique thereof is determined. Since it is known, the description thereof will be omitted.
 次に、レーザマーカ1を構成する印字情報作成部2とレーザ加工部3の回路構成について図2に基づいて説明する。先ず、レーザ加工部3の回路構成について説明する。 Next, the circuit configurations of the print information creating unit 2 and the laser processing unit 3 constituting the laser marker 1 will be described with reference to FIG. First, the circuit configuration of the laser processing unit 3 will be described.
 図2に表されるように、レーザ加工部3は、レーザコントローラ6、ガルバノコントローラ35、ガルバノドライバ36、レーザドライバ37、半導体レーザドライバ38、焦点系ドライバ78、ポインタ光出射器105、及びカメラ103等から構成されている。レーザコントローラ6は、レーザ加工部3の全体を制御する。レーザコントローラ6には、ガルバノコントローラ35、レーザドライバ37、半導体レーザドライバ38、及び焦点系ドライバ78等が電気的に接続されている。また、レーザコントローラ6及びカメラ103には、外部の印字情報作成部2が双方向通信可能に接続されている。また、ポインタ光出射器105は、レーザコントローラ6に電気的に接続されている。レーザコントローラ6は、印字情報作成部2から送信された各情報(例えば、印字情報、レーザ加工部3に対する制御パラメータ、ユーザからの各種指示情報等)を受信可能に構成されている。ポインタ光出射器105は、印字情報作成部2から送信された各情報(例えば、点灯消灯指示情報等)を受信可能に構成されている。カメラ103は、印字情報作成部2から送信された各情報(例えば、撮像指示情報等)を受信可能に構成され、また、撮像した画像を印字情報作成部2に送信可能に構成されている。 As shown in FIG. 2, the laser processing unit 3 includes a laser controller 6, a galvano controller 35, a galvano driver 36, a laser driver 37, a semiconductor laser driver 38, a focal system driver 78, a pointer light emitter 105, and a camera 103. And so on. The laser controller 6 controls the entire laser processing unit 3. A galvano controller 35, a laser driver 37, a semiconductor laser driver 38, a focal system driver 78, and the like are electrically connected to the laser controller 6. Further, an external print information creating unit 2 is connected to the laser controller 6 and the camera 103 so as to be capable of bidirectional communication. Further, the pointer light emitter 105 is electrically connected to the laser controller 6. The laser controller 6 is configured to be able to receive each information transmitted from the print information creation unit 2 (for example, print information, control parameters for the laser processing unit 3, various instruction information from the user, etc.). The pointer light emitter 105 is configured to be able to receive each information (for example, lighting / extinguishing instruction information) transmitted from the print information creating unit 2. The camera 103 is configured to be able to receive each information (for example, image pickup instruction information) transmitted from the print information creation unit 2, and is configured to be able to transmit the captured image to the print information creation unit 2.
 レーザコントローラ6は、CPU41、RAM42、及びROM43等を備えている。CPU41は、レーザ加工部3の全体の制御を行う演算装置及び制御装置である。CPU41、RAM42、及びROM43は、不図示のバス線により相互に接続されて、相互にデータのやり取りが行われる。 The laser controller 6 includes a CPU 41, a RAM 42, a ROM 43, and the like. The CPU 41 is an arithmetic unit and a control device that controls the entire laser processing unit 3. The CPU 41, RAM 42, and ROM 43 are connected to each other by a bus line (not shown), and data is exchanged with each other.
 RAM42は、CPU41により演算された各種の演算結果や印字パターンの(XY座標)データ等を一時的に記憶させておくためのものである。 The RAM 42 is for temporarily storing various calculation results calculated by the CPU 41, (XY coordinates) data of the print pattern, and the like.
 ROM43は、各種のプログラムを記憶させておくものであり、例えば、印字情報作成部2から送信された印字情報に基づいて印字パターンのXY座標データを算出してRAM42に記憶するプログラムや、ガイド光Qによる格子状の軌跡のXY座標データを算出してRAM42に記憶するプログラム等が記憶されている。尚、各種プログラムには、上述したプログラムに加えて、例えば、各種のディレイ値、印字情報作成部2から入力された印字情報に対応する印字パターンの太さ、深さ及び本数、レーザ発振器21のレーザ出力、加工レーザ光Rのレーザパルス幅、ガルバノスキャナ18による加工レーザ光Rを走査する速度、及びガルバノスキャナ18によるガイド光Qを走査する速度等を示す各種制御パラメータをRAM42に記憶するプログラム等がある。更に、ROM43には、歪補正のためのパラメータや、ガルバノスキャナ18、ポインタ光Pのオフセット補正データ、レーザマーカ1のステータス情報(エラー情報、加工回数、加工時間等)が記憶されている。 The ROM 43 stores various programs. For example, a program that calculates XY coordinate data of a print pattern based on the print information transmitted from the print information creation unit 2 and stores it in the RAM 42, or a guide light. A program or the like that calculates the XY coordinate data of the grid-like locus by Q and stores it in the RAM 42 is stored. In addition to the above-mentioned programs, the various programs include, for example, various delay values, the thickness, depth and number of print patterns corresponding to the print information input from the print information creation unit 2, and the laser oscillator 21. A program for storing various control parameters indicating the laser output, the laser pulse width of the processed laser light R, the speed of scanning the processed laser light R by the galvano scanner 18, the speed of scanning the guide light Q by the galvano scanner 18, etc. in the RAM 42, etc. There is. Further, the ROM 43 stores parameters for distortion correction, galvano scanner 18, offset correction data of the pointer light P, and status information (error information, number of times of processing, processing time, etc.) of the laser marker 1.
 CPU41は、ROM43に記憶されている各種のプログラムに基づいて各種の演算及び制御を行う。 The CPU 41 performs various operations and controls based on various programs stored in the ROM 43.
 CPU41は、印字情報作成部2から入力された印字情報に基づいて算出した印字パターンのXY座標データ、ガイド光Qによる格子状の軌跡のXY座標データ、ガルバノスキャナ18によるガイド光Qを走査する速度、及びガルバノスキャナ18による加工レーザ光Rを走査する速度等を示すガルバノ走査速度情報等を、ガルバノコントローラ35に出力する。また、CPU41は、印字情報作成部2から入力された印字情報に基づいて設定したレーザ発振器21のレーザ出力、及び加工レーザ光Rのレーザパルス幅等を示すレーザ駆動情報を、レーザドライバ37に出力する。 The CPU 41 scans the XY coordinate data of the print pattern calculated based on the print information input from the print information creation unit 2, the XY coordinate data of the grid-like locus by the guide light Q, and the guide light Q by the galvano scanner 18. , And the galvano scanning speed information or the like indicating the speed or the like of scanning the processed laser beam R by the galvano scanner 18 is output to the galvano controller 35. Further, the CPU 41 outputs to the laser driver 37 the laser output of the laser oscillator 21 set based on the print information input from the print information creation unit 2, and the laser drive information indicating the laser pulse width of the processed laser light R and the like. do.
 CPU41は、可視半導体レーザ28の点灯開始を指示するオン信号又は消灯を指示するオフ信号を半導体レーザドライバ38に出力する。 The CPU 41 outputs an on signal instructing the start of lighting of the visible semiconductor laser 28 or an off signal instructing the extinguishing of the visible semiconductor laser 28 to the semiconductor laser driver 38.
 ガルバノコントローラ35は、レーザコントローラ6から入力された各情報(例えば、印字パターンのXY座標データ、ガイド光Qによる格子状の軌跡のXY座標データ、ガルバノ走査速度情報等)に基づいて、ガルバノX軸モータ31とガルバノY軸モータ32の駆動角度、回転速度等を算出して、駆動角度及び回転速度を示すモータ駆動情報をガルバノドライバ36に出力する。ガルバノドライバ36は、ガルバノコントローラ35から入力されたモータ駆動情報に基づいて、ガルバノX軸モータ31とガルバノY軸モータ32を駆動制御して、加工レーザ光Rとガイド光Qを2次元走査する。 The galvano controller 35 is based on each information input from the laser controller 6 (for example, XY coordinate data of the print pattern, XY coordinate data of the grid-like locus by the guide light Q, galvano scanning speed information, etc.), and the galvano X-axis. The drive angle, rotation speed, and the like of the motor 31 and the galvano Y-axis motor 32 are calculated, and motor drive information indicating the drive angle and rotation speed is output to the galvano driver 36. The galvano driver 36 drives and controls the galvano X-axis motor 31 and the galvano Y-axis motor 32 based on the motor drive information input from the galvano controller 35, and scans the processed laser light R and the guide light Q in two dimensions.
 レーザドライバ37は、レーザコントローラ6から入力されたレーザ発振器21のレーザ出力、及び加工レーザ光Rのレーザパルス幅等を示すレーザ駆動情報等に基づいて、レーザ発振器21を駆動させる。半導体レーザドライバ38は、レーザコントローラ6から入力されたオン信号又はオフ信号に基づいて、可視半導体レーザ28を点灯駆動又は、消灯させる。 The laser driver 37 drives the laser oscillator 21 based on the laser output of the laser oscillator 21 input from the laser controller 6 and the laser drive information indicating the laser pulse width of the processed laser light R and the like. The semiconductor laser driver 38 turns on or turns off the visible semiconductor laser 28 based on the on signal or off signal input from the laser controller 6.
 焦点系ドライバ78は、レーザコントローラ6から入力された情報に基づいて、焦点系モータ80を駆動制御して、第2レンズ74を移動させる。 The focus system driver 78 drives and controls the focus system motor 80 based on the information input from the laser controller 6 to move the second lens 74.
 次に、印字情報作成部2の回路構成について説明する。印字情報作成部2は、制御部51、入力操作部55、液晶ディスプレイ(LCD)56、及びCD-ROMドライブ58等を備えている。制御部51には、不図示の入出力インターフェースを介して、入力操作部55、液晶ディスプレイ56、及びCD-ROMドライブ58等が接続されている。 Next, the circuit configuration of the print information creation unit 2 will be described. The print information creation unit 2 includes a control unit 51, an input operation unit 55, a liquid crystal display (LCD) 56, a CD-ROM drive 58, and the like. An input operation unit 55, a liquid crystal display 56, a CD-ROM drive 58, and the like are connected to the control unit 51 via an input / output interface (not shown).
 入力操作部55は、不図示のマウス及びキーボード等から構成されており、例えば、各種指示情報をユーザが入力する際に使用される。 The input operation unit 55 is composed of a mouse, a keyboard, etc. (not shown), and is used, for example, when the user inputs various instruction information.
 CD-ROMドライブ58は、各種データ、及び各種アプリケーションソフトウェア等をCD-ROM57から読み込むものである。 The CD-ROM drive 58 reads various data, various application software, and the like from the CD-ROM 57.
 制御部51は、印字情報作成部2の全体を制御すると共に、後述する画像処理を公知技術で実行することが可能なものであって、CPU61、RAM62、ROM63、及びハードディスクドライブ(以下、「HDD」という。)66等を備えている。CPU61は、印字情報作成部2の全体の制御を行う演算装置及び制御装置である。CPU61、RAM62、及びROM63は、不図示のバス線により相互に接続されており、相互にデータのやり取りが行われる。更に、CPU61とHDD66とは、不図示の入出力インターフェースを介して接続されており、相互にデータのやり取りが行われる。 The control unit 51 can control the entire print information creation unit 2 and execute image processing described later by a known technique, and is capable of executing a CPU 61, a RAM 62, a ROM 63, and a hard disk drive (hereinafter, “HDD”). It is equipped with 66 and the like. The CPU 61 is an arithmetic unit and a control device that controls the entire print information creation unit 2. The CPU 61, RAM 62, and ROM 63 are connected to each other by a bus line (not shown), and data is exchanged with each other. Further, the CPU 61 and the HDD 66 are connected to each other via an input / output interface (not shown), and data is exchanged with each other.
 RAM62は、CPU61により演算された各種の演算結果等を一時的に記憶させておくためのものである。ROM63は、各種のプログラム等を記憶させておくものである。更に、ROM63には、フォントの種類別に、直線と楕円弧とで構成された各文字のフォントの始点、終点、焦点、曲率等のデータが記憶されている。 The RAM 62 is for temporarily storing various calculation results and the like calculated by the CPU 61. The ROM 63 stores various programs and the like. Further, the ROM 63 stores data such as a start point, an end point, a focal point, and a curvature of the font of each character composed of a straight line and an elliptical arc for each type of font.
 HDD66には、各種アプリケーションソフトウェアのプログラム、及び各種データファイル等が記憶される。 The HDD 66 stores various application software programs, various data files, and the like.
[2.汚れの検知]
 本実施形態のレーザマーカ1は、fθレンズ19及びカメラ103の汚れを検知する。その際、加工対象物7の加工面8上には、2次元走査されるガイド光Qによって、格子状の軌跡が描画される。更に、比較的長時間露光させたカメラ103で、加工対象物7の加工面8が撮影されることによって、例えば、図3に表される検査画像110が取得される。図3の検査画像110には、加工対象物7の加工面8上において、検査パターン112が映し出されている。その検査パターン112は、格子状のパターンであって、fθレン
ズ19及びカメラ103が汚れていない場合に、ガイド光Qで加工対象物7の加工面8上に描かれる軌跡である。
[2. Dirt detection]
The laser marker 1 of the present embodiment detects dirt on the fθ lens 19 and the camera 103. At that time, a grid-like locus is drawn on the machined surface 8 of the machined object 7 by the guide light Q that is two-dimensionally scanned. Further, by photographing the processed surface 8 of the processed object 7 with the camera 103 exposed for a relatively long time, for example, the inspection image 110 shown in FIG. 3 is acquired. In the inspection image 110 of FIG. 3, the inspection pattern 112 is projected on the machined surface 8 of the machined object 7. The inspection pattern 112 is a grid-like pattern, and is a trajectory drawn on the machined surface 8 of the object to be machined 7 by the guide light Q when the fθ lens 19 and the camera 103 are not dirty.
 しかしながら、fθレンズ19が油で汚れている場合には、例えば、図4(A)に表されるような検査画像110が取得される。図4(A)の検査画像110では、検査パターン112のうち、fθレンズ19の油汚れに相当する箇所114がブレて映し出されている。そのようなブレの原因としては、fθレンズ19上の油汚れにより光の屈折や散乱が生じるが、例えば、fθレンズ19が、平面状であって、その口径が比較的大きいため、その影響がブレになって現れることが考えられる。 However, when the fθ lens 19 is contaminated with oil, for example, an inspection image 110 as shown in FIG. 4A is acquired. In the inspection image 110 of FIG. 4A, a portion 114 of the inspection pattern 112 corresponding to oil stains on the fθ lens 19 is blurred and projected. As a cause of such blurring, oil stains on the fθ lens 19 cause refraction and scattering of light. For example, since the fθ lens 19 is flat and has a relatively large aperture, the influence thereof is caused. It is possible that it will appear as a blur.
 また、fθレンズ19がゴミで汚れている場合には、例えば、図4(B)に表されるような検査画像110が取得される。図4(B)の検査画像110では、検査パターン112のうち、fθレンズ19のゴミ汚れに相当する箇所116が欠けて映し出されている。そのような欠けの原因としては、例えば,ガイド光Qがゴミで遮られることが考えられる。 Further, when the fθ lens 19 is contaminated with dust, for example, an inspection image 110 as shown in FIG. 4B is acquired. In the inspection image 110 of FIG. 4B, a portion 116 of the inspection pattern 112 corresponding to dust stains on the fθ lens 19 is missing and projected. As a cause of such chipping, for example, the guide light Q may be blocked by dust.
 また、カメラ103が油で汚れている場合には、例えば、図5(A)に表されるような検査画像110が取得される。図5(A)の検査画像110では、検査パターン112のうち、カメラ103の油汚れに相当する箇所118が歪んで映し出されている。そのような歪みの原因としては、カメラ103の油汚れにより光の屈折や散乱が生じるが、例えば、カメラ103のカバーガラスが、球面状であって、その口径が比較的小さいため、その影響が歪みになって現れることが考えられる。 Further, when the camera 103 is contaminated with oil, for example, an inspection image 110 as shown in FIG. 5A is acquired. In the inspection image 110 of FIG. 5A, a portion 118 of the inspection pattern 112 corresponding to oil stains of the camera 103 is distorted and projected. The cause of such distortion is refraction and scattering of light due to oil stains on the camera 103. For example, the cover glass of the camera 103 has a spherical shape and its diameter is relatively small, so that the effect is affected. It is possible that it appears as a distortion.
 また、カメラ103がゴミで汚れている場合には、例えば、図5(B)に表されるような検査画像110が取得される。図5(B)の検査画像110では、検査パターン112に加えて、カメラ103のゴミ汚れに相当する箇所120が、検査パターン112と異なるパターンで映し出されている。そのような映し出しの原因としては、例えば、カメラ103のカバーガラスに付着したゴミが、ピントが合わない状態で撮影されることが考えられる。 Further, when the camera 103 is dirty with dust, for example, the inspection image 110 as shown in FIG. 5B is acquired. In the inspection image 110 of FIG. 5B, in addition to the inspection pattern 112, the portion 120 corresponding to dust stains on the camera 103 is projected in a pattern different from that of the inspection pattern 112. As a cause of such projection, for example, it is conceivable that dust adhering to the cover glass of the camera 103 is photographed in a state of being out of focus.
 また、fθレンズ19とカメラ103の双方が油で汚れている場合には、例えば、図6に表されるような検査画像110が取得される。図6の検査画像110では、検査パターン112のうち、fθレンズ19の油汚れに相当する箇所114がブレて映し出され、カメラ103の油汚れに相当する箇所118が歪んで映し出されている。尚、図6の検査パターン112のように、fθレンズ19の油汚れに相当する箇所114と、カメラ103の油汚れに相当する箇所118とが重なることによって、その重なる箇所114,118がブレながら歪んで映し出される場合がある。 Further, when both the fθ lens 19 and the camera 103 are contaminated with oil, for example, the inspection image 110 as shown in FIG. 6 is acquired. In the inspection image 110 of FIG. 6, in the inspection pattern 112, the portion 114 corresponding to the oil stain of the fθ lens 19 is blurred and projected, and the portion 118 corresponding to the oil stain of the camera 103 is distorted and projected. As shown in the inspection pattern 112 of FIG. 6, when the portion 114 corresponding to the oil stain of the fθ lens 19 and the portion 118 corresponding to the oil stain of the camera 103 overlap, the overlapping portions 114 and 118 are blurred. It may be distorted and projected.
 以上から、本実施形態のレーザマーカ1は、カメラ103で取得された検査画像110の検査パターン112を、fθレンズ19及びカメラ103が汚れていない場合の検査パターン112と比較照合することによって、fθレンズ19及びカメラ103の汚れを検知する。その比較照合は、所定位置に加工対象物7の加工面8がセッティングされることを条件として行われてもよいし、検査画像110の取得時のワーキングディスタンスLに基づいて検査パターン112を拡大又は縮小して行ってもよい。また、検査パターン112は、加工対象物7の加工面8以外の面上に描画されてもよい。 From the above, the laser marker 1 of the present embodiment compares and collates the inspection pattern 112 of the inspection image 110 acquired by the camera 103 with the inspection pattern 112 when the fθ lens 19 and the camera 103 are not dirty, thereby making the fθ lens. 19 and the camera 103 are detected as dirty. The comparison and collation may be performed on the condition that the machined surface 8 of the machined object 7 is set at a predetermined position, or the inspection pattern 112 is enlarged or expanded based on the working distance L at the time of acquiring the inspection image 110. It may be reduced. Further, the inspection pattern 112 may be drawn on a surface other than the machined surface 8 of the machined object 7.
 以下の説明では、fθレンズ19及びカメラ103が汚れていない場合の検査パターン112を、「検査パターン112の基準となる対比パターン」と表記する。尚、検査パターン112の基準となる対比パターンについては、そのデータが、上記の比較照合のために、印字情報作成部2のROM63、HDD66、又はCD-ROM57等に予め記憶されている。 In the following description, the inspection pattern 112 when the fθ lens 19 and the camera 103 are not dirty is referred to as a “contrast pattern that serves as a reference for the inspection pattern 112”. The data of the comparison pattern, which is the reference of the inspection pattern 112, is stored in advance in the ROM 63, HDD 66, CD-ROM 57, or the like of the print information creating unit 2 for the above-mentioned comparison and collation.
 また、本実施形態のレーザマーカ1は、上記の比較照合で検出された差異がある部分を強調する画像を液晶ディスプレイ56に表示する。具体的には、図4(A)の検査画像110が取得された場合には、例えば、図7(A)に表されるように、fθレンズ19の油汚れに相当する箇所114、つまり、検査パターン112がブレて映し出されている箇所を円126で囲む画像125が生成され、その生成された画像125が液晶ディスプレイ56に表示される。また、図5(A)の検査画像110が取得された場合には、例えば、図7(B)に表されるように、検査パターン112の基準となる対比パターンの枠128を、その四隅が検査パターン112の四隅と重なるように配した画像125が生成され、その生成された画像125が液晶ディスプレイ56に表示される。このようにして、液晶ディスプレイ56には、上記の比較照合で検出された差異が無い部分と差異がある部分とを区別して視認可能な画像125が表示される。尚、上記の円126及び枠128の他に、表示の明るさ若しくは色の変更、又は検査パターン112と検査パターン112の基準となる対比パターンとの重ね合わせによって、上記の比較照合で検出された差異が無い部分と差異がある部分とを区別して視認可能な画像125が生成され、液晶ディスプレイ56に表示されてもよい。 Further, the laser marker 1 of the present embodiment displays an image on the liquid crystal display 56 that emphasizes the portion having the difference detected by the above comparison and collation. Specifically, when the inspection image 110 of FIG. 4A is acquired, for example, as shown in FIG. 7A, a portion 114 corresponding to oil stains on the fθ lens 19, that is, that is, An image 125 is generated in which the portion where the inspection pattern 112 is blurred and projected is surrounded by a circle 126, and the generated image 125 is displayed on the liquid crystal display 56. Further, when the inspection image 110 of FIG. 5A is acquired, for example, as shown in FIG. 7B, the four corners of the frame 128 of the comparison pattern, which is the reference of the inspection pattern 112, are formed. An image 125 arranged so as to overlap the four corners of the inspection pattern 112 is generated, and the generated image 125 is displayed on the liquid crystal display 56. In this way, the liquid crystal display 56 displays an image 125 that can be visually recognized by distinguishing between the portion without the difference and the portion with the difference detected in the above comparison and collation. In addition to the circle 126 and the frame 128, it was detected by the above comparison and collation by changing the brightness or color of the display or superimposing the inspection pattern 112 and the reference comparison pattern of the inspection pattern 112. An image 125 that can be visually recognized by distinguishing between a portion having no difference and a portion having a difference may be generated and displayed on the liquid crystal display 56.
 また、本実施形態のレーザマーカ1は、fθレンズ19及びカメラ103の汚れがある範囲を粗い精度で特定し、その特定した範囲について、fθレンズ19及びカメラ103の汚れを検知することも可能である。そのような検知について、図4(A)の場合(つまり、fθレンズ19が油で汚れている場合)を例に挙げて説明する。図8に表されるように、加工対象物7の加工面8上には、2次元走査されるガイド光Qによって、上記の検査パターン112よりもピッチaが大きい検査パターン130が描画される。その際、比較的長時間露光させたカメラ103で、加工対象物7の加工面8が撮影されることによって、検査画像132が取得される。 Further, the laser marker 1 of the present embodiment can specify a range of stains on the fθ lens 19 and the camera 103 with coarse accuracy, and can detect stains on the fθ lens 19 and the camera 103 in the specified range. .. Such detection will be described by taking the case of FIG. 4A (that is, the case where the fθ lens 19 is contaminated with oil) as an example. As shown in FIG. 8, an inspection pattern 130 having a pitch a larger than that of the inspection pattern 112 is drawn on the machining surface 8 of the machining object 7 by the guide light Q that is two-dimensionally scanned. At that time, the inspection image 132 is acquired by photographing the machined surface 8 of the machined object 7 with the camera 103 exposed for a relatively long time.
 検査画像132では、上記の比較照合によって、検査パターン130がブレて映し出されている箇所が、油汚れに相当する箇所114として特定される。その後、加工対象物7の加工面8上では、検査画像132で特定された箇所114に該当する位置を含む範囲において、2次元走査されるガイド光Qによって、検査パターン112よりもピッチaが小さい再検査パターン134が描画される。このようにして、加工対象物7の加工面8上では、検査パターン130の描画領域136内において、検査パターン130の描画領域136よりも狭くピッチaが細かい再検査パターン134が、検査パターン130に代えてガイド光Qで描画される。 In the inspection image 132, the portion where the inspection pattern 130 is blurred and projected is specified as the portion 114 corresponding to oil stain by the above comparison and collation. After that, on the machined surface 8 of the object to be machined 7, the pitch a is smaller than that of the inspection pattern 112 by the guide light Q two-dimensionally scanned in the range including the position corresponding to the position 114 specified in the inspection image 132. The re-inspection pattern 134 is drawn. In this way, on the machined surface 8 of the object to be machined 7, the re-inspection pattern 134, which is narrower than the drawing area 136 of the inspection pattern 130 and has a finer pitch a, becomes the inspection pattern 130 in the drawing area 136 of the inspection pattern 130. Instead, it is drawn with the guide light Q.
 その際、比較的長時間露光させたカメラ103で、加工対象物7の加工面8が撮影されることによって、再検査画像138が取得される。再検査画像138では、上記の比較照合によって、fθレンズ19の油汚れが検知される。 At that time, the re-inspection image 138 is acquired by photographing the processed surface 8 of the processed object 7 with the camera 103 exposed for a relatively long time. In the re-inspection image 138, oil stains on the fθ lens 19 are detected by the above comparison and collation.
 尚、図8では、説明をわかりやすくするために、検査画像132及び再検査画像138が重ね合わされて表され、検査パターン130及び再検査パターン134が重ねられて表されている。しかしながら、検査パターン130は、検査画像132に映し出され、再検査パターン134は、再検査画像138に映し出される。 Note that, in FIG. 8, in order to make the explanation easy to understand, the inspection image 132 and the re-inspection image 138 are superimposed and represented, and the inspection pattern 130 and the re-inspection pattern 134 are superimposed and represented. However, the inspection pattern 130 is projected on the inspection image 132, and the re-inspection pattern 134 is projected on the re-inspection image 138.
 また、検査画像132における比較照合では、fθレンズ19及びカメラ103が汚れていない場合の検査パターン130が、検査パターン130の基準となる対比パターンとして用いられる。更に、再検査画像138における比較照合では、fθレンズ19及びカメラ103が汚れていない場合の再検査パターン134が、再検査パターン134の基準となる対比パターンとして用いられる。それらの対比パターンのデータについても、それらの比較照合のために、印字情報作成部2のROM63、HDD66、又はCD-ROM57等に予め記憶されている。 Further, in the comparative collation in the inspection image 132, the inspection pattern 130 when the fθ lens 19 and the camera 103 are not dirty is used as a reference comparison pattern of the inspection pattern 130. Further, in the comparative collation in the re-inspection image 138, the re-inspection pattern 134 when the fθ lens 19 and the camera 103 are not dirty is used as a reference comparison pattern of the re-inspection pattern 134. The data of the comparison patterns are also stored in advance in the ROM 63, HDD 66, CD-ROM 57, or the like of the print information creating unit 2 for comparison and collation.
 尚、本実施形態のレーザマーカ1は、fθレンズ19及びカメラ103の汚れがある範囲を特定する際に、検査パターン112よりもピッチaが大きい検査パターン130ではなく、検査パターン112を加工対象物7の加工面8上に描画してもよい。 The laser marker 1 of the present embodiment uses the inspection pattern 112 as the object to be machined 7 instead of the inspection pattern 130 having a pitch a larger than that of the inspection pattern 112 when specifying a dirty range of the fθ lens 19 and the camera 103. May be drawn on the machined surface 8 of.
 また、本実施形態のレーザマーカ1は、ポインタ光出射器105の汚れを検知する。その際、ポインタ光Pがポインタ光出射器105から加工対象物7の加工面8に向けて出射される。更に、カメラ103で、加工対象物7の加工面8が撮影されることによって、例えば、図9(A)に表される検査画像110が取得される。図9(A)の検査画像110には、加工対象物7の加工面8上において、ポインタ光Pの照射領域122が映し出されている。ポインタ光出射器105が汚れていない場合、図9(A)に表されるように、ポインタ光Pの照射領域122は、その外形が円状である。 Further, the laser marker 1 of the present embodiment detects dirt on the pointer light emitter 105. At that time, the pointer light P is emitted from the pointer light emitter 105 toward the machined surface 8 of the machined object 7. Further, by photographing the machined surface 8 of the machined object 7 with the camera 103, for example, the inspection image 110 shown in FIG. 9A is acquired. In the inspection image 110 of FIG. 9A, the irradiation region 122 of the pointer light P is projected on the processing surface 8 of the processing object 7. When the pointer light emitter 105 is not dirty, the irradiation region 122 of the pointer light P has a circular outer shape as shown in FIG. 9A.
 しかしながら、ポインタ光出射器105が油で汚れている場合には、例えば、図9(B)に表されるような検査画像110が取得される。図9(B)の検査画像110において、ポインタ光Pの照射領域122は、その形状が変化して、楕円状に映し出されている。尚、ポインタ光出射器105が油で汚れている場合、ポインタ光Pの照射領域122は、図9(B)の検査画像110において、上記の形状変化の他に、位置ズレ又は面積変化した状態で映し出されることがある。また、ポインタ光Pの照射領域122の外において、干渉縞又は異常光が映し出されることもある。 However, when the pointer light emitter 105 is contaminated with oil, for example, an inspection image 110 as shown in FIG. 9B is acquired. In the inspection image 110 of FIG. 9B, the irradiation region 122 of the pointer light P changes its shape and is projected in an elliptical shape. When the pointer light emitter 105 is contaminated with oil, the irradiation area 122 of the pointer light P is in a state where the position is displaced or the area is changed in addition to the above-mentioned shape change in the inspection image 110 of FIG. 9B. It may be projected on. Further, interference fringes or abnormal light may be projected outside the irradiation area 122 of the pointer light P.
 また、ポインタ光出射器105がゴミで汚れている場合には、例えば、図9(C)に表されるような検査画像110が取得される。図9(C)の検査画像110において、ポインタ光Pの照射領域122は、そのゴミ汚れに相当する箇所124が欠けて映し出されている。 Further, when the pointer light emitter 105 is contaminated with dust, for example, an inspection image 110 as shown in FIG. 9C is acquired. In the inspection image 110 of FIG. 9C, the irradiation area 122 of the pointer light P is projected without the portion 124 corresponding to the dust stain.
 以上から、本実施形態のレーザマーカ1は、カメラ103で取得された検査画像110の照射領域122を、ポインタ光出射器105が汚れていない場合の照射領域122と比較照合することによって、ポインタ光出射器105の汚れを検知する。その比較照合は、上述した場合(つまり、fθレンズ19及びカメラ103の汚れを検知する場合)と同様に、所定位置に加工対象物7の加工面8がセッティングされることを条件として行われてもよいし、検査画像110の取得時のワーキングディスタンスLに基づいて照射領域122を拡大又は縮小して行ってもよい。また、ポインタ光Pは、加工対象物7の加工面8以外の面上に照射されてもよい。 From the above, the laser marker 1 of the present embodiment compares and collates the irradiation area 122 of the inspection image 110 acquired by the camera 103 with the irradiation area 122 when the pointer light emitter 105 is not dirty, thereby emitting pointer light. Detects dirt on the vessel 105. The comparison and collation is performed on the condition that the machined surface 8 of the object to be machined 7 is set at a predetermined position, as in the case described above (that is, when the dirt on the fθ lens 19 and the camera 103 is detected). Alternatively, the irradiation area 122 may be enlarged or reduced based on the working distance L at the time of acquisition of the inspection image 110. Further, the pointer light P may be applied to a surface other than the processing surface 8 of the processing object 7.
 以下の説明では、ポインタ光出射器105が汚れていない場合の照射領域122を、「照射領域122の基準となる対比領域」と表記する。尚、照射領域122の基準となる対比領域については、そのデータが、上記の比較照合のために、印字情報作成部2のROM63、HDD66、又はCD-ROM57等に予め記憶されている。 In the following description, the irradiation area 122 when the pointer light emitter 105 is not dirty is referred to as a “contrast area that serves as a reference for the irradiation area 122”. The data of the reference area of the irradiation area 122 is stored in advance in the ROM 63, HDD 66, CD-ROM 57, or the like of the print information creating unit 2 for the above-mentioned comparison and collation.
[3.制御フロー]
 図10乃至図13のフローチャートで表された検知方法200のプログラムは、制御部51のROM63に記憶されており、fθレンズ19及びカメラ103の汚れを検知する際に、制御部51のCPU61により実行される。従って、後述する処理において、制御対象がレーザ加工部3の構成要素である場合、カメラ103を除き、レーザコントローラ6を介した制御が行われる。尚、本プログラムは、CD-ROM57に保存されており、CD-ROMドライブ58によって読み込まれ、CPU61により実行されてもよい。以下、本プログラムを説明する。
[3. Control flow]
The program of the detection method 200 shown in the flowcharts of FIGS. 10 to 13 is stored in the ROM 63 of the control unit 51, and is executed by the CPU 61 of the control unit 51 when detecting the dirt on the fθ lens 19 and the camera 103. Will be done. Therefore, in the processing described later, when the control target is a component of the laser processing unit 3, control is performed via the laser controller 6 except for the camera 103. This program may be stored in the CD-ROM 57, read by the CD-ROM drive 58, and executed by the CPU 61. This program will be described below.
 検知方法200が実行されると、図10のフローチャートで表されたプログラムにおいて、先ず、ステップ(以下、単に「S」と表記する。)10の描画処理が行われる。この処理では、加工対象物7の加工面8上において、検査パターン112がガイド光Qで描画される。そのために、ガイド光部15からガイド光Qが出射されると共に、ガルバノスキャナ18の振角が所定角度となるように、ガルバノスキャナ18の各走査ミラー18X、18Yが回転させられる。そのような回転(走査)が繰り返されると、加工対象物7の加工面8上では、2次元走査中のガイド光Qによって、格子状の軌跡(つまり、検査パターン112)が描画される。 When the detection method 200 is executed, in the program shown in the flowchart of FIG. 10, first, the drawing process of step 10 (hereinafter, simply referred to as “S”) 10 is performed. In this process, the inspection pattern 112 is drawn by the guide light Q on the machined surface 8 of the machined object 7. Therefore, the guide light Q is emitted from the guide light unit 15, and the scanning mirrors 18X and 18Y of the galvano scanner 18 are rotated so that the swing angle of the galvano scanner 18 becomes a predetermined angle. When such rotation (scanning) is repeated, a grid-like locus (that is, an inspection pattern 112) is drawn on the machined surface 8 of the machined object 7 by the guide light Q during the two-dimensional scanning.
 取得処理S12では、カメラ103が加工対象物7の加工面8を撮影することによって、検査パターン112を映し出した検査画像110が取得される。その際、検査パターン112は、ROM63に記憶された露光時間の数値をもって、カメラ103によって撮影される。 In the acquisition process S12, the camera 103 photographs the processed surface 8 of the object to be processed 7, so that the inspection image 110 showing the inspection pattern 112 is acquired. At that time, the inspection pattern 112 is photographed by the camera 103 with the numerical value of the exposure time stored in the ROM 63.
 分析処理S14では、検査画像110に基づいて、検査パターン112に関する画像分析が行われる。この処理では、画像分析として、上記の比較照合が行われる。これにより、検査画像110に映し出された検査パターン112と、検査パターン112の基準となる対比パターンとの差異が検出される。具体的には、取得された検査画像110の各部の線幅や位置、輝度の大小の情報を抽出し、あらかじめ記憶されている汚れのない場合のパターン描画像の適正値と比較する。その比較結果に基づき、差異として検出される。その差異には、検査パターン112のブレ(図4(A),図6参照)、欠け(図4(B)参照)、歪み(図5(A),図6参照)、又は検査パターン112とは異なるパターン(図5(B)参照)等がある。 In the analysis process S14, image analysis related to the inspection pattern 112 is performed based on the inspection image 110. In this process, the above-mentioned comparison and collation are performed as image analysis. As a result, the difference between the inspection pattern 112 projected on the inspection image 110 and the reference comparison pattern of the inspection pattern 112 is detected. Specifically, information on the line width, position, and brightness of each part of the acquired inspection image 110 is extracted and compared with the appropriate value of the pattern drawing image stored in advance when there is no stain. Based on the comparison result, it is detected as a difference. The differences include blurring of the inspection pattern 112 (see FIGS. 4A and 6), chipping (see FIG. 4B), distortion (see FIGS. 5A and 6), or inspection pattern 112. Has different patterns (see FIG. 5B) and the like.
 診断処理S16では、上記の分析処理S14での画像分析の検査結果に基づいて、fθ
レンズ19及びカメラ103のうち少なくとも一方の汚れの有無を自動診断する。そのために、図11のフローチャートで表されたプログラムが、制御部51のCPU61により実行される。
In the diagnostic process S16, fθ is based on the inspection result of the image analysis in the above analysis process S14.
The presence or absence of dirt on at least one of the lens 19 and the camera 103 is automatically diagnosed. Therefore, the program shown in the flowchart of FIG. 11 is executed by the CPU 61 of the control unit 51.
 図11のフローチャートで表されたプログラムでは、先ず、判定処理S20が行われる。この処理では、画像分析の検査結果に、検査パターン112のブレ又は欠けがあるかが判定される。ここで、画像分析の検査結果に、検査パターン112のブレ又は欠けがある場合(S20:YES)には、判定処理S22が行われる。 In the program shown in the flowchart of FIG. 11, first, the determination process S20 is performed. In this process, it is determined whether or not the inspection result of the image analysis has a blur or a chip in the inspection pattern 112. Here, if the inspection result of the image analysis has blurring or chipping of the inspection pattern 112 (S20: YES), the determination process S22 is performed.
 この処理では、画像分析の検査結果に、検査パターン112の歪み又は検査パターン112とは異なるパターンもあるかが判定される。ここで、画像分析の検査結果に、検査パターン112の歪み又は検査パターン112とは異なるパターンもある場合(S22:YES)には、fθレンズ19及びカメラ103を、汚れが存在する場所に特定し、その旨を液晶ディスプレイ56に表示する特定処理S24が行われる。その後は、図10のフローチャートに戻る。これに対して、画像分析の検査結果に、検査パターン112の歪み及び検査パターン112とは異なるパターンがない場合(S22:NO)には、fθレンズ
19を、汚れが存在する場所に特定し、その旨を液晶ディスプレイ56に表示する特定処理S26が行われる。その後は、図10のフローチャートに戻る。
In this process, it is determined whether the inspection result of the image analysis includes distortion of the inspection pattern 112 or a pattern different from the inspection pattern 112. Here, if the inspection result of the image analysis includes distortion of the inspection pattern 112 or a pattern different from the inspection pattern 112 (S22: YES), the fθ lens 19 and the camera 103 are specified in the place where the stain is present. , The specific process S24 for displaying the fact on the liquid crystal display 56 is performed. After that, the process returns to the flowchart of FIG. On the other hand, when the inspection result of the image analysis does not show the distortion of the inspection pattern 112 and the pattern different from the inspection pattern 112 (S22: NO), the fθ lens 19 is specified in the place where the stain is present. The specific process S26 for displaying the fact on the liquid crystal display 56 is performed. After that, the process returns to the flowchart of FIG.
 一方、画像分析の検査結果に、検査パターン112のブレ及び欠けがない場合(S20:NO)には、判定処理S28が行われる。この処理では、画像分析の検査結果に、検査パターン112の歪み又は検査パターン112とは異なるパターンがあるかが判定される。ここで、画像分析の検査結果に、検査パターン112の歪み又は検査パターン112とは異なるパターンがある場合(S28:YES)には、カメラ103を、汚れが存在する場所に特定し、その旨を液晶ディスプレイ56に表示する特定処理S30が行われる。その後は、図10のフローチャートに戻る。これに対して、画像分析の検査結果に、検査パターン112の歪み及び検査パターン112とは異なるパターンがない場合(S28:NO)には、汚れが存在する場所が特定されることなく、検知方法200が終了する。その際、液晶ディスプレイ56には、fθレンズ19及びカメラ103に汚れがない旨が表示される。 On the other hand, if there is no blurring or chipping of the inspection pattern 112 in the inspection result of the image analysis (S20: NO), the determination process S28 is performed. In this process, it is determined whether the inspection result of the image analysis has a distortion of the inspection pattern 112 or a pattern different from the inspection pattern 112. Here, if the inspection result of the image analysis includes distortion of the inspection pattern 112 or a pattern different from the inspection pattern 112 (S28: YES), the camera 103 is specified as a place where dirt is present, and this is indicated. The specific process S30 to be displayed on the liquid crystal display 56 is performed. After that, the process returns to the flowchart of FIG. On the other hand, when there is no distortion of the inspection pattern 112 and a pattern different from the inspection pattern 112 in the inspection result of the image analysis (S28: NO), the detection method without specifying the place where the stain exists. 200 ends. At that time, the liquid crystal display 56 indicates that the fθ lens 19 and the camera 103 are clean.
 図10のフローチャートに戻ると、図12のフローチャートで表されたプログラムが、制御部51のCPU61により実行される。図12のフローチャートで表されたプログラムでは、先ず、画像生成処理S40が行われる。この処理では、上記の分析処理S14で画像分析として行われる比較照合によって検出された差異が無い部分と差異がある部分とに関し、それらの部分を円126又は枠128の選択的付与、異なる色で表示する等で区別して視認可能な画像125(図7(A),図7(B)参照)が生成される。表示処理S42では、その生成された画像125が、液晶ディスプレイ56に表示される。その後、検知方法200は終了する。 Returning to the flowchart of FIG. 10, the program represented by the flowchart of FIG. 12 is executed by the CPU 61 of the control unit 51. In the program shown in the flowchart of FIG. 12, first, the image generation process S40 is performed. In this process, regarding the part without the difference and the part with the difference detected by the comparative collation performed as the image analysis in the above analysis process S14, those parts are selectively given by the circle 126 or the frame 128, in different colors. An image 125 (see FIGS. 7 (A) and 7 (B)) that can be distinguished and visually recognized by display or the like is generated. In the display process S42, the generated image 125 is displayed on the liquid crystal display 56. After that, the detection method 200 ends.
 尚、検知方法200でポインタ光出射器105の汚れを検知する場合は、図10のフローチャートで表されたプログラムが、制御部51のCPU61により実行される。その際は、ガイド光Qの検査パターン112に代えて、ポインタ光Pの照射領域122が用いられる。更に、上述しように、分析処理S14で画像分析として行われる比較照合では、検査パターン112の基準となる対比パターンに代えて、照射領域122の基準となる対比領域が用いられる。また、その比較照合では、上述したように、照射領域122の形状変化、位置ズレ、若しくは面積変化、又は照射領域122の外に映し出される干渉縞若しくは異常光が、検査画像110に映し出された照射領域122と、照射領域122の基準となる対比領域との差異として検出される。更に、診断処理S16においては、検査画像110に映し出された照射領域122と、照射領域122の基準となる対比領域との差異が検出された場合に、ポインタ光出射器105に汚れがある旨が液晶ディスプレイ56に表示され、その差異が検出されない場合に、ポインタ光出射器105に汚れがない旨が液晶ディスプレイ56に表される。その後、検知方法200は終了する。 When the detection method 200 detects the dirt on the pointer light emitter 105, the program shown in the flowchart of FIG. 10 is executed by the CPU 61 of the control unit 51. In that case, the irradiation region 122 of the pointer light P is used instead of the inspection pattern 112 of the guide light Q. Further, as described above, in the comparative collation performed as an image analysis in the analysis process S14, the reference comparison region of the irradiation region 122 is used instead of the reference comparison pattern of the inspection pattern 112. Further, in the comparative collation, as described above, the shape change, the positional deviation, or the area change of the irradiation area 122, or the interference fringes or abnormal light projected outside the irradiation area 122 are projected on the inspection image 110. It is detected as a difference between the region 122 and the reference contrast region of the irradiation region 122. Further, in the diagnostic process S16, when the difference between the irradiation area 122 projected on the inspection image 110 and the reference comparison area of the irradiation area 122 is detected, the pointer light emitter 105 is found to be dirty. When it is displayed on the liquid crystal display 56 and the difference is not detected, the liquid crystal display 56 indicates that the pointer light emitter 105 is clean. After that, the detection method 200 ends.
 また、検知方法200で、fθレンズ19及びカメラ103の汚れがある範囲を粗く特定し、その特定した範囲について、fθレンズ19及びカメラ103の汚れを検知する場合には、先ず、図10及び図11の各フローチャートで表されたプログラムが、制御部51のCPU61により実行される。但し、その際は、上記の図8に表されるように、検査パターン112に代えて、検査パターン112よりもピッチaが大きい検査パターン130が描画され(S10)、検査パターン130を映し出した検査画像132が取得される(S12)。また、画像分析として行われる比較照合では、検査パターン112の基準となる対比パターンに代えて、検査パターン130の基準となる対比パターンが用いられる(S14)。 Further, when the detection method 200 roughly specifies a dirty range of the fθ lens 19 and the camera 103, and detects dirt on the fθ lens 19 and the camera 103 in the specified range, first, FIGS. 10 and 10 and FIGS. The program represented by each flowchart of 11 is executed by the CPU 61 of the control unit 51. However, in that case, as shown in FIG. 8, an inspection pattern 130 having a pitch a larger than that of the inspection pattern 112 is drawn instead of the inspection pattern 112 (S10), and the inspection pattern 130 is projected. Image 132 is acquired (S12). Further, in the comparative collation performed as an image analysis, a comparison pattern that is a reference of the inspection pattern 130 is used instead of the comparison pattern that is a reference of the inspection pattern 112 (S14).
 更に、各特定処理S24,S26,S30では、検査画像132において、ブレ、欠け、歪み、又は検査パターン130とは異なるパターンがある箇所が特定される。これにより、診断処理S16が行われる。但し、各特定処理S24,S26,S30では、fθレ
ンズ19又はカメラ103を汚れが存在する場所に特定した旨に代えて、fθレンズ19又はカメラ103に汚れが存在する可能性がある旨が液晶ディスプレイ56に表示される。
Further, in each of the specific processes S24, S26, and S30, a portion having a blur, chip, distortion, or a pattern different from the inspection pattern 130 is specified in the inspection image 132. As a result, the diagnostic process S16 is performed. However, in each of the specific processes S24, S26, and S30, instead of specifying the fθ lens 19 or the camera 103 in a place where dirt is present, the liquid crystal display indicates that the fθ lens 19 or the camera 103 may be dirty. It is displayed on the display 56.
 更に、診断処理S16が行われた後は、図13のフローチャートで表されたプログラムが、制御部51のCPU61により実行される。再描画処理S50が行われると、加工対象物7の加工面8上では、各特定処理S24,S26,S30で特定された箇所に該当する位置を含む範囲において、検査パターン112よりもピッチaが小さい再検査パターン134が、2次元走査中のガイド光Qで描画される。これにより、上記の図8に表されるように、加工対象物7の加工面8上では、検査パターン130の描画領域136内において、検査パターン130の描画領域136よりも狭くピッチaが細かい再検査パターン134が、検査パターン130に代えてガイド光Qで描画される。 Further, after the diagnostic process S16 is performed, the program shown in the flowchart of FIG. 13 is executed by the CPU 61 of the control unit 51. When the redrawing process S50 is performed, the pitch a is higher than that of the inspection pattern 112 on the machined surface 8 of the machined object 7 in a range including the positions corresponding to the places specified by the specific processes S24, S26, and S30. A small re-inspection pattern 134 is drawn with the guide light Q during the two-dimensional scan. As a result, as shown in FIG. 8, on the machined surface 8 of the object to be machined 7, the pitch a is narrower than the drawing area 136 of the inspection pattern 130 in the drawing area 136 of the inspection pattern 130 and the pitch a is finer. The inspection pattern 134 is drawn with the guide light Q instead of the inspection pattern 130.
 再取得処理S52では、カメラ103が加工対象物7の加工面8を撮影することによって、再検査パターン134を映し出した再検査画像138が取得される。その際、再検査パターン134は、ROM63に記憶された露光時間の数値をもって、カメラ103によって撮影される。 In the re-acquisition process S52, the camera 103 photographs the processed surface 8 of the object to be processed 7, so that the re-inspection image 138 showing the re-inspection pattern 134 is acquired. At that time, the re-inspection pattern 134 is photographed by the camera 103 with the numerical value of the exposure time stored in the ROM 63.
 再分析処理S54では、再検査画像138に基づいて、再検査パターン134に関する画像分析が行われる。この処理では、画像分析として、上記の比較照合が行われる。これにより、再検査画像138に映し出された再検査パターン134と、再検査パターン134の基準となる対比パターンとの差異が検出される。 In the re-analysis process S54, image analysis regarding the re-inspection pattern 134 is performed based on the re-inspection image 138. In this process, the above-mentioned comparison and collation are performed as image analysis. As a result, the difference between the re-inspection pattern 134 projected on the re-inspection image 138 and the reference comparison pattern of the re-inspection pattern 134 is detected.
 再診断処理S56では、上記の再分析処理S54での画像分析の検査結果に基づいて、fθレンズ19及びカメラ103のうち少なくとも一方の汚れの有無を自動診断する。そのために、上述した図11のフローチャートで表されたプログラムが、制御部51のCPU61により実行される。更に、上述した図12のフローチャートで表されたプログラムが、制御部51のCPU61により実行される。その後に、検知方法200が終了する。 The re-diagnosis process S56 automatically diagnoses the presence or absence of dirt on at least one of the fθ lens 19 and the camera 103 based on the inspection result of the image analysis in the re-analysis process S54. Therefore, the program shown in the flowchart of FIG. 11 described above is executed by the CPU 61 of the control unit 51. Further, the program shown in the flowchart of FIG. 12 described above is executed by the CPU 61 of the control unit 51. After that, the detection method 200 ends.
[4.まとめ]
 以上詳細に説明したように、本実施の形態のレーザマーカ1とその制御プログラム、及び検知方法200は、ガルバノスキャナ18で2次元走査されると共にfθレンズ19で集光されるガイド光Qで、検査パターン112を加工対象物7の加工面8上に描画し(S10)、カメラ103で加工対象物7の加工面8を撮影することによって、検査パターン112が映し出された検査画像110を取得し(S12)、検査画像110に基づいて、検査パターン112に関する画像分析を行い(S14)、画像分析の検査結果に基づいて、fθレンズ19及びカメラ103のうち少なくとも一方の汚れの有無を自動診断する(S16)。これにより、本実施の形態のレーザマーカ1及び検知方法200は、ガルバノスキャナ18及びfθレンズ19で加工対象物7の加工面8に対して走査及び集光させたガイド光Qをカメラ103で撮影することによって、検査のためにパワーメータ等の計測機器を移動させるようなことをせずに、速やかにfθレンズ19及びカメラ103のうち少なくとも一方の汚れを検知することが可能である。
[4. summary]
As described in detail above, the laser marker 1 of the present embodiment, its control program, and the detection method 200 are inspected by the guide light Q which is two-dimensionally scanned by the galvano scanner 18 and condensed by the fθ lens 19. The pattern 112 is drawn on the machined surface 8 of the machined object 7 (S10), and the machined surface 8 of the machined object 7 is photographed by the camera 103 to acquire the inspection image 110 on which the inspection pattern 112 is projected (S10). S12), an image analysis regarding the inspection pattern 112 is performed based on the inspection image 110 (S14), and the presence or absence of stains on at least one of the fθ lens 19 and the camera 103 is automatically diagnosed based on the inspection result of the image analysis (S12). S16). As a result, in the laser marker 1 and the detection method 200 of the present embodiment, the guide light Q scanned and focused on the machined surface 8 of the machined object 7 by the galvano scanner 18 and the fθ lens 19 is photographed by the camera 103. This makes it possible to quickly detect dirt on at least one of the fθ lens 19 and the camera 103 without moving a measuring device such as a power meter for inspection.
 また、本実施の形態のレーザマーカ1は、検査画像110に映し出された検査パターン112と、検査パターン112の基準となる対比パターンとの差異を検出することによって、画像分析を行うので(S14)、自動診断(S16)を比較的容易に行うことが可能である。 Further, since the laser marker 1 of the present embodiment performs image analysis by detecting the difference between the inspection pattern 112 projected on the inspection image 110 and the comparison pattern as a reference of the inspection pattern 112 (S14). It is possible to perform automatic diagnosis (S16) relatively easily.
 すなわち、本実施の形態のレーザマーカ1は、画像分析(S14)において、検査パターン112の歪み、ブレ、又は欠けの有無を検査するので(S20,S22,S28)、自動診断(S16)を比較的容易に行うことが可能である。 That is, since the laser marker 1 of the present embodiment inspects the presence or absence of distortion, blurring, or chipping of the inspection pattern 112 in the image analysis (S14) (S20, S22, S28), the automatic diagnosis (S16) is relatively performed. It can be done easily.
 また、本実施の形態のレーザマーカ1は、検査パターン112に現れる汚れの影響がfθレンズ19とカメラ103とでは異なることから、検査パターン112に歪みが有ること又は検査パターン112とは異なるパターンが有ることを画像分析の検査結果とする場合に、汚れが存在する場所として、カメラ103を特定し(S28:YES)、検査パターン112にブレ又は欠けが有ることを画像分析の検査内容とする場合に、汚れが存在する場所として、fθレンズ19を特定する(S20:YES)。このようにして、本実施の形態のレーザマーカ1は、画像分析の検査結果に基づいて、汚れが存在する場所を特定するので、汚れに対処し易い。 Further, the laser marker 1 of the present embodiment has a distortion in the inspection pattern 112 or a pattern different from the inspection pattern 112 because the influence of the stain appearing on the inspection pattern 112 is different between the fθ lens 19 and the camera 103. When the camera 103 is specified as the place where the stain exists (S28: YES) and the inspection pattern 112 is blurred or chipped as the inspection content of the image analysis. , The fθ lens 19 is specified as a place where dirt is present (S20: YES). In this way, the laser marker 1 of the present embodiment identifies the place where the stain is present based on the inspection result of the image analysis, so that the stain can be easily dealt with.
 また、本実施の形態のレーザマーカ1は、検査パターン112が格子状のパターンであることから、例えば、スクライブドサークル試験に用いられる模様等と兼用することが可能である。 Further, since the inspection pattern 112 of the laser marker 1 of the present embodiment is a grid pattern, it can be used in combination with, for example, a pattern used in a scribed circle test.
 また、本実施の形態のレーザマーカ1は、診断処理S16で汚れが存在する可能性が有ると自動診断された場合(S24,S26,S30)には、加工対象物7の加工面8上に描画された検査パターン130の描画領域136内において、検査パターン130の描画領域136よりも狭く、ピッチaが検査パターン112よりも細かい再検査パターン134を、検査パターン130に代えてガイド光Qで描画する(S50)。更に、加工対象物7の加工面8の撮影によって、再検査パターン134が映し出された再検査画像138を取得し(S52)、再検査画像138に基づいて、画像分析を再検査パターン134に関して行い(S54)、画像分析の検査結果に基づいて、fθレンズ19及びカメラ103のうち少なくとも一方の汚れの有無を自動診断する(S56)。これにより、本実施の形態のレーザマーカ1は、fθレンズ19及びカメラ103のうち少なくとも一方の汚れを検知することを、速くより精度良く行うことが可能である。 Further, the laser marker 1 of the present embodiment is drawn on the machined surface 8 of the object to be machined 7 when it is automatically diagnosed by the diagnostic process S16 that dirt may be present (S24, S26, S30). In the drawing area 136 of the inspection pattern 130, the re-inspection pattern 134, which is narrower than the drawing area 136 of the inspection pattern 130 and has a finer pitch a than the inspection pattern 112, is drawn by the guide light Q instead of the inspection pattern 130. (S50). Further, the re-inspection image 138 on which the re-inspection pattern 134 is projected is acquired by photographing the processed surface 8 of the processed object 7 (S52), and image analysis is performed on the re-inspection pattern 134 based on the re-inspection image 138. (S54), based on the inspection result of the image analysis, the presence or absence of dirt on at least one of the fθ lens 19 and the camera 103 is automatically diagnosed (S56). Thereby, the laser marker 1 of the present embodiment can detect the dirt of at least one of the fθ lens 19 and the camera 103 faster and more accurately.
 また、本実施の形態のレーザマーカ1は、差異が無い部分と差異がある部分とを円126又は枠128等で区別して視認可能な画像125を生成し(S40)、その画像125を液晶ディスプレイ56に表示する(S42)。これにより、本実施の形態のレーザマーカ1は、汚れの位置を視認で特定することが可能である。 Further, the laser marker 1 of the present embodiment generates a visible image 125 by distinguishing a portion having no difference and a portion having a difference by a circle 126, a frame 128, or the like (S40), and the image 125 is displayed on the liquid crystal display 56. Is displayed in (S42). Thereby, the laser marker 1 of the present embodiment can visually identify the position of dirt.
 また、本実施の形態のレーザマーカ1は、加工対象物7の加工面8におけるポインタ光Pの照射領域122を検査パターン112として扱うことによって、ポインタ光出射器105の汚れの有無を自動診断する(S10~S16)。これにより、本実施の形態のレーザマーカ1は、ポインタ光出射器105で加工対象物7の加工面8に対して照射させたポインタ光Pをカメラ103で撮影することによって、ポインタ光出射器105の汚れを検知することが可能である。 Further, the laser marker 1 of the present embodiment automatically diagnoses the presence or absence of dirt on the pointer light emitter 105 by treating the irradiation region 122 of the pointer light P on the machined surface 8 of the machined object 7 as an inspection pattern 112. S10 to S16). As a result, in the laser marker 1 of the present embodiment, the pointer light P irradiated to the machined surface 8 of the object to be machined 7 by the pointer light emitter 105 is photographed by the camera 103, whereby the pointer light emitter 105 It is possible to detect dirt.
 また、本実施の形態のレーザマーカ1は、ポインタ光Pの照射領域122のズレ、欠け、形状変化、若しくは面積変化の有無、又はポインタ光Pの照射領域122外における干渉縞若しくは異常光の有無を検査することによって、画像分析を行うので(S14)、自動診断(S16)を比較的容易に行うことが可能である。 Further, the laser marker 1 of the present embodiment determines the presence / absence of deviation, chipping, shape change, or area change of the irradiation area 122 of the pointer light P, or the presence / absence of interference fringes or abnormal light outside the irradiation area 122 of the pointer light P. Since the image analysis is performed by the inspection (S14), the automatic diagnosis (S16) can be performed relatively easily.
 ちなみに、レーザマーカ1は、「レーザ加工装置」の一例である。加工対象物7の加工面8は、「基準面」の一例である。ガルバノスキャナ18及びfθレンズ19は、「光学系」の一例である。可視半導体レーザ28は、「ガイド光出射部」の一例である。液晶ディスプレイ56は、「ディスプレイ」の一例である。カメラ103は、「撮影部」の一例である。ポインタ光出射器105は、「ポインタ光照射部」の一例である。描画処理S10は、「描画ステップ」の一例である。取得処理S12は、「取得ステップ」の一例である。分析処理S14は、「分析ステップ」の一例である。診断処理S16は、「診断ステップ」の一例である。 By the way, the laser marker 1 is an example of a "laser processing device". The machined surface 8 of the object to be machined 7 is an example of a “reference surface”. The galvano scanner 18 and the fθ lens 19 are examples of an “optical system”. The visible semiconductor laser 28 is an example of a “guided light emitting unit”. The liquid crystal display 56 is an example of a "display". The camera 103 is an example of a “shooting unit”. The pointer light emitter 105 is an example of a “pointer light irradiating unit”. The drawing process S10 is an example of a “drawing step”. The acquisition process S12 is an example of the “acquisition step”. The analysis process S14 is an example of an “analysis step”. The diagnostic process S16 is an example of a "diagnosis step".
[5.その他]
 尚、本開示は、本実施形態に限定されるものでなく、その趣旨を逸脱しない範囲で様々な変更が可能である。
[5. others]
The present disclosure is not limited to the present embodiment, and various changes can be made without departing from the spirit of the present embodiment.
 例えば、本実施の形態のレーザマーカ1では、格子状の検査パターン112に代えて、図14に表されるような検査パターン140が、加工対象物7の加工面8上に描画されてもよい。検査パターン140は、複数の直線が平行に並んだパターンであるため、格子状の検査パターン112と比べて、加工対象物7の加工面8上に容易に描画されることが可能である。尚、複数の直線が平行に並ぶ向きは、いずれの方向であってもよい。 For example, in the laser marker 1 of the present embodiment, the inspection pattern 140 as shown in FIG. 14 may be drawn on the machined surface 8 of the machined object 7 instead of the grid-shaped inspection pattern 112. Since the inspection pattern 140 is a pattern in which a plurality of straight lines are arranged in parallel, it can be easily drawn on the machined surface 8 of the machined object 7 as compared with the grid-shaped inspection pattern 112. The direction in which the plurality of straight lines are lined up in parallel may be any direction.
 また、液晶ディスプレイ56は、パーソナルコンピュータ等で構成される印字情報作成部2に備えられたものであるが、レーザ加工部3に備えられたものであってもよいし、印字情報作成部2及びレーザ加工部3から独立したものであってもよい。 Further, although the liquid crystal display 56 is provided in the print information creation unit 2 composed of a personal computer or the like, it may be provided in the laser processing unit 3, or the print information creation unit 2 and the liquid crystal display 56 may be provided. It may be independent of the laser processing unit 3.
 また、カメラ103は、加工レーザ光Rやガイド光Qの焦点位置Fを加工対象物7の加工面8に合わせる機能を実現するために、レーザマーカ1に備えられたものであるが、レーザ加工の仕上がり確認や、1次元コード又は2次元コードの読み取り等の機能を実現するために、レーザマーカ1に備えられたものであってもよい。また、検知方法200を実行するためだけに、レーザマーカ1に備えられたものであってもよい。 Further, the camera 103 is provided in the laser marker 1 in order to realize a function of aligning the focal position F of the processing laser light R and the guide light Q with the processing surface 8 of the processing object 7, but the laser processing The laser marker 1 may be provided in order to realize functions such as finish confirmation and reading of a one-dimensional code or a two-dimensional code. Further, the laser marker 1 may be provided only for executing the detection method 200.
1:レーザ加工装置、8:加工対象物の加工面、28:可視半導体レーザ、51:制御部、56:液晶ディスプレイ、103:カメラ、105:ポインタ光出射器、110:検査画像、112:検査パターン、122:ポインタ光の照射領域、125:画像、130:検査パターン、132:検査画像、134:再検査パターン、136:検査パターンの描画領域、138:再検査画像、140:検査パターン、200:検知方法、a:ピッチ、P:ポインタ光、Q:ガイド光、S10:描画処理、S12:取得処理、S14:分析処理、S16:診断処理、S40:画像生成処理、S42:表示処理、S50:再描画処理、S52:再取得処理、S54:再分析処理、S56:再診断処理 1: Laser processing device, 8: Processed surface of the object to be processed, 28: Visible semiconductor laser, 51: Control unit, 56: Liquid crystal display, 103: Camera, 105: Pointer light emitter, 110: Inspection image, 112: Inspection Pattern, 122: Pointer light irradiation area, 125: Image, 130: Inspection pattern, 132: Inspection image, 134: Re-inspection pattern, 136: Inspection pattern drawing area, 138: Re-inspection image, 140: Inspection pattern, 200 : Detection method, a: Pitch, P: Pointer light, Q: Guide light, S10: Drawing process, S12: Acquisition process, S14: Analysis process, S16: Diagnosis process, S40: Image generation process, S42: Display process, S50 : Redraw processing, S52: Reacquisition processing, S54: Reanalysis processing, S56: Rediagnosis processing

Claims (14)

  1.  可視光であるガイド光を出射するガイド光出射部と、
     前記ガイド光を基準面に対して走査しながら集光させるために、前記ガイド光出射部と前記基準面との間に配される光学系と、
     前記基準面を撮影する撮影部と、
     制御部と、を備え、
     前記制御部は、
     前記ガイド光で前記基準面に検査パターンを描画する描画処理と、
     前記基準面の撮影によって前記検査パターンが映し出された検査画像を取得する取得処理と、
     前記検査画像に基づいて、前記検査パターンに関する画像分析を行う分析処理と、
     前記画像分析の検査結果に基づいて、前記光学系及び前記撮影部のうち少なくとも一方の汚れの有無を自動診断する診断処理と、を実行することを特徴とするレーザ加工装置。
    A guide light emitting part that emits a guide light that is visible light,
    An optical system arranged between the guide light emitting portion and the reference surface in order to collect the guide light while scanning it with respect to the reference surface.
    A shooting unit that shoots the reference plane and
    With a control unit,
    The control unit
    A drawing process for drawing an inspection pattern on the reference surface with the guide light, and
    An acquisition process for acquiring an inspection image on which the inspection pattern is projected by photographing the reference surface, and
    An analysis process for performing image analysis on the inspection pattern based on the inspection image, and
    A laser processing apparatus characterized by executing a diagnostic process for automatically diagnosing the presence or absence of stains on at least one of the optical system and the photographing unit based on the inspection result of the image analysis.
  2.  前記分析処理は、前記検査画像に映し出された前記検査パターンと、前記検査パターンの基準となる対比パターンとの差異を検出することによって、前記画像分析を行うことを特徴とする請求項1に記載のレーザ加工装置。 The first aspect of the present invention is characterized in that the analysis process performs the image analysis by detecting a difference between the inspection pattern displayed on the inspection image and a comparison pattern as a reference of the inspection pattern. Laser processing equipment.
  3.  前記分析処理は、前記画像分析において、前記検査パターンの歪み、ブレ、又は欠けの有無を検査することを特徴とする請求項1又は請求項2に記載のレーザ加工装置。 The laser processing apparatus according to claim 1 or 2, wherein the analysis process inspects the presence or absence of distortion, blurring, or chipping of the inspection pattern in the image analysis.
  4.  前記診断処理は、前記画像分析の検査結果に基づいて、前記汚れが存在する場所を特定することを特徴とする請求項1乃至請求項3のいずれか一つに記載のレーザ加工装置。 The laser processing apparatus according to any one of claims 1 to 3, wherein the diagnostic process identifies a place where the stain is present based on the inspection result of the image analysis.
  5.  前記診断処理は、前記検査パターンに歪みが有ること又は前記検査パターンとは異なるパターンが有ることを前記画像分析の検査結果とする場合に、前記汚れが存在する場所として、前記撮影部を特定することを特徴とする請求項4に記載のレーザ加工装置。 In the diagnostic process, when the inspection result of the image analysis is that the inspection pattern is distorted or has a pattern different from the inspection pattern, the imaging unit is specified as a place where the stain is present. The laser processing apparatus according to claim 4.
  6.  前記診断処理は、前記検査パターンにブレ又は欠けが有ることを前記画像分析の検査内容とする場合に、前記汚れが存在する場所として、前記光学系を特定することを特徴とする請求項4に記載のレーザ加工装置。 The fourth aspect of the present invention is characterized in that the optical system is specified as a place where the stain is present when the inspection content of the image analysis is that the inspection pattern has a blur or a chip. The laser processing device described.
  7.  前記検査パターンは、格子状のパターンであることを特徴とする請求項1乃至請求項6のいずれか一つに記載のレーザ加工装置。 The laser processing apparatus according to any one of claims 1 to 6, wherein the inspection pattern is a grid pattern.
  8.  前記検査パターンは、複数の直線が平行に並んだパターンであることを特徴とする請求項1乃至請求項6のいずれか一つに記載のレーザ加工装置。 The laser processing apparatus according to any one of claims 1 to 6, wherein the inspection pattern is a pattern in which a plurality of straight lines are arranged in parallel.
  9.  前記制御部は、前記汚れが有ると自動診断された場合には、
     前記基準面に描画された前記検査パターンの描画領域内において、前記検査パターンの描画領域よりも狭くピッチが細かい再検査パターンを、前記検査パターンに代えて前記ガイド光で描画する再描画処理と、
     前記基準面の撮影によって前記再検査パターンが映し出された再検査画像を取得する再取得処理と、
     前記再検査画像に基づいて、前記画像分析を前記再検査パターンに関して行う再分析処理と、
     前記画像分析の検査結果に基づいて、前記光学系及び前記撮影部のうち少なくとも一方の前記汚れの有無を自動診断する再診断処理と、を実行することを特徴とする請求項7又は請求項8に記載のレーザ加工装置。
    When the control unit is automatically diagnosed as having the dirt, the control unit is used.
    In the drawing area of the inspection pattern drawn on the reference plane, a redrawing process of drawing a re-inspection pattern narrower than the drawing area of the inspection pattern and having a finer pitch with the guide light instead of the inspection pattern, and
    A re-acquisition process for acquiring a re-inspection image on which the re-inspection pattern is projected by photographing the reference plane, and a re-acquisition process.
    A reanalysis process in which the image analysis is performed on the reinspection pattern based on the reinspection image, and
    7. The laser processing apparatus described in.
  10.  ディスプレイを備え、
     前記制御部は、
     前記差異が無い部分と前記差異がある部分とを区別して視認可能な画像を生成する画像生成処理と、
     前記画像を前記ディスプレイに表示する表示処理と、を実行することを特徴とする請求項2に記載のレーザ加工装置。
    Equipped with a display
    The control unit
    An image generation process for generating a visible image by distinguishing between a portion having no difference and a portion having the difference.
    The laser processing apparatus according to claim 2, further comprising a display process of displaying the image on the display.
  11.  可視光であるポインタ光を前記基準面に照射するポインタ光照射部を備え、
     前記制御部は、前記基準面における前記ポインタ光の照射領域を前記検査パターンとして扱うことによって、前記ポインタ光照射部の汚れの有無を自動診断することを特徴する請求項1又は請求項2に記載のレーザ加工装置。
    It is provided with a pointer light irradiation unit that irradiates the reference surface with pointer light which is visible light.
    The first or second aspect of the present invention, wherein the control unit automatically diagnoses the presence or absence of dirt on the pointer light irradiation unit by treating the pointer light irradiation region on the reference surface as the inspection pattern. Laser processing equipment.
  12.  前記分析処理は、前記画像分析において、前記ポインタ光の照射領域のズレ、欠け、形状変化、若しくは面積変化の有無、又は前記ポインタ光の照射領域外における干渉縞若しくは異常光の有無を検査することを特徴とする請求項11に記載のレーザ加工装置。 In the analysis process, in the image analysis, the presence or absence of deviation, chipping, shape change, or area change of the pointer light irradiation region, or the presence or absence of interference fringes or abnormal light outside the pointer light irradiation region is inspected. 11. The laser processing apparatus according to claim 11.
  13.  可視光であるガイド光を出射するガイド光出射部と、前記ガイド光を基準面に対して走査しながら集光させるために、前記ガイド光出射部と前記基準面との間に配される光学系と、前記基準面を撮影する撮影部と、を備えるレーザ加工装置の汚れの有無を検知する汚れ検知方法であって、
     前記ガイド光で前記基準面に検査パターンを描画する描画ステップと、
     前記基準面の撮影によって前記検査パターンが映し出された検査画像を取得する取得ステップと、
     前記検査画像に基づいて、前記検査パターンに関する画像分析を行う分析ステップと、前記画像分析の検査結果に基づいて、前記光学系及び前記撮影部のうち少なくとも一方の汚れの有無を自動診断する診断ステップと、を備えることを特徴とする汚れ検知方法。
    A guide light emitting unit that emits a guide light that is visible light, and an optical system that is arranged between the guide light emitting unit and the reference surface in order to collect the guide light while scanning it with respect to the reference surface. It is a stain detection method for detecting the presence or absence of stains on a laser processing apparatus including a system and an imaging unit for photographing the reference surface.
    A drawing step of drawing an inspection pattern on the reference surface with the guide light,
    An acquisition step of acquiring an inspection image on which the inspection pattern is projected by photographing the reference surface, and
    An analysis step of performing image analysis on the inspection pattern based on the inspection image, and a diagnostic step of automatically diagnosing the presence or absence of stains on at least one of the optical system and the photographing portion based on the inspection result of the image analysis. And, a dirt detection method characterized by being provided with.
  14.  可視光であるガイド光を出射するガイド光出射部と、前記ガイド光を基準面に対して走査しながら集光させるために、前記ガイド光出射部と前記基準面との間に配される光学系と、前記基準面を撮影する撮影部と、制御部とを備えるレーザ加工装置を、
     前記ガイド光で前記基準面に検査パターンを描画する描画処理と、
     前記基準面の撮影によって前記検査パターンが映し出された検査画像を取得する取得処理と、
     前記検査画像に基づいて、前記検査パターンに関する画像分析を行う分析処理と、
     前記画像分析の検査結果に基づいて、前記光学系及び前記撮影部のうち少なくとも一方の汚れの有無を自動診断する診断処理と、を実行させるためのレーザ加工装置の制御プログラム。
    A guide light emitting unit that emits a guide light that is visible light, and an optical system that is arranged between the guide light emitting unit and the reference surface in order to collect the guide light while scanning it with respect to the reference surface. A laser processing device including a system, an imaging unit for photographing the reference plane, and a control unit.
    A drawing process for drawing an inspection pattern on the reference surface with the guide light, and
    An acquisition process for acquiring an inspection image on which the inspection pattern is projected by photographing the reference surface, and
    An analysis process for performing image analysis on the inspection pattern based on the inspection image, and
    A control program for a laser processing device for executing a diagnostic process for automatically diagnosing the presence or absence of stains on at least one of the optical system and the photographing unit based on the inspection result of the image analysis.
PCT/JP2021/047230 2020-12-25 2021-12-21 Laser machining device and control program of same, and detection method WO2022138617A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020216677A JP2022102125A (en) 2020-12-25 2020-12-25 Laser processing device, control program of the same and detection method
JP2020-216677 2020-12-25

Publications (1)

Publication Number Publication Date
WO2022138617A1 true WO2022138617A1 (en) 2022-06-30

Family

ID=82159298

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/047230 WO2022138617A1 (en) 2020-12-25 2021-12-21 Laser machining device and control program of same, and detection method

Country Status (2)

Country Link
JP (1) JP2022102125A (en)
WO (1) WO2022138617A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0957479A (en) * 1995-08-28 1997-03-04 Amada Co Ltd Laser machining head
JP2010240674A (en) * 2009-04-02 2010-10-28 Ihi Corp Laser welding device and laser welding method
JP2013099783A (en) * 2011-10-17 2013-05-23 Toshiba Corp Laser irradiation device and method for diagnosing integrity of laser irradiation head
JP2016172261A (en) * 2015-03-17 2016-09-29 日立造船株式会社 Laser welding apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0957479A (en) * 1995-08-28 1997-03-04 Amada Co Ltd Laser machining head
JP2010240674A (en) * 2009-04-02 2010-10-28 Ihi Corp Laser welding device and laser welding method
JP2013099783A (en) * 2011-10-17 2013-05-23 Toshiba Corp Laser irradiation device and method for diagnosing integrity of laser irradiation head
JP2016172261A (en) * 2015-03-17 2016-09-29 日立造船株式会社 Laser welding apparatus

Also Published As

Publication number Publication date
JP2022102125A (en) 2022-07-07

Similar Documents

Publication Publication Date Title
US9492889B2 (en) Laser processing machine
CN104416290B (en) Laser processing apparatus
JP5120625B2 (en) Inner surface measuring device
JP3878165B2 (en) 3D measuring device
JP5385703B2 (en) Inspection device, inspection method, and inspection program
JPH04105341A (en) Method and equipment for detecting bending and floating of lead of semiconductor device
WO2017110786A1 (en) Laser processing device
WO2011114407A1 (en) Method for measuring wavefront aberration and device of same
WO2022138617A1 (en) Laser machining device and control program of same, and detection method
JP5042503B2 (en) Defect detection method
WO2019176786A1 (en) Laser light centering method and laser processing device
JP3984683B2 (en) Laser processing apparatus and method for measuring position of workpiece
JP6911882B2 (en) Laser marker
JP5615660B2 (en) Machine tool with observation point focusing support function
US20230241710A1 (en) Method for Analyzing a Workpiece Surface for a Laser Machining Process and Analysis Device for Analyzing a Workpiece Surface
JP4091040B2 (en) Board inspection equipment
JP6476957B2 (en) Shape measuring apparatus and method of measuring structure
JP3056823B2 (en) Defect inspection equipment
JP2022151993A (en) Three-dimensional measurement device, laser machining device, and three-dimensional measurement method
JP6915640B2 (en) Laser marker
JP4206392B2 (en) Pattern defect inspection apparatus and pattern defect inspection method using the same
JP2008261829A (en) Surface measuring device
JP6994743B1 (en) Electronic component measuring device and electronic component measuring method using electronic component measuring device
JP2020056620A (en) Shape measuring device, structure manufacturing system, shape measuring method, fixing unit, and structure manufacturing method
JP2023020190A (en) Laser processing device and determination method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21910760

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21910760

Country of ref document: EP

Kind code of ref document: A1