CN107803585B - Laser processing machine and laser processing method - Google Patents

Laser processing machine and laser processing method Download PDF

Info

Publication number
CN107803585B
CN107803585B CN201710761098.6A CN201710761098A CN107803585B CN 107803585 B CN107803585 B CN 107803585B CN 201710761098 A CN201710761098 A CN 201710761098A CN 107803585 B CN107803585 B CN 107803585B
Authority
CN
China
Prior art keywords
unit
laser
optical system
image
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710761098.6A
Other languages
Chinese (zh)
Other versions
CN107803585A (en
Inventor
中西启一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Murata Machinery Ltd
Original Assignee
Murata Machinery Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Murata Machinery Ltd filed Critical Murata Machinery Ltd
Publication of CN107803585A publication Critical patent/CN107803585A/en
Application granted granted Critical
Publication of CN107803585B publication Critical patent/CN107803585B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/03Observing, e.g. monitoring, the workpiece

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Laser Beam Processing (AREA)

Abstract

The invention provides a laser processing machine and a laser processing method, the laser processing machine comprises: a laser oscillator for generating a laser beam for processing; a processing head for irradiating a workpiece with laser light from a laser oscillator; an imaging unit that images the workpiece irradiated with the laser beam via the processing head; and an image processing unit for extracting a feature amount from the image captured by the imaging unit with reference to the irradiation position of the laser beam in order to acquire information indicating the processing state of the workpiece.

Description

Laser processing machine and laser processing method
Technical Field
The present invention relates to a laser processing machine and a laser processing method.
Background
Laser processing machines are used for cutting, welding, and the like of workpieces. In a laser processing machine, there is a technique for grasping a processing state from an image of a workpiece being processed, from the viewpoint of improving the processing quality of the workpiece (for example, see patent documents 1 and 2 below).
The laser beam machine of patent document 1 (japanese patent application laid-open No. 11-129083) includes a coaxial observation type micro camera at a machining head. In patent document 1, an image captured by a micro camera is binarized according to a threshold value, and the shape of a high-luminance portion (molten pool) is acquired. Further, the image processing unit performs pattern matching on the acquired shape, and determines abnormal combustion. In the pattern matching, coordinates of each line of the high luminance portion are discretely acquired, and the sum of the respective multiplied by the weighting coefficients is acquired based on the coordinates. The sum is converted by a sigmoid function and collected as a single value, and if the sum is equal to or less than a threshold value, it is determined that abnormal combustion occurs. In addition, the weighting coefficients are optimized by mechanical learning by a neural network.
The laser beam machine of patent document 2 (japanese patent application laid-open No. 5-177374) has a camera coaxial with the laser beam machining optical axis. In patent document 2, a shot image is confirmed by a monitor, and laser processing control is performed by feeding back a processing result of the shot image. Further, on the screen of the monitor, by operating the knob for moving the cursor line, the region to be measured is sandwiched between the cursor lines, and the distance between the cursor lines is measured. Further, manual dimension measurement is automated by binarizing the captured image and dividing the processed portion image and other portions.
When pattern matching is performed as described above, a load required for processing is large, and there is a concern that, for example, processing takes a long time and the cost of a processing apparatus increases. In addition, when the dimension is measured by setting the position of the cursor line on the screen of the monitor, there is an arbitrary reference for the measurement, and the measurement result varies depending on, for example, the selection of the region to be measured, the processing direction, and the like, and therefore there is a fear that the processing state cannot be grasped with high accuracy.
Disclosure of Invention
The present invention has been made in view of the above circumstances, and an object thereof is to provide a laser processing machine and a laser processing method that can grasp a processing state with high accuracy and reduce a processing load thereof.
The laser processing machine of the present invention comprises: a laser oscillator for generating a laser beam for machining; a processing head configured to irradiate a workpiece with laser light from a laser oscillator; an imaging unit that images the workpiece irradiated with the laser beam via the processing head; and an image processing unit that extracts a feature amount from the image captured by the imaging unit with reference to the irradiation position of the laser beam in order to acquire information indicating the processing state of the workpiece.
The laser processing method of the present invention includes: generating a laser beam for machining; irradiating a workpiece with laser light from a processing head; shooting the workpiece irradiated with the laser through the processing head; and extracting a feature amount from the captured image with reference to the irradiation position of the laser light in order to acquire information indicating the processing state of the workpiece.
In addition, the following configuration is possible: the machining head moves relative to the workpiece in a direction parallel to the workpiece, and includes an image rotating unit that rotates an image around a laser light irradiation position so that a relative movement direction in which the machining head and the workpiece move relative to each other in the image becomes a reference direction. In addition, the following configuration is possible: the image rotating unit acquires a relative movement direction from the control unit and determines an angle at which the image is rotated. In addition, the following configuration is possible: the image processing apparatus includes a binarizing unit for binarizing an image and an area specifying unit for specifying an area including an irradiation position of a laser beam from the binarized image, and the image processing unit extracts a feature amount based on the area specified by the area specifying unit. In addition, the following configuration is possible: the laser processing apparatus includes an irradiation optical system that is provided inside a processing head and guides laser light so as to irradiate a workpiece through an emission port of the processing head, and an imaging optical system that guides light entering the processing head through the emission port from the workpiece to an imaging unit, and the imaging optical system and the irradiation optical system share at least a part of an optical member. The imaging optical system includes an optical system driving unit that drives an optical member common to the imaging optical system and the irradiation optical system in the optical path direction, and a focus adjustment unit that adjusts the focus of the imaging unit by changing a relative distance between the imaging optical system and the imaging unit. In addition, when the workpiece is cut by the laser light generated by the laser oscillator, the workpiece is subjected to a piercing process for forming a through hole by the laser light generated by the laser oscillator while the processing head is stopped, and then a cutting process for cutting the workpiece by the laser light generated by the laser oscillator while moving the processing head is performed with the through hole formed by the piercing process as a starting point. It can also be constituted as follows: the machining apparatus further includes a machining state determination unit that determines whether or not the machining state of the workpiece by the laser beam generated by the laser oscillator is appropriate by comparing the feature amount extracted by the image processing unit with a threshold value, and when the machining state determination unit determines that the machining state is inappropriate, the laser oscillator stops the generation of the laser beam and stops the machining of the workpiece.
Effects of the invention
According to the present invention, since the feature amount is extracted based on the irradiation position of the laser beam, the feature amount can be objectively extracted, the processing state can be grasped with high accuracy, and the processing load can be reduced.
In addition, since the laser beam machine including the image rotating unit rotates the image around the irradiation position of the laser beam, the load required for selecting the rotation center can be reduced. Further, since the rotation center is determined, it is possible to prevent the feature amount from being changed due to selection of the rotation center, and to grasp the machining state with high accuracy. In addition, the image rotation unit can acquire the relative movement direction with high accuracy and reduce the processing load, compared with a case where the relative movement direction is acquired from the image, for example, in a laser processing machine in which the relative movement direction is acquired from the control unit. Further, since the angle of rotation of the image is determined using the relative movement direction from the control unit, the rotation angle can be determined to a high-precision value, and the load of the processing can be reduced. In addition, since the laser processing machine including the binarizing unit extracts the feature amount from the binarized image, the processing load can be reduced. In addition, although the bright portion may be scattered in the binarized image, the laser processing machine including the region specifying unit specifies the region including the irradiation position of the laser light, and thus can select the region to be extracted with the feature amount with high accuracy and low load. In addition, the laser beam machine in which the imaging optical system and the irradiation optical system share at least a part of the optical components can reduce the number of components by sharing the optical components, thereby saving space and reducing cost. Further, when the common optical member is driven by focus adjustment of the processing laser light or the like, the focus of the image pickup section is shifted, but since the focus adjustment section performs adjustment, the focus can be optimized for both the processing laser light and the image pickup section. In addition, since the irradiation position of the laser light is acquired from the luminance distribution of the photographed image when the machining head is stopped in the piercing process, it is possible to acquire an accurate irradiation position as compared with a case where the irradiation position of the laser light is acquired when the machining head is moved. Further, by comparing the feature value with the threshold value, it is possible to determine whether or not the machining state is appropriate, and to stop machining when it is inappropriate, thereby suppressing the occurrence of useless machining.
Drawings
Fig. 1 is a diagram illustrating a laser beam machine according to embodiment 1.
Fig. 2 is an explanatory view of a machining state.
Fig. 3 is a diagram illustrating an image processing unit according to embodiment 1.
Fig. 4 is a diagram illustrating processing of the image rotation section.
Fig. 5 is a diagram showing the processing of the binarization section, the region determination section, and the feature amount extraction section.
Fig. 6 is a flowchart illustrating a laser processing method according to an embodiment.
Fig. 7 is a diagram illustrating a laser processing machine according to embodiment 2.
Detailed Description
Hereinafter, embodiments will be described with reference to the drawings. In the following drawings, the directions in the drawings will be described using an XYZ coordinate system. In this XYZ coordinate system, the vertical direction is the Z direction, and the horizontal direction is the X direction and the Y direction. In each direction (for example, the X direction), the direction of the arrow is referred to as a + side (for example, the + X side), and the opposite side is referred to as a-side (for example, the-X side).
[ embodiment 1 ]
Fig. 1 is a diagram illustrating a laser processing machine 1 according to the present embodiment. The laser processing machine 1 includes a processing head 2, a processing head driving unit 3, a laser oscillator 4, an imaging unit 5, an image processing unit 6 (image processing device), a control unit 7, a processing state determination unit 8, a storage unit 9, and a focus adjustment unit 25. The laser processing machine 1 performs cutting processing on a workpiece W by Numerical Control (NC), for example. The control unit 7 comprehensively controls the respective units of the laser processing machine 1 in accordance with, for example, a numerical control program.
The machining head 2 includes a nozzle 11, and a laser beam for machining (hereinafter referred to as a machining laser beam L1) is irradiated to the workpiece W through an inner side of an ejection port (a through hole penetrating the nozzle 11) formed in the nozzle 11. The machining head 2 is movable relative to the workpiece W in each of the X direction, the Y direction, and the Z direction. The machining head driving unit 3 includes a moving unit 12 and an optical system driving unit 13. The machining head driving unit 3 is controlled by the control unit 7, and the machining head 2 is moved in each of the X direction, the Y direction, and the Z direction by the moving unit 12. The processing head driving unit 3 is controlled by the control unit 7, and the focus of the light irradiated from the nozzle 11 is adjusted by the optical system driving unit 13. The laser processing machine 1 performs cutting by irradiating the workpiece W with the processing laser light L1 from the nozzle 11 of the processing head 2 while relatively moving the processing head 2 with respect to the workpiece W.
The laser oscillator 4 generates, for example, an infrared laser beam as the processing laser beam L1. An irradiation optical system 15 is provided inside the machining head 2, and the irradiation optical system 15 guides the machining laser light L1 generated by the laser oscillator 4 to the workpiece W, thereby irradiating the workpiece W with the machining laser light L1 through an exit port of the nozzle 11. The irradiation optical system 15 includes an optical fiber 16, a collimator 17, a beam splitter 18, and a condenser lens 19. One end (end on the light incident side) of the optical fiber 16 is connected to the laser oscillator 4, and the other end (end on the light emitting side) thereof is connected to the machining head 2. The processing laser light L1 from the laser oscillator 4 is introduced into the processing head 2 through the optical fiber 16. The machining head 2 irradiates the workpiece W with the machining laser light L1 from the laser oscillator 4.
The collimator 17 converts the processing laser light L1 from the laser oscillator 4 into parallel light or nearly parallel light. The beam splitter 18 is disposed at a position where the processing laser light L1 passing through the collimator 17 enters. The beam splitter 18 has a characteristic of reflecting at least a part of the processing laser light L1 and transmitting at least a part of light emitted from the workpiece W (hereinafter referred to as emission light). The beam splitter 18 is, for example, a dichroic mirror, a half mirror, or the like. The beam splitter 18 is inclined at an angle of about 45 ° with respect to the optical axis 17a of the collimator 17. The beam splitter 18 is inclined to the + X side as it goes to the + Z side.
The condenser lens 19 is disposed at a position where the processing laser light L1 from the beam splitter 18 enters. The processing laser light L1 passing through the collimator 17 is reflected by the beam splitter 18, and the optical path is bent by about 90 ° from the X direction to the Z direction (-Z side), and enters the condenser lens 19. The condenser lens 19 condenses the processing laser light L1 from the collimator 17. The optical system driving unit 13 of the machining head driving unit 3 moves the condenser lens 19 along the optical axis 19a of the condenser lens 19, thereby adjusting the focus of the irradiation optical system 15 on the workpiece side.
The imaging unit 5 images the workpiece W irradiated with the processing laser light L1 through the processing head 2. The imaging unit 5 includes an imaging optical system 21 and an imaging element 22. The imaging unit 5 detects light (emission light) emitted from one side of the workpiece W to the machining head 2 by irradiation of the machining laser light L1 by the imaging element 22 through the imaging optical system 21. The imaging optical system 21 guides light entering the processing head 2 from the workpiece W through the light exit port in the nozzle 11 of the processing head 2 to the imaging unit 5. The imaging optical system 21 includes a condenser lens 19, a beam splitter 18, a mirror 23, and an imaging lens 24. The imaging optical system 21 and the irradiation optical system 15 share at least a part of optical components. In fig. 1, the imaging optical system 21 and the irradiation optical system 15 share the condenser lens 19 and the beam splitter 18, and the workpiece W can be observed coaxially with the irradiation optical system 15.
The focus adjustment unit 25 is controlled by the control unit 7 to adjust the focus of the imaging unit 5. Specifically, the focal point of the imaging element 22 (the distance between the imaging point of the imaging lens 24 and the optical path direction of the imaging element) is adjusted by changing the relative distance between the imaging optical system 21 and the imaging element 22 to the optical path direction of the imaging optical system. In order to change the relative distance, the focus adjustment unit 25 moves at least one of the imaging optical system 21 and the imaging element 22. Thus, the focus adjustment unit 25 may move only the optical member (imaging lens 24) of the imaging optical system 21, only the imaging element 22, or both.
Even if the optical system driving unit 13 moves the condenser lens 19 to change the imaging point of the imaging lens 24, the focus adjusting unit 25 adjusts the focus of the imaging unit 5, and thus the machining state can be accurately grasped. Specifically, when the optical system driving unit 13 moves the condenser lens 19 so as to adjust the irradiation point of the processing laser light L1, the imaging optical system 21 and the irradiation optical system 15 share the condenser lens 19, and therefore the imaging point of the imaging lens 24 changes. On the other hand, the focus adjustment unit 25 adjusts the focus of the image pickup device 22 in accordance with the position of the condenser lens 19 changed by the optical system drive unit 13, and can maintain a high-precision picked-up image. Therefore, the laser processing can be performed on the workpiece with high accuracy, and the processing state of the workpiece can be grasped with high accuracy.
The emitted light from the workpiece W is incident on the beam splitter 18 through the condenser lens 19. The emitted light includes, for example, light emitted from the molten metal by irradiation with the processing laser light L1, light generated by plasma, and light reflected by the workpiece W in the processing laser light L1. At least a portion of the emitted light is incident on the mirror 23 through the beam splitter 18. The emitted light incident on the mirror 23 is reflected by the mirror 23 and incident on the imaging lens 24. The imaging lens 24 condenses the light from the mirror 23 to the image pickup device 22. The imaging lens 24 and the condenser lens 19 project an image of the workpiece W onto the imaging element 22.
The imaging element 22 is, for example, a CCD or CMOS image sensor, and images the image formed by the imaging optical system 21. The image pickup device 22 is provided with a plurality of pixels arranged two-dimensionally, and each pixel is provided with a light receiving element such as a photodiode. The image pickup device 22 sequentially reads out the electric charges (signals) generated in the respective pixels when light is incident on the light receiving device, amplifies the signals, and performs a/D conversion and arrangement of the signals into an image format, thereby generating digital data (hereinafter, referred to as picked-up image data) of a picked-up image (hereinafter, referred to as picked-up image). The imaging element 22 outputs the generated captured image data to the image processing unit 6.
The image processing unit 6 and the imaging element 22 are connected to each other in a wired or wireless communicable manner. The image processing unit 6 also serves as a control unit for the imaging element 22. The image processing unit 6 is connected to the control unit 7 in a wired or wireless communicable manner, and receives an instruction to execute shooting from the control unit 7. The image processing unit 6 causes the imaging element 22 to perform imaging in accordance with an instruction from the control unit 7.
The image processing unit 6 extracts a feature amount from the image captured by the imaging unit 5 with reference to the irradiation position of the laser beam in order to acquire information indicating the processing state of the workpiece W. The image processing unit 6 acquires captured image data from the imaging element 22, and generates information (feature amount) on the machining state by image processing using the captured image data. The image processing unit 6 extracts a feature amount from a part of the captured image captured by the imaging unit 5 or a part of the image in which the captured image is processed, based on the irradiation position of the processing laser light L1. Examples of the image processed by the captured image include a rotated image Im2, a binarized image Im3, a fixed image Im4, and the like shown in fig. 4 and 5 below. The respective processes performed by the image processing section 6 will be described in more detail later with reference to fig. 3 to 5. The image processing unit 6 supplies the feature amount extracted from the captured image to the control unit 7.
The machining state determination unit 8 is provided in the control unit 7, for example, and acquires the feature amount supplied from the image processing unit 6 to the control unit 7. The machining state determination unit 8 compares the feature amount extracted by the image processing unit 6 with a threshold value to determine the machining state.
Fig. 2 is an explanatory view of a machining state. Fig. 2 (a) shows an appropriate machining state, and fig. 2 (B) shows an inappropriate machining state. The portion of the workpiece W on which the processing laser light L1 is incident from the processing head 2 is melted to form a cut groove. The machining head 2 blows an assist gas around a portion of the workpiece W on which the machining laser light L1 is incident, and the molten metal M is discharged downward from the cutting groove by the assist gas blown from the machining head 2.
As shown in fig. 2 (a), when the machining state is appropriate, the molten metal M flows rearward with respect to the movement direction of the machining head 2 and downward of the workpiece W. As shown in fig. 2 (B), when the machining state is inappropriate, the molten metal M may diffuse and flow downward of the workpiece W as compared with fig. 2 (a). Even if the molten metal (not shown) on the workpiece W varies depending on the machining state, the molten metal may overflow from the notch groove and spread on the workpiece W in an inappropriate machining state. The difference in the processing state as shown in fig. 2 (a) and 2 (B) is expressed in the feature amount extracted by the image processing unit 6. The machining state determination unit 8 described above compares the feature value with a threshold value, and thereby determines whether the machining state is appropriate, for example.
The machining state determination unit 8 can evaluate whether the machining state is an appropriate level or an inappropriate level by classifying the machining state into three or more stages according to the value of the feature amount. For example, the machining state determination unit 8 may evaluate the machining state in three stages, i.e., "optimum", "appropriate", and "inappropriate". The machining state determination unit 8 may express the machining state in a numerical value such as 1, 2, 3, and …, as appropriate.
Fig. 3 is a diagram illustrating an image processing unit (image processing apparatus) according to the present embodiment. The image processing unit 6 includes an image rotating unit 31, a binarizing unit 32, a region identifying unit 33, and a feature amount extracting unit 34. In the following description, fig. 4 and 5 are referred to as appropriate. Fig. 4 is a diagram illustrating processing of the image rotation section. Fig. 5 is a diagram showing the processing of the binarization section, the region determination section, and the feature amount extraction section.
As shown in fig. 3, the image processing unit 6 performs image processing using information on the position (irradiation position) of the workpiece W to which the processing laser light L1 is irradiated. The irradiation position is the position of the center of the processing laser light L1. The irradiation position on the captured image is, for example, the position of the center of gravity of the light intensity distribution of the processing laser light L1 reflected on the captured image. Such an irradiation position can be obtained, for example, as follows: the workpiece W is melted by irradiating the processing laser light L1 from the processing head 2 without relatively moving the workpiece W and the processing head 2, and the image is captured by the imaging element 22 in this state, and the brightness distribution of the captured image is obtained. The irradiation position can be obtained by the image processing unit from the luminance distribution of the image of the workpiece taken by the imaging unit when the through hole (through hole) is formed at the start of the cutting process. When the laser processing machine 1 performs the cutting processing of the workpiece, after a through hole (through hole) is formed in the workpiece by the processing laser light L1 with the processing head 2 stopped (punching processing), the workpiece is cut with the processing laser light L1 from the through hole as a starting point while moving the processing head 2 (cutting processing).
The irradiation position in the captured image may be, for example, a position intersecting the optical axis 24a of the imaging lens 24 (imaging optical system 21) on the light receiving surface of the imaging element 22 in fig. 1. The processing laser light L1 is set so as to pass through the center of the injection hole of the processing head 2, and the irradiation position on the captured image may be a position corresponding to the center of the injection hole in the captured image. The information of the irradiation position is stored in, for example, the storage unit 9 in fig. 1, and the control unit 7 reads the information of the irradiation position from the storage unit 9 and supplies the information of the irradiation position to the image processing unit 6. In addition, the information of the irradiation position may be stored in a storage unit (not shown) of the image processing unit 6, and in this case, the control unit 7 may not supply the information of the irradiation position to the image processing unit 6.
The image rotating unit 31 rotates the image around the laser light irradiation position so that the relative movement direction in which the processing head 2 and the workpiece W move relative to each other in the image becomes the reference direction. The relative movement direction is a direction parallel to the dicing groove (machining direction) by laser machining. The control unit 7 controls the relative movement between the processing head 2 and the workpiece W, and retains information on the relative movement direction. The image rotation unit 31 acquires information on the relative movement direction from the control unit 7, and determines the angle at which the image is rotated.
In fig. 4, the reference direction is a predetermined direction, and is, for example, a direction in which pixels are arranged (for example, a horizontal scanning direction, a vertical scanning direction) in the captured image Im 1. The relative movement direction and the irradiation position LP in the captured image Im1 are known from information (see fig. 3) supplied from the control unit 7. The image rotating unit 31 calculates an inner product of the unit vector in the reference direction and the unit vector in the relative movement direction, and calculates an angle (rotation angle) formed between the reference direction and the relative movement direction. The image rotation unit 31 determines the calculation result as an angle (rotation angle) by which the captured image Im1 is rotated. Then, the image rotating unit 31 rotates the captured image Im1 by the above-described rotation angle around the irradiation position LP to generate a rotated image Im2 as shown in fig. 4 (B).
As shown in fig. 3, the image rotating unit 31 supplies the data of the rotated image to the binarizing unit 32. The binarizing unit 32 binarizes the image, and supplies data of the binarized image (hereinafter referred to as a binarized image) to the area determining unit 33 as a processing result thereof. The binarizing unit 32 compares each pixel value of the rotated image with a threshold value, and generates a binarized image by expressing pixels having pixel values equal to or higher than the threshold value as "white" and expressing pixels having pixel values smaller than the threshold value as "black". Such a binarized image may be represented discretely in island-like "white" regions.
For example, the binarized image Im3 shown in fig. 5 (a) includes island-shaped regions AR1 to AR 4. In the binarized image Im3, the regions AR1 to AR4 are "white" regions, respectively, and the other regions are "black". The region specification unit 33 in fig. 3 specifies (selects, determines, and extracts) a region AR1 (hereinafter referred to as a specification region) including the irradiation position LP of the processing laser light from the binarized image Im 3. The region determining section 33 supplies the information for determining the region AR1 to the feature amount extracting section 34. The information of the specific area AR1 is, for example, data of a specific image obtained by replacing the areas AR2 to AR4 other than the specific area AR1 with "black".
The feature amount extraction unit 34 of fig. 3 extracts feature amounts based on the region determined by the region determination unit 33. For example, the feature amount extraction unit 34 extracts a feature amount from the determination area AR 1. That is, the determination region AR1 is a region of the object from which the feature amount is extracted. The feature amount extraction unit 34 extracts, for example, the size of the determination area AR1 shown in fig. 5 (B) as a feature amount. In fig. 5, the "width direction" is a direction perpendicular to the "relative movement direction", and corresponds to the width direction of the notch. The feature amount extraction unit 34 sets a coordinate system in the relative movement direction and the width direction with reference to the irradiation position LP of the processing laser light (for example, the origin) in the specification image Im4, for example, and extracts various sizes of the specification area AR 1.
For example, the feature amount extraction unit 34 extracts the size X1 (length) of the fixed region AR1 in the relative movement direction and the size Y1 (width) of the fixed region AR1 in the width direction. The feature amount extraction unit 34 extracts the dimension X2 (length) of the front portion (head HD) with respect to the irradiation position LP in the relative movement direction. In addition, the feature amount extraction unit 34 extracts the dimension Y2 in the width direction of the head HD. Further, the feature amount extraction unit 34 extracts a size obtained by subtracting the size X2 of the head HD from the size X1 of the fixed area AR1 in the relative movement direction.
The feature extraction unit 34 performs four arithmetic operations using the sizes of the respective portions of the determination area AR1 as described above, and sets the calculation result as a feature. The four arithmetic operations may be operations that perform only one of a sum, a difference, a product, and a quotient, or operations that perform two or more of a sum, a difference, a product, and a quotient. For example, the feature amount extraction unit 34 calculates at least one of the sum, difference, product, and quotient of the dimension in the relative movement direction and the dimension in the width direction as the feature amount for each portion of the determination area AR 1. The feature amount extraction unit 34 may calculate the feature amount as a function having the size of each portion of the determination area AR1 as a variable. The function may be a linear function or a non-linear function.
Further, instead of extracting the feature amount from the binarized image and from the region determined by the region determining section 33, the feature amount extracting section 34 may extract the feature amount from the image before being binarized. For example, the feature amount extraction unit 34 may determine a range of the image from which the feature amount should be extracted based on the region determined by the region determination unit 33 from the binarized image (see fig. 4B), and extract the feature amount from the determined range in the image before binarization.
As shown in fig. 3, the feature extraction unit 34 supplies the extracted feature to the control unit 7, and the machining state determination unit 8 determines the machining state by comparing the feature with a threshold value. The above-described items of the feature amount are appropriately selected so as to be distinguishable between an appropriate machining state and an inappropriate machining state shown in fig. 2, for example, and may be one item or two or more items. The laser processing machine 1 may not include the processing state determination unit 8. For example, the feature amount extraction unit 34 displays the feature amount on a display unit or the like, and the operator can determine the machining state by referring to the displayed feature amount. In addition, instead of determining whether the machining state is good or bad, the laser processing machine 1 may adjust the machining conditions by feedback control using the feature amount, or the like.
Next, a laser processing method according to an embodiment will be described based on the configuration of the laser processing machine 1. Fig. 6 is a flowchart illustrating a laser processing method according to an embodiment. The following description will focus on a method of determining a machining state in a laser machining method. Further, fig. 1 is appropriately referred to for each part of the laser processing machine 1, and fig. 3 is appropriately referred to for each part of the image processing unit 6.
The imaging unit 5 in fig. 1 images the workpiece W via the machining head 2 in a state where the machining laser light L1 is irradiated from the machining head 2. In step S1, the image processing unit 6 performs image clipping and noise removal as preprocessing of the captured image captured by the imaging unit 5. For example, the image processing unit 6 cuts out the center of the captured image, performs noise removal on the cut-out image by a median filter or the like, and uses the processed image for the subsequent image processing.
In step S2, the image processing unit 6 acquires the irradiation position and the relative movement direction of the processing laser light L1. For example, the image processing unit 6 acquires information on the irradiation position and information on the relative movement direction from the control unit 7. In step S3, the image rotating unit 31 of the image processing unit 6 rotates the image around the irradiation position using the information on the irradiation position and the information on the relative movement direction acquired in step S2 (see fig. 4). In step S4, the binarizing unit 32 of the image processing unit 6 binarizes the image processed by the image rotating unit 31 (see fig. 5 a). In step S5, the region determining section 33 of the image processing section 6 determines the determination region AR1 including the irradiation position LP from the binarized image Im3 (refer to fig. 5 a) using the information of the irradiation position acquired in step S2.
In step S6, the feature extraction unit 34 extracts a feature from the determination region AR1 determined by the region determination unit 33 (see fig. 5B). For example, the feature extraction unit 34 extracts the sizes of the respective parts of the specification area AR1, and performs a four-rule operation using the extracted sizes. The feature extraction unit 34 uses the size of each part of the specification area AR1 and at least a part of the result of the above-described four arithmetic operations as features. In step S7, the machining state determination unit 8 of the control unit 7 compares the feature amount extracted by the feature amount extraction unit 34 in step S6 with a threshold value, and determines the machining state. The control unit 7 adjusts the processing conditions using the determination result of step S7, for example, and executes laser processing under the adjusted processing conditions. The control unit 7 may not perform feedback control (adjustment of the machining conditions) using the determination result of step S7, and the processing from step S1 to step S7 described above may be performed as a check, for example. In this case, when it is determined in step S7 that the machining is inappropriate, the laser oscillator 4 stops generating the machining laser light L1, and the laser processing machine 1 stops processing.
The sequence of the processing from step S3 to step S5 can be changed as appropriate. For example, the binarizing unit 32 may perform binarizing processing on an image before rotation, and the image rotating unit 31 may rotate the binarized image. The area specifying unit 33 can specify the area from the binarized image before rotation, and the image rotating unit 31 can rotate the specified area. The image processing unit 6 may not include the image rotation unit 31, and the feature amount extraction unit 34 may extract the feature amount from the image that is not rotated. The image processing unit 6 may not include the binarizing unit 32, and may not include the region identifying unit 33. In the present embodiment, the machining state determination unit 8 is provided in the control unit 7, but may be provided in a portion other than the control unit 7 (for example, the image processing unit 6).
[ 2 nd embodiment ]
Embodiment 2 will be described with reference to fig. 7. In the present embodiment, the same components as those in the above-described embodiments are denoted by the same reference numerals, and the description thereof is omitted or simplified. In the present embodiment, the laser processing machine 1 includes an illumination light source 41 and an illumination optical system 42. The illumination light source 41 emits light (for example, visible light) having a wavelength different from that of the processing laser light L1 as illumination light L2. The illumination optical system 42 is provided inside the processing head 2. The illumination optical system 42 guides the illumination light L2 generated by the illumination light source 41 toward the workpiece W, thereby irradiating the workpiece W through the exit of the nozzle 11.
The illumination optical system 42 includes a collimator 43, a half mirror 44, a beam splitter 18, and a condenser lens 19. Here, the illumination optical system 42 shares the beam splitter 18 and the condenser lens 19 with the illumination optical system 15, and performs reflected illumination via the condenser lens 19. The optical axis on the light exit side of the illumination optical system 42 is coaxial with the optical axis on the light exit side of the illumination optical system 15, and the illumination light L2 is irradiated onto the workpiece W through the same optical path as the processing laser light L1 (a part of the optical path common to the processing laser light L1).
The collimator 43 is disposed at a position where the illumination light L2 is incident from the illumination light source 41. The collimator 43 converts the illumination light L2 from the illumination light source 41 into parallel light or near-parallel light. In the case of matching the focal point of the illumination optical system 42 with the object position of the workpiece, the collimator 43 is configured, for example, such that the focal point thereof coincides with the position of the illumination light source 41. The half mirror 44 is disposed at a position where the illumination light L2 passing through the collimator 43 enters. The half mirror 44 is a reflective and transmissive member having a property of reflecting a part of the illumination light L2 and transmitting a part thereof.
A part of the illumination light L2 passing through the collimator 43 is reflected by the half mirror 44, and the optical path is bent by about 90 ° from the X direction to the Z direction (-Z side), and is incident on the beam splitter 18. The condenser lens 19 is disposed at a position where the illumination light L2 is incident from the beam splitter 18. The condenser lens 19 condenses the processing laser light L1 from the beam splitter 18. The emitted light radiated from the workpiece W by the irradiation of the illumination light L2 is imaged in the imaging element 22 through the same optical path as the emitted light caused by the processing laser light L1 illustrated in fig. 1. The imaging section 5 images the workpiece W in a state where the illumination light L2 is irradiated to the workpiece W. The image processing unit 6 detects the edge of the notch by performing image processing such as edge detection on the captured image, and determines the direction along the notch as the relative movement direction. In the present embodiment, instead of receiving the information of the relative movement direction from the control unit 7, the image rotation unit 31 (see fig. 3) of the image processing unit 6 rotates the image in the relative movement direction determined in the above-described processing.
In the above-described embodiment, the control unit 7 includes, for example, a computer system. The control unit 7 reads a program stored in the storage unit 9 and executes various processes in accordance with the program. The program is an image processing program installed in a computer of a laser processing machine including a laser oscillator that generates a laser beam for processing, a processing head that irradiates the workpiece with the laser beam from the laser oscillator via a processing head, and an imaging unit that images the workpiece irradiated with the laser beam via the processing head, the program causing the computer to execute: in order to acquire information indicating the processing state of the workpiece, a feature amount is extracted from the image captured by the imaging unit with reference to the irradiation position of the laser beam. The image processing program may be provided by being recorded on a computer-readable storage medium.
In the above-described embodiment, the image processing device (image processing unit 6) is an image processing device of a laser processing machine including a laser oscillator that generates a laser beam for processing, a processing head that irradiates the workpiece with the laser beam from the laser oscillator, and an imaging unit that images the workpiece irradiated with the laser beam via the processing head, and the image processing device extracts a characteristic amount from an image captured by the imaging unit with reference to an irradiation position of the laser beam in order to acquire information indicating a processing state of the workpiece.
The technical scope of the present invention is not limited to the embodiments described above. 1 or more of the elements described in the above embodiments and the like may be omitted. The elements described in the above embodiments and the like can be combined as appropriate. The disclosures of all documents cited in the above embodiments and the like are incorporated as a part of the description herein, as far as the statute permits.

Claims (16)

1. A laser processing machine is provided with:
a laser oscillator for generating a laser beam for machining;
a processing head configured to irradiate a workpiece with the laser light from the laser oscillator;
an imaging unit that images the workpiece irradiated with the laser beam via the processing head; and
an image processing unit for extracting a feature amount from the image captured by the imaging unit based on the irradiation position of the laser beam in order to acquire information indicating the processing state of the workpiece,
the machining head is moved relative to the workpiece in a direction parallel to the workpiece,
the laser processing machine includes an image rotating unit that rotates the image around an irradiation position of the laser beam such that a relative movement direction in which the processing head and the workpiece move relative to each other in the image becomes a reference direction.
2. The laser processing machine of claim 1,
the laser processing machine includes a control unit for controlling relative movement between the processing head and the workpiece,
the image rotating unit acquires the relative movement direction from the control unit and determines an angle at which the image is rotated.
3. The laser processing machine of claim 1,
the laser processing machine includes:
a binarization section for binarizing the image; and
an area specifying unit that specifies an area including an irradiation position of the laser beam from the image binarized by the binarizing unit,
the image processing unit extracts the feature amount based on the region specified by the region specifying unit.
4. The laser processing machine of claim 2,
the laser processing machine includes:
a binarization section for binarizing the image; and
an area specifying unit that specifies an area including an irradiation position of the laser beam from the image binarized by the binarizing unit,
the image processing unit extracts the feature amount based on the region specified by the region specifying unit.
5. The laser processing machine of claim 1,
the laser processing machine includes:
an irradiation optical system which is provided inside the processing head and guides the laser beam so as to irradiate the workpiece through an ejection opening of the processing head; and
an imaging optical system for guiding light entering the processing head from the workpiece through the exit to the imaging section,
the imaging optical system and the illumination optical system share at least a part of optical components.
6. The laser processing machine of claim 2,
the laser processing machine includes:
an irradiation optical system which is provided inside the processing head and guides the laser beam so as to irradiate the workpiece through an ejection opening of the processing head; and
an imaging optical system for guiding light entering the processing head from the workpiece through the exit to the imaging section,
the imaging optical system and the illumination optical system share at least a part of optical components.
7. The laser processing machine of claim 3,
the laser processing machine includes:
an irradiation optical system which is provided inside the processing head and guides the laser beam so as to irradiate the workpiece through an ejection opening of the processing head; and
an imaging optical system for guiding light entering the processing head from the workpiece through the exit to the imaging section,
the imaging optical system and the illumination optical system share at least a part of optical components.
8. The laser processing machine of claim 4,
the laser processing machine includes:
an irradiation optical system which is provided inside the processing head and guides the laser beam so as to irradiate the workpiece through an ejection opening of the processing head; and
an imaging optical system for guiding light entering the processing head from the workpiece through the exit to the imaging section,
the imaging optical system and the illumination optical system share at least a part of optical components.
9. The laser processing machine of claim 5,
the laser processing machine includes:
an optical system driving unit that drives an optical member common to the imaging optical system and the irradiation optical system in an optical path direction; and
a focus adjustment unit that adjusts the focus of the image pickup unit by changing a relative distance between the image pickup optical system and the image pickup unit,
the focus adjustment unit adjusts the focus of the image pickup unit when the optical member is driven by the optical system driving unit.
10. The laser processing machine of claim 6,
the laser processing machine includes:
an optical system driving unit that drives an optical member common to the imaging optical system and the irradiation optical system in an optical path direction; and
a focus adjustment unit that adjusts the focus of the image pickup unit by changing a relative distance between the image pickup optical system and the image pickup unit,
the focus adjustment unit adjusts the focus of the image pickup unit when the optical member is driven by the optical system driving unit.
11. The laser processing machine of claim 7,
the laser processing machine includes:
an optical system driving unit that drives an optical member common to the imaging optical system and the irradiation optical system in an optical path direction; and
a focus adjustment unit that adjusts the focus of the image pickup unit by changing a relative distance between the image pickup optical system and the image pickup unit,
the focus adjustment unit adjusts the focus of the image pickup unit when the optical member is driven by the optical system driving unit.
12. The laser processing machine of claim 8,
the laser processing machine includes:
an optical system driving unit that drives an optical member common to the imaging optical system and the irradiation optical system in an optical path direction; and
a focus adjustment unit that adjusts the focus of the image pickup unit by changing a relative distance between the image pickup optical system and the image pickup unit,
the focus adjustment unit adjusts the focus of the image pickup unit when the optical member is driven by the optical system driving unit.
13. The laser processing machine according to any one of claims 1 to 12,
in the case where the cutting processing of the workpiece is performed by the laser light generated by the laser oscillator,
performing a piercing process for forming a through hole in the workpiece with the laser beam generated by the laser oscillator in a state where the processing head is stopped, and then performing a cutting process for cutting the workpiece with the laser beam generated by the laser oscillator while moving the processing head with the through hole formed by the piercing process as a starting point,
when the punching process is performed, the image pickup unit picks up an image of the workpiece, and the image processing unit acquires the irradiation position of the laser beam based on a luminance distribution of the image picked up by the image pickup unit.
14. The laser processing machine according to any one of claims 1 to 12,
the laser processing machine further includes a processing state determination unit that determines whether or not the processing state of the workpiece by the laser beam generated by the laser oscillator is appropriate by comparing the feature amount extracted by the image processing unit with a threshold value,
when the machining state determination unit determines that the machining state is inappropriate, the laser oscillator stops the generation of the laser beam, and the machining of the workpiece is stopped.
15. The laser processing machine of claim 13,
the laser processing machine further includes a processing state determination unit that determines whether or not the processing state of the workpiece by the laser beam generated by the laser oscillator is appropriate by comparing the feature amount extracted by the image processing unit with a threshold value,
when the machining state determination unit determines that the machining state is inappropriate, the laser oscillator stops the generation of the laser beam, and the machining of the workpiece is stopped.
16. A laser processing method, comprising:
generating a laser beam for machining;
irradiating the laser beam from the processing head to the workpiece;
shooting the workpiece irradiated with the laser through the processing head; and
in order to acquire information indicating a processing state of the workpiece, a feature amount is extracted from the captured image with reference to an irradiation position of the laser beam,
the machining head is moved relative to the workpiece in a direction parallel to the workpiece,
the laser processing method further includes: the image is rotated around the laser irradiation position so that a relative movement direction in which the processing head and the workpiece move relative to each other in the image becomes a reference direction.
CN201710761098.6A 2016-09-07 2017-08-30 Laser processing machine and laser processing method Active CN107803585B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016174399A JP6801312B2 (en) 2016-09-07 2016-09-07 Laser machining machine and laser machining method
JP2016-174399 2016-09-07

Publications (2)

Publication Number Publication Date
CN107803585A CN107803585A (en) 2018-03-16
CN107803585B true CN107803585B (en) 2021-06-25

Family

ID=61569820

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710761098.6A Active CN107803585B (en) 2016-09-07 2017-08-30 Laser processing machine and laser processing method

Country Status (2)

Country Link
JP (1) JP6801312B2 (en)
CN (1) CN107803585B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6767416B2 (en) * 2018-03-26 2020-10-14 ファナック株式会社 Machining condition adjustment device and machine learning device
US12048970B2 (en) * 2018-06-22 2024-07-30 Mitsubishi Electric Corporation Laser processing apparatus
JP6792880B2 (en) * 2018-09-18 2020-12-02 前田工業株式会社 Laser welding condition judgment device
CN109175704A (en) * 2018-11-13 2019-01-11 岗春激光科技(江苏)有限公司 Laser welding head mechanism and laser soldering device
JP6810973B2 (en) * 2019-03-28 2021-01-13 前田工業株式会社 Laser welding control device
JP7357463B2 (en) 2019-05-08 2023-10-06 株式会社東芝 Judgment device, judgment system, welding system, judgment method, program, and storage medium
JP6824355B1 (en) * 2019-09-25 2021-02-03 株式会社アマダウエルドテック Laser machining monitoring method and laser machining monitoring device
JP7356381B2 (en) * 2020-03-11 2023-10-04 株式会社アマダ Laser processing machine and processing method
CN115427187A (en) * 2020-04-22 2022-12-02 株式会社尼康 Machining system
JP6775791B1 (en) * 2020-05-27 2020-10-28 上野精機株式会社 Parts delivery device
CN111901569A (en) * 2020-08-11 2020-11-06 上海柏楚电子科技股份有限公司 Monitoring processing method, device, equipment and medium for dynamically tracking cutting head
CN118010725A (en) * 2024-02-05 2024-05-10 深圳市九州智焊未来科技有限公司 Detection system and method for laser welding process

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01138083A (en) * 1987-11-25 1989-05-30 Amada Co Ltd Detector for reflected and returned beam of carbon dioxide gas laser beam machine
JPH05220590A (en) * 1992-02-10 1993-08-31 Nitto Denko Corp Etching method
JP3162254B2 (en) * 1995-01-17 2001-04-25 三菱電機株式会社 Laser processing equipment
JPH10180685A (en) * 1996-12-25 1998-07-07 Murata Mach Ltd Cutter control circuit and facsimile device
JP2000061671A (en) * 1998-08-19 2000-02-29 Kubota Corp Laser beam machining method and device therefor
JP2009095876A (en) * 2007-10-18 2009-05-07 Olympus Corp Laser machining apparatus, laser machining method, and laser machining program
JP5749623B2 (en) * 2011-10-04 2015-07-15 小池酸素工業株式会社 Plasma cutting monitoring device
JP6064519B2 (en) * 2012-10-29 2017-01-25 三星ダイヤモンド工業株式会社 Laser processing apparatus and processing condition setting method for patterned substrate
CN105102170A (en) * 2013-04-17 2015-11-25 村田机械株式会社 Laser processor and laser processing method
JP6299111B2 (en) * 2013-08-28 2018-03-28 オムロン株式会社 Laser processing equipment
DE102013218421A1 (en) * 2013-09-13 2015-04-02 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Apparatus and method for monitoring, in particular for controlling, a cutting process
CN103817431A (en) * 2013-12-11 2014-05-28 乐清市佳速激光有限公司 Laser real-time feedback control system and control method thereof
JP6220718B2 (en) * 2014-03-31 2017-10-25 日立オートモティブシステムズ株式会社 Laser welding quality determination method and laser welding quality determination device

Also Published As

Publication number Publication date
CN107803585A (en) 2018-03-16
JP6801312B2 (en) 2020-12-16
JP2018039028A (en) 2018-03-15

Similar Documents

Publication Publication Date Title
CN107803585B (en) Laser processing machine and laser processing method
Dworkin et al. Image processing for machine vision measurement of hot formed parts
JP5338890B2 (en) Laser welding welding position detection apparatus and welding position detection method
JP5875630B2 (en) Method and apparatus for identifying incomplete cuts
CN109952171A (en) Method and apparatus for determining and adjusting the focal position of processing beam
US9279973B2 (en) Image processing apparatus, fluorescence microscope apparatus, and image processing program
US9739715B2 (en) Laser scanning microscope system and method of setting laser-light intensity value
KR101875980B1 (en) High speed acquisition vision system and method for selectively viewing object features
CN106796343B (en) Microscope
CN105993033A (en) Method for identifying an edge contour of an opening on a machining head, and machining tool
EP2071381A1 (en) Scanning microscope
JP2023508765A (en) Quality Control of Laser Machining Process Using Machine Learning
CN111479648B (en) Device, method and use for monitoring beam processing of a workpiece
JP2008262100A (en) Sample scanner device, and sample position detecting method using device
US20210290050A1 (en) Ocular fundus image processing apparatus
JP2018032005A (en) Autofocus system, method and image inspection device
CN116615302A (en) Method for detecting the suspension position of a support bar and flat machine tool
CN113001036A (en) Laser processing method
JP2012141233A (en) Detector
US20170069110A1 (en) Shape measuring method
KR20210136005A (en) laser repair method, laser repair device
JP2017131931A (en) Laser marking device
US20230241710A1 (en) Method for Analyzing a Workpiece Surface for a Laser Machining Process and Analysis Device for Analyzing a Workpiece Surface
CN107250873B (en) Microscope
JP2019155402A (en) Centering method for laser beam and lase processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant