WO2020158201A1 - 加工条件解析装置、レーザ加工装置、レーザ加工システムおよび加工条件解析方法 - Google Patents
加工条件解析装置、レーザ加工装置、レーザ加工システムおよび加工条件解析方法 Download PDFInfo
- Publication number
- WO2020158201A1 WO2020158201A1 PCT/JP2019/048498 JP2019048498W WO2020158201A1 WO 2020158201 A1 WO2020158201 A1 WO 2020158201A1 JP 2019048498 W JP2019048498 W JP 2019048498W WO 2020158201 A1 WO2020158201 A1 WO 2020158201A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- processing
- correction amount
- unit
- evaluation
- laser
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 74
- 238000003754 machining Methods 0.000 title abstract description 31
- 238000011156 evaluation Methods 0.000 claims abstract description 177
- 238000012937 correction Methods 0.000 claims abstract description 127
- 238000004364 calculation method Methods 0.000 claims abstract description 47
- 238000003698 laser cutting Methods 0.000 claims abstract description 16
- 230000007547 defect Effects 0.000 claims abstract description 15
- 238000012545 processing Methods 0.000 claims description 420
- 238000005520 cutting process Methods 0.000 claims description 42
- 238000003860 storage Methods 0.000 claims description 9
- 230000006872 improvement Effects 0.000 claims description 8
- 238000003384 imaging method Methods 0.000 abstract description 5
- 238000000605 extraction Methods 0.000 description 37
- 238000010801 machine learning Methods 0.000 description 35
- 238000000034 method Methods 0.000 description 25
- 239000007789 gas Substances 0.000 description 19
- 238000010586 diagram Methods 0.000 description 18
- 206010040844 Skin exfoliation Diseases 0.000 description 14
- 230000008859 change Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 10
- 210000002569 neuron Anatomy 0.000 description 10
- 238000004519 manufacturing process Methods 0.000 description 8
- 230000002950 deficient Effects 0.000 description 6
- 238000003062 neural network model Methods 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 4
- 238000013459 approach Methods 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000003746 surface roughness Effects 0.000 description 4
- 230000004075 alteration Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000012706 support-vector machine Methods 0.000 description 3
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 2
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000013213 extrapolation Methods 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 229910052757 nitrogen Inorganic materials 0.000 description 2
- 239000001301 oxygen Substances 0.000 description 2
- 229910052760 oxygen Inorganic materials 0.000 description 2
- 208000024891 symptom Diseases 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 230000002146 bilateral effect Effects 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000002845 discoloration Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000007788 roughening Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23K—SOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
- B23K26/00—Working by laser beam, e.g. welding, cutting or boring
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23K—SOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
- B23K26/00—Working by laser beam, e.g. welding, cutting or boring
- B23K26/02—Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
- B23K26/03—Observing, e.g. monitoring, the workpiece
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23K—SOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
- B23K26/00—Working by laser beam, e.g. welding, cutting or boring
- B23K26/36—Removing material
- B23K26/38—Removing material by boring or cutting
Definitions
- the present invention relates to a processing condition analysis device, a laser processing device, a laser processing system and a processing condition analysis method for analyzing processing conditions in laser processing.
- Patent Document 1 discloses a machine learning device for machine learning laser processing condition data.
- the machine learning device described in Patent Document 1 includes a state quantity observing section for observing a state quantity of a laser processing system and an operation result acquiring section for acquiring a processing result by the laser processing system. Learn by associating with system state quantities and processing results. Then, the machine learning device described in Patent Document 1 refers to the learned laser processing condition data and outputs the laser processing condition data to determine the laser processing condition data with which the optimum processing result is obtained.
- Patent Document 1 the learning model is updated so that the error in the processing result becomes small.
- various methods of modifying the processing conditions that reduce the error in the processing result are conceivable, Patent Document 1 does not show the correspondence between the processing results and the method of modifying the processing conditions in detail. For this reason, the machine learning device described in Patent Document 1 needs to perform trials under various processing conditions before learning appropriate processing conditions and obtaining a processing quality of a certain level or higher. It takes a number of trials. As a result, the machine learning device described in Patent Document 1 requires time to adjust the processing conditions.
- the present invention has been made in view of the above, and adjusts the processing conditions in laser processing so as to suppress the adjustment time and obtain a processing quality of a certain level or higher regardless of the skill level of the operator. It is an object of the present invention to obtain a processing condition analysis device capable of performing.
- the processing condition analyzing apparatus uses a plurality of processing defects by using cutting surface information based on an image obtained by photographing a cutting surface cut by laser cutting processing.
- An evaluation unit that generates an evaluation value indicating the processing quality corresponding to each of the modes and outputs a combination pattern that is a plurality of evaluation values corresponding to each of the plurality of processing failure modes is provided.
- the processing condition analysis device includes a correction amount calculation unit that calculates a correction amount of a processing parameter indicating a processing condition of laser cutting processing based on the combination pattern.
- the processing condition analyzing apparatus can adjust the processing conditions in laser processing so as to suppress the adjustment time and obtain a processing quality of a certain level or higher regardless of the skill level of the operator. Produce an effect.
- FIG. 3 is a diagram showing a definition of a work in the vertical direction in the first embodiment.
- the figure which shows an example of the image of the cut surface image
- the figure which shows an example of the image of the cut surface image
- the figure which shows an example of the image of the cut surface imaged in the state where the oxide film peeling occurred.
- FIG. 3 is a diagram showing a configuration example of a processing circuit of Embodiment 1.
- FIG. 6 is a diagram showing a configuration example of a neural network model according to the second embodiment.
- FIG. 5 is a diagram showing a configuration example of a processing condition analyzing apparatus according to a third embodiment. The figure which shows an example of the display screen displayed on the display part of Embodiment 3.
- a processing condition analysis device, a laser processing device, a laser processing system, and a processing condition analysis method according to an embodiment of the present invention will be described below in detail with reference to the drawings.
- the present invention is not limited to the embodiments.
- FIG. 1 is a diagram showing a configuration example of a laser processing system including a processing condition analyzing apparatus according to the first embodiment of the present invention.
- the laser processing system includes a processing condition analysis device 10 according to the present invention and a laser processing device 20.
- the laser processing apparatus 20 of the present embodiment performs laser cutting processing for cutting the work 30 that is the workpiece by focusing the laser light.
- the processing condition analysis device 10 adjusts the processing conditions of the laser processing device 20, that is, the laser cutting process.
- the laser processing device 20 includes a control unit 21, a laser oscillator 22, and a processing head 23, as shown in FIG.
- the laser oscillator 22 oscillates and emits laser light.
- the wavelength of the laser light of the laser processing device 20 used for the processing can be appropriately selected in consideration of the absorptance of the laser light to the processing target, the reflectance, and the like. For example, it can be set to 0.193 ⁇ m to 11 ⁇ m.
- the laser light emitted from the laser oscillator 22 is supplied to the processing head 23 via the optical path.
- the processing gas is supplied to the inside of the processing head 23, and the processing gas is supplied to the work 30 when the work 30 is irradiated with the laser light.
- the processing head 23 has a condensing lens (not shown) that condenses the laser light onto the work 30.
- the processing head 23 cuts the work 30 by focusing the laser light and irradiating the work 30 with the laser light.
- the processing head 23 has a nozzle (not shown).
- the nozzle has an opening on the optical path of the laser beam between the condenser lens and the work 30, and the laser light and the processing gas pass through this opening.
- a motor and a motor drive device are a shaft on which the machining head 23 is installed, a machining table on which the work 30 is arranged, or a shaft on which the machining head 23 is arranged and a machining table on which the work 30 is arranged.
- the relative position between the processing head 23 and the work 30 can be changed by controlling the motor by the motor drive device under the control of the control unit 21.
- the type of laser oscillator 22 is not limited.
- An example of the laser oscillator 22 is a fiber laser oscillator, but a carbon dioxide gas laser, a copper vapor laser, various ion lasers, a solid-state laser having a YAG crystal or the like as an excitation medium may be used, or light from a laser diode may be used as it is. It may be a direct diode laser or the like. Further, a wavelength conversion unit for converting the wavelength of the laser light generated from the laser oscillator 22 may be provided.
- the control unit 21 controls the laser oscillator 22 and the motor drive device so that the laser beam scans the machining path on the work 30 according to the machining program and the machining parameters indicating the machining conditions.
- the control unit 21 uses the laser output, the processing gas pressure, the processing speed, the focus position of the converging optical system, the condensing diameter, the pulse frequency of the laser, the duty ratio of the laser pulse, the nozzle diameter, and the work 30 as the processing parameters.
- the distance to the nozzle, the type of laser beam mode, the positional relationship between the center of the nozzle hole and the laser beam, and the like can be mentioned.
- the processing parameter is not limited to the above example. What is used as a processing parameter is appropriately determined depending on the type of laser used, the function of the laser processing apparatus 20, and the like.
- the processing parameters used by the control unit 21 can be changed according to the correction amount calculated by the processing condition analysis device 10 as described later. That is, the processing parameters can be corrected by the processing condition analysis device 10.
- the processing parameters before being corrected by the processing condition analysis device 10 are predetermined according to the processing content, for example.
- the laser processing device 20 may include an input means for receiving an input from an operator, and the processing parameters before being corrected by the processing condition analysis device 10 may be changeable by the input from the operator. Further, the processing parameters before being corrected by the processing condition analyzing apparatus 10 may be transmitted to the laser processing apparatus 20 from another device such as a computer (not shown).
- the processing condition analysis device 10 includes a photographing device 11, a feature amount extraction unit 12, an evaluation unit 13, and a correction amount calculation unit 14.
- the photographing device 11 photographs the cut surface 31 of the work 30 processed by the laser processing device 20, and outputs the photographed image to the feature amount extraction unit 12.
- the captured image is not limited to a still image, and includes an image captured as a moving image.
- the feature amount extraction unit 12 extracts the feature amount from the image output from the imager 11, and outputs the extracted feature amount to the evaluation unit 13.
- the feature amount is an example of cutting plane information based on an image obtained by photographing a cutting plane cut by laser cutting processing.
- the evaluation of the processing by the processing condition analysis device 10 and the calculation of the correction amount of the processing parameter are performed, for example, before shifting to the production processing which is the processing for producing the product.
- the evaluation unit 13 determines the quality of processing of the cut surface 31 using the feature amount output from the feature amount extraction unit 12, and outputs the determination result to the correction amount calculation unit 14. Specifically, the evaluation unit 13 uses the cutting surface information based on the image of the cut surface 31 cut by the laser cutting processing to generate the evaluation value indicating the processing quality corresponding to each of the plurality of processing failure modes. To do.
- the cutting plane information may be the feature amount output from the feature amount extraction unit 12 or the image itself. Then, the evaluation unit 13 outputs a combination pattern, which is a plurality of evaluation values corresponding to each of the plurality of processing failure modes, as a determination result.
- the correction amount calculation unit 14 sends the correction amount of the processing parameter to the laser processing apparatus 20 to obtain desired processing quality.
- the desired processing quality is a processing quality of a certain level or higher, and a specific example will be described later.
- the correction amount calculation unit 14 does not calculate the correction amount and notifies the laser processing device 20 that the determination result is good.
- the control unit 21 controls the processing head 23 and the laser oscillator 22 to execute the cutting processing after correcting the processing parameters by the correction amount calculation unit 14 by the correction amount.
- the correction amount calculation unit 14 notifies the laser processing device 20 that the determination result is good, the laser processing device 20 shifts to production processing.
- the processing condition analysis device 10 and the laser processing device 20 may be connected by wire or wirelessly, or may be connected via a network.
- the processing condition analysis device 10 and the laser processing device 20 are separate devices, but the processing condition analysis device 10 may be included in the laser processing device 20. That is, the laser processing device 20 may include the imaging device 11, the feature amount extraction unit 12, the evaluation unit 13, and the correction amount calculation unit 14.
- FIG. 2 is a flow chart showing an example of the operation of adjusting the processing parameters in the laser processing system according to the present embodiment.
- the laser processing system executes, for example, the operation shown in FIG. 2 for adjusting processing parameters before performing production processing. Note that the timing of executing the operation illustrated in FIG. 2 is not limited to before the production processing is performed, and may be performed during the production processing, for example.
- Step S1 in FIG. 2 is performed by the laser processing apparatus 20 shown in FIG. 1, and step S2 is performed by the imager 11 shown in FIG. Step S3 of FIG. 2 is performed by the feature amount extraction unit 12 shown in FIG. 1, and steps S4 to S8 of FIG. 2 are performed by the evaluation unit 13 shown in FIG. Further, steps S9 and S11 of FIG. 2 are executed by the correction amount calculation unit 14 shown in FIG. 1, and step S10 of FIG. 2 is executed by the laser processing device 20. The details of each step will be described below.
- the laser processing device 20 performs cutting processing (step S1).
- the image capturing device 11 of the processing condition analyzing apparatus 10 captures an image of the cut surface 31 formed by the cutting processing (step S2).
- the image acquisition of the cut surface 31 may be performed by shooting the cut surface 31 of the portion where the cutting is completed during the laser processing, or the cut surface 31 of the processing target after the cutting is completed.
- the photographing device 11 may be a digital camera or a video camera.
- the image used for extracting the feature amount may be a still image or a moving image as long as the image can extract the feature amount.
- a device that acquires data obtained by measuring the in-plane distribution of the unevenness of the cut surface 31 with a three-dimensional shape measuring machine may be used.
- the feature quantity extraction unit 12 extracts the feature quantity from the image photographed by the photographing device 11 (step S3).
- the feature amount extraction unit 12 uses, for example, the image processing technique as the feature amount, the hue, the saturation, the lightness of the cut surface 31, the lightness correlation between the pixels, and the high-order local autocorrelation (HLAC). ) Is calculated.
- the feature amount extraction unit 12 uses, as the feature amount, a processing result obtained by a Gaussian smoothing process (a process using a Sobel filter), a result obtained by an affine transformation, and a local region, using an image processing technique.
- the brightness may be calculated using a histogram in the gradient direction (HoG feature amount), SIFT (Scale Invariant Feature Transform), an analysis result using the phase-only correlation method, an analysis result using the Fourier transform, and the like.
- the feature amount extraction unit 12 performs pre-processing such as image exposure correction, color tone correction such as color temperature and white balance correction, axial chromatic aberration correction, aberration correction such as magnification chromatic aberration correction, smoothing filter Gaussian filter, Median Filter, bilateral filter, Guided filter, adaptive histogram flattening (Contrast Limited Histogram Enhancement: CLAHE) which is a histogram flattening process, Sobel filter which is a differential filter, Laplarcian filter, RGB which is a color space conversion method, Luminance, HSV (Hue, Saturation, Value), HLS (Hue, Lightness, Saturation), CIE (Commission Internationale de l'Eclairage) L*a*b* (CIELAB), CIE L*u*v* (CIEUV), Affine transformation or the like may be used.
- pre-processing such as image exposure correction, color tone correction such as color temperature and white balance correction, axial chromatic aberration correction, aberration correction such as magn
- the feature amount extraction unit 12 may calculate a histogram, an average, and a variance as the feature amount as the pixel value information, or may calculate frequency information obtained by Fourier transform, wavelet transform, or the like as the feature amount.
- the feature amount extraction unit 12 includes higher-order local autocorrelation features, CILAC (Color Indel Local AutoCorrelation), NLAC (Normal Local AutoCorrelations), GLAC (Gradient Local AutoCorrelations), SIFT, HOG (Histograms of Oriented Gradients), SURF (Surf).
- the feature amount extraction unit 12 may use the image itself as the feature amount. Further, the number of feature quantities extracted by the feature quantity extraction unit 12 may be one or more.
- the evaluation unit 13 uses the feature amount output from the feature amount extraction unit 12 to make a pass/fail determination for each processing failure mode.
- FIG. 2 shows an example in which occurrence of roughness, occurrence of scratches, occurrence of oxide film peeling, and occurrence of dross are taken into consideration as the processing failure mode. Therefore, as shown in FIG. 2, the evaluation unit 13 determines roughness (step S4), scratches (step S5), and oxide film peeling (step S6) as quality judgments corresponding to these processing failure modes. , Dross determination (step S7) is performed.
- the roughness determination, the flaw determination, the oxide film peeling determination, and the dross determination are shown in parallel, but the evaluation unit 13 may perform these determinations in parallel, or in a time series. You may implement in order.
- FIG. 3 is a diagram showing the vertical definition of the work 30 in the present embodiment.
- the laser beam 40 is condensed by the condenser lens 231 which is a part of the processing head 23.
- FIG. 3 schematically shows a cross section of the work 30 and the condenser lens 231 on the cut surface 31.
- the direction in which the processing head 23 is present is defined as the upper side and the opposite side is defined as the lower side as viewed from the work 30.
- FIG. 4 is a diagram showing an example of an image of the cut surface 31 taken in a state where the roughness occurs.
- the part surrounded by the dotted line is the characteristic part of the roughness.
- Roughness is periodically generated in the upper part of the cut surface 31.
- the depth of the unevenness of the streaks becomes deeper than when the roughness does not occur.
- the evaluation unit 13 measures the depth of the unevenness by stereo photography or illuminates from the cycle direction of the roughness, so that the peaks are bright and the valleys of the shadow are dark.
- the surface roughness of the cut surface 31 is obtained by estimating the depth of the unevenness based on the degree, the length of the shadow, the width of the dark portion, etc., and when the surface roughness is a certain value or more, it is determined that the surface is rough. be able to.
- FIG. 5 is a diagram showing an example of an image of the cut surface 31 taken in a state where a scratch has occurred.
- the part surrounded by the dotted line is the characteristic part of the scratch.
- the scratch is locally generated on the cut surface 31 from the upper surface to the lower surface.
- the cut surface 31 is divided into a plurality of sections, and a PV (Peak to Valley) value of unevenness is calculated based on the brightness in the section as a criterion for determining the presence or absence of a scratch, and the PV of a section is calculated. Whether or not the value is a certain value or more can be used.
- a section obtained by dividing the cut surface 31 in the left-right direction may be used.
- the evaluation unit 13 divides the image into a plurality of sections vertically and horizontally, and defines a portion whose brightness differs from the average value of other pixels in the section by a threshold value or more as a flaw candidate portion, and the flaw candidate portion. May be determined, and it may be determined that a flaw has occurred when the length of the flaw candidate portion is a value equal to or greater than a certain value.
- the method for obtaining the defect candidate portion is not limited to this example.
- FIG. 6 is a diagram showing an example of an image of the cut surface 31 taken in a state where the oxide film is peeled off.
- the part surrounded by the dotted line is the characteristic part of the oxide film peeling.
- the oxide film peeling is a symptom that occurs when the processing gas used for cutting is oxygen, and the oxide film generated on the cutting surface 31 peels off, and occurs at the lower part of the cutting surface 31.
- the evaluation unit 13 obtains the area of the portion where the oxide film is supposed to be peeled based on the characteristic amount such as the brightness of the pixel, and when the area is a certain value or more, the oxide film peeling is performed. It may be determined that this has occurred.
- the location where the oxide film is peeled off can be defined as, for example, the portion where the difference between the average value of the brightness of the pixel and the threshold value is greater than or equal to the portion where the oxide film peeling has occurred.
- the calculation method is not limited to this.
- FIG. 7 is a diagram showing an example of an image of the cut surface 31 taken in a state where dross has occurred.
- the part surrounded by the dotted line is the characteristic part of the dross.
- Dross is a symptom that metal or the like melted during laser cutting adheres to the cutting surface 31, and is generated from the lower end of the cutting surface 31.
- the evaluation unit 13 obtains the length of a portion assumed to be a dross candidate in the lower part of the cut surface 31 based on the feature amount such as the brightness of the pixel, and the dross is generated when this length is a certain value or more. May be determined to be present.
- the method for calculating the dross location is not limited to this.
- the processing failure mode is not limited to these.
- the determination may be performed by including other processing failure modes such as the discoloration of the workpiece 30 and the presence or absence of a vibrating surface.
- a part of the processing failure modes described above may be replaced with another processing failure mode. You can go.
- the evaluation unit 13 may change the processing failure mode to be determined according to the processing parameters such as the combination of the laser output, the processing speed, the processing plate thickness, the type of processing gas, and the like.
- the evaluation unit 13 may omit the oxide film peeling determination when the type of processing gas is nitrogen.
- the evaluation unit 13 uses the judgment results after the roughness judgment (step S4), the scratch judgment (step S5), the oxide film peeling judgment (step S6), and the dross judgment (step S7), and determines whether the cutting process is good or bad. Is determined (step S8).
- the evaluation value which is the determination result of the quality of the cutting process the degree of good or bad (defective), that is, the processing quality may be represented by a plurality of predetermined stepwise values of two or more steps, It may be a continuous value.
- the evaluation value is a value indicating the processing quality in other words.
- the evaluation value When the evaluation value is expressed in stages, it may be a two-stage value indicating either one of two values, good or bad, or may indicate a degree of failure of three or more steps. Also, each stepwise value may be combined with a probability. For example, the evaluation unit 13 may calculate evaluation values such that the probability of being good is 90% and the probability of being bad is 10%. The evaluation unit 13 may output the determination result for each processing failure mode as an evaluation value. For example, the determination for each processing failure mode is the above-described four determinations of roughness determination, scratch determination, oxide film peeling determination, and dross determination, and the evaluation unit 13 has two values of good or bad as evaluation values corresponding to these respectively.
- the evaluation unit 13 may determine the quality of the cutting process based on the sum of the evaluation values of the determination corresponding to each process failure mode. Further, the evaluation unit 13 may weight the determination result for each processing failure mode and determine the quality of the cutting processing based on the weighted total. For example, the evaluation unit 13 may determine that the cut processing quality determination result is negative when the number of determination results corresponding to the processing failure mode determined to be negative is equal to or greater than a threshold value. Alternatively, if at least one of the determinations corresponding to each of the processing failure modes is determined to be defective, the evaluation unit 13 may determine the quality of the cut surface 31 as negative.
- the evaluation unit 13 does not make a binary decision of good or bad in the determination corresponding to each processing failure mode, but a continuous value that approaches 0 when the possibility of good is high and approaches 1 when the possibility of defective is high. May be requested.
- the evaluation unit 13 may display the determination result on a display unit inside or outside the processing condition analysis apparatus 10.
- the evaluation unit 13 may display the determination result on the display unit inside or outside the processing condition analysis device 10 only when the quality determination result of the cutting process is NO.
- the evaluation unit 13 may make a pass/fail judgment by using not only the feature amount output from the feature amount extraction unit 12 but also other information.
- FIG. 8 is a diagram showing input to the evaluation unit 13 in the present embodiment when other information is used.
- the imaging device 11 is omitted in the illustration.
- the processed plate thickness and the processed material are used as other information.
- the processing plate thickness is the thickness of the workpiece 30 to be processed in the laser beam incident direction
- the processing material is the material of the workpiece 30 to be processed.
- the other information may be input to the processing condition analyzing apparatus 10 from an operator by providing the processing condition analyzing apparatus 10 with an input unit, or the processing condition analyzing apparatus 10 acquires it from the laser processing apparatus 20. May be.
- the correction amount calculation unit 14 determines whether or not the determination result output from the evaluation unit 13 is good (step S9), and when the determination result is good (step S9 Yes), the adjustment of the processing parameter is performed. To finish. When the adjustment of the processing parameters is completed, production processing is executed. When the evaluation unit 13 outputs a determination result, that is, a value of three or more stages or a continuous value as the evaluation value, the correction amount calculation unit 14 determines whether the evaluation value satisfies the predetermined criterion in step S9. , Judge pass/fail. For example, when the evaluation value has five levels from level 1 to level 5, the level 1 is the best processing state, and the level 5 is the worst processing state, the evaluation unit 13 is good when the level 3 or lower. Judge that there is. When defining the evaluation values in a plurality of stages in this way, for example, in the scratch determination, a method of defining the evaluation values in a plurality of stages according to the length of the scratch candidate can be considered.
- the correction amount calculation unit 14 calculates the processing parameter correction amount based on the determination result output from the evaluation unit 13 (step S10). ..
- the correction amount calculation unit 14 outputs the calculated correction amount to the laser processing device 20.
- the correction amount calculation unit 14 can acquire the processing parameters set in the laser processing device 20 from the laser processing device 20, and determines the determination result output from the evaluation unit 13 and the currently set processing parameters.
- the correction amount may be calculated based on
- the control unit 21 of the laser processing apparatus 20 corrects the processing parameters based on the correction amount received from the correction amount calculation unit 14 (step S11), and executes step S1 again.
- the processing condition analysis device 10 again executes the processing from step S2.
- the correction of the processing parameters and the cutting processing are repeated until the determination result output from the evaluation unit 13 becomes good. If it is desired to confirm the stability of the processing, the cutting processing using the same processing parameter is performed a plurality of times, and the steps S2 to S9 are executed for each of the plurality of cutting processings to correspond to the cutting processing. When it is determined that all are good in the determination of step S9, the processing parameter correction may be ended.
- processing parameters to be corrected include laser output, processing gas pressure, processing speed, focus position of converging optical system, converging diameter, laser pulse frequency, laser pulse duty ratio, nozzle diameter, and work 30.
- the distance to the nozzle, the type of laser beam mode, the positional relationship between the center of the nozzle hole and the laser beam, and the like can be mentioned.
- the correction amount calculation unit 14 corrects the processing parameter to be corrected and the processing parameter based on the combination pattern of the quality determination results for each processing failure mode. You may decide the correction amount of a parameter.
- the combination pattern is, for example, good when the evaluation value is 1 and bad when the evaluation value is 0, and the evaluation values corresponding to the roughness judgment, the scratch judgment, the oxide film peeling judgment, and the dross judgment are evaluated by the evaluation unit 13 respectively. Is a combination of four data values such as 0, 0, 0, 1 and so on.
- the laser output and the processing gas pressure among the processing parameters are the calculation targets of the correction amount, and the laser output is increased to increase the processing gas pressure.
- the correction amount is determined so as to decrease. In this way, the processing parameter to be corrected and the correction amount of the processing parameter can be set for each combination pattern.
- correction is performed for each processing failure mode.
- the correction amount of the processing parameter to be corrected may be weighted and changed, or the processing parameter itself to be corrected may be changed according to the evaluation value of each processing failure mode.
- the evaluation unit 13 outputs an evaluation value for each processing failure mode as a numerical value from 0 to 1 in three or more levels.
- the evaluation value of the dross determination is defined in four stages of 0, 0.3, 0.6, and 1.0, and the correction amount of the laser output and the processing gas pressure is determined according to the evaluation value of the dross determination. Keep it.
- the correction amount of the laser output is +0.2 [kW]
- the correction amount of the processing gas pressure is ⁇ 0.01 [MPa]
- the evaluation value is 0.
- the correction amount is +0.5 [kW] and the processing gas pressure correction amount is ⁇ 0.02 [MPa].
- the correction amount calculation unit 14 calculates the correction amount according to the correspondence between the evaluation value and the correction amount thus determined.
- the evaluation value of the dross is 0.3
- the laser output is increased by 0.2 [kW] and the processing gas pressure is decreased by 0.01 [MPa].
- the evaluation value is 0.6
- the laser output is increased.
- the output is increased by 0.5 [kW] and the processing gas pressure is decreased by 0.02 [MPa].
- the correction amount described above is an example, and the correction amount may be set according to the evaluation value, and the correction amount may be set as a value depending on the value of the processing parameter before correction.
- the method of determining the correction amount is not limited to the above example.
- the correction amount of each processing parameter may be calculated by extrapolation or interpolation.
- a polynomial curve such as spline interpolation or Lagrange interpolation may be used, or various functions such as a trigonometric function and a conic curve may be used.
- the evaluation unit 13 has described an example of a defect related to the quality of the cut surface 31 as the processing failure mode.
- the quality of the cut surface 31, productivity, processing stability, etc. , High-priority improvement items may differ. If the productivity, that is, the processing speed is extremely low, even if the quality of the cut surface 31 is good, it may not be appropriate.
- the processing condition analysis device 10 may be provided with an input means so as to receive the input of the priority for each improvement item from the operator.
- the correction amount calculation unit 14 calculates the correction amount of the processing parameter based on the priority of each improvement item.
- the correction amount calculation unit 14 may determine the correction amount of the processing parameter based on the priority of the plurality of improvement items including the productivity, the combination pattern, and the processing stability. For example, depending on the improvement item, the positive and negative signs of the correction amount of the same processing parameter may be reversed. In such a case, the correction amount calculation unit 14 selects the correction amount corresponding to the work item to be prioritized.
- the correction amount calculation unit 14 may obtain the correction amount by weighting according to the priority. For example, the correction amount of each processing parameter is predetermined for each improvement item, and the correction amount calculation unit 14 multiplies the predetermined correction amount by the weight corresponding to the priority of the improvement item, and the weight is multiplied.
- the correction amount to be output is determined by obtaining the total correction amount after the correction. If the weight is determined such that the weight of the item having a higher priority becomes larger, the contribution to the output correction amount becomes larger as the priority becomes higher. As described above, the correction amount calculation unit 14 may perform the weighting according to the priority and calculate the correction amount by balancing each item.
- FIG. 9 is a diagram showing a configuration example of the processing condition analyzing apparatus of the present embodiment when the correction amount is determined by reflecting the results of a plurality of trials.
- a machining condition analyzing apparatus 10a shown in FIG. 9 has a machining condition storage unit 15 added to the machining condition analyzing apparatus 10 shown in FIG.
- the processing condition analysis device 10a In the processing condition analysis device 10a, one or more sets of the evaluation result output from the evaluation unit 13 and the processing parameter corresponding to the evaluation result in one or more previous trials are stored in the processing condition storage unit 15 as a set. Memorized in.
- the correction amount calculation unit 14 calculates the correction amount of the processing parameter based on the evaluation result output from the evaluation unit 13, the past evaluation result and the processing parameter stored in the processing condition storage unit 15. In this way, by calculating the correction amount based not only on the current information but also on the past information, it is possible to improve the calculation accuracy of the correction amount.
- the correction amount can be calculated using a set of evaluation results and processing parameters for a plurality of times.
- the same defective pattern may occur under a plurality of processing conditions due to the influence of unobserved or unobservable state quantities.
- a plurality of combinations of these correction conditions are possible.
- a more correct correction condition can be found by selecting one of the combinations and determining the correction amount in consideration of how the defective pattern changes in the next trial machining.
- the processing condition analysis device 10a calculates the correction amount so as to lower the focus position which is the processing parameter, and the cutting processing is performed.
- the processing condition storage unit 15 stores the processing parameters set in this processing and the evaluation result corresponding to the result of the cutting processing. When the evaluation result output from the evaluation unit 13 is not good, the correction amount calculation unit 14 lowers the focus position once more and performs a laser processing trial.
- the correction amount calculation unit 14 determines based on the set of the processing parameter and the evaluation value stored in the processing condition storage unit 15. Alternatively, the correction amount may be calculated so as to raise the focal position by an amount equal to or more than the amount of lowering the focal position by two trials.
- the processing condition analyzing apparatus 10a includes an input unit, and is used when the operator outputs a stepwise evaluation value corresponding to each processing failure mode or an evaluation value configured by two judgment results. You may make it accept the input of the threshold value for determining a step. Then, the evaluation unit 13 determines the evaluation value using the input threshold value. For each worker, a threshold value according to the worker's request can be determined, and the evaluation stage can be set finely or roughly for each processing failure mode. In addition, this allows the operator to set the evaluation value standard to be strict or loose.
- the feature amount extraction unit 12, the evaluation unit 13, and the correction amount calculation unit 14 of the processing condition analysis device 10 are realized by a processing circuit.
- the processing circuit may be a circuit including a processor or may be dedicated hardware.
- the processing condition storage unit 15 is realized by a memory.
- FIG. 10 is a diagram showing a configuration example of the processing circuit according to the present embodiment.
- the processing circuit 100 shown in FIG. 10 includes a processor 101 and a memory 102.
- the processor 101 reads out and executes the program stored in the memory 102, thereby Is realized. That is, when the feature amount extraction unit 12, the evaluation unit 13, and the correction amount calculation unit 14 are realized by the processing circuit 100 shown in FIG. 10, these functions are realized by using a program that is software.
- the memory 102 is also used as a work area of the processor 101.
- the processor 101 is a CPU (Central Processing Unit) or the like.
- the memory 102 corresponds to, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a nonvolatile or volatile semiconductor memory such as a flash memory, or a magnetic disk.
- the processing circuits are, for example, FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit).
- the feature amount extraction unit 12, the evaluation unit 13, and the correction amount calculation unit 14 may be realized by combining a processing circuit including a processor and dedicated hardware.
- the feature amount extraction unit 12, the evaluation unit 13, and the correction amount calculation unit 14 may be realized by a plurality of processing circuits.
- Adjusting the processing conditions for laser processing is a multi-dimensional parameter search, and it takes an extremely large number of trials to search for the processing conditions to obtain the processing quality above the desired level.
- the processing condition analyzers 10 and 10a of the present embodiment the feature amount of the cut surface is extracted, the quality of the plurality of processing failure modes of the cutting surface is determined, and the combination pattern of the quality of each of the plurality of processing failure modes is determined. By determining the correction amount of the processing parameter associated with, it becomes possible to correct the processing condition with a small number of trials.
- the working conditions can be adjusted by experience and know-how.
- the working condition analyzing apparatus 10 or 10a of the present embodiment the knowledge and know-how of the skilled person can be obtained. It is possible to adjust the processing conditions without requiring.
- the processing conditions of the laser processing apparatus 20 are generated by using the processing conditions adjusted by using the processing condition analysis apparatuses 10 and 10a of the present embodiment, and the manufacturing for production in the laser processing apparatus 20 is performed. May be.
- a measurement value such as the smoothness or surface roughness of the cut surface is used as the observation data in the state quantity observing unit of the laser cut surface, so that the measurement takes time. Therefore, it takes a long time to perform one cutting process trial.
- the feature amount is extracted from the image of the cut surface 31, and the quality determination is performed using the feature amount for each processing failure mode of the cut surface 31.
- the time required for one trial can be reduced.
- FIG. 11 is a diagram showing a configuration example of a processing condition analysis device according to the second exemplary embodiment of the present invention.
- the laser processing system according to the present embodiment is the same as the laser processing system according to the first embodiment except that a processing condition analyzing apparatus 10b shown in FIG. 11 is provided instead of the processing condition analyzing apparatus 10.
- the constituent elements having the same functions as those in the first embodiment are designated by the same reference numerals as those in the first embodiment, and the duplicated description will be omitted.
- differences from the first embodiment will be mainly described.
- the machine learning device 16 is added to the machining condition analysis device 10 of the first embodiment, and the evaluation unit 13 is deleted.
- the machine learning device 16 learns by associating the feature amount extracted by the feature amount extraction unit 12 with the evaluation value created by the worker (evaluation value by the worker).
- the evaluation value by the operator may be input, for example, from an input unit (not shown) or may be received from another device.
- the machine learning device 16 is realized by a processing circuit like the feature amount extraction unit 12, the evaluation unit 13, and the correction amount calculation unit 14 of the first embodiment.
- the machine learning device 16 includes a learning unit 161 and a data acquisition unit 162.
- the learning unit 161 learns a set of input and result data by machine learning. Although any machine learning algorithm of the learning unit 161 may be used, for example, a supervised learning algorithm may be used.
- the data acquisition unit 162 acquires the feature amount from the feature amount extraction unit 12 as an input to the learning unit 161, and inputs the acquired feature amount to the learning unit 161. Further, the evaluation value by the worker is also input to the learning unit 161.
- the evaluation value by the operator is the result of judging the quality of the cut surface 31 with respect to each processing failure mode, and indicates the level stepwise like the evaluation value in the judgment result by the evaluation unit 13 described in the first embodiment. It may be a value or a continuous numerical value.
- the data acquisition unit 162 may acquire the image output from the image capturing device 11 as an input to the learning unit 161. In this way, the data acquisition unit 162 acquires the image output from the image capturing device 11 or the feature amount output from the feature amount extraction unit 12 as a state variable and supplies the state variable to the learning unit 161.
- the learning unit 161 can machine-learn whether the cut surface 31 is good or bad by using a data set including state variables and evaluation values.
- the data set is data in which state variables and evaluation data are associated with each other.
- the learning unit 161 outputs the evaluation value according to the feature amount by using the learned model by machine learning, so that the processing parameter can be corrected with higher accuracy.
- the learning unit 161 has both the function of machine learning the quality of the cut surface 31 and the function as the learned model, but the inference unit that outputs the evaluation value is learned using the learned model. It may be provided separately from the portion 161. That is, the processing condition analysis device 10b may include an inference unit that calculates a combination pattern based on the cut surface information by using the learned model learned by the learning unit 161.
- the machine learning device 16 is provided in the processing condition analysis device 10b, but may be a device different from the processing condition analysis device 10b.
- the processing condition analysis device 10b and the machine learning device 16 may be connected via a network.
- the machine learning device 16 may exist on the cloud server.
- FIG. 12 is a diagram showing a configuration example of the processing condition analysis device 10b of the present embodiment including the evaluation unit 13.
- the evaluation unit 13 calculates, for example, a combination pattern, which is a plurality of evaluation values corresponding to each of the plurality of processing failure modes described in the first embodiment, the operator corrects the evaluation values, and the corrected result is displayed. Input to the machine learning device 16.
- the algorithm for determining the evaluation value in the evaluation unit 13 and the threshold value for the determination may be appropriately changeable by the operator.
- the learning unit 161 learns the pass/fail evaluation result of the cut surface 31 by so-called supervised learning using, for example, a neural network model.
- the supervised learning is to give a large amount of data sets, which are data sets of a certain input and a result (label), to a learner to learn the characteristics of those data sets and obtain the results from the input. This is machine learning to estimate.
- a neural network is composed of an input layer composed of multiple neurons, an intermediate layer composed of multiple neurons, also called a hidden layer, and an output layer composed of multiple neurons.
- the intermediate layer may be one layer or two or more layers.
- FIG. 13 is a diagram showing a configuration example of the neural network model according to the second embodiment.
- X1, X2 and X3 are neurons in the input layer
- Y1 and Y2 are neurons in the intermediate layer
- Z1, Z2 and Z3 are neurons in the output layer.
- each input value corresponds to a corresponding weight w11 to w16.
- Y1 and Y2 which are neurons in the intermediate layer.
- the output values from Y1 and Y2 are multiplied by the corresponding weights w21 to w26 and input to the neurons Z1, Z2 and Z3 which are neurons in the output layer.
- the output layer adds the input values and outputs the result as an output result.
- the results output from Z1, Z2, and Z3 can be associated with the evaluation results corresponding to each processing failure mode. This output result changes depending on the values of the weights w11 to w16 and the weights w21 to w26.
- the weights w11 to w16 and the weights w21 to w26 are adjusted by using the above-mentioned data set so that the output result of the neural network approaches the evaluation result of the quality of the cut surface 31, which is the correct answer. By doing so, learning is performed.
- FIG. 13 is an example, and the number of layers of the neural network model and the number of neurons belonging to each layer are not limited to the example of FIG. 13.
- the learning unit 161 can also learn the evaluation result of the quality of the cut surface 31 by so-called unsupervised learning using the neural network model.
- Unsupervised learning means learning a distribution of input data by giving a large amount of input data only to the learning unit 161, and learning the input data without inputting corresponding teacher output data. This is a method for learning how to perform compression, classification, shaping, and the like. For example, in unsupervised learning, it is possible to cluster those having similar features in the set of input data. Prediction of the evaluation result can be realized by assigning the evaluation result to the result of the clustering so that the result of the clustering or the like is optimized by setting some standard.
- the learning unit 161 of the present embodiment may realize machine learning by semi-supervised learning.
- the machine learning device 16 may acquire data sets from a plurality of processing condition analysis devices and learn the evaluation result of the quality of the cut surface 31.
- the plurality of processing condition analysis devices may be the processing condition analysis device 10b of the present embodiment, the processing condition analysis devices 10 and 10a of the first embodiment, or a combination thereof. ..
- the machine learning device 16 may acquire the data sets from a plurality of processing condition analyzing devices used at the same site, or may acquire the data sets from a plurality of processing condition analyzing devices operating at different sites. May be. Furthermore, it is also possible to add the processing condition analysis device from which the data set is acquired, or remove the processing condition analysis device from which the data set is acquired.
- a machine learning device 16 is provided separately from the processing condition analysis device 10b, and after the machine learning device 16 learns from a data set acquired from a certain processing condition analysis device 10b, it is connected to another processing condition analysis device 10b. Further, a data set may be acquired from the processing condition analysis device 10b and re-learned.
- the relationship between the feature amount and the evaluation result of the quality of the cut surface 31 is learned from the image output from the image capturing device 11 or the feature amount extracting unit 12 has been described.
- the relationship between the output image or the feature amount output from the feature amount extraction unit 12 and the correction amount of the processing parameter may be learned.
- the data acquisition unit 162 acquires the image output from the imaging device 11 or the feature amount output from the feature amount extraction unit 12 and the correction amount output from the correction amount calculation unit 14.
- the machine learning device 16 calculates the correction amount of each processing parameter based on the image output from the image capturing device 11 or the feature amount output from the feature amount extraction unit 12. Can be output.
- the processing condition analysis device 10b calculates the correction amount of the processing parameter based on the cut surface information using the learned model learned by the learning unit 161. It has an inference unit to perform.
- the data acquisition unit 162 receives, as input to the learning unit 161, not only the image output from the imaging device 11 or the feature amount output from the feature amount extraction unit 12, but also the plate thickness of the cut surface 31, the work 30.
- the material and the like may be input.
- the learning algorithm used in the learning unit 161 a neural network such as deep learning that learns the extraction of the feature amount itself can be used, and other known methods such as genetic programming and function can be used.
- Machine learning may be executed according to logic programming, support vector machine, Fisher discriminant method, subspace method, discriminant analysis using Mahalanobis space, and the like.
- the learning unit 161 As learning algorithms used in the learning unit 161, decision trees, random forests, logistic regression, k-nearest neighbors (kNN), subspace methods, CLAFIC (CLAss-Featuring Information Compression method), Isolation Forest, LOF. (Local Outlier Factor), boosting, AdaBoost, LogitBoost, One-Class SVM (Support Vector Machine), Gaussian Mixture Model, Discriminant Analysis, Naive Bayes classifier, etc. may be used. .. Further, when performing learning for automatically extracting a feature amount from an image, such as deep learning and convolutional neural network (Convolution Neural Network), the feature amount extraction unit 12 may not be provided. Further, the machine learning device 16 may be provided for each machining failure mode, or one machine learning device 16 may correspond to a plurality of machining failure modes. The operation of this embodiment other than that described above is the same as that of the first embodiment.
- Convolution Neural Network Convolution Neural Network
- the image output from the image capturing device 11 or the feature amount output from the feature amount extraction unit 12 and the evaluation result of the quality of the cut face 31 are used to determine the cut face 31.
- Machine learning of the judgment result of pass/fail As a result, the same effect as that of the first embodiment can be obtained, and the correction amount of the processing parameter can be obtained more accurately than in the first embodiment.
- FIG. 14 is a diagram showing a configuration example of a processing condition analyzing apparatus according to the third embodiment of the present invention.
- the laser processing system according to the present embodiment is the same as the laser processing system according to the first embodiment, except that a processing condition analyzing apparatus 10c shown in FIG. 14 is provided instead of the processing condition analyzing apparatus 10.
- the constituent elements having the same functions as those in the first embodiment are designated by the same reference numerals as those in the first embodiment, and the duplicated description will be omitted.
- differences from the first embodiment will be mainly described.
- the machining condition analyzing apparatus 10c of the present embodiment has a display unit 17 and a change accepting unit 18 added to the machining condition analyzing apparatus 10 of the first embodiment.
- the display unit 17 is realized by a display, a monitor, a touch panel, or the like.
- the change receiving unit 18 is an input means, and is realized by a keyboard, a mouse, buttons, a touch panel, or the like.
- the display unit 17 displays the image of the cut surface 31. That is, when there is an evaluation value calculated by the evaluation unit 13 that indicates a defect, the display unit 17 displays the part in the image that is the basis of the determination of the processing failure mode corresponding to the evaluation value. .. Thereby, the operator can grasp the cut surface 31 whose processing quality is determined to be poor from the image. In addition, by displaying such that the part that is the basis for determining a defect is displayed for each processing defect mode, the operator can know which part has a problem.
- the operator confirms the cut surface 31 displayed on the display unit 17 with an image, and uses the change reception unit 18 to change the determination result in the evaluation unit 13 and to set the threshold value used in the determination in the evaluation unit 13 and the like. Input can be made to make changes. Regarding the evaluation criteria of processing quality, the quality judgment may differ depending on the operator who uses it.
- the operator himself/herself can change the determination result by confirming the cut surface 31 displayed on the display unit 17.
- the evaluation unit 13 outputs an evaluation value that reflects the determination result input by the worker to the correction amount calculation unit 14.
- FIG. 15 is a diagram showing an example of a display screen displayed on the display unit 17 according to the present embodiment.
- an evaluation value and an image showing the determination result for each processing failure mode are displayed at the top.
- the correction amount of the processing parameter corresponding to the determination result is shown as the correction condition.
- the evaluation unit 13 determines that dross, scratches, and roughness have occurred is shown, and the corresponding portion in the image is surrounded by a dotted line.
- the evaluation value expressed as a numerical value is shown as a level 1.0 or the like at the bottom of each image.
- the worker when the level of scratches considered by the worker and the evaluation value determined by the evaluation unit 13 are different, the worker changes the evaluation value via the change reception unit 18.
- the worker may change the threshold value for determining each level of the evaluation value, instead of changing the evaluation value itself. As a result, the correction amount of the processing parameter is changed so as to approach the processing quality desired by the operator.
- a machine learning device 16 is provided in the processing condition analysis device 10c, and the machine learning device 16 changes the image output from the image capturing device 11 or the feature amount output from the feature amount extraction unit 12 and the change made by the operator. You may make it learn the content. Further, the display unit 17 and the change receiving unit 18 may be added to the machining condition analyzing apparatus 10b of the second embodiment so that the machine learning device 16 learns the evaluation result after the change by the operator.
- the display unit 17 displays the image of the cut surface 31 and displays the evaluation result from the operator. I accept changes. As a result, the same effect as that of the first embodiment can be obtained, and the processing state according to the operator's request can be realized.
- processing condition analysis device 11 camera device, 12 feature amount extraction unit, 13 evaluation unit, 14 correction amount calculation unit, 15 processing condition storage unit, 16 machine learning device, 17 display unit, 18 change acceptance unit, 20 laser processing device , 21 control unit, 22 laser oscillator, 23 processing head, 30 workpiece, 161, learning unit, 162 data acquisition unit.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Plasma & Fusion (AREA)
- Mechanical Engineering (AREA)
- Laser Beam Processing (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019016102A JP6972047B2 (ja) | 2019-01-31 | 2019-01-31 | 加工条件解析装置、レーザ加工装置、レーザ加工システムおよび加工条件解析方法 |
JP2019-016102 | 2019-01-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020158201A1 true WO2020158201A1 (ja) | 2020-08-06 |
Family
ID=71841336
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/048498 WO2020158201A1 (ja) | 2019-01-31 | 2019-12-11 | 加工条件解析装置、レーザ加工装置、レーザ加工システムおよび加工条件解析方法 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6972047B2 (enrdf_load_stackoverflow) |
WO (1) | WO2020158201A1 (enrdf_load_stackoverflow) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114769898A (zh) * | 2022-03-11 | 2022-07-22 | 大族激光科技产业集团股份有限公司 | 激光加工控制方法、装置及可读存储介质 |
WO2022263207A1 (de) * | 2021-06-18 | 2022-12-22 | Trumpf Werkzeugmaschinen Gmbh + Co. Kg | Verfahren zur laserbearbeitung und laserbearbeitungsanlage sowie steuerungseinrichtung hierfür |
CN116157223A (zh) * | 2020-08-27 | 2023-05-23 | 三菱电机株式会社 | 激光加工装置 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102020212510A1 (de) | 2020-10-02 | 2022-04-07 | Trumpf Werkzeugmaschinen Gmbh + Co. Kg | Verfahren und Vorrichtung zum Aufzeigen des Einflusses von Schneidparametern auf eine Schnittkante |
DE112021004692T5 (de) * | 2020-10-13 | 2023-07-06 | Fanuc Corporation | Maschinelle Lernvorrichtung, eine Steuervorrichtung und ein maschinelles Lernverfahren |
CN112894126B (zh) * | 2021-02-26 | 2023-01-06 | 广州德擎光学科技有限公司 | 激光加工过程检测参数调整方法和系统 |
US12079107B2 (en) | 2021-03-02 | 2024-09-03 | Mitsubishi Electric Corporation | Computer readable storage medium, debugging support device, debugging support method, and machine learning device |
EP4119284A1 (de) * | 2021-07-12 | 2023-01-18 | Bystronic Laser AG | Kalibrierung eines qualitätsschätzers für ein laserschneidverfahren |
JP7733420B2 (ja) * | 2021-10-20 | 2025-09-03 | 川崎重工業株式会社 | 作業監視装置及び作業監視方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002239760A (ja) * | 2001-02-13 | 2002-08-28 | Amada Eng Center Co Ltd | レーザ加工機の加工条件決定方法およびその装置 |
JP2015027681A (ja) * | 2013-07-30 | 2015-02-12 | ブラザー工業株式会社 | レーザ加工システム、レーザ加工装置及びプログラム |
JP2016135492A (ja) * | 2015-01-23 | 2016-07-28 | パナソニックIpマネジメント株式会社 | レーザ切断部位の観察装置及びその方法 |
JP2017164801A (ja) * | 2016-03-17 | 2017-09-21 | ファナック株式会社 | 機械学習装置、レーザ加工システムおよび機械学習方法 |
JP2018190241A (ja) * | 2017-05-09 | 2018-11-29 | オムロン株式会社 | タスク実行システム、タスク実行方法、並びにその学習装置及び学習方法 |
-
2019
- 2019-01-31 JP JP2019016102A patent/JP6972047B2/ja active Active
- 2019-12-11 WO PCT/JP2019/048498 patent/WO2020158201A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002239760A (ja) * | 2001-02-13 | 2002-08-28 | Amada Eng Center Co Ltd | レーザ加工機の加工条件決定方法およびその装置 |
JP2015027681A (ja) * | 2013-07-30 | 2015-02-12 | ブラザー工業株式会社 | レーザ加工システム、レーザ加工装置及びプログラム |
JP2016135492A (ja) * | 2015-01-23 | 2016-07-28 | パナソニックIpマネジメント株式会社 | レーザ切断部位の観察装置及びその方法 |
JP2017164801A (ja) * | 2016-03-17 | 2017-09-21 | ファナック株式会社 | 機械学習装置、レーザ加工システムおよび機械学習方法 |
JP2018190241A (ja) * | 2017-05-09 | 2018-11-29 | オムロン株式会社 | タスク実行システム、タスク実行方法、並びにその学習装置及び学習方法 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116157223A (zh) * | 2020-08-27 | 2023-05-23 | 三菱电机株式会社 | 激光加工装置 |
WO2022263207A1 (de) * | 2021-06-18 | 2022-12-22 | Trumpf Werkzeugmaschinen Gmbh + Co. Kg | Verfahren zur laserbearbeitung und laserbearbeitungsanlage sowie steuerungseinrichtung hierfür |
CN114769898A (zh) * | 2022-03-11 | 2022-07-22 | 大族激光科技产业集团股份有限公司 | 激光加工控制方法、装置及可读存储介质 |
CN114769898B (zh) * | 2022-03-11 | 2024-06-07 | 大族激光科技产业集团股份有限公司 | 激光加工控制方法、装置及可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP6972047B2 (ja) | 2021-11-24 |
JP2020121338A (ja) | 2020-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020158201A1 (ja) | 加工条件解析装置、レーザ加工装置、レーザ加工システムおよび加工条件解析方法 | |
CN113870172B (zh) | 指示用于训练的缺陷图像的数目的工件检查和缺陷检测系统 | |
JP5546317B2 (ja) | 外観検査装置、外観検査用識別器の生成装置及び外観検査用識別器生成方法ならびに外観検査用識別器生成用コンピュータプログラム | |
KR102388831B1 (ko) | 지능형 다중 초점 영상 융합 장치 및 방법 | |
US11367225B2 (en) | Image inspection apparatus | |
JP6731370B2 (ja) | 画像処理システム及び画像処理を行うためのコンピュータープログラム | |
JP2016114592A (ja) | 情報処理装置、情報処理方法、プログラム | |
WO2021153633A1 (ja) | 金属組織の相の分類方法、金属組織の相の分類装置、金属組織の相の学習方法、金属組織の相の学習装置、金属材料の材料特性予測方法および金属材料の材料特性予測装置 | |
US11501517B2 (en) | Individual identifying device | |
JP7393313B2 (ja) | 欠陥分類装置、欠陥分類方法及びプログラム | |
JP2022091270A (ja) | 方法、システム、および、コンピュータプログラム | |
CN114764189A (zh) | 用于评估图像处理结果的显微镜系统和方法 | |
CN113888459A (zh) | 一种基于自适应纹理特征的纹理图像疵点检测系统及方法 | |
WO2022044673A1 (ja) | 画像処理装置、検査システムおよび検査方法 | |
US20100091125A1 (en) | Template matching device, camera with template matching device, and program for allowing computer to carry out template matching | |
JP2022091269A (ja) | 方法、システム、および、コンピュータプログラム | |
JP7438311B2 (ja) | 画像処理システムおよび画像処理方法 | |
US12299870B2 (en) | Image inspection apparatus and image inspection method | |
CN116934734A (zh) | 基于图像的零件缺陷多路并行检测方法、装置及相关介质 | |
JP2011112810A (ja) | 画像処理方法および画像処理装置 | |
JP2003065969A (ja) | パターン検査装置および方法 | |
TWI884598B (zh) | 影像處理裝置、特徵提取器的學習方法、識別器的更新方法以及影像處理方法 | |
JP2009064100A (ja) | 画像処理装置およびゲイン調整方法 | |
JP4483038B2 (ja) | 検査装置 | |
JP2014142213A (ja) | 撮影パラメータ決定装置及びその制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19912853 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19912853 Country of ref document: EP Kind code of ref document: A1 |