WO2010057661A1 - Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung - Google Patents

Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung Download PDF

Info

Publication number
WO2010057661A1
WO2010057661A1 PCT/EP2009/008293 EP2009008293W WO2010057661A1 WO 2010057661 A1 WO2010057661 A1 WO 2010057661A1 EP 2009008293 W EP2009008293 W EP 2009008293W WO 2010057661 A1 WO2010057661 A1 WO 2010057661A1
Authority
WO
WIPO (PCT)
Prior art keywords
characteristic
current
control
space
workpiece
Prior art date
Application number
PCT/EP2009/008293
Other languages
German (de)
English (en)
French (fr)
Inventor
Ingo Stork Genannt Wersborg
Original Assignee
Precitec Kg
Precitec Itm Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE200810058422 external-priority patent/DE102008058422A1/de
Priority to CN200980147217.2A priority Critical patent/CN102281984B/zh
Priority to RU2011125317/02A priority patent/RU2529135C2/ru
Priority to MX2011005335A priority patent/MX2011005335A/es
Priority to CA2743518A priority patent/CA2743518C/en
Priority to BRPI0921514A priority patent/BRPI0921514A2/pt
Application filed by Precitec Kg, Precitec Itm Gmbh filed Critical Precitec Kg
Priority to KR1020117014138A priority patent/KR101787510B1/ko
Priority to US13/130,426 priority patent/US9492886B2/en
Priority to EP09760109.0A priority patent/EP2365889B1/de
Priority to JP2011536788A priority patent/JP5762968B2/ja
Publication of WO2010057661A1 publication Critical patent/WO2010057661A1/de
Priority to ZA2011/04527A priority patent/ZA201104527B/en
Priority to HK12105813.0A priority patent/HK1164786A1/zh

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/04Automatically aligning, aiming or focusing the laser beam, e.g. using the back-scattered light
    • B23K26/046Automatically focusing the laser beam
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/03Observing, e.g. monitoring, the workpiece
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/04Automatically aligning, aiming or focusing the laser beam, e.g. using the back-scattered light

Definitions

  • the invention relates to a method and a device for monitoring a laser processing operation to be performed on a workpiece as well as to a laser processing head with such a device.
  • process monitoring systems and sensors are used both in a laser cutting process and in a laser welding process.
  • sensors are used to detect the radiation coming from a working or interaction zone determined by the working focus.
  • Radiation sensors for observing a plasma forming above the interaction zone and a back-reflection sensor are provided as standard and detect the reflection of the laser from the interaction zone between the laser beam and a workpiece to be machined.
  • temperature sensors or infrared sensors are used to monitor the laser processing operation, by which an edge melting and the temperature profile during processing can be monitored.
  • the laser processing operation is further monitored by cameras, which may also be sensitive in predetermined wavelength ranges. Due to the image processing of the images taken by the cameras, characteristic values for monitoring the laser processing process, such as parameters with regard to the melted workpiece area, can likewise be obtained.
  • the first objective of the monitoring systems is to first classify the processing quality according to the process specification.
  • the second goal is to improve the processing quality by controlling and regulating the processes.
  • the sensors and cameras used for process monitoring are used by means of the acquired Sensor data and using methods of image processing and data analysis to perform a classification of the current state of the machining process.
  • the processes used are individually adjusted to the machining processes.
  • the current machining process is classified as insufficient, with appropriate control mechanisms being used to remedy this condition.
  • the regulation of the process parameters with respect to the recorded sensor data relates only to the respective measurement data of the corresponding sensors.
  • An object of the present invention is to provide a method and apparatus for monitoring a laser processing operation to be performed on a workpiece by which the classification of a laser processing state and thereby the processing quality of a laser processing operation to be performed on a workpiece are improved.
  • a method for monitoring, controlling or regulating a laser processing operation to be carried out on a workpiece comprising the following steps: detecting at least two current measured values by means of at least one sensor monitoring the laser processing operation, determining at least two current characteristic values from the at least two current measured values, wherein the at least two current characteristic values jointly represent a current fingerprint in a characteristic space, providing a predetermined point quantity in the characteristic value space, and classifying the laser processing operation by detecting the position of the current fingerprint relative to the predetermined point quantity in the characteristic value space.
  • the inventive method additionally comprises a step of Regge at least one process parameter of an associated actuator such that when leaving the current fingerprint from the point set of the characteristic space of at least
  • an actuator is activated so that the change of the associated process parameter corresponds to a gradient in the characteristic space, which extends from the fingerprint in the direction of the predetermined set amount in the characteristic space.
  • the determination of a current characteristic value from at least one current measured value comprises a method for data reduction or dimension reduction such as a principal component analysis, multidimensional scaling, support vector machines or a support vector classification. Due to the dimensional reduction of the sensor data, it will be possible to
  • the classification can be done much faster by a computer, which, for example, a fast control of a laser processing operation can be performed.
  • a computer which, for example, a fast control of a laser processing operation can be performed.
  • the determination of a current characteristic value from at least one current measured value takes place with the aid of a neural network.
  • the measurement data often 25 not directly inferences on a processing situation allow, it is advantageous if the predetermined amount of points within the characteristic space is determined by means of a learning process.
  • the gradient field of the parameter space is determined as a function of the process parameters in different regions at the points in the characteristic space which are representative of the gradient with respect to the gradient, the gradient of the characteristic space depending on a process parameter is determined by a variation of the process parameter at a predetermined point of the characteristic value space.
  • the at least one sensor is selected from a group which comprises at least one photodiode with filters for specific wavelengths, body and airborne sound pickups, and at least one camera unit with a corresponding surface illumination includes.
  • the camera images acquired by the at least one camera unit are recorded with different exposure times and to be charged to one another via the high-dynamic-range method.
  • the at least one actuator is selected from a group which controls the laser power, a speed control of the machining head relative to the workpiece, a control of the focal position of the machining laser beam, a control of the distance of the machining head Workpiece, and a control of the lateral offset comprises.
  • an apparatus for carrying out the method according to the invention which has at least one sensor for monitoring the laser processing operation, which is suitable for detecting at least two current measured values, a data processing unit for determining at least two characteristic values from the at least two current measured values for creating a current one Fingerprint in a characteristic space, a storage unit for storing a predetermined amount of points within the characteristic space, and a classification unit, which is suitable for evaluating the laser processing operation by detecting the position of the current fingerprint relative to the predetermined set amount in the characteristic space.
  • the device further comprises a control unit for controlling at least one process parameter of an associated actuator such that when leaving the current fingerprint from the point set of the characteristic space the at least one Actuator is activated so that the change of the associated process parameter corresponds to a gradient in the characteristic space, which extends from the fingerprint in the direction of the predetermined point amount.
  • the at least one sensor is selected from a group which comprises at least one photodiode with filters for specific wavelengths, body and airborne sound pickups, and at least one camera unit with a corresponding surface illumination.
  • the at least one actuator is advantageously selected from a group which comprises a control of the laser power, a speed control of the machining head relative to the workpiece, a control of the focus position of the machining laser beam, a control of the distance of the machining head to the workpiece, and a control of the lateral offset ,
  • a laser processing head for processing a workpiece by means of a laser beam is provided according to the invention, which comprises the device according to the invention.
  • FIG. 1 shows a flowchart with the essential components of the machining process of a workpiece according to the method according to the invention
  • FIG. 2 shows an overview of the sensors used in the method according to the invention for monitoring and detecting the laser processing process
  • FIG. 3 shows a greatly simplified schematic view of the components used in a machining process according to the invention
  • FIG. 4A shows a greatly simplified schematic view of part of the actuators used in the method according to the invention in a laser beam welding process
  • FIG. 4B shows a greatly simplified schematic view of part of the actuators used in the method according to the invention in a laser-cutting process
  • FIG. 5A shows a flowchart of the generation of a fingerprint according to the method according to the invention using linear as well as non-linear dimension reducers
  • FIG. 5B shows a flow chart of the generation of a fingerprint according to the method according to the invention using a neural network
  • FIG. 6A is a flowchart of the classification process according to the method of the invention using linear and nonlinear dimensional reducers.
  • FIG. 6B shows a flow chart of the classification process according to the method according to the invention using a neural network
  • FIG. 7 is a schematic diagram illustrating an error detection method
  • FIG. 8 shows a flow diagram which illustrates the learning of the fingerprints or the characteristic values or features according to the invention
  • FIG. 9 is a flow chart of the dimensional reduction method according to the invention.
  • FIG. 10 shows a flow chart of the assessment of the current machining process according to the invention
  • FIG. 11 is a flow chart of the estimation of new control parameters according to the invention.
  • FIG. 12 is a schematic view of a camera image processed with an HDR method according to the invention
  • FIG. 13 shows a block diagram of an HDR image sequence processing according to the invention.
  • FIG. 14 shows a flow chart of a classification process using a reinforcement learning method in a laser processing operation according to the invention
  • FIG. 15 is a flow chart of a classification process using a discriminant analysis method in a laser processing operation according to the invention.
  • FIG. 16 is a flowchart of a control operation by means of set values obtained by dimensional reduction in a laser processing operation according to the invention.
  • a cognitive laser material processing system which has cognitive abilities through the use of machine learning and self-learning algorithms.
  • the associated method according to the invention can be used in laser material processing for process observation, process control, and process control.
  • a system can have two types of cognitive abilities: First, it appears to an external observer as if the observed system had cognitive abilities, such as the ability to learn and improve independently. Second, the system realizes cognitive abilities in a similar way to a natural organism, such as the human brain.
  • the system according to the invention possesses cognitive abilities such as learning as well as the independent recognition and improvement of errors that are used in laser material processing.
  • the use of cognitive skills is particularly advantageous in the field of laser material processing. Machining processes such as cutting or joining workpieces are very different from process to process. So far, it is known to set each process manually individually first. After setting the process parameters, the process is only observed and adjusted accordingly manually. For example, if a next batch of workpieces is dirty or the workpiece thickness deviates from the previous batch of workpieces, the process often has to be readjusted manually. An automatic adaptation to process changes was either not at all possible or only possible to a very limited extent. In fact, the requirements of vehicle manufacturers who want to produce several vehicles in a production line are such that the production systems can adapt quickly and adaptively to the machining processes.
  • the fast learning of machining processes and the recognition, improvement and avoidance of errors during the machining are requirements that are met by the cognitive capabilities of the machining system according to the invention.
  • Fig. 1 is a flowchart of the method according to the invention with its essential components is shown schematically, which are explained below step by step.
  • all relevant information of the machining process is detected according to the invention with a sensor system having at least one sensor.
  • the sensors used are used to obtain a multiplicity of measured values and information about the process in order to be able to determine features, process images, process characteristics or clear fingerprints of the process, which are referred to below as characteristic values, from the measurement data of the sensors monitoring the processing process.
  • the determination is carried out in particular by calculating or another suitable, preferably electronic processing of the measured values.
  • FIG. 2 An overview of the sensors used according to the invention is shown in FIG. 2 and a structure of a laser processing system according to the invention with the corresponding sensors is shown in FIG.
  • sensors for detecting used by body and airborne sound In addition to already known sensors for monitoring a laser processing process, according to the invention additionally sensors for detecting used by body and airborne sound.
  • sensors for body and airborne sound For sound recording, it is expedient to use at least two sensors each for body and airborne sound.
  • the sensor signals for body and airborne sound are filtered, amplified and correspondingly scanned depending on the process in preprocessing. For airborne sound recording different directional characteristics are suitable.
  • the locations of the sound sources and the direction of propagation can be calculated. This can also reduce noise from non-relevant sources and background noise, or apply methods such as Active Noise Cancellation.
  • Sensors for emission detection of specific wavelengths which are preferably photodiodes that are sensitive for a specific wavelength range are also provided in the laser processing head.
  • optical bandpass filters for selecting specific wavelength ranges may additionally be arranged in front of the corresponding photodiodes. The measured values of these sensors are also recorded and sampled.
  • cameras which observe the laser processing operation and in particular the laser processing zone are used for data acquisition.
  • an in-process camera can be used, whose observation beam path is coaxially coupled into the beam path of the working laser in the processing head, so as to image the laser processing zone.
  • a camera can also record the processing process outside the processing head.
  • a leading camera, called a pre-process camera, and a trailing camera, called a post-process camera, can also capture the laser processing process.
  • Various workpiece lighting concepts are suitable for camera acquisition, depending on the machining process.
  • illumination light-emitting diodes which are inexpensive and can radiate in a wide wavelength range
  • lasers are used in different wavelengths with appropriate optics for focusing on the camera section on the workpiece surface.
  • data processing methods such as "region of interest”, “qualas”, or geometry data evaluation are particularly suitable and preferred.
  • a high dynamic range (HDR) method is used, which advantageously increases the contrast ratio of the captured camera images.
  • the inventive method is not limited to the use of the plurality of sensors, but already using only one sensor, such as the in-process camera, can be performed.
  • control program In laser material processing, a control program is normally designed manually for all involved actuators. During the process, this control program is only monitored via process monitoring or adjusted with firmly defined control circuits such as capacitive distance sensors during laser cutting.
  • the laser beam power, the distance between the machining head and the workpiece, the speed of the machining head relative to the workpiece, and the position of the focal point of the machining laser beam are controlled.
  • laser cutting in the processing method additionally controls or regulates the supply of process gas.
  • control signals can be modulated in their intensity with a certain frequency, for example, a modulation of the laser radiation intensity between 90 and 100 percent. Since the control signal is known, the system response via the sensor data can provide insights into the process for example, how to recover a gradient field of the characteristic value space as a function of the process parameters in different measuring ranges.
  • the controls can be realized via corresponding linear axes, robot control or other control interfaces.
  • the method according to the invention is not limited to the use of the multiplicity of actuators, but can already be carried out using only one actuator, for example a laser power controller for laser welding or a process gas controller for laser cutting.
  • the technical cognition has to be abstracted from the sensor data, so that the system according to the invention can autonomously make decisions for the actuation of the actuators.
  • the system can be trained by an operator of the system and is self-learning.
  • the invention provides that the system independently already knows or acquires and learns the essential characteristic values from all the sensors used and then makes decisions for the process control.
  • three stages of the method according to the invention will be illustrated, namely the learning of the process environment, the classification of the current process result and the control or regulation of the process.
  • homing or test processing are necessary. Each machining process has a desired result and a different one.
  • the test processing or homing must contain both results and ideally also the transitions, as well as the reaction on of the system on the process control. If, for example, a weld is to be achieved in the lap joint of stainless steel with a defined weld width of X mm and a length of Y cm, then at least one homing run must be carried out in which at least one process parameter is varied so that both the defined and the homing are determined the definition violation is contained in both directions of the process parameter.
  • the human system operator can carry out a reference run with an increasing laser power as process parameter, in which process the upper and lower definition limit occurs and is exceeded.
  • a reference run can start with a laser power which does not yet cause any welding.
  • the laser power is controlled to increase continuously until suturing occurs. This process is monitored and used to learn the process environment using the described process sensors, which record the corresponding measurements.
  • Another example concerns production problems between two batches of greasy and non-greasy workpieces.
  • the definition limits must be included for learning during the reference run.
  • the operator tells the cognitive laser material processing system where the definition limits lie, so that the system according to the invention can learn to distinguish between the areas.
  • linear and non-linear dimensional reducers and manifold learning methods such as Principal Component Analysis (PCA), MDS (Multidimensional Scaling), LLE (Locally Linear Embedding), and SVM (Support Vector Machines) can be used to understand the process environment. These methods can be used both in combination and alone. To learn the process environment can continue a discriminant analysis are used, as described below.
  • PCA Principal Component Analysis
  • MDS Multidimensional Scaling
  • LLE Long-Linear Embedding
  • SVM Small Vector Machines
  • KNN artificial neural network
  • the procedure is different, since the network is trained here and the learned information is then available in the network, which can then classify the result.
  • the initial neurons thus initially provide a classification based on the trained data. Based on this classification can then be regulated.
  • the current process result must be detected, compared with the previously learned target range, which can be regarded as a point quantity in the parameter space, and, if necessary, the process parameters are adapted, as shown in FIGS. 6A and 6B.
  • the process parameter adaptation can and should already take place before exiting the target range.
  • the predetermined point quantity which is used for the regulation of the system can be adapted such that in a control case the current fingerprint of the sensor system already leaves the predetermined point set at a time when the fingerprint enters an edge region of the desired range.
  • the cognitive laser material processing system has already stored in the database of a memory the learned process environment, the learned features or fingerprints in the form of a vector or a matrix.
  • the process currently used by the process NEN measured values of the sensors must first be reduced in the amount of data and brought for comparison in the same data space, so the characteristic space, such as the feature vectors or fingerprints, thus a current fingerprint is obtained as a reduced sensor data vector or matrix in the characteristic space, with the learned point set is compared in the characteristic value space.
  • the probability can be obtained that the currently detected data point is closest to a certain feature point. It is known here whether this feature point is still within the desired range, furthermore the probably necessary correction of the process parameter is known.
  • the classification of the current process result by means of neural networks is carried out by the trained network.
  • the classification result is whether the process is still within the target range and with which trend the process parameter is to be adapted.
  • the control of the process according to the inventive method is carried out in the following manner.
  • the control unit already knows the direction and the strength with which the corresponding actuators must be activated.
  • Various control methods can be used. For example, the minimization of the geodesic distance between desired feature vector and result vector or a control method with Kalman filter and minimization of the mean square error can be used.
  • the tendency to regulate from the multidimensional feature spaces or characteristic dreams can be determined via the "Support Vector" classification. The controller must not exceed the previously defined safety range.
  • the invention can be applied in several process variants, some of which are presented here.
  • the cognitive laser material processing system uses the PCA, Principal Components Analysis or a combination of the other methods to calculate the proposed dimension reducers from the sensor data.
  • the operator now tells the system where to place the workpiece - no garden has arisen.
  • the cognitive system can then calculate the corresponding component, a vector or matrix, from the information about where the ridge has been created at the cutting edge and at which points corresponding main components were calculated from the sensor data, the collected features or the fingerprint for contains the formation of burrs. From the current sensor data in the further operation of the system can then be calculated with matrix vector algebra during the process and the operator to see whether the learned error has occurred.
  • the same procedure can be used to detect, for example, effects in laser beam welding or laser cutting: wrong friend, suture, penetration, X cut, average state, cut edge roughness, combustion effects, weld width Y, weld in status, weld through status, bond tread status, gap in lap, Gap in butt joint, lateral offset, ejects, pores, holes.
  • the invention can also be used to simplify a batch change which previously required an adaptation of the laser material processing system.
  • the workpieces of the new batch have slightly changed properties, eg material thickness or degree of contamination.
  • It is again carried out first a learning phase and then a classification phase. After the classification phase, a control process can already be realized.
  • new control parameters for a process change that occur, for example, as a result of a batch change.
  • the learning phase according to FIG. 8, the measured values of the process sensor system are detected by a reference travel.
  • constant process control parameters are again set, except for a control parameter that is varied.
  • the laser power can be increased steadily during reference travel.
  • the recorded data are processed by the cognitive laser material processing system with dimension reducers, cf. FIG. 9.
  • the output data of each sensor used are first filtered with a corresponding low-pass filter. Then the n principal components are output via the Principal Component Analysis. The data are then normalized and freed from the mean.
  • the corresponding features or fingerprints and their mapping rule are stored in a database for feature ablation provisions.
  • the operator of the system now defines an area on the workpiece that corresponds to the desired result. This definition is transformed into a vector with which a classifier can be trained. In order to be able to perform a classification, Support Vector Machines is used in this process.
  • a Support Vector Classification method is used. It describes a mathematical method for distinguishing desired and undesired process results, which is performed by a multi-dimensional separation of the feature space, based on the specifications of the operator.
  • the database of feature mapping rules describes the mapping rule, and the classification database describes the separation of the feature spaces.
  • the cognitive laser material processing system monitors the machining process according to the previously learned operator wishes.
  • the sensor data is dimensionally reduced based on the specifications of the particular feature mapping rules.
  • the output data are located in the predetermined feature space or feature value space.
  • the classification data learned by the operator through the Support Vector Classification procedure is used to assess the current machining process. It can be judged whether the current process is within the user-defined desired range and which tendency is to be taken over a probability for the process control parameter to control the process.
  • the estimation of new control parameters or process parameters for small process changes by a batch change will be described. If the machining process is changed for a certain duration, e.g. by slightly changing the workpiece characteristics during a batch change, the new control parameters can be estimated. For this purpose, in addition to the previous homing 1, a new homing 2 must be performed. Homing 1 and 2 used the same control parameters.
  • the sensor data or the measured values of the sensors of reference travel 2 are again reduced in size.
  • the mapping rules are now applied to the recorded sensor data of Homing 1.
  • the occurrence probabilities of the characteristics from reference travel 1 during reference travel 2 are calculated.
  • the cognitive laser material processing system can thus calculate, from the position on the workpiece or from the control parameters used here and the occurrence probabilities of the features, which control parameters in the new process will produce a result very similar to or almost the same as in the previous machining process.
  • features are obtained from the process data as in the previously described methods. These features are classified by initial and periodic homing by the operator, with appropriate assessment of whether the control parameter should be adjusted.
  • the corresponding characteristics and the associated classifications are stored in a database, possibly with an adjustment proposal.
  • the operator therefore assesses the system at regular intervals and thus trains it.
  • the system can thus first determine whether the current process result is still in the specified feature space and whether the system should carry out an adaptation of the control parameters.
  • the learned features and customization suggestions thus become more and more over time and the system becomes better and better in processing. Similar It is possible to recalculate characteristics and adaptation proposals in order to avoid a flood of features.
  • HDR High Dynamic Range
  • an imaging sensor is either scanned several times, ie at least twice, at different times per image or multiple images, ie two, three or more images with different exposure times or with multiple cameras and then computed with each other to at least one image .
  • This procedure enables a picture, image sequence or video recording, which at the same time makes the surrounding processing area, the process lighting and the vapor capillary or keyhole visible in an image.
  • the ranges mentioned are distributed in the intensity values in an image acquisition of laser processing processes in a wide range, which can be made visible in an image by said method.
  • a picture or image sequence thus created is displayed adapted via a gray value or tone mapping method.
  • a plurality of images or pixel arrays are offset with one another.
  • the different images can be created by multiple scanning of an imaging sensor or by simultaneous image acquisition with multiple cameras or by sequential image acquisition with a camera, but different exposure times, called multi-exposure technique.
  • the billing of individual images can be done on different types of procedures. In the simplest case, this includes adding up and averaging the individual image values of several images of a sequence of images from at least two image recordings. For better image acquisition, the image values or pixels from an image sequence can be averaged out of at least two image recordings. Either an entropy method can be used as the weighting method, for the weighting according to the information content, or a weighted averaging can be carried out taking into account the camera response function. For this, a conclusion must be drawn on the real or realistic radiation energy per area, which is given by the following function:
  • the weighting for the individual radiation energies is then:
  • i is the image index of an image sequence of several image recordings
  • j the pixel position
  • tj the exposure time or scan time of the image capture iyy the intensity value of the pixel of the image capture i at the position j
  • T 1 O the inverse Camera Response Function
  • X j the estimated radiation energy per area at pixel position j
  • Wy the weighting function of the reliability model.
  • the invention explicitly relates to the use of these illustrated HDR image processing methods in processing methods such as cutting or joining of materials, in particular with laser processing heads and / or the process monitoring system according to the invention connected thereto.
  • any sensor that enables a sensor data output can be used as the sensor system.
  • these are, for example, microphones or structure-borne sound pickups, cameras, photodiodes, buttons, technical evaluation and monitoring signals and Aktorikparameter, such as the laser power.
  • PCA Principal Component Analysis
  • ICA Independent Component Analysis
  • Wavelet Analysis Fourier
  • Fast Fourier Fast Fourier
  • Laplace Analysis feature and object recognition algorithms. driving, Locally-Linear Embedding, Artificial Neural Networks, Multidimensional Scaling and much more.
  • the reduced amount of data can be interpreted as a point cloud of a multi-dimensional space obtained from a higher-dimensional space.
  • By reducing the data it is possible to compare them in finite time with previously recorded and classified or learned data sets. In this classification, it can be determined whether the new sensor data are similar to already recorded sensor data and this similarity is assigned a probability. If a defined threshold for a likelihood of similarity of a previously recorded amount of data is exceeded, then the solution or control or regulation approach previously stored thereon can be tracked. If the threshold for a likelihood of similarity to previously learned amounts of data is exceeded, the system has a new situation.
  • the behavior for a new situation can either be learned by inquiring from a human operator or can be tried out of the previous data and solution strategies according to the similarity principle.
  • self-learning algorithms are used, which then check after a target, after trying out a self-developed approach, whether a goal has been achieved and evaluate the chosen approach accordingly.
  • Support Vector Machines Support Vector Classification, Fuzzy Logic, Information Fuzzy Networks, Fuzzy K-Nearest Neighbor Classifier, K-Nearest Neighbor Classifier, Reinforcement Lear - ning, Bayesian Networks and Bayesian Knowledge Databases, Naive Bayes Classifiers, Hidden Markov Chains, Artificial Neural Networks and Backpropagation, Regression Analysis, Genetic Programming or Decision Trees.
  • the solution strategy resulting from the classification, or a controller or actuator control can be carried out simply, but it can also control the type of data acquisition. If, for example, no threshold is reached for a known amount of data, then the type of data acquisition can be changed. For example, this can be done by adapting a wavelet analysis. to new frequency ranges or by switching from PCA to ICA.
  • High Dynamic Range method (HDR method)
  • An HDR method may be used to calculate a higher contrast ratio from multiple captured images or image matrices and vectors having different contrast ratios. For this purpose, when taking a picture or observing a scene, several pictures with different exposure times can be taken, from which subsequently a picture or a picture sequence with improved contrast ratio can be calculated. In order to produce a sequence of images with different contrast ratios, several images with different exposure time can be recorded, according to the so-called multi-exposure method.
  • the pixel values can also be scanned repeatedly during an exposure time. In this way, an image sequence with different contrast ratios during an exposure time is created.
  • an imaging sensor can also be scanned repeatedly during an exposure time.
  • Charges representing the pixels are retrieved once and thereafter can not be retrieved a second time.
  • Techniques, such as non-destructive reading include non-distructive readout
  • NDRO Multi Slope or Single Slope Readout or Cooled Imager or Charge Injection Imaging (CIS) or Thin-Film on CMOS (TFC) or Active Pixel Sensor (APS) or Single-slope or Correlated Double Sampling (CDS) that allows multiple interrogation of a charge, for example, on a CMOS chip, during a single exposure period without the interrogation of the interrogated charge value.
  • these techniques can be used for observing a laser processing process in order to realize an observation or control method, whereby it is possible by the HDR method to carry out simultaneously the process emissions, the vapor capillary, the weld pool, the weld seam geometry during a laser welding operation to be carried out.
  • RL Strengthening Learning or Reinforcement Learning
  • RL refers to a field of machine learning. It describes procedures in which systems or agents, English, action apply to environments to maximize a reward. RL finds mapping rules or procedures (Engl, policy) for one or more system states or states on system action plans or actions. The methods of RL can be used according to the invention for self-improving control and observation of laser processing processes.
  • FIG. 14 shows a possible procedure for how RL can be integrated in a laser processing process.
  • the values to be learned are symbolized by the matrix Q.
  • the Q matrix consists of the components QS1, QSn, QSA, QDR, QR1, QRm, these can contain one or more values. These components are initialized with a start value and optimized according to an RL procedure. This optimization takes place by performing an action, evaluating it with a reward function, and modifying the values of the Q Matrix. Comparable to a theater where an actor is judged by a critic, and the actor adjusts his actions. As described above, in a reference run or by a learning phase, a point cloud with appropriate classification can be obtained by a human expert.
  • the characteristics or point clouds or characteristics or fingerprints or sensor measured values which represent the desired process result are therefore stored in this.
  • This can be realized by a support vector machine or another classification method.
  • This can be a reward function that the RL method works for.
  • the Q matrix is thus optimized after this man-taught reward function.
  • weighting values or setting parameters can be learned and optimized, such as the weighting different sensors among each other (QSl, QSn), the selection of special features that are used for control or observation (QDA), the selection of setpoints for various control methods (QDR), or the controller setting parameters such as proportional, P-share, integrated , I share, and differentiated, D share (QRl, QRm).
  • MDP Markov Decision Process
  • AHC Q Learning Adaptive Heuristic Critic
  • SARSA State-Action-Reward-State-Action
  • ART Adaptive Resonance Theory
  • MCA Multivariate Analysis
  • EM Expectation-Maximization Algorithm
  • Radar Dial Basis Function Network Time Series Prediction
  • ATR Automatic Target Recognition
  • RBF Radial Basis Function
  • a discriminant analysis (DA) or linear discriminant analysis or linear discriminant analysis (LDA) as well as Fisher's linear discriminant is a statistical analysis method which has a similar principle to the principal component analysis already described. In contrast to the principal component analysis, the DA also considers the class affiliation of a classification. Also, DA can alternatively be used for dimensional reduction in the method according to the invention, but at the same time represents a combination of dimensional reduction and classification method.
  • sensor data can be acquired, reduced in dimension and classified with previously learned data using a method as already described.
  • the classification result can then be used as the basis for an actual value calculation for one or more controllers with learned setpoint values for the control of one or more actuators or control parameters.
  • the DA can be combined with other material reduction methods in laser material processing, for example, a main component can be used first Be carried out analysis and then carried out a DA.
  • This also applies to the other dimension reduction methods already described, which reduce a sensor data input vector of dimension Y to a dimension X with X ⁇ Y.
  • the combinations may differ for the respective sensors.
  • the already mentioned Independent Component Analysis which extracts features for statistical independence, is particularly suitable for acoustic sensors and the principal component analysis for imaging sensors.
  • Further dimension reduction methods can be used according to the invention in a described laser material processing system: kernel principle component analysis (Kernel Principle Component Analysis), locally linear embedding (LLE), Hessian LLE, Laplace proper name and map (Laplacian own maps) , Local Tangent Space Alignment (LTSA), Semi-Definite Embedding (SDE), Maximum Variance Unfolding (MVU), Curvilinear Component Analysis (CCA), Data-driven High-dimensional Scaling (DD-HDS), Autoencoders, as a special variant of a feed -Forward Artificial Neural Network, Boltzmann Machines as well as all procedures similar principle.
  • a principal component analysis or other dimensional reduction methods or a feature extraction or an HDR method can also be carried out on a cellular neural network (CNN) integrated in an image acquisition unit in a laser processing system for particularly fast data processing.
  • CNN is a parallel calculation method similar to an artificial neural network.
  • a laser processing process can also be controlled directly with set values from a dimension reduction for faster data processing, a classification can then be used to determine the best set values with an optimization of a signal to signal to noise ratio. In this way, very high control cycles can be realized, while high adaptability by taking into account the learned classification results.

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Laser Beam Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
PCT/EP2009/008293 2008-11-21 2009-11-20 Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung WO2010057661A1 (de)

Priority Applications (11)

Application Number Priority Date Filing Date Title
JP2011536788A JP5762968B2 (ja) 2008-11-21 2009-11-20 工作物に対して実施されるべきレーザ加工作業をモニタリングするための方法および装置、ならびにかかる装置を有するレーザ加工ヘッド
RU2011125317/02A RU2529135C2 (ru) 2008-11-21 2009-11-20 Способ и устройство для контроля проводимого на обрабатываемой детали процесса лазерной обработки, а также лазерная обрабатывающая головка с подобным устройством
MX2011005335A MX2011005335A (es) 2008-11-21 2009-11-20 Metodo y dispositivo para monitorear una operacion de procesamiento laser a ser realizada sobre una pieza de trabajo y cabeza de procesamiento laser que tiene dicho dispositivo.
CA2743518A CA2743518C (en) 2008-11-21 2009-11-20 Method and device for monitoring a laser processing operation to be performed on a workpiece and laser processing head having such a device
BRPI0921514A BRPI0921514A2 (pt) 2008-11-21 2009-11-20 método e dispositivo para monitorar uma operação de processamento a laser, a ser executada em uma peça a trabalhar e cabeçote de processamento a laser tendo tal dispositivo.
CN200980147217.2A CN102281984B (zh) 2008-11-21 2009-11-20 用于监视要在工件上实施的激光加工过程的方法和装置以及具有这种装置的激光加工头
KR1020117014138A KR101787510B1 (ko) 2008-11-21 2009-11-20 공작물 레이저 가공 작업 모니터 방법 및 장치와 그 장치를 구비한 레이저 가공 헤드
US13/130,426 US9492886B2 (en) 2008-11-21 2009-11-20 Method and device for monitoring a laser processing operation to be performed on a workpiece and laser processing head having such a device
EP09760109.0A EP2365889B1 (de) 2008-11-21 2009-11-20 Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung
ZA2011/04527A ZA201104527B (en) 2008-11-21 2011-06-20 Method and device for monitoring a laser processing operation to be performed on a workpiece and laser processing head having such a device
HK12105813.0A HK1164786A1 (zh) 2008-11-21 2012-06-14 用於監視要在工件上實施的激光加工過程的方法和裝置以及具有這種裝置的激光加工頭

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
DE102008058422.3 2008-11-21
DE200810058422 DE102008058422A1 (de) 2008-11-21 2008-11-21 Verfahren und Vorrichtung zur Überwachung eines an einem Werkstück durchzuführenden Laserbearbeitungsvorgangs sowie Laserbearbeitungskopf mit einer derartigen Vorrichtung
DE102009033881 2009-07-20
DE102009033881.0 2009-07-20
EP09010230.2 2009-08-07
EP09010230 2009-08-07
EP09011375.4 2009-09-04
EP09011375 2009-09-04

Publications (1)

Publication Number Publication Date
WO2010057661A1 true WO2010057661A1 (de) 2010-05-27

Family

ID=41666804

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/EP2009/008293 WO2010057661A1 (de) 2008-11-21 2009-11-20 Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung
PCT/EP2009/008294 WO2010057662A1 (de) 2008-11-21 2009-11-20 Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/008294 WO2010057662A1 (de) 2008-11-21 2009-11-20 Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung

Country Status (11)

Country Link
US (2) US9492886B2 (zh)
EP (2) EP2365889B1 (zh)
JP (2) JP5763542B2 (zh)
KR (2) KR101700896B1 (zh)
CN (2) CN102281984B (zh)
CA (2) CA2743522C (zh)
HK (1) HK1164786A1 (zh)
MX (2) MX2011005335A (zh)
RU (2) RU2529135C2 (zh)
WO (2) WO2010057661A1 (zh)
ZA (2) ZA201104527B (zh)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012056913A1 (en) * 2010-10-26 2012-05-03 Sintokogio, Ltd. Evaluation method and evaluation system for impact force of laser irradiation during laser peening and laser peening method and laser peening system
WO2012107331A1 (de) * 2011-02-07 2012-08-16 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Vorrichtung und verfahren zur überwachung und insbesondere zur regelung eines laserschneidprozesses
WO2012163545A1 (de) 2011-06-03 2012-12-06 Lessmüller Lasertechnik GmbH Verfahren zum überwachen der bearbeitung sowie vorrichtung zum bearbeiten eines werkstücks mit einem hochenergetischen bearbeitungsstrahl
WO2016001234A1 (de) * 2014-07-01 2016-01-07 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren und vorrichtung zum bestimmen einer werkstoffart und/oder einer oberflächenbeschaffenheit eines werkstücks
US10241324B2 (en) 2016-12-19 2019-03-26 Fanuc Corporation Machine learning device for learning procedure for aligning optical part of light source unit, and light-source unit manufacturing apparatus
WO2020104103A1 (de) * 2018-11-22 2020-05-28 Precitec Gmbh & Co. Kg Überwachung eines laserarbeitungsprozesses mithilfe von tiefen faltenden neuronalen netzen
WO2020239328A1 (de) * 2019-05-29 2020-12-03 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Automatische materialerkennung mit laser
WO2021105364A3 (de) * 2019-11-29 2021-07-22 Sms Group Gmbh Steuerungssystem für eine industrielle anlage, insbesondere für eine anlage zur herstellung oder verarbeitung von metallischen bändern oder blechen und verfahren zum steuern einer industriellen anlage, insbesondere einer anlage zur herstellung oder verarbeitung von metallischen bändern oder blechen
WO2022263207A1 (de) * 2021-06-18 2022-12-22 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zur laserbearbeitung und laserbearbeitungsanlage sowie steuerungseinrichtung hierfür
DE102011122991B3 (de) 2011-06-03 2023-06-22 Lessmüller Lasertechnik GmbH Verfahren zum Überwachen der Bearbeitung sowie Vorrichtung zum Bearbeiten eines Werkstücks mit einem hochenergetischen Bearbeitungsstrahl
CN117283143A (zh) * 2023-10-08 2023-12-26 广东省源天工程有限公司 用于海洋内水下作业机器人的防腐蚀控制系统及方法
EP4368330A1 (de) * 2022-11-14 2024-05-15 Bystronic Laser AG Steuerung einer laserschneidmaschine mittels luftschallsignalen

Families Citing this family (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012000650A1 (en) * 2010-06-28 2012-01-05 Precitec Kg A method for classifying a multitude of images recorded by a camera observing a processing area and laser material processing head using the same
WO2012040916A1 (zh) * 2010-09-29 2012-04-05 东北大学 基于递归核主元分析的连续退火过程故障监测方法
JP2012148316A (ja) * 2011-01-19 2012-08-09 Keyence Corp レーザー加工装置
DE102012202279B4 (de) * 2012-02-15 2014-06-05 Siemens Aktiengesellschaft Sicherstellung einer Prüfabdeckung bei einer manuellen Inspektion
US10315275B2 (en) * 2013-01-24 2019-06-11 Wisconsin Alumni Research Foundation Reducing surface asperities
US10464172B2 (en) 2013-02-21 2019-11-05 Nlight, Inc. Patterning conductive films using variable focal plane to control feature size
US9842665B2 (en) 2013-02-21 2017-12-12 Nlight, Inc. Optimization of high resolution digitally encoded laser scanners for fine feature marking
CN105144346B (zh) 2013-02-21 2017-12-15 恩耐公司 多层结构的激光刻图
US20140255620A1 (en) * 2013-03-06 2014-09-11 Rolls-Royce Corporation Sonic grain refinement of laser deposits
CN103264227B (zh) * 2013-04-11 2015-05-13 温州大学 一种激光直接刻蚀聚合物基体表面覆盖金属膜的方法
DE102013218421A1 (de) * 2013-09-13 2015-04-02 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Vorrichtung und Verfahren zur Überwachung, insbesondere zur Regelung, eines Schneidprozesses
US11440141B2 (en) * 2013-09-13 2022-09-13 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Devices and methods for monitoring, in particular for regulating, a cutting process
KR102380352B1 (ko) * 2013-11-08 2022-03-30 써머툴 코포레이션 용접 프로세스를 위한 열 에너지 감지 및 분석
US9764469B1 (en) * 2013-12-13 2017-09-19 University Of South Florida Generating robotic trajectories with motion harmonics
DE102014202176B4 (de) * 2014-02-06 2015-10-22 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum Identifizieren einer Randkontur einer an einem Bearbeitungskopf gebildeten Öffnung und Bearbeitungsmaschine
US10069271B2 (en) 2014-06-02 2018-09-04 Nlight, Inc. Scalable high power fiber laser
US10618131B2 (en) 2014-06-05 2020-04-14 Nlight, Inc. Laser patterning skew correction
CN104007689B (zh) * 2014-06-09 2017-02-15 益阳市和天电子有限公司 一种电容生产现场用的智能监控系统
CN105720463B (zh) 2014-08-01 2021-05-14 恩耐公司 光纤和光纤传输的激光器中的背向反射保护与监控
JP2016078044A (ja) * 2014-10-14 2016-05-16 三菱電機株式会社 加工状態判定装置及び加工機
US10112262B2 (en) 2014-10-28 2018-10-30 General Electric Company System and methods for real-time enhancement of build parameters of a component
RU2599920C2 (ru) * 2014-11-20 2016-10-20 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Владимирский государственный университет имени Александра Григорьевича и Николая Григорьевича Столетовых" Устройство управления технологическим процессом лазерного термоупрочнения
CN104880992B (zh) * 2014-12-23 2018-08-17 湘潭大学 基于Kriging代理模型的磁控电弧传感器参数优化方法
US9837783B2 (en) 2015-01-26 2017-12-05 Nlight, Inc. High-power, single-mode fiber sources
US10050404B2 (en) 2015-03-26 2018-08-14 Nlight, Inc. Fiber source with cascaded gain stages and/or multimode delivery fiber with low splice loss
JP6675420B2 (ja) 2015-05-13 2020-04-01 バイストロニック レーザー アクチェンゲゼルシャフト レーザ加工装置
CN104966088B (zh) * 2015-06-02 2018-10-23 南昌航空大学 基于成组小波-变分关联向量机断口图像识别方法
US10520671B2 (en) 2015-07-08 2019-12-31 Nlight, Inc. Fiber with depressed central index for increased beam parameter product
JP2017030067A (ja) * 2015-07-30 2017-02-09 ファナック株式会社 加工時間測定機能とオンマシン測定機能を有する制御装置付き加工装置
JP2017030088A (ja) * 2015-07-31 2017-02-09 ファナック株式会社 機械学習装置、ネジ締付システムおよびその制御装置
JP6438366B2 (ja) * 2015-08-28 2018-12-12 ファナック株式会社 電動機に対する動作指令を学習する機械学習方法および機械学習装置並びに該機械学習装置を備えた制御装置および電動機装置
US9989495B2 (en) 2015-11-19 2018-06-05 General Electric Company Acoustic monitoring method for additive manufacturing processes
US10073060B2 (en) 2015-11-19 2018-09-11 General Electric Company Non-contact acoustic inspection method for additive manufacturing processes
US10232439B2 (en) 2015-11-20 2019-03-19 General Electric Company Gas flow monitoring in additive manufacturing
US9989396B2 (en) 2015-11-20 2018-06-05 General Electric Company Gas flow characterization in additive manufacturing
EP3380266B1 (en) 2015-11-23 2021-08-11 NLIGHT, Inc. Fine-scale temporal control for laser material processing
US11179807B2 (en) 2015-11-23 2021-11-23 Nlight, Inc. Fine-scale temporal control for laser material processing
US10839302B2 (en) 2015-11-24 2020-11-17 The Research Foundation For The State University Of New York Approximate value iteration with complex returns by bounding
US10295820B2 (en) 2016-01-19 2019-05-21 Nlight, Inc. Method of processing calibration data in 3D laser scanner systems
JP6339603B2 (ja) * 2016-01-28 2018-06-06 ファナック株式会社 レーザ加工開始条件を学習する機械学習装置、レーザ装置および機械学習方法
US10831180B2 (en) * 2016-02-25 2020-11-10 General Electric Company Multivariate statistical process control of laser powder bed additive manufacturing
JP6625914B2 (ja) * 2016-03-17 2019-12-25 ファナック株式会社 機械学習装置、レーザ加工システムおよび機械学習方法
DE102016206402A1 (de) * 2016-04-15 2017-10-19 Bühler Motor GmbH Kreiselpumpenmotor
CN106041296B (zh) * 2016-07-14 2017-07-28 武汉大音科技有限责任公司 一种在线式动态视觉激光精密加工方法
US10295845B2 (en) 2016-09-29 2019-05-21 Nlight, Inc. Adjustable beam characteristics
US10732439B2 (en) 2016-09-29 2020-08-04 Nlight, Inc. Fiber-coupled device for varying beam characteristics
US10730785B2 (en) 2016-09-29 2020-08-04 Nlight, Inc. Optical fiber bending mechanisms
JP6438450B2 (ja) * 2016-11-29 2018-12-12 ファナック株式会社 レーザ加工ロボットの加工順序を学習する機械学習装置、ロボットシステムおよび機械学習方法
US11205103B2 (en) 2016-12-09 2021-12-21 The Research Foundation for the State University Semisupervised autoencoder for sentiment analysis
RU2638267C1 (ru) * 2017-01-09 2017-12-12 Федеральное государственное бюджетное образовательное учреждение высшего образования "Владимирский Государственный Университет имени Александра Григорьевича и Николая Григорьевича Столетовых" (ВлГУ) Способ лазерной сварки внахлест листов конструкционной стали и сплавов алюминия
RU2646515C1 (ru) * 2017-02-02 2018-03-05 Федеральное государственное бюджетное образовательное учреждение высшего образования "Казанский национальный исследовательский технический университет им. А.Н. Туполева-КАИ" (КНИТУ-КАИ) Универсальная лазерная оптическая головка
CN110651218B (zh) 2017-04-04 2022-03-01 恩耐公司 用于检流计扫描仪校准的设备、系统和方法
DE102017208630B4 (de) * 2017-05-22 2022-03-31 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum Bestimmen von Wirkungsgrad oder Wirkungsgradänderungen einer Schlackeabsaugung und zugehörige Laserbearbeitungsmaschine
DE102017210098B4 (de) * 2017-06-16 2024-03-21 Jenoptik Optical Systems Gmbh Scanvorrichtung mit einer Scankopfvorrichtung zum Reflektieren oder Transmittieren von Strahlen für einen Scanner sowie Verfahren zum Reflektieren oder Transmittieren von Strahlen für einen Scanner
CN110020648B (zh) * 2018-01-10 2022-03-22 上银科技股份有限公司 工件量测及定位方法
JP6767416B2 (ja) * 2018-03-26 2020-10-14 ファナック株式会社 加工条件調整装置及び機械学習装置
JP7017985B2 (ja) * 2018-06-05 2022-02-09 株式会社日立製作所 システム及び処理条件の決定方法
EP3586973B1 (en) * 2018-06-18 2024-02-14 Rolls-Royce Corporation System control based on acoustic and image signals
CN109031954B (zh) * 2018-08-03 2021-06-25 北京深度奇点科技有限公司 基于强化学习的焊接参数确定方法、焊接方法和设备
US11534961B2 (en) 2018-11-09 2022-12-27 General Electric Company Melt pool monitoring system and method for detecting errors in a multi-laser additive manufacturing process
US11020907B2 (en) 2018-12-13 2021-06-01 General Electric Company Method for melt pool monitoring using fractal dimensions
US11285671B2 (en) 2018-12-13 2022-03-29 General Electric Company Method for melt pool monitoring using Green's theorem
US10894364B2 (en) 2018-12-13 2021-01-19 General Electric Company Method for melt pool monitoring using geometric length
US10828837B2 (en) 2018-12-13 2020-11-10 General Electric Company Method for melt pool monitoring using algebraic connectivity
US10828836B2 (en) 2018-12-13 2020-11-10 General Electric Company Method for melt pool monitoring
CN110132975B (zh) * 2019-03-28 2022-04-12 中核建中核燃料元件有限公司 一种用于核燃料棒包壳表面检测的方法、装置
US11447935B2 (en) 2019-04-30 2022-09-20 Deere & Company Camera-based boom control
US11653101B2 (en) 2019-05-17 2023-05-16 Samsung Electronics Co., Ltd. Imaging system for generating high dynamic range image
CN110133643B (zh) * 2019-05-22 2021-08-20 北京林业大学 植物根系探测方法及装置
DE102019208036A1 (de) * 2019-06-03 2020-12-03 Robert Bosch Gmbh Verfahren und Vorrichtung zum Überwachen eines Schweißprozesses
DE112020002341B4 (de) 2019-06-13 2024-06-20 Mitsubishi Electric Corporation Bearbeitungsfehler-Erkennungsvorrichtung, Laserschneidvorrichtung und Funkenerodiervorrichtung
RU2723493C1 (ru) * 2019-07-15 2020-06-11 федеральное государственное бюджетное образовательное учреждение высшего образования "Пермский национальный исследовательский политехнический университет" Способ лазерной сварки с контролем процесса формирования сварного шва
CN114341754B (zh) * 2019-08-28 2023-05-26 百超激光有限公司 对切割处理中的激光切割头运动的控制方法及设备和介质
EP3786736A1 (en) * 2019-08-28 2021-03-03 Bystronic Laser AG Control for a laser cutting head movement in a cutting process
US11878365B2 (en) 2019-11-20 2024-01-23 Concept Laser Gmbh Focus adjustment and laser beam caustic estimation via frequency analysis of time traces and 2D raster scan data
DE102019220485A1 (de) * 2019-12-20 2021-06-24 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zur Bestimmung und Korrektur des Maschinenzustands einer Werkzeugmaschine und Diagnosesystem
US12017300B2 (en) 2020-03-12 2024-06-25 Concept Laser Gmbh Cross stitching control by QMM3D
US12017301B2 (en) 2020-03-13 2024-06-25 General Electric Company Systems and methods for compression, management, and analysis of downbeam camera data for an additive machine
EP3885069A1 (de) 2020-03-25 2021-09-29 Bystronic Laser AG Qualitätskontrolle eines laserbearbeitungsprozesses mittels maschinellem lernen
CN111968072B (zh) * 2020-07-07 2024-04-02 南昌大学 一种基于贝叶斯网络的厚板t形接头焊接位置自主决策方法
DE102020210988A1 (de) 2020-08-31 2022-03-03 Fronius International Gmbh Laser-Hybrid-Schweißverfahren und Laser-Hybrid-Schweißgerät zur Verschweißung von Werkstücken
CN112149725B (zh) * 2020-09-18 2023-08-22 南京信息工程大学 基于傅立叶变换的谱域图卷积3d点云分类方法
CN112044874B (zh) * 2020-09-27 2022-02-25 厦门理工学院 一种激光清洗的实时监测系统及其监测方法
RU2763706C1 (ru) * 2021-03-16 2021-12-30 федеральное государственное бюджетное образовательное учреждение высшего образования "Казанский национальный исследовательский технический университет им.А.Н. Туполева - КАИ" Способ лазерной сварки разнородных металлических сплавов
CN117532279B (zh) * 2024-01-08 2024-03-19 山西金鼎泰金属制品股份有限公司 一种用于高压管路的连接法兰加工方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5517420A (en) * 1993-10-22 1996-05-14 Powerlasers Ltd. Method and apparatus for real-time control of laser processing of materials
WO1999014640A2 (en) * 1997-09-12 1999-03-25 Powerlasers Ltd. Self-adapting neural-fuzzy network for real-time process control
WO2001039919A2 (de) * 1999-11-27 2001-06-07 Thyssen Krupp Stahl Ag Verfahren und vorrichtung zur qualitätskontrolle der naht an mit einem laser stumpf geschweissten blechen oder bändern
DE10103255A1 (de) * 2001-01-25 2002-08-14 Bosch Gmbh Robert Verfahren zur automatischen Beurteilung von Laserbearbeitungsprozessen
EP1415755A2 (en) * 2002-07-31 2004-05-06 Unitek Miyachi International, Ltd. Laser weld monitor

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5768137A (en) * 1995-04-12 1998-06-16 American Research Corporation Of Virginia Laser aligned robotic machining system for use in rebuilding heavy machinery
JPH08309578A (ja) 1995-05-23 1996-11-26 Toyota Motor Corp 溶接ビード良否判定方法
US5940296A (en) * 1995-11-06 1999-08-17 Medar Inc. Method and system for interactively developing a graphical control-flow structure and associated application software for use in a machine vision system
DE59709824D1 (de) * 1996-07-29 2003-05-22 Elpatronic Ag Bergdietikon Verfahren und Vorrichtung zur Kantenverfolgung und Kantenprüfung
DE19716293C2 (de) 1997-04-18 2000-07-13 Daimler Chrysler Ag Vorrichtung zur Regelung der Fokuslage beim Laserstrahlschweißen
RU2155653C2 (ru) * 1998-06-08 2000-09-10 Курский государственный технический университет Видеосенсорное устройство
US6532454B1 (en) * 1998-09-24 2003-03-11 Paul J. Werbos Stable adaptive control using critic designs
JP2001276980A (ja) 2000-03-30 2001-10-09 Matsushita Electric Ind Co Ltd 接合装置
JP2002064304A (ja) * 2000-08-22 2002-02-28 Taiyo Yuden Co Ltd 積層型誘電体フィルタ
US7173245B2 (en) * 2001-01-04 2007-02-06 The Regents Of The University Of California Submicron thermal imaging method and enhanced resolution (super-resolved) AC-coupled imaging for thermal inspection of integrated circuits
TW565684B (en) 2001-02-14 2003-12-11 Honda Motor Co Ltd Welding state monitoring device
JP3512388B2 (ja) 2001-02-15 2004-03-29 川崎重工業株式会社 レーザ加工モニタリング装置
JP2002346783A (ja) * 2001-05-30 2002-12-04 Denso Corp レーザ溶接制御方法及びその装置
PT1448334E (pt) * 2001-11-15 2011-06-28 Precitec Vision Gmbh & Co Kg Processo e dispositivo para detetar a qualidade de uma costura de soldadura durante a soldadura de peças de trabalho
SE521787C2 (sv) * 2002-04-05 2003-12-09 Volvo Aero Corp Anordning och förfarande för kontroll av ett svetsområde, inrättning och förfarande för styrning av en svetsoperation, datorprogram och datorprogramprodukt
JP2006088163A (ja) * 2004-09-21 2006-04-06 Fanuc Ltd レーザ装置
US7364306B2 (en) * 2005-06-20 2008-04-29 Digital Display Innovations, Llc Field sequential light source modulation for a digital display system
US8615374B1 (en) * 2006-06-09 2013-12-24 Rockwell Automation Technologies, Inc. Modular, configurable, intelligent sensor system
US8084706B2 (en) * 2006-07-20 2011-12-27 Gsi Group Corporation System and method for laser processing at non-constant velocities
US8242426B2 (en) * 2006-12-12 2012-08-14 Dolby Laboratories Licensing Corporation Electronic camera having multiple sensors for capturing high dynamic range images and related methods
CN101636745A (zh) 2006-12-29 2010-01-27 格斯图尔泰克股份有限公司 使用增强型交互系统操纵虚拟对象
DE502007002697D1 (de) * 2007-11-20 2010-03-11 Trumpf Werkzeugmaschinen Gmbh Verfahren zum Bestimmen einer Kenngröße für die Genauigkeit einer Nahtlageregelung
DE102008038332A1 (de) 2008-08-19 2009-01-22 Daimler Ag Verfahren und Vorrichtung zur Regelung eines Schweißprozesses

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5517420A (en) * 1993-10-22 1996-05-14 Powerlasers Ltd. Method and apparatus for real-time control of laser processing of materials
WO1999014640A2 (en) * 1997-09-12 1999-03-25 Powerlasers Ltd. Self-adapting neural-fuzzy network for real-time process control
WO2001039919A2 (de) * 1999-11-27 2001-06-07 Thyssen Krupp Stahl Ag Verfahren und vorrichtung zur qualitätskontrolle der naht an mit einem laser stumpf geschweissten blechen oder bändern
DE10103255A1 (de) * 2001-01-25 2002-08-14 Bosch Gmbh Robert Verfahren zur automatischen Beurteilung von Laserbearbeitungsprozessen
EP1415755A2 (en) * 2002-07-31 2004-05-06 Unitek Miyachi International, Ltd. Laser weld monitor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FARSON D F ET AL: "INTELLIGENT LASER WELDING CONTROL", PROCEEDINGS OF LASER MATERIALS PROCESSING CONFERENCE ICALEO.PROCEEDINGS OF INTERNATIONAL CONGRESS ON THE APPLICATIONS OFLASERS AND ELECTRO-OPTICS, XX, XX, vol. 1722, 1 January 1991 (1991-01-01), pages 104 - 112, XP000490956 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9383276B2 (en) 2010-10-26 2016-07-05 Sintokogio, Ltd. Evaluation method and evaluation system for impact force of laser irradiation during laser peening and laser peening method and laser peening system
CN103052880A (zh) * 2010-10-26 2013-04-17 新东工业株式会社 用于激光照射在激光喷丸期间的冲击力的评估方法和评估系统以及激光喷丸方法和激光喷丸系统
WO2012056913A1 (en) * 2010-10-26 2012-05-03 Sintokogio, Ltd. Evaluation method and evaluation system for impact force of laser irradiation during laser peening and laser peening method and laser peening system
EP3189926A1 (de) * 2011-02-07 2017-07-12 TRUMPF Werkzeugmaschinen GmbH + Co. KG Vorrichtung und verfahren zur überwachung und insbesondere zur regelung eines laserschneidprozesses
US10888954B2 (en) 2011-02-07 2021-01-12 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Method for monitoring and controlling a laser cutting process
CN103347642A (zh) * 2011-02-07 2013-10-09 通快机床两合公司 用于监测并且特别是用于调整激光切割过程的装置和方法
WO2012107331A1 (de) * 2011-02-07 2012-08-16 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Vorrichtung und verfahren zur überwachung und insbesondere zur regelung eines laserschneidprozesses
EP3189927A1 (de) * 2011-02-07 2017-07-12 TRUMPF Werkzeugmaschinen GmbH + Co. KG Vorrichtung und verfahren zur überwachung und insbesondere zur regelung eines laserschneidprozesses
US10058953B2 (en) 2011-02-07 2018-08-28 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Method for monitoring and controlling a laser cutting process
EP3581323A1 (de) * 2011-02-07 2019-12-18 TRUMPF Werkzeugmaschinen GmbH + Co. KG Vorrichtung und verfahren zur überwachung und insbesondere zur regelung eines laserschneidprozesses
DE102011103282B4 (de) * 2011-06-03 2015-09-03 Lessmüller Lasertechnik GmbH Verfahren zum Überwachen der Bearbeitung sowie Vorrichtung zum Bearbeiten eines Werkstücks mit einem hochenergetischen Bearbeitungsstrahl
DE102011103282A1 (de) * 2011-06-03 2012-12-06 Lessmüller Lasertechnik GmbH Verfahren zum Überwachen der Bearbeitung sowie Vorrichtung zum Bearbeiten eines Werkstücks mit einem hochenergetischen Bearbeitungsstrahl
DE102011122991B3 (de) 2011-06-03 2023-06-22 Lessmüller Lasertechnik GmbH Verfahren zum Überwachen der Bearbeitung sowie Vorrichtung zum Bearbeiten eines Werkstücks mit einem hochenergetischen Bearbeitungsstrahl
WO2012163545A1 (de) 2011-06-03 2012-12-06 Lessmüller Lasertechnik GmbH Verfahren zum überwachen der bearbeitung sowie vorrichtung zum bearbeiten eines werkstücks mit einem hochenergetischen bearbeitungsstrahl
WO2016001234A1 (de) * 2014-07-01 2016-01-07 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren und vorrichtung zum bestimmen einer werkstoffart und/oder einer oberflächenbeschaffenheit eines werkstücks
US10115190B2 (en) 2014-07-01 2018-10-30 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Determining a material type and/or a surface condition of a workpiece
DE102017011463B4 (de) * 2016-12-19 2019-10-24 Fanuc Corporation Maschinenlerneinrichtung zum Lernen eines Verfahrens zum Justieren eines optischen Teils einer Lichtquelleneinheit und Lichtquelleneinheitsherstellungsvorrichtung
US10241324B2 (en) 2016-12-19 2019-03-26 Fanuc Corporation Machine learning device for learning procedure for aligning optical part of light source unit, and light-source unit manufacturing apparatus
WO2020104103A1 (de) * 2018-11-22 2020-05-28 Precitec Gmbh & Co. Kg Überwachung eines laserarbeitungsprozesses mithilfe von tiefen faltenden neuronalen netzen
CN113329836A (zh) * 2018-11-22 2021-08-31 普雷茨特两合公司 借助深度卷积神经网络监测激光加工过程
US12013670B2 (en) 2018-11-22 2024-06-18 Precitec Gmbh & Co. Kg Monitoring a laser machining process using deep folding neural networks
WO2020239328A1 (de) * 2019-05-29 2020-12-03 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Automatische materialerkennung mit laser
WO2021105364A3 (de) * 2019-11-29 2021-07-22 Sms Group Gmbh Steuerungssystem für eine industrielle anlage, insbesondere für eine anlage zur herstellung oder verarbeitung von metallischen bändern oder blechen und verfahren zum steuern einer industriellen anlage, insbesondere einer anlage zur herstellung oder verarbeitung von metallischen bändern oder blechen
WO2022263207A1 (de) * 2021-06-18 2022-12-22 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zur laserbearbeitung und laserbearbeitungsanlage sowie steuerungseinrichtung hierfür
EP4368330A1 (de) * 2022-11-14 2024-05-15 Bystronic Laser AG Steuerung einer laserschneidmaschine mittels luftschallsignalen
WO2024104690A1 (en) * 2022-11-14 2024-05-23 Bystronic Laser Ag Control of a laser cutting machine by means of airborne sound signals
CN117283143A (zh) * 2023-10-08 2023-12-26 广东省源天工程有限公司 用于海洋内水下作业机器人的防腐蚀控制系统及方法
CN117283143B (zh) * 2023-10-08 2024-02-09 广东省源天工程有限公司 用于海洋内水下作业机器人的防腐蚀控制系统及方法

Also Published As

Publication number Publication date
CA2743518C (en) 2015-02-10
CN102292187B (zh) 2015-12-09
JP5763542B2 (ja) 2015-08-12
EP2365889A1 (de) 2011-09-21
US9056368B2 (en) 2015-06-16
KR20110095381A (ko) 2011-08-24
US20110284512A1 (en) 2011-11-24
CA2743522C (en) 2015-05-26
CN102281984A (zh) 2011-12-14
MX2011005336A (es) 2011-10-14
ZA201104525B (en) 2012-03-28
JP2012509190A (ja) 2012-04-19
KR20110099250A (ko) 2011-09-07
US20110278277A1 (en) 2011-11-17
EP2365889B1 (de) 2016-07-06
KR101787510B1 (ko) 2017-10-18
EP2365890B1 (de) 2017-03-22
WO2010057662A1 (de) 2010-05-27
JP5762968B2 (ja) 2015-08-12
KR101700896B1 (ko) 2017-01-31
CN102292187A (zh) 2011-12-21
RU2529135C2 (ru) 2014-09-27
US9492886B2 (en) 2016-11-15
CA2743518A1 (en) 2010-05-27
ZA201104527B (en) 2012-03-28
JP2012509189A (ja) 2012-04-19
CA2743522A1 (en) 2010-05-27
CN102281984B (zh) 2015-12-16
RU2011125317A (ru) 2012-12-27
EP2365890A1 (de) 2011-09-21
RU2529136C2 (ru) 2014-09-27
HK1164786A1 (zh) 2012-09-28
RU2011125341A (ru) 2012-12-27
MX2011005335A (es) 2011-10-11

Similar Documents

Publication Publication Date Title
EP2365890B1 (de) Verfahren und vorrichtung zur überwachung eines an einem werkstück durchzuführenden laserbearbeitungsvorgangs sowie laserbearbeitungskopf mit einer derartigen vorrichtung
EP2456592B1 (de) Laserbearbeitungskopf und verfahren zur kompensation der fokuslagenänderung bei einem laserbearbeitungskopf
DE102008058422A1 (de) Verfahren und Vorrichtung zur Überwachung eines an einem Werkstück durchzuführenden Laserbearbeitungsvorgangs sowie Laserbearbeitungskopf mit einer derartigen Vorrichtung
DE102018129441B4 (de) System zur Überwachung eines Laserbearbeitungsprozesses, Laserbearbeitungssystem sowie Verfahren zur Überwachung eines Laserbearbeitungsprozesses
EP2883647B1 (de) Verfahren zur Konfiguration einer Laserbearbeitungsvorrichtung
WO2018069308A1 (de) Verfahren und vorrichtung zur bestimmung und zur regelung einer fokusposition eines bearbeitungsstrahls
EP0353409A1 (de) Automatische Helligkeits- und Kontrast-Steuerung einer Video-Kamera für industrielle/militärische Zwecke
DE102019209376A1 (de) Vorrichtung und Verfahren zur Überwachung eines Laserbearbeitungsprozesses, Verwendung einer ereignisbasierten Kamera, Computerprogramm und Speichermedium
DE202011110730U1 (de) Kognitiver Bearbeitungskopf zum Bearbeiten von Werkstücken
EP3885069A1 (de) Qualitätskontrolle eines laserbearbeitungsprozesses mittels maschinellem lernen
DE102020110087A1 (de) Verfahren zur prozesskontrolle bei der lasermaterialbearbeitung
DE102019101222A1 (de) Verfahren zur Auswahl von Kamerabildabschnitten
EP2808843B1 (de) Verfahren zur Parametrierung eines Bildverarbeitungssystems für die Überwachung einer Werkzeugmaschine
EP4119284A1 (de) Kalibrierung eines qualitätsschätzers für ein laserschneidverfahren
DE102011085677A1 (de) Verfahren und Vorrichtung zur Überwachung eines an einem Werkstück vorzunehmenden Laserbearbeitungsvorgangs
WO2022037899A1 (de) Verfahren und vorrichtung zur additiven herstellung eines werkstücks
DE102020211702A1 (de) Verfahren zum Überwachen einer Immersionsflüssigkeit in einem Mikroskop
DE102023002181B3 (de) Adaptive Filterkette zum Anzeigen eines Umfeldmodells in einem Fahrzeug
EP4230341A1 (de) Verfahren und vorrichtung zum laserschneiden eines werkstücks
WO2021139914A1 (de) Überwachung eines laserbearbeitungsprozesses mithilfe eines neuromorphen bildsensors
WO2023186693A1 (de) Verfahren, computerprogrammprodukt, parkassistenzsystem und gebäude
DE102021206302A1 (de) Verfahren zur Laserbearbeitung und Laserbearbeitungsanlage sowie Steuereinrichtung hierfür
EP4368330A1 (de) Steuerung einer laserschneidmaschine mittels luftschallsignalen
WO2023104767A1 (de) Verfahren und vorrichtung zur erkennung von objekten an einem industriellen arbeitsplatz
DE102022104083A1 (de) Linienoptiksystem

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980147217.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09760109

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2743518

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 3703/DELNP/2011

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: MX/A/2011/005335

Country of ref document: MX

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2011536788

Country of ref document: JP

REEP Request for entry into the european phase

Ref document number: 2009760109

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2009760109

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20117014138

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2011125317

Country of ref document: RU

WWE Wipo information: entry into national phase

Ref document number: 13130426

Country of ref document: US

ENP Entry into the national phase

Ref document number: PI0921514

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20110523