WO2022117207A1 - Procédé, dispositif et système d'usinage pour surveiller un processus d'usinage d'une pièce à usiner au moyen d'un faisceau d'usinage à haute énergie - Google Patents

Procédé, dispositif et système d'usinage pour surveiller un processus d'usinage d'une pièce à usiner au moyen d'un faisceau d'usinage à haute énergie Download PDF

Info

Publication number
WO2022117207A1
WO2022117207A1 PCT/EP2020/084629 EP2020084629W WO2022117207A1 WO 2022117207 A1 WO2022117207 A1 WO 2022117207A1 EP 2020084629 W EP2020084629 W EP 2020084629W WO 2022117207 A1 WO2022117207 A1 WO 2022117207A1
Authority
WO
WIPO (PCT)
Prior art keywords
representation
processing
machining
workpiece
measurement
Prior art date
Application number
PCT/EP2020/084629
Other languages
German (de)
English (en)
Inventor
Eckhard Lessmueller
Christian Truckenbrodt
Original Assignee
Lessmueller Lasertechnik Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lessmueller Lasertechnik Gmbh filed Critical Lessmueller Lasertechnik Gmbh
Priority to PCT/EP2020/084629 priority Critical patent/WO2022117207A1/fr
Publication of WO2022117207A1 publication Critical patent/WO2022117207A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/03Observing, e.g. monitoring, the workpiece
    • B23K26/032Observing, e.g. monitoring, the workpiece using optical means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/06Shaping the laser beam, e.g. by masks or multi-focusing
    • B23K26/064Shaping the laser beam, e.g. by masks or multi-focusing by means of optical elements, e.g. lenses, mirrors or prisms
    • B23K26/0643Shaping the laser beam, e.g. by masks or multi-focusing by means of optical elements, e.g. lenses, mirrors or prisms comprising mirrors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/06Shaping the laser beam, e.g. by masks or multi-focusing
    • B23K26/064Shaping the laser beam, e.g. by masks or multi-focusing by means of optical elements, e.g. lenses, mirrors or prisms
    • B23K26/0648Shaping the laser beam, e.g. by masks or multi-focusing by means of optical elements, e.g. lenses, mirrors or prisms comprising lenses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/08Devices involving relative movement between laser beam and workpiece
    • B23K26/082Scanning systems, i.e. devices involving movement of the laser beam relative to the laser head
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/20Bonding
    • B23K26/21Bonding by welding
    • B23K26/24Seam welding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K31/00Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups
    • B23K31/003Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups relating to controlling of welding distortion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K31/00Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups
    • B23K31/12Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups relating to investigating the properties, e.g. the weldability, of materials
    • B23K31/125Weld quality monitoring
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/2441Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/0209Low-coherence interferometers
    • G01B9/02091Tomographic interferometers, e.g. based on optical coherence

Definitions

  • the invention relates to a method, a control unit, a device and a machining system for monitoring a machining process of a workpiece using a high-energy machining beam.
  • the invention also relates to an associated control unit and an associated computer program product.
  • Methods for machining workpieces using a high-energy machining beam, such as a laser beam are known from the prior art.
  • Laser welding and/or laser beam cutting represent common applications.
  • the processing beam is moved along a main processing path relative to the workpiece or relative to a plurality of workpieces to be connected, as a result of which a weld seam can be formed, for example.
  • WO 2020/069266 A1 discloses a method in which a local energy density of the machining beam is visualized on the workpiece.
  • OCT optical coherence tomography
  • OCT measurements allow the actual machining process to be observed and evaluated.
  • the measuring beam is shifted independently of the processing beam and is moved, for example, along one or more measuring lines.
  • a height profile can then be determined along the measuring line.
  • a corresponding measuring method is known from DE 10 2015 007 142 A1, in which the measuring beam is displaced transversely to the processing direction both in front of and behind a current processing point.
  • a processing system is known from DE 10 2019 210 618 A1 that enables OCT monitoring of wobble laser welding.
  • the system is set up to move a measuring beam to different measuring locations relative to a processing beam. This movement takes place in accordance with the wobble pattern, i. H. the relative position of the measuring beam is independent of the periodic movement of the wobble.
  • the diverse options for process monitoring using OCT mean that a user of a processing system is often confronted with just as diverse information. It can be challenging and time-consuming to link the different measurement results and to draw conclusions about possible incorrect settings, suitable process parameters and/or the quality of a machining result.
  • the object of the present invention is therefore to improve the manageability of an OCT-based monitoring system.
  • the invention provides a method for monitoring a machining process of a workpiece, in particular laser welding, using a high-energy machining beam, comprising the following steps: determining a machining figure that defines a plurality of machining positions with associated multidimensional machining position coordinates; generating a high-energy processing beam and projecting and/or focusing the processing beam onto the workpiece according to the processing figure; Generating a measurement beam by means of an optical coherence tomograph, it being possible for the measurement beam to be coupled into the processing beam; determining measurement positions on the workpiece at least at some of the machining positions and/or in the vicinity thereof; determining measurement values at the measurement positions by directing the measurement beam at the measurement positions; and generating at least one representation based on multi-dimensional information relating to the machining position coordinates, the measured values and/or the measured positions.
  • the invention provides a device for monitoring a machining process of a workpiece, in particular laser welding, by means of a high-energy machining beam, the device being set up for: generating a high-energy machining beam and projecting and/or focusing the machining beam onto the workpiece according to a machining figure, defining a plurality of machining positions with associated multi-dimensional machining position coordinates; Generating a measurement beam by means of an optical coherence tomograph, it being possible for the measurement beam to be coupled into the processing beam; determining measurement positions on the workpiece at least at some of the machining positions and/or in the vicinity thereof; and generating at least one representation based on multi-dimensional information relating to the machining position coordinates, the measured values and/or the measured positions.
  • the device according to the invention can be set up to monitor the machining process according to a method according to the invention.
  • the device can be set up to carry out the method in a semi-automated or automated manner.
  • the device can have the corresponding components, units and/or devices.
  • the device can be set up to implement the corresponding method features.
  • the method can have corresponding additional and/or correspondingly adapted method steps.
  • the device can include a processing unit with a processing beam source for generating the high-energy processing beam and with processing beam optics for projecting and/or focusing the processing beam onto a processing position on the workpiece. Furthermore, the device can include an optical coherence tomograph for generating the measuring beam.
  • the measurement values and/or the measurement positions can be improved. It is possible for a user to use the representation to understand whether process parameters are set correctly or whether they need to be adjusted. The quality of the processing can also be seen from this. For example, a weld seam quality can be checked precisely and/or with spatial resolution. Furthermore, a position of the measuring beam relative to the machining beam and/or relative to a weld seam and/or relative to the machined workpiece can be easily checked using a multidimensional representation. As a result, the usability is improved since the effects of the settings made can be assessed easily and clearly.
  • the workpiece can be a single workpiece or multiple workpieces. In the latter case, the workpieces can be arranged one above the other and/or next to one another, for example in order to weld them together.
  • the workpiece may include one or more metal parts and/or parts and/or components.
  • the processing figure can define a main processing path on which an additional movement is superimposed, in particular a wobble movement. In particular, the processing figure includes a total of processing positions onto which the processing beam is to be directed.
  • the processing figure can be based on a rectilinear main processing path. Alternatively or additionally, the machining figure can be based on a curved and/or polygonal and/or circular and/or elliptical main machining path.
  • a processing head can be provided, which can be carried by an industrial robot, for example.
  • the machining head can be movable relative to the workpiece.
  • the processing beam optics can be provided in the processing head.
  • the processing beam can be a processing laser beam.
  • the measuring beam can be a laser beam, which can be generated in particular independently of the processing beam.
  • the measurement beam and/or the processing beam can be coupled into the processing head and preferably into the processing beam optics.
  • the optical coherence tomograph can include a beam generation unit for generating the measurement beam and a reference beam.
  • the optical coherence tomograph can have a measuring arm that extends from the beam generation unit and in which the measuring beam can be guided optically so that it can be projected onto a measurement object such as the workpiece, and a reference arm that extends from the beam generation unit and in which the reference beam can be guided optically and which simulates the measuring arm at least in its optical path length and/or in its other optical properties. After passing through the measuring arm or the reference arm, the measuring beam and the reference beam can be superimposed to generate an interference signal.
  • the reference arm can be designed to be fixed or adjustable.
  • optical properties of the reference arm such as its optical path length and/or its total dispersion can be adjustable, for example in order to adapt them to a changed path length of the processing beam and/or to different optical components guiding the measuring beam.
  • several reference arms are provided which can be operated in parallel, alternatively and/or in series and which differ in terms of their optical properties. It can thus be made possible, for example when changing optical components of the beam guidance, to switch manually or automatically to another reference arm in order to bring about the optical correspondence of reference arm and measuring arm.
  • the optical coherence tomograph is not arranged on the industrial robot.
  • the optical coherence tomograph can be stationary.
  • the optical coherence tomograph can be connected to the processing head via light guides such as fibers.
  • the processing head can be movable relative to the coherence tomograph.
  • the optical coherence tomograph and in particular its sensor system is arranged on the industrial robot, for example inside and/or near the processing head.
  • the optical coherence tomograph can be moved with the processing head and/or with moving components of the industrial robot.
  • a measuring beam deflection device can be provided, by means of which the measuring beam can be deflected in such a way that it can be guided relative to a current machining position on the workpiece.
  • the measuring beam can be displaceable independently of the processing beam.
  • the measuring beam can be directed, for example, into a keyhole created during processing and/or directly onto a point of impact of the processing beam on the workpiece.
  • the measuring beam can be guided in the main processing direction in front of or behind the processing position and/or transversely, for example perpendicularly and/or obliquely and/or longitudinally to the main processing direction.
  • measurement lines can be traced along which, for example, a height profile can be determined.
  • the measuring beam deflection device can comprise at least one movable mirror which can be rotated and/or pivoted in particular in at least one spatial direction.
  • the measuring beam deflection device comprises two mirrors, each of which can be moved in one spatial direction, or one movable mirror for each degree of freedom of displacement.
  • a processing beam deflection device can be present, by means of which the processing beam can be displaced.
  • the processing beam deflection device can comprise at least one movable mirror which can be rotated and/or pivoted in particular in at least one spatial direction.
  • the processing beam deflection device comprises two mirrors, each of which can be moved in one spatial direction, or one movable mirror for each degree of freedom of displacement.
  • the measuring beam and the processing beam can preferably be displaced together by means of the processing beam deflection device. For example, by moving the processing beam deflection device, a wobbling movement can be generated that affects both the measuring beam and the processing beam follow.
  • the measuring beam can be shifted relative to the processing beam by suitable, preferably synchronous, movement of the measuring beam deflection device, while both beams are deflected by the processing beam deflection device.
  • the device can additionally comprise an image acquisition device.
  • the image capturing device can include a camera, for example a 2D camera or a 3D camera.
  • the image acquisition device is preferably set up to deliver images in real time.
  • the image capturing device can, for example, be arranged and set up in such a way that it captures images of the workpiece in an area onto which the processing beam and/or the measuring beam can be directed.
  • the image acquisition device can be arranged behind the measuring beam deflection device.
  • the movable mirror of the measuring beam deflection device can be partially transparent.
  • the device can include an illumination device which is arranged and set up to illuminate the workpiece in an area onto which the processing beam and/or the measuring beam can be directed.
  • Monitoring of the editing process may be limited to rendering the presentation.
  • the monitoring can also include, in particular, partially automated or automated control and/or regulation of at least one process parameter.
  • the monitoring can include the display of further settings, parameters, representations, textual and/or acoustic notifications and/or information and the like.
  • the monitoring can include querying at least one user input.
  • the multi-dimensional, location-dependent information can be based on multi-dimensional location coordinates and can include at least one additional property, for example at least one value and/or at least one marking and/or at least one piece of information regarding the presence or absence of a condition at at least one of a number of specific locations.
  • At least one display unit can be provided, by means of which the representation can be displayed.
  • the display unit can include a screen and/or data glasses or the like.
  • the display unit can be part of the device according to the invention.
  • the display unit can also be part of an external Computer and/or a mobile device such as a smartphone, tablet PC etc. be.
  • the at least one representation is preferably intended to be reproduced on a two-dimensional display screen.
  • the at least one representation can also be intended to be reproduced as a three-dimensional static and/or moving image, for example as a hologram, as a 3D screen image or the like.
  • the at least one representation can, for example, comprise two partial representations for some or all of the representations included, which are each assigned to one of the two eyes of the user, preferably a left partial representation and a right partial representation.
  • a 3D rendering can be achieved as a result.
  • the at least one representation may include multiple representations that may be displayed sequentially and/or simultaneously.
  • a way of presentation can be adjustable by the user.
  • the at least one representation may include a sequence of representations.
  • the at least one representation can comprise a plurality of representations displayed next to one another and/or one above the other.
  • a representation within the meaning of this disclosure can include static and/or moving unprocessed captured images, processed captured images, simulated images, calculated images and/or previously stored images, in particular in any combination.
  • a representation can include an image that is based on a graphic processing of measured values and that is superimposed on a captured image, for example of the workpiece.
  • captured images and images generated in some other way can be included in a common representation, for example next to one another or one above the other.
  • the machining figure causes an oscillating movement of the machining beam, optionally with a frequency of at least 50 Hz and advantageously of at least 100 Hz.
  • This oscillating movement is advantageously superimposed on a movement along a main machining path.
  • processing can be carried out in which so-called wobbling takes place, which advantageously leads to a high quality of the weld seam.
  • the oscillating movement can include sinusoidal, sawtooth-shaped, zigzag-shaped, circular segment-shaped, spiral-shaped and/or rectangular partial movements.
  • the type and/or the amplitude and/or the period of the partial movements and/or the oscillating movement can be specified directly or indirectly by the user. Suitable parameters can be automated, for example, depending on a A machining process to be carried out is selected by specifying the workpiece material, course of the weld seam, and/or a desired penetration depth during welding, etc. Alternatively or additionally, at least one of the parameters mentioned can be specified directly by the user.
  • a precise adjustment of processing parameters and an associated control over a processing result can be achieved in particular if at least one additional beam property is changed periodically during the oscillating movement, in particular a beam diameter and/or a beam shape and/or a power and/or a wavelength .
  • the method also includes determining penetration depth values for a respective penetration depth of the machining beam into the workpiece at at least some of the machining positions during the machining process using the optical coherence tomograph by directing the measuring beam to suitable measuring positions on the workpiece; and assigning the penetration depth values to the respective machining position coordinates.
  • the at least one representation includes a representation, also referred to below as a penetration depth representation, which is based on the associated penetration depth values and machining position coordinates.
  • the device is also set up to: determine penetration depth values for a respective penetration depth of the machining beam into the workpiece at at least some of the machining positions during the machining process using the optical coherence tomograph by directing the measuring beam to suitable measuring positions on the workpiece; and for associating the penetration depth values with the respective machining position coordinates.
  • the at least one representation includes a representation, also referred to below as a penetration depth representation, which is based on the associated penetration depth values and machining position coordinates.
  • Penetration depth values can be determined by directing the measuring beam into a keyhole. It can be provided to first determine an actual position of the keyhole for a certain predetermined processing position by creating a two-dimensional and/or a three-dimensional height profile in the area of the processing position. Provision can also be made to direct the measuring beam during processing to a position of the keyhole determined in this way.
  • the penetration depth values can be temporally and/or spatially averaged values.
  • the processing positions can be continuously distributed.
  • a penetration depth value can be assigned to a specific machining position, which results from a suitable averaging of several individual measured values, which are taken into account, for example, using a suitable weighting function.
  • the weighting function can take into account known parameters of the keyhole, such as its depth distribution parallel, transverse or oblique to the main processing direction. Alternatively or additionally, the weighting function can take into account certain processing parameters such as a feed rate, a processing beam power, a material and/or another property of the workpiece and the like.
  • the assignment of the penetration depth values to the respective processing position coordinates causes the penetration depth into the keyhole to be recorded for a specific predefined processing position of the processing beam when the processing beam was aimed at this predefined processing position.
  • the actual position of the keyhole for this given edit position may differ from the edit position itself.
  • the keyhole forms regularly in the main processing direction behind the current processing position.
  • the assignment to the respective machining position coordinates establishes the connection to the machining program run through.
  • the penetration depth values can change along the machining figure, for example periodically. Occurs about one Machining with a constant feed rate in the main machining direction may change a local speed along a machining path due to a wobble movement, which meanders and/or meanders and/or can follow a curved, jagged, angled or otherwise oscillating course in comparison to the main machining path along the main machining direction . At reversal points of the corresponding machining path, the speed at which the machining beam moves relative to the surface of the workpiece can therefore be lower than in areas in which the machining path intersects the main machining path. In general, the local speed of the processing beam can vary, which is why the processing beam acts on different areas of the workpiece for different lengths of time. This can result in a locally variable penetration depth.
  • the at least one display for example the penetration depth display
  • the penetration depth display includes a three-dimensional display in which the penetration depth values are plotted against the processing position coordinates.
  • This three-dimensional representation can be a perspective representation and/or a heat map representation, for example.
  • At least one further beam property for example a beam diameter and/or a beam shape and/or a power and/or a wavelength, as mentioned above.
  • a suitable choice of periodically changing additional beam properties can be aimed at achieving penetration depths that are as uniform as possible along the processing path.
  • specific parameters for such a setting can be specified and at least one jet property is controlled in order to use the penetration depth or its progression along the processing path as a controlled variable.
  • One or more of the periodically changing beam properties can serve as a manipulated variable.
  • the power of the processing beam can be changed periodically. This can serve to prevent too deep penetration into the workpiece in sections of the machining path in which the machining beam moves comparatively more slowly over the workpiece. In an analogous way further or other beam properties are changed periodically, depending on the desired result.
  • the at least one display in particular the penetration depth display, includes a moving display that is based on a moving viewing position.
  • the moving representation can be based, for example, on a virtual flight over the machining position coordinates, in which the associated welding depths can be visually recorded.
  • the moving representation can include a rotation of the three-dimensional representation, as a result of which, for example, positions can be alternately recognized more precisely in a plan view, whereas penetration depth values can be recognized more precisely in a side view.
  • the three-dimensional representation can be rotated in order to assume a specific viewing position. A user can then precisely match the three-dimensional profile of the indentation depths with the processing figure used in order to be able to estimate their influence on the indentation depth distribution.
  • the complex relationships behind the penetration depth distribution can be easily and comprehensively analyzed by the user, in particular if the moving display uses a 3D system and/or virtual reality system and/or augmented reality system such as 3D glasses, a 3D monitor, a head-mounted display or the like is reproduced.
  • augmented reality system it can be provided that the penetration depth display is superimposed on a real image of the workpiece.
  • the user can view the workpiece from different viewing positions and/or from different angles while wearing suitable 3D glasses or a head-mounted display, whereby the representation of the penetration depth is displayed and assessed virtually directly on the workpiece can.
  • An augmented reality system can also be implemented using a user's portable device, for example in the form of an app on a smartphone or tablet PC. If the device is held appropriately in space, an image of the workpiece can be generated via its integrated camera and the penetration depth display can be superimposed on this image.
  • the choice may be a user choice. This allows the display be provided with easily understandable indications of limit value violations, for example by highlighting areas in which the measured penetration depth exceeds or falls below a limit value.
  • the false color display can be used, for example, to generate a color gradient that illustrates increasing penetration depth values. The indentation depth distribution can thus be intuitively grasped by the user even without looking closely at a scale or axis labeling, which leads to improved usability when processing parameters are to be assessed or adjusted.
  • a particularly efficient finding of suitable parameters can be facilitated by a user specifying at least one limit value, the at least one display, in particular the penetration depth display, including a display based on a comparison of the assigned penetration depth values with the predefined limit value. This display based on the comparison can in turn depend on the mentioned selection of at least one display color and/or blinking and/or highlighting.
  • the limit value can be a minimum penetration depth value, a maximum penetration depth value, a frequency for the occurrence of specific minimum or maximum penetration depth values and/or a one-dimensional or two-dimensional spatial coordinate range to be observed.
  • a high level of operating convenience can be achieved as a result.
  • the user can work close to the data, but still grasp the data situation intuitively.
  • a user moves a line and/or a curve in the display in order to specify the limit value.
  • Such a line or curve can be used, for example, to draw a boundary between acceptable and unacceptable penetration depth values.
  • the possibilities that OCT measurements offer for process monitoring can be fully exploited in particular if the penetration depth values and/or the processing position coordinates are subjected to at least one data analysis, for example a statistical evaluation, after assignment, and the at least one representation, in particular the penetration depth representation , one Includes presentation based on data analysis.
  • the data analysis can include smoothing, for example in the form of a moving average with a suitable kernel function such as a rectangle, a triangle, a Gaussian curve or an asymmetric, for example skew, distribution function adapted to the shape of the keyhole.
  • the data analysis can include an exclusion of outliers.
  • Statistical evaluation may include averaging, determining specific quantiles, performing automated statistical tests, and the like.
  • a distribution of penetration depth values can be determined depending on a relative position of the assigned machining position, in particular depending on a position relative to the main machining path.
  • the statistical evaluation can include the determination of statistical parameters of such a distribution.
  • the representation can include a graphical representation of determined distributions and/or mean values, as well as a display of determined values.
  • the influence of certain machining parameters on a result and a quality of the machining can be checked quickly and reliably in particular if the at least one representation, in particular the penetration depth representation, includes at least a first representation that relates to a first set of associated penetration depth values, and at least a second representation relating to a second set of penetration depth values.
  • the penetration depth display can include the simultaneous display of a display based on stored measurement values from a previous measurement and a current display based on current measurement values.
  • the at least one representation in particular the penetration depth representation, can comprise a representation that is based on a comparison of at least two sets of assigned penetration depth values and machining position coordinates.
  • weld seams produced one after the other and the associated penetration depth values can be visually compared with one another.
  • measurement curves or three-dimensional representations of measured values which are based on the two sets, can be superimposed on one another and/or displayed in a common coordinate system.
  • a difference of penetration depth values are formed and the formed difference is displayed in order to carry out the comparison.
  • data related to current machining position coordinates and data related to current penetration depth values are synchronized in real time.
  • control can advantageously be carried out directly via a control unit of the optical coherence tomograph.
  • the penetration depth values can easily be assigned to the correct machining position coordinates.
  • the control unit of the optical coherence tomograph advantageously processes a predetermined position of the deflection device.
  • the control unit of the optical coherence tomograph can process a reported actual position of the deflection device.
  • the projection and/or focusing of the processing beam onto the processing positions and the directing of the measuring beam onto the measuring positions is based on a common time specification, and that the assignment of the penetration depth values to the respective processing position coordinates is carried out on the basis of the common time specification becomes.
  • the assignment is particularly simple, since a common time control can be used for the movement of the processing beam and for the measurement by means of the optical coherence tomograph or for the movement of the measuring beam.
  • data related to current machining position coordinates and data related to current penetration depth values are synchronized based on at least one trigger signal.
  • This embodiment allows the processing beam deflection device to be controlled independently of the optical coherence tomograph, which has advantages with regard to the design of the components used independently of one another and leads to a high degree of flexibility.
  • a particularly precise assignment of measured values to machining position coordinates can be achieved in particular if the method also includes determining a current machining position and a current penetration depth value, the assignment including determining the machining position coordinates belonging to the current machining position and assigning them to the current penetration depth value.
  • a control unit of the processing beam deflection device and a control unit of the optical coherence tomograph be provided which work independently of one another and which in particular can be physically separated from one another.
  • the control unit of the machining beam deflection device preferably has an interface that allows control signals to be clocked in the range of 10 ps or less, such as an RS485 interface, for example.
  • the control unit of the optical coherence tomograph can have an interface that can read in data synchronously with the penetration depth measurements. Furthermore, the control unit of the optical coherence tomograph can be integrated in such a way that it synchronously receives the control signals of the processing beam deflection device and/or position signals of the processing beam deflection device. A synchronization with respect to the set or determined position of the machining beam deflection device can thus take place. Furthermore, in the case of a periodic wobbling movement of the processing beam deflection device, it can be provided that the control unit of the optical coherence tomograph queries the period of the wobbling movement and/or receives it from the control unit of the processing beam deflection device.
  • control unit of the processing beam deflection device can send a trigger signal to the control unit of the optical coherence tomograph.
  • This trigger signal can be used together with the known period to assign the measured penetration depth values to the correct machining position coordinates by assigning OCT measured values and positions of the machining beam deflection device and thus of the machining beam on the workpiece to one another.
  • the measuring beam and the processing beam are deflected via a common deflection device.
  • the deflection device can generate a position feedback signal that is used to determine an actual machining position and/or an actual measurement position.
  • the deflection device can comprise the machining beam deflection device.
  • the position feedback signal can be used to synchronize the data related to current machining position coordinates and the data related to current penetration depth values.
  • the position feedback signal can be received, for example, by the control unit of the optical coherence tomograph and used as a basis for the control of the deflection device, in particular the processing beam deflection device, which proceeds therefrom.
  • the machining figure is related to a geometrical configuration of the workpiece.
  • the method includes the additional step of estimating the geometry of the workpiece at the measurement positions based on the associated measurement values.
  • the at least one representation includes a representation, also referred to below as a geometric representation, which is based on a two-dimensional representation of at least part of the machining figure and the associated estimated geometric condition of the workpiece.
  • the machining figure is related to a geometric configuration of the workpiece.
  • the device is also set up to estimate the geometric nature of the workpiece at the measurement positions on the basis of the associated measurement values.
  • the at least one representation includes a representation, also referred to below as a geometric representation, which is based on a two-dimensional representation of at least part of the machining figure and the associated estimated geometric condition of the workpiece.
  • the display of at least a part of the processing figure together with the associated estimated geometric nature of the workpiece makes it possible to assess the quality of the underlying estimation or recognition method. For example, it can be checked whether a recognition of certain workpiece properties such as edges, gaps, surface structures and the like works with sufficient accuracy. If necessary, parameters to be specified on which a corresponding estimation or recognition is based can be easily adapted by the user and the effect of such an adaptation can be understood intuitively, as a result of which simple operability is achieved.
  • the geometric condition can be at least one location-dependent structural property of the workpiece.
  • the processing figure can be adapted to the geometric condition.
  • the estimation of the geometric structure can include a semi-automated or automated data analysis, which is based on several measured values, in particular on an elevation profile.
  • the geometric condition can be estimated by evaluating several measuring lines that are recorded at different positions along the processing figure be, in particular transverse to a main machining direction at the different positions.
  • the geometric representation is based on a representation of the entire machining figure.
  • a particularly intuitive ability to check the suitability of a processing figure and/or the quality of set recognition parameters can be achieved in particular when the estimated geometric condition of the workpiece includes the presence and/or course of at least one edge of the workpiece.
  • the estimation can be based on edge detection using measured values that originate from at least one measuring line, for example from a measuring line perpendicular to the main processing direction.
  • the at least one representation in particular the geometry representation, includes a representation in which a current machining position is shown together with the machining figure and the estimated geometric condition of the workpiece.
  • the processing figure can be represented by a line.
  • the estimated geometry may be displayed superimposed on the machining figure. If the estimated geometry is a position of an edge, this can be represented, for example, in the form of a number of markings such as dots or crosses and/or in the form of a line.
  • the respective estimated position of an edge of the workpiece can be assessed relative to a processing figure.
  • the current processing position can be represented as a marking, for example as a point, as a cross, as a box or the like.
  • the processing figure and/or the estimated geometric structure and/or the current processing position can each be displayed in different colors, flashing or otherwise distinguishable from one another.
  • the method according to the invention further comprises the steps: assigning a number of different measurement positions to the working positions; and capturing an image of the workpiece in a region of a current machining position using at least one image capturing device.
  • the at least one representation includes a representation, also referred to below as a measurement position representation, which is based on the image of the workpiece and on which the current machining position and optical information can be seen that allow conclusions to be drawn about the respective measurement positions.
  • the device is also set up for: assigning a plurality of different measurement positions to the processing positions; and for capturing an image of the workpiece in an area of a current machining position by means of at least one image capturing device.
  • the at least one representation includes a representation, also referred to below as a measurement position representation, which is based on the image of the workpiece and from which the current machining position and optical information can be seen that allow conclusions to be drawn about the different measurement positions assigned to the current machining position.
  • the method and the device according to this aspect permit simple handling, since a user can easily understand at which point on the workpiece measurements are taking place, for example in front of a keyhole, directly at the keyhole or behind it in the main machining direction.
  • its structure, geometry and nature can be recognized on the basis of an actual image of the workpiece and can be intuitively related to selected and/or to be selected measurement positions.
  • it can be quickly checked whether the corresponding measurement positions are suitable for reliably monitoring the machining process.
  • the optical information can allow conclusions to be drawn about the actual measurement positions on the workpiece, for example through a geometric mapping, in particular through an elongation and/or compression and/or rotation.
  • the visual information can be used to identify whether the measurement positions are set in the desired order.
  • the optical information can include, for example, light reflections and/or superimposed markings that can be seen in the representation and that lie on a geometrically stretched and/or compressed and/or rotated measurement figure. Even if the optical information to be recognized, for example corresponding light reflections and/or markings, do not occur at the actual measurement positions, the user can nevertheless intuitively determine whether the current measurement position is consistent with the previously selected measurement position(s).
  • the optical information can be used to identify how the measurement positions are arranged relative to one another.
  • the measurement position display is based on the image of the workpiece, with the current machining position and the measurement positions assigned to the current machining position being able to be seen.
  • the measurement positions themselves can be directly represented by the visual information.
  • the measurement positions can include points to which the measurement beam is directed and/or should be directed and/or was directed.
  • the measurement positions can also be midpoints, starting points and/or endpoints of a number of measurement lines that are to be scanned using the measurement beam.
  • the measurement positions can include a position in the area of the current processing position and one or more positions in the main processing direction in front of and/or behind the current processing position.
  • the at least one representation in particular the measurement position representation, includes a reproduction of the captured image of the workpiece, the measurement positions being recognizable in the captured image as light reflections of the measuring beam on the workpiece.
  • the light reflections form the optical information discussed above.
  • the image capturing device can have a filter with which ambient light, process light and/or illumination light is attenuated. This can be used in particular to make the light reflections of the measuring beam visible.
  • the image capturing device can have a suitable light sensitivity at least in the wavelength range of the measuring beam.
  • the at least one representation is based on the acquisition of an image by the image acquisition device, in which a light reflection of the measuring beam can be seen, which is caused by reflection of the measuring beam on a surface different from the workpiece.
  • the Light reflections originate from a reflection of a portion of the measuring beam transmitted through the measuring beam deflection device. This allows actual measurement positions to be determined, although reflections of the measurement beam on the workpiece cannot be detected.
  • Handling can be improved in particular if marking coordinates are calculated as optical information, which describe the position on the captured image at which the measurement positions are located relative to the workpiece.
  • the measurement positions can be identified by markings as recognizable optical information that is located at the calculated marking coordinates. A user can thus assess whether the measuring beam is correctly positioned.
  • the marking coordinates can be calculated using control signals for the measuring beam deflection device and/or the deflection device and/or the processing beam deflection device.
  • the marking coordinates can theoretically be specific coordinates at which the measuring beam is to be expected to strike. Alternatively, the marking coordinates can be based on actual impact points of the measuring beam.
  • the marking coordinates can be determined on the basis of reflections of the measuring beam on the workpiece, for example if these cannot be seen with the naked eye but can be determined by evaluating image information from the image acquisition device.
  • the markings can be superimposed on an image generated by the image acquisition device, in particular a real-time image.
  • An assessment as to whether selected measurement positions are suitable can be carried out particularly reliably if the measurement beam is directed at at least one alignment measurement position and at least one image is captured by means of the image acquisition device in which a light reflection of the measurement beam belonging to the at least one alignment measurement position can be seen is caused by the reflection of the measuring beam on a surface that is different from the workpiece.
  • a geometric image can be determined that describes a relationship between coordinates of the light reflection on the recorded image and coordinates of the adjustment measurement position on the recorded image. In this example, too, the light reflections form the optical information recognizable in the representation.
  • the geometric mapping can be used to calculate the marker coordinates.
  • reflections of the measuring beam that do not originate from reflections on the workpiece can be detected by means of the image detection device.
  • a position of such a reflex on an image of the The image acquisition device can be clearly related to a setting of the measuring beam deflection device and/or the deflection device and/or the processing beam deflection device, and in particular to the position of the measuring beam on the workpiece.
  • the markings are then displayed at positions where the measurement beam is actually located. This can be the case even though the measuring beam cannot be detected in the image of the image capturing device at its actual impingement positions.
  • Occurring optical reflections can be avoided in a targeted manner and/or recognizable optical information can be used in a targeted manner within the scope of a calibration measurement before the actual process monitoring if at least one optical filter is used during processing to detect reflections of the measuring beam on a lens that is different from the workpiece To prevent surface by the image capture device.
  • the optical filter can be arranged between the measuring beam deflection device and the image acquisition device.
  • the optical filter is preferably at least essentially impermeable to light from the measuring beam, in particular to light in a corresponding wavelength range.
  • the optical filter can have an optical density OD of at least 2, at least 3, at least 4 or even at least 5 for the measuring beam.
  • the optical filter can be automatically moved into and out of a beam path and/or can be designed to be manually removable.
  • the at least one representation in particular the penetration depth representation and/or the geometry representation and/or the measurement position representation, can include an interactive representation.
  • a user interface can be provided, by means of which user inputs can be entered which relate to the interactive display.
  • the user interface can include a keyboard and/or a computer mouse and/or a trackball and/or a touchpad and/or a touch display.
  • the interactive display can depend, for example, on a so-called mouse-over. According to one embodiment, numerical values for specific data points are displayed if a mouse pointer is located over the specific data point for at least a predetermined period of time. According to a further embodiment, only penetration depth values that meet a specific condition specified by the user are displayed.
  • penetration depth values can be shown or hidden that Machining position coordinates are assigned that exceed or fall below a certain distance from the main machining path.
  • penetration depth values that exceed or fall below a specific depth value can be faded in or faded out.
  • the interactive representation is generated, for example, by taking into account a magnification specified by the user and/or a viewing angle specified by the user and/or a viewing position specified by the user.
  • the viewing position can be a position in three-dimensional space.
  • the interactive representation can include a perspective three-dimensional representation, the perspective of which depends in particular on the viewing position specified by the user. A high degree of user-friendliness can be achieved in this way, since the user can view data in different ways and can therefore intuitively understand their meaning.
  • the machining position coordinates can lie in a machining plane.
  • Relevant measurement results can be taken into account particularly efficiently if the at least one representation, in particular the penetration depth representation, also includes a representation that is based on a selection of machining position coordinates that are plotted on a single axis.
  • This representation can be a measurement curve, for example.
  • the individual axis can result from a projection, for example a projection along an axis of a coordinate system of the machining position coordinates or a projection in any direction.
  • the representation, based on a selection of machining position coordinates plotted on a single axis may correspond to a 2D slice.
  • the single axis can also be defined by a specified path, for example a user-specified path in the processing plane.
  • the specified path can be the machining path and/or the main machining path, which provides an intuitively comprehensible overview of the effects of set process parameters.
  • the specified path can also be freely selectable and/or specifiable by the user, as a result of which a user can specifically and individually analyze specific areas.
  • the at least one display in particular the penetration depth display, includes a display that relates to monitoring and/or real-time monitoring and/or regulation of an ongoing machining process.
  • the representation relating to monitoring and/or real-time monitoring and/or regulation may be based on a user specified limit. This can be the limit value mentioned above, which is specified in particular by drawing.
  • parameters of a monitoring and/or a regulation can be adapted quickly and reliably. The user can easily understand from the illustration how to select a suitable limit value and enter it directly.
  • the at least one representation in particular the penetration depth representation or the measurement position representation, can include a representation that is based on the deviation between the current measurement position and the current processing position. This makes it easy for the user to understand whether any parameters, for example PID values, need to be adjusted for position control or other process parameters in order to correctly position the measuring beam relative to the processing beam.
  • the measurement position can be actively regulated, for example on the basis of a real-time image from the image acquisition device, a feed speed along the main machining direction, a parameter describing the above-mentioned oscillating movement or the like.
  • the current measurement position and the current processing position can be displayed in the at least one display, for example as two markings.
  • the at least one display can also include an area in which a current distance between the current measurement position and the current processing position is displayed, for example as a numerical value and/or using a graphic illustration.
  • One aspect of the invention relates to a control unit for monitoring a machining process of a workpiece using a high-energy machining beam, the control unit being set up and programmed to carry out the method steps of a method according to the invention.
  • the control unit can have at least one processor. Furthermore, the control unit can have at least one storage medium in which instructions are stored which can be executed by the processor. The instructions may be such that their execution by the processor causes the method steps to be performed.
  • the device according to the invention can include the control unit according to the invention.
  • One aspect of the invention relates to a computer program product, comprising instructions which, when executed by means of a data processing device, cause the method steps of a method according to the invention to be carried out.
  • the computer program product can include a volatile and/or a non-volatile storage medium on which the instructions are stored.
  • the computer program product can be part of the control unit according to the invention.
  • the data processing device can be part of the control unit, in particular the processor of the control unit.
  • One aspect of the invention relates to a machining system for carrying out and monitoring a machining process, in particular laser welding, of a workpiece using a high-energy machining beam, comprising: a machining unit with a machining beam source for generating the high-energy machining beam and with machining beam optics for projecting and/or focusing the machining beam a machining position on the workpiece; an optical coherence tomograph for generating a measuring beam, wherein the measuring beam can be coupled into the processing beam; and a control unit that is set up to carry out a method according to the invention.
  • the processing system can comprise a device according to the invention.
  • FIG. 1 shows a machining system with a device for monitoring a machining process of a workpiece in a schematic representation
  • FIG. 2 shows a schematic flow chart of a method for monitoring a machining process of a workpiece
  • FIG. 3 shows a schematic flowchart of a method according to a first development
  • FIG. 5 shows a further exemplary representation that can be generated in the method according to the first development
  • FIG. 7 shows a schematic representation of a head-mounted display
  • Figure 8 is a schematic representation of part of an alternative processing system
  • FIG. 9 shows an exemplary representation that can be generated in a method according to a second development
  • Figure 11 is a schematic representation of part of another alternative processing system
  • FIG. 13 shows an exemplary representation that can be generated by means of the device and/or in the method according to the third development.
  • the device 12 includes a measuring unit 14.
  • the measuring unit 14 includes an optical coherence tomograph 16 with an OCT measuring device 18, which is connected via a beam splitter 20 to a measuring arm 22 and a reference arm 24.
  • the OCT measuring device 18 is designed with a measuring beam source 26 for generating a measuring beam 28 and a spectrometer 30 for detecting a superimposed measuring beam. Furthermore, the OCT measuring device 36 can include a circulator 32, which connects the beam splitter 20 to the measuring beam source 26 or the spectrometer 30 via a transport fiber or an optical fiber 34, depending on the direction of travel of the incident light.
  • the measuring arm 22 of the optical coherence tomograph 16 extends through a measuring optics 36 of the measuring unit 14 to the workpiece W. This will be discussed in more detail below.
  • the OCT measuring beam 46 is coupled into the measuring optics 36 via an interface 38 of the measuring optics 54 and passes through a displaceable collimation lens 40 which can be adjusted in the direction of the arrow 42 in the case shown.
  • the measuring unit 14 includes a measuring beam deflection device 44.
  • the measuring beam deflection device 44 includes two movable mirrors that can be pivoted in two different spatial directions. As a result, the measuring beam can be shifted in a targeted manner.
  • the processing system also includes a processing unit 46 with a processing beam source 48 for generating a processing beam 50.
  • the processing unit 46 like the measuring unit 14, can have an adjustable collimation lens 52 at its disposal.
  • the processing unit 46 includes a processing head 54 .
  • the processing beam 50 can be coupled into the processing head 54 via an interface 56 of the processing head 54 .
  • the processing head 54 can have a further interface via which the measuring unit 14 is connected in such a way that the measuring beam 28 can also be coupled into the processing head 54 .
  • the measuring optics 36 can also be provided directly in the processing head 54 .
  • the machining head 54 is arranged on an industrial robot (not shown), by means of which the machining head 54 can be moved relative to the workpiece W. In other embodiments, the workpiece W can alternatively or additionally be movable.
  • the device 12 includes a deflection device 56.
  • the deflection device 56 is part of the processing unit 46.
  • the deflection device 56 can include a processing beam scanner.
  • the deflection device 56 allows displacement in two spatial directions.
  • the deflection device 56 expediently comprises two mirrors which are arranged one behind the other along the optical processing beam and which can each be pivoted in one spatial direction. By moving both mirrors, a beam can be shifted in one plane.
  • the processing beam 50 can be displaced by means of the deflection device 56 so that a point of impact of the processing beam 50 on the workpiece W can be changed.
  • the processing unit 46 comprises a partially transparent mirror 25. This directs the processing beam 50 onto the deflection device 56.
  • the measuring beam 28 passes through the partially transparent mirror 25 and also strikes the deflecting device 56.
  • the measuring beam 28 can thus be coupled into the processing beam 50.
  • the measuring beam 28 and the processing beam 50 can thus be displaced together by means of the deflection device 56 . Since the measuring beam 28 can also be displaced by means of the measuring beam deflection device 44, the point of impingement of the measuring beam 28 on the workpiece W can be displaced relative to the point of impingement of the machining beam 50 and independently of the machining beam 50 on the workpiece. As a result, measurements are possible both at a current machining position and at measurement positions that are different therefrom.
  • the processing unit 46 also has focusing optics 32 .
  • the focusing optics 32 comprise an f-theta lens which is arranged behind the deflection device 56 .
  • a pre-focusing optic can be provided immediately in front of the deflection device 56 .
  • the processing beam 50 and the measuring beam 28 both pass through the focusing optics 32 .
  • the measuring arm 22 extends from the beam splitter 20 to the point of impingement of the measuring beam 28 on the workpiece W.
  • the reference arm 24 is adapted to the measuring arm 22 in terms of its optical properties, in particular with regard to its optical path and/or its dispersion. In this way, distance values can be determined by means of OCT in a known manner by superimposing the light running in the measuring arm 22 or in the reference arm 24 .
  • the processing system 10 includes a control unit 62 (see FIG. 6).
  • the control unit 62 can be a central control unit that controls both the measuring unit 14 and the processing unit 46 .
  • the processing unit 46 and the measuring unit 14 can have their own control units.
  • the control unit 62 or control units comprise at least one processor and a memory for temporary and/or permanent storage of raw data, pre-processed data, evaluated data, program code and the like.
  • the functions of the control unit 62 are discussed again below. It goes without saying that the mentioned functions of the processing system 10 and its components as well as the mentioned methods are carried out by the control unit.
  • a computer program product (not shown) is also provided, which includes instructions that can be executed by the control unit in order to carry out the control functions mentioned.
  • the computer program product can include an in particular non-volatile storage medium on which the instructions are stored.
  • the workpiece W is machined according to a machining figure 58, which defines a plurality of machining positions 64, 66, 68 with associated machining position coordinates.
  • the processing figure 58 defines a processing path along which the processing beam 50 is moved during processing.
  • the machining position coordinates are, for example, in a machining plane and are two-dimensional machining position coordinates; however, three-dimensional machining position coordinates can also be used, depending on the geometry of the workpiece W or the machining path to be traversed.
  • the machining path basically runs in a main machining direction 60. This can define a main machining path. In the example shown, the main machining direction 60 is constant and the main machining path is a straight line.
  • the processing figure 58 consequently defines a wavy processing path on which the different processing positions 64, 66, 68 lie.
  • a high weld seam quality can be achieved by guiding the processing beam 50 according to this processing figure 58 over a connection point of the workpiece W', W".
  • the oscillating movement of the processing beam 50 which follows the processing figure 58, can, for example, be frequency of at least 50 Hz.
  • the processing figure 58 can be specified by the control unit 62 as a figure without an oscillating movement, on which said oscillation is superimposed. Knowing the oscillation amplitude and frequency, the respective machining position coordinates can then be determined starting from the figure without oscillation movement.
  • the oscillating movement is sinusoidal in the example shown.
  • the oscillating movement of the processing beam 50 comprises, for example, sawtooth-shaped, zigzag-shaped, circle-segment-shaped, spiral-shaped and/or rectangular partial movements.
  • beam properties such as a beam diameter and/or a beam shape and/or a power and/or a wavelength can be changed periodically, in particular with the same frequency at which the oscillating movement takes place.
  • measuring positions 70, 72, 74 are defined at at least some of the processing positions and/or in their vicinity.
  • measurement positions are shown as an example, which are assigned to the respective processing positions 64, 66, 68.
  • the measuring beam 28 is also displaced during processing with the processing beam 50 and is directed into a keyhole which is formed at or in the main processing direction directly behind the current processing position 64, 66, 68. Such a procedure is chosen when a penetration depth is to be determined.
  • the device 12 can also be operated in such a way that height information can be obtained at locations other than at a region of the workpiece W that is currently being machined.
  • the measuring beam 28 can be displaced independently of the processing beam 50 for this purpose.
  • the measuring beam 28 can be directed to measuring positions that run, for example, along one or more measuring lines transversely and/or obliquely and/or parallel to the main processing direction 60 .
  • the measurement lines can be positioned in the main processing direction 60 in front of or behind the keyhole and/or a current processing position 64, 66, 68 and, in particular, can also run through the keyhole.
  • Measuring lines can also run through the keyhole, for example transversely and/or obliquely and/or parallel to the main machining direction 60. Measuring lines can be straight or curved in any way.
  • a brightness of the reference arm 24 can be changeable.
  • a reference arm with adjustable brightness is, for example, from DE 10 2017 218 494 Al known.
  • the brightness of the reference arm 24 can be adjusted to a reflectivity of an object to be measured such as the workpiece W, for example.
  • the power of the measuring beam source 26 can be adjusted.
  • the spectrometer 30 can include a sensor, in particular a line sensor, whose exposure time can be adjusted. This can also result in an adaptation to a reflexivity.
  • reference is also made to DE 10 2015 007 142 A1.
  • the division ratio between the reference arm 24 and the measuring arm 22 is adjusted and/or changed, for example by the beam splitter 20 being selected and/or changed accordingly.
  • a reference arm that can be adjusted by a motor can be advantageous, particularly in applications with a processing scanner, in order to be able to compensate for the beam lengthening caused by deflection of the processing scanner, which is described in DE 10 2013 008 269 A1, for example.
  • the reference arm is located inside a housing of the optical coherence tomograph 16 or outside of the processing head 54.
  • a measurement result can be changed due to temperature and/or mechanical influences.
  • the reference arm 24 is designed as an optical fiber, as in the usual cases, temperature fluctuations can be determined by measuring the temperature of the fiber with copper wires running in parallel and can be compensated for by moving the reference arm 24 .
  • the measuring arm and the reference arm run at least in sections in a common fiber.
  • DE 10 2019 002 942 A1 describes how, in this case, thermal and mechanical disturbances affect the reference arm and measuring arm in equal measure.
  • the reference arm is mounted on or near the processing head. If the processing head is mounted on a robot, it is advantageous if the optical structure is robust and light at the same time. Should a change removable reference arm are used, it can be advantageous in this case in particular to use a fiber stretcher, as described in DE 10 2019 001 858 B3, instead of a reference arm with a free beam area and adjustable mirrors.
  • Reference arm and measuring arm can differ in terms of their dispersion, for example due to different densities of the materials used. This can have a negative effect on the measurement result.
  • DE 10 2015 015 112 A1 describes how hardware-based dispersion compensation can be carried out. In general, dispersion compensation can also be carried out using software. Software algorithms can be executed, for example, by one or more graphics cards or by an FPGA (Field Programmable Gate Array).
  • the measurements using the optical coherence tomograph can be subject to a certain drift. From time to time it can therefore make sense to re-reference the system.
  • a thin pane of glass or another suitable optical element can be used for this purpose, which is inserted perpendicularly into the measuring beam 28, for example between the measuring beam deflection device 44 and the partially transparent mirror 25.
  • the wobbling mentioned can be carried out in such a way that the measuring beam deflection device 44 and the deflection device 56, for example a measuring beam scanner and a processing beam scanner, are synchronized so that an oscillating movement of the deflection device 56 is compensated for by the measuring beam deflection device 44.
  • the measuring beam deflection device 44 and the deflection device 56 for example a measuring beam scanner and a processing beam scanner, are synchronized so that an oscillating movement of the deflection device 56 is compensated for by the measuring beam deflection device 44.
  • Such a procedure is described in DE 10 2015 015 330 A1, for example.
  • the OCT measurement is then unaffected by the oscillating movements.
  • measuring beam 28 and the processing beam 50 In real operation, it must be checked from time to time whether the measuring beam 28 and the processing beam 50 are still aligned coaxially. If necessary, it must be corrected by subjecting the measuring beam deflection device 44 to an offset. Different methods are known for the measurement. A hole can be shot into a metal sheet or foil with the processing beam 50 . This hole is then searched for by changing the position of the measurement beam 28 . Furthermore, a camera sensor can be attached to the processing head 54 or generally to a processing beam optics, or an optical position sensor can be provided. As described in DE 10 2015 012 565 B3, the points of impact of processing beam 50 and measuring beam 28 are observed and a distance between them is measured.
  • An offset between the two beams 28, 50 can also be done by means of a Photodiode with aperture can be determined, as is known from DE 10 2018 219 129 B3. First, it is determined at which position of the deflection device 56 the processing beam 50 delivers a maximum signal to the photodiode. The aperture is then scanned using the measuring beam 28 . The corresponding positions of the scanners can then be linked. Another possibility is described in DE 10 2016 106 648 A1. A checkerboard pattern is engraved in a metal sheet using the processing beam. This checkerboard pattern is then traversed again, but without a processing beam, and measured using the measuring beam. The measured pattern is laid over the specified pattern. There is then an offset between the processing beam and the measuring beam, which can be taken into account for a correction.
  • a repeatable start of the OCT measurement is of crucial importance, especially when measuring penetration depths.
  • a measurement can be started by different trigger signals, for example by a signal that represents switching on the processing beam, a signal from a photodiode or a signal that indicates a change in position of the processing beam.
  • Suitable envelope curves can be used to improve the quality of the measurement data generated by means of the optical coherence tomograph 16 in order to be able to classify the measured distance values, as is described in DE 10 2019 002 942 A1, for example.
  • the advantage of OCT in laser beam welding is the coaxial course of the measuring beam 28 and the processing beam 50.
  • Interfering contours e.g. The free deflection of the measuring beam 28 by a scanner enables a highly dynamic variation of the measuring position. This enables a measurement that is independent of the feed direction.
  • a typical offline measurement is, for example, the position measurement of a hairpin pair of an electric motor before subsequent welding with the processing beam 50.
  • measuring lines are often used that are actively transverse or oblique to processing direction are aligned. This has already been mentioned above and is described, for example, in DE 10 2015 007 142 A1.
  • the keyhole follows the processing beam 50 with a slight delay.
  • lines/spirals/meanders/etc. scanned over the suspected keyhole position. The measurement data of several lines can be averaged/added up for a better determination of the keyhole position.
  • an oscillating processing beam 50 it depends on the oscillation amplitude whether it makes sense to track the in-process measuring position of the oscillating movement or the resulting processing direction.
  • Small oscillating movements for example in the range of +/- 0.1 mm, tend to create a position-stable, wide keyhole, whereas larger oscillating movements change the keyhole size less, but can have an influence on the keyhole position.
  • the oscillation frequency also plays an important role. This can be limited by the speed of the deflection device 56 .
  • the online position tracking can, for example, either take place via a fast communication interface between deflection device 56 and measuring beam deflection device 44, or be stored in advance in the form of a file in measuring beam deflection device 44 or its control unit and played back with little synchronization to deflection device 56 (replay mode).
  • the depth values of the welding depth measurement are usually given in pm relative to the sheet metal surface.
  • the sheet metal surface can be measured offline and then made available as a saved value during welding.
  • the OCT scanner can occasionally direct the measuring beam not into the keyhole, but onto the sheet metal surface in order to record the reference. If the in-process measurement is more or less synchronous with an online pre- or post-process measurement, these height values can also be used as a reference for the in-process measurement.
  • Gaps between two workpieces lying one on top of the other can be detected if the measuring beam 28 does not hit the keyhole during the in-process measurement, but at least partially hits the upper sheet metal surface of the lower workpiece.
  • the movement of the processing beam 50 may be composed of the movement of all robot, gantry and processing scanner axes or scanner axes. This also applies to the measuring beam. In addition, there are the movements of those axes that define the independent mobility of the measuring beam. Special controls can be designed to synchronize all axes with each other (on-the-fly movement).
  • the above-mentioned axes for the processing beam 50 can usually be used as an actuator for position correction.
  • Offline pre/post measurements can require additional cycle time, since no processing usually takes place during the measurement. If the distance between two processing points is small, one processing point could be measured offline pre/post while the other processing point is being actively processed. Since the measurement beam 28 is also deflected by the deflection device 56, in some embodiments the measurement beam deflection device 44 must compensate for the movement of the deflection device 56 in addition to the measurement figure. For this purpose, the same synchronization of the two deflection devices or scanners can be used as for processing with an oscillating processing beam (see above).
  • Another way to shorten the cycle time can be to increase the sampling frequency of the OCT system.
  • the limitation usually lies in the speed of the sensors used (e.g. photodiode, line scan camera).
  • a number of OCT sensors can be connected in parallel and, in particular, can be synchronized by a common measuring beam deflection device.
  • Methods are described below that can be carried out with the device 12 or with modifications or developments of the same. It goes without saying that, depending on the configuration of the method, the device 12 or its at least one control unit 62 and its other components are suitably set up to carry out the respective method. Components that are not required for the particular method may be omitted in some embodiments.
  • the methods are described with method steps that are carried out in a specific order by way of example. Process steps that can be carried out independently of one another can also be carried out in a different order than that described.
  • FIG. 2 shows a schematic flowchart of a method for monitoring a machining process of a workpiece W.
  • the workpiece W is machined using the machining beam 50 .
  • the machining process is laser welding, for example.
  • a processing figure 58 is defined, as is shown by way of example in FIG. This defines several machining positions 64, 66, 68 with associated machining position coordinates.
  • the processing beam 50 is generated and projected and/or focused onto the workpiece according to the processing figure 58 .
  • the measuring beam 28 is generated by means of the optical coherence tomograph 16.
  • measurement positions 70, 72, 74 on the workpiece W are defined or set at and/or in the vicinity of specific machining positions 64, 66, 68.
  • the measurement positions 70, 72, 74 can generally lie on a measurement figure.
  • measured values are determined at the measurement positions 70, 72, 74 by directing the measurement beam 28 at the measurement positions 70, 72, 74.
  • the measured values are determined in a known manner from the available raw data. Suitable classification algorithms, filter algorithms and/or correction algorithms can be used for this.
  • a representation is generated that is based on multi-dimensional, location-dependent information that relates to the machining position coordinates, the measured values and/or the measured positions 70, 72, 74.
  • multi-dimensional, location-dependent information can in particular be taken from the representation.
  • the representation includes, for example, a chart with values, each are assigned to at least a two-dimensional coordinate, for example an xyz diagram.
  • FIG. 3 shows a schematic flowchart according to a first development of the method.
  • the method according to the first development includes additional method steps.
  • a step S11 penetration depth values of the processing beam 50 into the workpiece W at the processing positions 64 , 66 , 68 are determined by means of the measuring beam 28 by directing it at corresponding measuring positions 70 , 72 , 74 .
  • the penetration depth values are assigned to the respective machining position coordinates.
  • control unit 62 is set up to control both the deflection device 56 and the optical coherence tomograph 16 .
  • the control unit 62 can also process encoder positions reported back by the hardware used. The assignment then takes place directly via the control unit 62. The movement of the measuring beam 28 and the processing beam 50 is therefore based on a common time specification.
  • the assignment can take place by using a suitable trigger signal and/or control commands for the deflection device 46 for synchronization.
  • separate control units can be used for the processing unit 46 and in particular the deflection device 56 on the one hand and the measuring unit 14 and in particular the optical coherence tomograph 16 on the other hand.
  • the deflection device 56 is then controlled independently of the OCT control.
  • the controller of the scanner 56 can transmit its position commands to the OCT controller via a high-speed interface.
  • the measuring unit 14 can have a component inserted in a control line to the deflection device 56, by means of which the currently requested position of the deflection device 56 is picked up.
  • the control unit of the measuring unit 14 knows the current period and only sends a trigger signal from at suitable points in time the controller of the deflection device 56 is sent to the controller of the measuring unit 14 .
  • the at least one representation is generated in such a way that the at least one representation includes a penetration depth representation that is based on the associated penetration depth values and machining position coordinates. Examples of representations are explained below.
  • FIG. 4 is an exemplary penetration depth representation 76 that can be generated in the method according to the first development.
  • part of the processing figure 58 is represented in a plan view.
  • An oscillating course of the processing figure 58 can be seen due to a wobbling movement of the processing beam 50.
  • the processing beam 58 moves locally at a changing speed, depending on how large a portion of the movement is parallel to the main processing direction 60.
  • the speed is lowest at external extreme value points, at which the tangential movement takes place completely parallel to the main machining direction 60, but is highest in the area of the turning point in the middle of the oscillating movement. Consequently, the penetration depth of the processing beam 50 along the processing figure 58 varies periodically.
  • the keyhole formed at the corresponding processing positions thus has different height profiles 78, 80, 82. These can be determined, for example, by moving the measuring beam 28 in a known manner transversely to the main processing direction 60 or also transversely to the current processing direction over the keyhole that forms during processing.
  • That Penetration depth profile 84 is thus based on an averaging over a plurality of penetration depth values at corresponding positions transverse to the main machining direction 60, which are run through multiple times due to the wobbling.
  • FIG. 5 shows another exemplary penetration depth display 86 that can be generated in the method according to the first development.
  • the measured penetration depth values are illustrated as a three-dimensional curve 87 in the penetration depth representation 86 .
  • the penetration depth values are plotted against the assigned machining position coordinates of the machining figure 58 . This makes it intuitively recognizable how the penetration depth changes along the processing figure 58 .
  • the penetration depth values and/or the machining position coordinates can be subjected to at least one data analysis.
  • the penetration depth display is then based on this data analysis.
  • This can be a statistical analysis, for example. Basically, it is possible to determine mean values and, if necessary, to sort out outliers that are implausible and, for example, result from reflections of the measuring beam on the flanks of the keyhole, on the workpiece surface or on other objects in the area or are due to other measurement artifacts.
  • the penetration depth display 86 can be interactive.
  • a user interface can be provided for this purpose, which includes, for example, a keyboard and/or a mouse or the like, by means of which the user can influence the display. For example, if you click or mouse over a specific processing position on the processing figure, a display field 88 is generated and displayed that contains information about the processing position, such as its processing position coordinates and the penetration depth value measured there.
  • the user can also create two-dimensional slices or projections, such as by spanning a two-dimensional surface 90 that intersects curve 87, as shown.
  • the corresponding penetration depth profile is then displayed, for example, in a display field 92 that is then additionally displayed.
  • the user can also specify a projection direction and/or select an area over which an average is to be taken.
  • Such an averaging can, for example, form the basis of a penetration depth profile 84, as shown in FIG. Averaging can take place along the main machining direction, perpendicularly to it, or at an angle to it.
  • the user can define an area, such as a cuboid area 94 .
  • a part of the curve 87 contained therein can then be shown enlarged and possibly modified according to further functions.
  • the corresponding enlarged representation is then displayed, for example, in a display field 96 that is then additionally displayed.
  • a false color representation can also be used. For example, a bar with a color gradient and associated penetration depths is displayed. The curve 87 is then colored according to this color gradient.
  • the graphical user interface 98 is presented on a display device 100 of the device 12 and/or the processing system 10, for example. This can be controlled by the control unit 62 . In particular, the representations described herein can be generated by the control unit 62 .
  • Graphical user interface 98 may include multiple areas.
  • the generated representation can thus comprise several representations such as one or more penetration depth representations 86 .
  • a representation 102 may also be included whose content is based on a user selection, such as magnification.
  • a selection area 104 by means of which the user can make various settings and select parameters and presettings.
  • a representation 106 is generated that includes at least a first representation 107 that relates to a first set of associated penetration depth values, and at least a second representation 109 that relates to a second set of penetration depth values.
  • the representation 106 can also be based on a comparison of the two sets of penetration depth values in another way. For example, a difference in penetration depth values between the two sets can be plotted against the machining position coordinates. For example, for two machining processes with an identical machining pattern, it can be determined how different machining parameters affect a local penetration depth.
  • a penetration depth plot can also be used to monitor a process.
  • the graphical user interface 98 has a monitoring area 108 . This allows the user to view an elevation profile at a specific location or an average elevation profile in a displayed display area 110 .
  • the user can draw a limit value 112 in the height profile, for example by vertically moving a straight line that shows the current limit value. In this way, the user can specify a threshold value for a maximum permitted penetration depth.
  • the limit value 112 can be a curved curve or surface whose curvature can be parameterized or freely definable. Different threshold values can thus be specified for different coordinates, for example lower threshold values in a central area and larger threshold values in an outer area of the wobbling movement.
  • a section of the processing figure 58 is also shown in the monitored area 108 . If the current penetration depth exceeds the specified limit value, a position at which the excess occurs is indicated by flashing and/or a colored highlight 114, for example. In addition, an alarm can be issued, for example in the form of a text message, a warning tone or a voice message.
  • a limit value can also be specified as a numerical value. Irrespective of how the limit value is specified, it can also be used to control and/or regulate at least one process parameter. For example, the power of the processing beam 50 can be reduced automatically and/or the processing beam 50 can be widened more if a penetration depth exceeds a permissible maximum value. As mentioned above, this can also take place periodically, so that, for example, there are large penetration depth values occur and/or are expected, performance will be reduced. A corresponding regulation can relate to an interval within which the power of the processing beam 50 is varied.
  • a result of an arbitrarily designed regulation can in turn be illustrated in the representation.
  • a location-dependent penetration depth can be illustrated for an ongoing machining process, which makes it possible to understand that the specified limit value is initially exceeded and later adhered to, or vice versa.
  • the keyhole does not always form exactly at the current processing position.
  • the keyhole can also be designed and/or occur in a laterally distorted manner.
  • the at least one representation generated can therefore also visualize a deviation between a current measurement position and a current processing position.
  • a slider 115 is displayed, which, depending on the deviation, moves back and forth parallel to the main processing direction 60 . This can be provided with a scale.
  • a two-dimensional indicator 117 is provided, from which a two-dimensional deviation can be read. This can be displayed in the form of a crosshair.
  • FIG. 7 shows a schematic representation of a head-mounted display 116.
  • the head-mounted display can be used to view presentations.
  • representations on standard flat or curved 2D or 3D monitors or on other display devices are also possible.
  • the presentation may include a moving presentation.
  • the user can use the head-mounted display to take a virtual flight or walk along or over the curve shown in FIG. 5 .
  • FIG. 8 shows a schematic representation of part of an alternative machining system 210.
  • the machining system 210 has a device 212 for monitoring a machining process of a workpiece W.
  • FIG. This is basically designed like the device 12, which is why reference is made to the above description.
  • the device 212 has a measuring unit 214 for generating a measuring beam 228 and a processing unit 246 for generating a processing beam 250.
  • a deflection device 256 and a measuring beam deflection device 244 are also present.
  • the measuring beam 228 is coupled into the processing beam 250 via a partially transparent mirror 225 .
  • the device 212 also includes an image acquisition device 227, which is embodied as a camera, for example.
  • the device 212 comprises a partially transparent mirror 239 through which light can fall into the image acquisition device 227 .
  • the image acquisition device 227 can be arranged behind a deflection mirror 241 .
  • the device 212 comprises an illumination device 229 which is arranged and set up to illuminate the workpiece W.
  • the lighting device 229 can comprise a light source which is arranged separately from the beam path of the processing beam 250 .
  • an illumination device 229′ can be arranged and designed in such a way that illumination light is guided partially coaxially to the processing beam 250 by being fed in at the deflection mirror 241, for example.
  • an exposure duration of the image capturing device 227 can be suitably increased and the measuring beam 228 can be directed at different measuring positions, while the processing beam 250 is deactivated.
  • the measuring beam 228 or its reflection is then on the target object, for example a calibration Workpiece, clearly visible to the image capture device 227.
  • Its pixel coordinates can then be determined, for example by looking for a focal point or a brightest pixel of the recorded reflection. This is compared with an associated measurement position. If the captured image area is scanned in this way, the coordinate pairs described above can be searched for and the image can be determined.
  • the image can include stretching, expansion, rotation or other distortions.
  • OCT coordinates can also be inferred from the image data of the image acquisition device 227 by using the imaging in the correspondingly reverse direction.
  • a calibration can also be carried out in order to coordinate the underlying coordinate systems of the measuring unit 214 and the processing unit 246 with one another. Based on this, processing position coordinates can then also be inferred from the image data.
  • the image data from the image acquisition device 227 can be used to determine the position of the keyhole.
  • the measurement beam 228 can then be directed precisely into the keyhole on the basis of the mapping described above, since the corresponding measurement position coordinates can be determined using the image data and the determined mapping.
  • part of the measuring beam 228 is deflected by a rear side of the partially transparent mirror 225 and reflected on a surface of a component 231 of the device 212, the processing system 210 or the environment. A certain proportion of the measurement beam 228 is deflected back into the image acquisition device 227 and imaged there.
  • These maps can be used to create a representation that has proven extremely useful for monitoring machining processes.
  • FIG. 9 is an example representation 233 that can be generated in a method according to a second development, which is described below with reference to FIG. 10 .
  • the representation 233 is based on an image that is captured by the image capture device 227 .
  • a currently generated keyhole 235 and a generated weld seam lying behind it in the processing direction can be seen thereon, for example.
  • the measurement beam 228 is directed to different measurement positions in front of, in and behind the keyhole 235 for different measurements. As mentioned above, the measurement beam 228 can also be guided along suitable measurement lines at these positions. Points of impingement of the measuring beam 228 on the workpiece W may in many cases not be perceptible, depending on the prevailing light conditions.
  • the image acquisition device 227 clearly acquires light reflections 243, 245, 247, 249, 251 as optical information, which originate from reflections on the component 231 mentioned.
  • the light reflections 243, 245, 247, 249, 251 are illustrated in FIG. 9 as dots. These light reflections 243, 245, 247, 249, 251 appear in the image of the image acquisition device 227 at positions other than the impingement points of the measuring beam 228 on the workpiece W.
  • the connection between the impact points and the positions of the light reflections 243, 245, 247, 249, 251 is clear and can be described as a geometric image.
  • the measurement positions are processed in a sequence according to which light reflection 243 can be seen first, then light reflection 245, then light reflection 247 etc., the user can easily see that light reflections 243, 245, 247, 249 , 251 are in line and at approximately equal distances. If this behavior of the light reflections 243, 245, 247, 249, 251 coincides with the behavior expected for the intended measurement positions, the light reflections 243, 245, 247, 249, 251 can therefore be observed for process monitoring. Although these do not appear at the measurement positions, usable information recognizable by the user is contained in the occurrence and in the positions of the light reflections 243, 245, 247, 249, 251.
  • the method can also provide for the actual measurement positions to be determined on the basis of the light reflections and for markings 253 , 255 , 257 , 259 , 261 to be superimposed on the captured image in order to generate the representation 233 .
  • Marks 253, 255, 257, 259, 261 are illustrated in Figure 9 as crosses.
  • the underlying geometric image is determined by directing the measurement beam 228 to a plurality of alignment measurement positions.
  • An image is then captured by means of the image capturing device 227 in which a light reflection can be recognized as optical information, which is caused by the reflection of the measuring beam 228 on the component 231 .
  • a Stretching, compression and / or distortion can be determined.
  • adjustment measurement positions are used that lie on a grid, for example on a rectangular, hexagonal or spiral grid.
  • the representation 233 can therefore also be referred to as a measurement position representation.
  • the resulting representation corresponds to the representation 233 shown in FIG. 261 are not available.
  • FIG. 10 shows a schematic flow chart of the underlying method according to the second development of the method.
  • the method has additional method steps.
  • a step S21 different measurement positions are assigned to the processing positions. For example, at least one measurement in the processing direction in front of the keyhole 235, at least one measurement in the keyhole 235, and at least one measurement in the processing direction behind the keyhole 235 are carried out for a specific processing position.
  • a step S22 an image of the workpiece W in an area of a current machining position is captured by the image capturing device 227.
  • the at least one representation includes a representation 233 that is based on the image of the workpiece W and from which the current machining position and the recorded optical information regarding the measurement positions that are assigned to the current machining position can be seen. In the case shown, the measurement positions themselves can be seen and, as explained, are identified by markings.
  • the lighting conditions and/or the components of the image capturing device 237 can allow points of impingement of the measuring beam 228 on the workpiece W to be observed directly.
  • the measuring positions can be recognized as light reflections of the measuring beam on the workpiece. The light reflections are then already at that position in the image at which the measuring beam 228 actually falls on the workpiece W.
  • the variants described allow the user to assess whether suitable measurement positions have been selected during the ongoing process and on the basis of a real image of the machined workpiece W.
  • FIG. 11 shows a schematic representation of part of another alternative processing system 310 with an apparatus 312.
  • the processing system 312 is identical to that of FIG. Identical components therefore bear the same reference symbols. With regard to their description, reference is made to the statements above.
  • the device 312 comprises an optical filter 339.
  • the optical filter 339 can optionally be introduced into a light path which leads to the image acquisition device 227.
  • a motor can be provided for this purpose, which enables the optical filter 339 to be moved automatically.
  • the optical filter 339 is set up to prevent a detection of reflections of the measuring beam 228 on a surface 231 different from the workpiece W by the image detection device 227 . In this way it can be achieved that the light reflections can be seen in the representation 233 shown in FIG. 9 .
  • the optical filter 339 can be removed from the light path to make the adjustment described above. Once the geometric mapping is known, it can be reused. Only the added markings of the measurement positions can then be recognized in the representation 233 .
  • the optical filter 339 is a bandpass filter which is essentially impermeable in the wavelength range of the measuring beam 228
  • FIG. 12 shows a schematic flow chart of a method according to a third development of the method.
  • the method can be carried out, for example, using the device 12 from FIG. 1 , to which reference is made below.
  • the method includes at least one additional method step.
  • a geometric condition of the workpiece W at the measurement positions is estimated on the basis of the associated measurement values.
  • the machining figure 458 (see FIG. 13) is selected in such a way that it is related to the geometric nature of the workpiece W. For example, you want to weld along a path that describes a rectangle with rounded corners. This path corresponds to an existing edge.
  • the processing figure 458 can additionally describe a wobble movement.
  • the at least one representation comprises a representation that is based on a two-dimensional representation of at least a part of the processing figure 485 and the associated estimated geometric condition of the workpiece W.
  • FIG. 13 is an exemplary representation 433 that can be generated in the method according to the third development.
  • the machining figure 458 traces an edge present in the workpiece W .
  • the geometric nature of the workpiece W is the presence and course of this edge of the workpiece W.
  • the measuring beam 28, as already mentioned is scanned transversely or at an angle to the respective main processing direction and/or transversely or at an angle to the local processing direction over the workpiece W is led. This can be done during editing, for example in the pre-process before the keyhole.
  • measurements can be carried out according to a measurement figure which follows the course of the processing figure.
  • the measurement data obtained are evaluated automatically in order to obtain a height profile along the respective measurement line.
  • a position of the edge can be automatically estimated by suitable automated evaluation, for example the adaptation of straight lines or functions curved according to a known tool surface to the measurement data.
  • the estimated respective edge positions along the machining figure are shown in plot 433 as open circles.
  • area 453 there is a considerable deviation between the processing figure 458, which corresponds to the expected edge profile, and the estimated edge profile. This can be grasped intuitively due to the selected type of representation.
  • this embodiment provides for a current processing position 451 to be displayed in the display together with the processing figure 458 and the estimated geometric structure, in this case the estimated edge profile.
  • the representation 433 is thus based on measurement data recorded in advance, on the basis of which the course of the edge is estimated, and on live processing position data.
  • a measurement along the measurement figure or the processing figure 458 is carried out before processing.
  • edge detection can be used to automatically make minor adjustments to the machining figure to match machining positions to the actual geometry of the tool. Using this visualization of estimated edges, a user can easily check whether edge detection is working reliably and whether parameters of the estimation may need to be adjusted.

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Plasma & Fusion (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Laser Beam Processing (AREA)

Abstract

L'invention concerne un procédé de surveillance d'un processus d'usinage d'une pièce à usiner (W, W', W"), en particulier un soudage au laser, au moyen d'un faisceau d'usinage à haute énergie (50 ; 250), le procédé comprenant les étapes consistant à : définir un schéma d'usinage (58 ; 258) qui définit une pluralité de positions d'usinage (64, 66, 68) ayant des coordonnées de position d'usinage multidimensionnelles associées ; générer un faisceau d'usinage à haute énergie (50 ; 250) et projeter et/ou focaliser le faisceau d'usinage (50 ; 250) sur la pièce à usiner (W, W', W") selon le schéma d'usinage (58 ; 458) ; générer un faisceau de mesure (28 ; 228) au moyen d'un tomographe en cohérence optique (16), le faisceau de mesure (28 ; 228) pouvant être couplé dans le faisceau d'usinage (50 ; 250) ; définir des positions de mesure (70, 72, 74) sur la pièce à usiner (W, W', W") au niveau d'au moins certaines des positions d'usinage (64, 66, 68) et/ou à proximité de celles-ci ; déterminer des valeurs de mesure aux positions de mesure (70, 72, 74) en dirigeant le faisceau de mesure (28 ; 228) sur les positions de mesure (70, 72, 74) ; et générer au moins une représentation (76, 86, 102, 106, 110 ; 233 ; 433) qui est basée sur des informations dépendant d'un emplacement multidimensionnelles concernant les coordonnées de position d'usinage, les valeurs de mesure et/ou les positions de mesure (70, 72, 74). La présente invention concerne également un dispositif associé, une unité de commande et un produit de programme informatique.
PCT/EP2020/084629 2020-12-04 2020-12-04 Procédé, dispositif et système d'usinage pour surveiller un processus d'usinage d'une pièce à usiner au moyen d'un faisceau d'usinage à haute énergie WO2022117207A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/084629 WO2022117207A1 (fr) 2020-12-04 2020-12-04 Procédé, dispositif et système d'usinage pour surveiller un processus d'usinage d'une pièce à usiner au moyen d'un faisceau d'usinage à haute énergie

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/084629 WO2022117207A1 (fr) 2020-12-04 2020-12-04 Procédé, dispositif et système d'usinage pour surveiller un processus d'usinage d'une pièce à usiner au moyen d'un faisceau d'usinage à haute énergie

Publications (1)

Publication Number Publication Date
WO2022117207A1 true WO2022117207A1 (fr) 2022-06-09

Family

ID=73740400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/084629 WO2022117207A1 (fr) 2020-12-04 2020-12-04 Procédé, dispositif et système d'usinage pour surveiller un processus d'usinage d'une pièce à usiner au moyen d'un faisceau d'usinage à haute énergie

Country Status (1)

Country Link
WO (1) WO2022117207A1 (fr)

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013008269A1 (de) 2013-05-15 2014-11-20 Precitec Optronik Gmbh Bearbeitungskopf für eine Laserbearbeitungsvorrichtung
DE102014008265B3 (de) 2014-06-06 2015-09-03 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Durchführen eines Bearbeitungsprozesses entlang eines Hauptbearbeitungspfandes auf einem Werkstück mittels eines Bearbeitungsstrahls
DE102014216829A1 (de) 2014-08-25 2016-02-25 Trumpf Laser- Und Systemtechnik Gmbh Vorrichtung und Verfahren zur temperaturkompensierten interferometrischen Abstandsmessung beim Laserbearbeiten von Werkstücken
DE102015012565B3 (de) 2015-09-25 2016-10-27 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zur Erhöhung der Genauigkeit eines OCT-Messsystems für die Lasermaterialbearbeitung
DE102015007142A1 (de) 2015-06-02 2016-12-08 Lessmüller Lasertechnik GmbH Messvorrichtung für ein Laserbearbeitungssystem und Verfahren zum Durchführen von Positionsmessungen mittels eines Messstrahls auf einem Werkstück
DE102016001661B3 (de) * 2016-02-12 2017-04-13 Lessmüller Lasertechnik GmbH Messvorrichtung und Verfahren zum Ermitteln einer relativen Neigung eines Werkstücks mittels optischer Kohärenztomographie bei einer Bearbeitung
DE102015015112A1 (de) 2015-11-23 2017-05-24 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Überwachen eines Bearbeitungsprozesses zur Materialbearbeitung mittels eines optischen Referenzstrahls unter Ausgleich von Dispersionseffekten
DE102015015330A1 (de) 2015-11-25 2017-06-01 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Überwachen eines mit einem oszillierenden Bearbeitungsstrahl ausgeführten Bearbeitungsprozesses unter Verwendung eines OCT-Messstrahls
DE102016204577A1 (de) 2016-03-18 2017-09-21 Trumpf Laser- Und Systemtechnik Gmbh Heißrisserkennen beim Laserschweißen
DE102016106648A1 (de) 2016-04-12 2017-10-12 Blackbird Robotersysteme Gmbh Kalibrierverfahren für ein Sensor-Ablenksystem einer Laserbearbeitungsvorrichtung sowie Kalibriersystem zur Durchführung eines derartigen Kalibrierverfahrens
US20180178320A1 (en) 2013-03-13 2018-06-28 Ipg Photonics Corporation Methods and Systems for Characterizing Laser Machining Properties by Measuring Keyhole Dynamics Using Interferometry
DE102017001353A1 (de) 2017-02-13 2018-08-16 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Überwachen eines Bearbeitungsprozesses zur Materialbearbeitung mittels eines optischen Messstrahls unter Anwendung eines Temperaturausgleichs
DE102017117413A1 (de) 2017-08-01 2019-02-07 Precitec Gmbh & Co. Kg Verfahren zur optischen Messung der Einschweißtiefe
DE102017218494A1 (de) 2017-10-17 2019-04-18 Trumpf Laser- Und Systemtechnik Gmbh Bearbeitungsvorrichtung und Verfahren zur insbesondere schweißenden Bearbeitung eines Werkstücks
US10363632B2 (en) * 2015-06-24 2019-07-30 Illinois Tool Works Inc. Time of flight camera for welding machine vision
DE102018000887A1 (de) 2018-02-02 2019-08-08 Lessmüller Lasertechnik GmbH Vorrichtung und ein Verfahren zum Durchführen und Überwachen eines Bearbeitungsprozesses auf einem Werkstück mit der Möglichkeit zur OCT-Scanner-Kalibrierung
DE102018219129B3 (de) 2018-11-09 2019-11-07 Trumpf Laser Gmbh Verfahren und Computerprogrammprodukt zur OCT-Messstrahljustierung
DE102019210618A1 (de) 2018-07-19 2020-01-23 Ipg Photonics Corporation Systeme und verfahren zum überwachen und/oder steuern einer wobbel-bearbeitung unter verwenden von kohärenter inline-bildgebung (ici)
DE102018118501A1 (de) 2018-07-31 2020-02-06 Precitec Gmbh & Co. Kg Messvorrichtung zur Bestimmung eines Abstands zwischen einem Laserbearbeitungskopf und einem Werkstück, Laserbearbeitungssystem mit derselben und Verfahren zur Bestimmung eines Abstands zwischen einem Laserbearbeitungskopf und einem Werkstück
WO2020069266A1 (fr) 2018-09-27 2020-04-02 Ipg Photonics Corporation Système et procédé de visualisation de distributions d'énergie laser fournies par différents motifs de balayage en champ proche
DE102019220087A1 (de) * 2018-12-21 2020-06-25 Panasonic Intellectual Property Management Co., Ltd. LASERSCHWEIßVORRICHTUNG UND LASERSCHWEIßVERFAHREN
DE102019001858B3 (de) 2019-03-14 2020-07-30 Lessmüller Lasertechnik GmbH Messvorrichtung für ein Bearbeitungssystem, Bearbeitungssystem sowie Verfahren zum Überwachen einer Bearbeitung eines Werkstücks
DE102019002942A1 (de) 2019-04-24 2020-10-29 Lessmüller Lasertechnik GmbH Messvorrichtung und Verfahren zur Durchführung optischer Kohärenztomographie mit einem Kohärenztomographen

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180178320A1 (en) 2013-03-13 2018-06-28 Ipg Photonics Corporation Methods and Systems for Characterizing Laser Machining Properties by Measuring Keyhole Dynamics Using Interferometry
DE102013008269A1 (de) 2013-05-15 2014-11-20 Precitec Optronik Gmbh Bearbeitungskopf für eine Laserbearbeitungsvorrichtung
DE102014008265B3 (de) 2014-06-06 2015-09-03 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Durchführen eines Bearbeitungsprozesses entlang eines Hauptbearbeitungspfandes auf einem Werkstück mittels eines Bearbeitungsstrahls
DE102014216829A1 (de) 2014-08-25 2016-02-25 Trumpf Laser- Und Systemtechnik Gmbh Vorrichtung und Verfahren zur temperaturkompensierten interferometrischen Abstandsmessung beim Laserbearbeiten von Werkstücken
DE102015007142A1 (de) 2015-06-02 2016-12-08 Lessmüller Lasertechnik GmbH Messvorrichtung für ein Laserbearbeitungssystem und Verfahren zum Durchführen von Positionsmessungen mittels eines Messstrahls auf einem Werkstück
US10363632B2 (en) * 2015-06-24 2019-07-30 Illinois Tool Works Inc. Time of flight camera for welding machine vision
DE102015012565B3 (de) 2015-09-25 2016-10-27 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zur Erhöhung der Genauigkeit eines OCT-Messsystems für die Lasermaterialbearbeitung
DE102015015112A1 (de) 2015-11-23 2017-05-24 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Überwachen eines Bearbeitungsprozesses zur Materialbearbeitung mittels eines optischen Referenzstrahls unter Ausgleich von Dispersionseffekten
DE102015015330A1 (de) 2015-11-25 2017-06-01 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Überwachen eines mit einem oszillierenden Bearbeitungsstrahl ausgeführten Bearbeitungsprozesses unter Verwendung eines OCT-Messstrahls
DE102016001661B3 (de) * 2016-02-12 2017-04-13 Lessmüller Lasertechnik GmbH Messvorrichtung und Verfahren zum Ermitteln einer relativen Neigung eines Werkstücks mittels optischer Kohärenztomographie bei einer Bearbeitung
DE102016204577A1 (de) 2016-03-18 2017-09-21 Trumpf Laser- Und Systemtechnik Gmbh Heißrisserkennen beim Laserschweißen
DE102016106648A1 (de) 2016-04-12 2017-10-12 Blackbird Robotersysteme Gmbh Kalibrierverfahren für ein Sensor-Ablenksystem einer Laserbearbeitungsvorrichtung sowie Kalibriersystem zur Durchführung eines derartigen Kalibrierverfahrens
DE102017001353A1 (de) 2017-02-13 2018-08-16 Lessmüller Lasertechnik GmbH Vorrichtung und Verfahren zum Überwachen eines Bearbeitungsprozesses zur Materialbearbeitung mittels eines optischen Messstrahls unter Anwendung eines Temperaturausgleichs
DE102017117413A1 (de) 2017-08-01 2019-02-07 Precitec Gmbh & Co. Kg Verfahren zur optischen Messung der Einschweißtiefe
DE102017218494A1 (de) 2017-10-17 2019-04-18 Trumpf Laser- Und Systemtechnik Gmbh Bearbeitungsvorrichtung und Verfahren zur insbesondere schweißenden Bearbeitung eines Werkstücks
DE102018000887A1 (de) 2018-02-02 2019-08-08 Lessmüller Lasertechnik GmbH Vorrichtung und ein Verfahren zum Durchführen und Überwachen eines Bearbeitungsprozesses auf einem Werkstück mit der Möglichkeit zur OCT-Scanner-Kalibrierung
DE102019210618A1 (de) 2018-07-19 2020-01-23 Ipg Photonics Corporation Systeme und verfahren zum überwachen und/oder steuern einer wobbel-bearbeitung unter verwenden von kohärenter inline-bildgebung (ici)
DE102018118501A1 (de) 2018-07-31 2020-02-06 Precitec Gmbh & Co. Kg Messvorrichtung zur Bestimmung eines Abstands zwischen einem Laserbearbeitungskopf und einem Werkstück, Laserbearbeitungssystem mit derselben und Verfahren zur Bestimmung eines Abstands zwischen einem Laserbearbeitungskopf und einem Werkstück
WO2020069266A1 (fr) 2018-09-27 2020-04-02 Ipg Photonics Corporation Système et procédé de visualisation de distributions d'énergie laser fournies par différents motifs de balayage en champ proche
DE102018219129B3 (de) 2018-11-09 2019-11-07 Trumpf Laser Gmbh Verfahren und Computerprogrammprodukt zur OCT-Messstrahljustierung
DE102019220087A1 (de) * 2018-12-21 2020-06-25 Panasonic Intellectual Property Management Co., Ltd. LASERSCHWEIßVORRICHTUNG UND LASERSCHWEIßVERFAHREN
DE102019001858B3 (de) 2019-03-14 2020-07-30 Lessmüller Lasertechnik GmbH Messvorrichtung für ein Bearbeitungssystem, Bearbeitungssystem sowie Verfahren zum Überwachen einer Bearbeitung eines Werkstücks
DE102019002942A1 (de) 2019-04-24 2020-10-29 Lessmüller Lasertechnik GmbH Messvorrichtung und Verfahren zur Durchführung optischer Kohärenztomographie mit einem Kohärenztomographen

Similar Documents

Publication Publication Date Title
DE10335501B4 (de) Verfahren und Vorrichtung zum Schweißen oder Schneiden mit Laserstrahl
DE69826753T2 (de) Optischer Profilsensor
DE102009050784B4 (de) Verfahren zur bildgestützten Kontrolle von Bearbeitungsprozessen und Verfahren zur Reparatur von Defekten an Werkstücken
EP2886239B1 (fr) Procédé et dispositif de surveillance et de régulation de la trajectoire d'usinage lors d'un processus d'assemblage à laser
DE102005022095B4 (de) Verfahren und Vorrichtung zur Bestimmung einer lateralen Relativbewegung zwischen einem Bearbeitungskopf und einem Werkstück
EP1916046B1 (fr) Dispositif et méthode de découpe de tôles
EP3414042B1 (fr) Procédé et dispositif pour contrôler un joint d'assemblage lors de l'assemblage au moyen d'un faisceau laser
DE102013017795C5 (de) Prozessüberwachungsverfahren und -vorrichtung
DE102017126867A1 (de) Laserbearbeitungssystem und Verfahren zur Laserbearbeitung
DE102015015330B4 (de) Bearbeitungsvorrichtung und Verfahren zum Überwachen eines mit einer Bearbeitungsvorrichtung ausgeführten Bearbeitungsprozesses
DE102019005974B4 (de) Einlernvorrichtung zur laserbearbeitung
DE102018129407B4 (de) Verfahren zum Schneiden eines Werkstücks mittels eines Laserstrahls und Laserbearbeitungssystem zum Durchführen des Verfahrens
DE102016014564A1 (de) Messvorrichtung zum Überwachen eines Bearbeitungsprozesses unter Verwendung von an unterschiedlichen Messpositionen erfassten Messinformationen
EP3525975A1 (fr) Procédé et dispositif pour la détermination et la régulation d'une position focale d'un faisceau d'usinage
WO1991008439A1 (fr) Procede et dispositif de mesure optoelectronique d'objets
WO2020164862A1 (fr) Système d'usinage au laser pour l'usinage d'une pièce au moyen d'un faisceau laser, et procédé de commande d'un système d'usinage au laser
DE102014207095A1 (de) Kantenmessungs-Videowerkzeug mit robustem Kantenunterscheidungs-Spielraum
WO2014009150A1 (fr) Système pour réaliser des perçages ou des soudures au moyen d'un faisceau laser et d'un dispositif déflecteur de faisceau laser doté de deux scanneurs x/y
WO2019120557A1 (fr) Dispositif optique d'application ou de production et de surveillance automatiques d'une structure déposée sur un substrat avec détermination de dimensions géométriques et procédé correspondant
WO2020074713A1 (fr) Procédé permettant de déterminer une grandeur d'un processus d'usinage et machine d'usinage
DE102021124535A1 (de) System und verfahren mit mehrpunkt-autofokus zur ausrichtung einer optischen achse eines optischen abschnitts senkrecht zu einer werkstückoberfläche
DE102012210031A1 (de) Vorrichtung und Verfahren zur Bewertung einer Stirnfläche eines stabförmigen Produkts der Tabak verarbeitenden Industrie
WO2022117207A1 (fr) Procédé, dispositif et système d'usinage pour surveiller un processus d'usinage d'une pièce à usiner au moyen d'un faisceau d'usinage à haute énergie
EP1960156B1 (fr) Dispositif et procede de visualisation de positions sur une surface
EP4010145B1 (fr) Procédé d'analyse de la surface d'une pièce dans le cadre d'un processus d'usinage laser et dispositif d'analyse destiné à analyser la surface d'une pièce

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20820384

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20820384

Country of ref document: EP

Kind code of ref document: A1