EP4200123A1 - Verfahren und vorrichtung zur additiven herstellung eines werkstücks - Google Patents

Verfahren und vorrichtung zur additiven herstellung eines werkstücks

Info

Publication number
EP4200123A1
EP4200123A1 EP21746757.0A EP21746757A EP4200123A1 EP 4200123 A1 EP4200123 A1 EP 4200123A1 EP 21746757 A EP21746757 A EP 21746757A EP 4200123 A1 EP4200123 A1 EP 4200123A1
Authority
EP
European Patent Office
Prior art keywords
layer
material layer
workpiece
individual
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21746757.0A
Other languages
German (de)
English (en)
French (fr)
Inventor
Alexander Freytag
Thomas Milde
Ghazal GHAZAEI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carl Zeiss Industrielle Messtechnik GmbH
Original Assignee
Carl Zeiss Industrielle Messtechnik GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE102020121760.9A external-priority patent/DE102020121760A1/de
Application filed by Carl Zeiss Industrielle Messtechnik GmbH filed Critical Carl Zeiss Industrielle Messtechnik GmbH
Publication of EP4200123A1 publication Critical patent/EP4200123A1/de
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • B29C64/393Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F10/00Additive manufacturing of workpieces or articles from metallic powder
    • B22F10/20Direct sintering or melting
    • B22F10/28Powder bed fusion, e.g. selective laser melting [SLM] or electron beam melting [EBM]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F10/00Additive manufacturing of workpieces or articles from metallic powder
    • B22F10/30Process control
    • B22F10/38Process control to achieve specific product aspects, e.g. surface smoothness, density, porosity or hollow structures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F10/00Additive manufacturing of workpieces or articles from metallic powder
    • B22F10/80Data acquisition or data processing
    • B22F10/85Data acquisition or data processing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F12/00Apparatus or devices specially adapted for additive manufacturing; Auxiliary means for additive manufacturing; Combinations of additive manufacturing apparatus or devices with other processing apparatus or devices
    • B22F12/90Means for process control, e.g. cameras or sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/10Processes of additive manufacturing
    • B29C64/141Processes of additive manufacturing using only solid materials
    • B29C64/153Processes of additive manufacturing using only solid materials using layers of powder being selectively joined, e.g. by selective laser sintering or melting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y10/00Processes of additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y30/00Apparatus for additive manufacturing; Details thereof or accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • B33Y50/02Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0218Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
    • G05B23/0243Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults model based detection method, e.g. first-principles knowledge model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P10/00Technologies related to metal processing
    • Y02P10/25Process efficiency

Definitions

  • the present invention relates to a method for the additive manufacturing of a workpiece, with the steps of a) receiving a data set that defines the workpiece in a plurality of workpiece layers arranged one on top of the other, b) generating a material layer with a defined surface formed from a particulate material , c) taking at least one image of the material layer and inspecting the material layer using the at least one image in order to determine individual properties of the material layer, d) selectively solidifying the particulate material on the defined surface using a structuring tool using the data set and depending on the individual properties of the material layer, wherein a defined workpiece layer from the plurality of workpiece layers arranged on top of one another is produced from the material layer, and e) repeating steps b) to d), further defined workpiece layers from the plurality of workpiece layers arranged on top of one another be generated.
  • the invention also relates to a device for the additive manufacturing of a workpiece, with a memory for receiving a data set that defines the workpiece in a plurality of workpiece layers arranged one on top of the other, with a manufacturing platform, with a layer formation tool, with a structuring tool, with a camera , which is aimed at the production platform, and with an evaluation and control unit, which is set up to produce a material layer with a defined surface from a particulate material using the layer formation tool on the production platform, also taking at least one image of the material layer using the camera , and also to selectively solidify the particulate material on the defined surface with the aid of the structuring tool, wherein a defined workpiece layer from the plurality of workpiece layers arranged one on top of the other is produced from the material layer i.e.
  • Additive methods for producing workpieces are sometimes referred to as 3D printing.
  • 3D printing There are various additive manufacturing processes.
  • SLS selective laser sintering
  • SLM selective laser melting
  • a so-called powder bed made of a particulate material is used.
  • the particulate material is a metallic material.
  • particulate plastic materials in particular polymers.
  • Selected powder particles on the upper side of the powder bed are locally selectively melted or at least partially melted with the help of a laser beam or electron beam and in this way during cooling selectively solidified.
  • a new powder layer is then spread over the workpiece structure and the unmelted residual powder, and the workpiece is thus produced layer by layer.
  • the individual workpiece layers are produced from bottom to top on a production platform, which is lowered by the corresponding layer height after each workpiece layer.
  • US 2015/0061170A1 discloses, among other things, an optical measurement sensor with a camera that can be set up to enable a 3D coordinate measurement on the respective uppermost material layer.
  • a difficulty here are reflections and shadows that can be caused in particular by metallic powder particles, but also by other objects in the workspace of the device.
  • DE 10 2016 201 289 A1 discloses a method for the additive manufacturing of a workpiece, with first measurement data being recorded during the additive build-up using a thermographic material test or using an eddy current material test. After the additive build-up, second measurement data is recorded using computed tomography and compared with the first measurement data. Material testing results are to be classified using an unspecified algorithm from the field of supervised machine learning.
  • EP 3 459 715 A1 discloses a method for the additive manufacturing of a workpiece, with the aid of a classification function, which was trained using a technique from the field of machine learning that is not described in detail, being used to predict defects in a current layer or in subsequent further layers should.
  • WO 2015/020939 A1 discloses a process based on the processing of filaments for the additive manufacturing of a workpiece, with the aid of algorithms from the field of machine learning, a CAD input data set is to be correlated with parameters of the device used in order to determine the properties of the of the workpiece produced by the device and the time required for production. The actual quality control of the manufactured workpiece takes place after it has been manufactured using a 3D scanner and calibration patterns on the manufacturing platform.
  • One task in particular is to efficiently monitor the quality of the material layers close to the process in order to be able to correct any layer defects that occur or are imminent at an early stage.
  • this object is achieved by a method of the type mentioned at the outset, wherein the at least one image of the material layer is inspected in step c) using a previously trained statistical learning model, wherein using the previously trained statistical learning model at least a defect vector is determined which represents a plurality of individual defect probabilities, each individual defect probability from the plurality of individual defect probabilities being an individual indicator of whether a defined layer defect from a plurality of possible layer defects is present in the material layer, and step d ) is executed depending on the at least one error vector.
  • this object is achieved by a device of the type mentioned at the outset, wherein the evaluation and control unit is also set up to inspect the at least one image of the material layer using a statistical learning model trained in advance trained statistical learning model, at least one error vector is determined, which represents a multiplicity of individual error probabilities, each individual error probability from the multiplicity of individual error probabilities being an individual indicator of whether a defined layer error from a multiplicity of possible layer errors is present in the material layer, and wherein the evaluation and control unit controls the layer formation tool and the structuring tool depending on the error vector.
  • a termination criterion in particular an error and/or time criterion, based on the large number of annotated versions of the training images
  • the large number of training images preferably includes a large number of groups of at least three and in particular four training images each, the training images of each group each showing a defined material layer with at least one layer defect, and the training images of each group having the defined material layer with different layers Show lighting directions.
  • the number and lighting of the training images preferably correspond to those images that are inspected according to the new method for additive manufacturing of a workpiece using the previously trained statistical learning model, so that the following explanations apply equally to the training images.
  • the training images are recorded during the manufacture of a defined workpiece, and the termination criterion is defined as a function of the defined workpiece.
  • a learning model that has already been trained can advantageously be “post-trained” in a current production process for producing an individual workpiece, so that the learning model is optimized in relation to the individual workpiece. Further workpieces of the same type can then very advantageously be produced in subsequent further production processes using the learning model retrained in this way.
  • the learning model can be retrained iteratively using a plurality of manufacturing processes in which workpieces of the same type are produced in each case until a specified workpiece property or specified workpiece properties are achieved.
  • the termination criterion can advantageously be defined as a function of the specified workpiece properties.
  • the marking of the multiplicity of possible layer defects can be carried out by a person who is familiar with the layer defects to be detected.
  • the statistical learning model of the new method and the new device comes from the field of machine learning.
  • it implements a statistical evaluation of the at least one image of the material layer based on previously trained parameters and delivers individual probability values here, each of which has an individual represent the probability of the presence of a defined layer defect from a large number of possible layer defects.
  • the multitude of possible layer defects includes grooves in the surface of the material layer made of the particulate material, local accumulations of particulate material on the defined surface, non-uniform grain sizes of the particulate material, holes or depressions in the surface of the material layer made of the particulate material as well undesired adhesion or fusion of material particles. Such anomalies/inhomogeneities can lead to the workpiece defects mentioned above.
  • the development of the defects can be detected at an early stage.
  • the propagation of the defects can be avoided or corrected, or alternatively the printing process can be stopped in order to save material and time.
  • the selective solidification of the particulate material in step d) is carried out as a function of the at least one error vector.
  • step d) is no longer carried out at all due to a detected or emerging defect, or that step d) is carried out in a modified manner, for example with modified CAD data, or that the material layer made of the particulate material is initially removed according to step b) is post-processed and the selective strengthening according to step d) is subsequently carried out on the post-processed layer of material.
  • the use of the pre-trained statistical learning model enables a process-near inspection of the material layer both before and after the selective solidification of the particulate material in an efficient manner. Inspection of the layer of material prior to selective solidification makes it possible to correct any anomalies prior to selective solidification, for example by smoothing the surface of the layer of material again, spreading additional particulate material, and/or replacing existing particulate material. The inspection of the material layer after selective hardening also makes it possible to correct detected defects in the workpiece that has only been partially produced by post-processing a workpiece layer that has already been produced, such as melting it on or on, and/or by modifying subsequent workpiece layers , such as being made thicker or thinner.
  • the statistical learning model provides individual error probabilities for the various layer defects and allows a surface inspection based on empirical knowledge without each individual layer defect having to be precisely known in advance with regard to its precise appearance in at least one image of the material layer.
  • the statistical learning model preferably provides an individual error probability for a large number of different layer errors, so that an individual error probability, which represents the presence or absence of the respective layer error, is obtained for each possible layer error.
  • an algorithm from the field of machine learning is used here specifically for the inspection of the powder bed with the material layer made of the particulate material, not or not only on the behavior and properties of the device used for additive manufacturing and /or in relation to workpiece layers that have already been produced.
  • the new process and the corresponding device make a very efficient contribution to avoiding defective workpieces and workpiece layers as early as possible.
  • the error vector with the individual error probabilities for the various layer errors enables a qualitative and - at least in some exemplary embodiments - even a quantitative statement (such as the size or distribution of detected layer errors) in relation to the quality properties of the manufactured workpiece in a very efficient manner.
  • the selective solidification of the particulate material can be stopped or postponed depending on the error vector until the material layer of the particulate material has a desired homogeneity through suitable post-processing.
  • the production process can be terminated prematurely if a defect-free workpiece cannot be expected due to layer defects that are detected with a high degree of probability. Since different layer defects can occur at the same time or at different times, depending on a specific process sequence, locally distant from each other or in the immediate vicinity or to different extents, an inspection of the powder bed based on the machine learning method using the trained statistical learning model is particularly suitable. If the process parameter, the static learning model can be "post-trained" in an efficient way. The above task is therefore completely solved.
  • the layer of material is illuminated in step c) from a plurality of different directions and a multiplicity of images of the layer of material are recorded, with each image from the multiplicity of images showing the layer of material with a different direction of illumination, and wherein the individual characteristics are determined using the plurality of images.
  • the configuration is particularly advantageous for inspecting a powder bed of a metallic particulate material. However, it can equally be used to inspect a powder bed of plastic material or mixed materials.
  • the multitude of images show the material layer with different light reflections and different shadows. Therefore, the individual properties of the material layer can be recorded more reliably and in more detail.
  • the multiplicity of images are recorded with a single camera which is arranged in a fixed position relative to the production platform and/or the material layer. This enables quick image acquisition and easy assignment of the various illumination images to one another.
  • the configuration facilitates a semantic differentiation of different layer defects from one another, since it allows a detailed inspection of the material surface.
  • the material layer is preferably illuminated from at least three, in particular four different directions, and the plurality of images in step c) accordingly consists of at least three, in particular images, which show the material layer with a different illumination direction in each case.
  • the images show the layer of material as captured by the camera under the particular lighting.
  • the images can be advantageously rectified and/or distortion-corrected. Inhomogeneities in the lighting, for example due to manufacturing tolerances in the light sources used, can also be corrected.
  • the images show the material layer as such, not a filtered or otherwise modified view of the material layer with regard to the pixel information, because this enables a particularly fast and efficient inspection in the ongoing manufacturing process.
  • the plurality of images are supplied together as input data to the previously trained statistical learning model.
  • each image from the plurality of images can form an inspection channel.
  • the statistical learning model can process the image information from the different illumination images together and can therefore correlate the different image information. For example, when illuminated from a first direction, an edge in the material layer can be visible that is not visible when illuminated from a different direction. On the other hand, a reflection that occurs in one direction of illumination can resemble a layer defect that does not actually exist.
  • the configuration advantageously contributes to recognizing as many layer defects as possible that are actually present and, moreover, to distinguishing between layer defects that are actually present and layer defects that are only apparently present.
  • step c) a height map of the material layer is determined using the at least one image, the height map being supplied to the trained statistical learning model as an input data set.
  • the height map is determined using a plurality of images showing the layer of material, each with a different direction of illumination.
  • the height map is determined as a 2.5D height map using a method as described in DE 10 2017 108 874 A1 and US 2020/158499 A1 of equal priority, which are incorporated herein by reference.
  • the height map could be determined in further exemplary embodiments according to the triangulation principle, for example according to the principle of strip light projection.
  • the configuration has the advantage that the statistical learning model receives currently measured height information. As a result, the scope of the training data and the training time required in advance for the statistical learning model can be reduced.
  • very critical layer defects such as missing material or deep grooves as a result of a damaged layer formation tool, can be recognized very quickly using the height map.
  • step c) a multiplicity of error vectors are determined which each represent a multiplicity of individual error probabilities, with each error vector from the multiplicity of error vectors representing the individual error probabilities in relation to a selected pixel region in the at least one image .
  • the selected pixel areas are at least partially different from one another, so that the error vectors are representative of pixel areas of the at least one image that are different from one another.
  • each error vector from the plurality of error vectors represents the individual error probabilities in relation to another individual pixel in the at least one image.
  • the error vectors indicate the probabilities for defined layer errors in locally delimited areas and in particular at the pixel level.
  • the configurations make it possible to determine the extent and/or shape of any layer defects that may be present and/or their spatial progression.
  • error vectors that relate to individual pixels in the at least one image enable the dimensions of detected layer errors to be determined very precisely, so that, for example, a pore size in the layer sequence of the workpiece can be estimated.
  • the design contributes to a particularly efficient implementation of the new method and the corresponding device, since depending on the individual requirements of the workpiece produced layer defects that a certain Do not exceed size, training or shape, can be specifically tolerated. An advantageous classification of individual layer defects and workpiece defects that may result therefrom is also facilitated with this configuration.
  • morphological and/or dimensional properties of a defined defect in the material layer are determined using the multiplicity of defect vectors.
  • Morphological properties contain information about the structure and/or type of a defined defect. Dimensional properties include information about extent and/or shape. Determining the morphological and/or dimensional properties makes it easier to classify any layer defects and to make a targeted decision as to whether the workpiece produced can meet defined specifications in terms of strength, durability, shape and/or dimensions. The configuration is therefore particularly advantageous for an efficient process analysis.
  • the pre-trained statistical learning model includes a convolutional neural network (CNN).
  • CNN convolutional neural network
  • the trained statistical learning model particularly preferably contains a convolutional neural network with an encoder-decoder architecture, for example a so-called U-net.
  • a convolutional neural network uses, among other things, the mathematical operation "convolution" to analyze an input data set.
  • Filter matrices are convolved with the at least one image in sections in several steps.
  • the result of this convolution is a data set from which the presence or absence of a feature represented by the filter can be estimated.
  • a number of such convolution operations are preferably carried out sequentially one after the other. Filter matrices are advantageously used for each relevant layer defect.
  • one or more filter matrices can be provided for one or more of the following layer defects: grooves in the surface of the material layer made of the particulate material, local accumulations of particulate material on the defined surface, non-uniform grain sizes of the particulate material, Holes or indentations in the surface of the material layer made of particulate material, adhesion or fusion of material particles.
  • the large number of convolution operations can include classic convolution operations (convolution) and/or modified convolution operations such as strided convolution, atrous convolution or transposed convolution and, together with other operations of the convolutional neural network, such as in particular normalization operations, unification operations (pooling) and delinearization (e.g. using a Rectifying Linear Unit, ReLU) to the error vector.
  • convolutional neural network enables an error vector which is representative of the layer errors mentioned to be determined in a very efficient manner.
  • a convolutional neural network with an encoder-decoder architecture for example based on the U-net model, supplements the determination of the error vector in further subsequent convolution steps with information from the first convolution steps (so-called upsampling) and thus provides a very accurate segmentation of the at least one image into different layer defect areas and defect-free image background.
  • Such folding networks are traditionally used in the medical field. The investigations have shown that such networks are very well suited, especially when inspecting the material layer made of particulate material, to detect layer defects very early and accurately.
  • the at least one image of the material layer is normalized using a reference image, the reference image showing a homogeneous, diffusely reflecting surface or a defect-free material layer.
  • the homogeneous, diffusely reflecting surface of the reference image can be a white sheet of paper.
  • corresponding to the reflection properties of the diffusely re- reflecting surface here a Lambert radiator.
  • each image from the plurality of images is normalized with a reference image which shows the defect-free material layer with that direction of illumination with which the corresponding image from the plurality of images was recorded.
  • the image of the defect-free material layer can advantageously be low-pass filtered. Such a standardization has enabled particularly good inspection results.
  • the at least one error vector is stored as a historical error vector together with a time stamp which identifies the material layer, with the further defined workpiece layers being generated as a function of the historical error vector.
  • the repeated steps c) i.e. when inspecting subsequent material layers, to check whether a layer defect detected in the current material layer is persistent over a number of layers.
  • the design helps to reduce "false alarms" and to discontinue and/or modify the selective solidification of the particulate material of a current layer of material only when actually necessary to achieve a required part quality. For example, isolated layer defects that are limited to one material layer or apparent layer defects that are actually not present and appear as such due to light reflections, for example, can be eliminated in an efficient manner.
  • the current error vector can be compared with one or more historical error vectors in a deterministic manner after the inspection using the statistical learning model, i.e. as part of a post-processing. For example, a potential layer failure can be ignored if the historical failure vectors do not indicate the same layer failure in previous layers of material.
  • the at least one image of the material layer is stored as a historical image together with a time stamp which identifies the material layer, with the further defined workpiece layers being generated as a function of the historical image.
  • the history of detected layer defects is provided using the recorded images.
  • the design makes it possible to supply one or more historical images together with a respective current image to the trained learning model as a common multi-channel input data set.
  • the statistical learning model can then advantageously carry out the inspection of the material surface taking into account the history and recognize temporal correlations.
  • the at least one current image and one or more historical images can each form a channel of a convolutional neural network.
  • the statistical learning model can have a short-term memory, such as that implemented by an LSTM (Long Short Term Memory) network. Accordingly, in some embodiments, the statistical learning model may be an LSTM convolutional neural network.
  • This configuration can be combined particularly advantageously with the determination of a height map of the material layer, with a convolutional neural network being used, which performs the convolution operations three-dimensionally.
  • the first two dimensions can be the spatial pixel information along the X and Y axes of the height map and the third dimension of the convolution operations can be time using the current height map and one or more historical height maps.
  • the input data set can be a tensor whose dimensions correspond to the width and height of the height maps and the number of historical and current height maps.
  • the individual properties of the material layers from the repeated steps c) are each stored together with a time stamp that identifies the respective material layers, the workpiece being released for use depending on the stored individual properties.
  • the process-accompanying inspection of the material layers is advantageously used to assess the suitability of the workpiece produced for its intended use after the end of the manufacturing process.
  • the stored individual properties can advantageously also be used to document a quality assurance process. The design contributes in a very efficient manner to achieving high product quality in an additive manufacturing process.
  • FIG. 1 shows a schematic representation of an exemplary embodiment of the new device
  • FIG. 2 shows a flow chart for explaining an exemplary embodiment of the new method
  • FIG. 3 shows a flowchart to explain the inspection of the material surface according to exemplary embodiments of the method from FIG. 2,
  • FIG. 4 shows a simplified representation for explaining the functioning of a convolutional neural network, which can be advantageously used in exemplary embodiments of the new method and the new device,
  • FIG. 5 shows a simplified representation of an exemplary embodiment with an encoder-decoder architecture
  • FIG. 6 shows a simplified representation of a further exemplary embodiment with an encoder-decoder architecture.
  • an embodiment of the new device is denoted by the reference numeral 10 in its entirety.
  • the device 10 has a manufacturing platform 12 on which a workpiece 14 is produced additively according to an exemplary embodiment of the new method.
  • the workpiece 14 is built up in layers from bottom to top from a material stack in sequential steps.
  • the reference number 16 indicates a currently uppermost workpiece contour or uppermost workpiece layer.
  • An uppermost layer of material from which the workpiece layer 16 is produced is denoted by the reference number 18 .
  • the production platform 12 is typically lowered in the direction of the arrow 24 by the height of the next material layer and the particulate material 20 is removed from a reservoir 26 and distributed over the existing layer stack with the aid of the squeegee 22 .
  • a structuring tool is shown here in simplified form at reference number 28 .
  • the structuring tool 28 generates a laser beam 30 and moves it relative to the production platform 12 and the material layer 18 to be structured.
  • the material particles are selectively melted and/or partially melted with the laser beam 30 so that they solidify as they cool.
  • the structuring tool 28 can generate an electron beam in order to structure a workpiece layer on the manufacturing platform 12 .
  • the apparatus 10 may include more than one patterning tool 28, such as using two or more laser and/or electron beams to create a workpiece layer.
  • the structuring tool 28, hereinafter referred to simply as a write laser is connected to an evaluation and control unit, hereinafter referred to as controller 32, which controls the movement of the laser beam 30 along the material surface.
  • controller 32 has an interface 34 here, via which a data set 36 can be read in, which defines the workpiece 14 to be produced in a multiplicity of layers arranged one on top of the other.
  • the controller 32 controls the movement of the laser beam 30 relative to the material stack as a function of the data set 36, the laser beam 30 describing a trajectory in each workpiece layer 16 to be produced, which results from the data set 36.
  • the controller 32 is implemented using one or more commercially available personal computers running an operating system, such as Microsoft Windows, MacOS or Linux, and one or more control programs with which exemplary embodiments of the new method are implemented.
  • the controller 32 can be implemented as a soft PLC on a commercially available PC.
  • the controller 32 can be implemented using dedicated control hardware with one or more ASICs, FPGAs, microcontrollers, microprocessors or comparable logic circuits.
  • the device 10 also has a measuring arrangement 38, 40 which is set up to inspect the surface of the layer stack.
  • the measuring arrangement here includes an illumination arrangement 38 and a camera 40, each of which is connected to the controller 32 (or to a separate controller for the measuring arrangement, not shown here).
  • the camera 40 is set up to record a plurality of images of the surface of the material stack, with the surface being illuminated in different directions.
  • the lighting arrangement 38 here includes a multiplicity of lighting modules 38a - 38f which are arranged at different positions relative to the production platform 12 .
  • the lighting arrangement 38 can be movable relative to the production platform 12 in order to illuminate the material surface from different directions.
  • the manufacturing platform 12 could be arranged on a turntable.
  • Fig. 1 three lighting modules 38a, 38b, 38c are arranged side by side.
  • the lighting modules 38a, 38b, 38c are thus able to operate three largely in parallel to generate running illumination directions 42a, 42b (not shown here), 42c.
  • Three further lighting modules 38d, 38e, 38f are arranged here transversely to the lighting modules 38a, 38b, 38c and parallel to one another on a second side of the production platform 12.
  • the lighting modules 38d, 38e, 38f can generate three further lighting directions 42d (not shown here), 42e, 42f.
  • the device 10 has six further lighting modules (not shown here), of which three can be arranged opposite the lighting modules 38a-38c and another three can be arranged opposite the lighting modules 38d-38f.
  • the device 10 is able to illuminate the material surface from at least three different main directions.
  • the main directions can each contain three partial lighting directions that are offset parallel to one another, as is illustrated in FIG. 1 . This enables a very advantageous determination of a 2.5D height map of the material surface using a method such as that described in DE 10 2017 108 874 A1 mentioned at the outset or in the method with the same priority
  • the device 10 could have a ring light with a large number of individually and/or segmentally controllable light sources, with the ring light (not shown) being above the production platform 12 and preferably around the Material stack is arranged around to allow illumination of the material surface 18 from different directions.
  • a data record 36 is read into the controller 32, which defines the workpiece 14 in a multiplicity of workpiece layers 16 arranged one on top of the other.
  • the controller 32 could first receive a data set via the interface 34 that defines the workpiece to be produced “as a whole”, such as a CAD data set, and based on this determine the multiplicity of workpiece layers 16 arranged one on top of the other. In this case, too, the controller 32 receives the last borrowed a data set defining the workpiece 14 in a plurality of stacked workpiece layers 16.
  • a material layer 18 is created on the layer stack.
  • the controller 32 can remove particulate material 20 from the reservoir 26 with the aid of the squeegee 22 and distribute it on the layer stack.
  • the distribution of the particulate material 20 should be as uniform and homogeneous as possible.
  • the surface of the material layer 18 is then inspected using the measuring device 38, 40 in order to detect any anomalies, such as in particular grooves, holes, depressions, waves, accumulations of material, density variations and/or particle inhomogeneities (e.g. clumping) in of the material layer 18 can be seen. If the surface of the new material layer 18 corresponds to all desired criteria, the method branches according to step 56 to step 58, according to which a new workpiece layer 16 is produced in the uppermost material layer 18 using the writing laser 28 .
  • the writing laser 28 selectively melts material particles along the defined trajectory and in this way connects the melted or partially melted particles to one another.
  • step 52 If the surface of the new material layer 18 does not or not sufficiently meet the desired criteria, the method can return to step 52 according to loop 60 in order to rework the surface of the material layer 18 or to create it completely from scratch. According to step 62, steps 52-58 are repeated until the workpiece 14 corresponding to the data set 36 is completed.
  • a freshly produced workpiece layer 16 can be specifically inspected using the measuring device 38, 40, which is indicated by reference number 64. Depending on this, a subsequent workpiece layer can then be modified in order to correct a deviation in shape or size, for example.
  • the workpiece produced can be released for an intended use based on the history of the inspections from the repeated steps 52 and/or 64.
  • 3 shows an exemplary embodiment for the inspection of the material layer 18 according to step 54 and, if necessary, step 64 from FIG. 2.
  • the surface of the material layer 18 is preferably illuminated here with short-wave light.
  • the material surface is illuminated here from a first direction.
  • the first direction can be the illumination direction 42a according to FIG. 1, for example.
  • the camera 40 is used to record a first image I of the illuminated material surface.
  • the material surface is illuminated from a further direction, for example from the illumination direction 42c according to FIG. 1.
  • a further image J k of the material surface is recorded according to step 74, while the material surface is illuminated from the further direction 42c will.
  • step 76 a decision is made as to whether further images of the material surface with illumination should be taken from other directions, for example from the illumination directions 42e, 42f according to FIG different wavelengths/light colors can be generated and the different images can be separated from one another on the basis of the different wavelengths/light colors.
  • the recorded images are normalized using a respectively selected reference image in order to obtain a uniform illumination level regardless of the arrangement of the illumination modules in the working space of the device 10 and any manufacturing tolerances.
  • the reference image can show a white sheet of paper that was captured by the camera 40 .
  • the recorded images are each normalized with an associated reference image that shows a defect-free material layer with the particulate material and the same illumination as the illumination of the recorded image that is normalized.
  • a 2.5D height map of the material surface 18 can be determined using the recorded and possibly normalized images, as has already been mentioned above with reference to DE 10 2017 108 874 A1 or US 2020/158499 A1.
  • the elevation map and/or the normalized images are each provided with a time stamp that is representative of the material layer 18 currently recorded.
  • the time stamp can be a serial number that is incremented with each new material layer 18 .
  • the height map and/or the normalized images from step 78 are supplied to a previously trained statistical learning model, which is preferably implemented here as a convolutional neural network.
  • a previously trained statistical learning model which is preferably implemented here as a convolutional neural network. The basic functioning of such a network is explained in more detail in FIGS.
  • image 90a includes a current height map of the top layer of material 18 and image 90b includes the height map of a previous, underlying layer of material.
  • image 90b includes the height map of a previous, underlying layer of material.
  • Several such historical elevation maps of previous layers of material in the sequence of layers can be supplied to the statistical learning model together with the current elevation map. For example, height maps of two, three, four, or five previous layers of material may be included in the stack of input images 90a, 90b.
  • the stack of input images 90a, 90b can each contain current--preferably normalized--images of the uppermost material layer 18, each with a different direction of illumination. Accordingly, determination of a height map can be omitted in these exemplary embodiments.
  • the stack of input images 90a, 90b can contain current and historical images, each with different directions of illumination.
  • the stack of input images 90a, 90b can contain current and/or historical images, each with different illumination directions, as well as one or more height maps (current and/or historical).
  • the input images 90a, 90b are now each convolved with one of a plurality of filter masks 92a, 92b.
  • the convolution operations lead to an image stack 94 in which the respective convolution result is contained for each input image 90a, 90b and each filter mask 92, 92b.
  • the stack includes 94 and a step known to those skilled in the art as the ReLU step.
  • the ReLU step step 84 according to FIG. 3
  • the stack 94 is purposefully de-linearized.
  • what is known as a pooling step can then follow, which can be implemented in particular as what is known as max pooling.
  • the pooling step 86 a stack 96 is generated in which redundant image information is reduced.
  • Steps 82, 84, 86 can be repeated several times, as indicated by reference number 88 in FIG. 3, with the stack from the previous steps serving as the input data set for the further convolution and pooling steps. In some embodiments, between three and ten convolution steps 82, ReLU steps 84, and pooling steps 86 can be cascaded one after the other. It is also possible to carry out sequences of several convolution steps and/or de-linearization steps before a pooling step is carried out in each case.
  • one or more error vectors 100 can be determined in a further step 98, in particular using the so-called softmax function, which provides a probability distribution depending on the previous batch 96'.
  • Each error vector 100 therefore contains a multiplicity of individual error probabilities 102a, 102b, each individual error probability 102a, 102b being an individual indicator of whether a defined layer error from the multiplicity of layer errors mentioned above is present in the inspected material layer 18.
  • the training data includes height maps and/or normalized images of material surfaces that have one or more of the layer defects mentioned above and, in addition, at least one height map and/or images of a material surface that is free of defects.
  • a manufacturing process can be disrupted in a targeted manner, for example by overfilling with powder material, introducing grooves or waves in the material surface, mechanical shocks, deliberate incorrect control of the printing process, such as moving the writing beam too fast and/or writing intensity too high, and others .
  • images or elevation maps recorded in this way can be mirrored or manipulated during image processing to obtain a wide range of training data.
  • the training data are preferably supplied to the convolutional neural network 112 in advance in several training cycles.
  • the resulting error vectors are analyzed for each training data set supplied.
  • error measures are conceivable for the analysis, which can be optimized during the learning steps, such as the average error across all pixels of the defect map (usually cross entropy) or perceptual error measures (e.g. adversarial losses).
  • the parameters of the filter masks 92a, 92b are modified using a suitable optimization method until the selected target functions are sufficiently optimized, for example until the resulting error vectors correctly represent the layer errors in the training data.
  • Such an optimization method preferably includes a gradient descent variant (for example Stochastic Gradient Descent with or without Momentum, Adam, RMSProb, etc.).
  • the backpropagation algorithm is preferably used for the efficient calculation of the gradients. Alternatively, it can also be terminated earlier, for example if a previously defined training time budget has been used up or if the error in the model during training, evaluated on a separate validation data set, does not decrease any further. Then the statistical learning model is sufficiently trained.
  • Convolutional LSTM network A machine learning approach for precipitation nowcasting. InAdvances in neural information processing systems 2015 (pp. 802-810);
  • the error vectors are determined according to step 98 for each pixel of the height map and/or for each pixel of the normalized images, resulting in the probability of the presence of layer errors at the pixel level.
  • an individual error vector can be determined for the material layer 18 to be inspected, or partial areas can be defined within the material layer 18, for each of which an individual error vector is determined using the convolutional neural network.
  • the error vectors are each provided with a time stamp that identifies the material layer currently being inspected.
  • the time-stamped error vectors can be stored as historical error vectors in a memory 106 (see Figure 1).
  • the respective current error vectors can be compared with the historical error vectors from the memory 106 in order to identify layer errors that are persistent over several layers of material. Conversely, layer defects which, for example, are only detected in a current material layer and which do not appear in the following material layers either, can be discarded as "outliers".
  • the error vectors can be compared according to step 108 as part of a deterministic comparison.
  • historical layer error information can be supplied to the statistical learning model in further exemplary embodiments as a stack of height maps or input images 90a, 90b, so that the resulting error vectors 100 already reflect the individual error probabilities, taking into account the persistence of any layer errors represent.
  • the individual properties of the material layer 18, in particular the presence or absence of defined layer defects are determined using the defect probabilities 102a, 102b of the defect vectors.
  • the extent and/or shape of the detected layer defects are determined by comparing the defect vectors/probabilities of defects for different pixels or pixel areas.
  • the inspected material surface is advantageously segmented and classified in step 110 into defect-free sub-areas and defective sub-areas, with defective sub-areas being differentiated from one another depending on the type of layer defect detected in each case. Based on these properties, according to step 56 from FIG. 2, a decision is made as to whether and how the method for manufacturing the workpiece is continued.
  • the training of the statistical learning model 112′ can take place using training data, which in particular contain groups of normalized images of material surfaces, each of which has one or more of the layer defects to be detected.
  • the training data contain a large number of training images 120a, 120b and, in addition, annotated versions 122a, 122b of the training images 120a, 120b.
  • the layer defects to be detected are marked in the annotated versions 122a, 122b.
  • the annotated versions 122a, 122b represent a target result that the statistical learning model 112' is intended to deliver after completion of the training when the training images 120a, 120b are supplied to it again as input data.
  • Filter masks 92a, 92b for the convolution operations of the statistical learning model 112' are modified, for example with Stochastic Gradient Descent with or without Momentum, Adam, RMSProb, etc., until the statistical learning model 112' has the annotated versions 122a, 122b within the framework of a defined error and/or termination criterion.
  • the annotated versions 122a, 122b can advantageously be generated by an experienced expert with the aid of a visual inspection of the training images 120a, 120b and an individual marking of the detected layer defects.
  • the statistical learning model 112' After the statistical learning model 112' has been adequately trained, in preferred exemplary embodiments of the method for additively manufacturing a workpiece, it is supplied with the respective current images 92 and preferably also historical images of previous material layers and/or current elevation maps and/or historical elevation maps.
  • the training images 120a, 120b can be images which are recorded during the layer-by-layer production of a specified workpiece using the new method and which thus show a large number of material layers which were produced in the course of the production process.
  • the statistical learning model can be retrained using these training images and optimized in relation to the manufactured workpiece.
  • a statistical learning model 112' which has an encoder-decoder architecture.
  • an error map 124 can be created by refolding steps (deconvolution), which represents detected layer errors in a spatially resolved manner.
  • the error map 124 can contain an individual error probability or an error vector with a large number of individual error probabilities for each pixel, the error probabilities being representative of whether the respective pixel shows a layer error to be detected.
  • FIG. 6 shows another statistical learning model 112'' with an encoder-decoder architecture.
  • the statistical learning model 112′′ determines a multiplicity of increasingly filtered image stacks 94, 94′ in numerous successive convolution steps, delinearization steps and pooling steps. The image stacks are then folded back again in an upsampling path.
  • the respective stacks are combined here with the respective intermediate results from the previous folding steps, for example concatenated, as indicated by the arrows 126 .
  • a statistical learning model 112 is advantageously based on the U-net model mentioned above.
  • the statistical learning model 112′′ supplies an error map 124 with spatially resolved error probabilities that locally represent the presence or absence of layer errors.
  • the error map 124 here contains a large number of error vectors with a large number of individual error probabilities, preferably for each pixel of the images 92a, 92b, 92c, 92d.
  • the spatial dimensions of detected layer defects can advantageously be determined on the basis of such a defect map 124 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Materials Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Analytical Chemistry (AREA)
  • Plasma & Fusion (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Mechanical Engineering (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
EP21746757.0A 2020-08-19 2021-07-23 Verfahren und vorrichtung zur additiven herstellung eines werkstücks Pending EP4200123A1 (de)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102020121760.9A DE102020121760A1 (de) 2020-08-19 2020-08-19 Verfahren und Vorrichtung zur additiven Herstellung eines Werkstücks
DE102020126571 2020-10-09
DE102021102125 2021-01-29
PCT/EP2021/070751 WO2022037899A1 (de) 2020-08-19 2021-07-23 Verfahren und vorrichtung zur additiven herstellung eines werkstücks

Publications (1)

Publication Number Publication Date
EP4200123A1 true EP4200123A1 (de) 2023-06-28

Family

ID=77104089

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21746757.0A Pending EP4200123A1 (de) 2020-08-19 2021-07-23 Verfahren und vorrichtung zur additiven herstellung eines werkstücks

Country Status (4)

Country Link
US (1) US20230294173A1 (zh)
EP (1) EP4200123A1 (zh)
CN (1) CN115867403A (zh)
WO (1) WO2022037899A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021041422A1 (en) * 2019-08-27 2021-03-04 The Regents Of The University Of California Ai-powered autonomous 3d printer

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9855698B2 (en) 2013-08-07 2018-01-02 Massachusetts Institute Of Technology Automatic process control of additive manufacturing device
DE102013217422A1 (de) 2013-09-02 2015-03-05 Carl Zeiss Industrielle Messtechnik Gmbh Koordinatenmessgerät und Verfahren zur Vermessung und mindestens teilweisen Erzeugung eines Werkstücks
DE102016201289A1 (de) 2016-01-28 2017-08-03 Siemens Aktiengesellschaft Verfahren zur additiven Herstellung und Vorrichtung
US20180095450A1 (en) * 2016-09-30 2018-04-05 Velo3D, Inc. Three-dimensional objects and their formation
US20180099333A1 (en) * 2016-10-11 2018-04-12 General Electric Company Method and system for topographical based inspection and process control for additive manufactured parts
GB2559579B (en) * 2017-02-08 2021-08-11 Reliance Prec Limited Method of and apparatus for additive layer manufacture
DE102017108874A1 (de) 2017-04-26 2018-10-31 Carl Zeiss Ag Materialprüfung mit strukturierter Beleuchtung
WO2018234331A1 (de) * 2017-06-20 2018-12-27 Carl Zeiss Ag Verfahren und vorrichtung zur additiven fertigung
EP3459715A1 (en) * 2017-09-26 2019-03-27 Siemens Aktiengesellschaft Method and apparatus for predicting the occurrence and type of defects in an additive manufacturing process
US10857738B2 (en) * 2018-03-19 2020-12-08 Tytus3D System Inc. Systems and methods for real-time defect detection, and automatic correction in additive manufacturing environment
US11042992B2 (en) * 2018-08-03 2021-06-22 Logitech Europe S.A. Method and system for detecting peripheral device displacement
EP3646968A1 (en) * 2018-10-30 2020-05-06 Siemens Aktiengesellschaft Method for automatically preventing defects potentially arising during an additive manufacturing process and manufacturing device
CN110163858A (zh) * 2019-05-27 2019-08-23 成都数之联科技有限公司 一种铝型材表面缺陷检测与分类方法及系统
CN111007073B (zh) * 2019-12-23 2021-02-05 华中科技大学 一种増材制造过程中零件缺陷在线检测的方法及检测系统
CN111151751B (zh) * 2020-01-02 2022-03-22 汕头大学 一种三激光束智能增减材复合制造系统

Also Published As

Publication number Publication date
CN115867403A (zh) 2023-03-28
WO2022037899A1 (de) 2022-02-24
US20230294173A1 (en) 2023-09-21

Similar Documents

Publication Publication Date Title
DE102018129425B4 (de) System zur Erkennung eines Bearbeitungsfehlers für ein Laserbearbeitungssystem zur Bearbeitung eines Werkstücks, Laserbearbeitungssystem zur Bearbeitung eines Werkstücks mittels eines Laserstrahls umfassend dasselbe und Verfahren zur Erkennung eines Bearbeitungsfehlers eines Laserbearbeitungssystems zur Bearbeitung eines Werkstücks
DE102018129441B4 (de) System zur Überwachung eines Laserbearbeitungsprozesses, Laserbearbeitungssystem sowie Verfahren zur Überwachung eines Laserbearbeitungsprozesses
EP3401411B1 (de) Verfahren und vorrichtung zur fehlerstellenerkennung bei biegeschlaffen körpern, insbesondere tierhäuten
DE102021114967A1 (de) Werkstückinspektions- und fehlererkennungssystem, das die anzahl der fehlerbilder für das training angibt
DE69403574T2 (de) Automatische Inspektion von Druckplatten oder Zylindern
DE102019114012A1 (de) Mikroskopieverfahren, Mikroskop und Computerprogramm mit Verifikationsalgorithmus für Bildverarbeitungsergebnisse
EP1132732A2 (de) Verfahren zur Bewertung von strukturfehlern auf einer Waferoberfläche
EP3767403B1 (de) Machine-learning gestützte form- und oberflächenmessung zur produktionsüberwachung
DE102018214063A1 (de) Maschinenlernvorrichtung, Maschinenlernsystem und Maschinenlernverfahren
DE102018133196A1 (de) Bildbasierte wartungsvorhersage und detektion von fehlbedienungen
DE102020126554A1 (de) Mikroskopiesystem und verfahren zum überprüfen von eingabedaten
DE102021115002A1 (de) Werkstückinspektions- und fehlererkennungssystem einschliesslich überwachung der werkstückbilder
DE102019211656A1 (de) Bestimmung eines Verschleißgrades eines Werkzeugs
WO2021219515A1 (de) Verfahren, vorrichtung und computerprogramm zum erzeugen von qualitätsinformation über ein beschichtungsprofil, verfahren, vorrichtung und computerprogramm zum erzeugen einer datenbank, überwachungsgerät
WO2020212489A1 (de) Computer-implementiertes verfahren zur bestimmung von defekten eines mittels eines additiven fertigungsprozesses hergestellten objekts
EP4200123A1 (de) Verfahren und vorrichtung zur additiven herstellung eines werkstücks
DE102019131678A1 (de) Mikroskop und Verfahren zum Verarbeiten von Mikroskopbildern
EP3482348A1 (de) Verfahren und einrichtung zur kategorisierung einer bruchfläche eines bauteils
DE102022123355A1 (de) Werkstückinspektion und fehlererkennungssystem unter verwendung von farbkanälen
DE102020121760A1 (de) Verfahren und Vorrichtung zur additiven Herstellung eines Werkstücks
DE102022205312A1 (de) Überwachungssystem und additives fertigungssystem
DE102018133092B3 (de) Computer-implementiertes Verfahren zur Analyse von Messdaten aus einer Messung eines Objektes
DE102020214803A1 (de) Spritzguss-System
DE102020212510A1 (de) Verfahren und Vorrichtung zum Aufzeigen des Einflusses von Schneidparametern auf eine Schnittkante
DE102019009301B4 (de) Verfahren und Vorrichtung zur additiven Herstellung eines Werkstücks

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230317

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)