WO2023280679A1 - Automatic seam detection for a welding process - Google Patents

Automatic seam detection for a welding process Download PDF

Info

Publication number
WO2023280679A1
WO2023280679A1 PCT/EP2022/068122 EP2022068122W WO2023280679A1 WO 2023280679 A1 WO2023280679 A1 WO 2023280679A1 EP 2022068122 W EP2022068122 W EP 2022068122W WO 2023280679 A1 WO2023280679 A1 WO 2023280679A1
Authority
WO
WIPO (PCT)
Prior art keywords
computer
welding
input data
visual input
welding seam
Prior art date
Application number
PCT/EP2022/068122
Other languages
French (fr)
Inventor
Muhammad Sarmad
Jawad Tayyub
Rabia ALI
Alexander Vogel
Elmar Wosnitza
Philipp Leufke
Original Assignee
Endress+Hauser SE+Co. KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Endress+Hauser SE+Co. KG filed Critical Endress+Hauser SE+Co. KG
Publication of WO2023280679A1 publication Critical patent/WO2023280679A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/04Automatically aligning, aiming or focusing the laser beam, e.g. using the back-scattered light
    • B23K26/044Seam tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Definitions

  • the present invention relates to a computer-implemented method for detecting a welding seam position in a welding process, a data processing system comprising means for carrying out the method, a welding apparatus comprising such data processing system, a computer program and a computer-readable medium.
  • Welding is widely applied in various fields of technology, in particular for joining or repairing metal components. Frequently, welding processes are carried out in an at least partially automized manner, e.g. using welding robots or computer guided welding processes.
  • the different welding technologies available such as tungsten inert gas welding, metal inert gas welding or spot welding, are in principle known from various state of the art documents and will not be described in detail in this application.
  • Another welding technology constantly gaining popularity is laser welding which is due to several advantages, e.g. a high precision, low heat-affected zones, a high energy density, a high welding speed, and low shape distortions.
  • the efficiency and accuracy of a weld highly depends on a precise detection of the welding seam position, in particular in a laser welding processes for which harsh precision requirements need to be fulfilled.
  • the welding seam position is defined as an edge to be joint, e.g. between two components to be welded.
  • Seam position detection and control is usually carried out either completely manually or by using various types of sensors, e.g. displacement or ultra-sonic sensors.
  • sensors e.g. displacement or ultra-sonic sensors.
  • ultrasonic sensors necessitate a stable non-interference state which is difficult to achieve in an industrial environment.
  • the seam position is frequently detected by means of an applied algorithm based on visual input about the joint. Also these systems usually require human intervention on a regular basis to correct and verify detected seam positions, e.g. the Hough edge detector algorithm (Qian-Qian Wu et al.
  • EP2792447B1 relates to a welding position detecting apparatus and method, for which an imaging device captures, at a predetermined time interval, images of an irradiated portion of a material to be welded and a surrounding area thereof, an image processing device that identifies a position of the irradiated portion by performing an image processing calculation from two or more images acquired by the imaging device, in which a direction and an amount of parallel movement of points in the images is calculated, and a display device that displays the position of the irradiated portion, wherein the position is identified by the image processing device.
  • the objective technical problem underlying the present invention is to provide an improved possibility for edge detection in a welding process.
  • the object is achieved by the method according to claim 1 , the data processing system according to claim 10, the welding apparatus according to claim 11 , the computer program according to claim 12 and the computer-readable medium according to claim 13.
  • the objective technical problem is solved by a computer- implemented method for detecting a welding seam position in a welding process, in particular a laser welding process, the method comprising the steps of: receiving visual input data of a welding edge, providing the visual input data as an input to a neural network configured to determine the welding seam position based on the visual input data, and outputting the welding seam position.
  • Neural networks belonging to the field of machine learning, are based on a plurality of interconnected units called artificial neurons which are typically organized in various layers and are able to detect complex and nonlinear relationships.
  • the use of a neural network for automatically detecting a welding seam position in a welding process results in a very high robustness and high resolution. Achieving the required resolution, in particular down to a pixel-level or a millimeter, especially submillimeter, range, without need for any human intervention can be ensured. Such a high precision is not achievable in a fully automated manner in the state of the art.
  • the present invention provides superior precision and a full automation level of the welding seam detection.
  • the high precision also ensures a correct and precise positioning of a welding zone mandatory for a high quality of the welding.
  • the visual input data can e.g. be an image, a series of images or a video.
  • the input data can be collected by means of a visual sensing device, e.g. a camera. It is either possible to use the entire input data, e.g. an image, as input or to cut out a certain region of interest which isolates the welding seam area prior to processing the input data.
  • the input data can also be subjected to any data processing means e.g. for cropping out shadow and glare background effects, prior to providing the visual input data as an input to the neural network.
  • the neural network preferably is trained before it is employed in the method according to the present invention.
  • a trained neural network is employed for the method.
  • the training is preferably performed by an individually created and/or labeled data set of input data of welding seams, in particular relating to the particular welding apparatus used for welding.
  • the neural network is a convolutional neural network (CNN).
  • CNNs have the advantage that less pre-processing of the input data is required compared to other classification algorithms. CNNs are capable of capturing spatial and temporal dependencies in an image and thus improve the precision of the method according to the present invention.
  • the convolutional neural network is a, especially modified, U-net convolutional neural network. They have been proposed by Olaf Ronneberger et al. , in “U-net: Convolutional networks for biomedical image segmentation”, CoRR, abs/1505.04597, 2015. Compared to other convolutional networks U-net convolutional networks can operate based on smaller training data sets still yielding in a highly precise segmentation. While U-Net convolutional networks were originally developed mainly for biomedical applications it is a finding of the present invention, that by modification of such architecture increased precision regarding the automatic seam position detection can be achieved.
  • the method according to the present invention comprises the step of providing the visual input data to a preprocessing module including a binary classifier configured to classify the visual input data based on at least one predeterm inable criterion relating to a quality parameter of the visual input data.
  • the quality parameter can e.g. be an image quality parameter.
  • the predeterm inable criterion is related to a clarity of the visual input data, the presence of lighting artefacts, or the presence of a joint in the input data.
  • the classifier can be either based on any standard data processing algorithm, e.g. a filtering algorithm, e.g. regarding the brightness of the visual input data or also any other filtering or processing means.
  • the classifier can also be embodied as a neural network, in particular a trained neural network, or as a part of a neural network architecture, the classifier being configured to classify the visual input data based on the predeterm inable criterion. Only visual input data which fulfills the predeterm inable criterion is forwarded as input towards the neural network configured to determine the welding seam position.
  • the preprocessing module serves for further improving the precision and accuracy of the welding seam position detection, because only visual input data of sufficient quality is forwarded towards the neural network configured to determine the position of the welding seam.
  • the security of the detection process of the welding seam position is increased, because in an unknown situation, e.g. where the joint cannot be clearly recognized in the visual input data, carrying out the welding process can be prevented.
  • the classifier in principle is trained to detect normal and abnormal input data by distinguishing visual input data falling under the at least one predeterm inable criterion from such data which do not fulfill the criterion. Any input data that deviates from the, especially predeterm inable, normal conditions defined by the criterion, will be flagged as abnormal and not forwarded as input towards the neural network configured to determine the position of the welding seam.
  • the preprocessing module further includes an alert operator configured to output an alert in case that a visual input data does not fulfill the predeterm inable criterion.
  • an alert operator configured to output an alert in case that a visual input data does not fulfill the predeterm inable criterion.
  • assistance from an operator of the welding apparatus for which the method of the present invention is used can be sought, further increasing security of the welding process.
  • a warning, in particular a visual and/or sound signal may be output and/or the welding process can be stopped until the process is checked by the operator.
  • the welding seam position is output in form of a mathematical function representing the welding seam, in particular by outputting at least one parameter characterizing the mathematical function.
  • the mathematical function can be specified in advance and be adopted to the welding apparatus in use. For example, there are welding machines where the work pieces can be moved horizontally and/or vertically for producing the weld along the welding edge, but also machines where the welding system is moved while the work pieces remain in a fixed position, or where both the work pieces and the welding system are movably arranged. In each type different mathematical functions need to be chosen for defining or describing the welding seam position.
  • the mathematical function can e.g. be any, especially steady, function. However, it is of advantage, if the mathematical function is a line, in particular a vertical or horizontal line. This allows a characterization of the function by outputting one or two parameters and thus provides a straightforward output of the welding seam position.
  • the welding seam position is output in form of an edge mask.
  • the edge mask defines the welding seam position and may be related to the mathematical function describing the welding seam position.
  • the welding seam position can be defined as a boundary between two different areas of the edge mask.
  • the objective technical problem is also achieved by means of a data processing system comprising means for carrying out the method according to the present invention.
  • the objective technical problem underlying the present invention is achieved by means of a welding apparatus, in particular a laser welding apparatus comprising the data processing system according to the present invention.
  • the objective technical problem is also achieved by a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the present invention and finally, by a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to the present invention.
  • Fig. 1 illustrates the advantages of a detection of the welding seam position with pixel-level accuracy
  • Fig. 2 shows a block diagram of a first preferred embodiment of the method according to the present invention
  • Fig. 3 shows a block diagram of a second preferred embodiment of the method according to the present invention including a pre-processing module, and
  • Fig. 4 shows a scheme of a welding apparatus according to the present invention.
  • identical elements are each provided with the same reference signs.
  • Fig. 1 illustrates the importance of a highly accurate and precise welding seam detection.
  • Two components 2 and 3, especially metal components, are to be welded along welding seam 5. If the welding seam position 1 is correctly and precisely detected (Fig. 1a) at the welding seam 5, the welding zone 4 is also precisely provided along the welding edge 5. The result is a high quality and stable weld.
  • a very small offset e.g. an offset of less than one millimeter
  • the welding zone 4 where the weld is produced is not centered at the position of the welding edge anymore.
  • FIG. 2 a first embodiment of the method according to the present invention is shown.
  • Visual input data 6 is exemplarily provided in the form of an image of the joint, i.e. here an image of the welding edge of the two metal components 2 and 3 to be welded.
  • the visual input data 6 may be processed by any processing means [not shown] prior to using the input data 6 as input for the neural network 8 of the welding seam position detection block 7, which e.g. may be a computer program or be part of a data processing system 9 for carrying out the method.
  • the processing means may include choosing a region of interest in the input data 6 or cropping out shadow and glare background effects.
  • the input data 6 is provided as input to neural network 8 which is configured to determine the welding seam position 5 based on the visual input data 6.
  • the welding seam position 5 may be output in the form of an edge mask 10a produced in the optional block 10, as illustrated in the output 12 image of the welding seam of the two components 2 and 3.
  • the welding seam position 5 can also be output in the form of a mathematical function, preferably defined by at least one relevant parameter describing the mathematical function of the welding seam position 5.
  • the welding seam position 5 is a mathematical function in the form of a horizontal line 11 defined by a single parameter section as also illustrated in the output 12 image of the welding seam of the two components 2 and 3.
  • neural network 8 is embodied in the form of a modified U-net convolutional network.
  • a binary edge mask 10 is provided as output as it is the case for the example shown in Figs 2 and 3, and providing an architecture that is converted to this one dimensional output
  • a pixel-wise binary cross-entropy loss LED can be used to train the U-net architecture, which binary cross-entropy loss may be defined as log(pOi)) + (1 - yd log(p(l - yd) wherein the loss is calculated for each pixel W, and where y is the class label, and p(y is the probability of the predicted class.
  • the input data 6 is first provided to a preprocessing module 13 including a binary classifier 14 to classify the visual input data 6 based on at least one predeterm inable criterion, e.g. dividing the input data 6 into normal and abnormal images based on at least one predeterm inable criterion relating to a quality parameter of the visual input data 6.
  • the preprocessing module further includes an alert operator 15 configured to output an alert in case that a visual input data 6 does not fulfill the predeterm inable criterion.
  • the alert operator 15 can e.g. be embodied as to provide an alarm stopping the welding process. If, on the other hand, the predeterm inable criterion is fulfilled the input data 6 is transferred to block 7 and provided as input to neural network 8.
  • the binary classifier 14 may be embodied in the form of any standard data processing algorithm, e.g. a filtering algorithm. However, it can also be embodied in the form of a neural network, which can be trained together with or separate from neural network 8.
  • the visual input data 6 serves as input and the output can be a label specifying e.g. either a “1” (normal image) or a “0” (abnormal image). Abnormal may correspond to cases where the input data 6 lacks clarity of if a welding seam is already present in the input data 6.
  • a binary cross-entropy loss LEC where EC refers to the edge classification, which is used to train the binary classifier based on N training images, may then be calculated as log(p(l - yd), where y is the class label (here: ⁇ ” or “0”), and p(yi) is the probability of the predicted class.
  • y is the class label (here: ⁇ ” or “0”)
  • p(yi) is the probability of the predicted class.
  • the welding seam position detection block 7 only operates if the preprocessing module 13 outputs a label ⁇ ” for the corresponding visual input data 1 , e.g. a label corresponding to fulfillment of the predeterm inable criterion.
  • the output of block 7 and module 13 may be written as
  • preprocessing module 13 further increases the detection accuracy.
  • the preprocessing module ensures that spurious input data 6 caused e.g. by misplacement of parts, blurring, machine errors or the like, are filtered out.
  • Fig. 4 finally shows a schematic drawing of a welding apparatus 16 which for the present example is a laser welding apparatus.
  • the apparatus 16 includes a visual sensing system in the form of a camera for producing the visual input data 6 of the joint of the two components 2 and 3 located on a work piece platform which can be turned around a central axis.
  • the apparatus 16 further includes a welding system, here in the form of a laser system 18, with a laser used to produce welds of the work pieces 2 and 3.
  • the laser system is arranged such that it can be horizontally and vertically moved and turned around a central axis in order to follow any seam position to be welded.
  • the welding apparatus 16 may further comprise a computer or any data processing system which is embodied to carry out an embodiment of the method according to the present invention.
  • Reference symbols detected welding seam position component to be welded component to be welded welding zone welding seam position visual input data welding seam position detection block neural network configured to determine the welding seam position data processing system block to produce edge mask 10a mathematical function, esp. in the form of a line output image preprocessing module binary classifier alert operator welding apparatus camera laser system work piece platform

Abstract

The present invention relates to a computer-implemented method for detecting a welding seam position in a welding process, in particular a laser welding process, the method comprising the steps of: receiving visual input data (6) of a welding edge; providing the visual input data (6) as an input to a neural network (8) configured to determine the welding seam position based on the visual input data (6); and outputting the welding seam position. The present invention further relates to a data processing system (9), a welding apparatus, a computer program, and a computer-readable medium.

Description

Automatic Seam Detection for a Welding Process
The present invention relates to a computer-implemented method for detecting a welding seam position in a welding process, a data processing system comprising means for carrying out the method, a welding apparatus comprising such data processing system, a computer program and a computer-readable medium.
Welding is widely applied in various fields of technology, in particular for joining or repairing metal components. Frequently, welding processes are carried out in an at least partially automized manner, e.g. using welding robots or computer guided welding processes. The different welding technologies available, such as tungsten inert gas welding, metal inert gas welding or spot welding, are in principle known from various state of the art documents and will not be described in detail in this application. Another welding technology constantly gaining popularity is laser welding which is due to several advantages, e.g. a high precision, low heat-affected zones, a high energy density, a high welding speed, and low shape distortions.
The efficiency and accuracy of a weld highly depends on a precise detection of the welding seam position, in particular in a laser welding processes for which harsh precision requirements need to be fulfilled. The welding seam position is defined as an edge to be joint, e.g. between two components to be welded. Seam position detection and control is usually carried out either completely manually or by using various types of sensors, e.g. displacement or ultra-sonic sensors. However, most sensors are associated with specific problems. For instance, ultrasonic sensors necessitate a stable non-interference state which is difficult to achieve in an industrial environment.
In case of visual sensing systems used for the seam detection, in particular sensing systems with auxiliary light sources, the seam position is frequently detected by means of an applied algorithm based on visual input about the joint. Also these systems usually require human intervention on a regular basis to correct and verify detected seam positions, e.g. the Hough edge detector algorithm (Qian-Qian Wu et al. , “A study on the modified hough algorithm for image processing in weld seam tracking”, Journal of Mechanical Science and Technology, 29, (11 ): 4859 - 4865, 2015) or the canny edge detection algorithm (Jin-Yun Lu et al., “The weld image edge-detection algorithm combined with canny operator and mathematical morphology, Proceedings of the 32nd Chinese Control Conference, pp. 4467-4470, 2013).
Involving human intervention in the seam detection process in welding systems is not only timely intensive, but also results in varying quality and results due to deteriorating human performance with increasing working hours and subjective considerations of different persons.
EP2792447B1 relates to a welding position detecting apparatus and method, for which an imaging device captures, at a predetermined time interval, images of an irradiated portion of a material to be welded and a surrounding area thereof, an image processing device that identifies a position of the irradiated portion by performing an image processing calculation from two or more images acquired by the imaging device, in which a direction and an amount of parallel movement of points in the images is calculated, and a display device that displays the position of the irradiated portion, wherein the position is identified by the image processing device.
Several further approaches to provide a high level of automation in seam detection processes using visual input data such as images rely on structured light methods based on optical triangulation, as e.g. described by Yanbiao Zou and Tao Chen in “laser vision seam tracking system based on image processing and continuous convolution operator tracker”, Optics and Lasers in Engineering, 105: 141-149, 2018. Similar methods were also described to detect and control the motion of a welding torch in real time (Xinde Li et al. , “Automatic welding seam tracking and identification”, IEEE Transactions on industrial electronics, 64(9):7261 -7271 , 2017). However, such methods, which typically rely on hand-crafted algorithms using surface patterns, often become problematic in case of joints with relatively narrow gaps or in case of components and materials to be welded which feature relatively poor contrast with respect to each other or against a background. To be able to reduce the influence of background artefacts and to help to focus on the welding seam, it was suggested (Yanbiao Zou and Weilin Zhou in “Automatic seam detection and tracking system for robots based on laser vision”, Mechatronics, 63:102261, 2019; Mitchell Dinham et al. in “ Experiments on automatic seam detection for a mig welding robot”, International Conference on Artificial Intelligence and Computational intelligence, pp. 390-397, Springer, 2011 ; Andres Ryberg et al. in “Stereo vision for path correction in off-line programmed robot welding”, 2010 IEEE International Conference on Industrial Technology, pp. 170-1705, IEEE, 2010) to consider only certain regions of interest as input. However, the procedures suggested are still limited in terms of achievable resolution and precision.
Therefore, the objective technical problem underlying the present invention is to provide an improved possibility for edge detection in a welding process.
The object is achieved by the method according to claim 1 , the data processing system according to claim 10, the welding apparatus according to claim 11 , the computer program according to claim 12 and the computer-readable medium according to claim 13.
With respect to the method, the objective technical problem is solved by a computer- implemented method for detecting a welding seam position in a welding process, in particular a laser welding process, the method comprising the steps of: receiving visual input data of a welding edge, providing the visual input data as an input to a neural network configured to determine the welding seam position based on the visual input data, and outputting the welding seam position.
Neural networks (NN), belonging to the field of machine learning, are based on a plurality of interconnected units called artificial neurons which are typically organized in various layers and are able to detect complex and nonlinear relationships. With respect to the present invention, the use of a neural network for automatically detecting a welding seam position in a welding process results in a very high robustness and high resolution. Achieving the required resolution, in particular down to a pixel-level or a millimeter, especially submillimeter, range, without need for any human intervention can be ensured. Such a high precision is not achievable in a fully automated manner in the state of the art.
Thus, the present invention provides superior precision and a full automation level of the welding seam detection. The high precision also ensures a correct and precise positioning of a welding zone mandatory for a high quality of the welding. The visual input data can e.g. be an image, a series of images or a video. The input data can be collected by means of a visual sensing device, e.g. a camera. It is either possible to use the entire input data, e.g. an image, as input or to cut out a certain region of interest which isolates the welding seam area prior to processing the input data. The input data can also be subjected to any data processing means e.g. for cropping out shadow and glare background effects, prior to providing the visual input data as an input to the neural network.
The neural network preferably is trained before it is employed in the method according to the present invention. In this case, a trained neural network is employed for the method. The training is preferably performed by an individually created and/or labeled data set of input data of welding seams, in particular relating to the particular welding apparatus used for welding.
In one embodiment, the neural network is a convolutional neural network (CNN). CNNs have the advantage that less pre-processing of the input data is required compared to other classification algorithms. CNNs are capable of capturing spatial and temporal dependencies in an image and thus improve the precision of the method according to the present invention.
It is further of advantage, if the convolutional neural network is a, especially modified, U-net convolutional neural network. They have been proposed by Olaf Ronneberger et al. , in “U-net: Convolutional networks for biomedical image segmentation”, CoRR, abs/1505.04597, 2015. Compared to other convolutional networks U-net convolutional networks can operate based on smaller training data sets still yielding in a highly precise segmentation. While U-Net convolutional networks were originally developed mainly for biomedical applications it is a finding of the present invention, that by modification of such architecture increased precision regarding the automatic seam position detection can be achieved.
In a preferred embodiment the method according to the present invention comprises the step of providing the visual input data to a preprocessing module including a binary classifier configured to classify the visual input data based on at least one predeterm inable criterion relating to a quality parameter of the visual input data. In case the visual input data are provided as images, the quality parameter can e.g. be an image quality parameter.
It is of advantage, if the predeterm inable criterion is related to a clarity of the visual input data, the presence of lighting artefacts, or the presence of a joint in the input data.
The classifier can be either based on any standard data processing algorithm, e.g. a filtering algorithm, e.g. regarding the brightness of the visual input data or also any other filtering or processing means. However, the classifier can also be embodied as a neural network, in particular a trained neural network, or as a part of a neural network architecture, the classifier being configured to classify the visual input data based on the predeterm inable criterion. Only visual input data which fulfills the predeterm inable criterion is forwarded as input towards the neural network configured to determine the welding seam position.
The preprocessing module serves for further improving the precision and accuracy of the welding seam position detection, because only visual input data of sufficient quality is forwarded towards the neural network configured to determine the position of the welding seam.
Furthermore, the security of the detection process of the welding seam position is increased, because in an unknown situation, e.g. where the joint cannot be clearly recognized in the visual input data, carrying out the welding process can be prevented.
The classifier in principle is trained to detect normal and abnormal input data by distinguishing visual input data falling under the at least one predeterm inable criterion from such data which do not fulfill the criterion. Any input data that deviates from the, especially predeterm inable, normal conditions defined by the criterion, will be flagged as abnormal and not forwarded as input towards the neural network configured to determine the position of the welding seam.
It is also of advantage, if the preprocessing module further includes an alert operator configured to output an alert in case that a visual input data does not fulfill the predeterm inable criterion. In case the criterion is not fulfilled, e.g. assistance from an operator of the welding apparatus for which the method of the present invention is used can be sought, further increasing security of the welding process. In such cases, a warning, in particular a visual and/or sound signal may be output and/or the welding process can be stopped until the process is checked by the operator.
In another preferred embodiment of the present invention, the welding seam position is output in form of a mathematical function representing the welding seam, in particular by outputting at least one parameter characterizing the mathematical function. The mathematical function can be specified in advance and be adopted to the welding apparatus in use. For example, there are welding machines where the work pieces can be moved horizontally and/or vertically for producing the weld along the welding edge, but also machines where the welding system is moved while the work pieces remain in a fixed position, or where both the work pieces and the welding system are movably arranged. In each type different mathematical functions need to be chosen for defining or describing the welding seam position.
The mathematical function can e.g. be any, especially steady, function. However, it is of advantage, if the mathematical function is a line, in particular a vertical or horizontal line. This allows a characterization of the function by outputting one or two parameters and thus provides a straightforward output of the welding seam position.
Another preferred embodiment includes that the welding seam position is output in form of an edge mask. The edge mask defines the welding seam position and may be related to the mathematical function describing the welding seam position. For example, the welding seam position can be defined as a boundary between two different areas of the edge mask.
The objective technical problem is also achieved by means of a data processing system comprising means for carrying out the method according to the present invention.
Further, the objective technical problem underlying the present invention is achieved by means of a welding apparatus, in particular a laser welding apparatus comprising the data processing system according to the present invention. The objective technical problem is also achieved by a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the present invention and finally, by a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to the present invention.
It shall be noted that the embodiments described in connection with the method are mutatis mutandis applicable to the data processing system, the welding apparatus, the computer program and the computer-readable medium.
The invention will be explained in more detail with reference to the following figures.
Fig. 1 illustrates the advantages of a detection of the welding seam position with pixel-level accuracy,
Fig. 2 shows a block diagram of a first preferred embodiment of the method according to the present invention,
Fig. 3 shows a block diagram of a second preferred embodiment of the method according to the present invention including a pre-processing module, and
Fig. 4 shows a scheme of a welding apparatus according to the present invention. In the figures, identical elements are each provided with the same reference signs.
Fig. 1 illustrates the importance of a highly accurate and precise welding seam detection. Two components 2 and 3, especially metal components, are to be welded along welding seam 5. If the welding seam position 1 is correctly and precisely detected (Fig. 1a) at the welding seam 5, the welding zone 4 is also precisely provided along the welding edge 5. The result is a high quality and stable weld. Already a very small offset (e.g. an offset of less than one millimeter) between the detected seam position 1 and the welding seam position 5 results in a highly reduced weld quality between the two components 2 and 3, as illustrated in Fig. 1b, because the welding zone 4, where the weld is produced, is not centered at the position of the welding edge anymore. A greater offset, even if still in the millimeter range, can also prevent to produce a joint of the two components 2 and 3 at all, as shown in Fig. 1c. Thus, it is of utmost importance to detect the welding seam position 1 accurately and precisely prior to joining the components 2 and 3 to be welded.
Such welding seam position detection can automatically and precisely be achieved by means of the method according to the present invention. In Fig. 2 a first embodiment of the method according to the present invention is shown. Visual input data 6 is exemplarily provided in the form of an image of the joint, i.e. here an image of the welding edge of the two metal components 2 and 3 to be welded. The visual input data 6 may be processed by any processing means [not shown] prior to using the input data 6 as input for the neural network 8 of the welding seam position detection block 7, which e.g. may be a computer program or be part of a data processing system 9 for carrying out the method. The processing means may include choosing a region of interest in the input data 6 or cropping out shadow and glare background effects.
In order to detect the welding seam position 5, the input data 6 is provided as input to neural network 8 which is configured to determine the welding seam position 5 based on the visual input data 6. There are different possibilities to output the welding seam position 5. Optionally, the welding seam position 5 may be output in the form of an edge mask 10a produced in the optional block 10, as illustrated in the output 12 image of the welding seam of the two components 2 and 3. But, the welding seam position 5 can also be output in the form of a mathematical function, preferably defined by at least one relevant parameter describing the mathematical function of the welding seam position 5. For the present case, the welding seam position 5 is a mathematical function in the form of a horizontal line 11 defined by a single parameter section as also illustrated in the output 12 image of the welding seam of the two components 2 and 3.
Preferably, neural network 8 is embodied in the form of a modified U-net convolutional network. Assuming that a binary edge mask 10 is provided as output as it is the case for the example shown in Figs 2 and 3, and providing an architecture that is converted to this one dimensional output, a pixel-wise binary cross-entropy loss LED, where ED refers to the edge detection, can be used to train the U-net architecture, which binary cross-entropy loss may be defined as
Figure imgf000011_0001
log(pOi)) + (1 - yd log(p(l - yd) wherein the loss is calculated for each pixel W, and where y is the class label, and p(y is the probability of the predicted class.
In another preferred embodiment the input data 6 is first provided to a preprocessing module 13 including a binary classifier 14 to classify the visual input data 6 based on at least one predeterm inable criterion, e.g. dividing the input data 6 into normal and abnormal images based on at least one predeterm inable criterion relating to a quality parameter of the visual input data 6. For the example shown in Fig. 3 the preprocessing module further includes an alert operator 15 configured to output an alert in case that a visual input data 6 does not fulfill the predeterm inable criterion. The alert operator 15 can e.g. be embodied as to provide an alarm stopping the welding process. If, on the other hand, the predeterm inable criterion is fulfilled the input data 6 is transferred to block 7 and provided as input to neural network 8.
The binary classifier 14 may be embodied in the form of any standard data processing algorithm, e.g. a filtering algorithm. However, it can also be embodied in the form of a neural network, which can be trained together with or separate from neural network 8. For the example of a binary classifier 14 in the form of a neural network, the visual input data 6 serves as input and the output can be a label specifying e.g. either a “1” (normal image) or a “0” (abnormal image). Abnormal may correspond to cases where the input data 6 lacks clarity of if a welding seam is already present in the input data 6. A binary cross-entropy loss LEC, where EC refers to the edge classification, which is used to train the binary classifier based on N training images, may then be calculated as log(p(l - yd),
Figure imgf000011_0002
where y is the class label (here: Ί” or “0”), and p(yi) is the probability of the predicted class. The output of preprocessing module 13 is then given as
Labels = EC(Jd,
EC being the edge classification module 14, li the ith image and Labeli the normal/abnormal label or corresponding ith input image. When using a preprocessing module 13, the welding seam position detection block 7 only operates if the preprocessing module 13 outputs a label Ί” for the corresponding visual input data 1 , e.g. a label corresponding to fulfillment of the predeterm inable criterion. The output of block 7 and module 13 may be written as
Figure imgf000012_0001
The additional use of preprocessing module 13 further increases the detection accuracy. The preprocessing module ensures that spurious input data 6 caused e.g. by misplacement of parts, blurring, machine errors or the like, are filtered out.
Fig. 4 finally shows a schematic drawing of a welding apparatus 16 which for the present example is a laser welding apparatus. The apparatus 16 includes a visual sensing system in the form of a camera for producing the visual input data 6 of the joint of the two components 2 and 3 located on a work piece platform which can be turned around a central axis. The apparatus 16 further includes a welding system, here in the form of a laser system 18, with a laser used to produce welds of the work pieces 2 and 3. The laser system is arranged such that it can be horizontally and vertically moved and turned around a central axis in order to follow any seam position to be welded. The welding apparatus 16 may further comprise a computer or any data processing system which is embodied to carry out an embodiment of the method according to the present invention.
Reference symbols detected welding seam position component to be welded component to be welded welding zone welding seam position visual input data welding seam position detection block neural network configured to determine the welding seam position data processing system block to produce edge mask 10a mathematical function, esp. in the form of a line output image preprocessing module binary classifier alert operator welding apparatus camera laser system work piece platform

Claims

Patent Claims
1. A computer-implemented method for detecting a welding seam position (5) in a welding process, in particular a laser welding process, the method comprising the steps of: receiving visual input data (6) of a welding edge; providing the visual input data (6) as an input to a neural network (8) configured to determine the welding seam position (5) based on the visual input data (6); and outputting the welding seam position (5).
2. The computer-implemented method according to claim 1 , wherein the neural network (8) is a convolutional neural network.
3. The computer-implemented method according to claim 2, wherein the convolutional neural network is a, especially modified, U-net convolutional neural network.
4. The computer-implemented method according to any of the preceding claims, comprising the step of providing the visual input data to a preprocessing module (13) including a binary classifier (14) configured to classify the visual input data (6) based on at least one predeterm inable criterion relating to a quality parameter of the visual input data (6).
5. The computer-implemented method according to claim 4, wherein the predeterm inable criterion is related to a clarity of the visual input data (6), the presence of lighting artefacts, or the presence of a joint in the input data (6).
6. The computer-implemented method according to claim 4 or 5, wherein the preprocessing module (13) further includes an alert operator (15) configured to output an alert in case that a visual input data (6) does not fulfill the predeterm inable criterion.
7. The computer-implemented method according to any of the preceding claims, wherein the welding seam position (5) is output in form of a mathematical function representing the welding seam, in particular by outputting at least one parameter characterizing the mathematical function.
8. The computer-implemented method according to claim 7, wherein the mathematical function is a line (11 ), in particular a vertical or horizontal line.
9. The computer-implemented method according to any of the preceding claims, wherein the welding seam position (5) is output in form of an edge mask (10a).
10. A data processing system (9) comprising means for carrying out the method according to any of the proceeding claims.
11 .A welding apparatus (16) comprising the data processing system (9) according to claim 10.
12. A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to any of the preceding claims.
13. Computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to any of the preceding claims.
PCT/EP2022/068122 2021-07-08 2022-06-30 Automatic seam detection for a welding process WO2023280679A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021117714.6A DE102021117714A1 (en) 2021-07-08 2021-07-08 Automatic seam detection for a welding process
DE102021117714.6 2021-07-08

Publications (1)

Publication Number Publication Date
WO2023280679A1 true WO2023280679A1 (en) 2023-01-12

Family

ID=82608505

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/068122 WO2023280679A1 (en) 2021-07-08 2022-06-30 Automatic seam detection for a welding process

Country Status (2)

Country Link
DE (1) DE102021117714A1 (en)
WO (1) WO2023280679A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503417A (en) * 2023-06-29 2023-07-28 武汉纺织大学 Automatic recognition, positioning and size calculation method for ultra-long weld joint and typical defect

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2792447B1 (en) 2011-12-15 2018-01-24 JFE Steel Corporation Laser welding position-detecting device and welding position-detecting method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334807A (en) 2008-07-28 2008-12-31 中国航空工业第一集团公司北京航空制造工程研究所 Electro-beam welding joint melting-coagulation area shape factor modeling and solving method
CN104636760B (en) 2015-03-11 2017-09-05 西安科技大学 A kind of localization method of weld seam
CN109702293B (en) 2019-01-22 2020-01-14 清华大学 Welding penetration quality real-time control method based on visual detection
CN113012217A (en) 2019-12-22 2021-06-22 李新春 Image processing-based construction method of convolution neural network for welding seam positioning
CN112548273A (en) 2020-12-14 2021-03-26 天津科技大学 Weld joint automatic tracking method based on deep neural network
CN113034512B (en) 2021-03-15 2022-11-11 南京理工大学 Weld joint tracking method based on feature segmentation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2792447B1 (en) 2011-12-15 2018-01-24 JFE Steel Corporation Laser welding position-detecting device and welding position-detecting method

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
HE FENG ET AL: "Research on Weld Recognition Method Based on Mask R-CNN", 2021 IEEE ASIA-PACIFIC CONFERENCE ON IMAGE PROCESSING, ELECTRONICS AND COMPUTERS (IPEC), IEEE, 14 April 2021 (2021-04-14), pages 545 - 551, XP033911792, DOI: 10.1109/IPEC51340.2021.9421157 *
JIN-YUN LU ET AL.: "The weld image edge-detection algorithm combined with canny operator and mathematical morphology", PROCEEDINGS OF THE 32ND CHINESE CONTROL CONFERENCE, 2013, pages 4467 - 4470, XP032510881
QIAN-QIAN WU ET AL.: "A study on the modified hough algorithm for image processing in weld seam tracking", JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, vol. 29, no. 11, 2015, pages 4859 - 4865, XP035567758, DOI: 10.1007/s12206-015-1033-x
XINDE LI ET AL.: "Automatic welding seam tracking and identification", IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, vol. 64, no. 9, 2017, pages 7261 - 7271
YANBIAO ZOUTAO CHEN: "laser vision seam tracking system based on image processing and continuous convolution operator tracker", OPTICS AND LASERS IN ENGINEERING, vol. 105, 2018, pages 141 - 149
YANBIAO ZOUWEILIN ZHOU: "Automatic seam detection and tracking system for robots based on laser vision", MECHATRONICS, vol. 63, pages 102261
ZHANG WENBIN: "Semi-Supervised Training for Positioning of Welding Seams", 7 June 2021 (2021-06-07), pages 1 - 1, XP055972641, Retrieved from the Internet <URL:https://ruor.uottawa.ca/handle/10393/42257> [retrieved on 20221018] *
ZHANG WENBIN: "Semi-Supervised Training for Positioning of Welding Seams", 7 June 2021 (2021-06-07), XP055971927, Retrieved from the Internet <URL:https://ruor.uottawa.ca/bitstream/10393/42257/1/Zhang_Wenbin_2021_thesis.pdf> [retrieved on 20221018] *
ZOU YANBIAO ET AL: "Robust seam tracking via a deep learning framework combining tracking and detection", APPLIED OPTICS, vol. 59, no. 14, 10 May 2020 (2020-05-10), US, pages 4321, XP055972410, ISSN: 1559-128X, Retrieved from the Internet <URL:https://opg.optica.org/DirectPDFAccess/2011CBB5-9619-4EDE-A93800A1A0CE8904_431503/ao-59-14-4321.pdf?da=1&id=431503&seq=0&mobile=no> [retrieved on 20221018], DOI: 10.1364/AO.389730 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503417A (en) * 2023-06-29 2023-07-28 武汉纺织大学 Automatic recognition, positioning and size calculation method for ultra-long weld joint and typical defect
CN116503417B (en) * 2023-06-29 2023-09-08 武汉纺织大学 Automatic recognition, positioning and size calculation method for ultra-long weld joint and typical defect

Also Published As

Publication number Publication date
DE102021117714A1 (en) 2023-01-12

Similar Documents

Publication Publication Date Title
Xu et al. Visual sensing technologies in robotic welding: Recent research developments and future interests
KR20210091789A (en) Laser Machining Process Monitoring System and Method Using Deep Convolutional Neural Network
Ma et al. Binocular vision system for both weld pool and root gap in robot welding process
Ye et al. A robust algorithm for weld seam extraction based on prior knowledge of weld seam
KR20210090264A (en) Processing error detection system and method of laser processing system using deep convolutional neural network
Xu et al. Real‐time image capturing and processing of seam and pool during robotic welding process
CN113269762A (en) Screen defect detection method, system and computer storage medium
Reisgen et al. Machine vision system for online weld pool observation of gas metal arc welding processes
WO2023280679A1 (en) Automatic seam detection for a welding process
CN106493495A (en) High-accuracy machine vision alignment system
Heber et al. Weld seam tracking and panorama image generation for on-line quality assurance
CN109108518A (en) A kind of online test method and device of Laser Welding hump defect
Soares et al. Seam tracking and welding bead geometry analysis for autonomous welding robot
Wang et al. Groove-center detection in gas metal arc welding using a template-matching method
JP2005014027A (en) Weld zone image processing method, welding management system, feedback system for welding machine, and butt line detection system
Liu et al. A real-time passive vision system for robotic arc welding
Lin et al. Intelligent seam tracking of an ultranarrow gap during K-TIG welding: a hybrid CNN and adaptive ROI operation algorithm
Krämer et al. Seam tracking with texture based image processing for laser materials processing
CN110238520B (en) Automatic precise laser welding method based on CCD visual detection
Ye et al. Weld seam tracking based on laser imaging binary image preprocessing
Li et al. A modified welding image feature extraction algorithm for rotating arc narrow gap MAG welding
CN113240629B (en) Edge-based image matching narrow-gap weld initial point positioning device and method
Pasinetti et al. In-line monitoring of laser welding using a smart vision system
Soares et al. Computer vision system for weld bead geometric analysis
US11320258B2 (en) Apparatus and method for edge detection when machining workpieces

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22743785

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE