WO2024057100A1 - System for detecting and correcting welding defects in real-time and method thereof - Google Patents

System for detecting and correcting welding defects in real-time and method thereof Download PDF

Info

Publication number
WO2024057100A1
WO2024057100A1 PCT/IB2023/054293 IB2023054293W WO2024057100A1 WO 2024057100 A1 WO2024057100 A1 WO 2024057100A1 IB 2023054293 W IB2023054293 W IB 2023054293W WO 2024057100 A1 WO2024057100 A1 WO 2024057100A1
Authority
WO
WIPO (PCT)
Prior art keywords
welding
model
workpiece
trained
defect
Prior art date
Application number
PCT/IB2023/054293
Other languages
French (fr)
Inventor
Mrunal Vemuganti
Original Assignee
L&T Technology Services Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L&T Technology Services Limited filed Critical L&T Technology Services Limited
Publication of WO2024057100A1 publication Critical patent/WO2024057100A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K31/00Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups
    • B23K31/006Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups relating to using of neural networks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K31/00Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups
    • B23K31/12Processes relevant to this subclass, specially adapted for particular articles or purposes, but not covered by only one of the preceding main groups relating to investigating the properties, e.g. the weldability, of materials
    • B23K31/125Weld quality monitoring
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32179Quality control, monitor production tool with multiple sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32237Repair and rework of defect, out of tolerance parts, reschedule
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40307Two, dual arm robot, arm used synchronously, or each separately, asynchronously
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40584Camera, non-contact sensor mounted on wrist, indep from gripper
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45104Lasrobot, welding robot

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to AI based system and method for detecting and correcting welding defects in real time. The method comprises generating a 3D model of a workpiece using a plurality of images of the workpiece. The method further comprises performing specific welding on the workpiece based on a plurality of welding parameters extracted from analysis of the 3D model of the workpiece using a trained AI model. The method further recites monitoring the welding in real-time and detecting defect(s) in the welding. In response to detecting the defect(s) in the welding, the welding is discontinued, and the defect(s) is corrected. Thereafter, the welding is continued and monitored until the welding of the workpiece is completed. Further, a non-destructive testing of the welded workpiece is performed to identify any remaining defects in the welded workpiece and the remaining defects are corrected in the welded workpiece.

Description

SYSTEM FOR DETECTING AND CORRECTING WELDING DEFECTS IN REALTIME AND METHOD THEREOF
[001] TECHNICAL FIELD
[002] The present disclosure generally relates to the field of quality control for system driven welding processes. More particularly, the present disclosure relates to Artificial Intelligence (Al) based system for detecting and correcting defects in the welding in realtime.
[003] BACKGROUND
[004] The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[005] The quality of welding play vital role in metal fabrication industries. The quality of the welding may be controlled in many different manners. One of such manners includes identifying the defects by visual inspection and after identifying the defects sending back the defected workpiece for rectification of the defects. However, such manual visual inspection is time consuming, susceptible to error, and inefficient process.
[006] Another way to control the quality of the welding is to apply some testing techniques on the welded workpiece. Such testing may be applied using various testing instruments. However, such testing is performed after the workpiece is welded, which results as increase in overall manufacturing time. The manufacturing time is also affected by the availability of the testing instrument and may increase due to unavailability of the testing instrument. Further, such quality control mechanisms require different apparatus each for performing the welding, detecting the defects, and rectifying the identified defects. Thus, these mechanisms are time consuming, cost prohibitive, and requires human intervention to a larger extent. [007] Thus, there exists a need in the art for techniques to provide intelligent solution to automatically detect and correct the defects while performing the welding and resultant welded workpiece.
[008] SUMMARY
[009] The present disclosure overcomes one or more shortcomings of the prior art and provides additional advantages. Embodiments and aspects of the disclosure described in detail herein are considered a part of the claimed disclosure.
[0010] A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
[0011] In one non-limiting embodiment of the present disclosure, an Al based method for detecting and correcting welding defects in real time is disclosed. The method discloses generating a 3D model of a workpiece subjected to welding, from a plurality of images received from a first image capturing unit. The method further discloses guiding a first robotic arm to perform specific welding on the workpiece based on a plurality of welding parameters extracted from analysis of the 3D model of the workpiece using a trained Al model. The method further discloses monitoring the welding in real-time and detecting at least one defect in the welding using the trained Al model. The method further discloses discontinuing the welding in response to detecting the at least one defect in the welding and initiating a welding correction process to correct the at least one defect identified in the workpiece. The method further discloses continuing the welding upon completion of the welding correction process, and the real-time monitoring, until the welding of the workpiece is completed. The method furthermore discloses guiding a second robotic arm to perform non-destructive testing of the welded workpiece to identify, using the trained Al model, one or more remaining defects in the welded workpiece. Lastly, the method discloses correcting the one or more remaining defects in the welded workpiece. [0012] In another non-limiting embodiment of the present disclosure for determining the plurality of welding parameters, the method further comprises determining one or more of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition. The Al model is trained to determine the welding type, welding current, and welding temperature in the real-time, using supervised machine learning technique, based on a large dataset of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition.
[0013] In yet another non-limiting embodiment of the present disclosure for monitoring the welding in the real-time and detecting the at least one defect in the welding the method discloses capturing a video stream of the welding and processing the captured video stream in the real-time by the trained Al model to detect at least one defect in the welding. The Al model is trained to determine at least one defect in the welding, using supervised machine learning technique, based on a large set of images having plurality of defects in the welding.
[0014] In yet another non-limiting embodiment of the present disclosure, the welding correction process includes determining location(s) of the at least one defect in the welding by analysing the captured video stream using the trained Al model. The welding correction process further includes selectively guiding the first robotic arm to perform re- welding at the determined location(s) to remove the at least one defect, and selectively training the Al model based on the detected at least one defect.
[0015] In yet another non-limiting embodiment of the present disclosure for identifying the one or more remaining defects in the welded workpiece, the method further discloses capturing one or more images of the welded workpiece while performing the nondestructive testing using a second image capturing unit. The method further discloses detecting the one or more remaining defects by processing the one or more images of workpiece using the trained Al model. Specifically, the Al model is trained for determining the one or more remaining defects in the welded workpiece, using supervised machine learning technique, based on a large set of images of plurality of defects in welded workpieces.
[0016] In yet another non-limiting embodiment of the present disclosure for correcting the one or more remaining defects in the welded workpiece the method further discloses determining location(s) of the one or more remaining defects in the welded workpiece using the trained Al model. The method further discloses selectively guiding the first robotic arm to perform re-welding at the determined location(s) to remove the one or more remaining defects, and selectively training the Al model based on the one or more remaining defects.
[0017] In one non-limiting embodiment of the present disclosure, an Artificial Intelligence (Al) based robotic system to detect and correct welding defects in real-time is disclosed. The system comprises a first robotic arm having a first image capturing unit configured to capture a plurality of images of a workpiece subjected to welding and a welding unit mounted thereon and a second robotic arm having a testing unit and a second image capturing unit mounted thereon. The system further comprises and a system control unit comprising a trained Al model, and operatively coupled to the first robotic arm and the second robotic arm. The system control unit is configured to generate a 3D model of the workpiece, using the trained Al model, from the plurality of images received from the first image capturing unit. The system control unit is further configured to guide the first robotic arm to perform specific welding on the workpiece based on a plurality of welding parameters extracted from analysis of the 3D model of the workpiece using the trained Al model and monitor the welding in real-time using the first image capturing unit and detect at least one defect in the welding using the trained Al model. In response to detecting the at least one defect in the welding, the system control unit is configured to discontinue the welding and initiate a welding correction process to correct the at least one defect identified in the workpiece. The system control unit is further configured to continue the welding upon completion of the welding correction process, and the realtime monitoring, until the welding of the workpiece is completed and guide the second robotic arm to perform non-destructive testing of the welded workpiece to identify, using the trained Al model, one or more remaining defects in the welded workpiece. The system control unit is further configured to correct the one or more remaining defects in the welded workpiece.
[0018] In another non-limiting embodiment of the present disclosure, the plurality of welding parameters include one or more of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition. The Al model is trained to determine the welding type, welding current, and welding temperature in the real-time, using supervised machine learning technique, based on a large dataset of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition.
[0019] In yet another non-limiting embodiment of the present disclosure, to monitor the welding in the real-time and detect the at least one defect in the welding, the system control unit is configured to capture a video stream of the welding using the first image capturing unit and process the captured video stream in the real-time, using the trained Al model, to detect the at least one defect in the welding. The Al model is trained to determine the at least one defect in the welding, using supervised machine learning technique, based on a large set of images having plurality of defects in the welding.
[0020] In yet another non-limiting embodiment of the present disclosure, to initiate the welding correction process to correct the at least one defect, the system control unit is configured to determine location(s) of the at least one defect in the welding by analysing the captured video stream using the trained Al model and guide the first robotic arm to perform rewelding at the determined location(s) to remove the at least one defect. The system control unit is further configured to train the Al model based on the detected at least one defect to avoid such defect.
[0021] In yet another non-limiting embodiment of the present disclosure, to identify the one or more remaining defects in the welded workpiece, the system control unit is configured to capture one or more images of the welded workpiece while performing the non- destructive testing using the second image capturing unit. The system control unit is further configured to detect the one or more remaining defects by processing the one or more images of workpiece using the trained Al model. The Al model is trained to determine the one or more remaining defects in the welded workpiece, using supervised machine learning technique, based on a large set of images of plurality of defects in welded workpieces.
[0022] In yet another non-limiting embodiment of the present disclosure, to correct the one or more remaining defects in the welded workpiece, the system control unit is configured to determine location(s) of the one or more remaining defects by processing the captured one or more images of the welded workpiece using the trained Al model. The system control unit is further configured to guide the first robotic arm to perform re- welding at the determined location(s) to remove the one or more remaining defects. Additionally, the system control unit is further configured to train the Al model based on the one or more remaining defects.
[0023] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
[0024] BRIEF DESCRIPTION OF DRAWINGS
[0025] The features, nature, and advantages of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying Figs., in which:
[0026] Fig. 1 shows an exemplary environment of Al based robotic system to detect and correct the welding defects in real-time, in accordance with an embodiment of the present disclosure. [0027] Fig. 2 shows a block diagram of Al based robotic system to detect and correct the welding defects in real-time, in accordance with an embodiment of the present disclosure.
[0028] Fig. 3 shows exemplary setup for training the Al model to determine welding parameters, in accordance with an embodiment of the present disclosure.
[0029] Fig. 4 shows exemplary setup to train the Al model for detecting various defects in welding, in accordance with an embodiment of the present disclosure.
[0030] Fig. 5 shows a flowchart of an exemplary Al based method for detecting and correcting welding defects in real-time, in accordance with an embodiment of the present disclosure.
[0031] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in a computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0032] DETAILED DESCRIPTION
[0033] The foregoing has broadly outlined the features and technical advantages of the present disclosure in order that the detailed description of the disclosure that follows may be better understood. It should be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure.
[0034] The novel features which are believed to be characteristic of the disclosure, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present disclosure.
[0035] Various embodiments of the present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will satisfy applicable requirements. The term “or” is used herein in both the alternative and conjunctive sense, unless otherwise indicated. The terms “illustrative,” “example,” and “exemplary” are used to be examples with no indication of quality level. Like numbers refer to like elements throughout.
[0036] The phrases “in an embodiment,” “in one embodiment,” “according to one embodiment,” and the like generally mean that the particular feature, structure, or characteristic following the phrase may be included in at least one embodiment of the present disclosure and may be included in more than one embodiment of the present disclosure (importantly, such phrases do not necessarily refer to the same embodiment).
[0037] The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.
[0038] If the specification states a component or feature “can,” “may,” “could,” “should,” “would,” “preferably,” “possibly,” “typically,” “optionally,” “for example,” “often,” or “might” (or other such language) be included or have a characteristic, that particular component or feature is not required to be included or to have the characteristic. Such component or feature may be optionally included in some embodiments, or it may be excluded.
[0039] There exists various techniques and apparatus to perform welding. However, such techniques and/or apparatus need manual intervention to perform the welding. Although, the recent development in the industry has provided some automated tools to perform the welding, but these tools require manual intervention and monitoring. Further, the welding tools are not able to detect if any defects arise in the welding. Thus, different testing tools are required to detect the defects in the welding. However, these testing tools examines the welding once the welding of the entire workpiece is completed. Also, these tools are incapable of detecting the defects in the welding in real-time i.e., while performing the welding. Since the welding tools and the testing tools are different, and the testing is not performed in the real-time, the overall welding and testing process becomes time consuming and inefficient approach.
[0040] The present disclosure relates to an Artificial Intelligence (Al) based method and robotic system to detect and correct welding defects in real-time. The Al based robotic system and method is able to detect the welding defect while performing the welding and correct the detected defects in the real-time, thus the overall manufacturing time is reduced, and the need of separate device to detect the defects in the welding is also eliminated.
[0041] Further, the disclosed Al based robotic system and method are capable of predicting various welding parameters, to perform the welding, that are specific to workpiece subjected to the welding to automate the entire process of the welding. Furthermore, the system and method may disclose a second layer of quality control that includes applying non-destructive testing technique on the welded workpiece to detect if there are any remaining defects in the welded workpiece. Upon detection of the remaining defects in the welded workpiece, a defect correction process may be initiated to rectify the remaining defects.
[0042] Figure 1 shows an exemplary environment 100 for artificial intelligence based robotic system 102 to detect and correct defects in welding in real-time in accordance with an embodiment of the present disclosure. The Al based robotic system 102 may comprise at least a system control unit 104, a first robotic arm 106, a second robotic arm 108, but not limited thereto. The system control unit 104 may be operatively coupled to the first robotic arm 106 and the second robotic arm 108. The system control unit 104 may be configured to control and guide the first robotic arm 104 and the second robotic arm 106 to perform various operations. Although the system control unit 104 is shown as separate entity outside the robotic system 102, a skilled person would appreciate the fact that the system control unit 104 may be an integral part of the robotic system 102. In an exemplary embodiment, the system control unit 104 may be a Tech Pendant.
[0043] Moving on, as shown in figure 1, the robotic system 102 may be communicatively coupled with a user device 110 and a power supply 112. The user device 110 may a computer, desktop, laptop, mobile device, user terminal, tablet, display, etc., which can be used to monitor the welding and to provide one or more inputs to control various welding operations. The power supply may provide the power for operation to the robotic system 102.
[0044] The robotic system 102 is configured to perform the welding of the workpiece and simultaneously monitor the welding to identify any defect(s) in the welding. Upon detection of any defect(s) in the welding, the robotic system 102 may temporarily discontinue the welding and may start rectification of the detected defect(s). After rectifying the defect(s), the system may continue performing the welding and monitoring the welding to detect further defect(s), if any. Additionally, upon completion of the welding, the system 102 may be further configured to apply non-destructive testing (NDT) techniques to detect whether any defect(s) remains in the welded workpiece. If any remaining defect(s) is identified in the welded workpiece, the robotic system 102 may rectify the remaining defect(s). The structure and working of the robotic system 102, to achieve the desired objectives of the present disclosure, is explained in detail in forth coming paragraphs, in conjunction with figures 1 and figure 2.
[0045] Referring to figure 2, an exemplary Al based robotic system 200 by way of block diagram is disclosed for detecting and correcting defects in welding in accordance with an embodiment of the present disclosure. The robotic system 200 may comprise a system control unit 202, a first robotic arm 204, a second robotic arm 206, but not limited thereto. The system control unit 202 may be operatively coupled with the first robotic arm 204 and the second robotic arm 206 to control and guide the first robotic arm 204 and the second robotic arm 206 to perform various desired operations according to various embodiments of the present disclosure.
[0046] According to an embodiment, the system control unit 202 may comprise various entities such as a trained Artificial Intelligent (Al) model 208, a processor 210, a memory 212, an input/output interface 214, but not limited thereto. All these entities remain operatively and communicatively coupled with each other. The trained Al model 208 plays an important role in the robotic system 200. In a specific embodiment, the Al model 208 of system 200 is pre-trained to detect various defects in the welding in real-time and any remaining defect(s) in the welded workpiece when the welding is completed. [0047] In another specific embodiment, the Al model 208 is also pre-trained to predict the type of welding required for the subjected workpiece (not shown) based on analysis of various parameters obtained from the analysis of multidimensional model of the workpiece subjected to welded. In an embodiment, the multidimensional model may be 2D or 3D model, but not limited thereto. Those skilled in the art will appreciate that the various parameters may comprise one or more of: welding type, welding current, and welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition, but not limited thereto. Further, a skilled person would appreciate the fact that the mentioned list of parameters is not exhaustive list, and there may be various other parameters which are not listed here. The scope of present disclosure should be considered to be limited only to the listed parameters. Description of figure 3 in forthcoming paragraphs explains in detail the training of Al model to predict the type of welding required for the subjected workpiece.
[0048] In figure 3, at block 302, a large dataset of images of different type of workpieces having different welding requirements are provided for training Al model 308. The Al model 308 may be trained by iteratively processing the plurality of 3D models with various welding parameter using Graphics Processing unit (GPU) 304 that uses one or more know machine learning algorithms to generate a trained Al model 308. For example, supervised ML techniques may be used for training the Al model 308 by iteratively processing the plurality of annotated images. For training the model, the plurality of annotated 3D models may be divided into training 3D models and testing 3D models. In an exemplary aspect, the training 3D models may comprise 70-90% of the annotated 3D models and the testing 3D models may comprise 10-30% of the 3D models such that no 3D model is a part of the both the training and the testing 3D models.
[0049] In an embodiment, for training the Al model, one or more welding parameters of the training 3D models such as welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition may be used. A number of training iteration to train the model may be based on a learning score indicative of valid detection of welding parameters. The learning score is indicative of a learning loss encountered when training the model. If it is identified during training that the model is subjected to less learning loss (i.e., the learning score is high) then such a model may be deployed in the testing. In some examples, the model may be trained iteratively until the learning loss is less than a threshold value indicative a high learning score. Thus, the model may be trained iteratively until the learning loss is less than the benchmark value. The trained model 308 may be stored in a memory. In real-time, the generated 3D models 306 may be provided to the trained model 308 to detect the welding parameters. Further, the detected parameters 310 may be used to further train the model 308 as shown in the figure 3.
[0050] Coming back to figure 2, the first robotic arm 204 may have a first image capturing unit 204a and a welding unit 204b mounted thereon, but not limited thereto. The first image capturing unit 204a may be any device which is capable of capturing high quality images and/or video stream of the workpiece not only prior to welding process but also during the welding process. In an embodiment, the first image capturing unit 204a may be configured to provide a plurality of images and/or a video stream of the entire workpiece subjected to welding in order to generate the multidimensional model of the workpiece. The first image capturing unit 204a is mounted on the first robotic arm 204 in such a manner that it may take images and/or video of the workpiece from 360°, which is helpful in generating the 3D model of the workpiece. The first image capturing unit 204a may also be used to continuously monitor the welding in real-time to detect any defect in the welding. Further, the welding unit 204b may be used to perform the welding as guided by the system control unit 202 according to the generated multidimensional model.
[0051] In another embodiment, the second robotic arm 206 may have a second image capturing unit 206a and a testing unit 206b mounted thereon, but not limited thereto. The second image capturing unit 206a may be any suitable device which is capable of capturing images and/or video stream of the welded workpiece. The second image capturing unit 206a may capture a plurality of images of the welded workpiece while applying any weld testing technique to detect any remaining welding defects in the workpiece. The second image capturing unit 206a is mounted on the second robotic arm 206 in such a manner that it may take images and/or video of the workpiece from 360°, which is helpful in capturing a plurality of images of the welded workpiece while applying any weld testing technique to detect any remaining welding defects in the workpiece. The testing unit 206b may be used to apply any suitable testing technique to detect the remaining defects. In one embodiment, the testing unit 206b may apply non-destructive testing (NDT) technique to detect whether any defect is remaining in the welded workpiece.
[0052] Referring again to figure 2, the user may initiate a welding process to weld a workpiece by providing one or more inputs via the user device 110 to the robotic system 102, 200. When the user initiates the welding process to weld the workpiece, the system control unit 202 may trigger a first signal to the first image capturing unit 204a to capture a plurality of images, from different angles, of the workpiece subjected to the welding. In response, the first image capturing unit 204a may capture the plurality of images of the workpiece and send the captured images to the system control unit 202. The system control unit 202 may receive and store the captured images in the memory 212. Further, the system control unit 202 may generate the multidimensional model such as 2D or 3D model of the workpiece by processing the received plurality of images using the trained Al model 208. The system control unit 202 may utilize any suitable image processing technique to generate the multidimensional model of the workpiece.
[0053] After generating the multidimensional model such as the 3D model of the workpiece, the system control unit 202 may be configured to analyse the multidimensional model using the trained Al model 208 to extract a plurality of welding parameters that may guide the first robotic arm 204 on welding process to be followed. In an embodiment, the plurality of welding parameters may comprise one or more of: welding process, welding type, welding current, welding temperature, voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, and shielding gas composition, but not limited thereto. The Al model 208 may be trained to determine the plurality of welding parameters based on the 3D model of the workpiece in the real-time, using suitable machine learning technique such as supervised machine learning technique, as discussed in description of figure 3 in foregoing paragraphs. The Al model 208 may be trained based on a large dataset of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, and shielding gas composition for various 3D models of the workpieces. Thus, when the 3D model of the workpiece is generated, the trained Al model 208 may process the 3D model and may extract the plurality of welding parameters. [0054] After extracting the plurality of the welding parameters, the system control unit 202 may guide the first robotic arm 204 to perform specific welding on the workpiece based on a plurality of welding parameters. Particularly, the system control unit 202 may control and guide the welding unit 204b mounted on the first robotic arm 204 to perform the specific welding. While performing the welding, the first image capturing unit 204a may monitor the welding in real-time. Thus, the system control unit 202 may control the first image capturing unit 204a to monitor the welding to detect if any defect(s) arises in the welding. The system control unit 202 may guide the welding unit 204b and the first image capturing unit 204a to simultaneously perform and monitor the welding. Such monitoring of the welding in the real-time enable the detection of the defects in the welding as soon as it occurs in the welding. Since the defects may be detected in the realtime, it reduces the overall manufacturing time comparative to the scenarios where the defects are detected only after the completion of the welding process.
[0055] In order to detect the at least one defect in the welding, the system control unit 202 may be configured to send a second signal to the first image capturing unit 204a to capture a video stream of the welding, immediately after the welding process has initiated. In response, to receiving the second signal the first image capturing unit 204a may start capturing the images/video of the welding process and simultaneously shares the images/video, in real time, with the system control unit 202. Once, the captured stream is received by the system control unit 202, the system control unit 202 may process the captured video stream in the real-time using the trained Al system 208 to detect whether any defect exists in the welding or not. The Al model 208 may be trained to determine whether any defect(s) exists in the welding using suitable machine learning technique such as supervised machine learning technique, but not limited thereto. A large set of images having plurality of defects in the welding may be used to train the Al model 208. The training of the Al model to identify defects in the welded workpiece is shown in figure 4 in more detail, in below paragraphs.
[0056] As shown in figure 4, at block 402, a large dataset of images of different types of welding defects are provided for training Al model 408. The Al model 408 may be trained by iteratively processing the plurality of annotated images of defects using Graphics Processing unit (GPU) 304 that uses one or more know machine learning algorithms to generate a trained Al model 408. For example, supervised Al and ML techniques may be used for training the Al model by iteratively processing the plurality of annotated images. For training the model, the plurality of annotated images may be divided into training images and testing images. In an exemplary aspect, the training images may comprise 70-90% of the annotated images and the testing images may comprise 10-30% of the annotated images such that no image is a part of the both the training and the testing images.
[0057] In an embodiment, for training the Al model, one or more defects of the training images such as weld crack, porosity, undercut, incomplete fusion, incomplete penetration, slag inclusion, spatter, but not limited to, may be used. A number of training iteration to train the model may be based on a learning score indicative of valid detection of defects. The learning score is indicative of a learning loss encountered when training the model. If it is identified during training that the model is subjected to less learning loss (i.e., the learning score is high) then such a model may be deployed in the testing. In some examples, the model may be trained iteratively until the learning loss is less than a threshold value indicative a high learning score. Thus, the model may be trained iteratively until the learning loss is less than the benchmark value. The trained model 408 may be stored in a memory. In real-time, the captured images/video 406 may be provided to the model 408 to detect one or more defects. Further, the detected defects 410 may be used to further train the model 408 as shown in the figure.
[0058] When the Al model 208 detects that at least one defect exists in the welding, the system control unit 202 may discontinue the welding and may initiate a welding correction process to correct the at least one defect identified in the workpiece. In order to correct the identified defect, the system control unit 202 may to determine location(s) of the identified defect in the welding by analysing the captured video stream using the trained Al model 208. In response to detecting the location(s), the system control unit 202 may guide the first robotic arm 204 to remove the at least one defect. Particularly, the system control unit 202 may control the welding unit 204b to remove/correct the at least one defect. In order to remove the defect, the system 200 may perform re-welding at the location of the identified defect.
[0059] According to an embodiment, the system control unit 202 may identify possible reasons for the identified one or more defects by analysing the identified defect(s). In another embodiment, the system control unit 202 may send image(s) of the identified defect(s) to a user device (not shown in figure 2) to display the identified defect(s) to the user. The user may further analyse the defect(s) and understand possible reasons for occurrence of the defect(s). The user may provide one or more inputs to the system control unit 202 via the I/O unit, which may indicate the one or more reasons for the occurrence of the defect(s).
[0060] Based on the determined reasons, the system control unit 202 may decide whether the Al model 208 has to be further trained to avoid such defect(s) in the successive welding operations. There may be scenarios where the defects are due to some other factors which are not controlled by the robotic system 200 and in such scenario the user may take appropriate action rather than training the Al model 208. In case, where the Al model is trained based on detected defects, such further training of the Al model may be performed periodically either based on time period, number of welding operations, or number of detected defects, etc., but not limited thereto. For example, the Al model 208 may be trained based on the one or more detected defects in real-time or on hourly, daily, weekly, or monthly basis.
[0061] In another example, the Al model 208 may be further trained based on the one or more detected defects after completion of a number of the welding operations. The information related to the defects detected during these operations may be stored in the memory, and may be accessed to perform the training of the Al model 208. In yet another example, the system control unit 202 may store plurality of detected defects and when the number of the stored defects reaches to a predefined number, the Al model 208 may be trained based on the detected defects. In this manner, by training the pre-trained Al model 208 based on the defects detected in the real-time may increase the accuracy of the Al model 208 and thereby improves the accuracy of the robotic system 200 and also reduces the manufacturing time significantly.
[0062] Referring again to figure 2, after correcting the defect, the system control unit 202 may continue welding the workpiece. The system control unit 202 may keep monitoring the welding in the real-time to keep a track of any defect which may further arise in the welding. If any further defect arises in the welding, the system control unit 202 may immediately discontinue the welding and start the defect correction process. This process is repeated until the welding is completed.
[0063] When the welding is completed, the system control unit 202 may be further configured to control and guide the second robotic arm to perform a second layer of test of the welded workpiece. The system control unit 202 may control the testing unit 206b to perform the testing of the welded workpiece. In an embodiment, the testing unit 206b may be configured to apply non-destructive testing (NDT) technique to test the welded workpiece to detect whether any defect is remaining in the welded workpiece. In an embodiment, to apply the NDT technique, the system control unit 202 may be configured to control the second image capturing unit 206a to capture one or more images of the welded workpiece during the non-destructive testing. The system control unit 202 may process the one or more images using the pre-trained Al model to detect one or more remaining defects. In an exemplary embodiment, the Al model 208 may be trained to determine the one or more remaining defects in the welded workpiece. In an embodiment, the Al model 208 may be trained using machine learning technique such as supervised machine learning technique based on a large set of images of plurality of defects in welded workpieces.
[0064] In order to detect the one or more remaining defects, the one or more captured images may be processed by the system control unit 202 using the trained Al model 208 to determine location(s) of the one or more remaining defects. Upon detecting the location(s) of the one or more remaining defects, the system control unit 202 may guide the welding unit 204b mounted on the first robotic arm 204 to perform re-welding at the determined location(s) to remove the one or more remaining defects.
[0065] According to an embodiment, the system control unit 202 may identify possible reasons for the one or more remaining defects by analysing the identified one or more remaining defects. In another embodiment, the system control unit 202 send one or more images the one or more remaining defects to the user device (not shown in figure 2) to display the remaining defects to the user. In one embodiment, the user may further analyse the remaining defects and understand possible reasons for occurrence of the remaining defects. The user may provide one or more inputs to the system control unit 202 which may indicate the one or more reasons for the remaining defects. Based on the determined reasons, the system control unit 202 may decide whether the Al model 208 has to be further trained to avoid such defect(s) in the successive welding operations. There may be scenarios where the defects are due to some factors which are not controlled by the robotic system 200 and in such scenario the user may take appropriate action rather than training the Al model 208. In case, where the Al model 208 is trained based on the remaining defects, such further training of the Al model 208 may be performed periodically either based on time period, number of welding operations, or number of detected defects, etc., but not limited thereto, as explained in previous embodiments. This second level of quality check ensure that the welded workpiece is defect free and enhance the accuracy of overall welding operation.
[0066] In this manner, the robotic system 200 may perform welding in the automotive manner. The robotic system 200 may also detect and correct one or more defects in real-time. The robotic system 200 may detect and correct the defects while performing the welding in real-time and even after completion of welding. The robotic system 200 may apply the NDT technique to ensure the final welded workpiece does not comprise any kind of defects. Thus, the robotic system 200 may reduce the manufacturing time, and improves yield and efficiency.
[0067] Fig. 5 illustrates a flow chart of a method 500 for detecting and correcting welding defects in real time according to an embodiment of the present disclosure. The method 500 may also be described in the general context of computer executable instructions. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform specific functions or implement specific abstract data types.
[0068] The order in which the method 500 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described.
[0069] The method 500 may start when a user initiates a welding process to weld a workpiece. The user may initiate the welding process to weld the workpiece by providing one or more inputs via the user device 110 to the robotic system 102, 200. At step 502, the method 500 may comprise generating a 3D model of a workpiece subjected to welding, from a plurality of images received from a first image capturing unit. In order to generate the 3D model, the first image capturing unit 204a may capture a plurality of images of the workpiece subjected to the welding and may send the captured images to the system control unit 202. The system control unit 202 may receive and store the captured images in the memory 212. Further, 3D model of the workpiece may be generated by processing the received plurality of images using the trained Al model 208. Any suitable image processing technique to generate the 3D model of the workpiece.
[0070] After generating the 3D model of the workpiece, the 3D model may be analysed using the trained Al model 208 to extract a plurality of welding parameters to perform the welding that may guide the first robotic arm 204 on welding process to be followed. In an embodiment, the plurality of welding parameters may comprise one or more of: welding process, welding type, welding current, welding temperature, voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, and shielding gas composition, but not limited thereto. The Al model 208 may be trained to determine the plurality of welding parameters based on the 3D model of the workpiece in the real-time, using suitable machine learning technique such as supervised machine learning technique. The Al model 208 may be trained based on a large dataset of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, and shielding gas composition for various 3D models of the workpieces. Thus, when the 3D model of the workpiece is generated, the trained Al model 208 may process the 3D model and may extract the plurality of welding parameters.
[0071] At step 504, the method 500 may comprise guiding the first robotic arm 204 to perform specific welding on the workpiece based on a plurality of welding parameters extracted from analysis of the 3D model of the workpiece using a trained Al model 208. Particularly, the welding unit 204b mounted on the first robotic arm 204 may be guided and controlled to perform the specific welding.
[0072] At step 506, the method 500 may comprise monitoring the welding in real-time and detecting at least one defect in the welding using the trained Al model 208. While performing the welding, the first image capturing unit 204a may monitor the welding in real-time. Thus, the system control unit 202 may be configured to send a signal to control the first image capturing unit 204a to monitor the welding, immediately after the welding process has initiated, to detect if any defect(s) arises in the welding. In response, to receiving the second signal the first image capturing unit 204a may start capturing the images/video of the welding process and simultaneously shares the images/video, in real time, with the system control unit 202. The system control unit 202 may guide the welding unit 204b and the first image capturing unit 204a to simultaneously perform and monitor the welding. Such monitoring of the welding in the real-time enable the detection of the defects in the welding as soon as it occurs in the welding. This reduces the overall manufacturing time.
[0073] At step 508, the method 500 may comprise in response to detecting the at least one defect in the welding, discontinuing the welding and initiating a welding correction process to correct the at least one defect identified in the workpiece. In order to detect the at least one defect in the welding, a video stream of the welding may be captured by the first image capturing unit 204a. The captured stream may be received by the system control unit 202 for further processing. The captured video stream may be processed in the real-time using the trained Al system 208 to detect whether any defect exists in the welding or not. The Al model 208 may be trained to determine whether any defect(s) exists in the welding using suitable machine learning technique such as supervised machine learning technique, but not limited thereto. A large set of images having plurality of defects in the welding may be used to train the Al model 208.
[0074] When the Al model 208 detects that at least one defect exists in the welding, the system control unit 202 may discontinue the welding and may initiate a welding correction process to correct the at least one defect identified in the workpiece. In order to correct the identified defect, location(s) of the identified defect in the welding may be determined by analysing the captured video stream using the trained Al model 208. In response to detecting the location(s), the first robotic arm 204 may be guided to remove the at least one defect. Particularly, the system control unit 202 may control the welding unit 204b to remove/correct the at least one defect. In order to remove the defect, the system 200 may perform re- welding at the location of the identified defect. [0075] According to an embodiment, possible reasons for the identified one or more defects may be determined by analysing the identified defect(s). In another embodiment, the system control unit 202 may send image(s) of the identified defect(s) to a user device to display the identified defect(s) to the user. The user may further analyse the defect(s) and understand possible reasons for occurrence of the defect(s). The user may provide one or more inputs to the system control unit 202 via the I/O unit, which may indicate the one or more reasons for the occurrence of the defect(s).
[0076] Based on the determined reasons, the system control unit 202 may decide whether the Al model 208 has to be further trained to avoid such defect(s) in the successive welding operations. There may be scenarios where the defects are due to some other factors which are not controlled by the robotic system 200 and in such scenario the user may take appropriate action rather than training the Al model 208. in case, where the Al model is trained based on detected defects, such further training of the Al model may be performed periodically either based on time period, number of welding operations, or number of detected defects, etc., but not limited thereto. For example, the Al model 208 may be trained based on the one or more detected defects in real-time or on hourly, daily, weekly, or monthly basis.
[0077] In another example, the Al model 208 may be further trained based on the one or more detected defects after completion of a number of the welding operations. The information related to the defects detected during these operations may be stored in the memory, and may be accessed to perform the training of the Al model 208. In yet another example, the system control unit 202 may store plurality of detected defects and when the number of the stored defects reaches to a predefined number, the Al model 208 may be trained based on the detected defects. In this manner, by training the pre-trained Al model 208 based on the defects detected in the real-time may increase the accuracy of the Al model 208 and thereby improves the accuracy of the robotic system 200 and also reduces the manufacturing time significantly.
[0078] At step 510, the method 500 may comprise continuing the welding upon completion of the welding correction process, and the real-time monitoring, until the welding of the workpiece is completed. The system control unit 202 may keep monitoring the welding in the real-time to keep a track of any defect which may further arise in the welding. If any further defect arises in the welding, the system control unit 202 may immediately discontinue the welding and start the defect correction process. This process is repeated until the welding is completed.
[0079] At step 512, the method 500 may comprise guiding the second robotic arm 206 perform a second layer of test i.e., non-destructive testing of the welded workpiece to identify, using the trained Al model 208, one or more remaining defects in the welded workpiece. The system control unit 202 may be configured control the testing unit 206b to perform the testing of the welded workpiece. The testing unit 206b may be configured to apply non-destructive testing (NDT) technique to test the welded workpiece to detect whether any defect is remaining in the welded workpiece. While applying the NDT technique, the system control unit 202 may also be configured to control the second image capturing unit 206a to capture one or more images of the welded workpiece during the nondestructive testing. The system control unit 202 may process the one or more images using the pre-trained Al model to detect one or more remaining defects. In an exemplary embodiment, the Al model 208 may be trained to determine the one or more remaining defects in the welded workpiece. In an embodiment, the Al model 208 may be trained using machine learning technique such as supervised machine learning technique based on a large set of images of plurality of defects in welded workpieces.
[0080] At step 514, the method 500 may comprise correcting the one or more remaining defects in the welded workpiece. In order to detect the one or more remaining defects, the one or more captured images may be processed by the system control unit 202 using the trained Al model 208 to determine location(s) of the one or more remaining defects. Upon detecting the location(s) of the one or more remaining defects, the system control unit 202 may guide the welding unit 204b mounted on the first robotic arm 204 to perform re- welding at the determined location(s) to remove the one or more remaining defects.
[0081] According to an embodiment, the system control unit 202 may identify possible reasons for the one or more remaining defects by analysing the identified one or more remaining defects. In another embodiment, the system control unit 202 send one or more images the one or more remaining defects to the user device (not shown in figure 2) to display the remaining defects to the user. In one embodiment, the user may further analyse the remaining defects and understand possible reasons for occurrence of the remaining defects. The user may provide one or more inputs to the system control unit 202 which may indicate the one or more reasons for the remaining defects. Based on the determined reasons, the system control unit 202 may decide whether the Al model 208 has to be further trained to avoid such defect(s) in the successive welding operations. There may be scenarios where the defects are due to some factors which are not controlled by the robotic system 200 and in such scenario the user may take appropriate action rather than training the Al model 208. In case, where the Al model 208 is trained based on the remaining defects, such further training of the Al model 208 may be performed periodically either based on time period, number of welding operations, or number of detected defects, etc., but not limited thereto, as explained in previous embodiments. This second level of quality check ensure that the welded workpiece is defect free and enhance the accuracy of overall welding operation.
[0082] In this manner, the method 500 may perform welding in the automotive manner. The method 500 may also detect and correct one or more defects in real-time. The method 500 may detect and correct the defects while performing the welding in real-time and even after completion of welding. The method 500 may apply the NDT technique to ensure the final welded workpiece does not comprise any kind of defects. Thus, the method 500 may reduce the manufacturing time, and improves yield and efficiency.
[0083] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.
[0084] Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. [0085] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer- readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., are non- transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[0086] Suitable processors include, by way of example, a general-purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a graphic processing unit (GPU), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.

Claims

WE CLAIM:
1. An Al based method for detecting and correcting welding defects in real time, the method comprising: generating a 3D model of a workpiece subjected to welding, from a plurality of images received from a first image capturing unit; guiding a first robotic arm to perform specific welding on the workpiece based on a plurality of welding parameters extracted from analysis of the 3D model of the workpiece using a trained Al model; monitoring the welding in real-time and detecting at least one defect in the welding using the trained Al model; in response to detecting the at least one defect in the welding, discontinuing the welding and initiating a welding correction process to correct the at least one defect identified in the workpiece; continuing the welding upon completion of the welding correction process, and the realtime monitoring, until the welding of the workpiece is completed; guiding a second robotic arm to perform non-destructive testing of the welded workpiece to identify, using the trained Al model, one or more remaining defects in the welded workpiece; and correcting the one or more remaining defects in the welded workpiece.
2. The method of claim 1, wherein determining the plurality of welding parameters comprises determining one or more of: welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition, and wherein the Al model is trained to determine the welding type, welding current, and welding temperature in the real-time, using supervised machine learning technique, based on a large dataset of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition.
3. The method of claim 1, wherein monitoring the welding in the real-time and detecting the at least one defect in the welding comprises: capturing a video stream of the welding; and processing the captured video stream in the real-time by the trained Al model to detect at least one defect in the welding, wherein the Al model is trained to determine at least one defect in the welding, using supervised machine learning technique, based on a large set of images having plurality of defects in the welding.
4. The method of claim 3, wherein the welding correction process comprises: determining location(s) of the at least one defect in the welding by analysing the captured video stream using the trained Al model; selectively guiding the first robotic arm to perform re-welding at the determined location(s) to remove the at least one defect; and selectively training the Al model based on the detected at least one defect.
5. The method of claim 1, wherein for identifying the one or more remaining defects in the welded workpiece, the method further comprises: capturing one or more images of the welded workpiece while performing the nondestructive testing using a second image capturing unit; and detecting the one or more remaining defects by processing the one or more images of workpiece using the trained Al model, and wherein the Al model is trained for determining the one or more remaining defects in the welded workpiece, using supervised machine learning technique, based on a large set of images of plurality of defects in welded workpieces.
6. The method of claim 5, wherein correcting the one or more remaining defects in the welded workpiece comprises: determining location(s) of the one or more remaining defects in the welded workpiece using the trained Al model; selectively guiding the first robotic arm to perform re-welding at the determined location(s) to remove the one or more remaining defects; and selectively training the Al model based on the one or more remaining defects.
7. An Artificial Intelligence (Al) based robotic system to detect and correct welding defects in real-time, the system comprising: a first robotic arm having a first image capturing unit configured to capture a plurality of images of a workpiece subjected to welding and a welding unit mounted thereon, a second robotic arm having a testing unit and a second image capturing unit mounted thereon; and a system control unit comprising a trained Al model, and operatively coupled to the first robotic arm and the second robotic arm, the system control unit is configured to: generate a 3D model of the workpiece, using the trained Al model, from the plurality of images received from the first image capturing unit; guide the first robotic arm to perform specific welding on the workpiece based on a plurality of welding parameters extracted from analysis of the 3D model of the workpiece using the trained Al model; monitor the welding in real-time using the first image capturing unit and detect at least one defect in the welding using the trained Al model; in response to detecting the at least one defect in the welding, discontinue the welding and initiate a welding correction process to correct the at least one defect identified in the workpiece; continue the welding upon completion of the welding correction process, and the real-time monitoring, until the welding of the workpiece is completed; guide the second robotic arm to perform non-destructive testing of the welded workpiece to identify, using the trained Al model, one or more remaining defects in the welded workpiece; and correct the one or more remaining defects in the welded workpiece.
8. The system of claim 7, wherein the plurality of welding parameters comprises one or more of: welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition, and wherein the Al model is trained to determine the welding type, welding current, and welding temperature in the real-time, using supervised machine learning technique, based on a large dataset of welding process, welding types, welding current, welding temperature, arc voltage, welding speed, electrode feed speed, electrode extension, electrode diameter, electrode orientation, electrode polarity, weld size, throat thickness, leg length, and shielding gas composition.
9. The system of claim 7, wherein to monitor the welding in the real-time and detect the at least one defect in the welding, the system control unit is configured to: capture a video stream of the welding using the first image capturing unit; and process the captured video stream in the real-time, using the trained Al model, to detect the at least one defect in the welding, wherein the Al model is trained to determine the at least one defect in the welding, using supervised machine learning technique, based on a large set of images having plurality of defects in the welding.
10. The system of claim 9, wherein to initiate the welding correction process to correct the at least one defect, the system control unit is configured to: determine location(s) of the at least one defect in the welding by analysing the captured video stream using the trained Al model; guide the first robotic arm to perform re-welding at the determined location(s) to remove the at least one defect; and train the Al model based on the detected at least one defect to avoid such defect.
11. The system of claim 7, wherein to identify the one or more remaining defects in the welded workpiece, the system control unit is configured to: capture one or more images of the welded workpiece using while performing the nondestructive testing using the second image capturing unit; and detect the one or more remaining defects by processing the one or more images of workpiece using the trained Al model, and wherein the Al model is trained to determine the one or more remaining defects in the welded workpiece, using supervised machine learning technique, based on a large set of images of plurality of defects in welded workpieces.
12. The system of claim 11, wherein to correct the one or more remaining defects in the welded workpiece, the system control unit is configured to: determine location(s) of the one or more remaining defects by processing the captured one or more images of the welded workpiece using the trained Al model; guide the first robotic arm to perform re-welding at the determined location(s) to remove the one or more remaining defects; and train the Al model based on the one or more remaining defects.
PCT/IB2023/054293 2022-09-12 2023-04-26 System for detecting and correcting welding defects in real-time and method thereof WO2024057100A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202241052042 2022-09-12
IN202241052042 2022-09-12

Publications (1)

Publication Number Publication Date
WO2024057100A1 true WO2024057100A1 (en) 2024-03-21

Family

ID=90274359

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/054293 WO2024057100A1 (en) 2022-09-12 2023-04-26 System for detecting and correcting welding defects in real-time and method thereof

Country Status (1)

Country Link
WO (1) WO2024057100A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180341248A1 (en) * 2017-05-24 2018-11-29 Relativity Space, Inc. Real-time adaptive control of additive manufacturing processes using machine learning

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180341248A1 (en) * 2017-05-24 2018-11-29 Relativity Space, Inc. Real-time adaptive control of additive manufacturing processes using machine learning

Similar Documents

Publication Publication Date Title
US10682729B2 (en) System for automated in-process inspection of welds
WO2020038389A1 (en) Welding seam negative defect recognition method
CN103822970B (en) A kind of portable resistor spot welding Automatic ultrasonic testing instrument and detection method
CN109727229B (en) Method and device for detecting false solder
CN103231162A (en) Device and method for visual detection of welding quality of robot
Summerville et al. Nugget diameter in resistance spot welding: a comparison between a dynamic resistance based approach and ultrasound C-scan
CN112091472B (en) Welding process quality fusion judgment method and device
US20230221286A1 (en) Inspection device, inspection method, and inspection program
JP2008249441A (en) Ultrasonic flaw detection method and ultrasonic flaw detection program
WO2024057100A1 (en) System for detecting and correcting welding defects in real-time and method thereof
WO2022040819A2 (en) Computer-implemented monitoring of a welding operation
KR20230066100A (en) Welding system, welding method, welding support device, program, learning device, and method of creating a learning model
JP2022083487A5 (en)
CN116912165A (en) Aluminum alloy sheet welding defect detection method based on improved YOLOv5
CN110148172B (en) Weld joint positioning method, device, equipment and computer readable storage medium
JP2021067580A5 (en) Processing systems, processing equipment, processing methods, programs, and storage media
CN116618878A (en) Pre-welding process parameter determination method, welding quality online prediction method, device and storage medium
Zhang et al. Multisensory data fusion technique and its application to welding process monitoring
JP2725582B2 (en) Nugget diameter measurement method for spot welding
CN107705312A (en) A kind of method based on line scan data extraction postwelding weld edge point
Stepanova et al. Acoustic-emission testing of multiple-pass welding defects of large-size constructions
CN113702494A (en) Welding evaluation method, device, equipment and storage medium
US20230211448A1 (en) Method for checking at least one subregion of a component and checking device for checking at least one subregion of a component
TWM604396U (en) Weld checking system based on radiography
JP2021071377A5 (en) Processing systems, processing equipment, processing methods, programs, and storage media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23864857

Country of ref document: EP

Kind code of ref document: A1