CN112589232B - Weld joint tracking method and device based on independent deviation correction type deep learning - Google Patents

Weld joint tracking method and device based on independent deviation correction type deep learning Download PDF

Info

Publication number
CN112589232B
CN112589232B CN202011477237.0A CN202011477237A CN112589232B CN 112589232 B CN112589232 B CN 112589232B CN 202011477237 A CN202011477237 A CN 202011477237A CN 112589232 B CN112589232 B CN 112589232B
Authority
CN
China
Prior art keywords
welding
weld
welding seam
tracking
guide rail
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011477237.0A
Other languages
Chinese (zh)
Other versions
CN112589232A (en
Inventor
高向东
杜健准
张艳喜
梁添汾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202011477237.0A priority Critical patent/CN112589232B/en
Publication of CN112589232A publication Critical patent/CN112589232A/en
Application granted granted Critical
Publication of CN112589232B publication Critical patent/CN112589232B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/12Automatic feeding or moving of electrodes or work for spot or seam welding or cutting
    • B23K9/127Means for tracking lines during arc welding or cutting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/12Automatic feeding or moving of electrodes or work for spot or seam welding or cutting
    • B23K9/127Means for tracking lines during arc welding or cutting
    • B23K9/1272Geometry oriented, e.g. beam optical trading
    • B23K9/1274Using non-contact, optical means, e.g. laser means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a welding seam tracking method and a welding seam tracking device based on independent deviation correction type deep learning, wherein the device comprises the following steps: the welding gun comprises a working platform, a controller, a vertical guide rail, a horizontal guide rail, a welding gun clamp, a line structured light vision sensor and a servo motor; the method comprises the following steps: s1: collecting a welding seam structure light picture; s2: reading a welding seam structure light picture; s3: identifying the current weld type and positioning the center position of the weld by using YOLOV 3; s4: taking the welding seam center position positioned by S3 as a tracking target, and initializing a KCF tracker; s5: the welding is started, and the KCF-YOLOV3 algorithm is used for tracking the welding seam in real time; until the welding is finished. The welding device and the welding method can automatically identify various welding seams and accurately position the central area of the welding seam, and can effectively improve the detection speed and the identification accuracy of the welding seam, thereby improving the welding efficiency.

Description

Weld joint tracking method and device based on independent deviation correction type deep learning
Technical Field
The invention relates to the technical field of welding, in particular to a welding seam tracking method and device based on independent deviation correction type deep learning.
Background
The welding seam tracking technology based on the structured light vision combines the advantages of the computer vision and the laser three-dimensional measurement technology, has the excellent characteristics of simple data information acquisition, obvious welding seam characteristics, strong anti-interference capability and the like, and is widely applied to light sources of structured light because laser has the advantages of good directivity, monochromaticity, coherence, capability concentration and the like. At present, weld seam tracking under line structure light vision sensing is mature. In order to reduce the interference of intense arc light in the welding process, a filter lens with a certain bandwidth is usually installed in front of the camera lens to keep obtaining a clear structural light image, and the central position of the welding seam is obtained through signal acquisition and image processing.
The technical scheme has the disadvantages that a complex communication protocol needs to be established between the tracking device and the welding equipment, the transmission problem of a sensing signal and a control signal needs to be considered, the software compiling difficulty is high, the product development period is long, and the method can only be applied to specific industrial scenes.
The existing weld seam tracking system based on visual sensing basically comprises a computer and a visual sensor, wherein in upper computer software, the computer displays images acquired by the visual sensor in real time, acquires the center position of a weld seam through a specific image processing algorithm, then calculates the deviation of the weld seam, and finally sends a control command. Although powerful, the computer has the following disadvantages: the volume is large, the occupied space is large, and the device is not suitable for being applied to certain industrial fields; the cost is high, and the cost performance is not high; the installation is not flexible and convenient. Therefore, the computer is not suitable for large-scale industrial application in the operation field.
In the welding production process, according to the difference of groove form and connection mode, the following weld types are mainly available: flat butt welds, lap welds, V welds, fillet welds, circumferential welds. The structural light stripe images of each welding line type are different, so a specific image feature extraction algorithm needs to be developed according to the welding line type, and parameters such as welding speed, welding current and voltage need to be adjusted according to the welding line type in automatic robot welding. However, the conventional seam tracking system requires manual input of the type of seam before welding, thereby seriously decreasing the automation level of the welding robot. With the remarkable improvement of computer hardware performance and the large increase of data samples, a deep network based on a Convolutional Neural Network (CNN) is beginning to become the mainstream of the target detection technology. At present, the target detection technology based on deep learning can simultaneously realize real-time target classification and target positioning, and obtain better effects on accuracy and anti-interference degree. The technology is applied to weld seam tracking, and can realize weld seam type identification and weld seam center positioning before welding, real-time weld seam detection during welding and continuous correction of tracking results.
In recent years, target tracking algorithms are beginning to be applied to seam tracking, and many scholars at home and abroad have intensively studied around seam tracking based on the target tracking algorithms. Target tracking algorithms are respectively divided into two categories, namely a generative model and a discriminant model, the generative model describes the characteristics of a tracked target by establishing a model, and the representative algorithms comprise an optical flow method, Kalman filtering and the like, but the generation difficulty of a welding seam tracking system model is large, and the method is difficult to realize in engineering generally; the discriminant model introduces an online learning classifier in the tracking process, a sample set is formed by a positive sample containing a target and a negative sample containing a background, the tracking performance of the algorithm is good, and the representative algorithm comprises TLD (total likelihood decomposition), KCF (nearest neighbor function) and deep learning methods. The KCF algorithm has good performance in tracking accuracy and robust performance by using a related filtering technology, and is suitable for weld seam tracking. The KCF algorithm has the defects that tracking errors are easy to accumulate during long-time tracking, noise interference causes model drift, and noises such as arc light, scattered laser, metal splashing exist in the tracking process, so that the KCF algorithm is seriously influenced to track the center of a welding seam, and the welding seam tracking precision is reduced.
At present, most of welding seam tracking systems at home and abroad are machine vision systems based on traditional PCs (personal computer), mainly comprise a computer, a vision sensing system, an image acquisition card, a motion control card and the like, and have the defects of large volume, large occupied space and difficult application to certain harsh industrial environments; the cost is high, the energy consumption is large, and the cost performance is not high; the installation degree of difficulty is big, is difficult for the debugging. In addition, the traditional weld joint tracking system needs to establish a complex communication protocol with welding equipment, needs to consider the transmission problem of sensing signals and control signals, and limits the application range to a certain extent.
The traditional welding seam tracking system needs to manually input the type of the welding seam and adopt a corresponding welding seam image processing method before welding, thereby seriously reducing the welding automation level.
In the field welding process, strong noise interference exists in welding seam image information detected by the line structure-based optical vision sensor, so that the welding seam tracking precision is reduced. In recent years, a target tracking algorithm is applied to the field of weld tracking, a good effect is achieved, however, under the interference of strong noise, the problem that tracking failure is caused by tracking drift still exists in the tracking process.
Disclosure of Invention
The invention provides a welding seam tracking method and a welding seam tracking device based on independent correction type deep learning, aiming at overcoming the defects of low welding seam detection speed and low accuracy in the prior art.
The method comprises the following steps:
s1: collecting a welding seam structure light picture;
s2: reading a welding seam structure light picture;
s3: according to the requirement of automatic tracking of the welding seam, the type of the welding seam needs to be determined before welding starts. Identifying the current weld type and positioning the center position of the weld by using YOLOV 3;
s4: taking the welding seam center position positioned by S3 as a tracking target, and initializing a KCF tracker;
the initialization of the KCF tracker comprises the steps of selecting a boundary box as a positive sample, establishing a cyclic matrix through cyclic migration, training a classifier, and introducing a Gaussian kernel function to improve the performance of the classifier.
S5: the welding is started, and the KCF-YOLOV3 algorithm is used for tracking the welding seam in real time; until the welding is finished.
Preferably, the method for positioning the center position of the weld seam comprises the following steps:
YOLOV3 detects a weld in the weld image, and since a target bounding box appears at the weld center position, the center coordinates of the bounding box are used as the weld center position.
Preferably, S5 includes the steps of:
s5.1: the KCF is utilized to track the welding seam, and the YOLOV3 detects the welding seam and continuously corrects the tracker to prevent tracking drift;
s5.2: judging whether the classification score of the Yolov3 is higher than a fixed value, if so, executing S5.3, and if not, executing S5.5;
s5.3: calculating the offset error rate Po of the KCF and YOLOV3 in the x-axis direction of the output weld center position; and judging PoWhether the value is greater than a fixed threshold value alpha or equal to zero, if so, executing S5.4, and if not, executing S5.5;
s5.4: taking the detection result of YOLOV3 as the center position of the weld joint; reinitializing the KCF tracker according to a detection result of YOLOV 3;
s5.5: taking a KCF tracking result as a welding center position, and updating a KCF tracker in real time;
s5.6: judging whether the frame is the last frame of the welding seam image according to the KCF tracker initialized in the S5.4 or the KCF tracker updated in the S5.5, and if not, returning to the S5.2; if yes, welding is finished.
Preferably, the weld center position deviation error rate PoMeter (2)The calculation formula is as follows:
Figure BDA0002837587820000031
wherein x (k) is the x-axis coordinate of the welding seam center position predicted by the KCF algorithm at the k moment*(k) And the x-axis coordinate of the weld center position detected by the YOLO algorithm at the moment k.
The welding device can realize the welding method, and the device comprises: the welding gun comprises a working platform, a controller, a vertical guide rail, a horizontal guide rail, a welding gun clamp, a line structured light vision sensor and a servo motor;
the working platform is used for placing a workpiece to be welded;
the controller, the vertical guide rail, the horizontal guide rail, the welding gun clamp, the line structured light vision sensor and the servo motor are arranged above the working platform;
the horizontal guide rail is arranged on the vertical guide rail and can slide up and down along the vertical guide rail;
the motor is arranged at one end of the guide rail in the horizontal direction;
the welding gun clamp and the line structured light vision sensor are arranged on the horizontal guide rail;
the welding gun clamp and the line structured optical vision sensor can be controlled to move along the horizontal direction on the horizontal direction guide rail through the transmission of the motor;
the welding gun clamp is used for clamping a welding gun; the linear structure light vision sensor is used for collecting a welding seam structure light picture of a welding point and sending the collected welding seam structure light picture to the controller;
the controller controls the motor to rotate.
Preferably, the controller comprises an upper computer and a lower computer;
the upper computer is connected with the line structured light vision sensor;
the line structure light vision sensor collects a welding line structure light picture of a welding point and sends the welding line structure light picture to the upper computer;
analyzing the weld image of the upper computer, converting the weld deviation information into an instruction and sending the instruction to the lower computer;
the lower computer is used for receiving the instruction from the upper computer, analyzing the instruction content, converting the instruction content into a corresponding control function and controlling the servo motor by sending a control signal to the servo motor driver.
Preferably, the upper computer is a Raspberry Pi 4b development board, and the lower computer is a MINI-STM32 singlechip.
Preferably, the motor is a servo motor.
Preferably, the vertical guide rail comprises a first fixed bracket, a first lead screw and a manual pulley;
the first lead screw comprises a first screw rod and a first nut;
the first screw rod is rotatably arranged on the first fixing support, and one end of the first screw rod is coaxially connected with the manual pulley; the first screw rod can be driven to rotate by rotating the manual pulley;
the first nut is connected with the horizontal guide rail.
Preferably, the horizontal guide rail comprises a second fixed bracket and a second lead screw;
the second fixing bracket is connected with the first nut;
the second lead screw comprises a second screw rod and a second nut;
the second screw rod is rotatably arranged on the second fixed support, and one end of the second screw rod is coaxially connected with the motor transmission shaft; the second screw rod can be driven to rotate by the motor;
the second nut is connected with the welding gun clamp and the line structured light vision sensor.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
the welding device and the welding method can automatically identify various welding seams and accurately position the central area of the welding seam. In addition, according to the weld tracking method (KCF-YOLOV3) based on the combination of the nuclear correlation filtering (KCF) and the deep learning detection (YOLOV3), under the condition that the KCF algorithm is likely to be subjected to noise interference to cause tracking drift, YOLOV3 can position the center of a weld in real time and continuously correct a KCF tracker, so that the tracking drift is effectively inhibited, the robustness of the weld tracking algorithm is greatly improved, and the weld tracking accuracy is further improved. The welding seam detection speed and the identification accuracy can be effectively improved, and therefore the welding efficiency is improved.
Drawings
Fig. 1 is a flowchart of a weld tracking method based on independent deviation correction type deep learning according to embodiment 1.
FIG. 2 is a flow chart of the KCF-YOLOV3 algorithm.
Fig. 3 is a schematic view of a weld tracking device based on independent deviation correction type deep learning according to embodiment 1.
Fig. 4 is a schematic view of a vertical guide rail and a horizontal guide rail.
In the figure: the welding machine comprises a controller 1, a servo motor 2, a vertical guide rail 3, a horizontal guide rail 4, a line structured light vision sensor 5, a welding gun clamp 6, a welding gun 7, an arc welding machine 8, a working platform 9, a workpiece to be welded 10, a first fixed support 3.1, a first lead screw 2, a manual pulley 3.3, a second lead screw 4.1 and a second fixed support 4.2.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
Example 1:
the embodiment provides a weld joint tracking method based on independent correction type deep learning.
The welding method described in this example is based on the seam tracking method (KCF-YOLOV3) of Kernel Correlation Filtering (KCF) in combination with deep learning detection (YOLOV 3). In the real-time welding process, a large number of positive and negative samples are constructed by the KCF algorithm through cyclic shift, a ridge regression is used for training a classifier on line, the diagonalizable property of a cyclic matrix in a Fourier space is used for converting the operation of the matrix into the Hadamard product of vectors, the operation amount is greatly reduced, the algorithm meets the real-time requirement, and meanwhile, the Gaussian kernel function mapping is used for improving the tracking precision.
The traditional kernel correlation filtering tracking is a short-term target tracking method and has the limitation that tracking errors are easy to accumulate during long-time tracking and model drift is caused by noise interference. In the welding process, noises such as strong arc light, splashing and the like can interfere with the identification of the welding seam, meanwhile, the target model is kept updated in a noise environment, and if tracking is carried out for a long time, tracking errors can be accumulated, so that the problem of tracking drift is caused. Tracking drift occurs in the welding process, tracking errors can be accumulated, and even serious consequences of scrapping of workpieces are caused, so that the target model is kept accurate, and the tracking drift is avoided, which is very important for welding seam tracking. The center position of the welding seam is detected through a deep learning method, and a target model of the KCF tracker can be periodically corrected, so that the KCF can always accurately and steadily track the welding seam.
YOLOV3 is a deep network based on convolutional neural networks, which is the mainstream of the current target detection technology. The algorithm simplifies the classification and positioning of the target into a regression problem treatment, and is an end-to-end target detection algorithm. The YOLOV3 algorithm is characterized by high detection speed and recognition accuracy, so that real-time detection on an embedded platform can be realized. In order to realize the seam center position detection based on the YOLOV3, a large number of seam images need to be acquired and the network model needs to be trained offline. In order to improve the performance of the detector, the acquired welding seam image can be a welding seam image before welding or a welding seam image with certain noise in the welding process, and the welding seam type is common flat butt welding seams, lap welding seams, V-shaped welding seams, fillet welding seams, annular welding seams and the like, so that the diversity and complexity of the training sample are improved.
Before welding starts, a weld joint detection function of the YOLOV3 is started, so that the current weld joint type is identified and the center position of the weld joint is positioned; secondly, taking the positioned welding seam center position as a tracking target to initialize a KCF tracker; next, KCF is responsible for tracking the weld after welding begins, YOLOV3 is responsible for detecting the weld and continuously revising the purpose of the trackerAnd the standard model is used for preventing tracking drift. In the welding process, when the classification module of the YOLOV3 detects that a certain classification score of the current weld is higher, the positioning accuracy of the detected weld center position is considered to be higher. Then calculating the offset error rate P of KCF and YOLOV3 in the x-axis direction of the output weld center positionoSince the weld deviation occurs mainly in the x-axis direction, only the offset error in the x-axis direction is considered. Setting a fixed threshold value alpha when PoWhen the current time is more than alpha, the KCF tracking is considered to have drift, and the current time target frame of the YOLO algorithm is assigned to the KCF algorithm for re-tracking; when P is presentoWhen the alpha is less than or equal to alpha, the welding seam is considered to be tracked, and the KCF algorithm is used for continuing tracking; when P is presentoWhen the value is 0, the KCF algorithm loses the target, and the KCF algorithm is initialized again through the YOLO algorithm to realize target tracking.
When the tracking drift phenomenon occurs, the target needs to be re-detected and retrieved by using a YOLOV3 detection algorithm, the detection result of YOLOV3 at the current moment is used as the center position of the weld joint, and meanwhile, the KCF tracker is re-initialized. And when the tracking drift phenomenon does not occur, namely the reliability of the KCF tracking result is higher, taking the KCF tracking result at the current moment as the center position of the welding seam, and simultaneously keeping the KCF tracker to be updated online. And after the tracking of the center position of the welding line at the current moment is finished, continuing to track the welding line of the next frame, continuously circulating the algorithm flow until the last frame, and simultaneously finishing welding.
The method of this embodiment is described in detail below with reference to fig. 1: the welding method of the embodiment comprises the following steps:
s1: collecting a welding seam structured light picture through a line structured light vision sensor;
s2: reading a picture acquired by the line structured light vision sensor through a Raspberry Pi 4 b;
s3: weld image processing was performed in Raspberry Pi 4 b: the YOLOV3 detects the weld in the weld image, a bounding box appears at the center of the weld, and the current weld type is identified at the same time;
s4: taking the welding seam center position positioned by S3 as a tracking target, and initializing a KCF tracker; the initialization of the KCF tracker comprises the steps of selecting a bounding box as a positive sample, establishing a cyclic matrix through cyclic migration, training a classifier, and introducing a Gaussian kernel function to improve the performance of the classifier.
S5: welding starts, and the welding seam is tracked in real time by using a KCF-YOLOV3 algorithm (shown in FIG. 2); until the welding is finished.
Specifically, the position of the weld characteristic point is the center coordinate of the tracked boundary frame, the weld deviation at the current moment is calculated, and a control command is transmitted to MINI-STM32 according to the deviation amount.
In order to prevent the welding gun from frequently moving, the MINI-STM32 is provided with a control dead zone, and when the deviation amount is smaller than a fixed threshold value, the motor is not controlled; when the deviation value is larger than a fixed threshold value, the MINI-STM32 transmits a corresponding control signal to a servo motor driver, and the motor rotates for a specified distance to correct the deviation in real time until the welding is finished.
The method for positioning the center position of the welding seam comprises the following steps: YOLOV3 detects a weld in the weld image, and since a target bounding box appears at the weld center position, the center coordinates of the bounding box are used as the weld center position.
Further, S5 includes the steps of:
s5.1: the KCF is utilized to track the welding seam, and the YOLOV3 detects the welding seam and continuously corrects the tracker to prevent tracking drift;
s5.2: judging whether the classification score of the Yolov3 is higher than a fixed value, if so, executing S5.3, and if not, executing S5.5;
s5.3: calculating offset error rate P of KCF and YOLOV3 in x-axis direction of output weld center positiono(ii) a And determining PoWhether the value is greater than a fixed threshold value alpha or equal to zero, if so, executing S5.4, and if not, implementing S5.5;
s5.4: taking the detection result of YOLOV3 as the center position of the weld joint; reinitializing the KCF tracker according to a detection result of YOLOV 3;
s5.5: taking a KCF tracking result as a welding center position, and updating a KCF tracker in real time;
s5.6: judging whether the frame is the last frame of the welding seam image according to the KCF tracker initialized in the S5.4 or the KCF tracker updated in the S5.5, and if not, returning to the S5.2; if yes, welding is finished.
The calculation formula of the error rate Po of the center position offset of the welding seam is as follows:
Figure BDA0002837587820000081
wherein x (k) is the x-axis coordinate of the welding seam center position predicted by the KCF algorithm at the k moment*(k) And the x-axis coordinate of the weld center position detected by the YOLO algorithm at the k moment.
The weld tracking method (KCF-YOLOV3) based on the combination of nuclear correlation filtering (KCF) and deep learning detection (YOLOV3) is a target tracking algorithm with strong anti-interference capability, and can effectively solve the problem of tracking drift.
The weld joint detection based on the deep learning detection algorithm YOLOV3 adopted in the embodiment can automatically identify various weld joints and accurately position the central region of the weld joint. The algorithm of the Yolov3 is characterized by higher detection speed and recognition accuracy, so that real-time detection on an embedded platform can be realized.
Example 2:
the embodiment provides a weld seam tracking device based on independent correction type deep learning, as shown in fig. 3 to 4, the device includes: the device comprises a working platform 9, a controller 1, a vertical guide rail 3, a horizontal guide rail 4, a welding gun clamp 6, a line structured light vision sensor 5 and a servo motor 2;
the working platform 9 is used for placing a workpiece 10 to be welded;
the controller 1, the vertical guide rail 3, the horizontal guide rail 4, the welding gun clamp 6, the line structured light vision sensor 5 and the servo motor 2 are arranged above the working platform 9;
the horizontal guide rail 4 is arranged on the vertical guide rail 3 and can slide up and down along the vertical guide rail 3;
the servo motor 2 is arranged at one end of the horizontal guide rail 4;
the welding gun clamp 6 and the line structured light vision sensor 5 are arranged on the horizontal guide rail 4;
the welding gun clamp 6 and the line structured light vision sensor 5 can be controlled to move on the horizontal guide rail 4 along the horizontal direction through the transmission of the servo motor 2;
the welding gun clamp 6 is used for clamping a welding gun 7; the linear structure light vision sensor 5 is used for collecting a welding seam structure light picture of a welding point and sending the collected welding seam structure light picture to the controller 1;
the controller 1 controls the servo motor 2 to rotate.
The controller 1 comprises an upper computer and a lower computer;
the upper computer is connected with the line structured light vision sensor 5;
the line structure light vision sensor 5 collects the welding line structure light picture of the welding point and sends the welding line structure light picture to the upper computer;
analyzing the weld image of the upper computer, converting the weld deviation information into an instruction and sending the instruction to the lower computer;
the lower computer is used for receiving the instruction from the upper computer, analyzing the instruction content, converting the instruction content into a corresponding control function and controlling the servo motor 2 by sending a control signal to the servo motor driver.
The upper computer is a Raspberry Pi 4b development board, and the lower computer is a MINI-STM32 singlechip.
In the embodiment, the upper computer adopts a Raspberry Pi 4b development board, a CPU of the upper computer is a 64-bit 1.5GHz four-core ARM Cortex-A72, a memory is a 4GB LPDDR4 SDRAM, and a storage system adopts an SD/Micro SD card; the Raspberry Pi mainboard is provided with four USB interfaces, an Ethernet interface and an HDMI high-definition video output interface, and the interfaces can be connected with other peripheral equipment, such as a keyboard, a mouse, a network cable and a display, so as to form a computer with complete functions, and the operating system of the computer is based on a Linux system. The lower computer selects a MINI-STM32 singlechip, and the CPU of the lower computer is ARM 32-bit Cortex-M3 which is a microcontroller with high performance, low cost and low power consumption. The upper computer development board analyzes the collected welding seam image, converts welding seam deviation information into an instruction and sends the instruction to the lower computer; the upper computer is connected with the lower computer through serial port communication, the lower computer is used for receiving instructions from the upper computer, analyzing the instruction content and converting the instruction content into a corresponding control function, and the servo motor 2 is controlled by sending control signals to a driver of the servo motor 2.
The vertical guide rail 3 comprises a first fixed bracket 3.1, a first lead screw 3.2 and a manual pulley 3.3;
the first lead screw 3.2 comprises a first screw rod and a first nut;
the first screw rod is rotatably arranged on the first fixing support 3.1, and one end of the first screw rod is coaxially connected with the manual pulley 3.3; the first screw rod can be driven to rotate by rotating the manual pulley 3.3;
the first nut is connected to the horizontal direction guide rail 4.
The horizontal guide rail 4 comprises a second fixed bracket 4.2 and a second lead screw 4.1;
the second fixing bracket 4.2 is connected with the first nut;
the second lead screw 4.1 comprises a second screw rod and a second nut;
the second screw rod is rotatably arranged on the second fixed support 4.2, and one end of the second screw rod is coaxially connected with a transmission shaft of the servo motor 2; the servo motor 2 can drive the second screw rod to rotate;
the second nut is connected to the welding gun holder 6 and the line structured optical vision sensor 5.
The welding device can be conveniently assembled and disassembled on an industrial mechanical arm, in the welding seam tracking process, the mechanical arm moves along the welding path direction to finish relative movement with a workpiece, and the horizontal guide rail 4 can enable the welding gun to reciprocate along the horizontal direction to finish deviation correction. The welding device of the embodiment can also be installed on various special welding machines, and the deviation rectification is completed by moving on the horizontal guide rail 4.
In the embodiment, the Raspberry Pi 4b development board based on an embedded system is adopted to replace the existing PC industrial personal computer, and the MINI-STM32 single chip microcomputer replaces the existing motion control card. The system is small in size, low in price and easy to assemble and disassemble, and improves the flexibility, the practicability and the popularization of the machine vision monitoring equipment.
In addition, Raspberry Pi 4b is a microcomputer based on a Linux operating system, so the algorithm provided by the invention is written in python language.
In addition, the welding device can be installed on most of welding equipment and used as an independent welding seam deviation correcting device, the problem that a complex communication protocol is formulated with the welding equipment is solved, the application range is wide, and the installation and the debugging are easy.
The same or similar reference numerals correspond to the same or similar parts;
the terms describing positional relationships in the drawings are for illustrative purposes only and are not to be construed as limiting the patent;
it should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (8)

1. A welding seam tracking method based on independent deviation correction type deep learning is characterized by comprising the following steps:
s1: collecting a welding seam structure light picture;
s2: reading a welding seam structure light picture;
s3: identifying the current weld type and positioning the center position of the weld by using YOLOV 3; the method for positioning the center position of the welding seam comprises the following steps: the YOLOV3 detects the weld in the weld image, and a target boundary frame appears at the center position of the weld, so the center coordinate of the boundary frame is taken as the center position of the weld;
s4: taking the welding seam center position positioned by S3 as a tracking target, and initializing a KCF tracker;
s5: the welding is started, and the KCF-YOLOV3 algorithm is used for tracking the welding seam in real time; until the welding is finished;
s5 includes the steps of:
s5.1: the KCF is utilized to track the welding seam, and the YOLOV3 detects the welding seam and continuously corrects the tracker to prevent tracking drift;
s5.2: judging whether the classification score of the Yolov3 is higher than a fixed value, if so, executing S5.3, and if not, executing S5.5;
s5.3: calculating offset error rate P of KCF and YOLOV3 in x-axis direction of output weld center positiono(ii) a And judging PoIf the value is greater than the fixed threshold value alpha or equal to zero, if so, executing S5.4; if not, executing S5.5;
s5.4: taking the detection result of YOLOV3 as the center position of the weld joint; reinitializing the KCF tracker according to a detection result of YOLOV 3;
s5.5: taking a KCF tracking result as a welding center position, and updating a KCF tracker in real time;
s5.6: judging whether the frame is the last frame of the welding seam image according to the KCF tracker initialized in the S5.4 or the KCF tracker updated in the S5.5, and if not, returning to the S5.2; if yes, welding is finished.
2. The weld joint tracking method based on the independent correction type deep learning according to claim 1, characterized in that the error rate P of the deviation of the center position of the weld joint isoThe calculation formula of (2) is as follows:
Figure FDA0003595293570000011
wherein x (k) is x-axis coordinate of the weld center position predicted by the KCF algorithm at k moment, x*(k) And the x-axis coordinate of the weld center position detected by the YOLO algorithm at the moment k.
3. A weld tracking device based on independent deviation correction type deep learning, characterized in that the device comprises: the welding gun comprises a working platform, a controller, a vertical guide rail, a horizontal guide rail, a welding gun clamp, a line structured light vision sensor and a servo motor;
the working platform is used for placing a workpiece to be welded;
the controller, the vertical guide rail, the horizontal guide rail, the welding gun clamp, the line structured light vision sensor and the servo motor are arranged above the working platform;
the horizontal guide rail is arranged on the vertical guide rail and can slide up and down along the vertical guide rail;
the motor is arranged at one end of the guide rail in the horizontal direction;
the welding gun clamp and the line structured light vision sensor are arranged on the horizontal guide rail;
the welding gun clamp and the line structured optical vision sensor can be controlled to move along the horizontal direction on the horizontal direction guide rail through the transmission of the motor;
the welding gun clamp is used for clamping a welding gun; the linear structure light vision sensor is used for collecting a welding seam structure light picture of a welding point and sending the collected welding seam structure light picture to the controller;
the controller tracks the welding seam on the workpiece to be welded by adopting the welding seam tracking method based on the independent correction type deep learning of claim 1 or 2, and controls the motor to rotate according to the welding seam tracking result.
4. The weld joint tracking device based on the independent correction type deep learning according to claim 3, wherein the controller comprises an upper computer and a lower computer;
the upper computer is connected with the line structured light vision sensor;
the line structure light vision sensor collects a welding line structure light picture of a welding point and sends the welding line structure light picture to the upper computer;
analyzing the weld image of the upper computer, converting the weld deviation information into an instruction and sending the instruction to the lower computer;
the lower computer is used for receiving the instruction from the upper computer, analyzing the instruction content, converting the instruction content into a corresponding control function and controlling the servo motor by sending a control signal to the servo motor driver.
5. The weld joint tracking device based on the independent deviation correction type deep learning of claim 4, wherein the upper computer is a Raspberry Pi 4b development board, and the lower computer is a MINI-STM32 single chip microcomputer.
6. The weld tracking device based on the independent correction type deep learning of claim 5, wherein the motor is a servo motor.
7. The weld joint tracking device based on the independent correction type deep learning according to any one of claims 4 to 6, wherein the vertical direction guide rail comprises a first fixed bracket, a first lead screw and a manual pulley;
the first lead screw comprises a first screw rod and a first nut;
the first screw rod is rotatably arranged on the first fixing support, and one end of the first screw rod is coaxially connected with the manual pulley; the first screw rod can be driven to rotate by rotating the manual pulley;
the first nut is connected with the horizontal guide rail.
8. The weld joint tracking device based on the independent correction type deep learning of claim 7, wherein the horizontal guide rail comprises a second fixed bracket and a second lead screw;
the second fixing bracket is connected with the first nut;
the second lead screw comprises a second screw rod and a second nut;
the second screw rod is rotatably arranged on the second fixed support, and one end of the second screw rod is coaxially connected with the motor transmission shaft; the second screw rod can be driven to rotate by the motor;
the second nut is connected with the welding gun clamp and the line structured light vision sensor.
CN202011477237.0A 2020-12-15 2020-12-15 Weld joint tracking method and device based on independent deviation correction type deep learning Active CN112589232B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011477237.0A CN112589232B (en) 2020-12-15 2020-12-15 Weld joint tracking method and device based on independent deviation correction type deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011477237.0A CN112589232B (en) 2020-12-15 2020-12-15 Weld joint tracking method and device based on independent deviation correction type deep learning

Publications (2)

Publication Number Publication Date
CN112589232A CN112589232A (en) 2021-04-02
CN112589232B true CN112589232B (en) 2022-05-20

Family

ID=75195709

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011477237.0A Active CN112589232B (en) 2020-12-15 2020-12-15 Weld joint tracking method and device based on independent deviation correction type deep learning

Country Status (1)

Country Link
CN (1) CN112589232B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113313106A (en) * 2021-04-14 2021-08-27 深圳市睿达科技有限公司 Feeding deviation rectifying method and device, computer equipment and storage medium
CN114043081B (en) * 2021-11-24 2023-12-22 苏州全视智能光电有限公司 Multi-weld-joint type feature point identification method and system for laser welding
CN114029588A (en) * 2021-11-26 2022-02-11 江苏永大化工设备有限公司 Automatic adjusting system for gas shielded welding process parameters
CN114266974A (en) * 2021-12-23 2022-04-01 福州大学 Automatic positioning welding method based on deep learning
CN114260547B (en) * 2021-12-23 2022-07-01 山东大学 Narrow-gap rotating arc GTAW tungsten electrode position correction method based on deep learning algorithm
CN114932292B (en) * 2022-05-27 2023-09-26 华南理工大学 Narrow-gap passive vision weld joint tracking method and system
CN114749849B (en) * 2022-06-01 2023-09-01 江苏徐工工程机械研究院有限公司 Welding control method, device and system
CN116117273B (en) * 2023-03-01 2024-04-19 广东工业大学 Self-adaptive feedforward welding deviation correcting device and method
CN117840757A (en) * 2024-02-22 2024-04-09 江苏凯乐金属科技有限公司 Arc welding equipment for aluminum alloy profile

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3743776A (en) * 1970-05-05 1973-07-03 J Angely Device for tracing seams for welding by electron bombardment
CN106271081A (en) * 2016-09-30 2017-01-04 华南理工大学 Three coordinate rectangular robot line laser seam tracking system and trackings thereof
CN109492688A (en) * 2018-11-05 2019-03-19 深圳步智造科技有限公司 Welding seam tracking method, device and computer readable storage medium
CN109604777A (en) * 2017-12-07 2019-04-12 淮安信息职业技术学院 Welding seam traking system and method based on laser structure light
CN110480127A (en) * 2019-08-12 2019-11-22 广东工业大学 A kind of seam tracking system and method based on structured light visual sensing
CN110706266A (en) * 2019-12-11 2020-01-17 北京中星时代科技有限公司 Aerial target tracking method based on YOLOv3
CN210908467U (en) * 2019-07-16 2020-07-03 常州坤达焊接技术有限公司 Welding device for realizing welding seam tracking

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3743776A (en) * 1970-05-05 1973-07-03 J Angely Device for tracing seams for welding by electron bombardment
CN106271081A (en) * 2016-09-30 2017-01-04 华南理工大学 Three coordinate rectangular robot line laser seam tracking system and trackings thereof
CN109604777A (en) * 2017-12-07 2019-04-12 淮安信息职业技术学院 Welding seam traking system and method based on laser structure light
CN109492688A (en) * 2018-11-05 2019-03-19 深圳步智造科技有限公司 Welding seam tracking method, device and computer readable storage medium
CN210908467U (en) * 2019-07-16 2020-07-03 常州坤达焊接技术有限公司 Welding device for realizing welding seam tracking
CN110480127A (en) * 2019-08-12 2019-11-22 广东工业大学 A kind of seam tracking system and method based on structured light visual sensing
CN110706266A (en) * 2019-12-11 2020-01-17 北京中星时代科技有限公司 Aerial target tracking method based on YOLOv3

Also Published As

Publication number Publication date
CN112589232A (en) 2021-04-02

Similar Documents

Publication Publication Date Title
CN112589232B (en) Weld joint tracking method and device based on independent deviation correction type deep learning
EP3749475B1 (en) Method for seam tracking in pipe welding
CN109345593B (en) Camera posture detection method and device
CN106181162B (en) A kind of real-time weld joint tracking detection method based on machine vision
CN110064819B (en) Cylindrical surface longitudinal weld characteristic region extraction and weld tracking method and system based on structured light
CN206263418U (en) A kind of real-time seam tracking system of six degree of freedom welding robot line laser
CN111739063A (en) Electric power inspection robot positioning method based on multi-sensor fusion
CN103203526A (en) Laser visual tracking system
Chen et al. A robust visual servo control system for narrow seam double head welding robot
CN109465829B (en) Industrial robot geometric parameter identification method based on transformation matrix error model
CN110480127A (en) A kind of seam tracking system and method based on structured light visual sensing
Zhou et al. Autonomous acquisition of seam coordinates for arc welding robot based on visual servoing
CN111536872A (en) Two-dimensional plane distance measuring device and method based on vision and mark point identification device
Ma et al. Efficient and accurate start point guiding and seam tracking method for curve weld based on structure light
CN108672907A (en) The online method for correcting error of arc welding robot weld seam based on structured light visual sensing
JP2903964B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
CN114654465A (en) Welding seam tracking and extracting method based on line laser structure optical vision sensing
CN116393982B (en) Screw locking method and device based on machine vision
Song et al. A weld feature points detection method based on improved YOLO for welding robots in strong noise environment
Wang et al. Fuzzy-PI double-layer stability control of an online vision-based tracking system
CN111002349A (en) Robot following steering method and robot system adopting same
CN103363916A (en) Information processing method and processing device
CN1419104A (en) Object space position detector
CN112548265A (en) Intelligent welding method and equipment for container lock seat
CN111596656A (en) Heavy-load AGV hybrid navigation device based on binocular video and magnetic sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant