CN113034512B - Weld joint tracking method based on feature segmentation - Google Patents

Weld joint tracking method based on feature segmentation Download PDF

Info

Publication number
CN113034512B
CN113034512B CN202110277763.0A CN202110277763A CN113034512B CN 113034512 B CN113034512 B CN 113034512B CN 202110277763 A CN202110277763 A CN 202110277763A CN 113034512 B CN113034512 B CN 113034512B
Authority
CN
China
Prior art keywords
feature
layer
segmentation
welding
erfnet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110277763.0A
Other languages
Chinese (zh)
Other versions
CN113034512A (en
Inventor
柏连发
王业宇
赵壮
韩静
张毅
罗隽
郭卓然
杨傲东
王兴国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202110277763.0A priority Critical patent/CN113034512B/en
Publication of CN113034512A publication Critical patent/CN113034512A/en
Application granted granted Critical
Publication of CN113034512B publication Critical patent/CN113034512B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Laser Beam Processing (AREA)

Abstract

The invention relates to a weld joint tracking method based on feature segmentation, which comprises the following steps: 1. and collecting a molten pool image, and performing ROI selection on the molten pool image. 2. An Encoder-Decoder segmentation network structure of ERFNet is adopted to carry out molten pool image segmentation, a cross entropy loss function commonly used for semantic segmentation is used, global feature information is fused, and a pyramid pooling module is added into a backbone network. The invention improves the network structure and the Loss function of the existing ERFNet, and carries out multi-scale feature fusion on the high-level features and the low-level features by referring to UNet on the network structure, and Focal local is used for replacing the cross entropy Loss function on the Loss function. The problems of line breakage of the central line of the laser stripe, overlarge deviation of welding characteristic points and the like are avoided. The performance of the feature extraction algorithm is effectively improved, the laser stripe center line and the weld joint feature points of each weld joint of each layer are accurately extracted, the efficiency of the algorithm is not lost, and the requirement of weld joint tracking on real-time performance is guaranteed.

Description

Weld joint tracking method based on feature segmentation
Technical Field
The invention relates to a weld joint tracking method based on feature segmentation, and belongs to the technical field of welding automation.
Background
The additive welding is not only applied to the task of groove filling, but also can be widely applied to the rapid forming of solid parts by a layer-by-layer accumulation welding method, and the rapid manufacturing of parts with complex geometric shapes and structures is realized by utilizing the advantages of short processing period, high forming speed and high material utilization rate. With the increasing application of the additive manufacturing technology in important fields such as aerospace, national defense and military industry, automobile manufacturing, electronic products, biomedical treatment and the like, the demand for rapid forming of metal parts is increasing, so that the application of the welding seam tracking technology in the task of rapid forming of metal parts has important strategic significance.
Can learn from among the heavy hill vibration material disk welding seam tracking experiment, only the laser stripe contour line is effective information in the laser stripe image that gathers through the seam tracking system, consequently the unbalanced problem of positive negative sample can appear in ERFNet training process, simultaneously because the welding number of piles and welding track number all are far greater than the heavy hill filling experiment in dull and stereotyped vibration material disk welding, so the laser stripe characteristic of high-rise welding seam can be more unobvious, consequently directly use ERFNet in dull and stereotyped vibration material disk welding seam characteristic extraction experiment can lead to the laser stripe central line to appear the broken string, welding characteristic point deviation scheduling problem. The network structure and the Loss function of the ERFNet need to be improved, multi-scale feature fusion is carried out on the high-level features and the low-level features on the network structure by referring to the UNet, and the cross entropy Loss function is replaced by the Focal local on the Loss function, so that the problems can be avoided.
In the deep convolutional neural network, according to the theory of receptive field, the low-level feature map has higher resolution and contains more position and detail information, but the lower convolutional operation is performed, so the semantic information is lower, the noise is more, and the high-level feature map has stronger semantic information, but the higher convolutional operation is performed, so the resolution is very low, and the perception capability of the image details is poor. According to the characteristic of the feature map, the capability of acquiring the feature information in the image by the network can be improved by adopting a multi-scale feature fusion mode.
The multi-scale feature fusion is to fuse a low-level feature map with rich spatial features and a high-level feature map with rich semantic information to obtain a new feature map, so that the new feature map has the characteristics of high resolution and high semantic information, and the method is widely applied to the field of target detection and semantic segmentation.
The invention improves the ERFNet network structure by referring to the UNet network, and improves the effect of extracting the laser stripe center line and the welding characteristic point by the network by adopting a multi-scale characteristic fusion mode. The network structure of UNet is the same as ERFNet, and is also an algorithm based on an Encoder-Decoder structure, but characteristic graphs with the same size in an Encoder and a Decoder are spliced by using Concat operation in UNet, so that the network can acquire more spatial information and semantic information in the process of up-sampling, and the segmentation precision is improved.
The method directly adds the corresponding feature maps to realize the multi-scale feature fusion, and the reason is that the number of feature channels is increased by using Concat operation to perform feature superposition, so that the calculated amount is increased, and the speed of extracting the laser central line and the welding feature point by the algorithm is reduced.
Disclosure of Invention
In order to solve the technical problems, the invention provides a weld joint tracking method based on feature segmentation, which has the following specific technical scheme:
a weld joint tracking method based on feature segmentation is characterized by comprising the following steps: the method comprises the following steps:
step S1: collecting a molten pool image, and carrying out ROI selection on the molten pool image;
step S2: an Encoder-Decoder segmentation network structure of ERFNet is adopted for carrying out molten pool image segmentation, a cross entropy loss function commonly used for semantic segmentation is used, global feature information is fused, and a pyramid pooling module with a balance factor alpha is added into a backbone network t The cross entropy loss function of (a) is given by:
Figure BDA0002977323300000021
in the formula alpha t When being a positive sample, α t =α∈[0,1],α t When being negative sample, alpha t =1-α,y c Label, p, representing sample c c Representing the probability that the sample c is predicted to be a positive class, wherein M is the number of classes;
further, the ROI region in step S1 is a fixed region including the molten pool profile, and the size of the ROI region is 512 pixels × 512 pixels.
Further, in the step S2, the network structure is based on SegNet and ENet, and the entire model includes 23 layers, where 1-16 layers are encors and 17-23 layers are decors.
Further, in the step S2, the network structure adopts multi-scale feature fusion of the high-level features and the low-level features, and fuses the feature maps of the 8 th level and the 16 th level, the feature maps of the 3 rd level and the 17 th level, and the feature maps of the 2 nd level and the 20 th level in the ERFNet.
Further, the frame number of the ERFNet in the step S2 is 120FPS or more.
Further, in the step S2, the RMSE and PDE mean values of the high-level weld of the network structure relative to the low-level weld are in an ascending trend, and the offset between the weld feature point and the artificial mark feature point is 1 to 2 pixels.
The invention has the beneficial effects that: the weld joint feature extraction algorithm is optimized and improved, a multi-scale feature fusion strategy and Focal local are introduced into ERFNet, and a flat plate additive weld joint feature extraction experiment is designed to verify the feasibility of the improved algorithm, so that the performance of the feature extraction algorithm can be effectively improved, the laser stripe center line and weld joint feature points of each layer of weld joint can be accurately extracted, the efficiency of the algorithm can not be lost, and the requirement of weld joint tracking on real-time performance is ensured. The adaptability of the improved weld tracking scheme is verified under the actual working condition by carrying out a flat plate additive welding experiment, and the artificially welded cube and the system welding-formed cube are analyzed by comparison, so that the forming effects of the artificially welded cube and the system welding-formed cube are very poor by multiple experimental results, and the improved weld tracking scheme is proved to have higher reliability in the task of quickly forming solid parts.
Drawings
FIG. 1 is an additive manufacturing part molding of the present invention;
fig. 2 is a schematic diagram of the UNet network structure of the present invention;
FIG. 3 is a network architecture diagram of the modified ERFNet of the present invention;
figure 4 is a flat plate additive weld data set diagram of the present invention,
in the figure, (a 1) a third laser stripe pattern of the first layer; (a 2) a first layer third weld signature;
(b1) A second layer of a second laser stripe pattern; (b 2) a second layer second weld signature;
(c1) A fourth laser stripe pattern of the second layer; (c 2) a second layer fourth weld signature;
(d1) A third layer of a third laser stripe pattern; (d 2) a third weld signature of the third layer;
(e1) A fourth layer of second laser stripe patterns; (e 2) a fourth layer second weld signature;
(f1) A fifth layer of a second laser stripe pattern; (f 2) fifth layer second weld profile;
FIG. 5 is a graph showing the result of dividing each layer according to the present invention,
in the figure, a denotes an input test chart; b represents an output result graph;
(1-1) to (1-4) laser stripe patterns and characteristic segmentation patterns of the four welding seams of the first layer;
(2-1) - (2-4) laser stripe patterns and characteristic segmentation patterns of the four welding seams of the second layer;
(3-1) - (3-4) laser stripe patterns and characteristic segmentation patterns of the fourth welding seams of the third layer;
(4-1) to (4-4) laser stripe patterns and characteristic segmentation patterns of four welding seams of the fourth layer;
(5-1) - (5-4) laser stripe pattern and feature segmentation pattern of fifth layer fourth weld
FIG. 6 is a diagram of a weld pass planning and a weld pass numbering for a flat plate additive welding experiment of the present invention;
figure 7 is a comparison of the cubic surface topography after the welding of the layers of the present invention,
in the figure, (a 1) the first layer surface topography of the master workpiece; (b 1) a first layer surface topography of the experimental workpiece;
(a2) The surface appearance of a second layer of the standard workpiece; (b 2) testing the surface topography of the second layer of the workpiece;
(a3) The third layer of the standard workpiece has the surface appearance; (b 3) the surface topography of the third layer of the experimental workpiece;
(a4) The surface appearance of a fourth layer of the standard workpiece; (b 4) the surface topography of the fourth layer of the experimental workpiece;
(a5) The fifth layer surface appearance of the standard workpiece; (b 5) fifth layer topography of the experimental work-piece.
Detailed Description
The present invention will now be described in further detail with reference to the accompanying drawings. These drawings are simplified schematic views illustrating only the basic structure of the present invention in a schematic manner, and thus show only the constitution related to the present invention.
As shown in fig. 1 to 3, the weld tracking method based on feature segmentation of the present invention is specifically as follows:
a weld joint tracking method based on feature segmentation comprises the following steps:
step S1: collecting a molten pool image, and carrying out ROI (region of interest) selection on the molten pool image; the ROI region is a fixed region containing the puddle contour, the ROI region being 512 pixels by 512 pixels in size.
Step S2: an Encoder-Decoder segmentation network structure of ErfNet is adopted to carry out molten pool image segmentation, a cross entropy loss function commonly used for semantic segmentation is used, global characteristic information is fused, and a pyramid pooling module with a balance factor alpha is added into a backbone network t The cross entropy loss function formula of (1) is:
Figure BDA0002977323300000041
in the formula of alpha t When being a positive sample, alpha t =α∈[0,1],α t When being negative sample, alpha t =1-α,y c Label, p, representing sample c c Representing the probability that the sample c is predicted to be a positive class, wherein M is the number of classes; network architectureBased on SegNet and ENet, the whole model comprises 23 layers, wherein 1-16 layers are Encoders, and 17-23 layers are Decoders. The network structure adopts multi-scale feature fusion of high-level features and bottom-level features, feature maps of the 8 th level and the 16 th level in ERFNet are fused, feature maps of the 3 rd level and the 17 th level are fused, and feature maps of the 2 nd level and the 20 th level are fused. The frame number of ERFNet is above 120 FPS. The RMSE and PDE mean value of the high-layer welding seam of the network structure relative to the low-layer welding seam is in an ascending trend, and the offset between the welding seam feature point and the artificial marking feature point is 1-2 pixels.
The technical effects of the present invention are verified by the following examples:
plate additive weld joint feature extraction algorithm based on improved ERFNet
It can be known from the experiment of heavy-duty groove additive weld joint tracking, only the laser stripe outline line is effective information in the laser stripe image that gathers through the welding seam tracking system, consequently the problem of positive negative sample imbalance can appear in ERFNet training process, simultaneously because the welding number of piles and welding track number are all far more than the experiment of heavy-duty groove filling in dull and stereotyped additive welding, so the laser stripe characteristic of high-rise welding seam can be more unobvious, consequently directly use ERFNet in dull and stereotyped additive weld joint characteristic extraction experiment can lead to the laser stripe central line to appear broken string, welding characteristic point deviation scheduling problem. In order to avoid the problems, the invention improves the network structure and the loss function of the ERFNet: and (3) performing multi-scale feature fusion on the high-level features and the low-level features by referring to UNet on the network structure, and replacing the cross entropy Loss function with Focal local on the Loss function.
(1) Multi-scale feature fusion
In the deep convolutional neural network, according to the theory of receptive field, the low-level feature map has higher resolution and contains more position and detail information, but the lower convolutional operation is performed, so the semantic information is lower, the noise is more, and the high-level feature map has stronger semantic information, but the higher convolutional operation is performed, so the resolution is very low, and the perception capability of the image details is poor. According to the characteristic of the feature map, the capability of acquiring the feature information in the image by the network can be improved by adopting a multi-scale feature fusion mode.
The multi-scale feature fusion is to fuse a low-level feature map with rich spatial features and a high-level feature map with rich semantic information to obtain a new feature map, so that the new feature map has the characteristics of high resolution and high semantic information, and the method is widely applied to the field of target detection and semantic segmentation.
The method is improved by referring to a UNet network on the ERFNet network structure, and the effect of extracting the laser stripe center line and the welding feature point by the network is improved by adopting a multi-scale feature fusion mode. The network structure of UNet is shown in fig. 2, and is the same as ERFNet, UNet is an algorithm based on an Encoder-Decoder structure, but feature maps with the same size in an Encoder and a Decoder are spliced by using Concat operation in UNet, so that the network can acquire more spatial information and semantic information in the process of up-sampling, and the segmentation precision is improved.
Referring to a feature fusion mode of splicing corresponding large and small feature maps in the UNet, feature maps of the 8 th layer and the 16 th layer in the ERFNet are fused, feature maps of the 3 rd layer and the 17 th layer are fused, feature maps of the 2 nd layer and the 20 th layer are fused, and a network structure is shown in fig. 3. Different from the method for fusing the feature maps by using Concat operation in UNet, the method for fusing the multi-scale features directly adds the corresponding feature maps to realize the multi-scale feature fusion, and is characterized in that the number of feature channels is increased by using the Concat operation for feature superposition, so that the calculated amount is increased, and the speed of extracting the laser center line and the welding feature point by the algorithm is reduced, so that the multi-scale feature fusion is realized by directly adding the corresponding feature maps in order to ensure the real-time performance and the reliability of the algorithm.
(2)Focal Loss
The Focal local is provided in the field of target detection, and mainly aims to solve the problem of extreme unbalance of positive and negative samples in target detection. In the task of seam tracking in the method, only the laser stripes in the image collected by the laser vision system belong to the positive sample, and the problem of unbalance of the positive sample and the negative sample exists. So in order to improve the capability of the algorithm to extract the laser fringe features, the cross entropy loss function adopted in the ERFNet can be replaced by focallloss.
The Focal local is improved on the basis of a traditional cross entropy Loss function. In a conventional mathematical expression of the cross-entropy loss function, the larger the output probability is for positive samples, the smaller the loss is, and the smaller the output probability is for negative samples, but in the case of extreme imbalance of the positive and negative samples, the cross-entropy loss function may not be optimized in the iterative process. To solve this problem, first a balance factor α is introduced t For positive samples, α t =α∈[0,1]For negative samples, α t 1- α, with a balance factor α t The cross entropy loss function of (a) is as follows:
Figure BDA0002977323300000061
balance factor alpha t The proportion of positive and negative samples can be well balanced and is not uniform, but the problems of simple samples and difficult samples cannot be solved, so the formula balance factor alpha t The cross entropy loss function of (a) is based on introducing a parameter gamma, so that the model is more concentrated on difficult and misclassified samples. Thus, the mathematical expression for focaloss is as follows:
Figure BDA0002977323300000062
the parameter γ is an over parameter for adjusting the rate of weight reduction of simple samples, when γ =0, focalLoss is converted into a conventional cross entropy loss function, and when the value of γ increases, the influence of the modulation factor increases, so that the loss of easily classified samples decreases.
The use of the focaloss training ERFNet can reduce the weight of a large number of simple negative samples in training, which is very important for the application scenario of the present invention.
(II) plate additive weld joint feature extraction experiment
The experiment adopts the ERFNet network-based laser vision weld joint tracking system and the equipment and scheme to collect and manufacture the flat plate additive weld joint laser data set. A partial data set image is shown in fig. 4.
Comparing the plate additive welding line data set with the large-groove additive welding line data set, the line shape of the laser stripe in plate additive welding is more gentle, the intensity of the laser stripe can be weakened on the edge position of an image along with the increase of the number of welding layers and the number of welding tracks, the number of welding line characteristic points can be less, the position is more unobvious, and therefore higher requirements are provided for a characteristic extraction algorithm. In order to test the performance of the improved ERFNet, the laser stripe patterns of all welding seams of each layer are verified, and the environment configuration of network testing is also the same as that of the ERFNet network-based laser vision welding seam tracking system. The partial test set test results are shown in fig. 5.
The network has the effect of dividing the center line of the laser stripe, and as can be seen from fig. 5, the improved ERFNet can achieve ideal dividing effect on each layer of welding seam laser stripe, and the problems of line breakage and the like do not occur at the key position. Secondly, the core task in weld tracking is as follows: although the number of the characteristic points in the experiment is less and the distribution of positive and negative samples is more unbalanced compared with that of a large-groove experiment, the ERFNet is improved, a good segmentation effect can be achieved for each layer of welding seam, and the extraction of the area position of the characteristic point is consistent with an actual result. The method has the advantages that the performance of the feature extraction algorithm can be improved by aiming at the improvement of the ERFNet network structure and the loss function, the frame number of the improved ERFNet can reach more than 120FPS, the efficiency is not lost due to the introduction of multi-scale feature fusion, and the requirement of weld joint tracking instantaneity is still met. Therefore, the improved feature extraction algorithm completely meets the requirements of intelligent welding on precision and time.
In order to accurately evaluate the improved ERFNet performance, the RMSE is adopted to evaluate the extraction effect of the laser stripe center line, and the PDE is adopted to evaluate the accuracy of the weld characteristic points. The error values for each weld of each layer are shown in table 1.
Figure BDA0002977323300000071
TABLE 1 RMES and PDE between segmentation and labeling plots for ERFNet after modification
As can be seen from Table 1, for each layer of weld, the RMSE has the average values 0.6181, 0.6273, 0.6199, 0.6623 and 0.6557, and the PDE has the average values 2.2035, 2.6789, 2.2414, 2.5516 and 2.4439. Compared with the first layer of welding seams, the RMSE and the PDE mean value of the high layer of welding seams are increased to different degrees, which shows that the segmentation error of the laser stripe graph is increased along with the increase of the number of the welding layers. The PDE average value of the first layer weld is 2.2035, which indicates that the offset between the weld feature point obtained by the improved algorithm and the artificially marked feature point is about two pixels, and it can be considered that the improved feature extraction algorithm can accurately obtain the feature point of the first layer weld. Meanwhile, the above table shows that the RMSE and PDE of the high-rise weld are respectively increased by less than 0.0442 and 0.4754 compared with the first-rise weld, which indicates that with the increase of the number of welding layers, although the segmentation accuracy is reduced, the offset of the weld characteristic point is less than one pixel, and it can be considered that the improved algorithm can achieve the ideal segmentation accuracy for the high-rise weld. In conclusion, the performance of the feature extraction algorithm can be improved by improving the ERFNet.
In order to verify the feasibility of the improved weld tracking scheme in the field of rapid forming of solid parts, a flat plate additive welding experiment is performed below, and whether the weld tracking scheme meets the requirements of the field of intelligent additive manufacturing is judged by analyzing the deviation between an actual welding point and an artificial welding point and comparing the final forming effect.
(III) Flat plate additive weld tracking experiment
The experimental platform of the experiment is based on a welding seam tracking system and welding seam tracking software, and the material of the selected flat plate is Q235 stainless steel. Since the feasibility of the weld tracking system herein in the field of rapid prototyping solid parts needs to be verified, part prototyping will be simulated by additive welding to form a cube. In this experiment, a stainless steel flat plate was used as a base plate, the number of welding layers was 5, and the number of welding passes per layer was 5, for 25 passes. The specific bead layout and bead number are shown in fig. 6.
According to the experimental scheme, a welding seam tracking system and welding seam tracking software are combined to perform a flat plate additive welding seam tracking experiment. Similar to a large-groove welding experiment, the reliability of the improved algorithm and system in the task of quickly forming the part is verified by comparing a standard workpiece with an experimental workpiece in the experiment.
Firstly, focusing on teaching errors between welding points and welding points calculated by a welding system. According to the welding scheme shown in fig. 5, three-dimensional coordinates of a part of the bead start welding point and the end welding point are selected and subjected to error analysis. The measurement results are shown in tables 2 to 3.
Figure BDA0002977323300000081
Figure BDA0002977323300000091
TABLE 2 three-dimensional coordinates and errors of initial welding point of plate material increase welding
Figure BDA0002977323300000092
TABLE 3 Flat Panel additive welding termination weld Point three-dimensional coordinates and error
According to the experimental data recorded in the table, although the number of welding layers and the number of welding tracks are more in the plate additive welding, the deviation of each dimension coordinate between the teaching welding point and the welding point obtained by the algorithm does not exceed 1.00mm, which shows that the improved welding line tracking scheme meets the requirement of a part rapid forming task on high precision.
And secondly to focus on the shaping effect of the cube during welding. According to the welding scheme shown in fig. 5, the height of the cube was measured every time one layer was welded, and the height values of the experimental workpiece and the standard workpiece were compared, and the measurement results are shown in table 4.
Figure BDA0002977323300000101
TABLE 4 height of welded workpiece of each layer in plate additive welding
The measured data in the table show that the error values are accumulated to a certain extent with the increase of the welding layer number and the welding bead number, but the maximum error is not more than 0.90mm, which indicates that the ideal forming effect can be realized in the actual welding process.
The surface topography of each layer of the experimental and standard work pieces was compared while recording the cube height values after each layer was welded, as shown in fig. 7. The appearance contrast diagram after welding of each layer can be seen visually, the forming effect of each layer of the experimental workpiece and the standard workpiece has no obvious difference, and the improvement of the welding seam tracking system in the section can meet the requirements of high precision and real-time performance in the field of rapid forming of solid parts by combining the data and analysis of the table 4.
In order to verify the feasibility of the welding seam tracking scheme in the field of rapid forming of solid parts, a flat plate additive welding seam tracking experiment is carried out. Aiming at the problems that the laser stripe profile is more complex, the proportion of positive and negative samples is unbalanced, the number of welding layers and welding tracks is more and the like in the experiment, the invention optimizes and improves the welding track characteristic extraction algorithm, introduces a multi-scale characteristic fusion strategy and Focal Loss into ERFNet, designs the feasibility of the algorithm after the improvement of the plate additive welding track characteristic extraction experiment verification, and shows that the improvement scheme adopted by the invention not only can effectively improve the performance of the characteristic extraction algorithm, but also can accurately extract the laser stripe central line and the welding track characteristic points of each welding track of each layer, and simultaneously can not lose the efficiency of the algorithm and ensure the requirement of the welding track on the real-time property. The adaptability of the improved weld tracking scheme under the actual working condition is verified through a subsequent experiment for carrying out a plate additive welding experiment, and the artificially welded cube and the system welding-formed cube are compared and analyzed, and multiple experimental results show that the forming effects of the artificially welded cube and the system welding-formed cube are very different, so that the improved weld tracking scheme is proved to have higher reliability in a task of quickly forming solid parts.
In light of the foregoing description of the preferred embodiment of the present invention, many modifications and variations will be apparent to those skilled in the art without departing from the spirit and scope of the invention. The technical scope of the present invention is not limited to the content of the specification, and must be determined according to the scope of the claims.

Claims (3)

1. A weld joint tracking method based on feature segmentation is characterized by comprising the following steps: the method comprises the following steps:
step S1: collecting a molten pool image, and carrying out ROI (region of interest) selection on the molten pool image;
step S2: an Encoder-Decoder segmentation network structure adopting ERFNet carries out molten pool image segmentation, the frame number of the ERFNet is more than 120FPS, the network structure adopts multi-scale feature fusion of high-level features and bottom-level features, the RMSE and PDE mean values of high-level welding seams of the network structure relative to low-level welding seams are in an ascending trend, the offset between welding seam feature points and artificial mark feature points is 1~2 pixels, cross entropy loss functions commonly used for semantic segmentation are used for fusing feature maps of the 8 th layer and the 16 th layer in the ERFNet, feature maps of the 3 rd layer and the 17 th layer are fused, feature maps of the 2 nd layer and the 20 th layer are fused, a pyramid pooling module is added into a main network, and the pyramid pooling module is provided with balance factors
Figure DEST_PATH_IMAGE001
The cross entropy loss function of (a) is given by:
Figure DEST_PATH_IMAGE002
in the formula
Figure DEST_PATH_IMAGE003
In the case of a positive sample, the sample is,
Figure DEST_PATH_IMAGE004
Figure 840496DEST_PATH_IMAGE001
in the case of a negative sample, the sample is,
Figure DEST_PATH_IMAGE005
Figure DEST_PATH_IMAGE006
a label representing the sample c is attached to the sample c,
Figure DEST_PATH_IMAGE007
the probability that the sample c is predicted as a positive class is shown, and M is the number of classes.
2. The feature segmentation based seam tracking method according to claim 1, wherein: the ROI area in step S1 is a fixed area including the molten pool contour, and the size of the ROI area is 512 pixels × 512 pixels.
3. The feature segmentation based seam tracking method according to claim 1, wherein: in the step S2, the network structure is based on SegNet and ENet, and the whole model comprises 23 layers, wherein 1-16 layers are Encoders, and 17-23 layers are Decoders.
CN202110277763.0A 2021-03-15 2021-03-15 Weld joint tracking method based on feature segmentation Active CN113034512B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110277763.0A CN113034512B (en) 2021-03-15 2021-03-15 Weld joint tracking method based on feature segmentation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110277763.0A CN113034512B (en) 2021-03-15 2021-03-15 Weld joint tracking method based on feature segmentation

Publications (2)

Publication Number Publication Date
CN113034512A CN113034512A (en) 2021-06-25
CN113034512B true CN113034512B (en) 2022-11-11

Family

ID=76470691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110277763.0A Active CN113034512B (en) 2021-03-15 2021-03-15 Weld joint tracking method based on feature segmentation

Country Status (1)

Country Link
CN (1) CN113034512B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021117714A1 (en) 2021-07-08 2023-01-12 Endress+Hauser SE+Co. KG Automatic seam detection for a welding process
CN115121913B (en) * 2022-08-30 2023-01-10 北京博清科技有限公司 Method for extracting laser central line

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10452980B1 (en) * 2019-01-25 2019-10-22 StradVision, Inc. Learning method and learning device for extracting feature from input image by using convolutional layers in multiple blocks in CNN, resulting in hardware optimization which allows key performance index to be satisfied, and testing method and testing device using the same
CN111985274A (en) * 2019-05-23 2020-11-24 中国科学院沈阳自动化研究所 Remote sensing image segmentation algorithm based on convolutional neural network
CN112381095A (en) * 2021-01-15 2021-02-19 南京理工大学 Electric arc additive manufacturing layer width active disturbance rejection control method based on deep learning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10452980B1 (en) * 2019-01-25 2019-10-22 StradVision, Inc. Learning method and learning device for extracting feature from input image by using convolutional layers in multiple blocks in CNN, resulting in hardware optimization which allows key performance index to be satisfied, and testing method and testing device using the same
CN111985274A (en) * 2019-05-23 2020-11-24 中国科学院沈阳自动化研究所 Remote sensing image segmentation algorithm based on convolutional neural network
CN112381095A (en) * 2021-01-15 2021-02-19 南京理工大学 Electric arc additive manufacturing layer width active disturbance rejection control method based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于加权损失函数的多尺度对抗网络图像语义分割算法;张宏钊等;《计算机应用与软件》;20200112(第01期);全文 *

Also Published As

Publication number Publication date
CN113034512A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
CN113034512B (en) Weld joint tracking method based on feature segmentation
CN110992317B (en) PCB defect detection method based on semantic segmentation
CN104990926B (en) A kind of TR elements positioning of view-based access control model and defect inspection method
CN111922483B (en) Line structure light welding seam tracking and material adding path deviation rectifying device and method based on learning
CN103870623B (en) Preprocessing simulated analysis template for vehicle model
CN107358636B (en) Loose defect image generation method based on texture synthesis
CN102279190A (en) Image detection method for weld seam surface defects of laser welded plates of unequal thickness
CN110084124A (en) Feature based on feature pyramid network enhances object detection method
CN109118500A (en) A kind of dividing method of the Point Cloud Data from Three Dimension Laser Scanning based on image
CN110728667A (en) Automatic and accurate cutter wear loss measuring method based on gray level image probability
CN106887020A (en) A kind of road vertical and horizontal section acquisition methods based on LiDAR point cloud
CN103383775B (en) A kind of Remote Sensing Image Fusion effect evaluation method
CN104766333A (en) Vehicle door point welding robot path correction method based on stereoscopic vision
CN115018827B (en) Automatic detection method for quality of building material weld joint
CN112819066A (en) Res-UNet single tree species classification technology
CN112419237B (en) Deep learning-based automobile clutch master cylinder groove surface defect detection method
CN110440761B (en) Processing method of aerial photogrammetry data of unmanned aerial vehicle
CN111080621A (en) Method for identifying railway wagon floor damage fault image
CN104715109B (en) A kind of automobile body-in-white solder joint Automated Partition Method interfered based on ball
CN115546125A (en) Method for error detection and track deviation correction of additive manufacturing cladding layer based on point cloud information
CN104050640A (en) Multi-view dense point cloud data fusion method
Sun et al. Geographic, geometrical and semantic reconstruction of urban scene from high resolution oblique aerial images.
CN106446475A (en) Method and device for extracting welding point information of vehicle body in white
CN110153582A (en) Welding scheme generation method, device and welding system
CN101976452A (en) Integrated filtering method of airborne laser scanning spot clouds based on contour line cluster analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant