CN111637847A - Welding seam parameter measuring method and device - Google Patents
Welding seam parameter measuring method and device Download PDFInfo
- Publication number
- CN111637847A CN111637847A CN202010446559.2A CN202010446559A CN111637847A CN 111637847 A CN111637847 A CN 111637847A CN 202010446559 A CN202010446559 A CN 202010446559A CN 111637847 A CN111637847 A CN 111637847A
- Authority
- CN
- China
- Prior art keywords
- edge point
- coordinate data
- welding seam
- point
- weld
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003466 welding Methods 0.000 title claims abstract description 90
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000012545 processing Methods 0.000 claims abstract description 45
- 238000002372 labelling Methods 0.000 claims abstract description 30
- 238000000691 measurement method Methods 0.000 claims abstract description 7
- 210000004205 output neuron Anatomy 0.000 claims description 6
- 210000002364 input neuron Anatomy 0.000 claims description 3
- 238000005259 measurement Methods 0.000 abstract description 13
- 238000005516 engineering process Methods 0.000 abstract description 3
- 230000004913 activation Effects 0.000 description 14
- 238000003860 storage Methods 0.000 description 13
- 238000004590 computer program Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000011176 pooling Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 2
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30152—Solder
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention provides a method and a device for measuring welding seam parameters, which relate to the welding technology, and are characterized in that the shape of a welding seam is obtained according to welding seam data by obtaining the welding seam data; acquiring a first labeling picture, wherein the first labeling picture is obtained by labeling a left end edge point, a lowest point and a right end edge point of the welding seam shape; performing labeling identification processing on the first labeled picture according to a preset CNN identification model to acquire coordinate data corresponding to the left edge point, the lowest point and the right edge point; and acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point. The method and the device utilize identification and measurement of key points of the welding seam to acquire the welding seam parameter data, have simple measurement method and lower cost, and improve the measurement efficiency.
Description
Technical Field
The invention relates to a welding technology, in particular to a method and a device for measuring welding seam parameters.
Background
With the development of technology, automation operations are generally increased. Wherein, can adopt welding robot to weld when to pipe automatic weld. And the welding robot needs to identify the welding seam so as to accurately finish the welding operation.
Currently, laser-based weld parameter measurement is based on a laser measurement technique, in which a laser is scanned from one end of a pipe to the other end, and based on the difference of laser-measured data in an image, parameter data of a weld, such as the width and depth of the weld, are calculated. However, the laser measurement method generally requires long-time scanning, so that the work production efficiency is reduced, the laser measurement height is limited to a certain extent, the position of the laser is generally required to be adjusted according to the sizes of different pipe diameters, external mechanical cooperation is required to be added, the structure of the measurement device is complex, and the high cost of the laser is increased, so that the overall cost is high.
Therefore, the prior art has complex measuring process of welding seam parameters and higher cost.
Disclosure of Invention
The embodiment of the invention provides a method and a device for measuring welding seam parameters, which improve the measurement efficiency, reduce the complexity of the measurement of the welding seam parameters and reduce the cost.
In a first aspect of the embodiments of the present invention, a method for measuring a weld parameter is provided, including:
acquiring weld data, and acquiring the shape of a weld according to the weld data;
acquiring a first labeling picture, wherein the first labeling picture is obtained by labeling a left end edge point, a lowest point and a right end edge point of the welding seam shape;
performing labeling identification processing on the first labeled picture according to a preset CNN identification model to acquire coordinate data corresponding to the left edge point, the lowest point and the right edge point;
and acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point.
Optionally, in a possible implementation manner of the first aspect, after the obtaining the first annotation picture, the method further includes:
expanding the first labeled picture to obtain a plurality of second labeled pictures;
the labeling, identifying and processing the first labeled picture according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point and the right edge point includes:
and performing label identification processing on the second label pictures according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point and the right edge point.
Optionally, in a possible implementation manner of the first aspect, after the performing the expansion processing on the first annotation picture and acquiring the plurality of second annotation pictures, the method further includes:
performing pixel zooming processing on the plurality of second labeled pictures according to a preset pixel value to obtain a plurality of zoomed images;
carrying out gray processing on the plurality of zoomed images to obtain a plurality of third labeled pictures;
correspondingly, the labeling and identifying the first labeled picture according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point, and the right edge point includes:
and performing label identification processing on the third label pictures according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point and the right edge point.
Optionally, in a possible implementation manner of the first aspect, the preset CNN recognition model includes an input layer, a hidden layer, and an output layer, where the input layer is adjacent to the hidden layer, and the hidden layer is adjacent to the output layer;
wherein the input neuron data of the input layer is 1, and the output neuron number of the output layer is 6.
Optionally, in a possible implementation manner of the first aspect, the performing, according to a preset CNN recognition model, a label recognition process on the first label picture to obtain coordinate data corresponding to the left edge point, the lowest point, and the right edge point includes:
receiving the first labeled picture according to the input layer, and outputting 6 data according to the output layer;
and acquiring 3 coordinate data corresponding to the left edge point, the lowest point and the right edge point according to the 6 data.
Optionally, in a possible implementation manner of the first aspect, the weld parameter data includes a weld groove depth;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a first strategy to obtain the groove depth of the weld joint, wherein the first strategy is as follows:
T1=(Z2+Z3–2*Z1)/2
wherein T1 represents the weld groove depth, Z1 represents the ordinate of the nadir coordinate data, Z2 represents the ordinate of the left-end edge point coordinate data, and Z3 represents the ordinate of the right-end edge point coordinate data.
Optionally, in a possible implementation manner of the first aspect, the weld parameter data includes a weld groove width;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a second strategy to obtain the width of the weld groove, wherein the second strategy is as follows:
T2=Z3–Z2
wherein T2 represents the weld groove width, Z2 represents the ordinate of the left edge point coordinate data, and Z3 represents the ordinate of the right edge point coordinate data.
Optionally, in a possible implementation manner of the first aspect, the weld parameter data includes a left groove angle of the weld;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a third strategy to obtain the left groove angle of the welding seam, wherein the third strategy is as follows:
T3=atan((Z1-Z2)/(X1-X2))
wherein T3 represents the weld left groove angle, Z1 represents the ordinate of the nadir coordinate data, Z2 represents the ordinate of the left edge point coordinate data, X1 represents the abscissa of the nadir coordinate data, and X2 represents the abscissa of the left edge point coordinate data.
Optionally, in a possible implementation manner of the first aspect, the weld parameter data includes a right groove angle of the weld;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a fourth strategy to obtain the right groove angle of the welding seam, wherein the fourth strategy is as follows:
T4=atan((Z1-Z3)/(X1-X3))
wherein T4 represents the weld right groove angle, Z1 represents the ordinate of the nadir coordinate data, Z3 represents the ordinate of the right end edge point coordinate data, X1 represents the abscissa of the nadir coordinate data, and X3 represents the abscissa of the right end edge point coordinate data.
In a second aspect of the embodiments of the present invention, there is provided a weld parameter measuring apparatus, including:
the scanning module is used for acquiring welding seam data and acquiring the shape of a welding seam according to the welding seam data;
the marking module is used for acquiring a first marking picture, wherein the first marking picture is obtained by marking the left edge point, the lowest point and the right edge point of the welding seam shape;
the identification module is used for carrying out marking identification processing on the first marked picture according to a preset CNN identification model and acquiring coordinate data corresponding to the left edge point, the lowest point and the right edge point;
and the parameter module is used for acquiring the welding seam parameter data according to the coordinate data of the left edge point, the lowest point and the right edge point.
In a third aspect of the embodiments of the present invention, there is provided a weld parameter measuring apparatus, including: memory, a processor and a computer program, the computer program being stored in the memory, the processor running the computer program to perform the method of the first aspect of the invention as well as various possible aspects of the first aspect.
A fourth aspect of the embodiments of the present invention provides a readable storage medium, in which a computer program is stored, the computer program being, when executed by a processor, configured to implement the method according to the first aspect of the present invention and various possible aspects of the first aspect.
According to the welding seam parameter measuring method, welding seam data are obtained, and the shape of a welding seam is obtained according to the welding seam data; acquiring a first labeling picture, wherein the first labeling picture is obtained by labeling a left end edge point, a lowest point and a right end edge point of the welding seam shape; performing labeling identification processing on the first labeled picture according to a preset CNN identification model to acquire coordinate data corresponding to the left edge point, the lowest point and the right edge point; and acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point. The method and the device utilize identification and measurement of key points of the welding seam to acquire the welding seam parameter data, have simple measurement method and lower cost, and improve the measurement efficiency.
Drawings
FIG. 1 is a schematic flow chart of a weld parameter measurement method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a shooting device provided in an embodiment of the present invention;
FIG. 3 is a schematic view of a V-weld provided by an embodiment of the present invention;
FIG. 4 is a schematic view of a V-shaped weld marked according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a weld parameter measuring device according to an embodiment of the present invention;
fig. 6 is a schematic hardware structure diagram of a weld parameter measuring device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the processes do not mean the execution sequence, and the execution sequence of the processes should be determined by the functions and the internal logic of the processes, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
It should be understood that in the present application, "comprising" and "having" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
As used herein, "if" may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Referring to fig. 1, which is a schematic flowchart of a weld parameter measuring method according to an embodiment of the present invention, an execution subject of the method shown in fig. 1 may be a software and/or hardware device. The execution subject of the present application may include, but is not limited to, at least one of: user equipment, network equipment, etc. The user equipment may include, but is not limited to, a computer, a smart phone, a Personal Digital Assistant (PDA), the above mentioned electronic equipment, and the like. The network device may include, but is not limited to, a single network server, a server group of multiple network servers, or a cloud of numerous computers or network servers based on cloud computing, wherein cloud computing is one type of distributed computing, a super virtual computer consisting of a cluster of loosely coupled computers. The present embodiment does not limit this. The welding seam parameter measuring method comprises the following steps of S101 to S104:
s101, acquiring welding seam data, and acquiring the shape of a welding seam according to the welding seam data.
Specifically, in order to acquire data of key points of the weld, shape data of the weld needs to be acquired first. For example, by processing the weld data, the obtained weld shape is a V shape. It should be noted that the scheme is suitable for measuring the V-shaped welding seam.
In practical application, refer to fig. 2, which is a schematic structural diagram of a shooting device according to an embodiment of the present invention, wherein a line laser sensor 22 is disposed at a front end of a shooting robot 21, and the line laser sensor 22 may be a red line scanning laser transmitter which is mainstream in the market. The workpiece may be placed on the material table 23. The device can be used for scanning the welding seam of the workpiece. The device has the advantages of simple scanning of the workpiece and wide scannable range.
In practical application, the laser sensing device can be used to obtain weld data, and the mainstream line scanning laser data in the market is a point pair (X, Z) of a horizontal offset value X and a vertical offset value Z between the object surface and the laser center point, and the resolution ratio according to the line scanning laser is usually 500-1000 points. Referring to fig. 3, a schematic view of a V-shaped weld according to an embodiment of the present invention is shown.
S102, obtaining a first labeled picture, wherein the first labeled picture is obtained by labeling the left edge point, the lowest point and the right edge point of the welding seam shape.
Specifically, after the round tube picture is acquired, the first labeling picture may be acquired by processing the V-shaped weld shape acquired in step S101, for example, labeling the key points.
The key points can be left edge points, lowest points and right edge points of the V-shaped welding line shape, and after the measurement data of the left edge points, the lowest points and the right edge points are obtained, the key points of the V-shaped welding line shape can be positioned, so that the specific shape of the V-shaped welding line shape can be determined.
Exemplarily, refer to fig. 4, which is a schematic diagram of a marked V-shaped weld according to an embodiment of the present invention. The left end edge point 41, the lowest point 42 and the right end edge point 43 in the figure are the key points of the V-shaped weld. The left edge point 41, the lowest point 42, and the right edge point 43 may be implemented by label software such as labelme label software and via label software.
And S103, performing label identification processing on the first label picture according to a preset CNN identification model, and acquiring coordinate data corresponding to the left edge point, the lowest point and the right edge point.
Specifically, after receiving the first labeled picture, the first labeled picture may be processed by using a preset CNN identification model to obtain coordinate data of the left edge point 41, the lowest point 42, and the right edge point 43 in the first labeled picture, so as to implement measurement of the welding seam parameters by using the coordinate data.
The preset CNN recognition model comprises an input layer, a hidden layer and an output layer, wherein the input layer is adjacent to the hidden layer, and the hidden layer is adjacent to the output layer; wherein the input neuron data of the input layer is 1, and the output neuron number of the output layer is 6.
Specifically, the preset CNN recognition models have 6 layers, the initial input value of the input layer is 240 × 240 grayscale images, and the final output value of the output layer is 3 coordinate point pairs (x, y), which total 6 points. The first layer is the conv2d _1 convolutional layer, 32 convolutions are used, the convolution kernel size is 3 × 3, the activation layer activation _1 is used after the convolution is completed, the activation function is relu, the Max pooling layer Max _ position 2d _1 is added after the activation is completed, and the pooling size is 2 × 2. The second layer is the conv2d _2 convolutional layer, 64 convolutions are used, the convolution kernel size is 3 × 3, the activation layer activation _2 is used after the convolution is completed, the activation function is relu, the Max pooling layer Max _ position 2d _2 is added after the activation is completed, and the pooling size is 2 × 2. The third layer is a conv2d _3 convolution layer, 128 convolutions are used, the convolution kernel size is 3 × 3, an activation layer activation _3 is used after the convolution is completed, the activation function is relu, a Max pooling layer Max _ pooling2d _2 is added after the activation is completed, and the pooling size is 2 × 2. After the third layer is finished and all the convolution layers are calculated, the data is unfolded by using a Flatten layer. The fourth layer is the fully connected layer dense _1, with 256 output neurons, a relu activation function, followed by Dropout, which is scaled to 0.5. The fifth layer is a fully connected layer dense _2, the number of output neurons is 6, and the activation function is relu. The sixth layer is a full-link layer, the number of output neurons is 6, and the output is the final model output.
Illustratively, obtaining the coordinate data of the left edge point 41, the lowest point 42 and the right edge point 43 comprises receiving the first labeled picture according to the input layer, outputting 6 data according to the output layer, and then obtaining the coordinate data corresponding to the left edge point 41, the lowest point 42 and the right edge point 43 according to the 6 data.
It can be understood that two data are required for one two-dimensional coordinate, and 3 coordinates can be correspondingly formed by 6 data, so that 3 corresponding coordinate data can be obtained.
In practical application, before the first labeled picture is labeled and identified according to the preset CNN identification model, the initial CNN identification model is required to be subjected to gradient descent training processing to obtain the preset CNN identification model.
It will be appreciated that the initial model is trained before it is used. In the scheme, a gradient descent method is used for training the network, wherein the parameters are as follows: the Loss function selects an average Mean Square Error (MSE), the optimization method uses an Adam method, the learning rate is 0.002, and the iteration number is 500. And after the training is finished, acquiring a preset CNN recognition model.
And S104, acquiring the welding seam parameter data according to the coordinate data of the left edge point, the lowest point and the right edge point.
Specifically, after the coordinate data is acquired, the coordinate data may be processed to acquire corresponding weld parameter data.
The specific explanation is given by taking the weld groove depth, the weld groove width, the weld left groove angle and the weld right groove angle as examples of the weld parameter data to be obtained, and the specific explanation is as follows:
obtaining the depth of a weld groove:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a first strategy to obtain the groove depth of the weld joint, wherein the first strategy is as follows:
T1=(Z2+Z3–2*Z1)/2
wherein T1 represents the weld groove depth, Z1 represents the ordinate of the nadir coordinate data, Z2 represents the ordinate of the left-end edge point coordinate data, and Z3 represents the ordinate of the right-end edge point coordinate data.
Obtaining the width of a welding seam groove:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a second strategy to obtain the width of the weld groove, wherein the second strategy is as follows:
T2=Z3–Z2
wherein T2 represents the weld groove width, Z2 represents the ordinate of the left edge point coordinate data, and Z3 represents the ordinate of the right edge point coordinate data.
Obtaining a left groove angle of a welding seam:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a third strategy to obtain the left groove angle of the welding seam, wherein the third strategy is as follows:
T3=atan((Z1-Z2)/(X1-X2))
wherein T3 represents the weld left groove angle, Z1 represents the ordinate of the nadir coordinate data, Z2 represents the ordinate of the left edge point coordinate data, X1 represents the abscissa of the nadir coordinate data, and X2 represents the abscissa of the left edge point coordinate data.
Obtaining the right groove angle of the welding seam:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a fourth strategy to obtain the right groove angle of the welding seam, wherein the fourth strategy is as follows:
T4=atan((Z1-Z3)/(X1-X3))
wherein T4 represents the weld right groove angle, Z1 represents the ordinate of the nadir coordinate data, Z3 represents the ordinate of the right end edge point coordinate data, X1 represents the abscissa of the nadir coordinate data, and X3 represents the abscissa of the right end edge point coordinate data.
It can be understood that the coordinates are processed according to the first strategy, the second strategy, the third strategy and the fourth strategy, and corresponding weld parameter data are obtained.
According to the welding seam parameter measuring method provided by the embodiment, the welding seam shape is obtained according to the welding seam data by obtaining the welding seam data; acquiring a first labeling picture, wherein the first labeling picture is obtained by labeling a left end edge point, a lowest point and a right end edge point of the welding seam shape; performing labeling identification processing on the first labeled picture according to a preset CNN identification model to acquire coordinate data corresponding to the left edge point, the lowest point and the right edge point; and acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point. The method and the device utilize identification and measurement of key points of the welding seam to acquire the welding seam parameter data, have simple measurement method and lower cost, and improve the measurement efficiency.
On the basis of the above-described embodiment, in order to improve the accuracy, the processed data set may be enlarged. After the obtaining of the first annotation picture, the method further comprises:
and correspondingly, carrying out label identification processing on the second label pictures according to a preset CNN identification model to obtain coordinate data corresponding to the edge point of the left end, the lowest point and the edge point of the right end.
For example, random rotation, scaling and symmetric change processing may be performed on a plurality of first labeled pictures to obtain a plurality of second labeled pictures. In practical applications, the data set can be expanded to 2000 pictures, i.e. 2000 second annotated pictures can be available.
In some embodiments, pixel scaling processing may be further performed on the second labeled picture, and accordingly, pixel scaling processing is performed on a plurality of second labeled pictures according to a preset pixel value, so as to obtain a plurality of scaled images; carrying out gray processing on the plurality of zoomed images to obtain a plurality of third labeled pictures; correspondingly, the third labeled pictures are labeled and identified according to a preset CNN identification model, and coordinate data corresponding to the left edge point, the lowest point and the right edge point are obtained.
For example, the picture is uniformly scaled to 240 × 240 pixel size to accommodate the processing of the model. In order to further improve the processing accuracy, the scaling images may be subjected to gray scale processing to uniformly convert the images into gray scale images.
Referring to fig. 5, a schematic structural diagram of a weld parameter measuring apparatus according to an embodiment of the present invention is shown, where the weld parameter measuring apparatus 50 includes:
the scanning module 51 is used for acquiring welding seam data and acquiring the shape of a welding seam according to the welding seam data;
the labeling module 52 is configured to obtain a first labeled picture, where the first labeled picture is obtained by labeling a left edge point, a lowest point, and a right edge point of the weld shape;
an identifying module 53, configured to perform label identification processing on the first labeled picture according to a preset CNN identification model, and obtain coordinate data corresponding to the left edge point, the lowest point, and the right edge point;
a parameter module 54, configured to obtain the weld parameter data according to the coordinate data of the left edge point, the lowest point, and the right edge point.
The apparatus in the embodiment shown in fig. 5 can be correspondingly used to perform the steps in the method embodiment shown in fig. 1, and the implementation principle and technical effect are similar, which are not described herein again.
Referring to fig. 6, it is a schematic diagram of a hardware structure of a weld parameter measuring apparatus provided in an embodiment of the present invention, where the weld parameter measuring apparatus includes: a processor 61, memory 62 and computer programs; wherein
A memory 62 for storing the computer program, which may also be a flash memory (flash). The computer program is, for example, an application program, a functional module, or the like that implements the above method.
A processor 61 for executing the computer program stored in the memory to implement the steps performed by the apparatus in the above method. Reference may be made in particular to the description relating to the preceding method embodiment.
Alternatively, the memory 62 may be separate or integrated with the processor 61.
When the memory 62 is a device separate from the processor 61, the apparatus may further include:
a bus 63 for connecting the memory 62 and the processor 61.
The present invention also provides a readable storage medium, in which a computer program is stored, which, when being executed by a processor, is adapted to implement the methods provided by the various embodiments described above.
The readable storage medium may be a computer storage medium or a communication medium. Communication media includes any medium that facilitates transfer of a computer program from one place to another. Computer storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, a readable storage medium is coupled to the processor such that the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Additionally, the ASIC may reside in user equipment. Of course, the processor and the readable storage medium may also reside as discrete components in a communication device. The readable storage medium may be a read-only memory (ROM), a random-access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The present invention also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the device may read the execution instructions from the readable storage medium, and the execution of the execution instructions by the at least one processor causes the device to implement the methods provided by the various embodiments described above.
In the above embodiments of the apparatus, it should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (10)
1. A weld parameter measurement method is characterized by comprising the following steps:
acquiring weld data, and acquiring the shape of a weld according to the weld data;
acquiring a first labeling picture, wherein the first labeling picture is obtained by labeling a left end edge point, a lowest point and a right end edge point of the welding seam shape;
performing labeling identification processing on the first labeled picture according to a preset CNN identification model to acquire coordinate data corresponding to the left edge point, the lowest point and the right edge point;
and acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point.
2. The method of claim 1, further comprising, after said obtaining the first annotation picture:
expanding the first labeled picture to obtain a plurality of second labeled pictures;
the labeling, identifying and processing the first labeled picture according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point and the right edge point includes:
and performing label identification processing on the second label pictures according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point and the right edge point.
3. The method according to claim 2, wherein after the expanding the first annotated picture to obtain a plurality of second annotated pictures, the method further comprises:
performing pixel zooming processing on the plurality of second labeled pictures according to a preset pixel value to obtain a plurality of zoomed images;
carrying out gray processing on the plurality of zoomed images to obtain a plurality of third labeled pictures;
correspondingly, the labeling and identifying the first labeled picture according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point, and the right edge point includes:
and performing label identification processing on the third label pictures according to a preset CNN identification model to obtain coordinate data corresponding to the left edge point, the lowest point and the right edge point.
4. The method of claim 1, wherein the pre-defined CNN recognition model comprises an input layer, a hidden layer and an output layer, wherein the input layer is adjacent to the hidden layer, and the hidden layer is adjacent to the output layer;
wherein the input neuron data of the input layer is 1, and the output neuron number of the output layer is 6.
5. The method according to claim 4, wherein the labeling recognition processing on the first labeled picture according to a preset CNN recognition model to obtain coordinate data corresponding to the left edge point, the lowest point and the right edge point comprises:
receiving the first labeled picture according to the input layer, and outputting 6 data according to the output layer;
and acquiring 3 coordinate data corresponding to the left edge point, the lowest point and the right edge point according to the 6 data.
6. The method of claim 1, wherein the weld parameter data comprises weld groove depth;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a first strategy to obtain the groove depth of the weld joint, wherein the first strategy is as follows:
T1=(Z2+Z3–2*Z1)/2
wherein T1 represents the weld groove depth, Z1 represents the ordinate of the nadir coordinate data, Z2 represents the ordinate of the left-end edge point coordinate data, and Z3 represents the ordinate of the right-end edge point coordinate data.
7. The method of claim 1, wherein the weld parameter data comprises weld groove width;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a second strategy to obtain the width of the weld groove, wherein the second strategy is as follows:
T2=Z3–Z2
wherein T2 represents the weld groove width, Z2 represents the ordinate of the left edge point coordinate data, and Z3 represents the ordinate of the right edge point coordinate data.
8. The method of claim 1, wherein the weld parameter data comprises a left groove angle of the weld;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a third strategy to obtain the left groove angle of the welding seam, wherein the third strategy is as follows:
T3=atan((Z1-Z2)/(X1-X2))
wherein T3 represents the weld left groove angle, Z1 represents the ordinate of the nadir coordinate data, Z2 represents the ordinate of the left edge point coordinate data, X1 represents the abscissa of the nadir coordinate data, and X2 represents the abscissa of the left edge point coordinate data.
9. The method of claim 1, wherein the weld parameter data comprises a weld right bevel angle;
acquiring the welding seam parameter data according to the coordinate data of the left end edge point, the lowest point and the right end edge point, wherein the welding seam parameter data comprises the following steps:
processing the coordinate data of the left edge point, the lowest point and the right edge point according to a fourth strategy to obtain the right groove angle of the welding seam, wherein the fourth strategy is as follows:
T4=atan((Z1-Z3)/(X1-X3))
wherein T4 represents the weld right groove angle, Z1 represents the ordinate of the nadir coordinate data, Z3 represents the ordinate of the right end edge point coordinate data, X1 represents the abscissa of the nadir coordinate data, and X3 represents the abscissa of the right end edge point coordinate data.
10. A weld parameter measuring device, comprising:
the scanning module is used for acquiring welding seam data and acquiring the shape of a welding seam according to the welding seam data;
the marking module is used for acquiring a first marking picture, wherein the first marking picture is obtained by marking the left edge point, the lowest point and the right edge point of the welding seam shape;
the identification module is used for carrying out marking identification processing on the first marked picture according to a preset CNN identification model and acquiring coordinate data corresponding to the left edge point, the lowest point and the right edge point;
and the parameter module is used for acquiring the welding seam parameter data according to the coordinate data of the left edge point, the lowest point and the right edge point.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010446559.2A CN111637847A (en) | 2020-05-25 | 2020-05-25 | Welding seam parameter measuring method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010446559.2A CN111637847A (en) | 2020-05-25 | 2020-05-25 | Welding seam parameter measuring method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111637847A true CN111637847A (en) | 2020-09-08 |
Family
ID=72326781
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010446559.2A Pending CN111637847A (en) | 2020-05-25 | 2020-05-25 | Welding seam parameter measuring method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111637847A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112489010A (en) * | 2020-11-27 | 2021-03-12 | 桂林电子科技大学 | Method and device for quickly identifying welding line and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014092391A (en) * | 2012-11-01 | 2014-05-19 | Yokogawa Bridge Holdings Corp | Welding flaw outer appearance inspection system and welding flaw outer appearance inspection method |
CN104636760A (en) * | 2015-03-11 | 2015-05-20 | 西安科技大学 | Positioning method for welding seam |
CN105844622A (en) * | 2016-03-16 | 2016-08-10 | 南京工业大学 | V-shaped groove weld joint detection method based on laser vision |
CN206056502U (en) * | 2016-08-18 | 2017-03-29 | 广东工业大学 | A kind of line source scans the detection means of weld seam |
CN107798330A (en) * | 2017-11-10 | 2018-03-13 | 上海电力学院 | A kind of weld image characteristics information extraction method |
CN108898571A (en) * | 2018-03-27 | 2018-11-27 | 哈尔滨理工大学 | A kind of V-type weld inspection system based on structure light vision and deep learning |
CN109670503A (en) * | 2018-12-19 | 2019-04-23 | 北京旷视科技有限公司 | Label detection method, apparatus and electronic system |
CN110321903A (en) * | 2019-07-05 | 2019-10-11 | 天津科技大学 | A kind of characteristics of weld seam point extracting method based on Y-net multilayer convolutional neural networks |
-
2020
- 2020-05-25 CN CN202010446559.2A patent/CN111637847A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014092391A (en) * | 2012-11-01 | 2014-05-19 | Yokogawa Bridge Holdings Corp | Welding flaw outer appearance inspection system and welding flaw outer appearance inspection method |
CN104636760A (en) * | 2015-03-11 | 2015-05-20 | 西安科技大学 | Positioning method for welding seam |
CN105844622A (en) * | 2016-03-16 | 2016-08-10 | 南京工业大学 | V-shaped groove weld joint detection method based on laser vision |
CN206056502U (en) * | 2016-08-18 | 2017-03-29 | 广东工业大学 | A kind of line source scans the detection means of weld seam |
CN107798330A (en) * | 2017-11-10 | 2018-03-13 | 上海电力学院 | A kind of weld image characteristics information extraction method |
CN108898571A (en) * | 2018-03-27 | 2018-11-27 | 哈尔滨理工大学 | A kind of V-type weld inspection system based on structure light vision and deep learning |
CN109670503A (en) * | 2018-12-19 | 2019-04-23 | 北京旷视科技有限公司 | Label detection method, apparatus and electronic system |
CN110321903A (en) * | 2019-07-05 | 2019-10-11 | 天津科技大学 | A kind of characteristics of weld seam point extracting method based on Y-net multilayer convolutional neural networks |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112489010A (en) * | 2020-11-27 | 2021-03-12 | 桂林电子科技大学 | Method and device for quickly identifying welding line and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112233181B (en) | 6D pose recognition method and device and computer storage medium | |
CN112836734A (en) | Heterogeneous data fusion method and device and storage medium | |
US20230206603A1 (en) | High-precision point cloud completion method based on deep learning and device thereof | |
CN111127422A (en) | Image annotation method, device, system and host | |
CN107329962B (en) | Image retrieval database generation method, and method and device for enhancing reality | |
CN107705293A (en) | A kind of hardware dimension measurement method based on CCD area array cameras vision-based detections | |
CN112084849A (en) | Image recognition method and device | |
CN115131444A (en) | Calibration method based on monocular vision dispensing platform | |
CN110866936A (en) | Video labeling method, tracking method, device, computer equipment and storage medium | |
CN111144349A (en) | Indoor visual relocation method and system | |
CN109934165A (en) | Joint point detection method and device, storage medium and electronic equipment | |
CN113421242A (en) | Deep learning-based welding spot appearance quality detection method and device and terminal | |
CN113378976A (en) | Target detection method based on characteristic vertex combination and readable storage medium | |
CN114022542A (en) | Three-dimensional reconstruction-based 3D database manufacturing method | |
CN111368733B (en) | Three-dimensional hand posture estimation method based on label distribution learning, storage medium and terminal | |
CN114519853A (en) | Three-dimensional target detection method and system based on multi-mode fusion | |
CN108447092B (en) | Method and device for visually positioning marker | |
CN110009625B (en) | Image processing system, method, terminal and medium based on deep learning | |
CN118154603A (en) | Display screen defect detection method and system based on cascading multilayer feature fusion network | |
CN113628170B (en) | Laser line extraction method and system based on deep learning | |
CN111637847A (en) | Welding seam parameter measuring method and device | |
CN116958148B (en) | Method, device, equipment and medium for detecting defects of key parts of power transmission line | |
CN113808273A (en) | Disordered incremental sparse point cloud reconstruction method for ship traveling wave numerical simulation | |
CN111633337B (en) | Reflection eliminating method and device for laser welding seam measurement | |
CN111633358B (en) | Laser-based weld parameter measuring method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200908 |
|
RJ01 | Rejection of invention patent application after publication |