CN113588692B - Computer vision-based weld defect nondestructive testing method - Google Patents

Computer vision-based weld defect nondestructive testing method Download PDF

Info

Publication number
CN113588692B
CN113588692B CN202111140057.8A CN202111140057A CN113588692B CN 113588692 B CN113588692 B CN 113588692B CN 202111140057 A CN202111140057 A CN 202111140057A CN 113588692 B CN113588692 B CN 113588692B
Authority
CN
China
Prior art keywords
edge
outer edge
connected domain
inner edge
welding seam
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111140057.8A
Other languages
Chinese (zh)
Other versions
CN113588692A (en
Inventor
保柳柳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Gaoya Steel Structure Co ltd
Original Assignee
Nantong Gaoya Steel Structure Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Gaoya Steel Structure Co ltd filed Critical Nantong Gaoya Steel Structure Co ltd
Priority to CN202111140057.8A priority Critical patent/CN113588692B/en
Publication of CN113588692A publication Critical patent/CN113588692A/en
Application granted granted Critical
Publication of CN113588692B publication Critical patent/CN113588692B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/04Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/06Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and measuring the absorption
    • G01N23/083Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and measuring the absorption the radiation being X-rays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder

Abstract

The invention relates to a computer vision-based nondestructive testing method for weld defects, which comprises the following steps: performing semantic segmentation on the welding seam x-ray negative plate to obtain a welding seam edge, and performing connected domain analysis on the inner side area of the welding seam edge to obtain a connected domain of the inner side of the welding seam; dividing the edge of each connected domain to obtain the outer edge close to one side of the base material and the inner edge close to the center of the welding seam; respectively obtaining the identification degree, the blackness and the flatness of the inner edge and the outer edge; acquiring the definition of the inner edge and the outer edge by using the identification degree and the blackness; obtaining the unfused rate of the connected domain according to the definition difference and the flatness difference between the inner edge and the outer edge; setting a threshold, comparing the threshold with the unfused rate corresponding to the connected domain, and marking the connected domain according to the comparison result of the threshold and the unfused rate corresponding to the connected domain. According to the technical means provided by the invention, the non-fusion defect of the welding line can be accurately identified, and the identification of the confusable types of the welding line defects is more accurate.

Description

Computer vision-based weld defect nondestructive testing method
Technical Field
The invention relates to the field of computer vision, in particular to a weld defect nondestructive testing method based on computer vision.
Background
In the welding process, different welding defects can be generated due to various factors such as improper operation or unqualified welding materials, in order to detect the defects, the welding seam is irradiated from top to bottom by using x-rays in the prior art, an x-ray bottom plate of the welding seam is obtained according to the absorption of the welding seam metal to the x-rays, nondestructive detection can be realized, and the internal defects of the welding seam can be detected without damaging the welding parts. When the weld is defective, shadows of different shapes, etc., are formed thereon. Based on these features, the operator can identify the defects of the weld.
The defects on the existing butt welding seam x-ray negative are often directly identified by manpower, and the efficiency is low. Other neural networks that utilize different features of the defect to identify the defect image require training data based on a large amount of weld defects. The data needs manual labeling, which is time-consuming and labor-consuming. Some existing methods for detecting the defect edge by using the traditional edge detection method can confuse some defects in the welding seam defects by identifying the defect types according to the shape of the edge, and the damage degree of different types of defects in the welding seam to the welding seam is different. Therefore, a weld defect detection method capable of accurately distinguishing different defects with similar shapes and easy confusion is needed.
Aiming at the problems, the invention provides a weld defect nondestructive testing method based on computer vision. The main objective is to distinguish between weld failure, which is lack of fusion and lack of penetration. The unfused defect is difficult to repair and has larger harm, the unfused defect can be repaired and welded, and the harm is smaller than that of the unfused defect. The two defects are similar in shape, and misjudgment can be caused on the damage degree of the welding seam after confusion, so that danger is caused. The invention is used for detecting the concerned non-fusion defect of the welding line according to the definition and the straightness of the edge.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a weld defect nondestructive testing method based on computer vision.
In order to achieve the purpose, the invention adopts the following technical scheme that the nondestructive detection method for the weld defects based on computer vision comprises the following steps:
performing semantic segmentation on the welding seam x-ray negative plate to obtain a welding seam edge, and performing connected domain analysis on the inner side area of the welding seam edge to obtain a connected domain of the inner side of the welding seam;
dividing the edge of each connected domain to obtain the outer edge close to one side of the base material and the inner edge close to the center of the welding seam;
respectively obtaining the identification degree, the blackness and the flatness of the inner edge and the outer edge; acquiring the definition of the inner edge by using the identification degree and the blackness of the inner edge, and acquiring the definition of the outer edge by using the identification degree and the blackness of the outer edge; obtaining the unfused rate of the connected domain according to the definition difference between the inner edge and the outer edge and the flatness difference between the inner edge and the outer edge;
and judging the weld defects in the connected domain corresponding to the non-fusion rate according to the obtained non-fusion rate, and marking the connected domain according to the judgment result.
Further, the expression of the unfused rate is as follows:
Figure 535656DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE003
is the non-fusion rate of the weld joint connected domain,
Figure 247522DEST_PATH_IMAGE004
being the absolute value of the sharpness difference between the inner and outer edges,
Figure DEST_PATH_IMAGE005
is the absolute value of the difference in flatness between the inner and outer edges.
Further, the definition of the inner and outer edges is obtained by:
according to the degree of identification of the inner edge
Figure 990219DEST_PATH_IMAGE006
Degree of blackness
Figure DEST_PATH_IMAGE007
And the degree of identification of the outer edge
Figure 260663DEST_PATH_IMAGE008
Degree of blackness
Figure DEST_PATH_IMAGE009
Respectively obtaining the definition of the inner edge
Figure 456853DEST_PATH_IMAGE010
And definition of outer edge
Figure DEST_PATH_IMAGE011
And obtaining the expressions of the definition of the inner edge and the outer edge respectively as follows:
Figure DEST_PATH_IMAGE013
Figure DEST_PATH_IMAGE015
wherein the content of the first and second substances,
Figure 267683DEST_PATH_IMAGE010
for the sake of clarity of the inner edge,
Figure 536990DEST_PATH_IMAGE006
the degree of identification of the inner edge is,
Figure 479801DEST_PATH_IMAGE007
is the degree of blackness of the inner edge,
Figure 873873DEST_PATH_IMAGE011
in order to be able to clarify the outer edge,
Figure 691656DEST_PATH_IMAGE008
is the degree of identification of the outer edge,
Figure 753153DEST_PATH_IMAGE009
is the blackness of the outer edge.
Further, flatness of the inner edge
Figure 834242DEST_PATH_IMAGE016
And flatness of the outer edge
Figure DEST_PATH_IMAGE017
Respectively obtained by the following steps:
by the center line of the weldSetting weight values by taking the extension direction as a reference direction, respectively calculating the coincidence rate s of the extension direction of a small edge formed by points on the inner edge and the outer edge and adjacent points relative to the extension direction of the central line of the welding seam, respectively obtaining the coincidence rate of all the points on the inner edge and the coincidence rate of all the points on the outer edge, and respectively averaging the coincidence rate of all the points on the inner edge and the coincidence rate of all the points on the outer edge to obtain the straightness of the inner edge
Figure 574664DEST_PATH_IMAGE018
And flatness of the outer edge
Figure DEST_PATH_IMAGE019
Further, the method for obtaining the identification degrees of the inner edge and the outer edge is as follows:
marking inner edges as
Figure 225832DEST_PATH_IMAGE020
And the outer edge is marked
Figure DEST_PATH_IMAGE021
Calculating
Figure 469732DEST_PATH_IMAGE020
Figure 659405DEST_PATH_IMAGE021
The gradient mean value of points belonging to the outside of the connected domain in eight directions in eight neighborhoods around each point;
respectively calculating inner edges according to the gradient mean values
Figure 152703DEST_PATH_IMAGE020
And an outer edge
Figure 187655DEST_PATH_IMAGE021
Average gradient difference of gray values of to-be-measured point and adjacent point outside to-be-measured connected domain
Figure 521947DEST_PATH_IMAGE022
And adding and averaging the average gradient difference of each point on the inner edge to obtain the inner edge
Figure 741575DEST_PATH_IMAGE020
Has a total average gradient difference of
Figure DEST_PATH_IMAGE023
The average gradient difference of each point on the outer edge is added and averaged to obtain the outer edge
Figure 456590DEST_PATH_IMAGE021
Has a total average gradient difference of
Figure 29654DEST_PATH_IMAGE024
(ii) a According to the inner edge
Figure 982567DEST_PATH_IMAGE020
And an outer edge
Figure 877491DEST_PATH_IMAGE021
The total average gradient difference of the two gradient values is obtained to obtain the identification degree of the inner edge of
Figure 955169DEST_PATH_IMAGE006
The degree of identification of the outer edge is
Figure 784453DEST_PATH_IMAGE008
Further, the blackness of the inner and outer edges is obtained by:
respectively averaging the gray values of all the points on the inner edge and the outer edge to obtain the gray average value of the inner edge
Figure DEST_PATH_IMAGE025
And gray average of outer edge
Figure 123031DEST_PATH_IMAGE026
Respectively carrying out normalization processing on the average gray values of the inner edge and the outer edge to obtain the blackness of the inner edge
Figure 559828DEST_PATH_IMAGE007
And blackness of outer edge
Figure 954163DEST_PATH_IMAGE009
Further, the method of obtaining the inner edge and the outer edge is as follows:
traversing points on the edge of a connected domain in an x-ray film image of a weld non-fusion defect, searching edge points of the same connected domain in the direction of each point towards the nearest base material and the direction of the center of the weld, respectively marking the searched points of the same connected domain in two directions, and respectively forming an inner edge of the connected domain close to the center of the weld and an outer edge of the connected domain close to the base material according to the two marked points.
Further, the method for obtaining the communication domain inside the weld joint comprises the following steps:
inputting the X-ray bottom image of the welding seam into a DNN network, obtaining the area of the welding seam in the image, obtaining the edge of the welding seam through semantic segmentation, and analyzing the inner side of the edge of the welding seam to obtain the connected domain in the welding seam.
Further, the method for marking the connected domain corresponding to the unfused rate comprises the following steps:
the invention has the beneficial effects that:
(1) the weld joint defect can be subjected to nondestructive testing, manual identification is not needed, and the automation degree is high.
(2) The method can accurately identify the non-fusion defect of the welding line, thereby distinguishing the non-penetration of the welding line and the non-fusion defect of the welding line, and identifying the easily confused types of the welding line defects more accurately.
Drawings
FIG. 1 is a schematic diagram of a non-destructive inspection method for weld defects based on computer vision in one embodiment of the present invention;
FIG. 2 is a schematic view of a weld x-ray negative with a weld unfused defect in one embodiment of the present invention;
FIG. 3 is a schematic view of the inside of the weld edge and its inside communication domain in one embodiment of the invention;
FIG. 4 is a schematic view of the edge segmentation of the weld communication domain in an embodiment of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and examples.
In the description of the present invention, it is to be understood that the terms "center", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature; in the description of the present invention, "a plurality" means two or more unless otherwise specified.
Example 1
As shown in fig. 1, the embodiment provides a nondestructive testing method for weld defects based on computer vision, which comprises the following steps:
and performing semantic segmentation on the welding seam x-ray negative to obtain a welding seam edge, and acquiring a communication domain on the inner side of the welding seam edge according to the welding seam edge.
The welding defect refers to a defect formed in the welding joint part in the welding process. Weld defects include porosity, slag inclusions, lack of penetration, lack of fusion, cracks, pits, undercuts, flash, and the like. Among these defects, voids and inclusions of slag (spots) are volume defects. The line defects of the strip slag, lack of penetration, lack of fusion and cracks are also called surface defects. In particular, cracks and unfused are more surface defects. Pits, undercuts, flash and surface cracks belong to surface defects. Other defects, including internal buried cracks, are buried defects.
As shown in fig. 2, a schematic view of an x-ray negative film of a weld non-fusion defect is provided, wherein the non-fusion defect is an area type defect formed in the welding process due to unreasonable groove design and processing or unclean weld bead cleaning, oil stain or rust and the like.
And identifying the welding seam in the x-ray negative by utilizing the DNN network, and performing semantic segmentation on the acquired welding seam x-ray negative. The DNN network identifies the area of a welding seam in an image by adopting a semantic segmentation mode, and the specific training process is as follows:
the dataset is used as a weld x-ray master image dataset according to the present invention.
The pixels needing to be segmented are divided into two types, namely the labeling process of the training set corresponding to the labels is as follows: and in the single-channel semantic label, the label of the pixel at the corresponding position belonging to the background class is 0, and the label of the pixel belonging to the welding seam class is 1.
The loss function used is a cross entropy loss function (for the classification task).
The connected domain of the welding seam in the image can be obtained through the DNN network, the edge of the connected domain of the welding seam on the original image is extracted, the connected domain analysis is carried out again in the edge of the connected domain of the welding seam, and the connected domain in the welding seam is obtained.
As shown in fig. 3, a schematic diagram of the inner side of the edge of the weld and the inner side of the connected domain is given, and the edge of the connected domain is divided according to the analysis result of the edge of the weld and the connected domain, so that the side, which is close to the center of the weld, of the divided connected domain is an inner edge, and the side, which is close to the base material, of the connected domain is an outer edge.
Dividing the edge of a to-be-detected connected domain into two types, namely one side edge close to the parent metal and one side edge close to the center of a welding seam, wherein the specific dividing method comprises the following steps:
traversing the points on the edge of the connected domain 1, searching the same connected domain edge point of each point in the direction towards the nearest parent metal and the direction towards the central line of the welding seam, and marking the same connected domain edge point as a type 1 point if the same connected domain edge point is found in the direction towards the parent metal. If the edge points of the same connected domain are found in the direction towards the center line of the welding seam, the edge points are marked as 2 types of points, and as shown in fig. 4, a schematic diagram of the edge segmentation of the connected domain of the welding seam is given.
So far, the inner edge and the outer edge of the communication domain to be detected are obtained. One side edge near the center of the weld and one side edge near the base material.
Through the analysis of the x-ray negative image of the actual weld non-fusion defect and the formation principle of the non-fusion defect, the edge of one side of the non-fusion defect has a straighter profile, darker color and obvious edge. The other side edge is opposite, uneven, lighter in color and less obvious in edge.
The phenomenon is that the color depth of the edge on the X-ray negative film is obviously changed due to different attenuation of the X-ray because the depth of the position of the defect from the upper surface of the welding seam is different when the X-ray vertically irradiates from top to bottom. So that the darker edges of the two side edges are clearly contrasted with the surroundings and the lighter edges are less contrasted with the surroundings.
Due to the facts that the groove is unreasonable in design, the welding bead is not cleaned cleanly and the like, molten solder flow can flow downwards along the groove before solidification in the welding process, the flowing starting points of the solder flow are the same, the end points of the molten solder flow during solidification are different due to the reasons in the single-flow process, and therefore the phenomenon that one side is straight and the other side looks uneven is caused.
In summary, the above is based on the principle of the formation of unfused defects and the presentation on an x-ray film. In the embodiment, whether the edge of the defect area has difference is judged through the change of the edge gray scale, the fusion rate of the area is calculated according to the edge difference, and whether the area has no weld seam and is not fused with the defect is further judged.
Processing the inner edge and the outer edge to obtain the identification degree and the blackness of the inner edge and the outer edge
Taking the connected component 1 in FIG. 3 as an example, the edge on the center side of the weld is marked as
Figure 993663DEST_PATH_IMAGE020
Marking the edge of one side of the base material as
Figure 593272DEST_PATH_IMAGE021
. Then
Figure 591184DEST_PATH_IMAGE020
Figure 643453DEST_PATH_IMAGE021
Can be understood as the collection of pixel points on the edges of the center and one side of the base material of all welding seams. (subscript 1 indicates that this edge is the edge of connected component 1).
For the
Figure 689907DEST_PATH_IMAGE020
Figure 767191DEST_PATH_IMAGE021
Calculating the gradient mean value of the points which belong to the outside of the connected domain in eight directions in eight neighborhoods around the point, and calculating the average gradient difference of the gray values of the point to be measured and the adjacent points outside the connected domain to be measured in the above example, wherein the calculation mode is as follows:
Figure 545791DEST_PATH_IMAGE028
average gradient difference for each point on the edge
Figure 475570DEST_PATH_IMAGE022
Adding and averaging to obtain the total average gradient difference of the edge
Figure DEST_PATH_IMAGE029
Then to
Figure 591293DEST_PATH_IMAGE020
,
Figure 227811DEST_PATH_IMAGE021
Having a total mean gradient difference of
Figure 177312DEST_PATH_IMAGE023
,
Figure 564694DEST_PATH_IMAGE024
Calculating from the above total average gradient difference
Figure 890633DEST_PATH_IMAGE020
Figure 443974DEST_PATH_IMAGE021
Degree of identification of
Figure 564376DEST_PATH_IMAGE006
,
Figure 203168DEST_PATH_IMAGE008
The specific calculation method is as follows: (obtaining normalized degree of identification)
Figure DEST_PATH_IMAGE031
Figure DEST_PATH_IMAGE033
Calculating the blackness of the inner edge and the outer edge according to the gray values of the points on the inner edge and the outer edge, wherein the specific calculation mode is as follows:
averaging the gray values of the inner edge and the outer edge to obtain the average value of the gray values of the inner edge and the outer edge
Figure 973545DEST_PATH_IMAGE034
. Calculate the blackness (normalized blackness) of the inner and outer edges:
Figure 256759DEST_PATH_IMAGE036
Figure 938276DEST_PATH_IMAGE038
acquiring the definition of the inner edge by using the identification degree and the blackness of the inner edge, and acquiring the definition of the outer edge by using the identification degree and the blackness of the outer edge; obtaining the unfused rate of the connected domain according to the definition difference between the inner edge and the outer edge and the flatness difference between the inner edge and the outer edge;
the definition of the inner edge is obtained by utilizing the identification degree and the blackness of the inner edge, and the definition of the outer edge is obtained by utilizing the identification degree and the blackness of the outer edge
Figure 674151DEST_PATH_IMAGE010
,
Figure 200947DEST_PATH_IMAGE011
The calculation method is as follows: due to the characteristics of the two sides, the clearer the edge is, the closer the edge is to the base material, the more fuzzy the edge is, and the closer the edge is to the central line of the welding line. Based on the definition, the definition is calculated, namely the larger the blackness is, the clearer the higher the identification is.
Figure 666564DEST_PATH_IMAGE013
Figure 394348DEST_PATH_IMAGE015
Calculating the flatness according to the neighborhood distribution of the inner edge, the outer edge and the similar adjacent edge points: the straightness of the edge means that the straightness along the extension direction of the center line of the welding line is greater. I.e. the most straight case is where the edge is a straight line and parallel to the centre line of the weld, where the straightness is 1.
The method for obtaining the flatness of the inner edge and the outer edge is as follows:
take connected domain 1 as an example. For the
Figure 243618DEST_PATH_IMAGE020
Figure 715050DEST_PATH_IMAGE021
Each point above, flatness is calculated based on the position of the point and its neighboring edge points within eight neighbors. The extension direction of the center line of the welding seam is taken as the reference direction, the weight is set to be 0.5, the block adjacent to the edge is 0.25, and the block adjacent to the corner is 0.
For the point on the edge in fig. 4, calculating the coincidence rate of the extending direction of the small edge formed by the point and the adjacent point with respect to the extending direction of the central line of the weld joint as s, in this embodiment, the calculation mode is as follows;
Figure 35173DEST_PATH_IMAGE040
for the
Figure DEST_PATH_IMAGE041
. The coincidence rates of all the points are summed and averaged to obtain
Figure 261755DEST_PATH_IMAGE042
Is that is
Figure 362435DEST_PATH_IMAGE041
The flatness of (a).
And obtaining the unfused rate of the connected domain according to the definition difference between the inner edge and the outer edge and the flatness difference between the inner edge and the outer edge. The specific calculation method is as follows: because the unfused defect is that two edges have a significant difference, and the larger the difference is, the more the two edges of the defect connected domain are unfused, the more likely the two edges are the unfused defect.
Figure 371980DEST_PATH_IMAGE044
Figure 45144DEST_PATH_IMAGE046
Figure 239365DEST_PATH_IMAGE048
Thus, the non-fusion rate of the connected domain 1 is obtained, and the steps are repeated for other connected domains to obtain the non-fusion rate.
As shown in fig. 1, the corresponding connected domain is marked according to the unfused rate of the connected domain, and the type of the weld is judged according to the mark.
And marking each connected domain on the welding seam image according to the obtained non-fusion rate of the connected domain to obtain the marked non-fusion defect of the welding seam. And guidance is provided for the judgment of operators.
Taking the connected domain 1 in this embodiment as an example, the connected domain is marked according to its unfused rate, and P is a preset threshold.
a) When in use
Figure DEST_PATH_IMAGE049
Then, marking the connected domain 1 as an unfused defect;
b) when in use
Figure 30604DEST_PATH_IMAGE050
Then, marking the connected domain 1 as a non-unfused defect;
the unfused rate of all obtained n connected domains: (
Figure DEST_PATH_IMAGE051
) The marking is performed based on the above-mentioned rules. And the operator can judge whether the welding work is qualified or not according to the marking result.
The above embodiments are merely illustrative of the present invention, and should not be construed as limiting the scope of the present invention, and all designs identical or similar to the present invention are within the scope of the present invention.

Claims (9)

1. A weld defect nondestructive testing method based on computer vision is characterized by comprising the following steps:
performing semantic segmentation on the welding seam x-ray negative plate to obtain a welding seam edge, and performing connected domain analysis on the inner side area of the welding seam edge to obtain a connected domain of the inner side of the welding seam;
dividing the edge of each connected domain to obtain the outer edge close to one side of the base material and the inner edge close to the center of the welding seam;
respectively obtaining the identification degree, the blackness and the flatness of the inner edge and the outer edge; acquiring the definition of the inner edge by using the identification degree and the blackness of the inner edge, and acquiring the definition of the outer edge by using the identification degree and the blackness of the outer edge; obtaining the unfused rate of the connected domain according to the definition difference between the inner edge and the outer edge and the flatness difference between the inner edge and the outer edge;
and judging the weld defects in the connected domain corresponding to the non-fusion rate according to the obtained non-fusion rate, and marking the connected domain according to the judgment result.
2. The nondestructive testing method for the weld defect based on the computer vision is characterized in that the expression of the unfused rate is as follows:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 685086DEST_PATH_IMAGE002
is the non-fusion rate of the weld joint connected domain,
Figure 979802DEST_PATH_IMAGE003
being the absolute value of the sharpness difference between the inner and outer edges,
Figure 79345DEST_PATH_IMAGE004
is the absolute value of the difference in flatness between the inner and outer edges.
3. The nondestructive testing method for the weld defects based on the computer vision is characterized in that the definitions of the inner edge and the outer edge are obtained by the following modes respectively:
according to the degree of identification of the inner edge
Figure 29983DEST_PATH_IMAGE005
Degree of blackness
Figure 4499DEST_PATH_IMAGE006
And the degree of identification of the outer edge
Figure 153721DEST_PATH_IMAGE007
Degree of blackness
Figure 299531DEST_PATH_IMAGE008
Respectively obtaining the definition of the inner edge
Figure 596520DEST_PATH_IMAGE009
And definition of outer edge
Figure 751558DEST_PATH_IMAGE010
And obtaining the expressions of the definition of the inner edge and the outer edge respectively as follows:
Figure 755286DEST_PATH_IMAGE011
Figure 963676DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 357748DEST_PATH_IMAGE009
for the sake of clarity of the inner edge,
Figure 441111DEST_PATH_IMAGE005
the degree of identification of the inner edge is,
Figure 564924DEST_PATH_IMAGE006
is the degree of blackness of the inner edge,
Figure 318117DEST_PATH_IMAGE010
in order to be able to clarify the outer edge,
Figure 324119DEST_PATH_IMAGE007
is the degree of identification of the outer edge,
Figure 820959DEST_PATH_IMAGE008
is the blackness of the outer edge.
4. The method of claim 1, wherein the flatness of the inner edge is determined by non-destructive inspection of weld defects based on computer vision
Figure 291955DEST_PATH_IMAGE013
And flatness of the outer edge
Figure 340683DEST_PATH_IMAGE014
Respectively obtained by the following steps:
setting weight values by taking the extension direction of the central line of the welding seam as a reference direction, respectively calculating the coincidence rate s of the extension direction of the small edge formed by the points on the inner edge and the outer edge and the adjacent points relative to the extension direction of the central line of the welding seam, respectively obtaining the coincidence rate of all the points on the inner edge and the coincidence rate of all the points on the outer edge, and respectively averaging the coincidence rate of all the points on the inner edge and the coincidence rate of all the points on the outer edge to obtain the straightness of the inner edge
Figure 709347DEST_PATH_IMAGE015
And flatness of the outer edge
Figure 134512DEST_PATH_IMAGE016
5. The method for nondestructive testing of weld defects based on computer vision is characterized in that the method for acquiring the identification degrees of the inner edge and the outer edge is as follows:
marking inner edges as
Figure 967339DEST_PATH_IMAGE017
And the outer edge is marked
Figure 62334DEST_PATH_IMAGE018
Calculating
Figure 278814DEST_PATH_IMAGE017
Figure 383036DEST_PATH_IMAGE018
The gradient mean value of points belonging to the outside of the connected domain in eight directions in eight neighborhoods around each point;
respectively calculating inner edges according to the gradient mean values
Figure 132687DEST_PATH_IMAGE017
And an outer edge
Figure 398583DEST_PATH_IMAGE018
Average gradient difference of gray values of to-be-measured point and adjacent point outside to-be-measured connected domain
Figure 804156DEST_PATH_IMAGE019
And adding and averaging the average gradient difference of each point on the inner edge to obtain the inner edge
Figure 305545DEST_PATH_IMAGE017
Has a total average gradient difference of
Figure 50647DEST_PATH_IMAGE020
The average gradient difference of each point on the outer edge is added and averaged to obtain the outer edge
Figure 110613DEST_PATH_IMAGE018
Has a total average gradient difference of
Figure 3483DEST_PATH_IMAGE021
(ii) a According to the inner edge
Figure 183929DEST_PATH_IMAGE017
And an outer edge
Figure 908171DEST_PATH_IMAGE018
The total average gradient difference of the two gradient values is obtained to obtain the identification degree of the inner edge of
Figure 781449DEST_PATH_IMAGE005
The degree of identification of the outer edge is
Figure 692773DEST_PATH_IMAGE007
6. The nondestructive testing method for weld defects based on computer vision is characterized in that the blackness of the inner edge and the outer edge is obtained by the following method:
respectively averaging the gray values of all the points on the inner edge and the outer edge to obtain the gray average value of the inner edge
Figure 676910DEST_PATH_IMAGE022
And gray average of outer edge
Figure 960386DEST_PATH_IMAGE023
Respectively carrying out normalization processing on the average gray values of the inner edge and the outer edge to obtain the blackness of the inner edge
Figure 863620DEST_PATH_IMAGE006
And blackness of outer edge
Figure 403186DEST_PATH_IMAGE008
7. The nondestructive testing method for the weld defects based on the computer vision is characterized in that the method for acquiring the inner edge and the outer edge is as follows:
traversing points on the edge of a connected domain in an x-ray film image of a weld non-fusion defect, searching edge points of the same connected domain in the direction of each point towards the nearest base material and the direction of the center of the weld, respectively marking the searched points of the same connected domain in two directions, and respectively forming an inner edge of the connected domain close to the center of the weld and an outer edge of the connected domain close to the base material according to the two marked points.
8. The nondestructive testing method for the weld defects based on the computer vision is characterized in that the method for acquiring the inside communication domain of the weld is as follows:
inputting the X-ray bottom image of the welding seam into a DNN network, obtaining the area of the welding seam in the image, obtaining the edge of the welding seam through semantic segmentation, and analyzing the inner side of the edge of the welding seam to obtain the connected domain in the welding seam.
9. The nondestructive testing method for the weld defects based on the computer vision is characterized in that the method for marking the connected domains corresponding to the unfused rate is as follows:
when in use
Figure 50068DEST_PATH_IMAGE024
Then, marking the connected domain corresponding to the non-fusion rate value as a non-fusion defect; when in use
Figure 358689DEST_PATH_IMAGE025
Then, the connected domain corresponding to the non-fusion rate value is marked as a non-fusion defect,
Figure 636087DEST_PATH_IMAGE026
is the unfused rate threshold.
CN202111140057.8A 2021-09-28 2021-09-28 Computer vision-based weld defect nondestructive testing method Active CN113588692B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111140057.8A CN113588692B (en) 2021-09-28 2021-09-28 Computer vision-based weld defect nondestructive testing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111140057.8A CN113588692B (en) 2021-09-28 2021-09-28 Computer vision-based weld defect nondestructive testing method

Publications (2)

Publication Number Publication Date
CN113588692A CN113588692A (en) 2021-11-02
CN113588692B true CN113588692B (en) 2021-12-10

Family

ID=78242246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111140057.8A Active CN113588692B (en) 2021-09-28 2021-09-28 Computer vision-based weld defect nondestructive testing method

Country Status (1)

Country Link
CN (1) CN113588692B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114414599B (en) * 2022-03-29 2022-06-03 武汉丰丽德智能设备有限公司 Machine vision-based nondestructive detection method and system for welding defects of air conditioner anechoic chamber
CN114792316B (en) * 2022-06-22 2022-09-02 山东鲁岳桥机械股份有限公司 Method for detecting spot welding defects of bottom plate of disc brake shaft
CN115049649B (en) * 2022-08-12 2022-11-11 山东振鹏建筑钢品科技有限公司 Reinforcing steel bar polishing and rust removing control method based on corrosion degree
CN115229355B (en) * 2022-09-22 2022-12-13 江苏双赢锻压机床有限公司 Laser welding method for high-precision stamping forging
CN115984272B (en) * 2023-03-20 2023-05-23 山东杨嘉汽车制造有限公司 Semitrailer axle defect identification method based on computer vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05261557A (en) * 1992-03-18 1993-10-12 Daiwa Can Co Ltd Device for deciding whether can body is welded adequately or not
CN101135652A (en) * 2007-10-15 2008-03-05 清华大学 Weld joint recognition method based on texture partition
CN101556598A (en) * 2009-05-08 2009-10-14 中国矿业大学 Radiographic testing weld image management system and auxiliary film viewing method
CN102175700A (en) * 2011-01-20 2011-09-07 山东大学 Method for detecting welding seam segmentation and defects of digital X-ray images
CN106442533A (en) * 2016-12-14 2017-02-22 哈尔滨理工大学 Weld information extracting system based on industrial CCD

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007024789B3 (en) * 2007-05-26 2008-10-23 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Method for detecting defects in a weld during a laser welding process

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05261557A (en) * 1992-03-18 1993-10-12 Daiwa Can Co Ltd Device for deciding whether can body is welded adequately or not
CN101135652A (en) * 2007-10-15 2008-03-05 清华大学 Weld joint recognition method based on texture partition
CN101556598A (en) * 2009-05-08 2009-10-14 中国矿业大学 Radiographic testing weld image management system and auxiliary film viewing method
CN102175700A (en) * 2011-01-20 2011-09-07 山东大学 Method for detecting welding seam segmentation and defects of digital X-ray images
CN106442533A (en) * 2016-12-14 2017-02-22 哈尔滨理工大学 Weld information extracting system based on industrial CCD

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于模糊集和神经网络的焊缝缺陷识别技术研究;佟彤;《中国优秀硕士学位论文全文数据库 信息科技辑》;20150315;全文 *

Also Published As

Publication number Publication date
CN113588692A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
CN113588692B (en) Computer vision-based weld defect nondestructive testing method
Shafeek et al. Automatic inspection of gas pipeline welding defects using an expert vision system
CN109859177B (en) Deep learning-based industrial ray image evaluation method and device
CN114240944B (en) Welding defect detection method based on point cloud information
CN115797358B (en) Metal shell automatic welding defect detection method based on machine vision
JP6697302B2 (en) Flaw detection device and defect detection method using flaw detection device
CN109239081B (en) Weld quality parameter detection method based on structured light and visual imaging
US20220415020A1 (en) System and method for detection of anomalies in welded structures
CN106053593B (en) Flaw detection device and flaw detection method using flaw detection device
CN115803619A (en) Information processing device, determination method, and information processing program
KR20130118379A (en) Method and device for inspecting an object for the detection of surface damage
CN113496483A (en) Weld seam air hole defect detection method based on image processing
CN113196040A (en) Surface defect detection method, surface defect detection device, steel product manufacturing method, steel product quality management method, steel product manufacturing facility, surface defect determination model generation method, and surface defect determination model
CN115719332A (en) Welding quality detection method
Yan et al. Surface defect detection of aluminum alloy welds with 3D depth image and 2D gray image
CN116429768A (en) Sealing nail welding quality detection method, system, equipment and storage medium
US20230274407A1 (en) Systems and methods for analyzing weld quality
CN115035092A (en) Image-based bottle detection method, device, equipment and storage medium
CN115601359A (en) Welding seam detection method and device
CN116309277A (en) Steel detection method and system based on deep learning
CN108732148B (en) Online detection device and method for fluorescent magnetic particle inspection
JP6007639B2 (en) Wrinkle detection method and wrinkle detection device
JP3440569B2 (en) Magnetic particle flaw detection method and apparatus
Soares et al. Computer Vision System for Weld Bead Analysis.
Brzeskot et al. Development of the automatic method of detection and grouping of external welding imperfections

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant