CN114187256A - Method for detecting defects of welding seam X-ray photograph - Google Patents

Method for detecting defects of welding seam X-ray photograph Download PDF

Info

Publication number
CN114187256A
CN114187256A CN202111498895.2A CN202111498895A CN114187256A CN 114187256 A CN114187256 A CN 114187256A CN 202111498895 A CN202111498895 A CN 202111498895A CN 114187256 A CN114187256 A CN 114187256A
Authority
CN
China
Prior art keywords
image
welding seam
small
target
weld
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111498895.2A
Other languages
Chinese (zh)
Inventor
王国栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Core Spectrum Vision Technology Co ltd
Original Assignee
Nanjing Core Spectrum Vision Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Core Spectrum Vision Technology Co ltd filed Critical Nanjing Core Spectrum Vision Technology Co ltd
Publication of CN114187256A publication Critical patent/CN114187256A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/04Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder

Abstract

The invention provides a method for detecting defects of a welding seam X-ray photo, which comprises the following steps: s1: collecting an X-ray image; s2: preprocessing an image, including automatic filtering and image enhancement, inputting an original image into a filter and an image enhancer in sequence, removing noise points in the original image, enhancing the contrast of a welding seam, and enabling the contrast of the welding seam and defects to be more obvious; s3: detecting a welding seam, namely identifying the welding seam through a target detection method, and removing the interference of a non-welding seam area on image detection; s4: defining the weld defects as small targets, enhancing small target samples and unbalanced data, copying and pasting the small targets in the small target samples for multiple times, and increasing the diversity of the areas covered by the small targets and the positions of the small targets; s5: improving the resolution of the small target by a perception countermeasure generation network, and converting the small target with low resolution into a large target with high resolution; s6: and detecting defects of the welding seam area. The method can improve the detection accuracy and reduce the inconsistent detection results.

Description

Method for detecting defects of welding seam X-ray photograph
Technical Field
The invention relates to the technical field of computer vision, in particular to a method for detecting defects of a welding seam X-ray photo.
Background
Welding, also known as fusion welding, is a manufacturing process and technique for joining metals or other thermoplastic materials by heating, high temperature, or high pressure. Defects such as unfused, incomplete penetration, cracks, tungsten inclusion, slag inclusion pseudo-defects, pores and the like in the welding seam directly determine the welding quality, nondestructive detection is needed for detecting the defects of the welding seam, the welded part can be prevented from being damaged, and the method for detecting the welding seam by X-ray occupies an important position in the nondestructive detection. It is common practice to generate images by means of film or DR imaging and then to assess defects in the images by inspection personnel with the aid of film-viewing lights or display screens. The result of the manual judgment depends on the professional level of the testers, and different testers have differences in the understanding and implementation of the evaluation criteria. In addition, long-time and multi-batch detection work can cause eye fatigue, and the probability of missed detection and misjudgment is increased.
The detection idea of the weld defects at present mainly comprises the following two technical routes: firstly, extracting the characteristics of an ROI (region of interest) by adopting the traditional image processing technology such as corrosion, expansion, edge detection, custom convolution kernel and the like, and then classifying the extracted ROI so as to achieve the purpose of defect detection; and secondly, directly detecting the welding seam and defects in the welding seam by using a deep learning method, such as fast-RCNN, SSD, YOLO and the like, so as to realize the detection of the welding seam defects.
In the traditional image detection technology, the characteristics of each defect can be effectively extracted only by skillfully designing a convolution kernel, so that the realization difficulty is high and the generalization capability of the algorithm is weak. The defect detection method based on deep learning can fully extract the characteristics of the defects aiming at the large target and has strong generalization capability. However, two problems exist in the specific technical field of weld defects at present, one is that X-ray photographs have the problems of low contrast, high noise, type interference, high gray level and the like, filtering and enhancement are required to be manually performed, time and labor are consumed, and an optimal preprocessing method is difficult to find; and secondly, the common target detection method is difficult to solve the problem of detecting small defects in the welding seam.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a method for detecting the defects of a welding seam X-ray photo, which can improve the detection accuracy and reduce the condition of inconsistent detection results.
In order to achieve the purpose, the method for detecting the defects of the X-ray photographs of the welding seams comprises the following steps:
s1: collecting an X-ray image;
s2: preprocessing an image, including automatic filtering and image enhancement, inputting an original image into a filter and an image enhancer in sequence, removing noise points in the original image, enhancing the contrast of a welding seam, and enabling the contrast of the welding seam and defects to be more obvious;
s3: detecting a welding seam, namely identifying the welding seam through a target detection method, and removing the interference of a non-welding seam area on image detection;
s4: defining the weld defects as small targets, enhancing small target samples and unbalanced data, copying and pasting the small targets in the small target samples for multiple times, and increasing the diversity of the areas covered by the small targets and the positions of the small targets;
s5: improving the resolution of the small target by a perception countermeasure generation network, and converting the small target with low resolution into a large target with high resolution;
s6: and detecting defects of the welding seam area.
Further, in step S1, the weld region is irradiated with X-rays and imaged on an imaging plate, the X-ray photograph is washed out in a dark room after the sheet on the imaging plate is removed, and then the X-ray photograph is converted into a digital image by using a CR scanner, and the image data to be detected is acquired.
Further, in step S2, the process of selecting a filter and an image intensifier is: sequentially inputting the original image into each filter, calculating the structural similarity index SSIM indexes of the original image and the filtered image, setting a threshold as threshold, screening out the filters with SSIM being more than or equal to the threshold, and recording the SSIM indexes of the filters;
sequentially inputting the original image into each image intensifier, calculating SSIM indexes of the original image and the intensified image, setting a threshold as threshold, screening the intensifiers with SSIM being more than or equal to the threshold, and recording the SSIM indexes of the intensifiers;
and then combining the screened filters and the image intensifiers in pairs, respectively calculating the SSIM, recording the SSIM indexes after combination, and finally selecting the filter and the image intensifier with the SSIM index closest to 1 as a preprocessing method of the current picture.
Further, in step S2, the structural similarity index SSIM of the image is calculated by the following formula:
Figure BDA0003401983010000021
where x denotes the original image acquired in step S1, y denotes the filtered or enhanced picture, and μxRepresents the mean gray-scale value, μ, of the original imageyRepresents the average gray value of the enhanced picture,
Figure BDA0003401983010000022
the variance of the gray scale of the original image is represented,
Figure BDA0003401983010000023
representing the variance, σ, of the enhancement picturexyRepresenting the covariance of the original image x and the enhancement image y, C1、C2Is a constant.
Further, in step S3, an image labeling tool is used to label a weld area on a picture, a data set in a VOC format is generated, the data set is split into a training set and a verification set, the picture in the training set is preprocessed by the method in step S2, the processed picture is input into a fast-RCNN network for training and outputting a weld detection model, the same method is used to preprocess the verification sample, a weld model is loaded and inferred, the recall ratio and the mapp of the model on the verification set are calculated, the recall ratio and the mapp are used as evaluation indexes of model performance, and the evaluated weld detection model is used to identify a weld on the picture to be detected.
Further, in step S4, three defects of crack, unfused and unwelded are imposed 5 times by small target reinforcement, and the ratio with other categories is increased to 1/2.
Further, in step S5, the perceptual countermeasure generation network includes a generator and a discriminator, and the overall target of learning of the perceptual countermeasure generation network is defined as follows:
Figure BDA0003401983010000031
wherein G denotes a generator, D denotes a discriminator, and FlRepresenting features of large objects after convolution, FsFeatures after convolution of small objects, G (F)sIf) represents the convolution characteristic F under the condition of giving the small object characteristicsGenerating small object features at super-resolution, F represents convolution features of the small object, Fs+G(FsAnd f) is a result obtained after the convolution characteristics of the small object and the characteristics generated by the generator are fused.
Further, let the generator parameter be θgThen thetagIs expressed as the following equation:
Figure BDA0003401983010000032
in the formula, LdisLoss L by the discriminatordis_aAnd a perceptual loss Ldis_pThe device comprises the following two parts:
Ldis=Ldis_a+Ldis_p
Figure BDA0003401983010000033
in the formula (I), the compound is shown in the specification,
Figure BDA0003401983010000034
representing super-resolution features generated based on small target convolution features,
Figure BDA0003401983010000035
is a discriminator.
Further, let the parameter of the judgment branch of the discriminator be θaThen thetaaIs expressed as the following equation:
Figure BDA0003401983010000036
representing the minimized super-resolution feature and the actual small object feature;
the loss function using cross entropy to represent the corresponding distribution is the following equation:
Figure BDA0003401983010000037
further, let the parameter of the predicted branch of the discriminator be θpThen thetapIs expressed as the following equation:
Figure BDA0003401983010000041
in the formula, Ldis_pThe loss of the detection target is represented, and comprises classification loss and regression loss:
Ldis_p=Lcls(p,g)+Lloc(rg,r*)
in the formula, Lcls(p, g) represents the classification loss for the target, LlocRepresenting the regression loss to the target bounding box.
The method for detecting the defects of the X-ray photographs of the welding seams can improve the detection accuracy and reduce the condition of inconsistent detection results.
Drawings
The present invention will be further described and illustrated with reference to the following drawings.
FIG. 1 is a flow chart of a method for detecting defects in a weld X-ray photograph in accordance with a preferred embodiment of the present invention.
Detailed Description
The technical solution of the present invention will be more clearly and completely explained by the description of the preferred embodiments of the present invention with reference to the accompanying drawings.
As shown in fig. 1, a method for detecting defects in a weld X-ray photograph according to a preferred embodiment of the present invention includes the following steps:
s1: the method comprises the steps of collecting an X-ray image, irradiating a welding seam area through X-rays, imaging on an imaging plate, taking off a sheet on the imaging plate, cleaning an X-ray picture in a dark room, converting the X-ray picture into a digital image by using a CR scanner, and obtaining picture data to be detected.
S2: the method comprises the steps of preprocessing an image, wherein the preprocessing comprises automatic filtering and image enhancement, inputting an original image into a filter and an image enhancer in sequence, removing noise points in the original image, enhancing the contrast of a welding seam, and enabling the contrast of the welding seam and defects to be more obvious.
Specifically, the structural similarity index defines structural information from the perspective of image composition as being independent of brightness, contrast, reflects attributes of object structures in a scene, and models distortion as a combination of three different factors of brightness, contrast, and structure. The mean of the picture gray levels is used as an estimate of the brightness, the standard deviation of the picture gray levels is used as an estimate of the contrast, and the covariance of the input and output images is used as a measure of the degree of structural similarity.
The structural similarity index SSIM of the image is calculated according to the formula:
Figure BDA0003401983010000042
where x denotes the original image acquired in step S1, y denotes the filtered or enhanced picture, and μxRepresents the mean gray-scale value, μ, of the original imageyRepresents the average gray value of the enhanced picture,
Figure BDA0003401983010000051
the variance of the gray scale of the original image is represented,
Figure BDA0003401983010000052
representing the variance, σ, of the enhancement picturexyRepresenting the covariance of the original image x and the enhancement image y, C1、C2Is a constant.
The process of selecting the filter and the image intensifier is as follows: sequentially inputting the original image into each filter, calculating the structural similarity index SSIM indexes of the original image and the filtered image, setting a threshold as threshold, screening out the filters with SSIM being more than or equal to the threshold, and recording the SSIM indexes of the filters;
sequentially inputting the original image into each image intensifier, calculating SSIM indexes of the original image and the intensified image, setting a threshold as threshold, screening the intensifiers with SSIM being more than or equal to the threshold, and recording the SSIM indexes of the intensifiers;
and then combining the screened filters and the image intensifiers in pairs, respectively calculating the SSIM, recording the SSIM indexes after combination, and finally selecting the filter and the image intensifier with the SSIM index closest to 1 as a preprocessing method of the current picture.
S3: detecting a welding seam, namely identifying the welding seam through a target detection method, and removing the interference of a non-welding seam area on image detection;
the non-welding seam area in the picture often has the content of suspected defect, avoids the interference of non-welding seam area, reduces the false detection rate, converts the welding seam recognition problem into the target detection problem. Marking a weld joint area on a picture by using an image marking tool, generating a data set in a VOC format, dividing the data set into a training set and a verification set, preprocessing the picture in the training set by using the method in the step S2, inputting the processed picture into a Faster-RCNN network for training and outputting a weld joint detection model, preprocessing a verification sample by using the same method, loading the weld joint model and carrying out reasoning, calculating the recall rate and mAP of the model on the verification set, using the recall rate and mAP as evaluation indexes of model performance, and using the evaluated weld joint detection model to carry out weld joint identification on the picture to be detected.
The recall rate of the weld joint detection model formed in the embodiment on the test set can reach more than 95%, and the recall rate on the verification set can reach more than 93%.
S4: defining the weld defects as small targets, enhancing small target samples and unbalanced data, copying and pasting the small targets in the small target samples for multiple times, and increasing the diversity of the areas covered by the small targets and the positions of the small targets;
the ratio of the sizes of defects such as tungsten inclusion, bubbles and the like in the weld defects to the size of an original image is less than 0.1, so the weld defect detection belongs to the problem of small target detection. The target detection model may potentially be more focused on large and medium targets due to the smaller number of samples for small targets. At the same time, there is a lack of diversity in the location of small targets, as they cover a smaller area. Therefore, aiming at the problems, the problem that the number of small target samples contained in a data set is small is solved by sampling a small target image, three defects of crack, non-fusion and non-penetration are forced to 5 times by a small target enhancement mode, the proportion of the defects and other types is increased to 1/2, and finally the recall rate of the crack, non-fusion and non-penetration defects can be improved by more than 10%.
S5: improving the resolution of the small target by a perception countermeasure generation network, and converting the small target with low resolution into a large target with high resolution;
the perception countermeasure generation network comprises a generator and a discriminator, the generator converts the characteristics of the low-resolution small target into the characteristics of the high-resolution large object, the discriminator and the generator distinguish the characteristics in a competitive mode, and the difference between the small target and the large target is reduced in the continuous iteration process, so that the accuracy of small target detection is improved.
The generator is a depth residual feature generation model, and original poor features are converted into high-grade deformed features by introducing features of low-level fine granularity. The discriminator distinguishes high-resolution features generated by small objects from large object features on one hand, and improves the detection rate by using the perception loss on the other hand.
The overall goal of learning to perceive confrontation generating networks is defined as follows:
Figure BDA0003401983010000061
wherein G denotes a generator, D denotes a discriminator, and FlRepresenting features of large objects after convolution, FsFeatures after convolution of small objects, G (F)sIf) represents the convolution characteristic F under the condition of giving the small object characteristicsGenerating small object features at super-resolution, F represents convolution features of the small object, Fs+G(FsAnd f) is a result obtained after the convolution characteristics of the small object and the characteristics generated by the generator are fused.
Let the parameter of the generator be thetagThen thetagIs expressed as the following equation:
Figure BDA0003401983010000062
in the formula, LdisLoss L by the discriminatordis_aAnd a perceptual loss Ldis_pThe device comprises the following two parts:
Ldis=Ldis_a+Ldis_p
Figure BDA0003401983010000063
in the formula (I), the compound is shown in the specification,
Figure BDA0003401983010000064
representing super-resolution features generated based on small target convolution features,
Figure BDA0003401983010000065
is a discriminator.
Let the parameter of the judgment branch of the discriminator be thetaaThen thetaaIs expressed as the following equation:
Figure BDA0003401983010000071
representing the minimized super-resolution feature and the actual small object feature;
the loss function using cross entropy to represent the corresponding distribution is the following equation:
Figure BDA0003401983010000072
let the parameter of the predicted branch of the arbiter be θpThen thetapIs expressed as the following equation:
Figure BDA0003401983010000073
in the formula, Ldis_pThe loss of the detection target is represented, and comprises classification loss and regression loss:
Ldis_p=Lcls(p,g)+Lloc(rg,r*)
in the formula, Lcls(p, g) represents the classification loss for the target, LlocRepresenting the regression loss to the target bounding box.
S6: and detecting defects of the welding seam area.
The method for detecting the defects of the X-ray photographs of the welding seams can improve the detection accuracy and reduce the condition of inconsistent detection results.
The above detailed description merely describes preferred embodiments of the present invention and does not limit the scope of the invention. Without departing from the spirit and scope of the present invention, it should be understood that various changes, substitutions and alterations can be made herein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents. The scope of the invention is defined by the claims.

Claims (10)

1. A welding seam X-ray photograph defect detection method is characterized by comprising the following steps:
s1: collecting an X-ray image;
s2: preprocessing an image, including automatic filtering and image enhancement, inputting an original image into a filter and an image enhancer in sequence, removing noise points in the original image, enhancing the contrast of a welding seam, and enabling the contrast of the welding seam and defects to be more obvious;
s3: detecting a welding seam, namely identifying the welding seam through a target detection method, and removing the interference of a non-welding seam area on image detection;
s4: defining the weld defects as small targets, enhancing small target samples and unbalanced data, copying and pasting the small targets in the small target samples for multiple times, and increasing the diversity of the areas covered by the small targets and the positions of the small targets;
s5: improving the resolution of the small target by a perception countermeasure generation network, and converting the small target with low resolution into a large target with high resolution;
s6: and detecting defects of the welding seam area.
2. The method for detecting defects of welding seams according to claim 1, wherein in step S1, the welding seam area is irradiated by X-rays and imaged on an imaging plate, the X-ray image is cleaned in a dark room after the film on the imaging plate is removed, and then the X-ray image is converted into a digital image by using a CR scanner, so as to obtain the image data to be detected.
3. The weld defect detection method according to claim 1, wherein in step S2, the filter and image intensifier selection process comprises: sequentially inputting the original image into each filter, calculating the structural similarity index SSIM indexes of the original image and the filtered image, setting a threshold as threshold, screening out the filters with SSIM being more than or equal to the threshold, and recording the SSIM indexes of the filters;
sequentially inputting the original image into each image intensifier, calculating SSIM indexes of the original image and the intensified image, setting a threshold as threshold, screening the intensifiers with SSIM being more than or equal to the threshold, and recording the SSIM indexes of the intensifiers;
and then combining the screened filters and the image intensifiers in pairs, respectively calculating the SSIM, recording the SSIM indexes after combination, and finally selecting the filter and the image intensifier with the SSIM index closest to 1 as a preprocessing method of the current picture.
4. The weld X-ray photograph defect detection method of claim 3, wherein in step S2, the structural similarity index SSIM of the image is calculated by the following formula:
Figure FDA0003401983000000011
where x denotes the original image acquired in step S1, y denotes the filtered or enhanced picture, and μxRepresents the mean gray-scale value, μ, of the original imageyRepresents the average gray value of the enhanced picture,
Figure FDA0003401983000000021
the variance of the gray scale of the original image is represented,
Figure FDA0003401983000000022
representing the variance, σ, of the enhancement picturexyRepresenting the covariance of the original image x and the enhancement image y, C1、C2Is a constant.
5. The method for detecting defects of a weld X-ray photo according to claim 1, wherein in step S3, an image labeling tool is used to label a weld area on a picture, a VOC format data set is generated, the data set is divided into a training set and a verification set, the picture in the training set is preprocessed by the method of step S2, the processed picture is input into a Faster-RCNN network for training and outputting a weld detection model, the verification sample is preprocessed by the same method, the weld model is loaded and inferred, the recall rate and mAP of the model on the verification set are calculated, the recall rate and mAP are used as evaluation indexes of model performance, and the weld to be detected is identified by the evaluated weld detection model.
6. The weld defect X-ray detection method of claim 1, wherein in step S4, the ratio of crack, unfused and unwelded defects is increased to 1/2 by a small target enhancement mode to impose 5 times.
7. The weld defect detecting method according to claim 1, wherein in step S5, the perceptual countermeasure generation network includes a generator and a discriminator, and the overall learning objective of the perceptual countermeasure generation network is defined as follows:
Figure FDA0003401983000000023
wherein G denotes a generator, D denotes a discriminator, and FlRepresenting features of large objects after convolution, FsFeatures after convolution of small objects, G (F)sIf) represents the convolution characteristic F under the condition of giving the small object characteristicsGenerating small object features at super-resolution, F represents convolution features of the small object, Fs+G(FsAnd f) is a result obtained after the convolution characteristics of the small object and the characteristics generated by the generator are fused.
8. The weld X-ray photo defect detection method according to claim 7, wherein the generator parameter is set to thetagThen thetagIs superior toThe target is represented by the following formula:
Figure FDA0003401983000000024
in the formula, LdisLoss L by the discriminatordis_aAnd a perceptual loss Ldis_pThe device comprises the following two parts:
Ldis=Ldis_a+Ldis_p
Figure FDA0003401983000000025
in the formula (I), the compound is shown in the specification,
Figure FDA0003401983000000031
representing super-resolution features generated based on small target convolution features,
Figure FDA0003401983000000032
is a discriminator.
9. The method for detecting defects of welding seams X-ray photographs as claimed in claim 7, wherein the parameter of the judgment branch of the discriminator is set as thetaaThen thetaaIs expressed as the following equation:
Figure FDA0003401983000000033
representing the minimized super-resolution feature and the actual small object feature;
the loss function using cross entropy to represent the corresponding distribution is the following equation:
Figure FDA0003401983000000034
10. the weld X-ray photograph defect detection method according to claim 7, characterized in that the parameter of the prediction branch of the discriminator is set as thetapThen thetapIs expressed as the following equation:
Figure FDA0003401983000000035
in the formula, Ldis_pThe loss of the detection target is represented, and comprises classification loss and regression loss:
Ldis_p=Lcls(p,g)+Lloc(rg,r*)
in the formula, Lcls(p, g) represents the classification loss for the target, LlocRepresenting the regression loss to the target bounding box.
CN202111498895.2A 2021-11-24 2021-12-09 Method for detecting defects of welding seam X-ray photograph Pending CN114187256A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111400650 2021-11-24
CN2021114006501 2021-11-24

Publications (1)

Publication Number Publication Date
CN114187256A true CN114187256A (en) 2022-03-15

Family

ID=80542915

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111498895.2A Pending CN114187256A (en) 2021-11-24 2021-12-09 Method for detecting defects of welding seam X-ray photograph

Country Status (1)

Country Link
CN (1) CN114187256A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115266774A (en) * 2022-07-29 2022-11-01 中国特种设备检测研究院 Weld ray detection and evaluation method based on artificial intelligence

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574418A (en) * 2015-01-27 2015-04-29 西安工业大学 Pressure vessel weld defect identification method and device based on neural network
CN113298190A (en) * 2021-07-05 2021-08-24 四川大学 Weld image recognition and classification algorithm based on large-size unbalanced samples
CN113674247A (en) * 2021-08-23 2021-11-19 河北工业大学 X-ray weld defect detection method based on convolutional neural network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574418A (en) * 2015-01-27 2015-04-29 西安工业大学 Pressure vessel weld defect identification method and device based on neural network
CN113298190A (en) * 2021-07-05 2021-08-24 四川大学 Weld image recognition and classification algorithm based on large-size unbalanced samples
CN113674247A (en) * 2021-08-23 2021-11-19 河北工业大学 X-ray weld defect detection method based on convolutional neural network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LI, J等: "Perceptual Generative Adversarial Networks for Small Object Detection" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115266774A (en) * 2022-07-29 2022-11-01 中国特种设备检测研究院 Weld ray detection and evaluation method based on artificial intelligence
CN115266774B (en) * 2022-07-29 2024-02-13 中国特种设备检测研究院 Artificial intelligence-based weld joint ray detection and evaluation method

Similar Documents

Publication Publication Date Title
CN113362326B (en) Method and device for detecting defects of welding spots of battery
Wang et al. Automatic identification of different types of welding defects in radiographic images
Shafeek et al. Assessment of welding defects for gas pipeline radiographs using computer vision
CN110047073B (en) X-ray weld image defect grading method and system
CN110097547B (en) Automatic detection method for welding seam negative film counterfeiting based on deep learning
US5182775A (en) Method of processing radiographic image data for detecting a welding defect
Gayer et al. Automatic recognition of welding defects in real-time radiography
CN112150410A (en) Automatic detection method and system for weld defects
JP2968442B2 (en) Evaluation system for welding defects
Nacereddine et al. Weld defect detection in industrial radiography based digital image processing
CN113469177A (en) Drainage pipeline defect detection method and system based on deep learning
CN109859177A (en) Industrial x-ray image assessment method and device based on deep learning
CN110659675A (en) Welding seam defect detection method based on AdaBoost algorithm
CN115880302B (en) Method for detecting welding quality of instrument panel based on image analysis
Aoki et al. Application of artificial neural network to discrimination of defect type in automatic radiographic testing of welds
CN116309409A (en) Weld defect detection method, system and storage medium
CN115719332A (en) Welding quality detection method
CN111489310B (en) Searching method for small-diameter pipe welding joint radiographic inspection image weld joint area
CN114187256A (en) Method for detecting defects of welding seam X-ray photograph
CN116381053A (en) Ultrasonic detection method and system for welding metal materials
JP4981433B2 (en) Inspection device, inspection method, inspection program, and inspection system
Saravanan et al. Segmentation of defects from radiography images by the histogram concavity threshold method
Hu et al. Crack detection and evaluation method for self-piercing riveting button images based on BP neural network
Gao et al. Real-time X-ray radiography for defect detection in submerged arc welding and segmentation using sparse signal representation
CN113393440A (en) Method for automatically enhancing and identifying weld defects based on magneto-optical imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220315

RJ01 Rejection of invention patent application after publication