CN115082464A - Method and system for identifying welding seam data in welding process of dust remover - Google Patents
Method and system for identifying welding seam data in welding process of dust remover Download PDFInfo
- Publication number
- CN115082464A CN115082464A CN202211002484.4A CN202211002484A CN115082464A CN 115082464 A CN115082464 A CN 115082464A CN 202211002484 A CN202211002484 A CN 202211002484A CN 115082464 A CN115082464 A CN 115082464A
- Authority
- CN
- China
- Prior art keywords
- gray
- area
- region
- value
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 87
- 238000003466 welding Methods 0.000 title claims abstract description 80
- 230000008569 process Effects 0.000 title claims abstract description 38
- 239000000428 dust Substances 0.000 title claims abstract description 32
- 230000002159 abnormal effect Effects 0.000 claims abstract description 45
- 230000007547 defect Effects 0.000 claims abstract description 39
- 230000011218 segmentation Effects 0.000 claims abstract description 36
- 238000012216 screening Methods 0.000 claims abstract description 12
- 238000003708 edge detection Methods 0.000 claims abstract description 11
- 238000004364 calculation method Methods 0.000 claims abstract description 4
- 230000006870 function Effects 0.000 claims description 23
- 238000007781 pre-processing Methods 0.000 claims description 6
- 239000000126 substance Substances 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 abstract description 11
- 238000001514 detection method Methods 0.000 description 13
- 230000008859 change Effects 0.000 description 9
- 230000009466 transformation Effects 0.000 description 3
- 239000011324 bead Substances 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000002950 deficient Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 230000008439 repair process Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035699 permeability Effects 0.000 description 1
- 238000005215 recombination Methods 0.000 description 1
- 230000006798 recombination Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30152—Solder
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method and a system for identifying welding seam data in a welding process of a dust remover, and relates to the technical field of data processing and identification. The method mainly comprises the following steps: processing the infrared image of the surface of the welding seam to obtain a gray image; processing the gray level image to obtain the gray level difference degree of each pixel point; obtaining a segmentation threshold value by utilizing the gray difference degree of each pixel point; obtaining abnormal regions and normal regions by using a segmentation threshold, carrying out edge detection on the normal regions to obtain edge regions, screening out closed regions in the edge regions, constructing a target function according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area of each abnormal region, and calculating the value of the target function; and increasing the value of the segmentation threshold by one step length, carrying out threshold segmentation and target function calculation again, carrying out iteration until the value of the target function is converged, and forming a defect region by pixel points of the grayscale image, of which the grayscale value is greater than the segmentation threshold when the target function is converged.
Description
Technical Field
The application relates to the technical field of data processing and identification, in particular to a method and a system for identifying welding seam data in a welding process of a dust remover.
Background
In the manufacturing process of the dust remover, welding is an important process, and the air permeability and the outlet concentration are directly influenced by the quality of welding, so that after welding is finished, an infrared camera is used for carrying out nondestructive detection on a welding seam image, whether defects exist or not is judged by analyzing the infrared image, the positions of the defects are cut, and subsequent welding repair is facilitated.
Aiming at the detection of defects in the welding seam of the dust remover, in the prior art, the part with the defects in the welding seam is obtained by global segmentation through a preset threshold value.
However, since the non-defective region of the weld in the infrared image also has a large gray scale variation, it is difficult to accurately locate the defective region by performing global threshold segmentation using a fixed preset threshold. Meanwhile, the fixed preset threshold in the traditional threshold segmentation is not timely adjusted in combination with the actual situation of the defects in the welding seam, so that the detection process of the defects in the welding seam has no pertinence, and meanwhile, the corresponding threshold needs to be preset in advance aiming at the welding seams of different types or environments, so that the detection of the defects in the welding seam is more complicated and does not have universality.
Disclosure of Invention
Aiming at the technical problems, the invention provides the method and the system for identifying the welding seam data in the welding process of the dust remover, which do not need to artificially preset a gray threshold in advance to perform global threshold segmentation to determine the defect area in the welding seam, so that the detection result of the defects in the welding seam is more accurate, and the detection process is more universal.
In a first aspect, an embodiment of the present disclosure provides a method for identifying weld data in a welding process of a dust remover, including:
s1: the method comprises the steps of obtaining an infrared image of the surface of a welding seam to be detected, preprocessing the infrared image to obtain an image of a welding seam area, and graying the image of the welding seam area to obtain a gray image.
S2: and taking the direction of the maximum principal component after PCA is carried out on the gray level image as the extending direction of the welding line, and arranging the pixel values of the pixel points in the gray level image along the extending direction to respectively obtain each gray level sequence.
S3: and respectively obtaining the gray level difference degree of each pixel point in each gray level sequence according to the gray level entropy and the gray level mean value of the gray level sequence.
S4: and setting a segmentation threshold according to the maximum gray difference and the minimum gray difference.
S5: and forming abnormal regions by the pixels with the gray values larger than the segmentation threshold value in the gray image, and forming normal regions by the pixels with the gray values not larger than the segmentation threshold value in the gray image.
S6: and performing edge detection on the normal region to obtain an edge region, screening out closed regions in the edge region, constructing an objective function according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area, and calculating the value of the objective function.
S7: and increasing the value of the segmentation threshold by one step, executing S5-S6, iterating until the value of the target function is converged, taking the segmentation threshold when the target function is converged as a convergence threshold, and forming a defect region by using pixel points of which the gray values are greater than the convergence threshold in the gray image.
In a possible embodiment, constructing the objective function according to the grayscale entropy of each of the closed regions, the grayscale entropy of each of the abnormal regions, and the area includes:
here, theRepresents the mean value of the gray levels of the jth occlusion region,representing the maximum gray value in the gray-scale image,representing the gray difference between the jth closed region and the adjacent pixel point,the gray level entropy of the jth closed region is represented,is shown asThe mean value of the gray levels of the individual abnormal regions,is shown asThe gray scale difference between the abnormal region and the adjacent pixel point,is shown asThe gray level entropy of the individual abnormal regions,indicating the degree of influence of the area of the jth occlusion region,is shown asThe degree of influence of the area of the individual anomaly regions,in order to be able to determine the number of closed regions,the number of abnormal regions.
In one possible embodiment, the method for obtaining the degree of influence of the area of the occlusion region or the abnormal region includes:
when the area of the closed area or the abnormal area is larger than a preset area threshold value, the influence degree is a preset first numerical value, otherwise, the influence degree is a preset second numerical value, wherein the preset first numerical value is larger than the preset second numerical value.
In one possible embodiment, the calculating of the gray difference between the abnormal region and the adjacent pixel point thereof includes:
and taking 5 pixels adjacent to each edge pixel point of the abnormal area as adjacent pixel points, and subtracting the gray average value of the adjacent pixel points from the gray average value of the pixel points in the abnormal area to obtain the gray difference between the abnormal area and the adjacent pixel points.
In one possible embodiment, the calculating of the gray difference between the closed region and the adjacent pixel point thereof includes:
and taking 5 pixels adjacent to each edge pixel point of the closed area as adjacent pixel points, and subtracting the gray average value of the adjacent pixel points from the gray average value of the pixel points in the closed area to obtain the gray difference between the closed area and the adjacent pixel points.
In one possible embodiment, screening out the closed regions in the edge region comprises:
and drawing straight lines passing through the geometric center of the edge region in different directions at equal angles by taking a preset interval angle as an interval, respectively judging the number of intersection points of each straight line and the edge region, counting the proportion of the straight lines of which the number of the intersection points is more than 2 in the edge region, and taking the edge region as a closed region when the proportion is more than a preset first threshold value.
And screening out the closed regions in all the edge regions by using a closed region judgment method.
In a possible embodiment, the obtaining the gray level difference degree of each pixel point in each gray level sequence according to the gray level entropy and the gray level mean of the gray level sequence comprises:
wherein the content of the first and second substances,to express the second in a gray sequenceThe gray level difference degree of each pixel point,a mean value of the gray levels representing the sequence of gray levels,to express the second in a gray sequenceThe gray value of each pixel point is calculated,representing the gray entropy of the gray sequence.
In one possible embodiment, the setting the segmentation threshold according to the maximum gray-scale difference and the maximum gray-scale difference includes:
and carrying out weighted average on the maximum gray difference degree and the maximum gray difference degree, and taking the weighted average result as a segmentation threshold value. Wherein, the weight corresponding to the maximum gray scale difference degree in the weighted average process is。
In one possible embodiment, preprocessing the infrared image of the weld surface to obtain an image of the weld region includes:
setting the pixel value of a pixel point of the infrared image with the temperature not greater than a preset temperature threshold value to be 0, and obtaining a welding seam area image, wherein the preset temperature threshold value is obtained by counting historical data of the temperature of an area outside a welding seam.
In a second aspect, an embodiment of the present invention provides a system for detecting a weld defect of a dust remover based on image processing, including: the welding seam data identification method comprises a memory and a processor, wherein the processor executes a computer program stored in the memory so as to realize the welding seam data identification method in the welding process of the dust remover.
The invention provides a method and a system for identifying welding seam data in a welding process of a dust remover, and compared with the prior art, the method and the system have the beneficial effects that: the defect area in the welding seam is determined without manually presetting a gray threshold in advance to carry out global threshold segmentation, so that the detection result of the defect in the welding seam is more accurate, and the detection process has universality.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for identifying weld joint data in a welding process of a dust remover, provided by an embodiment of the invention.
FIG. 2 is a diagram illustrating an adjacent pixel according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature; in the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
The embodiment of the invention provides a method for identifying welding seam data in a welding process of a dust remover, which comprises the following steps of:
and S1, acquiring an infrared image of the surface of the weld to be detected, preprocessing the infrared image to acquire a weld area image, and graying the weld area image to acquire a grayscale image.
And step S2, taking the direction of the maximum principal component after PCA is carried out on the gray level image as the extending direction of the welding line, and arranging the pixel values of the pixel points in the gray level image along the extending direction to respectively obtain each gray level sequence.
And step S3, respectively obtaining the gray level difference degree of each pixel point in each gray level sequence according to the gray level entropy and the gray level mean value of the gray level sequence.
In step S4, a division threshold is set based on the maximum gradation difference and the minimum gradation difference.
Step S5, forming abnormal areas by the pixel points with the gray scale value larger than the segmentation threshold value in the gray scale image, and forming normal areas by the pixel points with the gray scale value not larger than the segmentation threshold value in the gray scale image.
And step S6, performing edge detection on the normal region to obtain an edge region, screening out closed regions in the edge region, constructing an objective function according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area, and calculating the value of the objective function.
And S7, increasing the value of the segmentation threshold by one step, executing S5 to S6, iterating until the value of the target function is converged, taking the segmentation threshold when the target function is converged as a convergence threshold, and forming a defect region by pixel points of which the gray values are greater than the convergence threshold in the gray image.
The embodiment of the invention aims at the following situations: the infrared image of the welding line of the dust remover is collected, and the defect position in the welding line is obtained by processing the infrared image, so that the subsequent repair treatment on the defect is facilitated.
Further, step S1, an infrared image of the weld surface to be detected is obtained and is preprocessed to obtain a weld region image, and the weld region image is grayed to obtain a grayscale image. The method specifically comprises the following steps:
first, in order to realize the defect area positioning, the embodiment of the invention collects the infrared image of the weld surface. It should be noted that in this embodiment, the infrared image of the surface of the weld joint may be acquired by an infrared camera.
Secondly, preprocessing the infrared image of the weld surface to obtain a weld area image, wherein the color value of the weld area in the infrared image of the weld is greatly different from the color value of the non-weld area, so that the embodiment of the invention divides the weld area image by using colors, and specifically comprises the following steps: setting the pixel value of a pixel point of the infrared image with the temperature not greater than a preset temperature threshold value to be 0, and obtaining a welding seam area image, wherein the preset temperature threshold value is obtained by counting historical data of the temperature of an area outside a welding seam.
And finally, graying the obtained welding seam area image to obtain a grayscale image, wherein the grayscale image comprises the following steps: and taking the maximum value of the pixel values of the pixel points in the weld region image in the RGB three channels as the gray value of the pixel points in the gray image. Therefore, the influence of colors on the subsequent processing process can be avoided.
Further, in step S2, the maximum principal component direction obtained by PCA of the grayscale image is taken as the extending direction of the weld, and the pixel values of the pixels in the grayscale image along the extending direction are arranged to obtain each grayscale sequence. The method specifically comprises the following steps:
analyzing based on the welding seam gray level image, and calculating the gray level difference degree of each area pixel:
first, the maximum Principal component direction after PCA of a grayscale image is taken as the extending direction of the bead, and PCA (Principal components analysis) is one of important dimension reduction methods. The method has wide application in the fields of data compression and redundancy elimination, data noise elimination and the like. It uses orthogonal transformation to transform a series of possible linearly related variables into a set of linearly uncorrelated new variables, also called principal components, to characterize the data in smaller dimensions using the new variables.
Specifically, in space, PCA can be understood as projecting the original data to a new coordinate system, where the first principal component is a first coordinate axis, and its meaning represents a change interval of a new variable obtained by some transformation of a plurality of variables in the original data; the second component is a second coordinate axis which represents a change interval of a second new variable obtained by carrying out certain transformation on a plurality of variables in the original data. Thus we converted the use of raw data to account for sample differences to new variables.
It should be noted that there are many ways of such projection, and in order to keep the interpretation of the original data to the maximum, the maximum variance theory or the minimum loss theory is generally used, so that the first principal component has the maximum variance or variance. In this embodiment, the first principal component direction, which is the maximum principal component direction, is defined as the extending direction of the bead.
And secondly, arranging pixel values of pixel points along the extension direction in the gray level image to respectively obtain each gray level sequence. Straight lines can be drawn along the extending direction of the welding line through all pixel points in the gray level image, the gray level values of the pixel points on all the straight lines are arranged, and all the gray level sequences are obtained respectively.
Further, step S3, obtaining the gray level difference of each pixel in each gray level sequence according to the gray level entropy and the gray level mean of the gray level sequence. The method specifically comprises the following steps:
wherein the content of the first and second substances,to express the second in a gray sequenceThe gray level difference degree of each pixel point,a mean value of the gray levels representing the sequence of gray levels,to express the second in a gray sequenceThe gray value of each pixel point is calculated,representing the gray entropy of the gray sequence. Therefore, the gray difference degree of each pixel point in each gray sequence can be respectively obtained. The influence of the boundary area of the gray change can be reduced by taking the gray entropy of the gray sequence as the denominator, and the influence of the pixel at the gray change boundary is reduced by the gray difference because the pixel does not belong to the defect although the edge area of the gray change has larger gray difference.
It should be noted that, when the difference between the gray value of a pixel and the gray value of the gray sequence in which the pixel is located is large in the extending direction of the weld, it indicates that the probability that the pixel belongs to a defect is large, and on the other hand, the probability that the same gray value belongs to a defect in different regions is also different.
Further, in step S4, a division threshold is set based on the maximum gradation difference degree and the minimum gradation difference degree. The method specifically comprises the following steps:
carrying out weighted average on the maximum gray difference degree and the maximum gray difference degree, and taking the weighted average result as a segmentation threshold; wherein, the weight corresponding to the maximum gray scale difference degree in the weighted average process isI.e. the weight corresponding to the minimum gray scale difference in the weighted average process is。
Further, in step S5, the pixels in the grayscale image with the grayscale value greater than the segmentation threshold are formed into an abnormal region, and the pixels in the grayscale image with the grayscale value not greater than the segmentation threshold are formed into a normal region. The method specifically comprises the following steps:
specifically, the pixels with the gray values larger than the segmentation threshold in the gray image form an abnormal region, and the pixels with the gray values not larger than the segmentation threshold in the gray image form a normal region.
Further, step S6, performing edge detection on the normal region to obtain an edge region, screening out closed regions in the edge region, constructing an objective function according to the grayscale entropy of each closed region, the grayscale entropy of each abnormal region, and the area, and calculating a value of the objective function. The method specifically comprises the following steps:
firstly, carrying out edge detection on a normal region to obtain an edge region, and screening out a closed region in the edge region.
It should be noted that the edge detection of an image is a basic step of image processing, and is a basic research direction and slab in the image processing. The method is mainly characterized in that pixel points with obvious color change or brightness change in a digital image are identified, and the significant change of the pixel points usually represents that the attribute of the image has important change, including discontinuity in depth, discontinuity in direction, discontinuity in brightness and the like.
There are many edge detection models commonly used at present: the first order is a Roberts operator, a Prewitt operator, a Sobel operator, a Canny operator and the like; the second order is the Laplacian operator. The edge detection of the image is realized based on the gradient of the image, and the gradient of the obtained image is converted into the edge detection by performing convolution operation on the image by using various operators.
As an example, in the embodiment of the present invention, the sober operator is used to perform edge detection on a normal region in a grayscale image to obtain an edge region.
Specifically, the process of screening out the closed region in the edge region includes: and drawing straight lines passing through the geometric center of the edge region in different directions at equal angles by taking a preset interval angle as an interval, respectively judging the number of intersection points of each straight line and the edge region, counting the proportion of the straight lines of which the number of the intersection points is more than 2 in the edge region, and taking the edge region as a closed region when the proportion is more than a preset first threshold value. And screening out the closed regions in all the edge regions by using a judging method of the closed regions.
As an example, in the embodiment of the present invention, the preset interval angle is 1 °, and the preset first threshold value is 0.9.
Secondly, an objective function is constructed according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area, and the value of the objective function is calculated. The method specifically comprises the following steps:
the objective function is:
wherein the content of the first and second substances,in order to be the objective function, the target function,represents the mean value of the gray levels of the jth occlusion region,representing the maximum gray value in the gray-scale image,representing the gray difference between the jth closed area and the adjacent pixel point,representing the j-th occlusion regionThe entropy of the gray scale is such that,is shown asThe mean value of the gray levels of the individual abnormal regions,is shown asThe gray scale difference between the abnormal region and the adjacent pixel point,is shown asThe gray level entropy of the individual abnormal regions,indicating the degree of influence of the area of the jth occlusion region,is shown asThe degree of influence of the area of the individual anomaly regions,is the number of the closed areas and,the number of abnormal regions.
The calculation of the gray difference between the closed area or the abnormal area and the adjacent pixel point comprises the following steps: fig. 2 shows a schematic diagram of adjacent pixels in the embodiment of the present invention, and as shown in fig. 2, 5 pixels adjacent to each edge pixel of the closed region or the abnormal region are used as adjacent pixels, and the gray level average of the adjacent pixels is subtracted from the gray level average of the pixels in the closed region or the abnormal region, so as to obtain the gray level difference between the closed region or the abnormal region and the adjacent pixels.
It should be noted that, when the area of the closed region or the abnormal region is greater than the preset area threshold, the influence degree is a preset first value, otherwise, the influence degree is a preset second value, where the preset first value is greater than the preset second value.
As an example, the preset area threshold value is 4 in the embodiment of the present invention.
As an example, in the embodiment of the present invention, the preset first value is 4, and the preset second value is 0.5.
It should be noted that, according to the characteristics of the weld defect, the gray value of the defect region is smaller, the difference of the gray values inside the defect region is smaller, the contrast between the defect region and the adjacent region is larger, and in order to avoid noise interference, the defect region should be larger than a certain area, so that the three characteristics are used to represent the defect conformity, but the defect conformity of the non-defect region is smaller,indicating the number of abnormal regions.
Further, step S7, increasing the value of the segmentation threshold by one step, and executing S5 to S6, and performing iteration until the value of the target function converges, taking the segmentation threshold when the target function converges as a convergence threshold, and forming a defect region by using the pixels in the grayscale image whose grayscale values are greater than the convergence threshold. The method specifically comprises the following steps:
as an example, the step size in the embodiment of the present invention is 10.
And (4) segmenting the gray level image by using the obtained convergence threshold, namely forming a defect region by using the pixel points of which the gray level values are greater than the convergence threshold in the gray level image. In this way, a region in which defects exist in the weld is obtained.
Based on the same inventive concept as the method, the embodiment further provides a welding seam defect detection system of the dust remover based on image processing, and the welding seam defect detection system of the dust remover based on image processing in the embodiment comprises a memory and a processor, wherein the processor executes a computer program stored in the memory to realize the detection of the defects in the welding seam of the dust remover in the embodiment of the identification method of the welding seam data in the welding process of the dust remover.
In the embodiment of the method for identifying the welding seam data in the welding process of the dust remover, a method for detecting defects in the welding seam of the dust remover is already described, and details are not repeated here.
In summary, the method and the system for identifying the weld data in the welding process of the dust remover, provided by the embodiment of the invention, do not need to artificially preset a gray threshold in advance to perform global threshold segmentation to determine the defect region in the weld, so that the detection result of the defect in the weld is more accurate, and the detection process is more universal.
The use of words such as "including," "comprising," "having," and the like in this disclosure is an open-ended term that means "including, but not limited to," and is used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the method and system of the present invention, various components or steps may be decomposed and/or re-combined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure.
The above-mentioned embodiments are merely examples for clearly illustrating the present invention and do not limit the scope of the present invention. It will be apparent to those skilled in the art that other variations and modifications may be made in the foregoing description, and it is not necessary or necessary to exhaustively enumerate all embodiments herein. All designs identical or similar to the present invention are within the scope of the present invention.
Claims (10)
1. A method for identifying welding seam data in a welding process of a dust remover is characterized by comprising the following steps:
s1: acquiring an infrared image of the surface of a weld to be detected, preprocessing the infrared image to acquire a weld area image, and graying the weld area image to acquire a grayscale image;
s2: taking the direction of the maximum principal component after PCA is carried out on the gray level image as the extending direction of a welding line, and arranging the pixel values of the pixel points in the gray level image along the extending direction to respectively obtain each gray level sequence;
s3: respectively obtaining the gray level difference degree of each pixel point in each gray level sequence according to the gray level entropy and the gray level mean value of the gray level sequence;
s4: setting a segmentation threshold according to the maximum gray difference and the minimum gray difference;
s5: forming abnormal regions by pixel points with the gray values larger than the segmentation threshold value in the gray image, and forming normal regions by pixel points with the gray values not larger than the segmentation threshold value in the gray image;
s6: performing edge detection on the normal region to obtain an edge region, screening out closed regions in the edge region, constructing a target function according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area, and calculating the value of the target function;
s7: and increasing the value of the segmentation threshold by one step, executing S5-S6, iterating until the value of the target function is converged, taking the segmentation threshold when the target function is converged as a convergence threshold, and forming a defect region by using pixel points of which the gray values are greater than the convergence threshold in the gray image.
2. The method for identifying the weld joint data in the welding process of the dust remover according to claim 1, wherein an objective function is constructed according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area, and the method comprises the following steps:
here, theRepresents the mean value of the gray levels of the jth occlusion region,representing the maximum gray value in the gray-scale image,representing the gray difference between the jth closed region and the adjacent pixel point,the gray level entropy of the jth closed region is represented,is shown asThe mean value of the gray levels of the individual abnormal regions,is shown asThe gray scale difference between the abnormal region and the adjacent pixel point,is shown asThe gray level entropy of the individual abnormal regions,indicating the degree of influence of the area of the jth occlusion region,is shown asThe degree of influence of the area of the individual anomaly regions,in order to be able to determine the number of closed regions,the number of abnormal regions.
3. The method for identifying the weld joint data in the welding process of the dust remover according to claim 2, wherein the obtaining process of the influence degree of the area of the closed region or the abnormal region comprises the following steps:
when the area of the closed area or the abnormal area is larger than a preset area threshold value, the influence degree is a preset first numerical value, otherwise, the influence degree is a preset second numerical value, wherein the preset first numerical value is larger than the preset second numerical value.
4. The method for identifying the welding seam data in the welding process of the dust remover according to claim 2, wherein the calculation process of the gray difference between the abnormal area and the adjacent pixel point comprises the following steps:
and taking 5 pixels adjacent to each edge pixel point of the abnormal area as adjacent pixel points, and subtracting the gray average value of the adjacent pixel points from the gray average value of the pixel points in the abnormal area to obtain the gray difference between the abnormal area and the adjacent pixel points.
5. The method for identifying the welding seam data in the welding process of the dust remover according to claim 2, wherein the calculation process of the gray difference between the closed area and the adjacent pixel point comprises the following steps:
and taking 5 pixels adjacent to each edge pixel point of the closed area as adjacent pixel points, and subtracting the gray average value of the adjacent pixel points from the gray average value of the pixel points in the closed area to obtain the gray difference between the closed area and the adjacent pixel points.
6. The method for identifying the weld data in the welding process of the dust remover according to claim 1, wherein screening out the closed region in the edge region comprises the following steps:
drawing straight lines passing through the geometric center of the edge area in different directions at equal angles by taking a preset interval angle as an interval, respectively judging the number of intersection points of each straight line and the edge area, counting the proportion of the straight lines of which the number of the intersection points is more than 2 in the edge area, and taking the edge area as a closed area when the proportion is more than a preset first threshold value;
and screening out the closed regions in all the edge regions by using a closed region judgment method.
7. The method for identifying the welding seam data in the welding process of the dust remover according to claim 1, wherein the step of respectively obtaining the gray level difference degree of each pixel point in each gray level sequence according to the gray level entropy and the gray level mean value of the gray level sequence comprises the following steps:
wherein the content of the first and second substances,to express the second in a gray sequenceThe gray scale difference degree of each pixel point is,a mean value of the gray levels representing the sequence of gray levels,to express the second in a gray sequenceThe gray value of each pixel point is calculated,representing the gray entropy of the gray sequence.
8. The method for identifying the weld joint data in the welding process of the dust remover according to claim 1, wherein a segmentation threshold is set according to the maximum gray-scale difference degree and the maximum gray-scale difference degree, and the method comprises the following steps:
9. The method for identifying the welding seam data in the welding process of the dust remover according to claim 1, wherein the step of preprocessing the infrared image of the welding seam surface to obtain the welding seam area image comprises the following steps:
setting the pixel value of a pixel point of which the temperature is not more than a preset temperature threshold value in the infrared image to be 0, and obtaining an image of a welding seam region, wherein the preset temperature threshold value is obtained by counting historical data of the temperature of a region outside the welding seam.
10. A welding seam data identification system in a welding process of a dust remover comprises: a memory and a processor, wherein the processor executes a computer program stored in the memory to implement the method for identifying weld data in a welding process of a dust remover according to any of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211002484.4A CN115082464B (en) | 2022-08-22 | 2022-08-22 | Method and system for identifying weld data in welding process of dust remover |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211002484.4A CN115082464B (en) | 2022-08-22 | 2022-08-22 | Method and system for identifying weld data in welding process of dust remover |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115082464A true CN115082464A (en) | 2022-09-20 |
CN115082464B CN115082464B (en) | 2023-12-12 |
Family
ID=83243952
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211002484.4A Active CN115082464B (en) | 2022-08-22 | 2022-08-22 | Method and system for identifying weld data in welding process of dust remover |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115082464B (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115249301A (en) * | 2022-09-22 | 2022-10-28 | 精技精密部件(南通)有限公司 | Method for extracting grinding wrinkles on surface of workpiece |
CN115272316A (en) * | 2022-09-27 | 2022-11-01 | 山东华太新能源电池有限公司 | Intelligent detection method for welding quality of battery cover based on computer vision |
CN115345894A (en) * | 2022-10-17 | 2022-11-15 | 南通通力油泵有限公司 | Welding seam ray detection image segmentation method |
CN115359047A (en) * | 2022-10-19 | 2022-11-18 | 元能微电子科技南通有限公司 | Abnormal defect detection method for intelligent welding of PCB (printed circuit board) |
CN115375588A (en) * | 2022-10-25 | 2022-11-22 | 山东旗胜电气股份有限公司 | Power grid transformer fault identification method based on infrared imaging |
CN115457031A (en) * | 2022-10-27 | 2022-12-09 | 江苏集宿智能装备有限公司 | Method for identifying internal defects of integrated box based on X-ray |
CN115880302A (en) * | 2023-03-08 | 2023-03-31 | 杭州智源电子有限公司 | Instrument panel welding quality detection method based on image analysis |
CN116038112A (en) * | 2022-12-06 | 2023-05-02 | 西南石油大学 | Laser tracking large-scale curved plate fillet welding system and method |
CN116128877A (en) * | 2023-04-12 | 2023-05-16 | 山东鸿安食品科技有限公司 | Intelligent exhaust steam recovery monitoring system based on temperature detection |
CN116188498A (en) * | 2023-04-28 | 2023-05-30 | 江西科技学院 | Axle welding area detection method and system based on computer vision |
CN116385439A (en) * | 2023-06-05 | 2023-07-04 | 山东兰通机电有限公司 | Motor rubber shock pad quality detection method based on image processing |
CN116385476A (en) * | 2023-06-05 | 2023-07-04 | 青岛星跃铁塔有限公司 | Iron tower quality analysis method based on visual detection |
CN116664569A (en) * | 2023-07-31 | 2023-08-29 | 山东正华建筑科技有限公司 | Weld flash defect detection method |
CN116805317A (en) * | 2023-08-28 | 2023-09-26 | 苏州科尔珀恩机械科技有限公司 | Rotary furnace inner wall defect detection method based on artificial intelligence |
CN117197588A (en) * | 2023-11-02 | 2023-12-08 | 南通宝田包装科技有限公司 | Plastic package control early warning method based on temperature identification |
CN117351013A (en) * | 2023-12-05 | 2024-01-05 | 天津风霖物联网科技有限公司 | Intelligent detection system and method for building damage |
CN117408958A (en) * | 2023-10-16 | 2024-01-16 | 日照鼎立钢构股份有限公司 | Method and system for monitoring production quality of steel structural member |
CN117455870A (en) * | 2023-10-30 | 2024-01-26 | 太康精密(中山)有限公司 | Connecting wire and connector quality visual detection method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103942777A (en) * | 2014-03-13 | 2014-07-23 | 华南理工大学 | Mobile phone glass cover plate defect detecting method based on principal component analysis |
CN114862862A (en) * | 2022-07-11 | 2022-08-05 | 江苏大田阀门制造有限公司 | Pump body cold shut defect identification method and system based on image processing |
-
2022
- 2022-08-22 CN CN202211002484.4A patent/CN115082464B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103942777A (en) * | 2014-03-13 | 2014-07-23 | 华南理工大学 | Mobile phone glass cover plate defect detecting method based on principal component analysis |
CN114862862A (en) * | 2022-07-11 | 2022-08-05 | 江苏大田阀门制造有限公司 | Pump body cold shut defect identification method and system based on image processing |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115249301A (en) * | 2022-09-22 | 2022-10-28 | 精技精密部件(南通)有限公司 | Method for extracting grinding wrinkles on surface of workpiece |
CN115272316A (en) * | 2022-09-27 | 2022-11-01 | 山东华太新能源电池有限公司 | Intelligent detection method for welding quality of battery cover based on computer vision |
CN115345894A (en) * | 2022-10-17 | 2022-11-15 | 南通通力油泵有限公司 | Welding seam ray detection image segmentation method |
CN115359047A (en) * | 2022-10-19 | 2022-11-18 | 元能微电子科技南通有限公司 | Abnormal defect detection method for intelligent welding of PCB (printed circuit board) |
CN115375588A (en) * | 2022-10-25 | 2022-11-22 | 山东旗胜电气股份有限公司 | Power grid transformer fault identification method based on infrared imaging |
CN115375588B (en) * | 2022-10-25 | 2023-02-07 | 山东旗胜电气股份有限公司 | Power grid transformer fault identification method based on infrared imaging |
CN115457031A (en) * | 2022-10-27 | 2022-12-09 | 江苏集宿智能装备有限公司 | Method for identifying internal defects of integrated box based on X-ray |
CN116038112A (en) * | 2022-12-06 | 2023-05-02 | 西南石油大学 | Laser tracking large-scale curved plate fillet welding system and method |
CN115880302A (en) * | 2023-03-08 | 2023-03-31 | 杭州智源电子有限公司 | Instrument panel welding quality detection method based on image analysis |
CN116128877A (en) * | 2023-04-12 | 2023-05-16 | 山东鸿安食品科技有限公司 | Intelligent exhaust steam recovery monitoring system based on temperature detection |
CN116128877B (en) * | 2023-04-12 | 2023-06-30 | 山东鸿安食品科技有限公司 | Intelligent exhaust steam recovery monitoring system based on temperature detection |
CN116188498A (en) * | 2023-04-28 | 2023-05-30 | 江西科技学院 | Axle welding area detection method and system based on computer vision |
CN116385439B (en) * | 2023-06-05 | 2023-08-15 | 山东兰通机电有限公司 | Motor rubber shock pad quality detection method based on image processing |
CN116385439A (en) * | 2023-06-05 | 2023-07-04 | 山东兰通机电有限公司 | Motor rubber shock pad quality detection method based on image processing |
CN116385476B (en) * | 2023-06-05 | 2023-08-18 | 青岛星跃铁塔有限公司 | Iron tower quality analysis method based on visual detection |
CN116385476A (en) * | 2023-06-05 | 2023-07-04 | 青岛星跃铁塔有限公司 | Iron tower quality analysis method based on visual detection |
CN116664569B (en) * | 2023-07-31 | 2023-10-10 | 山东正华建筑科技有限公司 | Weld flash defect detection method |
CN116664569A (en) * | 2023-07-31 | 2023-08-29 | 山东正华建筑科技有限公司 | Weld flash defect detection method |
CN116805317B (en) * | 2023-08-28 | 2023-11-14 | 苏州科尔珀恩机械科技有限公司 | Rotary furnace inner wall defect detection method based on artificial intelligence |
CN116805317A (en) * | 2023-08-28 | 2023-09-26 | 苏州科尔珀恩机械科技有限公司 | Rotary furnace inner wall defect detection method based on artificial intelligence |
CN117408958A (en) * | 2023-10-16 | 2024-01-16 | 日照鼎立钢构股份有限公司 | Method and system for monitoring production quality of steel structural member |
CN117408958B (en) * | 2023-10-16 | 2024-03-26 | 日照鼎立钢构股份有限公司 | Method and system for monitoring production quality of steel structural member |
CN117455870A (en) * | 2023-10-30 | 2024-01-26 | 太康精密(中山)有限公司 | Connecting wire and connector quality visual detection method |
CN117455870B (en) * | 2023-10-30 | 2024-04-16 | 太康精密(中山)有限公司 | Connecting wire and connector quality visual detection method |
CN117197588A (en) * | 2023-11-02 | 2023-12-08 | 南通宝田包装科技有限公司 | Plastic package control early warning method based on temperature identification |
CN117197588B (en) * | 2023-11-02 | 2024-03-05 | 南通宝田包装科技有限公司 | Plastic package control early warning method based on temperature identification |
CN117351013A (en) * | 2023-12-05 | 2024-01-05 | 天津风霖物联网科技有限公司 | Intelligent detection system and method for building damage |
CN117351013B (en) * | 2023-12-05 | 2024-02-09 | 天津风霖物联网科技有限公司 | Intelligent detection system and method for building damage |
Also Published As
Publication number | Publication date |
---|---|
CN115082464B (en) | 2023-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115082464B (en) | Method and system for identifying weld data in welding process of dust remover | |
WO2022042579A1 (en) | Lcd screen defect detection method and apparatus | |
CN111047568B (en) | Method and system for detecting and identifying steam leakage defect | |
EP1430446B1 (en) | Image processing method for appearance inspection | |
CN114943739B (en) | Aluminum pipe quality detection method | |
KR20020077420A (en) | Method for automatically detecting casting defects in a test piece | |
CN115690108A (en) | Aluminum alloy rod production quality evaluation method based on image processing | |
CN111667470B (en) | Industrial pipeline flaw detection inner wall detection method based on digital image | |
CN117351019B (en) | Welding defect detection method | |
CN112381791A (en) | Bolt looseness detection method based on 3D point cloud | |
CN113340909B (en) | Glue line defect detection method based on machine vision | |
CN113689415A (en) | Steel pipe wall thickness online detection method based on machine vision | |
CN115471486A (en) | Switch interface integrity detection method | |
CN116612112B (en) | Visual inspection method for surface defects of bucket | |
CN116385433B (en) | Plastic pipeline welding quality assessment method | |
CN116258838B (en) | Intelligent visual guiding method for duct piece mold clamping system | |
CN114998346B (en) | Waterproof cloth quality data processing and identifying method | |
CN114529543B (en) | Installation detection method and device for peripheral screw gasket of aero-engine | |
CN111192261A (en) | Method for identifying lens defect types | |
CN114638822B (en) | Method and system for detecting surface quality of automobile cover plate by using optical means | |
CN115761257A (en) | Method and device for detecting angle of part based on computer vision technology | |
CN115049641A (en) | Electric data processing method and system for anomaly detection of mechanical parts | |
CN115797314A (en) | Part surface defect detection method, system, equipment and storage medium | |
CN114638847A (en) | Insulator hardware trimming method and system based on image processing | |
CN109949245B (en) | Cross laser detection positioning method and device, storage medium and computer equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20231107 Address after: Room 1804, Building 1, No. 96 Lixin 12th Road, Xintang Town, Zengcheng District, Guangzhou City, Guangdong Province, 511300 Applicant after: Guangzhou Yutong Environmental Protection Technology Co.,Ltd. Address before: No. 142, tongqi Road, Haimen Industrial Park, Haimen District, Nantong City, Jiangsu Province, 226100 Applicant before: Nantong feilida Hydraulic Technology Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |