CN115082464B - Method and system for identifying weld data in welding process of dust remover - Google Patents

Method and system for identifying weld data in welding process of dust remover Download PDF

Info

Publication number
CN115082464B
CN115082464B CN202211002484.4A CN202211002484A CN115082464B CN 115082464 B CN115082464 B CN 115082464B CN 202211002484 A CN202211002484 A CN 202211002484A CN 115082464 B CN115082464 B CN 115082464B
Authority
CN
China
Prior art keywords
gray
region
value
area
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211002484.4A
Other languages
Chinese (zh)
Other versions
CN115082464A (en
Inventor
刘双粉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Yutong Environmental Protection Technology Co ltd
Original Assignee
Guangzhou Yutong Environmental Protection Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yutong Environmental Protection Technology Co ltd filed Critical Guangzhou Yutong Environmental Protection Technology Co ltd
Priority to CN202211002484.4A priority Critical patent/CN115082464B/en
Publication of CN115082464A publication Critical patent/CN115082464A/en
Application granted granted Critical
Publication of CN115082464B publication Critical patent/CN115082464B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a method and a system for identifying weld data in a welding process of a dust remover, and relates to the technical field of data processing and identification. Mainly comprises the following steps: processing the infrared image of the welding seam surface to obtain a gray image; processing the gray level image to obtain gray level difference degree of each pixel point; obtaining a segmentation threshold value by using the gray level difference of each pixel point; obtaining an abnormal region and a normal region by using a segmentation threshold, carrying out edge detection on the normal region to obtain an edge region, screening out a closed region in the edge region, constructing an objective function according to the gray entropy of each closed region and the gray entropy and the area of each abnormal region, and calculating the value of the objective function; and increasing the value of the segmentation threshold by one step, re-carrying out threshold segmentation and calculation of the objective function, iterating until the value of the objective function is converged, and forming a defect area by pixel points of which the gray value is larger than the segmentation threshold when the objective function is converged in the gray image.

Description

Method and system for identifying weld data in welding process of dust remover
Technical Field
The application relates to the technical field of data processing and recognition, in particular to a method and a system for recognizing weld data in a welding process of a dust remover.
Background
In the dust removing machine manufacturing process, welding is an important process, and the air permeability and the outlet concentration are directly influenced by the welding quality, so that after welding is finished, a welding line image is subjected to nondestructive detection by using an infrared camera, whether defects exist or not is judged by analyzing the infrared image, the positions of the defects are separated, and the subsequent welding repair is facilitated.
Aiming at the detection of defects in welding seams of dust collectors, global segmentation is carried out through a preset threshold value in the prior art to obtain defective parts in the welding seams.
However, because the non-defective area of the weld in the infrared image also has larger gray level variation, it is difficult to accurately locate the defective area by global threshold segmentation using a fixed preset threshold. Meanwhile, the fixed preset threshold value in the traditional threshold value segmentation is not timely adjusted in combination with the actual condition of the defects in the welding line, so that the detection process of the defects in the welding line is not targeted, and meanwhile, the corresponding threshold value is preset in advance according to the welding line under different types or environments, so that the detection of the defects in the welding line is more complicated and has no universality.
Disclosure of Invention
Aiming at the technical problems, the application provides a method and a system for identifying weld joint data in the welding process of a dust remover, which do not need to manually preset gray thresholds in advance to perform global threshold segmentation to determine defect areas in the weld joint, so that the detection result of defects in the weld joint is more accurate, and the detection process is more universal.
In a first aspect, an embodiment of the present application provides a method for identifying weld data in a welding process of a dust remover, including:
s1: obtaining an infrared image of the surface of the welding seam to be detected, preprocessing the infrared image to obtain a welding seam area image, and graying the welding seam area image to obtain a gray image.
S2: and taking the direction of the maximum principal component after PCA of the gray level image as the extending direction of the welding seam, and arranging the pixel values of the pixel points along the extending direction in the gray level image to obtain each gray level sequence respectively.
S3: and respectively obtaining the gray level difference degree of each pixel point in each gray level sequence according to the gray level entropy and the gray level average value of the gray level sequence.
S4: the segmentation threshold is set according to the maximum gray level difference and the minimum gray level difference.
S5: and forming an abnormal region by using pixels with gray values larger than the segmentation threshold in the gray image, and forming a normal region by using pixels with gray values not larger than the segmentation threshold in the gray image.
S6: and carrying out edge detection on the normal region to obtain an edge region, screening out closed regions in the edge region, constructing an objective function according to the gray entropy of each closed region, the gray entropy and the area of each abnormal region, and calculating the value of the objective function.
S7: and (3) increasing the value of the segmentation threshold by one step length, executing S5 to S6, iterating until the value of the objective function is converged, taking the segmentation threshold when the objective function is converged as a convergence threshold, and forming a defect area by using pixel points with gray values larger than the convergence threshold in the gray image.
In one possible embodiment, constructing the objective function according to the gray entropy of each of the closed regions, the gray entropy of each of the abnormal regions, and the area includes:
here, theRepresents the gray mean value of the j-th occlusion region,>representing the maximum gray value in the gray image, for example>Representing the gray scale difference between the j-th closed region and the adjacent pixel point thereof,/for the pixel region>Gray entropy representing the j-th closed region, < ->Indicate->Gray mean value of individual anomaly areas, +.>Indicate->Gray scale difference between the abnormal region and the adjacent pixel point thereof,>indicate->Gray entropy of the individual anomaly areas, +.>Indicating the extent of influence of the area of the jth occlusion region,/->Indicate->Degree of influence of the area of the individual abnormality regions, +.>For the number of closed areas>Is the number of abnormal regions.
In one possible embodiment, the method for obtaining the degree of influence of the area of the closed area or the abnormal area includes:
when the area of the closed area or the abnormal area is larger than a preset area threshold, the influence degree is a preset first value, otherwise, the influence degree is a preset second value, wherein the preset first value is larger than the preset second value.
In one possible embodiment, the calculation of the gray scale difference of the abnormal region and the adjacent pixel point thereof includes:
and taking 5 pixels adjacent to each edge pixel point of the abnormal region as adjacent pixel points, and subtracting the gray average value of the adjacent pixel points from the gray average value of the pixel points in the abnormal region to obtain the gray difference between the abnormal region and the adjacent pixel points.
In one possible embodiment, the calculation of the gray scale difference between the closed region and its neighboring pixel points includes:
and 5 pixels adjacent to each edge pixel point of the closed region are used as adjacent pixel points, and the gray average value of the adjacent pixel points is subtracted from the gray average value of the pixel points in the closed region to obtain the gray difference between the closed region and the adjacent pixel points.
In one possible embodiment, screening out the closed region in the edge region includes:
and drawing straight lines passing through geometric centers of the edge areas in different directions at equal angles by taking a preset interval angle as an interval, respectively judging the number of intersection points of each straight line and the edge areas, counting the duty ratio of the straight lines with the number of intersection points being more than 2 in the edge areas, and taking the edge areas as closed areas when the duty ratio is more than a preset first threshold value.
And screening out the closed areas in all the edge areas by using a closed area judging method.
In a possible embodiment, according to the gray entropy and the gray average value of the gray sequence, the gray difference of each pixel point in each gray sequence is obtained respectively, including:
wherein,representing the>Gray scale difference of each pixel point, +.>Gray mean value representing gray sequence,/->Representing the>Gray value of each pixel, +.>And (5) representing the gray entropy of the gray sequence.
In one possible embodiment, setting the segmentation threshold according to the maximum gray level difference and the maximum gray level difference includes:
the maximum gray level difference degreeAnd carrying out weighted average on the maximum gray level difference, and taking a weighted average result as a segmentation threshold value. Wherein the weight corresponding to the maximum gray level difference in the weighted average process is that
In one possible embodiment, preprocessing an infrared image of a weld surface to obtain a weld area image includes:
and setting the pixel value of a pixel point with the temperature not greater than a preset temperature threshold in the infrared image to be 0, and obtaining a welding seam area image, wherein the preset temperature threshold is obtained by counting historical data of the temperature of an area outside the welding seam.
In a second aspect, an embodiment of the present application provides a dust remover weld defect detection system based on image processing, including: the processor executes the computer program stored in the memory to realize the identification method of the welding seam data in the welding process of the dust remover.
The application provides a method and a system for identifying weld data in a welding process of a dust remover, and compared with the prior art, the embodiment of the application has the beneficial effects that: the defect area in the welding line is determined by global threshold segmentation without manually presetting a gray threshold in advance, so that the detection result of the defect in the welding line is more accurate, and the detection process is more universal.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the application, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for identifying weld data in a welding process of a dust remover according to an embodiment of the application.
Fig. 2 is a schematic diagram of adjacent pixels in an embodiment of the application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second" may include one or more such features, either explicitly or implicitly; in the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
The embodiment of the application provides a method for identifying weld data in a welding process of a dust remover, which is shown in fig. 1 and comprises the following steps:
s1, obtaining an infrared image of the surface of the welding seam to be detected, preprocessing the infrared image to obtain a welding seam area image, and graying the welding seam area image to obtain a gray image.
And S2, taking the direction of the maximum principal component after PCA of the gray level image as the extending direction of the welding seam, and arranging the pixel values of the pixel points along the extending direction in the gray level image to obtain each gray level sequence.
And S3, respectively obtaining the gray scale difference degree of each pixel point in each gray scale sequence according to the gray scale entropy and the gray scale average value of the gray scale sequence.
And S4, setting a segmentation threshold according to the maximum gray level difference and the minimum gray level difference.
And S5, forming the pixel points with the gray values larger than the segmentation threshold value in the gray image into an abnormal region, and forming the pixel points with the gray values not larger than the segmentation threshold value in the gray image into a normal region.
And S6, carrying out edge detection on the normal region to obtain an edge region, screening out a closed region in the edge region, constructing an objective function according to the gray entropy of each closed region, the gray entropy and the area of each abnormal region, and calculating the value of the objective function.
And step S7, increasing the value of the segmentation threshold by one step, executing steps S5 to S6, iterating until the value of the objective function is converged, taking the segmentation threshold when the objective function is converged as a convergence threshold, and forming a defect area by using pixel points with gray values larger than the convergence threshold in the gray image.
The embodiment of the application aims at the following situations: and acquiring an infrared image of the welding line of the dust remover, and processing the infrared image to obtain the defect position in the welding line, so that the defect existing in the welding line can be repaired conveniently.
Further, in step S1, an infrared image of the surface of the weld to be detected is obtained, a weld area image is obtained by preprocessing, and a gray image is obtained by graying the weld area image. The method specifically comprises the following steps:
firstly, in order to realize defect area positioning, the embodiment of the application acquires an infrared image of the surface of a welding line. It should be noted that, in this embodiment, the infrared image of the surface of the weld may be acquired by an infrared camera.
Secondly, preprocessing an infrared image of the surface of the welding seam to obtain a welding seam area image, wherein the color value of the welding seam area in the infrared image of the welding seam is greatly different from the color value of a non-welding seam area, so that the embodiment of the application utilizes the color to divide the welding seam area image, and the method specifically comprises the following steps: and setting the pixel value of a pixel point with the temperature not greater than a preset temperature threshold in the infrared image to be 0, and obtaining a welding seam area image, wherein the preset temperature threshold is obtained by counting historical data of the temperature of an area outside the welding seam.
Finally, graying the obtained weld joint area image to obtain a gray image, which comprises the following steps: and taking the maximum value of pixel values of the pixel points in the RGB three channels in the weld joint area image as the gray value of the pixel points in the gray image. In this way, the influence of the color on the subsequent processing process can be avoided.
Further, in step S2, the direction of the largest principal component after PCA is performed on the gray-scale image is taken as the extending direction of the weld, and the pixel values of the pixels along the extending direction in the gray-scale image are arranged, so as to obtain each gray-scale sequence. The method specifically comprises the following steps:
analyzing based on the weld gray level image, and calculating gray level difference degree of pixels in each region:
first, the direction of the maximum principal component after PCA of the gray-scale image is taken as the extending direction of the weld bead, and PCA (Principal components analysis, principal component analysis) is one of the important dimension reduction methods. The method has wide application in the fields of data compression redundancy elimination, data noise elimination and the like. Which uses orthogonal transformation to transform a series of potentially linearly related variables into a set of new linearly uncorrelated variables, also known as principal components, which are used to characterize the data in smaller dimensions.
In space, PCA is understood to mean that the original data is projected to a new coordinate system, the first principal component is a first coordinate axis, and its meaning represents a change interval of a new variable obtained by transforming a plurality of variables in the original data; the second component is a second coordinate axis, and represents a change interval of a second new variable obtained by a certain transformation of a plurality of variables in the original data. Thus we turn interpreting the differences in the samples using the raw data into interpreting the differences in the samples using the new variables.
It should be noted that, in this projection manner, in order to preserve the interpretation of the original data to the maximum, a maximum variance theory or a minimum loss theory is generally used, so that the first principal component has the maximum variance or variance. In this embodiment, the first principal component direction, which is the maximum principal component direction, is used as the extending direction of the weld bead.
Next, the pixel values of the pixels in the extending direction in the gray scale image are arranged to obtain each gray scale sequence. Straight lines can be drawn along the extending direction of the welding line through each pixel point in the gray level image, gray level values of the pixel points on each straight line are arranged, and each gray level sequence is obtained respectively.
Further, step S3, respectively obtaining the gray level difference of each pixel point in each gray level sequence according to the gray level entropy and the gray level average value of the gray level sequence. The method specifically comprises the following steps:
wherein,representing the>Gray scale difference of each pixel point, +.>Gray mean value representing gray sequence,/->Representing the>Gray value of each pixel, +.>And (5) representing the gray entropy of the gray sequence. Thus, the gray scale difference degree of each pixel point in each gray scale sequence can be obtained respectively. The gray entropy of the gray sequence can be used as a denominatorThe influence of the boundary area of the gray level change is reduced, and the pixel is not defective in spite of the large gray level difference in the edge area of the gray level change, so that the influence of the pixel at the boundary of the gray level change is reduced by the gray level difference.
It should be noted that, when the gray value of the pixel point has a large gray difference with respect to the gray sequence in which the pixel point is located in the extending direction of the weld seam, the probability that the pixel belongs to a defect is large, and on the other hand, the probability that the same gray value belongs to a defect in different regions is also different.
Further, in step S4, a segmentation threshold is set according to the maximum gray level difference and the minimum gray level difference. The method specifically comprises the following steps:
carrying out weighted average on the maximum gray level difference and the maximum gray level difference, and taking a weighted average result as a segmentation threshold; wherein the weight corresponding to the maximum gray level difference in the weighted average process is thatI.e. the smallest grey scale difference in the weighted averaging process corresponds to a weight of +.>
Further, in step S5, pixels with gray values greater than the division threshold in the gray image are formed into an abnormal region, and pixels with gray values not greater than the division threshold in the gray image are formed into a normal region. The method specifically comprises the following steps:
specifically, pixels with gray values greater than the segmentation threshold in the gray image are formed into an abnormal region, and pixels with gray values not greater than the segmentation threshold in the gray image are formed into a normal region.
Further, step S6, edge detection is carried out on the normal area to obtain an edge area, a closed area in the edge area is screened out, an objective function is constructed according to the gray entropy of each closed area and the gray entropy and the area of each abnormal area, and the value of the objective function is calculated. The method specifically comprises the following steps:
firstly, edge detection is carried out on a normal area to obtain an edge area, and a closed area in the edge area is screened out.
It should be noted that edge detection of an image is an essential step of image processing, and is an essential research direction and block in image processing. The main principle is that pixel points with obvious color changes or brightness changes in the digital image are identified, and the obvious changes of the pixel points often represent that the attribute of the image is changed significantly, wherein the important changes include discontinuity in depth, discontinuity in direction, discontinuity in brightness and the like.
There are many edge detection models in common use today: first order there are Roberts operator, prewitt operator, sobel operator, canny operator, etc.; the second order is Laplacian operator, etc. Edge detection of an image is achieved based on gradients of the image, and obtaining gradients of the image translates into obtaining the image using various operators to convolve the image.
As an example, in the embodiment of the present application, the sober operator is used to perform edge detection on a normal area in the gray image to obtain an edge area.
Specifically, the process of screening out the closed region in the edge region includes: and drawing straight lines passing through geometric centers of the edge areas in different directions at equal angles by taking a preset interval angle as an interval, respectively judging the number of intersection points of each straight line and the edge areas, counting the duty ratio of the straight lines with the number of intersection points being more than 2 in the edge areas, and taking the edge areas as closed areas when the duty ratio is more than a preset first threshold value. And screening out the closed areas in all the edge areas by using a closed area judging method.
As an example, in the embodiment of the present application, the preset interval angle is 1 °, and the preset first threshold is 0.9.
And secondly, constructing an objective function according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area, and calculating the value of the objective function. The method specifically comprises the following steps:
the objective function is:
wherein,for the purpose of +.>Represents the gray mean value of the j-th occlusion region,>representing the maximum gray value in the gray image, for example>Representing the gray scale difference between the j-th closed region and the adjacent pixel point thereof,/for the pixel region>Gray entropy representing the j-th closed region, < ->Indicate->Gray mean value of individual anomaly areas, +.>Indicate->Gray scale difference between the abnormal region and the adjacent pixel point thereof,>indicate->Gray entropy of the individual anomaly areas, +.>Indicating the extent of influence of the area of the jth occlusion region,/->Indicate->Degree of influence of the area of the individual abnormality regions, +.>For the number of closed areas>Is the number of abnormal regions.
The calculation of the gray scale difference between the occlusion region or the abnormal region and the adjacent pixel point thereof comprises the following steps: fig. 2 is a schematic diagram of adjacent pixel points in the embodiment of the present application, as shown in fig. 2, 5 pixels adjacent to each edge pixel point of the closed area or the abnormal area are taken as adjacent pixel points, and the gray average value of the adjacent pixel points is subtracted from the gray average value of the pixel points in the closed area or the abnormal area, so as to obtain the gray difference between the closed area or the abnormal area and the adjacent pixel points.
It should be noted that, when the area of the closed area or the abnormal area is greater than the preset area threshold, the influence degree is a preset first value, otherwise, the influence degree is a preset second value, wherein the preset first value is greater than the preset second value.
As an example, in the embodiment of the present application, the preset area threshold is 4.
As an example, in the embodiment of the present application, the preset first value is 4, and the preset second value is 0.5.
The gray value of the defective region is small and the defective region is inside according to the characteristics of the weld defectThe difference of the gray scale is smaller, the contrast ratio between the defective area and the adjacent area is larger, and the defective area is larger than a certain area in order to avoid noise interference, so that the three characteristics are utilized to represent the defect coincidence degree, the defect coincidence degree of the non-defective area is smaller,indicating the number of abnormal regions.
Further, step S7, increasing the value of the segmentation threshold by one step, executing steps S5 to S6, iterating until the value of the objective function converges, taking the segmentation threshold when the objective function converges as a convergence threshold, and forming a defect region by using pixel points with gray values larger than the convergence threshold in the gray image. The method specifically comprises the following steps:
as an example, the step size is 10 in the embodiment of the present application.
Dividing the gray image by using the obtained convergence threshold, namely forming a defect area by pixel points with gray values larger than the convergence threshold in the gray image. In this way, a defective region in the weld is obtained.
Based on the same inventive concept as the method, the embodiment also provides a dust remover welding seam defect detection system based on image processing, which comprises a memory and a processor, wherein the processor executes a computer program stored in the memory to realize detection of defects in the dust remover welding seam as described in the embodiment of the method for identifying welding seam data in the dust remover welding process.
Because the method for detecting the defects in the welding line of the dust remover has been described in the embodiment of the method for identifying the welding line data in the welding process of the dust remover, the description is omitted here.
In summary, the method and the system for identifying the weld joint data in the welding process of the dust remover provided by the embodiment of the application do not need to manually preset the gray threshold in advance to perform global threshold segmentation to determine the defect area in the weld joint, so that the detection result of the defect in the weld joint is more accurate, and the detection process is more universal.
In this disclosure, terms such as "comprising," "including," "having," and the like are open-ended terms that mean "including, but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
It should also be noted that in the methods and systems of the present application, components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered equivalent to the present disclosure.
The above examples are given for clarity of illustration only and are not to be construed as limiting the scope of the application. Other variations or modifications of the various aspects will be apparent to persons skilled in the art from the foregoing description, and it is not necessary nor intended to be exhaustive of all embodiments. All designs that are the same or similar to the present application are within the scope of the present application.

Claims (5)

1. The method for identifying the welding line data in the welding process of the dust remover is characterized by comprising the following steps of:
s1: obtaining an infrared image of the surface of a welding seam to be detected, preprocessing the infrared image to obtain a welding seam area image, and graying the welding seam area image to obtain a gray image;
s2: taking the direction of the maximum principal component after PCA of the gray level image as the extending direction of a welding line, and arranging the pixel values of the pixel points along the extending direction in the gray level image to respectively obtain each gray level sequence;
s3: respectively obtaining the gray level difference degree of each pixel point in each gray level sequence according to the gray level entropy and the gray level average value of the gray level sequence;
s4: setting a segmentation threshold according to the maximum gray level difference and the minimum gray level difference;
s5: forming an abnormal region by using pixels with gray values larger than a segmentation threshold in the gray image, and forming a normal region by using pixels with gray values not larger than the segmentation threshold in the gray image;
s6: performing edge detection on the normal region to obtain an edge region, screening out closed regions in the edge region, constructing an objective function according to the gray entropy of each closed region, the gray entropy and the area of each abnormal region, and calculating the value of the objective function;
s7: increasing the value of the segmentation threshold by one step length, executing S5 to S6, iterating until the value of the objective function is converged, taking the segmentation threshold when the objective function is converged as a convergence threshold, and forming a defect area by using pixel points with gray values larger than the convergence threshold in the gray image;
constructing an objective function according to the gray entropy of each closed region, the gray entropy of each abnormal region and the area, wherein the objective function comprises the following steps:
here, theRepresents the gray mean value of the j-th occlusion region,>representing the maximum gray value in the gray image, for example>Representing the gray scale difference between the j-th closed region and the adjacent pixel point thereof,/for the pixel region>Gray entropy representing the j-th closed region, < ->Indicate->Gray mean value of individual anomaly areas, +.>Indicate->Gray scale difference between the abnormal region and the adjacent pixel point thereof,>indicate->Gray entropy of the individual anomaly areas, +.>Indicating the extent of influence of the area of the jth occlusion region,/->Indicate->Degree of influence of the area of the individual abnormality regions, +.>For the number of closed areas>Is the number of abnormal regions;
the process for obtaining the influence degree of the area of the closed area or the abnormal area includes:
when the area of the closed area or the abnormal area is larger than a preset area threshold, the influence degree is a preset first value, otherwise, the influence degree is a preset second value, wherein the preset first value is larger than the preset second value;
the calculation process of the gray scale difference between the abnormal region and the adjacent pixel point comprises the following steps:
taking 5 pixels adjacent to each edge pixel of the abnormal region as adjacent pixels, and subtracting the gray average value of the adjacent pixels from the gray average value of the pixels in the abnormal region to obtain the gray difference between the abnormal region and the adjacent pixels;
the calculation process of gray scale difference between the closed region and the adjacent pixel point comprises the following steps:
taking 5 pixels adjacent to each edge pixel point of the closed region as adjacent pixel points, and subtracting the gray average value of the adjacent pixel points from the gray average value of the pixel points in the closed region to obtain the gray difference between the closed region and the adjacent pixel points;
screening out a closed region in the edge region, including:
drawing straight lines passing through geometric centers of the edge areas in different directions at equal angles by taking a preset interval angle as an interval, respectively judging the number of intersection points of each straight line and the edge areas, counting the duty ratio of the straight lines with the number of intersection points being more than 2 in the edge areas, and taking the edge areas as closed areas when the duty ratio is more than a preset first threshold value;
and screening out the closed areas in all the edge areas by using a closed area judging method.
2. The method for recognizing welding seam data in a welding process of a dust remover according to claim 1, wherein the step of obtaining the gray scale difference degree of each pixel point in each gray scale sequence according to the gray scale entropy and the gray scale average value of the gray scale sequence comprises the steps of:
wherein,representing the>Gray scale difference of each pixel point, +.>Gray mean value representing gray sequence,/->Representing the>Gray value of each pixel, +.>And (5) representing the gray entropy of the gray sequence.
3. The method for recognizing weld data in a welding process of a dust remover according to claim 1, wherein the setting of the segmentation threshold according to the maximum gray level difference and the maximum gray level difference comprises:
carrying out weighted average on the maximum gray level difference and the maximum gray level difference, and taking a weighted average result as a segmentation threshold; wherein the weight corresponding to the maximum gray level difference in the weighted average process is that
4. The method for recognizing weld data in a welding process of a dust remover according to claim 1, wherein preprocessing an infrared image of a weld surface to obtain a weld region image comprises:
and setting the pixel value of a pixel point with the temperature not greater than a preset temperature threshold in the infrared image to be 0, and obtaining a welding seam area image, wherein the preset temperature threshold is obtained by counting historical data of the temperature of an area outside the welding seam.
5. An identification system for weld data in a dust remover welding process, comprising: a memory and a processor, wherein the processor executes a computer program stored in the memory to implement the method for identifying weld data in a dust extractor welding process according to any one of claims 1-4.
CN202211002484.4A 2022-08-22 2022-08-22 Method and system for identifying weld data in welding process of dust remover Active CN115082464B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211002484.4A CN115082464B (en) 2022-08-22 2022-08-22 Method and system for identifying weld data in welding process of dust remover

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211002484.4A CN115082464B (en) 2022-08-22 2022-08-22 Method and system for identifying weld data in welding process of dust remover

Publications (2)

Publication Number Publication Date
CN115082464A CN115082464A (en) 2022-09-20
CN115082464B true CN115082464B (en) 2023-12-12

Family

ID=83243952

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211002484.4A Active CN115082464B (en) 2022-08-22 2022-08-22 Method and system for identifying weld data in welding process of dust remover

Country Status (1)

Country Link
CN (1) CN115082464B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115249301B (en) * 2022-09-22 2022-12-20 精技精密部件(南通)有限公司 Method for extracting grinding wrinkles on surface of workpiece
CN115272316B (en) * 2022-09-27 2023-01-03 山东华太新能源电池有限公司 Intelligent detection method for welding quality of battery cover based on computer vision
CN115345894B (en) * 2022-10-17 2022-12-27 南通通力油泵有限公司 Welding seam ray detection image segmentation method
CN115359047B (en) * 2022-10-19 2024-07-19 元能微电子科技南通有限公司 Abnormal defect detection method for intelligent welding of PCB
CN115375588B (en) * 2022-10-25 2023-02-07 山东旗胜电气股份有限公司 Power grid transformer fault identification method based on infrared imaging
CN115457031A (en) * 2022-10-27 2022-12-09 江苏集宿智能装备有限公司 Method for identifying internal defects of integrated box based on X-ray
CN116038112A (en) * 2022-12-06 2023-05-02 西南石油大学 Laser tracking large-scale curved plate fillet welding system and method
CN115880302B (en) * 2023-03-08 2023-05-23 杭州智源电子有限公司 Method for detecting welding quality of instrument panel based on image analysis
CN116128877B (en) * 2023-04-12 2023-06-30 山东鸿安食品科技有限公司 Intelligent exhaust steam recovery monitoring system based on temperature detection
CN116188498A (en) * 2023-04-28 2023-05-30 江西科技学院 Axle welding area detection method and system based on computer vision
CN116385476B (en) * 2023-06-05 2023-08-18 青岛星跃铁塔有限公司 Iron tower quality analysis method based on visual detection
CN116385439B (en) * 2023-06-05 2023-08-15 山东兰通机电有限公司 Motor rubber shock pad quality detection method based on image processing
CN116664569B (en) * 2023-07-31 2023-10-10 山东正华建筑科技有限公司 Weld flash defect detection method
CN116805317B (en) * 2023-08-28 2023-11-14 苏州科尔珀恩机械科技有限公司 Rotary furnace inner wall defect detection method based on artificial intelligence
CN117408958B (en) * 2023-10-16 2024-03-26 日照鼎立钢构股份有限公司 Method and system for monitoring production quality of steel structural member
CN117455870B (en) * 2023-10-30 2024-04-16 太康精密(中山)有限公司 Connecting wire and connector quality visual detection method
CN117197588B (en) * 2023-11-02 2024-03-05 南通宝田包装科技有限公司 Plastic package control early warning method based on temperature identification
CN117351013B (en) * 2023-12-05 2024-02-09 天津风霖物联网科技有限公司 Intelligent detection system and method for building damage

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103942777A (en) * 2014-03-13 2014-07-23 华南理工大学 Mobile phone glass cover plate defect detecting method based on principal component analysis
CN114862862B (en) * 2022-07-11 2022-09-16 江苏大田阀门制造有限公司 Pump body cold shut defect identification method and system based on image processing

Also Published As

Publication number Publication date
CN115082464A (en) 2022-09-20

Similar Documents

Publication Publication Date Title
CN115082464B (en) Method and system for identifying weld data in welding process of dust remover
WO2022042579A1 (en) Lcd screen defect detection method and apparatus
US10803573B2 (en) Method for automated detection of defects in cast wheel products
CN111047568B (en) Method and system for detecting and identifying steam leakage defect
CN105067638B (en) Tire fetal membrane face character defect inspection method based on machine vision
KR20020077420A (en) Method for automatically detecting casting defects in a test piece
CN114943739B (en) Aluminum pipe quality detection method
CN111667470B (en) Industrial pipeline flaw detection inner wall detection method based on digital image
CN111062940B (en) Screw positioning and identifying method based on machine vision
CN113340909B (en) Glue line defect detection method based on machine vision
CN113689415A (en) Steel pipe wall thickness online detection method based on machine vision
CN115471486A (en) Switch interface integrity detection method
CN116128873A (en) Bearing retainer detection method, device and medium based on image recognition
CN116883446B (en) Real-time monitoring system for grinding degree of vehicle-mounted camera lens
CN116258838B (en) Intelligent visual guiding method for duct piece mold clamping system
CN116385433B (en) Plastic pipeline welding quality assessment method
CN110322508B (en) Auxiliary positioning method based on computer vision
CN116596987A (en) Workpiece three-dimensional size high-precision measurement method based on binocular vision
CN114529543B (en) Installation detection method and device for peripheral screw gasket of aero-engine
CN116823708A (en) PC component side mold identification and positioning research based on machine vision
CN113592953B (en) Binocular non-cooperative target pose measurement method based on feature point set
CN115880481A (en) Curve positioning algorithm and system based on edge profile
CN115049641A (en) Electric data processing method and system for anomaly detection of mechanical parts
CN114638847A (en) Insulator hardware trimming method and system based on image processing
CN117474916B (en) Image detection method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20231107

Address after: Room 1804, Building 1, No. 96 Lixin 12th Road, Xintang Town, Zengcheng District, Guangzhou City, Guangdong Province, 511300

Applicant after: Guangzhou Yutong Environmental Protection Technology Co.,Ltd.

Address before: No. 142, tongqi Road, Haimen Industrial Park, Haimen District, Nantong City, Jiangsu Province, 226100

Applicant before: Nantong feilida Hydraulic Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant