CN116612112B - Visual inspection method for surface defects of bucket - Google Patents

Visual inspection method for surface defects of bucket Download PDF

Info

Publication number
CN116612112B
CN116612112B CN202310868040.7A CN202310868040A CN116612112B CN 116612112 B CN116612112 B CN 116612112B CN 202310868040 A CN202310868040 A CN 202310868040A CN 116612112 B CN116612112 B CN 116612112B
Authority
CN
China
Prior art keywords
pixel
degree
edge
image
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310868040.7A
Other languages
Chinese (zh)
Other versions
CN116612112A (en
Inventor
岳友清
董春华
刘通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xinmai Shandong Industry And Trade Co ltd
Original Assignee
Xinmai Shandong Industry And Trade Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xinmai Shandong Industry And Trade Co ltd filed Critical Xinmai Shandong Industry And Trade Co ltd
Priority to CN202310868040.7A priority Critical patent/CN116612112B/en
Publication of CN116612112A publication Critical patent/CN116612112A/en
Application granted granted Critical
Publication of CN116612112B publication Critical patent/CN116612112B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses a visual inspection method for surface defects of a water bucket, and relates to the technical field of image processing. The method comprises the following steps: collecting a bucket image, and carrying out graying treatment on the bucket image to obtain a graying image; analyzing the gray level discrete degree around the pixel to be detected of the gray level image, and quantifying the probability of the membership edge of the pixel to obtain the size fit degree of the initial template of the pixel to be detected; according to the size fitting degree of the initial template, the size and coefficient adjustment degree of the Gaussian filter are quantized, and self-adaptive Gaussian filtering of the image is completed; and carrying out Canny edge detection on the bucket image to finish edge extraction. The embodiment of the application can effectively realize the smoothing processing and noise suppression of the image and protect the edge of a tiny target.

Description

Visual inspection method for surface defects of bucket
Technical Field
The application relates to the technical field of image processing, in particular to a visual inspection method for surface defects of a water bucket.
Background
In the industrial production of the water bucket of the water fountain, the production parameters of the water bucket need to be strictly controlled, so that the bad products are prevented from flowing into the market to cause loss. In the production process, the quality of the water bucket cannot be ensured due to the influence of equipment and personnel operation, such as cracking, bending and the like on the surface of the water bucket.
Wherein, the bucket defects such as cracks, holes and the like are reflected in the image to show edge characteristics, and can be subjected to edge detection by adopting a Canny algorithm and then compared with the standard bucket edges, so that the defect detection is performed. However, canny's algorithm is sensitive to noise and requires smoothing by gaussian filtering. When the image is smoothed by using the gaussian filter, the window size and the gaussian filter coefficient need to be set manually, and the coefficient is fixed, so that it is difficult to simultaneously suppress noise and protect the edges of a fine object when the smoothing is performed.
Disclosure of Invention
The embodiment of the application provides a visual inspection method for surface defects of a water bucket, which can effectively realize smooth processing and noise suppression of images and protect fine target edges.
The embodiment of the application provides a visual inspection method for surface defects of a water bucket, which comprises the following steps:
collecting a bucket image, and carrying out graying treatment on the bucket image to obtain a graying image;
analyzing the gray level discrete degree around the pixel to be detected of the gray level image, and quantifying the probability of the membership edge of the pixel to obtain the size fit degree of the initial template of the pixel to be detected;
according to the size fitting degree of the initial template, the size and coefficient adjustment degree of the Gaussian filter are quantized, and self-adaptive Gaussian filtering of the image is completed;
and carrying out Canny edge detection on the bucket image to finish edge extraction.
In some embodiments of the present application, the step of collecting a bucket image and performing graying processing on the bucket image to obtain a grayed image includes:
and acquiring a corresponding bucket image through a camera arranged on a bucket production line of the water dispenser, and carrying out graying treatment on the bucket image according to texture features existing on the bucket to obtain a graying image.
In some embodiments of the present application, the step of analyzing the gray level discrete degree around the pixel to be measured of the gray level image, quantifying the probability of the pixel membership edge, and obtaining the suitability of the size of the initial template of the pixel to be measured includes:
constructing an initial window by taking a pixel to be detected as a center, and analyzing gray distribution in an initial template to obtain gray discrete degree around the pixel to be detected;
and judging the degree of the membership edge of the pixel according to the degree of the gray level dispersion around the pixel to be detected, and obtaining the suitability of the template size.
In some embodiments of the present application, the step of constructing an initial window with the pixel to be measured as a center, analyzing the gray distribution in the initial template, and obtaining the gray discrete degree around the pixel to be measured includes:
set up with the pixel to be measured as the centerA size initial gaussian filter template;
analyzing the gray image of the bucket collected on the production line, and comparing the gray valueDefining the pixels of the pixel to be detected as illumination pixels, calculating the illumination pixel duty ratio in the initial window corresponding to each pixel to be detected, and eliminating the influence of the illumination pixels according to the illumination pixel duty ratio;
the gray level distribution degree in an initial window divided by the pixels to be detected is analyzed, the gray level variance of the pixels in the initial window is calculated, and the gray level discrete degree of the window corresponding to each pixel is obtained through analysis by combining the similar pixel aggregation degree in the initial window.
In some embodiments of the present application, the calculation formula for calculating the duty ratio of the illumination pixel in the initial window corresponding to each pixel to be measured is:
wherein ,for each pixel to be detected, corresponding to the illumination pixel ratio in the initial window, < >>Is the number of illuminated pixels in the initial window, +.>The total number of pixels is included for the initial window.
In some embodiments of the present application, the calculation formula of the gray level discrete degree is:
wherein ,representing the gray level dispersion degree of each initial window, < >>For the pixel gray variance in the initial window, gray difference +.>The pixels of (2) are regarded as similar pixels, and similar pixels are combined in each window to obtain corresponding areas, and the number of the areas is recorded>
In some embodiments of the present application, the step of determining the degree of the pixel membership edge according to the degree of gray level dispersion around the pixel to be detected to obtain the suitability of the template size includes:
judging whether an edge exists in the initial window according to the gray level discrete degree;
if the edge exists, calculating the possible gradient value of the edge in the initial window to obtain a gradient value set
Calculating the gradient of the to-be-measured point;
comparing the calculated gradient of the point to be measured with the gradient value set A, and judging the membership edge degree of the point to be measured;
and obtaining the suitability of the size of the initial template of the pixel to be detected according to the membership edge degree of the point to be detected.
In some embodiments of the present application, the determining whether an edge exists in the initial window according to the gray level discrete degree includes indicating that the edge exists in the initial window when the gray level discrete degree exists; when the gray level discrete degree does not exist, the fact that no edge exists in the initial window is indicated.
In some embodiments of the present application, the step of comparing the calculated gradient of the point to be measured with the gradient value set a to determine the degree of membership edge of the point to be measured includes:
if the measured point meets,/>The gradient of the point to be measured indicates that the point to be measured belongs to the edge; if the point to be measured is not at the edge position and is associated with the set +.>And quantifying the membership edge degree of the to-be-measured point to obtain the membership edge degree of the to-be-measured point when the calculated distance exists between the edge pixel points.
In some embodiments of the present application, the step of obtaining the suitability of the initial template size of the pixel to be detected according to the membership edge degree of the point to be detected includes:
if the membership edge degree of the to-be-measured point is smaller than the first preset value or larger than the second preset value, the pixel to be measured is not suitable for the initial template size;
if the membership edge degree of the to-be-measured point is larger than or equal to the first preset value and smaller than or equal to the second preset value, the pixel to be measured is indicated to be appropriate in initial template size.
Therefore, the embodiment of the application mainly judges the pixel membership section to be tested by analyzing the gray distribution around the pixel to be tested, so as to adjust the filter according to different pixel membership. The size of the filter is reduced when the filter is positioned at the edge, and the small edge can be protected when the filter treatment is performed at the edge by combining the calculated standard deviation size of the initial window; otherwise, the size of the filter is increased, and noise is restrained; when the algorithm smoothes the image, noise suppression and tiny edge protection can be considered.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a visual inspection method for surface defects of a water bucket according to an embodiment of the present application;
fig. 2 is a schematic diagram of steps of a visual inspection method for surface defects of a water tub according to an embodiment of the present application;
fig. 3 is a schematic diagram of a graying image of a bucket according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
In the industrial production of the water bucket of the water fountain, the production parameters of the water bucket need to be strictly controlled, so that the bad products are prevented from flowing into the market to cause loss. In the production process, the quality of the water bucket cannot be ensured due to the influence of equipment and personnel operation, such as cracking, bending and the like on the surface of the water bucket.
Bucket defects such as cracks, holes and the like are reflected in the image to show edge characteristics, and can be detected by adopting a Canny algorithm, and compared with the standard bucket edge to detect the defects. The Canny algorithm is sensitive to noise and needs to be subjected to Gaussian filtering for smoothing. When the image is smoothed by using the gaussian filter, the window size and the gaussian filter coefficient need to be set manually, and the coefficient is fixed, so that it is difficult to simultaneously suppress noise and protect the edges of a fine object when the smoothing is performed.
The application firstly establishes an initial state by taking the pixel to be measured as the centerAnd the size window is used for analyzing the gray level discrete degree in the template, quantifying the probability of the membership edge of the pixel to be detected through the gray level distribution of the window, obtaining the proper degree of the size of the initial window template, and adjusting the size of each filter and the Gaussian coefficient of the filter so that the noise suppression and the fine edge protection of the image can be considered when the bucket image is subjected to smooth processing.
Referring to fig. 1, fig. 1 is a flow chart of a visual inspection method for surface defects of a water bucket according to an embodiment of the present application.
As shown in fig. 1 and fig. 1, an embodiment of the present application provides a visual inspection method for surface defects of a water tub, the method including:
s1, acquiring a bucket image, and carrying out graying treatment on the bucket image to obtain a graying image;
s2, analyzing gray level discrete degree around a pixel to be detected of the gray level image, and quantifying pixel membership edge probability to obtain the size fit degree of an initial template of the pixel to be detected;
s3, quantifying the size and coefficient adjustment degree of the Gaussian filter according to the fit degree of the initial template size, and completing the self-adaptive Gaussian filtering of the image;
s4, carrying out Canny edge detection on the bucket image to finish edge extraction.
The overall logic of the application is as follows:
corresponding bucket images are collected through a production line setting camera;
constructing a pixel with the pixel to be measured as the centerAnalyzing gray level discrete degree in the window to obtain probability of membership edges of pixels to be detected and obtaining suitability of template size;
adjusting the initial template size and the corresponding Gaussian filter coefficient according to the template size fit degree to finish the self-adaptive Gaussian filtering of the image;
and carrying out subsequent Canny edge detection to finish edge extraction.
In detail, the filter is adjusted according to the pixel membership difference by analyzing the gray distribution around the pixel to be measured and judging the pixel membership interval to be measured. The size of the filter is reduced when the filter is positioned at the edge, and the small edge can be protected when the filter treatment is performed at the edge by combining the calculated standard deviation size of the initial window; otherwise, the size of the filter is increased, and noise is restrained; when the algorithm smoothes the image, noise suppression and tiny edge protection can be considered.
Referring to fig. 2 and 3, for step S1, the steps of collecting a bucket image, and performing graying processing on the bucket image to obtain a grayed image include:
and acquiring a corresponding bucket image through a camera arranged on a bucket production line of the water dispenser, and carrying out graying treatment on the bucket image according to texture features existing on the bucket to obtain a graying image.
For step S2, the step of analyzing the gray level discrete degree around the pixel to be detected of the gray level image, quantifying the probability of the pixel membership edge, and obtaining the suitability of the initial template size of the pixel to be detected includes:
s21, constructing an initial window by taking a pixel to be detected as a center, and analyzing gray distribution in an initial template to obtain gray discrete degree around the pixel to be detected;
s22, judging the degree of the membership edge of the pixel according to the degree of the gray level dispersion around the pixel to be detected, and obtaining the suitability of the template size.
The main logic of the steps is as follows: when the image is subjected to Gaussian filtering, gray level distribution of the bucket image is different in different areas. A fixed size filter is difficult to process effectively on the whole image. The larger the size of the filter is selected in the region with uniform gray level distribution or gradual change, the noise can be effectively removed, and the smaller the size of the filter is selected in the region with severe gray level change, so that the tiny edge can be protected. According to the gray level change characteristics of the pixel membership area to be detected, the size of the filter with a proper scale is selected, so that the Gaussian filter processing effect is optimal.
Specifically, for step S21, the step of constructing an initial window with the pixel to be measured as a center, analyzing the gray distribution in the initial template, and obtaining the gray dispersion degree around the pixel to be measured includes:
set up with the pixel to be measured as the centerA size initial gaussian filter template;
analyzing the gray image of the bucket collected on the production line, and comparing the gray valueDefining the pixels of the pixel to be detected as illumination pixels, calculating the illumination pixel duty ratio in the initial window corresponding to each pixel to be detected, and eliminating the influence of the illumination pixels according to the illumination pixel duty ratio;
the gray level distribution degree in an initial window divided by the pixels to be detected is analyzed, the gray level variance of the pixels in the initial window is calculated, and the gray level discrete degree of the window corresponding to each pixel is obtained through analysis by combining the similar pixel aggregation degree in the initial window.
For the step S21 after refinement, the specific logic is that the size of the gaussian filter is affected by the standard deviation of the gray values of the pixels in the filter template, and the gray distribution around the pixel to be measured is analyzed first and then the subsequent analysis is performed.
Further, the calculation formula for calculating the duty ratio of the illumination pixel in the initial window corresponding to each pixel to be measured is as follows:
wherein ,for each pixel to be detected, corresponding to the illumination pixel ratio in the initial window, < >>Is the number of illuminated pixels in the initial window, +.>The total number of pixels is included for the initial window.
The calculation formula of the gray level discrete degree is as follows:
wherein ,representing the gray level dispersion degree of each initial window, < >>For the pixel gray variance in the initial window, gray difference +.>The pixels of (2) are regarded as similar pixels, and similar pixels are combined in each window to obtain corresponding areas, and the number of the areas is recorded>
The following is a brief example:
general Gaussian filter size selection、/>、/>、/>And so on, here we choose an initial size +.>Window, i.e. set up +.about the point to be measured as the center>The initial filter template is sized.
The size filter template is selected for the following reasons: because the gray scale distribution situation around the pixel to be detected is analyzed, the selected window size is too small to accurately obtain the gray scale distribution situation of the pixel around the pixel to be detected; if the size of the filter template is too large, the contribution of data far away from the pixel to be measured to the situation reflecting the surrounding gray distribution is not high, so that a proper filter template with an initial size is selected, and the gray distribution in the window is analyzed.
The illumination can interfere the gray information of the image, and the gray value is larger and evenly distributed in the gray chart. When gray scale distribution in a window is analyzed, the illumination influence is large, so that the calculated variance of gray scale data in the window can deviate, and the judgment of the gray scale discrete degree around the pixel to be detected is influenced.
And analyzing the gray level image of the water barrel collected on the production line, wherein the gray level value of the surface of the water barrel is generally smaller. So the gray value isIs defined as an illumination pixel. For the formula:
in the formula And for each pixel to be detected, the upper right part corresponds to the number of the illuminated pixels in the initial window, and the lower half part corresponds to the total number of the pixels contained in the initial window.
Formula logic: the method can obtain whether the illumination influence exists around each pixel to be detected or not through the method, and eliminates the influence of the illumination pixels when analyzing the gray level discrete degree around the pixels.
Analyzing the gray level discrete degree in the initial window divided by the pixel to be detected, and generally calculating the gray level variance of the pixel in the initial windowPreliminary judgment is carried out, and the corresponding window of each pixel is comprehensively analyzed in the combination window according to the similar pixel aggregation degreeDegree of gray level dispersion of the aperture.
Will gray scale differenceThe pixels of (2) are regarded as similar pixels, and similar pixels are combined in each window to obtain corresponding areas, and the number of the areas is recorded>
in the formula Representing the degree of gray level dispersion for each initial window.
Formula logic: only the variance is adopted to describe that the gray level discrete degree in the window is too limited, the aggregation degree of similar pixels should be analyzed, and the variance is combined to judge. The variance value calculated when the 'edge' characteristic exists in the window is quite large, if the points show the variance value calculated by the 'noise' distribution, the variance value calculated by the 'noise' distribution is quite large, but the gray distribution characteristics of the points are different, and the size selection of the corresponding filter is also different. The degree of aggregation of similar pixels is calculated,a value indicating that there are areas of different gray levels in the window, there are edges between the areas; />If the Gaussian filter size does not exist, the data shows the discreteness, the window discreteness degree is large, and the corresponding Gaussian filter size is large, so that noise is suppressed.
Thus, the corresponding gray level discrete degree around the image pixel is obtained.
For step S22, the step of determining the degree of the pixel membership edge according to the degree of gray level dispersion around the pixel to be detected to obtain the suitability of the template size includes:
judging whether an edge exists in the initial window according to the gray level discrete degree;
if the edge exists, calculating the possible gradient value of the edge in the initial window to obtain a gradient value set
Calculating the gradient of the to-be-measured point;
comparing the calculated gradient of the point to be measured with the gradient value set A, and judging the membership edge degree of the point to be measured;
and obtaining the suitability of the size of the initial template of the pixel to be detected according to the membership edge degree of the point to be detected.
For the refinement step of step S22, the specific logic is: if the pixel belongs to the edge, the filter should be selected to be small in size, so that the small edge is protected, and the edge distortion caused by smoothing is prevented. If the noise belongs to the noise or gray level distribution non-discrete area, a large-size filter is selected, so that the noise is effectively suppressed.
Judging whether an edge exists in the initial window according to the gray level discrete degree in the step, wherein the step comprises the step of indicating that the edge exists in the initial window when the gray level discrete degree exists; when the gray level discrete degree does not exist, the fact that no edge exists in the initial window is indicated.
Further, for comparing the calculated gradient of the point to be measured with the gradient value set a, the step of judging the degree of membership edge of the point to be measured includes:
if the measured point meets,/>The gradient of the point to be measured indicates that the point to be measured belongs to the edge; if the point to be measured is not at the edge position and is associated with the set +.>And quantifying the membership edge degree of the to-be-measured point to obtain the membership edge degree of the to-be-measured point when the calculated distance exists between the edge pixel points.
Further, for the step of obtaining the suitability of the initial template size of the pixel to be detected according to the membership edge degree of the point to be detected, the step includes:
if the membership edge degree of the to-be-measured point is smaller than the first preset value or larger than the second preset value, the pixel to be measured is not suitable for the initial template size;
if the membership edge degree of the to-be-measured point is larger than or equal to the first preset value and smaller than or equal to the second preset value, the pixel to be measured is indicated to be appropriate in initial template size.
The refinement step of step S22 is illustrated below:
the degree of dispersion of each window can be obtained according to step S21,if the value indicates that the edge exists in the window, calculating the gradient value possibly appearing at the edge in the window to obtain a gradient value set +.>
Calculating gradient of to-be-measured point: the traditional Canny algorithm uses +.>The pixel gradient is calculated by finite difference in the neighborhood, and the application adopts +.>Neighborhood, original technology combining pixels +>The gray scale change of the direction makes the gradient value of the pixel to be measured as accurate as possible.
in the formula Is->Gradient amplitude of each measuring point, < >>The gray level difference value of two adjacent pixels in the horizontal direction of the point to be detected;
is about the point to be measured>On a 45-degree diagonal line in the window, two adjacent pixels of the point to be detected have gray level difference values;
the gray level difference value of two adjacent pixels in the vertical direction of the point to be detected; />Is centered on the point to be measuredOn a 135-degree diagonal line in the window, two adjacent pixels of the point to be measured have gray level difference values;
is->Gradient angle of each pixel to be measured, +.>Indicate->The orthogonal decomposition values of the measured points in the horizontal and vertical directions. />Indicate->And obtaining gradients of the points to be detected, wherein the gradients comprise amplitude values and angles.
The gradient of the pixel to be detected can be obtained through the above formula.
In step S21, gray level differences smaller than 10 are defined as gray level similarity, different areas are obtained by clustering, and the gray level differences between the different areas are calculated and are approximately regarded as gradients at the junctions (i.e. edges) of the different areas. And whether the to-be-detected point belongs to the edge or not is analyzed, and comparison can be carried out so as to judge the degree of the membership edge of the to-be-detected point.
If the measured point meetsIndicating that it belongs to an edge. If not at the edge position, and the set +.>And quantifying the membership edge degree of the to-be-measured point by combining the calculated existence distance of the edge pixel point with the 8-neighborhood gray level average value.
Specifically, a set of points to be measured and corresponding window calculations is calculatedThe shortest distance of the edge is obtained to obtain a distance set +.>. Calculating the gray level average value of a 3 multiplied by 3 window taking a point to be measured as the center and the gray level difference value of the point to be measured +.>
in the formula Indicating the degree of the membership edge of the point to be detected. />Set calculated for the point to be measured and the corresponding window +.>Shortest distance of edge, ->The gray level difference value between the gray level average value of the 3 x 3 window with the center of the point to be measured and the point to be measured is obtained.
Formula logic: the minimum distance between the point to be measured and the edge is used as a weight value and the point to be measuredThe gray average value in the window is multiplied by the difference to represent the membership edge degree of the to-be-measured point. The distance between the point to be measured and the edge is inversely proportional to the edge degree of the point to be measured (i.e.)>The smaller the edge degree of the point to be measured is, the higher the difference between the 8 neighborhood gray average value of the point to be measured and the gray value of the point to be measured is, the more the point to be measured belongs to the edge, the difference is close to the point to be measured>And the degree of membership edges of the points to be detected is very low, and the degree of membership edges is in direct proportion. General->More than 10 (i.e. the gray difference value is more than 10), which indicates that the 8 neighborhood gray distribution of the point to be measured is quite discrete,is->The pixel distance indicates that the distance is relatively close, then +.>The product reaches->Indicating that the point to be measured belongs to the edge, and the surrounding gray distribution has a uniform part, the initial template size ∈>When the image is smoothed, edges can be effectively protected and noise can be suppressed.
If edges exist, a small-size filter is selected, and if the initial template selected by the application is bigger, the application indicates that the point to be measured is to be measuredIs very large or absent (i.e.)>) When the initial template is unsuitable, the template size (i.e., the filter size) needs to be adjusted. If calculated->The gray distribution around the point to be measured may be noise distribution, the initial size may be properly amplified, and the specific size judgment formula is:
thus, the suitability of the initial template size of the pixel to be detected is obtained.
For step S3, the step of quantizing the size and coefficient adjustment degree of the gaussian filter according to the suitability of the initial template size to complete adaptive gaussian filtering of the image specifically includes:
and (2) obtaining the suitability of the image pixel point corresponding to the initial template according to the step (S2), and adjusting the size according to different gray level distribution conditions to finish the self-adaptive Gaussian filtering.
The specific logic is as follows: where there are significant edges, the size of the initial template is reduced; otherwise, the original template size can not be changed in the existence of the gray gradient region; the gray level is evenly distributed or the noise is more distributed, and the size of the initial template is increased.
The following specifically exemplifies step S3:
when weight is allocated to each position in Gaussian filter, the standard deviation of the gray scale of the pixels in the corresponding image of the filter is relied on。/>The smaller the pixel weight is, the larger the pixel weight changes outwards from the central position, and the central pixel gray scale is highlighted; />The larger the pixel weight is, the less obvious the pixel weight changes outwards from the central position, the filtering effect is similar to that of average filtering, and if edge pixels exist, the edge can be distorted, and the edge is blurred. And c, when judging the edge degree of the point to be tested in the step b, the variance value calculated in the window is generally larger, namely the standard deviation is larger.
And then, through judging the pixels when Gaussian filtering is carried outBelonging to the section and +.>The size of the filter is adjusted.
,/>Is relatively large, noise exists in the initial template of the point to be detected, and the initial size or the initial size can be adoptedThe size corresponds to Gaussian filtering effect and approaches mean filtering, and has a noise suppression effect.
,/>The middle value should be taken to make the tiny edges protrude, and the filter performs noise suppression on the part with even gray distribution.
,/>Taking small values, highlighting the center point weight, and selecting the size of the filter to be +.>The size can protect the edge from blurring.
Thus, the adjustment of the filter size and the adaptive Gaussian filtering are completed.
And for the step S4, carrying out Canny edge detection on the bucket image to finish edge extraction. The logic for step S4 is: as pretreatment of Canny algorithm, gaussian filtering should reduce noise influence and enable fine edges of images not to be blurred, so that gradient extraction integrity of images can be guaranteed in subsequent gradient calculation. Specifically, the steps complete the self-adaptive Gaussian filtering of the bucket image, perform gradient calculation in a Canny algorithm, perform non-maximum suppression, set double thresholds to extract the edges of the bucket image, perform comparison and complete the analysis of the surface defects of the bucket.
The method for visually detecting the surface defects of the bucket provided by the application is finished in all processes, and can be used for adjusting parameters of the Gaussian filter according to the membership of pixels when an algorithm performs image processing by using a self-adaptive Gaussian filtering method, so that noise can be suppressed, and tiny edges of an image can be prevented from being distorted. In detail, the gray distribution around the pixel to be measured is analyzed, and the pixel membership interval to be measured is judged, so that the filter is adjusted according to different pixel membership. The size of the filter is reduced when the filter is positioned at the edge, and the small edge can be protected when the filter treatment is performed at the edge by combining the calculated standard deviation size of the initial window; otherwise, the size of the filter is increased, and noise is restrained; when the algorithm smoothes the image, noise suppression and tiny edge protection can be considered.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, an embodiment of the present application provides a computer-readable storage medium in which a plurality of computer programs are stored, the computer programs being capable of being loaded by a processor to perform the steps in any of the visual inspection methods for surface defects of a water tub provided by the embodiment of the present application. For example, the computer program may perform the steps of:
s1, acquiring a bucket image, and carrying out graying treatment on the bucket image to obtain a graying image;
s2, analyzing gray level discrete degree around a pixel to be detected of the gray level image, and quantifying pixel membership edge probability to obtain the size fit degree of an initial template of the pixel to be detected;
s3, quantifying the size and coefficient adjustment degree of the Gaussian filter according to the fit degree of the initial template size, and completing the self-adaptive Gaussian filtering of the image;
s4, carrying out Canny edge detection on the bucket image to finish edge extraction.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: a Read Only Memory 1005 (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or the like.
The steps in any one of the visual inspection methods for surface defects of the water bucket provided by the embodiments of the present application can be executed due to the computer program stored in the storage medium, so that the beneficial effects of any one of the visual inspection methods for surface defects of the water bucket provided by the embodiments of the present application can be achieved, which are detailed in the previous embodiments and are not described herein.
The foregoing has described in detail a visual inspection method for surface defects of a water tub, which is provided by the embodiments of the present application, and specific examples are applied herein to illustrate the principles and embodiments of the present application, and the above description of the examples is only for helping to understand the method and core ideas of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (5)

1. A visual inspection method for surface defects of a water bucket, the method comprising:
collecting a bucket image, and carrying out graying treatment on the bucket image to obtain a graying image;
analyzing the gray level discrete degree around the pixel to be detected of the gray level image, and quantifying the probability of the membership edge of the pixel to obtain the size fit degree of the initial template of the pixel to be detected;
according to the size fitting degree of the initial template, the size and coefficient adjustment degree of the Gaussian filter are quantized, and self-adaptive Gaussian filtering of the image is completed;
carrying out Canny edge detection on the bucket image to finish edge extraction;
the step of analyzing the gray level discrete degree around the pixel to be detected of the gray level image, quantifying the probability of the pixel membership edge, and obtaining the suitability of the initial template size of the pixel to be detected comprises the following steps:
constructing an initial window by taking a pixel to be detected as a center, and analyzing gray distribution in an initial template to obtain gray discrete degree around the pixel to be detected;
judging the degree of the membership edge of the pixel according to the degree of the gray level dispersion around the pixel to be detected, and obtaining the suitability of the template size;
the step of constructing an initial window by taking the pixel to be detected as a center and analyzing the gray distribution in the initial template to obtain the gray discrete degree around the pixel to be detected comprises the following steps:
set up with the pixel to be measured as the centerA size initial gaussian filter template;
analyzing the gray image of the bucket collected on the production line, and comparing the gray valueDefining the pixels of the pixel to be detected as illumination pixels, calculating the illumination pixel duty ratio in the initial window corresponding to each pixel to be detected, and eliminating the influence of the illumination pixels according to the illumination pixel duty ratio;
analyzing the gray level distribution degree in an initial window divided by the pixels to be detected, calculating the gray level variance of the pixels in the initial window, and analyzing the gray level discrete degree of the window corresponding to each pixel by combining the similar pixel aggregation degree in the initial window;
the calculation formula for calculating the illumination pixel duty ratio in the initial window corresponding to each pixel to be measured is as follows:
wherein ,for each pixel to be detected, corresponding to the illumination pixel ratio in the initial window, < >>Is the number of illuminated pixels in the initial window, +.>The total number of pixels is contained for the initial window;
the calculation formula of the gray level discrete degree is as follows:
wherein ,representing the gray level dispersion degree of each initial window, < >>For the pixel gray variance in the initial window, gray difference +.>The pixels of (2) are regarded as similar pixels, and similar pixels are combined in each window to obtain corresponding areas, and the number of the areas is recorded>
The step of judging the pixel membership edge degree according to the peripheral gray level discrete degree of the pixel to be detected to obtain the suitability of the template size comprises the following steps:
judging whether an edge exists in the initial window according to the gray level discrete degree;
if the edge exists, calculating the possible gradient value of the edge in the initial window to obtain a gradient value set
Calculating the gradient of the to-be-measured point;
comparing the calculated gradient of the point to be measured with the gradient value set A, and judging the membership edge degree of the point to be measured;
and obtaining the suitability of the size of the initial template of the pixel to be detected according to the membership edge degree of the point to be detected.
2. The visual inspection method of water tub surface defects as set forth in claim 1, wherein said collecting water tub images and graying said water tub images to obtain grayed images comprises:
and acquiring a corresponding bucket image through a camera arranged on a bucket production line of the water dispenser, and carrying out graying treatment on the bucket image according to texture features existing on the bucket to obtain a graying image.
3. The visual inspection method of water tub surface defects according to claim 1, wherein said determining whether an edge exists in the initial window according to the gray level dispersion degree comprises indicating that an edge exists in the initial window when the gray level dispersion degree exists; when the gray level discrete degree does not exist, the fact that no edge exists in the initial window is indicated.
4. The visual inspection method for surface defects of water bucket according to claim 1, wherein the step of comparing the calculated gradient of the point to be inspected with the gradient value set a to judge the degree of membership edges of the point to be inspected comprises the steps of:
if the measured point meets,/>The gradient of the point to be measured indicates that the point to be measured belongs to the edge; if the point to be measured is not at the edge position and is associated with the set +.>And quantifying the membership edge degree of the to-be-measured point to obtain the membership edge degree of the to-be-measured point when the calculated distance exists between the edge pixel points.
5. The visual inspection method of water bucket surface defects according to claim 1, wherein the step of obtaining the suitability of the initial template size of the pixel to be inspected according to the membership edge degree of the point to be inspected comprises the following steps:
if the membership edge degree of the to-be-measured point is smaller than the first preset value or larger than the second preset value, the pixel to be measured is not suitable for the initial template size;
if the membership edge degree of the to-be-measured point is larger than or equal to the first preset value and smaller than or equal to the second preset value, the pixel to be measured is indicated to be appropriate in initial template size.
CN202310868040.7A 2023-07-17 2023-07-17 Visual inspection method for surface defects of bucket Active CN116612112B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310868040.7A CN116612112B (en) 2023-07-17 2023-07-17 Visual inspection method for surface defects of bucket

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310868040.7A CN116612112B (en) 2023-07-17 2023-07-17 Visual inspection method for surface defects of bucket

Publications (2)

Publication Number Publication Date
CN116612112A CN116612112A (en) 2023-08-18
CN116612112B true CN116612112B (en) 2023-09-22

Family

ID=87685667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310868040.7A Active CN116612112B (en) 2023-07-17 2023-07-17 Visual inspection method for surface defects of bucket

Country Status (1)

Country Link
CN (1) CN116612112B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117094912B (en) * 2023-10-16 2024-01-16 南洋电气集团有限公司 Welding image enhancement method and system for low-voltage power distribution cabinet
CN117990713A (en) * 2024-02-18 2024-05-07 广州华研制药设备有限公司 Visual inspection equipment for surface defects of drinking water barrel

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101170641A (en) * 2007-12-05 2008-04-30 北京航空航天大学 A method for image edge detection based on threshold sectioning
WO2009113231A1 (en) * 2008-03-14 2009-09-17 株式会社ソニー・コンピュータエンタテインメント Image processing device and image processing method
CN102044071A (en) * 2010-12-28 2011-05-04 上海大学 Single-pixel margin detection method based on FPGA
CN102521836A (en) * 2011-12-15 2012-06-27 江苏大学 Edge detection method based on gray-scale image of specific class
CN102637317A (en) * 2012-04-26 2012-08-15 东南大学 Coin size measuring method based on vision
CN105719298A (en) * 2016-01-22 2016-06-29 北京航空航天大学 Edge detection technology based line diffusion function extracting method
CN107657606A (en) * 2017-09-18 2018-02-02 深圳市华星光电半导体显示技术有限公司 The luminance defects detection method and device of a kind of display device
CN109829876A (en) * 2018-05-30 2019-05-31 东南大学 Carrier bar on-line detection device of defects and method based on machine vision
CN211293129U (en) * 2019-09-04 2020-08-18 国网江苏省电力有限公司电力科学研究院 Partial discharge detection device with combined action of alternating current and impulse voltage
CN114782475A (en) * 2022-06-16 2022-07-22 南通金石包装印刷有限公司 Corrugated carton line pressing defect detection optimization method based on artificial intelligence system
CN116309510A (en) * 2023-03-29 2023-06-23 清华大学 Numerical control machining surface defect positioning method and device
CN116309570A (en) * 2023-05-18 2023-06-23 山东亮马新材料科技有限公司 Titanium alloy bar quality detection method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160253574A1 (en) * 2013-11-28 2016-09-01 Pavel S. Smirnov Technologies for determining local differentiating color for image feature detectors
CN112950508B (en) * 2021-03-12 2022-02-11 中国矿业大学(北京) Drainage pipeline video data restoration method based on computer vision

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101170641A (en) * 2007-12-05 2008-04-30 北京航空航天大学 A method for image edge detection based on threshold sectioning
WO2009113231A1 (en) * 2008-03-14 2009-09-17 株式会社ソニー・コンピュータエンタテインメント Image processing device and image processing method
CN102044071A (en) * 2010-12-28 2011-05-04 上海大学 Single-pixel margin detection method based on FPGA
CN102521836A (en) * 2011-12-15 2012-06-27 江苏大学 Edge detection method based on gray-scale image of specific class
CN102637317A (en) * 2012-04-26 2012-08-15 东南大学 Coin size measuring method based on vision
CN105719298A (en) * 2016-01-22 2016-06-29 北京航空航天大学 Edge detection technology based line diffusion function extracting method
CN107657606A (en) * 2017-09-18 2018-02-02 深圳市华星光电半导体显示技术有限公司 The luminance defects detection method and device of a kind of display device
CN109829876A (en) * 2018-05-30 2019-05-31 东南大学 Carrier bar on-line detection device of defects and method based on machine vision
CN211293129U (en) * 2019-09-04 2020-08-18 国网江苏省电力有限公司电力科学研究院 Partial discharge detection device with combined action of alternating current and impulse voltage
CN114782475A (en) * 2022-06-16 2022-07-22 南通金石包装印刷有限公司 Corrugated carton line pressing defect detection optimization method based on artificial intelligence system
CN116309510A (en) * 2023-03-29 2023-06-23 清华大学 Numerical control machining surface defect positioning method and device
CN116309570A (en) * 2023-05-18 2023-06-23 山东亮马新材料科技有限公司 Titanium alloy bar quality detection method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于形态学的焊接图像缺陷检测方法研究;马云;王贺;张晓光;胡晓钦;张涛;;计算机测量与控制(第05期);全文 *

Also Published As

Publication number Publication date
CN116612112A (en) 2023-08-18

Similar Documents

Publication Publication Date Title
CN116612112B (en) Visual inspection method for surface defects of bucket
CN109410230B (en) Improved Canny image edge detection method capable of resisting noise
CN116843688B (en) Visual detection method for quality of textile
CN102156996B (en) Image edge detection method
CN108827181B (en) Vision-based plate surface detection method
CN116168026A (en) Water quality detection method and system based on computer vision
CN109816645B (en) Automatic detection method for steel coil loosening
CN115619793B (en) Power adapter appearance quality detection method based on computer vision
CN110189290A (en) Metal surface fine defects detection method and device based on deep learning
CN116740061B (en) Visual detection method for production quality of explosive beads
CN116228768B (en) Method for detecting scratches on surface of electronic component
JP5705711B2 (en) Crack detection method
CN116993742B (en) Nickel alloy rolling defect detection method based on machine vision
CN115272256A (en) Sub-pixel level sensing optical fiber path Gaussian extraction method and system
CN117764983A (en) Visual detection method for binocular identification of intelligent manufacturing production line
CN116883412A (en) Graphene far infrared electric heating equipment fault detection method
JP2008267943A (en) Crack detecting method
CN117764989B (en) Visual-aided display screen defect detection method
CN117011291B (en) Watch shell quality visual detection method
CN115984246B (en) Machine vision-based defect rapid detection method and device, equipment and storage medium
Nagase et al. Automatic calculation and visualization of nuclear density in whole slide images of hepatic histological sections
CN111415365A (en) Image detection method and device
CN109544481B (en) Aviation image dodging method and system based on machine learning
CN109949245B (en) Cross laser detection positioning method and device, storage medium and computer equipment
CN116805314B (en) Building engineering quality assessment method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant