CN116091455A - Steel mesh surface defect judging method based on machine vision - Google Patents
Steel mesh surface defect judging method based on machine vision Download PDFInfo
- Publication number
- CN116091455A CN116091455A CN202310045703.5A CN202310045703A CN116091455A CN 116091455 A CN116091455 A CN 116091455A CN 202310045703 A CN202310045703 A CN 202310045703A CN 116091455 A CN116091455 A CN 116091455A
- Authority
- CN
- China
- Prior art keywords
- gray
- cluster
- uniformity
- steel mesh
- frequency
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 229910000831 Steel Inorganic materials 0.000 title claims abstract description 72
- 239000010959 steel Substances 0.000 title claims abstract description 72
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000007547 defect Effects 0.000 title claims abstract description 38
- 239000011159 matrix material Substances 0.000 claims abstract description 29
- 238000013139 quantization Methods 0.000 claims abstract description 19
- 238000013528 artificial neural network Methods 0.000 claims abstract description 7
- 238000011077 uniformity evaluation Methods 0.000 claims description 14
- 238000004422 calculation algorithm Methods 0.000 claims description 13
- 239000002245 particle Substances 0.000 claims description 11
- 239000006185 dispersion Substances 0.000 claims description 9
- 238000005457 optimization Methods 0.000 claims description 3
- 238000009826 distribution Methods 0.000 abstract description 9
- 238000004364 calculation method Methods 0.000 abstract description 8
- 238000001514 detection method Methods 0.000 abstract description 8
- 230000008569 process Effects 0.000 abstract description 7
- 238000013473 artificial intelligence Methods 0.000 abstract description 2
- 238000013527 convolutional neural network Methods 0.000 description 7
- UQSXHKLRYXJYBZ-UHFFFAOYSA-N Iron oxide Chemical compound [Fe]=O UQSXHKLRYXJYBZ-UHFFFAOYSA-N 0.000 description 6
- 230000000694 effects Effects 0.000 description 3
- 238000009776 industrial production Methods 0.000 description 2
- NDLPOXTZKUMGOV-UHFFFAOYSA-N oxo(oxoferriooxy)iron hydrate Chemical compound O.O=[Fe]O[Fe]=O NDLPOXTZKUMGOV-UHFFFAOYSA-N 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 208000032544 Cicatrix Diseases 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 231100000241 scar Toxicity 0.000 description 1
- 230000037387 scars Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/41—Analysis of texture based on statistical description of texture
- G06T7/44—Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
- G06V10/763—Non-hierarchical techniques, e.g. based on statistics of modelling distributions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30136—Metal
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of artificial intelligence, in particular to a steel mesh surface defect judging method based on machine vision. Acquiring a surface image of a steel mesh to obtain a corresponding steel mesh gray image, adaptively dividing the gray value in the steel mesh gray image into N optimal gray intervals according to the gray values and distribution positions of pixel points, carrying out gray level quantization on the gray values based on the optimal gray intervals to obtain a quantized gray image, acquiring a gray co-occurrence matrix of the quantized gray image to obtain texture entropy of each pixel point, forming an entropy matrix corresponding to the surface image, and inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of the steel mesh defect. By carrying out self-adaptive gray scale interval division on the gray scale value in the image and carrying out gray scale quantization based on the divided gray scale interval, the gray scale quantization process is more flexible, so that the calculation amount is simplified and reduced on the premise of reducing the loss of texture information, and the defect detection accuracy is improved.
Description
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a steel mesh surface defect judging method based on machine vision.
Background
In industrial production, due to the influence of production environment and process, more than ten defects such as cracks, inclusions, pressed iron oxide, pits, scratches, scars, holes, rust spots, bubbles and the like can be inevitably generated in a steel mesh, and the appearance and even the performance of the product are greatly influenced. At present, the method for detecting the surface defects of the steel mesh generally uses a standard image as a template to carry out contrast analysis to detect the defects of the steel mesh, but the requirement on the standard image is higher, otherwise, the detection is inaccurate; the defect detection is carried out on the steel mesh image by a threshold segmentation method, the threshold is generally obtained according to experience, but the characteristics of the steel mesh image collected each time are dissimilar, so that the method is easy to generate the phenomena of missing detection and false detection, and the detection effect is difficult to ensure to meet the related requirements of industrial production.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a steel mesh surface defect judging method based on machine vision, which adopts the following technical scheme:
acquiring a surface image of the steel mesh to obtain a corresponding steel mesh gray image;
obtaining a gray histogram of the steel mesh gray image to obtain a maximum gray value and a minimum gray value, setting a clustering radius of a DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value, and dividing the gray value in the steel mesh gray image into A clustering clusters based on the clustering radius, wherein A is a positive integer larger than 0; dividing gray values in the steel mesh gray image into N gray intervals based on a cluster, wherein N is a positive integer greater than 0 and NA, constructing an objective function according to the number of gray intervals and the number of pixel points corresponding to the gray intervals contained in each divided cluster, and acquiring N optimal gray intervals corresponding to the objective function when the objective function reaches global optimum;
carrying out gray level quantization on gray values based on N optimal gray intervals of the steel mesh gray image to obtain a quantized gray image; acquiring a gray level co-occurrence matrix of the quantized gray level image to obtain texture entropy of each pixel point, and forming an entropy matrix corresponding to the surface image; inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of steel mesh defects.
Further, the method for constructing the objective function according to the number of gray intervals and the number of pixels corresponding to the gray intervals contained in each divided cluster includes:
b demarcation points required for re-dividing are obtained according to the number of the gray spaces and the number of the clusters, and the B demarcation points are utilized to re-divide the gray intervals of the initial gray intervals corresponding to the A clusters;
calculating intra-cluster frequency uniformity and inter-cluster frequency uniformity of A cluster clusters according to the frequency of each gray scale interval based on the N gray scale intervals after the repartition, wherein the frequency refers to the frequency that the number of pixel points contained in the gray scale interval accounts for the number of all pixel points in the steel mesh gray scale image; calculating the intra-cluster space dispersity uniformity and inter-cluster space dispersity uniformity of the A clusters according to the position coordinates of the pixel points; calculating intra-cluster gray span uniformity and inter-cluster gray span uniformity of A clusters according to the gray span of each gray interval, wherein the gray span refers to the difference value between the maximum gray level and the minimum gray level corresponding to the gray interval;
linearly combining the intra-cluster frequency uniformity, the intra-cluster spatial dispersity uniformity and the intra-cluster gray scale span uniformity into an intra-cluster uniformity evaluation value; and linearly combining the inter-cluster frequency uniformity, the inter-cluster space dispersity uniformity and the inter-cluster gray scale span uniformity into inter-cluster uniformity evaluation values, and constructing the objective function of N gray scale intervals by combining the intra-cluster uniformity evaluation values and the inter-cluster uniformity evaluation values.
Further, the method for calculating intra-cluster frequency uniformity and inter-cluster frequency uniformity of the a clusters according to the frequency of each gray scale interval includes:
calculating the average frequency of each cluster according to the frequency of each gray interval, obtaining the sum of the frequency differences of each cluster according to the frequency difference between the frequency of each gray interval and the average frequency of the cluster to which the frequency difference belongs, and obtaining the intra-cluster frequency uniformity degree by combining the sum of the frequency differences of A clusters;
calculating a first average value of the average frequency among A clusters according to the average frequency of each cluster, and calculating the inter-cluster frequency uniformity according to the difference between the average frequency of each cluster and the first average value.
Further, the method for calculating the intra-cluster spatial dispersity uniformity and the inter-cluster spatial dispersity uniformity of the A clusters according to the position coordinates of the pixel points comprises the following steps:
taking the average value of the position coordinates of all the pixel points in each gray scale interval as the center position coordinate of the corresponding gray scale interval, calculating the Euclidean distance between the position coordinate of each pixel point in the gray scale interval and the center position coordinate, obtaining the distance average value of each gray scale interval, and taking the distance average value as the space dispersity of the corresponding gray scale interval;
calculating the average spatial dispersity of each cluster according to the spatial dispersity of each gray scale interval in the clusters, obtaining the sum of the spatial dispersity differences of each cluster according to the spatial dispersity of each gray scale space and the corresponding average spatial dispersity, and calculating the intra-cluster spatial dispersity uniformity according to the sum of the spatial dispersity differences of A clusters;
calculating a second average value of the average spatial dispersion degree among A clusters according to the average spatial dispersion degree of each cluster, and calculating the inter-cluster spatial dispersion degree uniformity according to the difference between the average spatial dispersion degree of each cluster and the second average value.
Further, the method for calculating the intra-cluster gray span uniformity and the inter-cluster gray span uniformity of the A clusters according to the gray span of each gray interval comprises the following steps:
calculating the average gray span of each cluster according to the gray span of each gray interval, obtaining the sum of gray span differences of each cluster according to the gray span difference between the gray span of each gray space and the average gray span of the cluster to which the gray space belongs, and calculating the intra-cluster gray span uniformity by combining the sum of the gray span differences of A clusters;
and calculating a third average value of the average gray scale span among A clusters according to the average gray scale span of each cluster, and calculating the inter-cluster gray scale span uniformity according to the difference between the average gray scale span of each cluster and the third average value.
Further, a particle swarm optimization algorithm is adopted to obtain a global optimal solution of the objective function.
Further, the method for setting the clustering radius of the DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value comprises the following steps:
and calculating a gray level difference value between the maximum gray level value and the minimum gray level value, setting the clustering radius according to the gray level difference value, and enabling the gray level difference value and the clustering radius to form a positive correlation.
The embodiment of the invention has at least the following beneficial effects: by carrying out self-adaptive gray scale interval division on the gray scale value in the image and carrying out gray scale quantization based on the divided gray scale interval, the gray scale quantization process is more flexible, so that the calculation amount is simplified and reduced on the premise of reducing the loss of texture information, and the defect detection accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of steps of a method for determining a defect on a steel mesh surface based on machine vision according to an embodiment of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following is a specific implementation, structure, characteristics and effects of the machine vision-based steel mesh surface defect judging method according to the invention, which are described in detail below with reference to the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the steel mesh surface defect judging method based on machine vision.
Referring to fig. 1, a flowchart of a method for determining a defect on a steel mesh surface based on machine vision according to an embodiment of the invention is shown, the method includes the following steps:
and S001, acquiring a surface image of the steel mesh to obtain a corresponding steel mesh gray image.
Specifically, an industrial camera is used for acquiring images of the surface of the steel mesh to obtain surface images, then the surface images are subjected to gray level conversion into corresponding steel mesh gray level images, and in order to prevent noise influence of the steel mesh gray level images, the steel mesh gray level images are subjected to image filtering pretreatment.
Step S002, obtaining a gray histogram of the steel mesh gray image, obtaining a maximum gray value and a minimum gray value, setting a clustering radius of a DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value, and dividing the gray value in the steel mesh gray image into A clustering clusters based on the clustering radius, wherein A is a positive integer larger than 0; dividing gray values in the steel mesh gray image into N gray intervals based on the cluster, wherein N is a positive integer greater than 0 and NAnd A, constructing an objective function according to the number of gray intervals and the number of pixel points corresponding to the gray intervals contained in each cluster after division, and obtaining N optimal gray intervals corresponding to the objective function when the objective function reaches global optimum.
Specifically, the different defect characteristics of the steel mesh are different, and the gray level co-occurrence matrix describes the spatial distribution relation among pixels in the imageAnd the dimension of the gray co-occurrence matrix is related to the number of gray levels, for a single pixelAn image of several gray levels, the size of the gray level co-occurrence matrix is +.>In order to simplify the operation on the premise of not affecting the image quality as much as possible, the number of gray levels can be reduced, and meanwhile, the self-adaptive gray level division algorithm is adopted to divide gray level intervals, so that samples of each gray level interval are reasonably distributed, the loss of texture details caused by gray level division is reduced, and therefore, the embodiment of the invention quantizes the gray level in the steel mesh gray level image into N, and N is a positive integer larger than 0.
Considering that the usual gray scale quantization is uniform quantization of gray scale intervals, i.e. gray scale values at 031, the value becomes 1, 32 +.>63, the value of which becomes 2, < >>At a gray value of 224 +.>255, the gray value becomes 7, however, the gray value of each image is not uniformly distributed in the gray level 0-255, and if most of the pixel values of the image are in a certain interval, the pixels are classified into the same gray value after the gray level quantization, and only a few pixels correspond to the remaining seven gray values, so that the gray co-occurrence matrix generated after quantization cannot well represent the texture of the image, because most of the texture information is represented as a single gray level, the texture information is hidden, and therefore, the setting of the optimal quantization interval is required according to the pixel distribution condition of the image, thereby better representing the texture characteristics of the image, specifically comprising:
(1) Obtaining the gray level of the steel meshObtaining a maximum gray value according to a gray histogram of an imageAnd minimum gray value->。
(2) Clustering gray values in the gray histogram by using a DBSCAN clustering algorithm, wherein the DBSCAN clustering algorithm needs to set two parameters, one is a neighborhood radiusThe other is the number threshold +.>. In the embodiment of the invention, R gray values are assumed to be shared in the gray image of the steel mesh, and the neighborhood radius is set according to the empirical value>Sum threshold +.>And then clustering gray values in the gray histogram by using a set DBSCAN algorithm to obtain A clustering clusters, wherein A is a positive integer greater than 0, namely dividing the gray values in the steel mesh gray image into A initial gray intervals.
(3) Due to the set number thresholdThe number of clusters is therefore not greater than or equal to the quantization gray level N, i.e. N +.>A, therefore, the method of repartitioning needs to be based on the divided a initial gray scale intervals to make the gray scale intervals N, and the method of repartitioning is as follows: obtaining the number of demarcation points of the repartition>In A initial gray scale intervalsAnd searching the optimal B demarcation points, so that the gray value in the steel mesh gray image is divided into N optimal gray intervals.
Specifically, in order to improve the utilization rate of gray levels and increase the uniformity of gray level representation gray level information, according to the embodiment of the invention, an objective function is constructed according to the pixel point distribution of each gray level interval after repartitioning, and N optimal gray level intervals are obtained by obtaining B demarcation points corresponding to when the objective function reaches global optimal.
The construction method of the objective function comprises the following steps:
the method comprises the steps of carrying out gray scale interval re-division on initial gray scale intervals corresponding to A clustering clusters by using B demarcation points, and firstly calculating the frequency of the number of pixel points contained in each gray scale interval accounting for the number of all pixel points in a steel mesh gray scale image based on N gray scale intervals subjected to re-divisionThen the average frequency of each cluster is calculated from the frequency of each gray interval, i.e./th>The average frequency of the individual clusters is +.>Wherein->Is->The number of gray intervals divided by the clusters, < >>Indicate->The +.>The frequency of the individual gray intervals; calculating the level of each gray scale interval and the cluster to which the gray scale interval belongsAnd the frequency difference between the average frequencies is used for obtaining the sum of the frequency differences of each cluster, and the frequency uniformity in the clusters is calculated by combining the sum of the frequency differences of the A clusters.
When the clustering algorithm is carried out, A clusters are already separated according to the density distribution condition of gray scales, the gray scale distribution in the clusters is relatively concentrated, and when the selected clusters are re-divided, for example, one cluster is selected to be divided into two gray scale intervals, so that the number of pixels in the two divided gray scale intervals is uniform, namely, the frequency uniformity in the clusters is uniformThe number of the pixel points of the divided gray scale interval and the adjacent clusters (or the gray scale interval divided by the adjacent clusters) is balanced as much as possible, the gray scale values of the adjacent clusters are similar, and the uniformity of the distribution of the similar gray scale values can be improved in the quantization process, so that the first average value of the average frequency among A clusters is calculated according to the average frequency of each cluster, namely->Calculating inter-cluster frequency uniformity +.A between A clusters based on the difference between the average frequency of each cluster and the first average>。
Considering that the spatial positions of all the pixel points in each gray scale interval after the repartitioning should be dispersed, so that the texture characteristic can be better reflected after the gray scale interval is partitioned, therefore, the position coordinates of all the pixel points in each gray scale interval in the cluster are obtained, and the position coordinates of all the pixel points in each gray scale interval are obtainedThe average value is used as the central position coordinate of the corresponding gray scale interval, euclidean distance between the position coordinate of each pixel point in the gray scale interval and the central position coordinate is calculated, the distance average value of each gray scale interval is obtained, and the distance average value is used as the space dispersity of the corresponding gray scale intervalThe method comprises the steps of carrying out a first treatment on the surface of the According to the spatial dispersity of each gray scale interval in the cluster +.>Calculating the average spatial dispersity of each cluster, i.eWherein->Indicate->Average spatial dispersity of the individual clusters, +.>Is the firstThe>And obtaining the sum of the spatial dispersity differences of each cluster according to the spatial dispersity difference between each gray space and the average spatial dispersity corresponding to each gray space, and calculating the spatial dispersity uniformity in the cluster by combining the sum of the spatial dispersity differences of the A clusters.
similarly, consider inter-cluster spatial dispersity uniformity among A clustersWherein->Representation->A second average of the average spatial dispersion of the clusters.
Consider the gray span of a newly divided gray interval after repartitioningThe gray span d should be as uniform as possible, which is the length of the gray interval after gray division (the value is the difference between the maximum gray level and the minimum gray level in the gray interval), and here the idea of gray uniform division is adopted, the more uniform the gray division is, the more uniform the color of the gray image is, the more gentle the color change is, and the smaller the contrast is, so the gray span of each gray interval is obtained to calculate the average gray span of each cluster, i.e.)>,/>Is->Average gray span of individual clusters, +.>Is->The +.>The gray span of each gray interval is then obtained according to the gray span difference between each gray space and the average gray span of the cluster to which the gray space belongs, and the sum of the gray span differences of each cluster is combined with the sum of the gray span differences of A clusters to calculate the gray in the clusterSpan uniformity.
similarly, consider inter-cluster gray span uniformity between A clustersWherein->Representation->Average of average gray span of each cluster.
Further, intra-cluster frequency uniformity of A clusters is improvedIntra-cluster spatial dispersity uniformity +.>And intra-cluster gray span uniformity->Linear combination of intra-cluster uniformity evaluation values +.>The method comprises the steps of carrying out a first treatment on the surface of the Inter-cluster frequency uniformity->Inter-cluster spatial dispersity uniformity->And inter-cluster gray scale span uniformity->Linear combination of inter-cluster uniformity evaluation values +.>Wherein->,/>And->Representing the corresponding adjustment parameters, respectively.
Preferably, the embodiment of the invention is empirically arranged,/>,/>0.02,0.001,0.1 respectively.
Based on the intra-cluster uniformity evaluation valueAnd inter-cluster uniformity evaluation value +.>Constructing target contents of N gray scale intervals after repartitioning:
wherein,,is the whole gray scale interval in the gray scale image of the steel mesh>B demarcation points selected from the A clusters and used forObtaining N gray scale intervals; />Weight of uniformity evaluation value in cluster, +.>For weighting the cluster uniformity evaluation value, the embodiment of the present invention empirically sets +.>,/>=0.3。
Based on objective functionsDesigning particles with two properties of speed and position by adopting a particle swarm optimization algorithm PSO, randomly initializing the particles, evaluating each particle and judging an objective function ∈ ->If the global optimum is reached, if the conditions are not met, the current speed and the current position of the particles are updated, the function adaptation value of each particle is evaluated, the historical optimum position of each particle is updated, the global optimum position of a particle swarm is updated, at the moment, each particle is evaluated again, and the objective function is judged>Whether the global optimum is reached or not until the global optimum is reached, obtaining an optimum solution +.>Namely, the optimal positions of the B demarcation points are combined with the A cluster clusters and the optimal positions of the B demarcation points to ensure that the whole gray scale interval of the steel mesh gray scale image is ∈>And performing self-adaptive division into N optimal gray scale intervals.
Step S003, carrying out gray level quantization on gray values based on N optimal gray intervals of the steel mesh gray image to obtain a quantized gray image; acquiring a gray level co-occurrence matrix of the quantized gray level image to obtain texture entropy of each pixel point, and forming an entropy matrix corresponding to the surface image; the surface image and the corresponding entropy matrix are input into a neural network to confirm the type of steel mesh defect.
Specifically, according to step S002, N optimal gray scale intervals adaptively divided in the toughened gray scale image are obtained, and further, the gray scale value corresponding to each optimal gray scale interval is quantized, so as to obtain a quantized gray scale image. After adaptive gray scale quantization, selectTaking every point on the quantized gray-scale image +.>And deviation from it>Gray values corresponding to these two points +.>And->Composition Point pair->For grey level->Point pair->Is combined and shared->For each pixel on the gray image +.>Neighborhood, count eachThe number of occurrences and then the composition->Gray scale symbiotic square matrix. Consider->,/>,/>,/>Four feature directions, when two pixels are aligned with +.>The shaft is->In the case of taking->,/>The method comprises the steps of carrying out a first treatment on the surface of the When two pixels are combined with +>The shaft is->In the case of taking->,The method comprises the steps of carrying out a first treatment on the surface of the When two pixels are combined with +>The shaft is->In the case of taking->,/>The method comprises the steps of carrying out a first treatment on the surface of the When two pixels are combined with +>The shaft is->At the time, take,/>Step size->Taking the average value of gray level co-occurrence matrixes obtained in the four characteristic directions as the final gray level co-occurrence matrix of the corresponding pixel point as 1 +.>。
Finally selecting entropy coefficientAs a feature metric of the gray level co-occurrence matrix, +.>Reflecting the degree of non-uniformity and complexity of the texture, the greater the value of entropy when the element portions in the co-occurrence matrix are more dispersed. Among defects on the surface of the steel mesh, the crack has the highest non-uniformity, the maximum entropy value, moderate non-uniformity of pressed iron oxide and pits, moderate entropy value, relatively simple distribution of inclusions and lower entropy value, so that the four defect characteristics can be well distinguished by adopting the entropy of the gray level co-occurrence matrix, the final gray level co-occurrence matrix of each pixel point is calculated in texture entropy, and a surface image (steel mesh gray level image) is formedThe calculation formula of the corresponding entropy matrix and texture entropy is as follows:
Further, the embodiment of the invention utilizes a neural network to realize accurate classification of the surface defects of the steel mesh, wherein the adopted Convolutional Neural Network (CNN) is a feedforward neural network comprising convolutional calculation, the convolutional neural network comprises a convolutional layer, a pooling layer, a full-connection layer and the like, picture information with three dimensions of width, height and depth is input, and the training process of the convolutional neural network is as follows:
(1) The input of the convolutional neural network is a surface image of the steel network and a corresponding entropy matrix; the probability of the defect type is output.
(2) According to the embodiment of the invention, the defect types of the steel mesh are considered to be cracks, impurity clamps, pressed-in ferric oxide and pits, so that a normal surface image is marked as 0, cracks are marked as 1, inclusions are marked as 2, pressed-in ferric oxide is marked as 3, pits are marked as 4, and then the labels are increased according to the condition that multiple defects appear simultaneously, for example, the cracks and the inclusions appear simultaneously, and the image is marked as 5.
(3) And estimating the classification accuracy of the convolutional neural network on the training set by adopting a cross entropy loss function, and measuring the difference between the model output and the real output.
(4) And normalizing the output value by adopting a Softmax classifier to obtain the probability of the defect type.
Further, inputting the surface image of the steel mesh to be detected and the corresponding entropy matrix into a trained convolutional neural network to detect the defect type of the image.
In summary, the embodiment of the invention provides a method for determining a steel mesh surface defect based on machine vision points, which is used for acquiring a surface image of a steel mesh to obtain a corresponding steel mesh gray image, adaptively dividing a gray value in the steel mesh gray image into N optimal gray intervals according to gray values and distribution positions of pixel points, carrying out gray level quantization on the gray values based on the optimal gray intervals to obtain a quantized gray image, carrying out gray symbiotic matrix acquisition on the quantized gray image to obtain texture entropy of each pixel point, forming an entropy matrix corresponding to the surface image, and inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of the steel mesh defect. By carrying out self-adaptive gray scale interval division on the gray scale value in the image and carrying out gray scale quantization based on the divided gray scale interval, the gray scale quantization process is more flexible, so that the calculation amount is simplified and reduced on the premise of reducing the loss of texture information, and the defect detection accuracy is improved.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. And the foregoing description has been directed to specific embodiments of this specification. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.
Claims (7)
1. The steel mesh surface defect judging method based on machine vision is characterized by comprising the following steps of:
acquiring a surface image of the steel mesh to obtain a corresponding steel mesh gray image;
obtaining a gray histogram of the steel mesh gray image to obtain a maximum gray value and a minimum gray value, setting a clustering radius of a DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value, and dividing the gray value in the steel mesh gray image into A clustering clusters based on the clustering radius, wherein A is a positive integer larger than 0; dividing gray values in the steel mesh gray image into N gray intervals based on a cluster, wherein N is a positive integer greater than 0 and NA, constructing an objective function according to the number of gray intervals and the number of pixel points corresponding to the gray intervals contained in each divided cluster, and acquiring N optimal gray intervals corresponding to the objective function when the objective function reaches global optimum;
carrying out gray level quantization on gray values based on N optimal gray intervals of the steel mesh gray image to obtain a quantized gray image; acquiring a gray level co-occurrence matrix of the quantized gray level image to obtain texture entropy of each pixel point, and forming an entropy matrix corresponding to the surface image; inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of steel mesh defects.
2. The method for determining the surface defect of the steel mesh based on the machine vision according to claim 1, wherein the method for constructing the objective function according to the number of gray intervals and the number of pixels corresponding to the gray intervals contained in each cluster after the division comprises the following steps:
b demarcation points required for re-dividing are obtained according to the number of the gray spaces and the number of the clusters, and the B demarcation points are utilized to re-divide the gray intervals of the initial gray intervals corresponding to the A clusters;
calculating intra-cluster frequency uniformity and inter-cluster frequency uniformity of A cluster clusters according to the frequency of each gray scale interval based on the N gray scale intervals after the repartition, wherein the frequency refers to the frequency that the number of pixel points contained in the gray scale interval accounts for the number of all pixel points in the steel mesh gray scale image; calculating the intra-cluster space dispersity uniformity and inter-cluster space dispersity uniformity of the A clusters according to the position coordinates of the pixel points; calculating intra-cluster gray span uniformity and inter-cluster gray span uniformity of A clusters according to the gray span of each gray interval, wherein the gray span refers to the difference value between the maximum gray level and the minimum gray level corresponding to the gray interval;
linearly combining the intra-cluster frequency uniformity, the intra-cluster spatial dispersity uniformity and the intra-cluster gray scale span uniformity into an intra-cluster uniformity evaluation value; and linearly combining the inter-cluster frequency uniformity, the inter-cluster space dispersity uniformity and the inter-cluster gray scale span uniformity into inter-cluster uniformity evaluation values, and constructing the objective function of N gray scale intervals by combining the intra-cluster uniformity evaluation values and the inter-cluster uniformity evaluation values.
3. The method for determining the surface defect of the steel mesh based on the machine vision according to claim 2, wherein the method for calculating the intra-cluster frequency uniformity and the inter-cluster frequency uniformity of the a clusters according to the frequency of each gray scale interval comprises the following steps:
calculating the average frequency of each cluster according to the frequency of each gray interval, obtaining the sum of the frequency differences of each cluster according to the frequency difference between the frequency of each gray interval and the average frequency of the cluster to which the frequency difference belongs, and obtaining the intra-cluster frequency uniformity degree by combining the sum of the frequency differences of A clusters;
calculating a first average value of the average frequency among A clusters according to the average frequency of each cluster, and calculating the inter-cluster frequency uniformity according to the difference between the average frequency of each cluster and the first average value.
4. The method for determining the surface defect of the steel mesh based on machine vision according to claim 2, wherein the method for calculating the intra-cluster spatial dispersity uniformity and the inter-cluster spatial dispersity uniformity of the a clusters according to the position coordinates of the pixel points comprises the following steps:
taking the average value of the position coordinates of all the pixel points in each gray scale interval as the center position coordinate of the corresponding gray scale interval, calculating the Euclidean distance between the position coordinate of each pixel point in the gray scale interval and the center position coordinate, obtaining the distance average value of each gray scale interval, and taking the distance average value as the space dispersity of the corresponding gray scale interval;
calculating the average spatial dispersity of each cluster according to the spatial dispersity of each gray scale interval in the clusters, obtaining the sum of the spatial dispersity differences of each cluster according to the spatial dispersity of each gray scale space and the corresponding average spatial dispersity, and calculating the intra-cluster spatial dispersity uniformity according to the sum of the spatial dispersity differences of A clusters;
calculating a second average value of the average spatial dispersion degree among A clusters according to the average spatial dispersion degree of each cluster, and calculating the inter-cluster spatial dispersion degree uniformity according to the difference between the average spatial dispersion degree of each cluster and the second average value.
5. The method for determining the surface defect of the steel mesh based on the machine vision according to claim 2, wherein the method for calculating the intra-cluster gray span uniformity and the inter-cluster gray span uniformity of the a clusters according to the gray span of each gray interval comprises the following steps:
calculating the average gray span of each cluster according to the gray span of each gray interval, obtaining the sum of gray span differences of each cluster according to the gray span difference between the gray span of each gray space and the average gray span of the cluster to which the gray space belongs, and calculating the intra-cluster gray span uniformity by combining the sum of the gray span differences of A clusters;
and calculating a third average value of the average gray scale span among A clusters according to the average gray scale span of each cluster, and calculating the inter-cluster gray scale span uniformity according to the difference between the average gray scale span of each cluster and the third average value.
6. The method for judging the surface defects of the steel mesh based on machine vision according to claim 1, wherein a particle swarm optimization algorithm is adopted to obtain a globally optimal solution of the objective function.
7. The method for determining the surface defects of the steel mesh based on machine vision according to claim 1, wherein the method for setting the cluster radius of the DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value comprises the following steps:
and calculating a gray level difference value between the maximum gray level value and the minimum gray level value, setting the clustering radius according to the gray level difference value, and enabling the gray level difference value and the clustering radius to form a positive correlation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310045703.5A CN116091455A (en) | 2023-01-30 | 2023-01-30 | Steel mesh surface defect judging method based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310045703.5A CN116091455A (en) | 2023-01-30 | 2023-01-30 | Steel mesh surface defect judging method based on machine vision |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116091455A true CN116091455A (en) | 2023-05-09 |
Family
ID=86198750
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310045703.5A Withdrawn CN116091455A (en) | 2023-01-30 | 2023-01-30 | Steel mesh surface defect judging method based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116091455A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116433657A (en) * | 2023-06-08 | 2023-07-14 | 金乡县明耀玻璃有限公司 | Toughened glass scratch area image enhancement method based on computer vision |
CN116523913A (en) * | 2023-07-03 | 2023-08-01 | 惠州市金思维科技有限公司 | Intelligent detection method for quality of screw rod |
CN116596936A (en) * | 2023-07-18 | 2023-08-15 | 深圳市魔方卫星科技有限公司 | Solar wing sailboard quality detection method based on image data |
CN116934750A (en) * | 2023-09-15 | 2023-10-24 | 山东庆葆堂生物科技有限公司 | Vinegar egg liquid production quality assessment method |
CN116993718A (en) * | 2023-09-25 | 2023-11-03 | 深圳市东陆科技有限公司 | TFT array substrate defect detection method based on machine vision |
CN117173158A (en) * | 2023-10-25 | 2023-12-05 | 深圳市德海威实业有限公司 | Intelligent detection method and system for quality of precise connector |
CN118365644A (en) * | 2024-06-19 | 2024-07-19 | 西安联瑞科技实业有限责任公司 | Steel plate sand blasting uneven detection method based on image processing |
-
2023
- 2023-01-30 CN CN202310045703.5A patent/CN116091455A/en not_active Withdrawn
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116433657B (en) * | 2023-06-08 | 2023-08-25 | 金乡县明耀玻璃有限公司 | Toughened glass scratch area image enhancement method based on computer vision |
CN116433657A (en) * | 2023-06-08 | 2023-07-14 | 金乡县明耀玻璃有限公司 | Toughened glass scratch area image enhancement method based on computer vision |
CN116523913B (en) * | 2023-07-03 | 2023-11-03 | 惠州市金思维科技有限公司 | Intelligent detection method for quality of screw rod |
CN116523913A (en) * | 2023-07-03 | 2023-08-01 | 惠州市金思维科技有限公司 | Intelligent detection method for quality of screw rod |
CN116596936A (en) * | 2023-07-18 | 2023-08-15 | 深圳市魔方卫星科技有限公司 | Solar wing sailboard quality detection method based on image data |
CN116596936B (en) * | 2023-07-18 | 2023-09-12 | 深圳市魔方卫星科技有限公司 | Solar wing sailboard quality detection method based on image data |
CN116934750A (en) * | 2023-09-15 | 2023-10-24 | 山东庆葆堂生物科技有限公司 | Vinegar egg liquid production quality assessment method |
CN116934750B (en) * | 2023-09-15 | 2023-12-01 | 山东庆葆堂生物科技有限公司 | Vinegar egg liquid production quality assessment method |
CN116993718A (en) * | 2023-09-25 | 2023-11-03 | 深圳市东陆科技有限公司 | TFT array substrate defect detection method based on machine vision |
CN116993718B (en) * | 2023-09-25 | 2023-12-22 | 深圳市东陆科技有限公司 | TFT array substrate defect detection method based on machine vision |
CN117173158A (en) * | 2023-10-25 | 2023-12-05 | 深圳市德海威实业有限公司 | Intelligent detection method and system for quality of precise connector |
CN117173158B (en) * | 2023-10-25 | 2024-01-30 | 深圳市德海威实业有限公司 | Intelligent detection method and system for quality of precise connector |
CN118365644A (en) * | 2024-06-19 | 2024-07-19 | 西安联瑞科技实业有限责任公司 | Steel plate sand blasting uneven detection method based on image processing |
CN118365644B (en) * | 2024-06-19 | 2024-09-10 | 西安联瑞科技实业有限责任公司 | Steel plate sand blasting uneven detection method based on image processing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116091455A (en) | Steel mesh surface defect judging method based on machine vision | |
CN115082467B (en) | Building material welding surface defect detection method based on computer vision | |
CN115829883B (en) | Surface image denoising method for special-shaped metal structural member | |
CN110211126B (en) | Image segmentation method based on intuitive fuzzy C-means clustering | |
CN109583474B (en) | Training sample generation method for industrial big data processing | |
CN110443778B (en) | Method for detecting irregular defects of industrial products | |
CN116664559B (en) | Machine vision-based memory bank damage rapid detection method | |
CN114092389A (en) | Glass panel surface defect detection method based on small sample learning | |
CN116030052B (en) | Etching quality detection method for lamination process of computer display panel | |
CN117197140B (en) | Irregular metal buckle forming detection method based on machine vision | |
CN110276764A (en) | K-Means underwater picture background segment innovatory algorithm based on the estimation of K value | |
CN118115498B (en) | Method and system for rapidly detecting glossiness of titanium rod | |
CN118096796B (en) | Visual inspection method for appearance of radial forging titanium rod based on machine learning | |
CN118096579B (en) | 3D printing lattice structure defect detection method | |
CN117522864B (en) | European pine plate surface flaw detection method based on machine vision | |
CN117635507B (en) | Plastic particle online visual detection method and system | |
CN117314940B (en) | Laser cutting part contour rapid segmentation method based on artificial intelligence | |
CN112183469B (en) | Method for identifying congestion degree of public transportation and self-adaptive adjustment | |
CN110766662B (en) | Forging surface crack detection method based on multi-scale and multi-layer feature learning | |
CN112784922A (en) | Extraction and classification method of intelligent cloud medical images | |
CN113160214B (en) | Novel method for measuring similarity of local neighborhood pixels of image | |
CN113160166B (en) | Medical image data mining working method through convolutional neural network model | |
CN106981201A (en) | vehicle identification method under complex environment | |
CN117409275B (en) | Multi-angle radar image processing method | |
CN116740056B (en) | Defect detection method for coating layer of whole-core high-pattern conveyer belt |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20230509 |