CN116091455A - Steel mesh surface defect judging method based on machine vision - Google Patents

Steel mesh surface defect judging method based on machine vision Download PDF

Info

Publication number
CN116091455A
CN116091455A CN202310045703.5A CN202310045703A CN116091455A CN 116091455 A CN116091455 A CN 116091455A CN 202310045703 A CN202310045703 A CN 202310045703A CN 116091455 A CN116091455 A CN 116091455A
Authority
CN
China
Prior art keywords
gray
cluster
uniformity
steel mesh
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202310045703.5A
Other languages
Chinese (zh)
Inventor
谢海居
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Haiju Steel Structure Co ltd
Original Assignee
Nantong Haiju Steel Structure Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Haiju Steel Structure Co ltd filed Critical Nantong Haiju Steel Structure Co ltd
Priority to CN202310045703.5A priority Critical patent/CN116091455A/en
Publication of CN116091455A publication Critical patent/CN116091455A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30136Metal
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a steel mesh surface defect judging method based on machine vision. Acquiring a surface image of a steel mesh to obtain a corresponding steel mesh gray image, adaptively dividing the gray value in the steel mesh gray image into N optimal gray intervals according to the gray values and distribution positions of pixel points, carrying out gray level quantization on the gray values based on the optimal gray intervals to obtain a quantized gray image, acquiring a gray co-occurrence matrix of the quantized gray image to obtain texture entropy of each pixel point, forming an entropy matrix corresponding to the surface image, and inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of the steel mesh defect. By carrying out self-adaptive gray scale interval division on the gray scale value in the image and carrying out gray scale quantization based on the divided gray scale interval, the gray scale quantization process is more flexible, so that the calculation amount is simplified and reduced on the premise of reducing the loss of texture information, and the defect detection accuracy is improved.

Description

Steel mesh surface defect judging method based on machine vision
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a steel mesh surface defect judging method based on machine vision.
Background
In industrial production, due to the influence of production environment and process, more than ten defects such as cracks, inclusions, pressed iron oxide, pits, scratches, scars, holes, rust spots, bubbles and the like can be inevitably generated in a steel mesh, and the appearance and even the performance of the product are greatly influenced. At present, the method for detecting the surface defects of the steel mesh generally uses a standard image as a template to carry out contrast analysis to detect the defects of the steel mesh, but the requirement on the standard image is higher, otherwise, the detection is inaccurate; the defect detection is carried out on the steel mesh image by a threshold segmentation method, the threshold is generally obtained according to experience, but the characteristics of the steel mesh image collected each time are dissimilar, so that the method is easy to generate the phenomena of missing detection and false detection, and the detection effect is difficult to ensure to meet the related requirements of industrial production.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a steel mesh surface defect judging method based on machine vision, which adopts the following technical scheme:
acquiring a surface image of the steel mesh to obtain a corresponding steel mesh gray image;
obtaining a gray histogram of the steel mesh gray image to obtain a maximum gray value and a minimum gray value, setting a clustering radius of a DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value, and dividing the gray value in the steel mesh gray image into A clustering clusters based on the clustering radius, wherein A is a positive integer larger than 0; dividing gray values in the steel mesh gray image into N gray intervals based on a cluster, wherein N is a positive integer greater than 0 and N
Figure SMS_1
A, constructing an objective function according to the number of gray intervals and the number of pixel points corresponding to the gray intervals contained in each divided cluster, and acquiring N optimal gray intervals corresponding to the objective function when the objective function reaches global optimum;
carrying out gray level quantization on gray values based on N optimal gray intervals of the steel mesh gray image to obtain a quantized gray image; acquiring a gray level co-occurrence matrix of the quantized gray level image to obtain texture entropy of each pixel point, and forming an entropy matrix corresponding to the surface image; inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of steel mesh defects.
Further, the method for constructing the objective function according to the number of gray intervals and the number of pixels corresponding to the gray intervals contained in each divided cluster includes:
b demarcation points required for re-dividing are obtained according to the number of the gray spaces and the number of the clusters, and the B demarcation points are utilized to re-divide the gray intervals of the initial gray intervals corresponding to the A clusters;
calculating intra-cluster frequency uniformity and inter-cluster frequency uniformity of A cluster clusters according to the frequency of each gray scale interval based on the N gray scale intervals after the repartition, wherein the frequency refers to the frequency that the number of pixel points contained in the gray scale interval accounts for the number of all pixel points in the steel mesh gray scale image; calculating the intra-cluster space dispersity uniformity and inter-cluster space dispersity uniformity of the A clusters according to the position coordinates of the pixel points; calculating intra-cluster gray span uniformity and inter-cluster gray span uniformity of A clusters according to the gray span of each gray interval, wherein the gray span refers to the difference value between the maximum gray level and the minimum gray level corresponding to the gray interval;
linearly combining the intra-cluster frequency uniformity, the intra-cluster spatial dispersity uniformity and the intra-cluster gray scale span uniformity into an intra-cluster uniformity evaluation value; and linearly combining the inter-cluster frequency uniformity, the inter-cluster space dispersity uniformity and the inter-cluster gray scale span uniformity into inter-cluster uniformity evaluation values, and constructing the objective function of N gray scale intervals by combining the intra-cluster uniformity evaluation values and the inter-cluster uniformity evaluation values.
Further, the method for calculating intra-cluster frequency uniformity and inter-cluster frequency uniformity of the a clusters according to the frequency of each gray scale interval includes:
calculating the average frequency of each cluster according to the frequency of each gray interval, obtaining the sum of the frequency differences of each cluster according to the frequency difference between the frequency of each gray interval and the average frequency of the cluster to which the frequency difference belongs, and obtaining the intra-cluster frequency uniformity degree by combining the sum of the frequency differences of A clusters;
calculating a first average value of the average frequency among A clusters according to the average frequency of each cluster, and calculating the inter-cluster frequency uniformity according to the difference between the average frequency of each cluster and the first average value.
Further, the method for calculating the intra-cluster spatial dispersity uniformity and the inter-cluster spatial dispersity uniformity of the A clusters according to the position coordinates of the pixel points comprises the following steps:
taking the average value of the position coordinates of all the pixel points in each gray scale interval as the center position coordinate of the corresponding gray scale interval, calculating the Euclidean distance between the position coordinate of each pixel point in the gray scale interval and the center position coordinate, obtaining the distance average value of each gray scale interval, and taking the distance average value as the space dispersity of the corresponding gray scale interval;
calculating the average spatial dispersity of each cluster according to the spatial dispersity of each gray scale interval in the clusters, obtaining the sum of the spatial dispersity differences of each cluster according to the spatial dispersity of each gray scale space and the corresponding average spatial dispersity, and calculating the intra-cluster spatial dispersity uniformity according to the sum of the spatial dispersity differences of A clusters;
calculating a second average value of the average spatial dispersion degree among A clusters according to the average spatial dispersion degree of each cluster, and calculating the inter-cluster spatial dispersion degree uniformity according to the difference between the average spatial dispersion degree of each cluster and the second average value.
Further, the method for calculating the intra-cluster gray span uniformity and the inter-cluster gray span uniformity of the A clusters according to the gray span of each gray interval comprises the following steps:
calculating the average gray span of each cluster according to the gray span of each gray interval, obtaining the sum of gray span differences of each cluster according to the gray span difference between the gray span of each gray space and the average gray span of the cluster to which the gray space belongs, and calculating the intra-cluster gray span uniformity by combining the sum of the gray span differences of A clusters;
and calculating a third average value of the average gray scale span among A clusters according to the average gray scale span of each cluster, and calculating the inter-cluster gray scale span uniformity according to the difference between the average gray scale span of each cluster and the third average value.
Further, a particle swarm optimization algorithm is adopted to obtain a global optimal solution of the objective function.
Further, the method for setting the clustering radius of the DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value comprises the following steps:
and calculating a gray level difference value between the maximum gray level value and the minimum gray level value, setting the clustering radius according to the gray level difference value, and enabling the gray level difference value and the clustering radius to form a positive correlation.
The embodiment of the invention has at least the following beneficial effects: by carrying out self-adaptive gray scale interval division on the gray scale value in the image and carrying out gray scale quantization based on the divided gray scale interval, the gray scale quantization process is more flexible, so that the calculation amount is simplified and reduced on the premise of reducing the loss of texture information, and the defect detection accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of steps of a method for determining a defect on a steel mesh surface based on machine vision according to an embodiment of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following is a specific implementation, structure, characteristics and effects of the machine vision-based steel mesh surface defect judging method according to the invention, which are described in detail below with reference to the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the steel mesh surface defect judging method based on machine vision.
Referring to fig. 1, a flowchart of a method for determining a defect on a steel mesh surface based on machine vision according to an embodiment of the invention is shown, the method includes the following steps:
and S001, acquiring a surface image of the steel mesh to obtain a corresponding steel mesh gray image.
Specifically, an industrial camera is used for acquiring images of the surface of the steel mesh to obtain surface images, then the surface images are subjected to gray level conversion into corresponding steel mesh gray level images, and in order to prevent noise influence of the steel mesh gray level images, the steel mesh gray level images are subjected to image filtering pretreatment.
Step S002, obtaining a gray histogram of the steel mesh gray image, obtaining a maximum gray value and a minimum gray value, setting a clustering radius of a DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value, and dividing the gray value in the steel mesh gray image into A clustering clusters based on the clustering radius, wherein A is a positive integer larger than 0; dividing gray values in the steel mesh gray image into N gray intervals based on the cluster, wherein N is a positive integer greater than 0 and N
Figure SMS_2
And A, constructing an objective function according to the number of gray intervals and the number of pixel points corresponding to the gray intervals contained in each cluster after division, and obtaining N optimal gray intervals corresponding to the objective function when the objective function reaches global optimum.
Specifically, the different defect characteristics of the steel mesh are different, and the gray level co-occurrence matrix describes the spatial distribution relation among pixels in the imageAnd the dimension of the gray co-occurrence matrix is related to the number of gray levels, for a single pixel
Figure SMS_3
An image of several gray levels, the size of the gray level co-occurrence matrix is +.>
Figure SMS_4
In order to simplify the operation on the premise of not affecting the image quality as much as possible, the number of gray levels can be reduced, and meanwhile, the self-adaptive gray level division algorithm is adopted to divide gray level intervals, so that samples of each gray level interval are reasonably distributed, the loss of texture details caused by gray level division is reduced, and therefore, the embodiment of the invention quantizes the gray level in the steel mesh gray level image into N, and N is a positive integer larger than 0.
Considering that the usual gray scale quantization is uniform quantization of gray scale intervals, i.e. gray scale values at 0
Figure SMS_5
31, the value becomes 1, 32 +.>
Figure SMS_6
63, the value of which becomes 2, < >>
Figure SMS_7
At a gray value of 224 +.>
Figure SMS_8
255, the gray value becomes 7, however, the gray value of each image is not uniformly distributed in the gray level 0-255, and if most of the pixel values of the image are in a certain interval, the pixels are classified into the same gray value after the gray level quantization, and only a few pixels correspond to the remaining seven gray values, so that the gray co-occurrence matrix generated after quantization cannot well represent the texture of the image, because most of the texture information is represented as a single gray level, the texture information is hidden, and therefore, the setting of the optimal quantization interval is required according to the pixel distribution condition of the image, thereby better representing the texture characteristics of the image, specifically comprising:
(1) Obtaining the gray level of the steel meshObtaining a maximum gray value according to a gray histogram of an image
Figure SMS_9
And minimum gray value->
Figure SMS_10
(2) Clustering gray values in the gray histogram by using a DBSCAN clustering algorithm, wherein the DBSCAN clustering algorithm needs to set two parameters, one is a neighborhood radius
Figure SMS_11
The other is the number threshold +.>
Figure SMS_12
. In the embodiment of the invention, R gray values are assumed to be shared in the gray image of the steel mesh, and the neighborhood radius is set according to the empirical value>
Figure SMS_13
Sum threshold +.>
Figure SMS_14
And then clustering gray values in the gray histogram by using a set DBSCAN algorithm to obtain A clustering clusters, wherein A is a positive integer greater than 0, namely dividing the gray values in the steel mesh gray image into A initial gray intervals.
(3) Due to the set number threshold
Figure SMS_15
The number of clusters is therefore not greater than or equal to the quantization gray level N, i.e. N +.>
Figure SMS_16
A, therefore, the method of repartitioning needs to be based on the divided a initial gray scale intervals to make the gray scale intervals N, and the method of repartitioning is as follows: obtaining the number of demarcation points of the repartition>
Figure SMS_17
In A initial gray scale intervalsAnd searching the optimal B demarcation points, so that the gray value in the steel mesh gray image is divided into N optimal gray intervals.
Specifically, in order to improve the utilization rate of gray levels and increase the uniformity of gray level representation gray level information, according to the embodiment of the invention, an objective function is constructed according to the pixel point distribution of each gray level interval after repartitioning, and N optimal gray level intervals are obtained by obtaining B demarcation points corresponding to when the objective function reaches global optimal.
The construction method of the objective function comprises the following steps:
the method comprises the steps of carrying out gray scale interval re-division on initial gray scale intervals corresponding to A clustering clusters by using B demarcation points, and firstly calculating the frequency of the number of pixel points contained in each gray scale interval accounting for the number of all pixel points in a steel mesh gray scale image based on N gray scale intervals subjected to re-division
Figure SMS_20
Then the average frequency of each cluster is calculated from the frequency of each gray interval, i.e./th>
Figure SMS_21
The average frequency of the individual clusters is +.>
Figure SMS_23
Wherein->
Figure SMS_19
Is->
Figure SMS_22
The number of gray intervals divided by the clusters, < >>
Figure SMS_24
Indicate->
Figure SMS_25
The +.>
Figure SMS_18
The frequency of the individual gray intervals; calculating the level of each gray scale interval and the cluster to which the gray scale interval belongsAnd the frequency difference between the average frequencies is used for obtaining the sum of the frequency differences of each cluster, and the frequency uniformity in the clusters is calculated by combining the sum of the frequency differences of the A clusters.
As one example, intra-cluster frequency uniformity
Figure SMS_26
The calculation formula of (2) is as follows: />
Figure SMS_27
When the clustering algorithm is carried out, A clusters are already separated according to the density distribution condition of gray scales, the gray scale distribution in the clusters is relatively concentrated, and when the selected clusters are re-divided, for example, one cluster is selected to be divided into two gray scale intervals, so that the number of pixels in the two divided gray scale intervals is uniform, namely, the frequency uniformity in the clusters is uniform
Figure SMS_28
The number of the pixel points of the divided gray scale interval and the adjacent clusters (or the gray scale interval divided by the adjacent clusters) is balanced as much as possible, the gray scale values of the adjacent clusters are similar, and the uniformity of the distribution of the similar gray scale values can be improved in the quantization process, so that the first average value of the average frequency among A clusters is calculated according to the average frequency of each cluster, namely->
Figure SMS_29
Calculating inter-cluster frequency uniformity +.A between A clusters based on the difference between the average frequency of each cluster and the first average>
Figure SMS_30
Considering that the spatial positions of all the pixel points in each gray scale interval after the repartitioning should be dispersed, so that the texture characteristic can be better reflected after the gray scale interval is partitioned, therefore, the position coordinates of all the pixel points in each gray scale interval in the cluster are obtained, and the position coordinates of all the pixel points in each gray scale interval are obtainedThe average value is used as the central position coordinate of the corresponding gray scale interval, euclidean distance between the position coordinate of each pixel point in the gray scale interval and the central position coordinate is calculated, the distance average value of each gray scale interval is obtained, and the distance average value is used as the space dispersity of the corresponding gray scale interval
Figure SMS_32
The method comprises the steps of carrying out a first treatment on the surface of the According to the spatial dispersity of each gray scale interval in the cluster +.>
Figure SMS_34
Calculating the average spatial dispersity of each cluster, i.e
Figure SMS_36
Wherein->
Figure SMS_33
Indicate->
Figure SMS_35
Average spatial dispersity of the individual clusters, +.>
Figure SMS_37
Is the first
Figure SMS_38
The>
Figure SMS_31
And obtaining the sum of the spatial dispersity differences of each cluster according to the spatial dispersity difference between each gray space and the average spatial dispersity corresponding to each gray space, and calculating the spatial dispersity uniformity in the cluster by combining the sum of the spatial dispersity differences of the A clusters.
As an example, the calculation formula for the intra-cluster spatial dispersity uniformity is:
Figure SMS_39
similarly, consider inter-cluster spatial dispersity uniformity among A clusters
Figure SMS_40
Wherein->
Figure SMS_41
Representation->
Figure SMS_42
A second average of the average spatial dispersion of the clusters.
Consider the gray span of a newly divided gray interval after repartitioning
Figure SMS_43
The gray span d should be as uniform as possible, which is the length of the gray interval after gray division (the value is the difference between the maximum gray level and the minimum gray level in the gray interval), and here the idea of gray uniform division is adopted, the more uniform the gray division is, the more uniform the color of the gray image is, the more gentle the color change is, and the smaller the contrast is, so the gray span of each gray interval is obtained to calculate the average gray span of each cluster, i.e.)>
Figure SMS_44
,/>
Figure SMS_45
Is->
Figure SMS_46
Average gray span of individual clusters, +.>
Figure SMS_47
Is->
Figure SMS_48
The +.>
Figure SMS_49
The gray span of each gray interval is then obtained according to the gray span difference between each gray space and the average gray span of the cluster to which the gray space belongs, and the sum of the gray span differences of each cluster is combined with the sum of the gray span differences of A clusters to calculate the gray in the clusterSpan uniformity.
As one example, intra-cluster gray span uniformity
Figure SMS_50
The calculation formula of (2) is as follows:
Figure SMS_51
similarly, consider inter-cluster gray span uniformity between A clusters
Figure SMS_52
Wherein->
Figure SMS_53
Representation->
Figure SMS_54
Average of average gray span of each cluster.
Further, intra-cluster frequency uniformity of A clusters is improved
Figure SMS_57
Intra-cluster spatial dispersity uniformity +.>
Figure SMS_58
And intra-cluster gray span uniformity->
Figure SMS_61
Linear combination of intra-cluster uniformity evaluation values +.>
Figure SMS_56
The method comprises the steps of carrying out a first treatment on the surface of the Inter-cluster frequency uniformity->
Figure SMS_59
Inter-cluster spatial dispersity uniformity->
Figure SMS_62
And inter-cluster gray scale span uniformity->
Figure SMS_64
Linear combination of inter-cluster uniformity evaluation values +.>
Figure SMS_55
Wherein->
Figure SMS_60
,/>
Figure SMS_63
And->
Figure SMS_65
Representing the corresponding adjustment parameters, respectively.
Preferably, the embodiment of the invention is empirically arranged
Figure SMS_66
,/>
Figure SMS_67
,/>
Figure SMS_68
0.02,0.001,0.1 respectively.
Based on the intra-cluster uniformity evaluation value
Figure SMS_69
And inter-cluster uniformity evaluation value +.>
Figure SMS_70
Constructing target contents of N gray scale intervals after repartitioning:
Figure SMS_71
/>
Figure SMS_72
wherein,,
Figure SMS_73
is the whole gray scale interval in the gray scale image of the steel mesh>
Figure SMS_74
B demarcation points selected from the A clusters and used forObtaining N gray scale intervals; />
Figure SMS_75
Weight of uniformity evaluation value in cluster, +.>
Figure SMS_76
For weighting the cluster uniformity evaluation value, the embodiment of the present invention empirically sets +.>
Figure SMS_77
,/>
Figure SMS_78
=0.3。
Based on objective functions
Figure SMS_79
Designing particles with two properties of speed and position by adopting a particle swarm optimization algorithm PSO, randomly initializing the particles, evaluating each particle and judging an objective function ∈ ->
Figure SMS_80
If the global optimum is reached, if the conditions are not met, the current speed and the current position of the particles are updated, the function adaptation value of each particle is evaluated, the historical optimum position of each particle is updated, the global optimum position of a particle swarm is updated, at the moment, each particle is evaluated again, and the objective function is judged>
Figure SMS_81
Whether the global optimum is reached or not until the global optimum is reached, obtaining an optimum solution +.>
Figure SMS_82
Namely, the optimal positions of the B demarcation points are combined with the A cluster clusters and the optimal positions of the B demarcation points to ensure that the whole gray scale interval of the steel mesh gray scale image is ∈>
Figure SMS_83
And performing self-adaptive division into N optimal gray scale intervals.
Step S003, carrying out gray level quantization on gray values based on N optimal gray intervals of the steel mesh gray image to obtain a quantized gray image; acquiring a gray level co-occurrence matrix of the quantized gray level image to obtain texture entropy of each pixel point, and forming an entropy matrix corresponding to the surface image; the surface image and the corresponding entropy matrix are input into a neural network to confirm the type of steel mesh defect.
Specifically, according to step S002, N optimal gray scale intervals adaptively divided in the toughened gray scale image are obtained, and further, the gray scale value corresponding to each optimal gray scale interval is quantized, so as to obtain a quantized gray scale image. After adaptive gray scale quantization, select
Figure SMS_91
Taking every point on the quantized gray-scale image +.>
Figure SMS_105
And deviation from it>
Figure SMS_112
Gray values corresponding to these two points +.>
Figure SMS_89
And->
Figure SMS_97
Composition Point pair->
Figure SMS_95
For grey level->
Figure SMS_104
Point pair->
Figure SMS_94
Is combined and shared->
Figure SMS_102
For each pixel on the gray image +.>
Figure SMS_87
Neighborhood, count each
Figure SMS_101
The number of occurrences and then the composition->
Figure SMS_92
Gray scale symbiotic square matrix. Consider->
Figure SMS_99
,/>
Figure SMS_93
,/>
Figure SMS_107
,/>
Figure SMS_85
Four feature directions, when two pixels are aligned with +.>
Figure SMS_96
The shaft is->
Figure SMS_106
In the case of taking->
Figure SMS_113
,/>
Figure SMS_84
The method comprises the steps of carrying out a first treatment on the surface of the When two pixels are combined with +>
Figure SMS_98
The shaft is->
Figure SMS_110
In the case of taking->
Figure SMS_116
Figure SMS_111
The method comprises the steps of carrying out a first treatment on the surface of the When two pixels are combined with +>
Figure SMS_117
The shaft is->
Figure SMS_90
In the case of taking->
Figure SMS_100
,/>
Figure SMS_108
The method comprises the steps of carrying out a first treatment on the surface of the When two pixels are combined with +>
Figure SMS_114
The shaft is->
Figure SMS_109
At the time, take
Figure SMS_115
,/>
Figure SMS_86
Step size->
Figure SMS_103
Taking the average value of gray level co-occurrence matrixes obtained in the four characteristic directions as the final gray level co-occurrence matrix of the corresponding pixel point as 1 +.>
Figure SMS_88
Finally selecting entropy coefficient
Figure SMS_118
As a feature metric of the gray level co-occurrence matrix, +.>
Figure SMS_119
Reflecting the degree of non-uniformity and complexity of the texture, the greater the value of entropy when the element portions in the co-occurrence matrix are more dispersed. Among defects on the surface of the steel mesh, the crack has the highest non-uniformity, the maximum entropy value, moderate non-uniformity of pressed iron oxide and pits, moderate entropy value, relatively simple distribution of inclusions and lower entropy value, so that the four defect characteristics can be well distinguished by adopting the entropy of the gray level co-occurrence matrix, the final gray level co-occurrence matrix of each pixel point is calculated in texture entropy, and a surface image (steel mesh gray level image) is formedThe calculation formula of the corresponding entropy matrix and texture entropy is as follows:
Figure SMS_120
wherein,,
Figure SMS_121
representing the +.o on gray level co-occurrence matrix>
Figure SMS_122
Line and->
Figure SMS_123
Column elements.
Further, the embodiment of the invention utilizes a neural network to realize accurate classification of the surface defects of the steel mesh, wherein the adopted Convolutional Neural Network (CNN) is a feedforward neural network comprising convolutional calculation, the convolutional neural network comprises a convolutional layer, a pooling layer, a full-connection layer and the like, picture information with three dimensions of width, height and depth is input, and the training process of the convolutional neural network is as follows:
(1) The input of the convolutional neural network is a surface image of the steel network and a corresponding entropy matrix; the probability of the defect type is output.
(2) According to the embodiment of the invention, the defect types of the steel mesh are considered to be cracks, impurity clamps, pressed-in ferric oxide and pits, so that a normal surface image is marked as 0, cracks are marked as 1, inclusions are marked as 2, pressed-in ferric oxide is marked as 3, pits are marked as 4, and then the labels are increased according to the condition that multiple defects appear simultaneously, for example, the cracks and the inclusions appear simultaneously, and the image is marked as 5.
(3) And estimating the classification accuracy of the convolutional neural network on the training set by adopting a cross entropy loss function, and measuring the difference between the model output and the real output.
(4) And normalizing the output value by adopting a Softmax classifier to obtain the probability of the defect type.
Further, inputting the surface image of the steel mesh to be detected and the corresponding entropy matrix into a trained convolutional neural network to detect the defect type of the image.
In summary, the embodiment of the invention provides a method for determining a steel mesh surface defect based on machine vision points, which is used for acquiring a surface image of a steel mesh to obtain a corresponding steel mesh gray image, adaptively dividing a gray value in the steel mesh gray image into N optimal gray intervals according to gray values and distribution positions of pixel points, carrying out gray level quantization on the gray values based on the optimal gray intervals to obtain a quantized gray image, carrying out gray symbiotic matrix acquisition on the quantized gray image to obtain texture entropy of each pixel point, forming an entropy matrix corresponding to the surface image, and inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of the steel mesh defect. By carrying out self-adaptive gray scale interval division on the gray scale value in the image and carrying out gray scale quantization based on the divided gray scale interval, the gray scale quantization process is more flexible, so that the calculation amount is simplified and reduced on the premise of reducing the loss of texture information, and the defect detection accuracy is improved.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. And the foregoing description has been directed to specific embodiments of this specification. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (7)

1. The steel mesh surface defect judging method based on machine vision is characterized by comprising the following steps of:
acquiring a surface image of the steel mesh to obtain a corresponding steel mesh gray image;
obtaining a gray histogram of the steel mesh gray image to obtain a maximum gray value and a minimum gray value, setting a clustering radius of a DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value, and dividing the gray value in the steel mesh gray image into A clustering clusters based on the clustering radius, wherein A is a positive integer larger than 0; dividing gray values in the steel mesh gray image into N gray intervals based on a cluster, wherein N is a positive integer greater than 0 and N
Figure QLYQS_1
A, constructing an objective function according to the number of gray intervals and the number of pixel points corresponding to the gray intervals contained in each divided cluster, and acquiring N optimal gray intervals corresponding to the objective function when the objective function reaches global optimum;
carrying out gray level quantization on gray values based on N optimal gray intervals of the steel mesh gray image to obtain a quantized gray image; acquiring a gray level co-occurrence matrix of the quantized gray level image to obtain texture entropy of each pixel point, and forming an entropy matrix corresponding to the surface image; inputting the surface image and the corresponding entropy matrix into a neural network to confirm the type of steel mesh defects.
2. The method for determining the surface defect of the steel mesh based on the machine vision according to claim 1, wherein the method for constructing the objective function according to the number of gray intervals and the number of pixels corresponding to the gray intervals contained in each cluster after the division comprises the following steps:
b demarcation points required for re-dividing are obtained according to the number of the gray spaces and the number of the clusters, and the B demarcation points are utilized to re-divide the gray intervals of the initial gray intervals corresponding to the A clusters;
calculating intra-cluster frequency uniformity and inter-cluster frequency uniformity of A cluster clusters according to the frequency of each gray scale interval based on the N gray scale intervals after the repartition, wherein the frequency refers to the frequency that the number of pixel points contained in the gray scale interval accounts for the number of all pixel points in the steel mesh gray scale image; calculating the intra-cluster space dispersity uniformity and inter-cluster space dispersity uniformity of the A clusters according to the position coordinates of the pixel points; calculating intra-cluster gray span uniformity and inter-cluster gray span uniformity of A clusters according to the gray span of each gray interval, wherein the gray span refers to the difference value between the maximum gray level and the minimum gray level corresponding to the gray interval;
linearly combining the intra-cluster frequency uniformity, the intra-cluster spatial dispersity uniformity and the intra-cluster gray scale span uniformity into an intra-cluster uniformity evaluation value; and linearly combining the inter-cluster frequency uniformity, the inter-cluster space dispersity uniformity and the inter-cluster gray scale span uniformity into inter-cluster uniformity evaluation values, and constructing the objective function of N gray scale intervals by combining the intra-cluster uniformity evaluation values and the inter-cluster uniformity evaluation values.
3. The method for determining the surface defect of the steel mesh based on the machine vision according to claim 2, wherein the method for calculating the intra-cluster frequency uniformity and the inter-cluster frequency uniformity of the a clusters according to the frequency of each gray scale interval comprises the following steps:
calculating the average frequency of each cluster according to the frequency of each gray interval, obtaining the sum of the frequency differences of each cluster according to the frequency difference between the frequency of each gray interval and the average frequency of the cluster to which the frequency difference belongs, and obtaining the intra-cluster frequency uniformity degree by combining the sum of the frequency differences of A clusters;
calculating a first average value of the average frequency among A clusters according to the average frequency of each cluster, and calculating the inter-cluster frequency uniformity according to the difference between the average frequency of each cluster and the first average value.
4. The method for determining the surface defect of the steel mesh based on machine vision according to claim 2, wherein the method for calculating the intra-cluster spatial dispersity uniformity and the inter-cluster spatial dispersity uniformity of the a clusters according to the position coordinates of the pixel points comprises the following steps:
taking the average value of the position coordinates of all the pixel points in each gray scale interval as the center position coordinate of the corresponding gray scale interval, calculating the Euclidean distance between the position coordinate of each pixel point in the gray scale interval and the center position coordinate, obtaining the distance average value of each gray scale interval, and taking the distance average value as the space dispersity of the corresponding gray scale interval;
calculating the average spatial dispersity of each cluster according to the spatial dispersity of each gray scale interval in the clusters, obtaining the sum of the spatial dispersity differences of each cluster according to the spatial dispersity of each gray scale space and the corresponding average spatial dispersity, and calculating the intra-cluster spatial dispersity uniformity according to the sum of the spatial dispersity differences of A clusters;
calculating a second average value of the average spatial dispersion degree among A clusters according to the average spatial dispersion degree of each cluster, and calculating the inter-cluster spatial dispersion degree uniformity according to the difference between the average spatial dispersion degree of each cluster and the second average value.
5. The method for determining the surface defect of the steel mesh based on the machine vision according to claim 2, wherein the method for calculating the intra-cluster gray span uniformity and the inter-cluster gray span uniformity of the a clusters according to the gray span of each gray interval comprises the following steps:
calculating the average gray span of each cluster according to the gray span of each gray interval, obtaining the sum of gray span differences of each cluster according to the gray span difference between the gray span of each gray space and the average gray span of the cluster to which the gray space belongs, and calculating the intra-cluster gray span uniformity by combining the sum of the gray span differences of A clusters;
and calculating a third average value of the average gray scale span among A clusters according to the average gray scale span of each cluster, and calculating the inter-cluster gray scale span uniformity according to the difference between the average gray scale span of each cluster and the third average value.
6. The method for judging the surface defects of the steel mesh based on machine vision according to claim 1, wherein a particle swarm optimization algorithm is adopted to obtain a globally optimal solution of the objective function.
7. The method for determining the surface defects of the steel mesh based on machine vision according to claim 1, wherein the method for setting the cluster radius of the DBSCAN clustering algorithm according to the maximum gray value and the minimum gray value comprises the following steps:
and calculating a gray level difference value between the maximum gray level value and the minimum gray level value, setting the clustering radius according to the gray level difference value, and enabling the gray level difference value and the clustering radius to form a positive correlation.
CN202310045703.5A 2023-01-30 2023-01-30 Steel mesh surface defect judging method based on machine vision Withdrawn CN116091455A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310045703.5A CN116091455A (en) 2023-01-30 2023-01-30 Steel mesh surface defect judging method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310045703.5A CN116091455A (en) 2023-01-30 2023-01-30 Steel mesh surface defect judging method based on machine vision

Publications (1)

Publication Number Publication Date
CN116091455A true CN116091455A (en) 2023-05-09

Family

ID=86198750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310045703.5A Withdrawn CN116091455A (en) 2023-01-30 2023-01-30 Steel mesh surface defect judging method based on machine vision

Country Status (1)

Country Link
CN (1) CN116091455A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116433657A (en) * 2023-06-08 2023-07-14 金乡县明耀玻璃有限公司 Toughened glass scratch area image enhancement method based on computer vision
CN116523913A (en) * 2023-07-03 2023-08-01 惠州市金思维科技有限公司 Intelligent detection method for quality of screw rod
CN116596936A (en) * 2023-07-18 2023-08-15 深圳市魔方卫星科技有限公司 Solar wing sailboard quality detection method based on image data
CN116934750A (en) * 2023-09-15 2023-10-24 山东庆葆堂生物科技有限公司 Vinegar egg liquid production quality assessment method
CN116993718A (en) * 2023-09-25 2023-11-03 深圳市东陆科技有限公司 TFT array substrate defect detection method based on machine vision
CN117173158A (en) * 2023-10-25 2023-12-05 深圳市德海威实业有限公司 Intelligent detection method and system for quality of precise connector
CN118365644A (en) * 2024-06-19 2024-07-19 西安联瑞科技实业有限责任公司 Steel plate sand blasting uneven detection method based on image processing

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116433657B (en) * 2023-06-08 2023-08-25 金乡县明耀玻璃有限公司 Toughened glass scratch area image enhancement method based on computer vision
CN116433657A (en) * 2023-06-08 2023-07-14 金乡县明耀玻璃有限公司 Toughened glass scratch area image enhancement method based on computer vision
CN116523913B (en) * 2023-07-03 2023-11-03 惠州市金思维科技有限公司 Intelligent detection method for quality of screw rod
CN116523913A (en) * 2023-07-03 2023-08-01 惠州市金思维科技有限公司 Intelligent detection method for quality of screw rod
CN116596936A (en) * 2023-07-18 2023-08-15 深圳市魔方卫星科技有限公司 Solar wing sailboard quality detection method based on image data
CN116596936B (en) * 2023-07-18 2023-09-12 深圳市魔方卫星科技有限公司 Solar wing sailboard quality detection method based on image data
CN116934750A (en) * 2023-09-15 2023-10-24 山东庆葆堂生物科技有限公司 Vinegar egg liquid production quality assessment method
CN116934750B (en) * 2023-09-15 2023-12-01 山东庆葆堂生物科技有限公司 Vinegar egg liquid production quality assessment method
CN116993718A (en) * 2023-09-25 2023-11-03 深圳市东陆科技有限公司 TFT array substrate defect detection method based on machine vision
CN116993718B (en) * 2023-09-25 2023-12-22 深圳市东陆科技有限公司 TFT array substrate defect detection method based on machine vision
CN117173158A (en) * 2023-10-25 2023-12-05 深圳市德海威实业有限公司 Intelligent detection method and system for quality of precise connector
CN117173158B (en) * 2023-10-25 2024-01-30 深圳市德海威实业有限公司 Intelligent detection method and system for quality of precise connector
CN118365644A (en) * 2024-06-19 2024-07-19 西安联瑞科技实业有限责任公司 Steel plate sand blasting uneven detection method based on image processing
CN118365644B (en) * 2024-06-19 2024-09-10 西安联瑞科技实业有限责任公司 Steel plate sand blasting uneven detection method based on image processing

Similar Documents

Publication Publication Date Title
CN116091455A (en) Steel mesh surface defect judging method based on machine vision
CN115082467B (en) Building material welding surface defect detection method based on computer vision
CN115829883B (en) Surface image denoising method for special-shaped metal structural member
CN110211126B (en) Image segmentation method based on intuitive fuzzy C-means clustering
CN109583474B (en) Training sample generation method for industrial big data processing
CN110443778B (en) Method for detecting irregular defects of industrial products
CN116664559B (en) Machine vision-based memory bank damage rapid detection method
CN114092389A (en) Glass panel surface defect detection method based on small sample learning
CN116030052B (en) Etching quality detection method for lamination process of computer display panel
CN117197140B (en) Irregular metal buckle forming detection method based on machine vision
CN110276764A (en) K-Means underwater picture background segment innovatory algorithm based on the estimation of K value
CN118115498B (en) Method and system for rapidly detecting glossiness of titanium rod
CN118096796B (en) Visual inspection method for appearance of radial forging titanium rod based on machine learning
CN118096579B (en) 3D printing lattice structure defect detection method
CN117522864B (en) European pine plate surface flaw detection method based on machine vision
CN117635507B (en) Plastic particle online visual detection method and system
CN117314940B (en) Laser cutting part contour rapid segmentation method based on artificial intelligence
CN112183469B (en) Method for identifying congestion degree of public transportation and self-adaptive adjustment
CN110766662B (en) Forging surface crack detection method based on multi-scale and multi-layer feature learning
CN112784922A (en) Extraction and classification method of intelligent cloud medical images
CN113160214B (en) Novel method for measuring similarity of local neighborhood pixels of image
CN113160166B (en) Medical image data mining working method through convolutional neural network model
CN106981201A (en) vehicle identification method under complex environment
CN117409275B (en) Multi-angle radar image processing method
CN116740056B (en) Defect detection method for coating layer of whole-core high-pattern conveyer belt

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20230509