CN110781913B - Zipper cloth belt defect detection method - Google Patents
Zipper cloth belt defect detection method Download PDFInfo
- Publication number
- CN110781913B CN110781913B CN201910859700.9A CN201910859700A CN110781913B CN 110781913 B CN110781913 B CN 110781913B CN 201910859700 A CN201910859700 A CN 201910859700A CN 110781913 B CN110781913 B CN 110781913B
- Authority
- CN
- China
- Prior art keywords
- image
- matrix
- gray level
- value
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
- G01N2021/888—Marking defects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Abstract
The invention relates to a method for detecting defects of a zipper cloth belt, which comprises the following steps: s1: acquiring a training sample image, and carrying out histogram equalization treatment on the training sample image to obtain an equalized image; s2: according to the equalized image, calculating to obtain a feature matrix of the gray level co-occurrence matrix; s3: according to the equalized image, calculating to obtain a characteristic matrix of a local binary mode; s4: combining the feature matrix of the gray level co-occurrence matrix with the feature matrix of the local binary pattern to obtain a combined feature matrix, and carrying out normalization processing on the combined feature matrix to obtain a normalized combined feature matrix; s5: training the support vector machine model according to the normalized combined feature matrix to obtain a classifier model; s6: and detecting the zipper sample image to be detected according to the classifier model, and marking the detection result. The method adopts GLCM and LBP feature extraction operators in combination, and improves the compatibility of images and the detection precision.
Description
Technical Field
The invention belongs to the technical field of industrial product defect detection, and particularly relates to a zipper cloth belt defect detection method.
Background
In recent years, along with the vigorous development of clothing industry and textile industry in China, china becomes the largest zipper production country in the world, and the zipper production and manufacturing technology has greatly advanced. However, in the production process of the zipper, due to imperfect production process and lack of corresponding detection equipment, more or less problems such as notch, damage, dirty tape, glue leakage and the like of the zipper cloth tape often occur.
Zipper defect detection plays an important role in the industrial production process of zippers, and the quality and production efficiency of zippers are directly affected by the accuracy and efficiency of zipper defect detection. At present, the domestic zipper detection technology is still in a semi-automatic and manual detection stage. For the quality inspection of finished zippers, defective products are generally sorted by adopting an inspection method of visual inspection after sampling, and the zippers are identified and judged by workers on the production line through vision and experience. The traditional manual detection method has the defects of high labor intensity, low detection efficiency, poor reliability of the detection standard, high erroneous judgment rate and omission rate, and difficulty in ensuring the quality of the zipper products. Therefore, the zipper cloth belt defect detection method with high accuracy, high efficiency and high credibility is provided, and has important significance in the quality detection of the actual industrial production line.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a zipper cloth belt defect detection method. The technical problems to be solved by the invention are realized by the following technical scheme:
the invention provides a method for detecting defects of a zipper cloth belt, which comprises the following steps:
s1: acquiring a training sample image, and carrying out histogram equalization treatment on the training sample image to obtain an equalized image;
s2: according to the equalized image, calculating to obtain a feature matrix of the gray level co-occurrence matrix;
s3: calculating to obtain a characteristic matrix of a local binary mode according to the equalized image;
s4: combining the feature matrix of the gray level co-occurrence matrix with the feature matrix of the local binary pattern to obtain a combined feature matrix, and carrying out normalization processing on the combined feature matrix to obtain a normalized combined feature matrix;
s5: training a support vector machine model according to the normalized combined feature matrix to obtain a classifier model;
s6: and detecting the zipper sample image to be detected according to the classifier model, and marking the detection result.
In one embodiment of the present invention, the S1 includes:
performing histogram equalization processing on the training sample image according to a histogram equalization calculation formula to obtain an equalized image, wherein the histogram equalization calculation formula is that,
wherein S is k Representing the gray level of a pixel in said equalized image, T (r k ) Representing a gray level transformation function, r k Represents the kth gray level in the training sample image, k=0, 1,2 j Representing the gray level r in the training sample image j N represents the total number of pixels in the training sample image.
In one embodiment of the present invention, the S2 includes:
s21: selecting parameters of a gray level co-occurrence matrix, wherein the parameters comprise a step d and a scanning direction, the step d represents the distance between two pixel points in the equalized image, and the scanning direction comprises 0 degree, 45 degrees, 90 degrees and 135 degrees;
s22: carrying out gray level compression on the equalized image to obtain a gray level compressed image;
s23: according to the parameters of the gray level co-occurrence matrix and the gray level compressed image, calculating to obtain the gray level co-occurrence matrix, wherein in the gray level co-occurrence matrix, the value of the ith row and the jth column represents the occurrence frequency of a pixel point with a pixel value of i and a pixel point with a pixel value of j in the gray level compressed image in different scanning directions, and the steps of the two pixel points are d;
s24: and calculating to obtain the characteristic matrix of the gray level co-occurrence matrix according to the gray level co-occurrence matrix.
In one embodiment of the present invention, the S24 includes:
s241: respectively calculating a contrast characteristic matrix, a correlation characteristic matrix and a homogeneity characteristic matrix according to the gray level co-occurrence matrix,
the calculation formula of the contrast ratio is that,
the calculation formula of the correlation is that,
the calculation formula of the homogeneity is that,
where i denotes the gray value of the pixel point as i, j denotes the gray value of the pixel point as j, and P (i, j) denotes the frequency of occurrence of the combination of the pixel point with the pixel value as i and the pixel point with the pixel value as j, μ i Represents the average value, mu, of the rows of pixel points having a gray value of i j Mean value, sigma, representing row of pixel points with gray value j i Representing the variance, sigma, of the row in which the pixel point with gray value i is located j The variance of the row of pixels with gray values j is represented.
S242: and combining the contrast characteristic matrix, the correlation characteristic matrix and the homogeneity characteristic matrix to obtain the characteristic matrix of the gray level co-occurrence matrix.
In one embodiment of the present invention, the S3 includes:
s31: dividing the equalized image into a plurality of N multiplied by N sub-image blocks, calculating LBP characteristic values of each pixel in each sub-image block, wherein the LBP characteristic values are calculated according to the calculation formula,
wherein, (x) c ,y c ) Representing the center pixel point, i, of the sub-image block c Gray value, i, representing center pixel p Representing gray values of surrounding pixel points, wherein p represents the number of gray values of the sub-image blocks;
s32: obtaining an LBP characteristic value histogram of the sub-image block according to the LBP characteristic value of the sub-image block;
s33: normalizing the LBP characteristic value histogram of the sub-image block to obtain a normalized histogram of the sub-image block;
s34: and connecting the normalized histograms of all the sub-image blocks to obtain the feature matrix of the local binary pattern.
In one embodiment of the present invention, the S5 includes:
s51: selecting 80% of normalized combined feature matrixes of the training sample images and corresponding sample labels as training data, and training the support vector machine model;
s52: and selecting the remaining 20% of normalized combined feature matrix of the training sample image and the corresponding sample label as verification data to perform cross verification on the trained support vector machine model to obtain the classifier model.
In one embodiment of the present invention, the S6 includes:
s61: acquiring and marking edge lines of the upper side and the lower side of a zipper cloth belt in a zipper sample image to be detected, and equally dividing the edge lines into a plurality of sections;
s62: defining a sliding window, enabling the sliding window to slide along one side edge line of the zipper cloth belt according to a step length L, and obtaining a sliding window partial block image;
s63: processing the sliding window partial block image to obtain a combined feature matrix thereof;
s64: inputting the combined feature matrix of the sliding window partial block image into the classifier model for detection, and marking the defect position by using a rectangular frame according to the detection result;
s65: after one side of the zipper cloth belt in the zipper sample image to be detected is detected, the zipper sample image to be detected is turned over along the horizontal central axis, and the steps S62-S64 are repeated to detect the other side of the zipper cloth belt in the zipper sample image to be detected.
Compared with the prior art, the invention has the beneficial effects that:
1. after the training sample image is acquired, the method for detecting the defects of the zipper cloth belt firstly carries out histogram equalization treatment, so that the gray values of the training sample image are uniformly distributed, the method can cope with zippers with various colors, and the classification accuracy is improved;
2. the zipper cloth belt defect detection method combines GLCM (gray level co-occurrence matrix) and LBP (local binary pattern) feature extraction operators, can accurately classify certain fine gray level value changes in textures, and improves the compatibility of images and the detection precision.
3. According to the zipper cloth belt defect detection method, in the training process of the support vector machine model, the performance of the model is verified by adopting a cross verification method, so that the occurrence of the over-fitting phenomenon is effectively reduced, and the generalization capability of the support vector machine model is improved.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention, as well as the preferred embodiments thereof, together with the following detailed description of the invention, given by way of illustration only, together with the accompanying drawings.
Drawings
FIG. 1 is a flowchart of a method for detecting defects of a zipper cloth tape according to an embodiment of the present invention;
FIGS. 2-5 are images of a sample zipper to be inspected for different defect types provided by embodiments of the present invention;
fig. 6 to fig. 9 are test results of images of slide fastener samples to be tested for different defect types according to an embodiment of the present invention.
Detailed Description
In order to further explain the technical means and effects adopted by the invention to achieve the preset aim, the following describes in detail a zipper cloth belt defect detection method according to the invention with reference to the attached drawings and the specific embodiments.
The foregoing and other features, aspects, and advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments when taken in conjunction with the accompanying drawings. The technical means and effects adopted by the present invention to achieve the intended purpose can be more deeply and specifically understood through the description of the specific embodiments, however, the attached drawings are provided for reference and description only, and are not intended to limit the technical scheme of the present invention.
The zipper cloth belt defect detection based on the machine learning method depends on the feature extraction technology and the selection of a classifier, wherein a cloth belt part in the zipper mainly shows a texture mode, LBP (local binary pattern) and GLCM (gray level co-occurrence matrix) are used as texture image feature extraction operators, the LBP has gray level invariance and rotation invariance, and the GLCM can well capture the change of spatial texture features. The LBP and the GLCM are effectively combined, so that the false detection rate can be reduced while the detection precision is improved.
Referring to fig. 1, fig. 1 is a flowchart of a method for detecting defects of a fastener tape according to an embodiment of the present invention, as shown in the drawings, the method for detecting defects of a fastener tape according to the embodiment includes:
s1: acquiring a training sample image, and carrying out histogram equalization treatment on the training sample image to obtain an equalized image;
specifically, the training sample image is subjected to histogram equalization processing according to a histogram equalization calculation formula, so as to obtain an equalized image, wherein the histogram equalization calculation formula is that,
wherein S is k Representing the gray level of a pixel in said equalized image, T (r k ) Representing a gray level transformation function, r k Represents the kth gray level in the training sample image, k=0, 1,2 j Representing the gray level r in the training sample image j N represents the total number of pixels in the training sample image.
Because the zipper cloth belts can be of different colors in the production process of the zipper, before the image features of the training sample are extracted, histogram equalization processing is carried out firstly, so that the gray values of the image of the training sample are uniformly distributed, the classifier can be compatible with the zippers of various colors, and the method can be used for coping with the zippers of various colors, thereby improving the classification accuracy.
S2: according to the equalized image, calculating to obtain a feature matrix of the gray level co-occurrence matrix;
specifically, the method comprises the following steps:
s21: selecting parameters of a gray level co-occurrence matrix, wherein the parameters comprise a step d and a scanning direction, the step d represents the distance between two pixel points in the equalized image, and the scanning direction comprises 0 degree, 45 degrees, 90 degrees and 135 degrees;
for example, in the gray level co-occurrence matrix, the value of the ith row and jth column represents the frequency of occurrence of the combination of the pixel point (x, y) with the pixel value i and the pixel point (x+a, y+b) with the pixel value j, wherein the distance difference (a, b) between the two pixel points is defined as follows:
a=1, b=0 means that the two pixel points are scanned horizontally, namely, 0 degrees;
a=1, b=1 means that the pixel point with the pixel value j is located at the lower right of the pixel point with the pixel value i, i.e. 45 ° scanning;
a=0, b=1 indicates that the two pixel points are scanned vertically, namely 90 degrees;
a= -1, b= -1 indicates that the pixel with pixel value j is located above and to the left of the pixel with pixel value i, i.e. 135 ° scan.
In this embodiment, stride d is selected to be 5 and 10.
S22: carrying out gray level compression on the equalized image to obtain a gray level compressed image;
the gray level in one image is 256 (0-255), but 256 gray levels are not needed when the gray level co-occurrence matrix is calculated, and the calculated amount is large, so that the gray level compression is usually carried out for the convenience of feature extraction and calculation. In this embodiment, the gray level of the equalized image is compressed to 8 levels. When dividing into 8 gray levels, if the gray level of the pixel point of the training sample image is directly divided by 32, the image definition is reduced, so that the training sample image is subjected to histogram equalization treatment to increase the dynamic range of the gray level, and the overall contrast effect of the image can be increased by compressing the gray level of the equalized image.
S23: according to the parameters of the gray level co-occurrence matrix and the gray level compressed image, calculating to obtain a gray level co-occurrence matrix, wherein in the gray level co-occurrence matrix, the value of the ith row and the jth column represents the frequency of occurrence of the combination of the pixel point with the pixel value of i and the pixel point with the pixel value of j in different scanning directions (0 DEG, 45 DEG, 90 DEG and 135 DEG) in the gray level compressed image, and the step of the two pixel points is d;
s24: and calculating to obtain the characteristic matrix of the gray level co-occurrence matrix according to the gray level co-occurrence matrix.
Since the dimension of the gray level co-occurrence matrix is large, the matrix is not directly used for distinguishing texture features. The existing gray level co-occurrence matrix has 14 attributes to describe the relation between gray values, and in the embodiment, the contrast, the correlation and the homogeneity with characterization capability are preferably used as the characteristic values, wherein the contrast is used for reflecting the definition of an image and the depth of texture grooves, and if the grooves are deeper, the more pixel pairs with larger gray value differences in the image are, the effect is clear; on the contrary, the shallower the groove is, the fewer the pixel pairs with larger gray value differences in the image are, and the effect is blurred. The correlation reflects the consistency of image textures, is used for describing the similarity degree between elements of rows or columns in the gray level co-occurrence matrix, reflects the extension length of gray level values along a certain direction, and has large correlation value when the matrix element values are uniform and equal; in contrast, if the matrix pixel values differ greatly, the correlation value is small. The homogeneity is also called inverse gap, which is a measure of the local gray level homogeneity of the image, and if the local gray level is uniform, the value of the homogeneity is larger; in contrast, if the local gray scale differences are large, the homogeneity value is smaller.
Specifically, the method comprises the following steps:
s241: respectively calculating a contrast characteristic matrix, a correlation characteristic matrix and a homogeneity characteristic matrix according to the gray level co-occurrence matrix,
the calculation formula of the contrast ratio is that,
the calculation formula of the correlation is that,
the calculation formula of the homogeneity is that,
where i denotes the gray value of the pixel point as i, j denotes the gray value of the pixel point as j, and P (i, j) denotes the frequency of occurrence of the combination of the pixel point with the pixel value as i and the pixel point with the pixel value as j, μ i Represents the average value, mu, of the rows of pixel points having a gray value of i j Mean value, sigma, representing row of pixel points with gray value j i Representing the variance, sigma, of the row in which the pixel point with gray value i is located j The variance of the row of pixels with gray values j is represented.
S242: and combining the contrast characteristic matrix, the correlation characteristic matrix and the homogeneity characteristic matrix to obtain the characteristic matrix of the gray level co-occurrence matrix.
S3: calculating to obtain a characteristic matrix of a local binary mode according to the equalized image;
specifically, the method comprises the following steps:
s31: dividing the equalized image into a plurality of N multiplied by N sub-image blocks, calculating LBP characteristic values of each pixel in each sub-image block, wherein the LBP characteristic values are calculated according to the calculation formula,
wherein, (x) c ,y c ) Representing the center pixel point, i, of the sub-image block c Gray value, i, representing center pixel p Representing the gray values of surrounding pixels, p representing the number of gray values of the sub-image block, when i p -i c S (i) when not less than 0 p -i c ) Taking 1, when i p -i c When < 0, S (i) p -i c ) Taking 0;
s32: obtaining an LBP characteristic value histogram of the sub-image block according to the LBP characteristic value of the sub-image block;
s33: normalizing the LBP characteristic value histogram of the sub-image block to obtain a normalized histogram of the sub-image block;
s34: and connecting the normalized histograms of all the sub-image blocks to obtain the feature matrix of the local binary pattern.
S4: combining the feature matrix of the gray level co-occurrence matrix with the feature matrix of the local binary pattern to obtain a combined feature matrix, and carrying out normalization processing on the combined feature matrix to obtain a normalized combined feature matrix;
in this embodiment, the feature matrix of the gray level co-occurrence matrix and the feature matrix of the local binary pattern are flattened and combined to obtain a one-dimensional combined feature matrix, and a sklearn.preprocessing.scale function is adopted to normalize the combined feature matrix, so that the mean value of the combined feature matrix is 0 and the variance of the combined feature matrix is 1, and the normalized combined feature matrix is obtained after standard normal distribution. And carrying out normalization processing on the feature matrix to enable the feature matrix to be subjected to standard normal distribution, so that the problem that the target function cannot learn other features correctly due to the fact that the variance of the features is too large to lead the target function can be avoided.
S5: training a support vector machine model according to the normalized combined feature matrix to obtain a classifier model;
specifically, the method comprises the following steps:
s51: selecting 80% of normalized combined feature matrixes of the training sample images and corresponding sample labels as training data, and training the support vector machine model;
s52: and selecting the remaining 20% of normalized combined feature matrix of the training sample image and the corresponding sample label as verification data to perform cross verification on the trained support vector machine model to obtain the classifier model.
In this embodiment, the training sample image is marked, the sample label of the normal image is 1, and the sample label of the defect image is 0. And calculating according to the steps S1-S4 to obtain the normalized combined feature matrix of each training sample image. And selecting 80% of normalized combined feature matrixes of the training sample images and corresponding sample labels thereof as training data, and the remaining 20% as verification data. And selecting an RBF kernel function in the support vector machine, setting the initial value of a punishment parameter C to be 1, setting the initial value of a single training sample influence size gamma to be 0.05, and carrying out iterative training on the support vector machine by using training data to update the punishment parameter C and the parameter value of the single training sample influence size gamma. In the training process of the support vector machine model, the performance of the support vector machine model is cross-verified through verification data, so that the occurrence of the over-fitting phenomenon can be effectively reduced, and the generalization capability of the support vector machine model is improved.
S6: and detecting the zipper sample image to be detected according to the classifier model, and marking the detection result.
Specifically, the method comprises the following steps:
s61: acquiring and marking edge lines of the upper side and the lower side of a zipper cloth belt in a zipper sample image to be detected, and equally dividing the edge lines into a plurality of sections;
s62: defining a sliding window, enabling the sliding window to slide along one side edge line of the zipper cloth belt according to a step length L, and obtaining a sliding window partial block image;
in this embodiment, the size of the sliding window is 100×125, and since the edges of the zipper cloth tape are not flush, the zipper cloth tape is divided into a plurality of small sections according to the size of the sliding window, so that the measurement error in the detection process can be reduced, and the sliding step length L of the sliding window is 100. And calculating the proportion of the zipper cloth belt image in the sliding window when the sliding window is moved every time, so that the sliding window can contain the complete zipper cloth belt image.
S63: processing the sliding window partial block image to obtain a combined feature matrix thereof;
in this embodiment, histogram equalization processing is performed on the sliding window partial block image, a GLCM feature matrix and an LBP feature matrix are obtained through calculation, and a combination feature matrix is obtained through combination of the GLCM feature matrix and the LBP feature matrix.
S64: inputting the combined feature matrix of the sliding window partial block image into the classifier model for detection, and marking the defect position by using a rectangular frame according to the detection result;
s65: after one side of the zipper cloth belt in the zipper sample image to be detected is detected, the zipper sample image to be detected is turned over along the horizontal central axis, and the steps S62-S64 are repeated to detect the other side of the zipper cloth belt in the zipper sample image to be detected.
The zipper cloth belt defect detection method of the embodiment combines GLCM and LBP characteristic extraction operators, can accurately classify certain fine gray value changes in textures, and improves the compatibility of images and the detection precision. Before extracting the feature matrix, histogram equalization processing is performed, so that the gray values of the images are uniformly distributed, the detection method of the embodiment can cope with zippers with various colors, and the classification accuracy is improved. In addition, in the training process of the support vector machine model, the performance of the model is verified by adopting a cross verification method, so that the occurrence of the over-fitting phenomenon is effectively reduced, and the generalization capability of the support vector machine model is improved.
Referring to fig. 2-5, fig. 2-5 are sample images of a to-be-detected zipper of different defect types provided by the embodiment of the invention, as shown in fig. 2 is a sample image of a notch of a zipper tape, fig. 3 is a sample image of a damaged zipper tape, fig. 4 is a sample image of a dirty tape of the zipper tape, and fig. 5 is a sample image of a leakage of the zipper tape. The to-be-detected zipper sample images of different defect types in fig. 2-5 are detected by adopting the zipper cloth belt defect detection method of the embodiment, wherein the detection results are shown in fig. 6-9, wherein fig. 6 is a detection result diagram in fig. 2, fig. 7 is a detection result diagram in fig. 3, fig. 8 is a detection result diagram in fig. 4, and fig. 9 is a detection result diagram in fig. 5.
The foregoing is a further detailed description of the invention in connection with the preferred embodiments, and it is not intended that the invention be limited to the specific embodiments described. It will be apparent to those skilled in the art that several simple deductions or substitutions may be made without departing from the spirit of the invention, and these should be considered to be within the scope of the invention.
Claims (6)
1. The method for detecting the defects of the zipper cloth belt is characterized by comprising the following steps:
s1: acquiring a training sample image, and carrying out histogram equalization treatment on the training sample image to obtain an equalized image;
s2: according to the equalized image, calculating to obtain a feature matrix of the gray level co-occurrence matrix;
s3: calculating to obtain a characteristic matrix of a local binary mode according to the equalized image;
s4: combining the feature matrix of the gray level co-occurrence matrix with the feature matrix of the local binary pattern to obtain a combined feature matrix, and carrying out normalization processing on the combined feature matrix to obtain a normalized combined feature matrix;
s5: training a support vector machine model according to the normalized combined feature matrix to obtain a classifier model;
s6: detecting the zipper sample image to be detected according to the classifier model, and marking the detection result; the step S6 comprises the following steps:
s61: acquiring and marking edge lines of the upper side and the lower side of a zipper cloth belt in a zipper sample image to be detected, and equally dividing the edge lines into a plurality of sections;
s62: defining a sliding window, enabling the sliding window to slide along one side edge line of the zipper cloth belt according to a step length L, and obtaining a sliding window partial block image;
s63: processing the sliding window partial block image to obtain a combined feature matrix thereof;
s64: inputting the combined feature matrix of the sliding window partial block image into the classifier model for detection, and marking the defect position by using a rectangular frame according to the detection result;
s65: after one side of the zipper cloth belt in the zipper sample image to be detected is detected, the zipper sample image to be detected is turned over along the horizontal central axis, and the steps S62-S64 are repeated to detect the other side of the zipper cloth belt in the zipper sample image to be detected.
2. The method according to claim 1, wherein S1 comprises:
performing histogram equalization processing on the training sample image according to a histogram equalization calculation formula to obtain an equalized image, wherein the histogram equalization calculation formula is that,
wherein S is k Representing the gray level of a pixel in said equalized image, T (r k ) Representing a gray level transformation function, r k Represents the kth gray level in the training sample image, k=0, 1,2 j Representing the gray level r in the training sample image j N represents the total number of pixels in the training sample image.
3. The method according to claim 1, wherein S2 comprises:
s21: selecting parameters of a gray level co-occurrence matrix, wherein the parameters comprise a step d and a scanning direction, the step d represents the distance between two pixel points in the equalized image, and the scanning direction comprises 0 degree, 45 degrees, 90 degrees and 135 degrees;
s22: carrying out gray level compression on the equalized image to obtain a gray level compressed image;
s23: according to the parameters of the gray level co-occurrence matrix and the gray level compressed image, calculating to obtain the gray level co-occurrence matrix, wherein in the gray level co-occurrence matrix, the value of the ith row and the jth column represents the occurrence frequency of a pixel point with a pixel value of i and a pixel point with a pixel value of j in the gray level compressed image in different scanning directions, and the steps of the two pixel points are d;
s24: and calculating to obtain the characteristic matrix of the gray level co-occurrence matrix according to the gray level co-occurrence matrix.
4. A method according to claim 3, wherein S24 comprises:
s241: respectively calculating a contrast characteristic matrix, a correlation characteristic matrix and a homogeneity characteristic matrix according to the gray level co-occurrence matrix,
the calculation formula of the contrast ratio is that,
the calculation formula of the correlation is that,
the calculation formula of the homogeneity is that,
where i denotes the gray value of the pixel point as i, j denotes the gray value of the pixel point as j, and P (i, j) denotes the frequency of occurrence of the combination of the pixel point with the pixel value as i and the pixel point with the pixel value as j, μ i Represents the average value, mu, of the rows of pixel points having a gray value of i j Mean value, sigma, representing row of pixel points with gray value j i Representing the variance, sigma, of the row in which the pixel point with gray value i is located j The variance of the row of pixels with gray values j is represented.
S242: and combining the contrast characteristic matrix, the correlation characteristic matrix and the homogeneity characteristic matrix to obtain the characteristic matrix of the gray level co-occurrence matrix.
5. The method according to claim 1, wherein S3 comprises:
s31: dividing the equalized image into a plurality of N multiplied by N sub-image blocks, calculating LBP characteristic values of each pixel in each sub-image block, wherein the LBP characteristic values are calculated according to the calculation formula,
wherein, (x) c ,y c ) Representing the center pixel point, i, of the sub-image block c Gray value, i, representing center pixel p Representing gray values of surrounding pixel points, wherein p represents the number of gray values of the sub-image blocks;
s32: obtaining an LBP characteristic value histogram of the sub-image block according to the LBP characteristic value of the sub-image block;
s33: normalizing the LBP characteristic value histogram of the sub-image block to obtain a normalized histogram of the sub-image block;
s34: and connecting the normalized histograms of all the sub-image blocks to obtain the feature matrix of the local binary pattern.
6. The method according to claim 1, wherein S5 comprises:
s51: selecting 80% of normalized combined feature matrixes of the training sample images and corresponding sample labels as training data, and training the support vector machine model;
s52: and selecting the remaining 20% of normalized combined feature matrix of the training sample image and the corresponding sample label as verification data to perform cross verification on the trained support vector machine model to obtain the classifier model.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910859700.9A CN110781913B (en) | 2019-09-11 | 2019-09-11 | Zipper cloth belt defect detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910859700.9A CN110781913B (en) | 2019-09-11 | 2019-09-11 | Zipper cloth belt defect detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110781913A CN110781913A (en) | 2020-02-11 |
CN110781913B true CN110781913B (en) | 2023-05-30 |
Family
ID=69383480
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910859700.9A Active CN110781913B (en) | 2019-09-11 | 2019-09-11 | Zipper cloth belt defect detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110781913B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7463186B2 (en) | 2020-05-26 | 2024-04-08 | キヤノン株式会社 | Information processing device, information processing method, and program |
CN111783885A (en) * | 2020-07-01 | 2020-10-16 | 中国电子科技集团公司第三十八研究所 | Millimeter wave image quality classification model construction method based on local enhancement |
CN114612385B (en) * | 2022-01-31 | 2023-06-13 | 南通市通州区锦都拉链有限公司 | Zipper selvedge defect identification method based on template matching |
CN115100201B (en) * | 2022-08-25 | 2022-11-11 | 淄博齐华制衣有限公司 | Blending defect detection method of flame-retardant fiber material |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150016668A1 (en) * | 2013-07-12 | 2015-01-15 | Ut-Battelle, Llc | Settlement mapping systems |
CN106097327B (en) * | 2016-06-06 | 2018-11-02 | 宁波大学 | In conjunction with the objective evaluation method for quality of stereo images of manifold feature and binocular characteristic |
CN108960088A (en) * | 2018-06-20 | 2018-12-07 | 天津大学 | The detection of facial living body characteristics, the recognition methods of specific environment |
-
2019
- 2019-09-11 CN CN201910859700.9A patent/CN110781913B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110781913A (en) | 2020-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110781913B (en) | Zipper cloth belt defect detection method | |
CN110349126B (en) | Convolutional neural network-based marked steel plate surface defect detection method | |
CN107543828B (en) | Workpiece surface defect detection method and system | |
CN114549522A (en) | Textile quality detection method based on target detection | |
CN108181316B (en) | Bamboo strip defect detection method based on machine vision | |
CN111383209A (en) | Unsupervised flaw detection method based on full convolution self-encoder network | |
CN115294113A (en) | Wood veneer quality detection method | |
CN108596880A (en) | Weld defect feature extraction based on image procossing and welding quality analysis method | |
CN104268505A (en) | Automatic cloth defect point detection and recognition device and method based on machine vision | |
CN111915704A (en) | Apple hierarchical identification method based on deep learning | |
CN109540925B (en) | Complex ceramic tile surface defect detection method based on difference method and local variance measurement operator | |
CN105654121A (en) | Complex jacquard fabric defect detection method based on deep learning | |
CN110517265A (en) | A kind of detection method of surface defects of products, device and storage medium | |
CN106501272B (en) | Machine vision soldering tin positioning detection system | |
CN115100206B (en) | Printing defect identification method for textile with periodic pattern | |
CN111667475B (en) | Machine vision-based Chinese date grading detection method | |
CN114910480A (en) | Wafer surface defect detection method based on machine vision | |
CN108647706A (en) | Article identification classification based on machine vision and flaw detection method | |
CN113706490B (en) | Wafer defect detection method | |
Wah et al. | Analysis on feature extraction and classification of rice kernels for Myanmar rice using image processing techniques | |
CN111724376B (en) | Paper disease detection method based on texture feature analysis | |
CN114549441A (en) | Sucker defect detection method based on image processing | |
CN112378350A (en) | Flatness detection method for PIN PIN of network transformer | |
TW201512649A (en) | Method of chip detects inspecting, system therefor, and computer program product thereof | |
CN113421223B (en) | Industrial product surface defect detection method based on deep learning and Gaussian mixture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |