CN109035195B - Fabric defect detection method - Google Patents

Fabric defect detection method Download PDF

Info

Publication number
CN109035195B
CN109035195B CN201810433675.3A CN201810433675A CN109035195B CN 109035195 B CN109035195 B CN 109035195B CN 201810433675 A CN201810433675 A CN 201810433675A CN 109035195 B CN109035195 B CN 109035195B
Authority
CN
China
Prior art keywords
value
fabric
image
calculating
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810433675.3A
Other languages
Chinese (zh)
Other versions
CN109035195A (en
Inventor
胡峰
徐启永
王传桐
吴雨川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Textile University
Original Assignee
Wuhan Textile University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Textile University filed Critical Wuhan Textile University
Priority to CN201810433675.3A priority Critical patent/CN109035195B/en
Publication of CN109035195A publication Critical patent/CN109035195A/en
Application granted granted Critical
Publication of CN109035195B publication Critical patent/CN109035195B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Abstract

A fabric defect detection method comprising the steps of: firstly, calibrating the pixel size; the second step is that: filtering the image; the third step: calculating a variance ratio; the fourth step: drawingh‑MBSFA curve; the fifth step: obtaining a straight line fitting equation; and a sixth step: calculating optimal filter parametersh(ii) a The seventh step: performing significance and normalization processing; eighth step: calculating a segmentation threshold; the ninth step: performing fault segmentation; the tenth step: calculating the area of the defect; the eleventh step: and eliminating false defects. The design can detect low-contrast pictures, has strong universality, can effectively eliminate false defects, and further improves the accuracy of defect detection.

Description

Fabric defect detection method
Technical Field
The invention relates to a fabric defect detection method, which is particularly suitable for improving the defect identification rate and improving the universality of a frequency tuning significance algorithm.
Background
The accuracy rate of manually detecting the defects of the fabrics is only 60-70 percent, and the industrial production requirements are difficult to meet. In order to improve the precision of fabric defect detection, researchers provide a plurality of defect detection methods which can be mainly divided into three types: statistical methods, spectral methods and model methods.
The defect detection method based on statistics is simple and easy to implement, but detection results are easily affected by factors such as fabric texture structure, defect shape and the like, and detection omission possibly occurs in small defect areas. The method can well detect the defect area under the condition of selecting a proper filter set, but needs similar defect samples to learn when selecting filter parameters, and is not beneficial to industrial detection. The defect detection method based on the model is characterized in that the model is established according to the texture of similar non-defect fabrics, and then whether a test image accords with the model or not is judged through statistical hypothesis test to realize the detection of the defect area of the fabrics, but the method is complex, large in calculation amount, difficult in online learning and poor in detection effect. In recent years, remarkable methods have been applied to defect detection of fabrics, and certain research results have been obtained.
The frequency tuning significant algorithm (FT method for short) proposed by Achata et al firstly uses Gaussian low-pass filtering to preprocess the image; then, converting the preprocessed image into a Lab color space, and calculating Euclidean distance between a single pixel in each color channel of the image and the average value of the whole image from the angle of a frequency domain; and finally, taking the sum of the Euclidean distances of 3 channels as a significant value of the pixel. The method has the advantages of simple calculation, similarity of the generated saliency map and the original image resolution, and effective maintenance of the integrity of the target region, and is beneficial to improvement of the subsequent target segmentation precision. However, when the FT method is applied to fabric defect segmentation, there are disadvantages in that it is difficult to identify defects having a small difference in brightness from the background area of the fabric, and in that the gaussian filter has a weak smoothing capability and a poor noise reduction capability, and it is easy to blur the defect edges.
The improved frequency tuning significant algorithm adopts an optimal non-local mean filter (NLM) to replace a Gaussian filter in an FT method, then an Otsu segmentation method is used for segmenting a fabric image, the problems existing in the FT algorithm can be effectively solved, the defect area of the fabric can be segmented quickly and accurately, when the filtering parameter h of the NLM filter is optimized by adopting the average maximum between-class variance, a defect sample is required to be used for learning, a certain contrast ratio between the defect and the background area is required, when the Otsu method is used for segmenting, a defect target and the background area are required to have bimodality on a gray level histogram, the fabric image which presents unimodal performance to the gray level histogram is difficult to detect, and the industrial detection is not facilitated.
Disclosure of Invention
The invention aims to solve the problems of low defect detection accuracy and small application range in the prior art, and provides a fabric defect detection method with high defect detection accuracy and wide application range.
In order to achieve the above purpose, the technical solution of the invention is as follows:
a fabric defect detection method comprising the steps of:
the first step is as follows: calibrating the size of an image pixel, and calculating the size and the area of a single pixel on each picture;
the second step is that: filtering the image, namely taking m normal fabric pictures to be detected, wherein the pictures are RGB color spaces, and the image is made to be f ═ { f (I) | I ∈ I }, wherein I is a pixel point, and I is a search window;
let the filtering parameter h be equal to 1,2,3, … …, 50 respectively, and each normal fabric picture is subjected to non-local mean filtering using equation (1) and equation (2), respectively:
Figure GDA0003242582730000021
Figure GDA0003242582730000022
in the above formula: f. ofN(i) The gray value of the pixel point i after non-local mean filtering, f (j) is the gray value of the pixel point j in the search window, and omega (i, j) is the weight corresponding to the pixel point i, j during weighted average;
Figure GDA0003242582730000023
the square of weighted Euclidean distance of rectangular similar windows N (i) and N (j) corresponding to pixel points i, j is shown, alpha is standard deviation of a Gaussian kernel function, and alpha is equal to 1;
the third step: calculating the variance ratio, and respectively calculating the gray level variance value sigma of each normal fabric picture before filtering according to the formula (3)inVariance value sigma of gray value of normal fabric picture after filteringout
Figure GDA0003242582730000024
In formula (3): σ represents the variance of the grey value of the normal fabric picture, xiThe gray value of the ith pixel point in the normal fabric picture is shown, mu is the average value of the gray values of the normal fabric picture, and n represents that the normal fabric picture has n gray values;
the ratio of the filtered and filtered variances is calculated according to equation (4):
Figure GDA0003242582730000031
each picture has R, G and B color channels, so each picture has the variance ratio of R, G, B three color channels, which are BSFR,BSFGAnd BSFB
The fourth step: drawing an h-MBSF curve, and calculating the variance ratio BSF of the three color channels according to the third stepR,BSFGAnd BSFBAnd counting the corresponding average variance ratio under different filtering parameter h values according to the formula (5):
Figure GDA0003242582730000032
drawing a curve graph of the average variance ratio and the filtering parameter h, namely an h-MBSF curve graph, wherein m is the number of normal fabric pictures;
the fifth step: obtaining a straight line fitting equation, and fitting a straight line descending section in the h-MBSF curve by using a least square method to obtain the straight line fitting equation, wherein the equation is as follows:
A×h+B×MBSF+C=0 (6)
wherein A, B, C is a constant;
and a sixth step: calculating an optimal filtering parameter h, and calculating the vertical distance d from each point in the h-MBSF curve transition section to the fitted straight line in the fifth step, as follows:
Figure GDA0003242582730000033
when the value range of d is 0.2-0.8, selecting the filter parameter h corresponding to the point with the minimum value of d as an optimal value, and simultaneously enabling the corresponding point to meet the condition that the value of A multiplied by h + B multiplied by MBSF + C is larger than zero;
the seventh step: and (3) performing significance and normalization processing, converting the filtered normal fabric picture in the second step into Lab color space, and calculating a significance value S according to the following formula:
S=||Iu-INLM|| (8)
in formula (8): i isuIs the arithmetic mean of the pixels of the fabric image in Lab color space, INLMThe image is a non-local mean filtered image; the | | is an European distance formula;
the significant value S is normalized as follows:
Figure GDA0003242582730000041
normalizing the significant value S to be within the range of 0-255 and rounding the value to obtain a normalized significant value G;
eighth step: calculating segmentation threshold values, arranging the significant values in ascending order, and taking the corresponding significant value with the probability of the significant value not less than 95% as the segmentation threshold value T when the significant values of the fabric significant graph obey normal distributionqEach fabric picture can obtain a corresponding segmentation threshold value; finally, take TqThe median value in (b) is used as a segmentation threshold of the defect image, and is as follows:
Figure GDA0003242582730000042
the ninth step: performing defect segmentation, using the optimal filtering parameter h obtained in the sixth step to carry out filtering on the picture to be detected according to the formula (1) and the formula (2), performing significance and normalization processing on the picture obtained after filtering in the seventh step, and then performing defect segmentation according to the formula (11) by using the threshold value T obtained in the eighth step;
Figure GDA0003242582730000043
g (i) in the formula (11) represents the significant value of the i-th normalized pixel point obtained in the seventh step;
the tenth step: calculating the area of the defect, calculating the number P of pixel points in an eight-connected domain with the pixel value of 1 in the binary image, and multiplying the P by the area of the single pixel point obtained in the first step, so that the area of the defect area can be represented by P continuous pixel points;
the eleventh step: and (3) eliminating the false spots, and eliminating the false spots by using a segmentation threshold value M and a formula (12) to obtain a final segmentation image F, wherein the formula is as follows:
Figure GDA0003242582730000044
the threshold M is 0.5mm multiplied by 0.5 mm; when the gray values in the segmented images are all 0, the fabric image is considered to have no defects; otherwise, the image is considered to have the defects, and the defect segmentation result can be completely obtained through the processing: the area with the gray value of 1 in the binary image is a defect area, and the area with the gray value of 0 is a normal part of the fabric; at this point defect detection is complete.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the fabric defect detection method, the variance ratio of the normal fabric before and after filtering is used as the optimization criterion of the filtering parameter h, and the optimal filtering parameter h can be accurately obtained. And defect detection can be well realized for pictures with low contrast, so that the accuracy of the detection method is improved. Therefore, the method can detect the low-contrast picture, and has strong universality and high detection accuracy.
2. According to the fabric defect detection method, the filtering parameter h is optimized, the image to be optimized is filtered, the fabric image with a good segmentation effect is obtained, the defects are segmented, and the segmentation accuracy is effectively improved. Therefore, the method has high segmentation accuracy and high defect detection reliability.
3. In the fabric defect detection method, the area of the defect area is represented by calculating the number of continuous pixel points in an eight-connected domain with the pixel value of 1 in the binary image, so that the pseudo defects are eliminated, and the accuracy of defect detection is further improved. Therefore, the invention can effectively eliminate the false defects and further improve the accuracy of defect detection.
Drawings
Fig. 1 is a graph of the variance ratio of R, G, B three color channels versus the filter parameter h in the third step of the present invention.
FIG. 2 is a graph of h-MBSF in the fourth step of the present invention.
Figure 3 is a graph comparing the effectiveness of the method of the present invention with improved frequency tuning saliency algorithms for defect detection.
Fig. 4 is a schematic view of the manner in which pictures of the fabric of the present invention are taken.
Detailed Description
The present invention will be described in further detail with reference to the following description and embodiments in conjunction with the accompanying drawings.
Referring to fig. 1 to 2, a fabric defect detecting method includes the steps of:
the first step is as follows: calibrating the size of an image pixel, and calculating the size and the area of a single pixel on each picture;
the second step is that: filtering the image, namely taking m normal fabric pictures to be detected, wherein the pictures are RGB color spaces, and the image is made to be f ═ { f (I) | I ∈ I }, wherein I is a pixel point, and I is a search window;
let the filtering parameter h be equal to 1,2,3, … …, 50 respectively, and each normal fabric picture is subjected to non-local mean filtering using equation (1) and equation (2), respectively:
Figure GDA0003242582730000051
Figure GDA0003242582730000052
in the above formula: f. ofN(i) The gray value of the pixel point i after non-local mean filtering, f (j) is the gray value of the pixel point j in the search window, and omega (i, j) is the weight corresponding to the pixel point i, j during weighted average;
Figure GDA0003242582730000061
the square of weighted Euclidean distance of rectangular similar windows N (i) and N (j) corresponding to pixel points i, j is shown, alpha is standard deviation of a Gaussian kernel function, and alpha is equal to 1;
the third step: calculating the variance ratio, and respectively calculating the gray level variance value sigma of each normal fabric picture before filtering according to the formula (3)inVariance value sigma of gray value of normal fabric picture after filteringout
Figure GDA0003242582730000062
In formula (3): σ represents the variance of the grey value of the normal fabric picture, xiIs positiveThe gray value of the ith pixel point in the normal fabric picture is mu which is the average value of the gray values of the normal fabric pictures, and n represents that the normal fabric pictures have n gray values;
the ratio of the filtered and filtered variances is calculated according to equation (4):
Figure GDA0003242582730000063
each picture has R, G and B color channels, so each picture has the variance ratio of R, G, B three color channels, which are BSFR,BSFGAnd BSFB
The fourth step: drawing an h-MBSF curve, and calculating the variance ratio BSF of the three color channels according to the third stepR,BSFGAnd BSFBAnd counting the corresponding average variance ratio under different filtering parameter h values according to the formula (5):
Figure GDA0003242582730000064
drawing a curve graph of the average variance ratio and the filtering parameter h, namely an h-MBSF curve graph, wherein m is the number of normal fabric pictures;
the fifth step: obtaining a straight line fitting equation, and fitting a straight line descending section in the h-MBSF curve by using a least square method to obtain the straight line fitting equation, wherein the equation is as follows:
A×h+B×MBSF+C=0 (6)
wherein A, B, C is a constant;
and a sixth step: calculating an optimal filtering parameter h, and calculating the vertical distance d from each point in the h-MBSF curve transition section to the fitted straight line in the fifth step, as follows:
Figure GDA0003242582730000071
when the value range of d is 0.2-0.8, selecting the filter parameter h corresponding to the point with the minimum value of d as an optimal value, and simultaneously enabling the corresponding point to meet the condition that the value of A multiplied by h + B multiplied by MBSF + C is larger than zero;
the seventh step: and (3) performing significance and normalization processing, converting the filtered normal fabric picture in the second step into Lab color space, and calculating a significance value S according to the following formula:
S=||Iu-INLM|| (8)
in formula (8): i isuIs the arithmetic mean of the pixels of the fabric image in Lab color space, INLMThe image is a non-local mean filtered image; the | | is an European distance formula;
the significant value S is normalized as follows:
Figure GDA0003242582730000072
normalizing the significant value S to be within the range of 0-255 and rounding the value to obtain a normalized significant value G;
eighth step: calculating segmentation threshold values, arranging the significant values in ascending order, and taking the corresponding significant value with the probability of the significant value not less than 95% as the segmentation threshold value T when the significant values of the fabric significant graph obey normal distributionqEach fabric picture can obtain a corresponding segmentation threshold value; finally, take TqThe median value in (b) is used as a segmentation threshold of the defect image, and is as follows:
Figure GDA0003242582730000073
the ninth step: performing defect segmentation, using the optimal filtering parameter h obtained in the sixth step to carry out filtering on the picture to be detected according to the formula (1) and the formula (2), performing significance and normalization processing on the picture obtained after filtering in the seventh step, and then performing defect segmentation according to the formula (11) by using the threshold value T obtained in the eighth step;
Figure GDA0003242582730000074
g (i) in the formula (11) represents the significant value of the i-th normalized pixel point obtained in the seventh step;
the tenth step: calculating the area of the defect, calculating the number P of pixel points in an eight-connected domain with the pixel value of 1 in the binary image, and multiplying the P by the area of the single pixel point obtained in the first step, so that the area of the defect area can be represented by P continuous pixel points;
the eleventh step: and (3) eliminating the false spots, and eliminating the false spots by using a segmentation threshold value M and a formula (12) to obtain a final segmentation image F, wherein the formula is as follows:
Figure GDA0003242582730000081
the threshold M is 0.5mm multiplied by 0.5 mm; when the gray values in the segmented images are all 0, the fabric image is considered to have no defects; otherwise, the image is considered to have the defects, and the defect segmentation result can be completely obtained through the processing: the area with the gray value of 1 in the binary image is a defect area, and the area with the gray value of 0 is a normal part of the fabric; at this point defect detection is complete.
The principle of the invention is illustrated as follows:
referring to fig. 4, to improve the contrast of the spot area with the background area. In the fabric image acquisition process, the light source and the camera are respectively arranged on two sides of the fabric for image acquisition. The contrast of the defect area is improved by utilizing the difference of the light transmittance of the defect and the background area.
Referring to fig. 1, as the filtering parameter h increases, the variation trend of the variance ratio BSF in the RGB color channel is almost consistent, and the curve can be divided into four stages, which are: a slow descending section, a straight descending section, a transition section and a slow converting section. After the parameter h exceeds a certain critical value, the smoothing effect of the non-local mean filter on the image is more and more limited, and the distribution range of the image gray value tends to be stable.
Referring to fig. 3, the improved frequency tuning significant algorithm can accurately identify defects such as thick warps, slubs, hanging warps, knots, broken wefts, thread ends, oil stains and broken holes, but has insufficient identification capability for defects with low contrast such as side band wefts, doffing wefts and thin wefts, and a large number of false defect areas are formed; the variance of the fabric image can better reflect the distribution range of the gray value of the fabric image, the smaller the variance value is, the smaller the gray value difference of pixels in the fabric image is, the variance ratio before and after normal fabric filtering is used as the optimization criterion of the filtering parameter h, the optimal filtering parameter h can be accurately obtained, and the problem that the improved frequency tuning obvious algorithm has poor effect on dividing the fabric with side weft, weft separation and thin weft is effectively solved.
When filtering the fabric image, it is found that the filtering parameter h controls the filtering degree of the fabric image by influencing the magnitude of the weight ω (i, j). If the value is too small, the noise filtering is not thorough, and the false detection phenomenon is easy to occur; if the value is too large, the image is excessively smooth, the defect segmentation precision is not improved favorably, and the phenomenon of missing detection is easy to occur.
Example 1:
a fabric defect detection method comprising the steps of:
the first step is as follows: calibrating the size of an image pixel, and calculating the size and the area of a single pixel on each picture;
the second step is that: filtering the image, namely taking m normal fabric pictures to be detected, wherein the pictures are RGB color spaces, and the image is made to be f ═ { f (I) | I ∈ I }, wherein I is a pixel point, and I is a search window;
let the filtering parameter h be equal to 1,2,3, … …, 50 respectively, and each normal fabric picture is subjected to non-local mean filtering using equation (1) and equation (2), respectively:
Figure GDA0003242582730000091
Figure GDA0003242582730000092
in the above formula: f. ofN(i) The gray value of the pixel point i after non-local mean filtering, f (j) the gray value of the pixel point j in the search window, and ω (i, j) the weight corresponding to the pixel point i, j during weighted average;
Figure GDA0003242582730000093
The square of weighted Euclidean distance of rectangular similar windows N (i) and N (j) corresponding to pixel points i, j is shown, alpha is standard deviation of a Gaussian kernel function, and alpha is equal to 1;
the third step: calculating the variance ratio, and respectively calculating the gray level variance value sigma of each normal fabric picture before filtering according to the formula (3)inVariance value sigma of gray value of normal fabric picture after filteringout
Figure GDA0003242582730000094
In formula (3): σ represents the variance of the grey value of the normal fabric picture, xiThe gray value of the ith pixel point in the normal fabric picture is shown, mu is the average value of the gray values of the normal fabric picture, and n represents that the normal fabric picture has n gray values;
the ratio of the filtered and filtered variances is calculated according to equation (4):
Figure GDA0003242582730000095
each picture has R, G and B color channels, so each picture has the variance ratio of R, G, B three color channels, which are BSFR,BSFGAnd BSFB
The fourth step: drawing an h-MBSF curve, and calculating the variance ratio BSF of the three color channels according to the third stepR,BSFGAnd BSFBAnd counting the corresponding average variance ratio under different filtering parameter h values according to the formula (5):
Figure GDA0003242582730000096
drawing a curve graph of the average variance ratio and the filtering parameter h, namely an h-MBSF curve graph, wherein m is the number of normal fabric pictures;
the fifth step: obtaining a straight line fitting equation, and fitting a straight line descending section in the h-MBSF curve by using a least square method to obtain the straight line fitting equation, wherein the equation is as follows:
A×h+B×MBSF+C=0 (6)
wherein A, B, C is a constant;
and a sixth step: calculating an optimal filtering parameter h, and calculating the vertical distance d from each point in the h-MBSF curve transition section to the fitted straight line in the fifth step, as follows:
Figure GDA0003242582730000101
when the value range of d is 0.2-0.8, selecting the filter parameter h corresponding to the point with the minimum value of d as an optimal value, and simultaneously enabling the corresponding point to meet the condition that the value of A multiplied by h + B multiplied by MBSF + C is larger than zero;
the seventh step: and (3) performing significance and normalization processing, converting the filtered normal fabric picture in the second step into Lab color space, and calculating a significance value S according to the following formula:
S=||Iu-INLM|| (8)
in formula (8): i isuIs the arithmetic mean of the pixels of the fabric image in Lab color space, INLMThe image is a non-local mean filtered image; the | | is an European distance formula;
the significant value S is normalized as follows:
Figure GDA0003242582730000102
normalizing the significant value S to be within the range of 0-255 and rounding the value to obtain a normalized significant value G;
eighth step: calculating segmentation threshold values, arranging the significant values in ascending order, and taking the corresponding significant value with the probability of the significant value not less than 95% as the segmentation threshold value T when the significant values of the fabric significant graph obey normal distributionqEach fabric picture can obtain a corresponding segmentation threshold value; finally, take TqThe median value in (b) is used as a segmentation threshold of the defect image, and is as follows:
Figure GDA0003242582730000103
the ninth step: performing defect segmentation, using the optimal filtering parameter h obtained in the sixth step to carry out filtering on the picture to be detected according to the formula (1) and the formula (2), performing significance and normalization processing on the picture obtained after filtering in the seventh step, and then performing defect segmentation according to the formula (11) by using the threshold value T obtained in the eighth step;
Figure GDA0003242582730000111
g (i) in the formula (11) represents the significant value of the i-th normalized pixel point obtained in the seventh step;
the tenth step: calculating the area of the defect, calculating the number P of pixel points in an eight-connected domain with the pixel value of 1 in the binary image, and multiplying the P by the area of the single pixel point obtained in the first step, so that the area of the defect area can be represented by P continuous pixel points;
the eleventh step: and (3) eliminating the false spots, and eliminating the false spots by using a segmentation threshold value M and a formula (12) to obtain a final segmentation image F, wherein the formula is as follows:
Figure GDA0003242582730000112
the threshold M is 0.5mm multiplied by 0.5 mm; when the gray values in the segmented images are all 0, the fabric image is considered to have no defects; otherwise, the image is considered to have the defects, and the defect segmentation result can be completely obtained through the processing: the area with the gray value of 1 in the binary image is a defect area, and the area with the gray value of 0 is a normal part of the fabric; at this point defect detection is complete.

Claims (1)

1. A method of detecting fabric defects, characterized by: the defect detection method comprises the following steps:
the first step is as follows: calibrating the size of an image pixel, and calculating the size and the area of a single pixel on each picture;
the second step is that: filtering the image, namely taking m normal fabric pictures to be detected, wherein the pictures are RGB color spaces, and the image is made to be f ═ { f (I) | I ∈ I }, wherein I is a pixel point, and I is a search window;
let the filtering parameter h be equal to 1,2,3, … …, 50 respectively, and each normal fabric picture is subjected to non-local mean filtering using equation (1) and equation (2), respectively:
Figure FDA0003242582720000011
Figure FDA0003242582720000012
in the above formula: f. ofN(i) The gray value of the pixel point i after non-local mean filtering, f (j) is the gray value of the pixel point j in the search window, and omega (i, j) is the weight corresponding to the pixel point i, j during weighted average;
Figure FDA0003242582720000013
the square of weighted Euclidean distance of rectangular similar windows N (i) and N (j) corresponding to pixel points i, j is shown, alpha is standard deviation of a Gaussian kernel function, and alpha is equal to 1;
the third step: calculating the variance ratio, and respectively calculating the gray level variance value sigma of each normal fabric picture before filtering according to the formula (3)inVariance value sigma of gray value of normal fabric picture after filteringout
Figure FDA0003242582720000014
In formula (3): σ represents the variance of the grey value of the normal fabric picture, xiIs the gray value of the ith pixel point in the normal fabric picture, and mu is the gray value of the normal fabric pictureAverage value, n represents that the normal fabric picture has n gray values;
the ratio of the filtered and filtered variances is calculated according to equation (4):
Figure FDA0003242582720000015
each picture has R, G and B color channels, so each picture has the variance ratio of R, G, B three color channels, which are BSFR,BSFGAnd BSFB
The fourth step: drawing an h-MBSF curve, and calculating the variance ratio BSF of the three color channels according to the third stepR,BSFGAnd BSFBAnd counting the corresponding average variance ratio under different filtering parameter h values according to the formula (5):
Figure FDA0003242582720000021
drawing a curve graph of the average variance ratio and the filtering parameter h, namely an h-MBSF curve graph, wherein m is the number of normal fabric pictures;
the fifth step: obtaining a straight line fitting equation, and fitting a straight line descending section in the h-MBSF curve by using a least square method to obtain the straight line fitting equation, wherein the equation is as follows:
A×h+B×MBSF+C=0 (6)
wherein A, B, C is a constant;
and a sixth step: calculating an optimal filtering parameter h, and calculating the vertical distance d from each point in the h-MBSF curve transition section to the fitted straight line in the fifth step, as follows:
Figure FDA0003242582720000022
when the value range of d is 0.2-0.8, selecting the filter parameter h corresponding to the point with the minimum value of d as an optimal value, and simultaneously enabling the corresponding point to meet the condition that the value of A multiplied by h + B multiplied by MBSF + C is larger than zero;
the seventh step: and (3) performing significance and normalization processing, converting the filtered normal fabric picture in the second step into Lab color space, and calculating a significance value S according to the following formula:
S=||Iu-INLM|| (8)
in formula (8): i isuIs the arithmetic mean of the pixels of the fabric image in Lab color space, INLMThe image is a non-local mean filtered image; ║ ║ is Euclidean distance type;
the significant value S is normalized as follows:
Figure FDA0003242582720000023
normalizing the significant value S to be within the range of 0-255 and rounding the value to obtain a normalized significant value G;
eighth step: calculating segmentation threshold values, arranging the significant values in ascending order, and taking the corresponding significant value with the probability of the significant value not less than 95% as the segmentation threshold value T when the significant values of the fabric significant graph obey normal distributionqEach fabric picture can obtain a corresponding segmentation threshold value; finally, take TqThe median value in (b) is used as a segmentation threshold of the defect image, and is as follows:
Figure FDA0003242582720000031
the ninth step: performing defect segmentation, using the optimal filtering parameter h obtained in the sixth step to carry out filtering on the picture to be detected according to the formula (1) and the formula (2), performing significance and normalization processing on the picture obtained after filtering in the seventh step, and then performing defect segmentation according to the formula (11) by using the threshold value T obtained in the eighth step;
Figure FDA0003242582720000032
g (i) in the formula (11) represents the significant value of the i-th normalized pixel point obtained in the seventh step;
the tenth step: calculating the area of the defect, calculating the number P of pixel points in an eight-connected domain with the pixel value of 1 in the binary image, and multiplying the P by the area of the single pixel point obtained in the first step, so that the area of the defect area can be represented by P continuous pixel points;
the eleventh step: and (3) eliminating the false spots, and eliminating the false spots by using a segmentation threshold value M and a formula (12) to obtain a final segmentation image F, wherein the formula is as follows:
Figure FDA0003242582720000033
the threshold M is 0.5mm multiplied by 0.5 mm; when the gray values in the segmented images are all 0, the fabric image is considered to have no defects; otherwise, the image is considered to have the defects, and the defect segmentation result can be completely obtained through the processing: the area with the gray value of 1 in the binary image is a defect area, and the area with the gray value of 0 is a normal part of the fabric; at this point defect detection is complete.
CN201810433675.3A 2018-05-08 2018-05-08 Fabric defect detection method Expired - Fee Related CN109035195B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810433675.3A CN109035195B (en) 2018-05-08 2018-05-08 Fabric defect detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810433675.3A CN109035195B (en) 2018-05-08 2018-05-08 Fabric defect detection method

Publications (2)

Publication Number Publication Date
CN109035195A CN109035195A (en) 2018-12-18
CN109035195B true CN109035195B (en) 2021-11-30

Family

ID=64611468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810433675.3A Expired - Fee Related CN109035195B (en) 2018-05-08 2018-05-08 Fabric defect detection method

Country Status (1)

Country Link
CN (1) CN109035195B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110021023A (en) * 2019-03-05 2019-07-16 西安工程大学 A kind of electronics cloth defect segmentation method
CN109961437B (en) * 2019-04-04 2021-06-25 江南大学 Method for detecting significant fabric defects based on machine teaching mode
CN110473190B (en) * 2019-08-09 2022-03-04 江南大学 Adaptive fabric defect detection method based on scale
CN111080574A (en) * 2019-11-19 2020-04-28 天津工业大学 Fabric defect detection method based on information entropy and visual attention mechanism
CN110991082B (en) * 2019-12-19 2023-11-28 信利(仁寿)高端显示科技有限公司 Mura quantification method based on excimer laser annealing
CN111861996B (en) * 2020-06-23 2023-11-03 西安工程大学 Printed fabric defect detection method
CN113724241B (en) * 2021-09-09 2022-08-02 常州市宏发纵横新材料科技股份有限公司 Broken filament detection method and device for carbon fiber warp-knitted fabric and storage medium
CN113838038B (en) * 2021-09-28 2022-08-02 常州市宏发纵横新材料科技股份有限公司 Carbon fiber cloth cover defect detection method and device, electronic equipment and storage medium
CN115115615B (en) * 2022-07-26 2022-12-13 南通好心情家用纺织品有限公司 Textile fabric quality evaluation method and system based on image recognition
CN115082460B (en) * 2022-08-18 2022-11-11 聊城市恒丰电子有限公司 Weaving production line quality monitoring method and system
CN115311294B (en) * 2022-10-12 2023-03-24 启东金耀億华玻纤材料有限公司 Glass bottle body flaw identification and detection method based on image processing

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102706881A (en) * 2012-03-19 2012-10-03 天津工业大学 Cloth defect detecting method based on machine vision
CN105261003A (en) * 2015-09-10 2016-01-20 西安工程大学 Defect point detection method on basis of self structure of fabric
CN105678767A (en) * 2016-01-07 2016-06-15 无锡信捷电气股份有限公司 SoC software and hardware collaborative design-based cloth surface blemish detection method
CN106872487A (en) * 2017-04-21 2017-06-20 佛山市南海区广工大数控装备协同创新研究院 The surface flaw detecting method and device of a kind of view-based access control model
CN107870172A (en) * 2017-07-06 2018-04-03 黎明职业大学 A kind of Fabric Defects Inspection detection method based on image procossing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012381A1 (en) * 1997-09-26 2001-08-09 Hamed Sari-Sarraf Vision-based, on-loom fabric inspection system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102706881A (en) * 2012-03-19 2012-10-03 天津工业大学 Cloth defect detecting method based on machine vision
CN105261003A (en) * 2015-09-10 2016-01-20 西安工程大学 Defect point detection method on basis of self structure of fabric
CN105678767A (en) * 2016-01-07 2016-06-15 无锡信捷电气股份有限公司 SoC software and hardware collaborative design-based cloth surface blemish detection method
CN106872487A (en) * 2017-04-21 2017-06-20 佛山市南海区广工大数控装备协同创新研究院 The surface flaw detecting method and device of a kind of view-based access control model
CN107870172A (en) * 2017-07-06 2018-04-03 黎明职业大学 A kind of Fabric Defects Inspection detection method based on image procossing

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Fault detection of fabrics using image processing methods;Kazım YILDIZ等;《Pamukkale University Journal of Engineering Sciences》;20171231;第23卷(第7期);第841-844页 *
Frequency-turned salient region detection;Radhakrishna Achanta等;《2009 IEEE Conference on Computer Vision and Pattern Recognition》;20090818;第1597-1604页 *
一种基于优化参数的非局部均值滤波算法;张权等;《计算机应用与软件》;20120331;第29卷(第3期);第78-81、138页 *
基于全局和局部显著性的织物疵点检测;姚明海等;《浙江工业大学学报》;20170228;第45卷(第1期);第19-22页 *
基于双边滤波与区域生长的织物疵点检测;马腾等;《北京信息科技大学学报》;20170228;第32卷(第1期);第82-85、91页 *
基于改进自适应阈值的织物疵点检测算法研究;刘洲峰;《微型机与应用》;20131231;第32卷(第10期);第38-40、44页 *
改进频率调谐显著算法在疵点辨识中的应用;王传桐等;《纺织学报》;20180331;第39卷(第3期);第154-160页 *
新的基于图像显著性区域特征的织物疵点检测算法;赵波等;《计算机应用》;20120630;第32卷(第6期);第1574-1577页 *

Also Published As

Publication number Publication date
CN109035195A (en) 2018-12-18

Similar Documents

Publication Publication Date Title
CN109035195B (en) Fabric defect detection method
CN106780486B (en) Steel plate surface defect image extraction method
CN111415363B (en) Image edge identification method
CN102305798B (en) Method for detecting and classifying glass defects based on machine vision
US20200237286A1 (en) Method and device for analyzing water content of skin by means of skin image
CN105374015A (en) Binary method for low-quality document image based on local contract and estimation of stroke width
CN111814686A (en) Vision-based power transmission line identification and foreign matter invasion online detection method
CN109389566B (en) Method for detecting bad state of fastening nut of subway height adjusting valve based on boundary characteristics
CN109472788B (en) Method for detecting flaw on surface of airplane rivet
CN115294140A (en) Hardware part defect detection method and system
CN114494210A (en) Plastic film production defect detection method and system based on image processing
CN111340824A (en) Image feature segmentation method based on data mining
CN115294116B (en) Method, device and system for evaluating dyeing quality of textile material based on artificial intelligence
CN115311267B (en) Method for detecting abnormity of check fabric
CN111738931B (en) Shadow removal algorithm for aerial image of photovoltaic array unmanned aerial vehicle
CN116524196B (en) Intelligent power transmission line detection system based on image recognition technology
CN115311265B (en) Loom intelligence control system based on weaving quality
CN115115637A (en) Cloth defect detection method based on image pyramid thought
CN115018785A (en) Hoisting steel wire rope tension detection method based on visual vibration frequency identification
CN112435235A (en) Seed cotton impurity content detection method based on image analysis
CN111768455A (en) Image-based wood region and dominant color extraction method
CN116883408B (en) Integrating instrument shell defect detection method based on artificial intelligence
CN102313740B (en) Solar panel crack detection method
CN107977608B (en) Method for extracting road area of highway video image
CN116703894B (en) Lithium battery diaphragm quality detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211130