CN114419004A - Fabric flaw detection method and device, computer equipment and readable storage medium - Google Patents

Fabric flaw detection method and device, computer equipment and readable storage medium Download PDF

Info

Publication number
CN114419004A
CN114419004A CN202210072590.3A CN202210072590A CN114419004A CN 114419004 A CN114419004 A CN 114419004A CN 202210072590 A CN202210072590 A CN 202210072590A CN 114419004 A CN114419004 A CN 114419004A
Authority
CN
China
Prior art keywords
fabric
image
grid
preprocessed
fabric image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210072590.3A
Other languages
Chinese (zh)
Inventor
韦帅
朱浩
莫兆忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Jiyan Zhilian Technology Co ltd
Original Assignee
Foshan Jiyan Zhilian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Jiyan Zhilian Technology Co ltd filed Critical Foshan Jiyan Zhilian Technology Co ltd
Priority to CN202210072590.3A priority Critical patent/CN114419004A/en
Publication of CN114419004A publication Critical patent/CN114419004A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/45Analysis of texture based on statistical description of texture using co-occurrence matrix computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The method comprises the steps of obtaining a to-be-detected fabric image acquired by a camera, preprocessing the to-be-detected fabric image to obtain a preprocessed fabric image, carrying out grid division on the preprocessed fabric image according to a texture period of the preprocessed fabric image to obtain a gridded fabric image, extracting grid texture features of the gridded fabric image by utilizing a redundant gray level co-occurrence matrix, classifying the grid texture features through a pre-trained fabric defect detection neural network, and determining whether the grid texture features are defect features or not according to a classification result; and if the grid texture features are determined to be the defect features, marking the defect positions in the grid texture features corresponding to the fabric image to be detected. The technical scheme can improve the detection accuracy and the detection efficiency of the fabric flaws.

Description

Fabric flaw detection method and device, computer equipment and readable storage medium
Technical Field
The present application relates to the field of textiles, and in particular, to a method and an apparatus for detecting fabric defects, a computer device, and a readable storage medium.
Background
At present, the fabric surface quality detection of most domestic textile enterprises is also finished by quality inspectors through a traditional manual method, and the traditional method for detecting fabric flaws manually needs to carry out professional skill training on the inspectors and needs to have more experience of the inspectors; secondly, because subjective factors of people are mixed, the objectivity and consistency of detection results are difficult to ensure. In addition, the quality testing personnel are difficult to keep a good state for a long time due to the long working time in the detection process, so that the reliability of the detection result is low. Therefore, the traditional fabric defect detection based on manual work influences the product quality and the production efficiency in the textile production process, in recent years, the computer vision technology has been developed greatly, the detection of the object surface defects by utilizing machine vision instead of manual work becomes a focus problem in industrial production, and the detection of the fabric surface defects based on the machine vision has practical significance for promoting the intelligent and efficient development of enterprises.
Disclosure of Invention
In view of the above, it is desirable to provide a fabric defect detection method, device, computer device and readable storage medium capable of improving fabric defect detection accuracy and detection efficiency.
In a first aspect, the present application provides a method for detecting a fabric defect, the method comprising:
acquiring a to-be-detected fabric image acquired by camera equipment, and preprocessing the to-be-detected fabric image to obtain a preprocessed fabric image;
performing grid division on the preprocessed fabric image according to the texture period of the preprocessed fabric image to obtain a plurality of grid fabric images, and extracting grid texture characteristics of each grid fabric image by using a redundant gray level co-occurrence matrix;
classifying the grid texture features through a pre-trained fabric flaw detection neural network, and determining whether the grid texture features are flaw features or not according to a classification result;
and if the grid texture features are determined to be the defect features, marking the defect positions in the grid texture features corresponding to the fabric image to be detected.
In one embodiment, the method for detecting fabric defects further comprises:
acquiring a fabric sample image which is marked in advance; the fabric sample images include a defective fabric sample image and a non-defective fabric sample image;
and extracting the fabric texture characteristics of the fabric sample image, inputting the fabric texture characteristics of the sample into a classifier for training, and constructing to obtain the fabric flaw detection neural network.
In one embodiment, the step of preprocessing the fabric image to be detected to obtain a preprocessed fabric image comprises:
adjusting the gray value weight corresponding to the neighborhood pixel points according to the proximity degree of each pixel point in the to-be-detected fabric image and the neighborhood pixel points around the pixel point in the geometric space or the similarity degree in the gray space;
carrying out weighted average on the gray value of each neighborhood pixel according to the gray value weight to obtain a target gray value of the pixel;
and obtaining a preprocessed fabric image according to the target gray value of each pixel point in the fabric image to be detected.
In one embodiment, the step of obtaining a plurality of gridding fabric images by gridding the preprocessed fabric images according to the texture period of the preprocessed fabric images comprises:
determining homogenization parameters of the preprocessed fabric image according to gray value difference values between pixel pairs in the preprocessed fabric image;
measuring the texture period of the preprocessed fabric image according to the homogenization parameters, and determining the mesh size for meshing the preprocessed fabric image;
and carrying out meshing on the preprocessed fabric image based on the size of the meshes to obtain a plurality of meshed fabric images.
In one embodiment, the step of determining the homogenization parameters of the pre-processed fabric image according to the gray value difference between each pixel pair in the pre-processed fabric image comprises:
determining the relative position relation of each pixel pair in the preprocessed fabric image, and determining the gray value difference value of the pixel pair based on the relative position relation;
and counting the probability distribution of the difference values of different gray values, calculating the statistical probability of the difference value of the gray value being a preset difference value, and determining the homogenization parameters of the preprocessed fabric image according to the probability distribution result.
In one embodiment, the step of extracting the grid texture features of each grid fabric image by using the redundant gray level co-occurrence matrix comprises:
carrying out multi-scale decomposition on each gridding fabric image through a pre-established generalized Gaussian filter to obtain a plurality of redundant images with different scales corresponding to the gridding fabric images;
generating a gray level co-occurrence matrix corresponding to each redundant image to obtain a redundant gray level co-occurrence matrix of the gridded fabric image;
and solving corresponding characteristic quantity according to the redundant gray level co-occurrence matrix to obtain the grid texture characteristics of the grid fabric image.
In one embodiment, the step of obtaining the grid texture features of the grid fabric image by performing corresponding feature quantity calculation according to the redundant gray level co-occurrence matrix includes:
calculating characteristic parameters of the redundant images based on the redundant gray level co-occurrence matrix, wherein the characteristic parameters comprise an angular second moment, contrast, correlation, an inverse difference moment and entropy;
and extracting the characteristics of the gridding fabric image according to the characteristic parameters to obtain the grid texture characteristics corresponding to the gridding fabric image.
In a second aspect, the present application also provides a device for detecting fabric defects, the device comprising:
the fabric image acquisition module is used for acquiring a fabric image to be detected acquired by the camera equipment and preprocessing the fabric image to be detected to obtain a preprocessed fabric image;
the texture feature extraction module is used for carrying out grid division on the preprocessed fabric image according to the texture period of the preprocessed fabric image to obtain a plurality of grid fabric images, and extracting the grid texture features of each grid fabric image by utilizing the redundant gray level co-occurrence matrix;
the flaw characteristic determination module is used for classifying the grid texture characteristics through a pre-trained fabric flaw detection neural network and determining whether the grid texture characteristics are flaw characteristics or not according to a classification result;
and the flaw position marking module is used for marking the flaw position in the grid texture characteristic corresponding to the fabric image to be detected if the grid texture characteristic is determined to be the flaw characteristic.
In a third aspect, the present application further provides a computer device, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, and the processor executes the computer program to implement the steps of the method for detecting a fabric defect mentioned in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the method for detecting a fabric defect mentioned in the first aspect.
According to the fabric flaw detection method, the device, the computer equipment and the readable storage medium, the image of the fabric to be detected is acquired by the camera, the image of the fabric to be detected is preprocessed to obtain a preprocessed image of the fabric, the preprocessed image of the fabric is subjected to grid division according to the texture period of the preprocessed image of the fabric to obtain a gridded image of the fabric, the grid texture features of the gridded image of the fabric are extracted by utilizing a redundant gray level co-occurrence matrix, the grid texture features are classified through a pre-trained fabric flaw detection neural network, and whether the grid texture features are flaw features is determined according to the classification result; if the grid texture features are determined to be the flaw features, marking flaw positions in the grid texture features corresponding to the fabric image to be detected, and therefore detecting accuracy and detecting efficiency of the fabric flaws are improved.
Drawings
FIG. 1 is a schematic diagram illustrating an application scenario of a method for detecting fabric defects according to an embodiment;
FIG. 2 is a flow diagram of a method for detecting fabric defects in an embodiment;
FIG. 3 is a flowchart illustrating a method for constructing a fabric defect detection neural network according to an embodiment;
FIG. 4 is an exploded schematic view of a gridded fabric image in one embodiment;
FIG. 5 is a reference diagram of an example image decomposition corresponding to FIG. 4;
FIG. 6 is a schematic diagram illustrating a position relationship between two pixels in a pixel pair according to an embodiment;
FIG. 7 is a schematic view of an exemplary fabric defect detecting device;
FIG. 8 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The technical scheme of the application is that the accurate detection of the fabric defects is realized based on computer vision. In order to facilitate understanding of the application scheme, an application environment to which the embodiments of the present application are applicable is first described.
Fig. 1 is a schematic view of an application scenario of the method for detecting a fabric defect in an embodiment, as shown in fig. 1, a camera device 101 and a terminal 102 are in communication connection in a wired manner. The camera device 101 is arranged at a fixed position, and when the fabric to be detected moves to a preset position range, the camera device 101 continuously shoots the fabric to be detected to obtain an image of the fabric to be detected. The image pickup device 101 transmits the photographed image of the fabric to be detected to the terminal 102. The method comprises the steps that a terminal 102 obtains a fabric image to be detected and detects flaws in the fabric image to be detected, firstly, after the fabric image to be detected is preprocessed, grid division is carried out according to texture periods of fabrics to obtain a plurality of gridding fabric images, grid texture features of each gridding fabric image are extracted by using a redundant gray level co-occurrence matrix, the grid texture features are classified through a pre-trained fabric flaw detection neural network, whether the grid texture features are flaw features or not is determined, and if the grid texture features are determined to be the flaw features, flaw positions are marked in the grid texture features corresponding to the fabric image to be detected. The terminal 102 may be, but is not limited to, various personal computers, notebook computers, industrial personal computers, and the like.
Optionally, the application environment may further include a display device 103, such as a display screen, which may be carried by the terminal 102 or a separate external device communicatively connected to the terminal 102. The terminal 102 sends the detection result of the fabric defect to the display device 103 so as to display the detection result to the user through the display device.
In an embodiment, the application environment may further include a marking mechanism 104, and the terminal 102 is connected to the marking mechanism 104, and when the terminal 102 detects a defect position corresponding to the defect characteristic, the marking mechanism 104 is controlled to mark the defect position corresponding to the fabric to be detected. In the technical scheme, the camera device 101 collects the image of the to-be-detected fabric corresponding to the to-be-detected fabric and transmits the image to the terminal 102 for fabric flaw detection operation, and after the fabric flaw position is determined, the marking mechanism is controlled to mark the fabric flaw position corresponding to the to-be-detected fabric, so that the marking instantaneity and the flaw detection efficiency of fabric flaw detection are improved.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a flowchart of a method for detecting fabric defects according to an embodiment, which may be performed at a terminal.
Specifically, as shown in fig. 2, the method for detecting the fabric defect may include the following steps:
s210, acquiring a to-be-detected fabric image acquired by the camera equipment, and preprocessing the to-be-detected fabric image to obtain a preprocessed fabric image.
The image pickup apparatus is an apparatus having an image pickup function, and a camera is explained below as an example. The camera is installed on a fixed position, and when the fabric to be detected is detected to be located in a preset position range, the camera is triggered and controlled through the encoder to shoot the fabric to be detected, so that an image of the fabric to be detected is obtained. Optionally, the fabric to be detected is photographed by the linear array camera, the fabric to be detected moves at a constant speed, and when the fabric to be detected moves to a preset position range, the fabric to be detected is continuously scanned line by one or more linear array cameras, so that the whole surface of the fabric to be detected is uniformly detected. Images with the precision of micron level are accurately acquired through the linear array camera, so that the detection precision of the flaws is improved.
Optionally, the fabric image to be detected may be a gray fabric image, and when the collected fabric image to be detected is a color fabric image, a graying process may be adopted to convert the color fabric image into the gray fabric image. Optionally, the resolution of the image to be detected is 8192 × 3000, and the high-resolution image of the fabric to be detected can improve the detection precision of the fabric flaws.
In this embodiment, the image of the fabric to be detected acquired by the camera is transmitted to the computer device by using the image acquisition card, so that the fabric to be detected is subjected to fabric defect detection processing by the computer device. Optionally, the computer device acquires a fabric image to be detected acquired through the camera, and performs preprocessing such as tilt correction, smooth denoising, contrast enhancement and the like on the fabric image to be detected, so as to eliminate noise of the fabric image to be detected and keep texture characteristics, thereby improving the detection accuracy of fabric flaws.
S220, grid division is carried out on the preprocessed fabric image according to the texture period of the preprocessed fabric image to obtain a plurality of grid fabric images, and grid texture features of the grid fabric images are extracted by utilizing the redundant gray level co-occurrence matrix.
The fabric pattern and the texture are important parameter indexes reflecting fabric surface information, most of the fabric textures and patterns have periodicity, certain simple or complex changes can be correspondingly presented along with the change periodicity of the yarn interweaving rule and the pattern distribution, the surface quality of the fabric can be directly influenced by faults of the fabric in the weaving and printing and dyeing processes, and the technical scheme can detect the flaws of the fabric according to the texture period in the preprocessed fabric image.
In one embodiment, a preprocessed fabric image is divided into a plurality of grid blocks by performing grid division on the preprocessed fabric image, so that a plurality of grid fabric images are obtained. Optionally, according to the texture periods of the preprocessed fabric image in the horizontal and vertical directions, the preprocessed fabric image is subjected to mesh division according to the texture periods, so that a plurality of meshed fabric images are obtained.
Since the texture is formed by the repeated appearance of the gray scale distribution at the spatial position, a certain gray scale relationship, i.e., a spatial correlation characteristic of the gray scale in the image, exists between two pixels spaced apart from each other in the image space. Gray-level Co-occurrrence Matrix (GLCM), refers to a method for describing texture features by studying the spatial correlation properties of Gray levels,
in one embodiment, the grid texture features corresponding to each of the grid fabric images may be extracted through a plurality of gray level co-occurrence matrices. Optionally, a gray level co-occurrence matrix corresponding to each gridded fabric image is calculated to obtain a redundant gray level co-occurrence matrix, and the grid texture features of the gridded fabric image are extracted by using the redundant gray level co-occurrence matrix.
And S230, classifying the grid texture features through a pre-trained fabric flaw detection neural network, and determining whether the grid texture features are flaw features according to the classification result.
In order to design a neural network model structure suitable for detecting fabric defects, the neural network model needs to be trained to determine the number of hidden layers, the number of nodes and transfer functions of an input layer, an output layer and the hidden layers of the suitable neural network model.
Fig. 3 is a flowchart of a method for constructing a fabric defect detecting neural network in an embodiment, as shown in fig. 3, in an embodiment, the fabric defect detecting neural network may be obtained by the following steps:
and S110, acquiring a fabric sample image which is marked in advance.
The fabric sample images include a defective fabric sample image and a non-defective fabric sample image.
Randomly selecting a plurality of fabric images as fabric sample images, marking the fabric sample images with a '1' if the fabric sample images with defects are the fabric sample images with defects, and marking the fabric sample images without defects with a '0' if the fabric sample images without defects are the fabric sample images with defects.
And S120, extracting the fabric texture features of the fabric sample image, inputting the fabric texture features of the sample into a classifier for training, and constructing to obtain the fabric flaw detection neural network.
In one embodiment, the fabric texture features of each fabric sample image can be extracted in a gray level co-occurrence matrix manner, so as to obtain the sample fabric texture features. Optionally, performing multi-scale decomposition on each fabric sample image through a pre-established generalized Gaussian filter to obtain a plurality of sample redundant images with different scales corresponding to the fabric sample images, generating a gray level co-occurrence matrix corresponding to each sample redundant image to obtain a redundant gray level co-occurrence matrix of the fabric sample images, and performing corresponding characteristic quantity calculation by using the redundant gray level co-occurrence matrix of the fabric sample images to obtain sample grid texture characteristics corresponding to the fabric sample images.
And inputting the texture characteristics of the sample fabric into a classifier for training and learning, and continuously optimizing various model parameters of the classifier to obtain a fabric flaw detection neural network.
In this embodiment, the mesh texture feature extracted based on the redundant gray level co-occurrence matrix method determines that the fabric defect detection neural network adopts a BP neural network structure, the number of nodes of the input layer is 30, the hidden layer is 2, and when the number of nodes of each layer is 128, the recognition accuracy and the recognition speed of the fabric defect are optimal. In addition, an ELU function is selected as an activation function of a hidden layer of the neural network, so that the convergence rate of the neural network is higher, the image classification effect is better, a softmax function is added into an output layer, so that the sum of all output values is 1, and each output value can be regarded as the probability of judging whether the grid texture feature is a defect feature class or a normal feature class. Meanwhile, an approval operation is added before the activation functions of all layers, and a cross entropy loss function is adopted as a loss function in a training process, so that the robustness of the fabric flaw detection neural network is improved.
And S240, if the grid texture features are determined to be the defect features, marking the defect positions in the grid texture features corresponding to the fabric image to be detected.
The mesh texture features of the normal image, i.e. the non-defective image, are clearly distinguished from the mesh texture features of the defective image. The grid texture features refer to the thickness, density and other features of the texture in each grid after the image is subjected to grid division. In general, a grid texture feature may be represented by characteristic parameters such as Angular second moment (Asm), Contrast (Contrast, Con), Correlation (Correlation, Cor), Inverse difference moment (Idm), Entropy (Entropy), etc., where the Asm value may be used to describe the uniformity of gray distribution and the thickness of texture in an image, the Cor value may be used to describe the Correlation of matrix elements in the row or column direction, the Con value may be used to represent the sharpness of an image by the depth of texture grooves, the Idm value may be used to represent the homogeneity of image texture, measure the uniformity of local changes of image texture, and the Ent value may be used to represent the complexity of texture of an image.
The grid texture features input into the fabric flaw detection neural network are classified into categories such as normal features and flaw features according to the output values of the feature parameters of the grid texture features. And determining whether the input grid texture features are the flaw features or not according to the output values of the characteristic parameters because the grid texture features of the normal image and the grid texture features of the flaw image have obvious differences. And if the grid texture features are determined to be the flaw features, marking flaw positions at positions corresponding to the grid texture features corresponding to the fabric image to be detected.
The fabric defect detection method provided by the embodiment includes the steps of obtaining a to-be-detected fabric image acquired through a camera, preprocessing the to-be-detected fabric image to obtain a preprocessed fabric image, performing grid division on the preprocessed fabric image according to a texture period of the preprocessed fabric image to obtain a gridded fabric image, extracting grid texture features of the gridded fabric image by using a redundant gray level co-occurrence matrix, classifying the grid texture features through a pre-trained fabric defect detection neural network, and determining whether the grid texture features are defect features according to a classification result; if the grid texture features are determined to be the flaw features, marking flaw positions in the grid texture features corresponding to the fabric image to be detected, and therefore detecting accuracy and detecting efficiency of the fabric flaws are improved.
In order to more clearly illustrate the technical solution of the present application, the following further describes an implementation manner of a plurality of steps of the fabric defect detection method.
In an embodiment, the preprocessing the fabric image to be detected in step S210 to obtain a preprocessed fabric image may include the following steps:
s2101, adjusting the gray value weight of the neighborhood pixel points according to the proximity degree of each pixel point in the fabric image to be detected and the neighborhood pixel points around the pixel points in the geometric space or the similarity degree in the gray space.
In an embodiment, the fabric image to be detected may be preprocessed by a filter, optionally, a non-linear filter, such as a bilateral filter and a median filter, may be used for preprocessing, and a linear filter, such as a gaussian filter, a mean filter, a block filter, and the like, may also be used for preprocessing. Optionally, in this embodiment, the bilateral filter is used to perform smoothing processing on the fabric image to be detected, so that the edge information of the image can be maintained, and thus the detection accuracy is improved.
Different gray value weights are set for different pixels, so that the noise reduction effect is achieved for each pixel point of the fabric image to be detected. And a plurality of neighborhood pixel points exist around a certain pixel point, and corresponding gray value weight values are set for the corresponding neighborhood pixel points according to the proximity degree of the pixel point and the neighborhood pixel points in the geometric space and the similarity degree in the gray space.
In this embodiment, the setting parameter c (ξ, x) represents the degree of closeness of the pixel point x and the neighborhood pixel ξ in the geometric space, and s (f (ξ), f (x)) represents the degree of similarity of the pixel point x and the neighborhood pixel ξ in the gray scale space. Optionally, in an embodiment, in a local area where the gray value of the fabric image to be detected is relatively close, since the gray value difference near the pixel point x is relatively small, the function s (f (ξ), f (x)) acting on the gray space has almost no influence on the filtering result, and focuses on the filtering action of the parameter c (ξ, x), at this time, the gray value weight of the neighborhood pixel point is adjusted, the gray value weight of the neighborhood pixel point farther away from the pixel point x is increased, and the gray value weight of the neighborhood pixel point closer to the pixel point x is decreased, so that the image to be detected is subjected to smoothing processing in the geometric space, and noise is reduced.
Optionally, in another embodiment, in a local area with a larger gray value difference on the fabric image to be detected, for example, in an edge area of the fabric image to be detected, when the pixel point x is on a side with darker edges, a function s (f (ξ), f (x)) acting on a gray space will participate in filtering, and a weight of a neighborhood pixel point corresponding to the pixel point x is adjusted, wherein a gray value weight of a neighborhood pixel point on a side (located in a dark portion) same as the pixel point x is increased, and a gray value weight of a neighborhood pixel point on a side (located in a bright portion) different from the pixel point x is decreased. Taking the bilateral filter as an example for explanation, at this time, the gray value of the pixel point x is a result of weighted averaging of the gray values of the neighboring pixel points located on the same side (i.e., the dark portion) in the filtering template of the bilateral filter, and the pixel point of the bright portion hardly works, and vice versa.
S2102, carrying out weighted average on the gray values of all neighborhood pixels according to the gray value weight to obtain the target gray value of the pixel.
The method comprises the following steps of respectively carrying out weighted average on the proximity degree of a neighborhood pixel point near a certain pixel point on a geometric space and the similarity degree of the neighborhood pixel point on a chromaticity space to obtain a target gray value, wherein the target gray value is a result of filtering the pixel point, and the expression of a mathematical model of the target gray value can be as follows:
Figure BDA0003482799430000111
the normalization coefficient is as follows:
Figure BDA0003482799430000112
wherein c (ξ, x) represents the proximity of a certain pixel point (i.e. the central point of the filter) x and a neighborhood pixel point ξ on a geometric space, and s (f (ξ), f (x)) represents the similarity of the pixel point x and the neighborhood pixel point ξ on a gray scale space.
In this embodiment, the gray value information is added on the basis of the spatial filtering through the bilateral filtering, and the gray value weight of the neighborhood pixel point corresponding to each pixel point is adjusted through the gray value information to achieve the filtering effect of keeping the edge information of the image, thereby improving the detection accuracy of the fabric flaws.
S2103, obtaining a preprocessed fabric image according to the target gray value of each pixel point in the fabric image to be detected.
The method comprises the steps of carrying out filtering processing on each pixel point in a fabric image to be detected through a filter, and using a target gray value obtained through calculation as the gray value of the pixel point after the filtering processing, so that preprocessing such as smoothing and denoising of the fabric image to be detected is realized, a preprocessed fabric image is obtained, and the texture characteristics of the fabric to be detected are not influenced while the detection noise of the fabric to be detected is reduced.
In the related art, when the image is subjected to smooth filtering and noise reduction, the edge information of the image can be blurred, so that the detection accuracy of the fabric flaws is affected, in the embodiment, the gray value weight corresponding to the neighborhood pixel point is adjusted according to the proximity degree of each pixel point of the fabric image to be detected and the neighborhood pixel point in the geometric space or the similarity degree in the gray space, the gray value of the neighborhood pixel point is subjected to weighted average, so that the pretreatment of the fabric image to be detected is realized, the edge information of the image is maintained while the fabric image to be detected is subjected to smooth denoising, and the detection accuracy of the fabric flaws is improved.
In an embodiment, the step S220 of meshing the pre-processed fabric image according to the texture period of the pre-processed fabric image to obtain a plurality of meshed fabric images may include the following steps:
s2201, determining homogenization parameters of the preprocessed fabric image according to the gray value difference value between each pixel pair in the preprocessed fabric image.
The texture properties of the fabric are the same or similar during the same texture period, which in one embodiment is determined by finding the smallest repeat unit of the fabric pattern. Optionally, the minimum cyclic unit of the fabric pattern is determined by preprocessing the gray value difference between each pixel pair of the fabric image, and the similarity degree of the texture of the fabric pattern is measured by setting the homogenization parameter.
Optionally, in an embodiment, determining the homogenization parameter of the preprocessed fabric image according to the gray value difference between each pixel pair may be implemented in the following manner, and specifically may include the following steps:
s301, determining the relative position relation of each pixel pair in the preprocessed fabric image, and determining the gray value difference value of the pixel pair based on the relative position relation.
In this embodiment, the relative position relationship may refer to a relative position relationship between any two pixel points (i.e., a pixel pair) in the horizontal or vertical direction, and may be defined by defining a displacement vector v ═ (v ═ v1,v2) Showing a view of a pretreated fabricThe relative position of a pixel pair in the image, and the relative angle and direction of the pixel pair can be determined by the displacement vector.
Setting the size of the image I of the pretreated fabric to be Ly×LxThe gray scale level of the image is K. Any pixel point p ═ x, y ∈ I, I ═ 0,1y-1}×{0,1,...,Lx-1, representing the gray value at the pixel point p as I (p) e {0, 1., K-1}, defining a gray value difference parameter a of the pixel pair, where the gray value difference a ═ I (p) -I (p + v) represents the gray value difference of the pixel point pair determined by the relative position.
S302, counting the probability distribution of the gray value difference value, and determining the homogenization parameters of the preprocessed fabric image according to the statistical result of the probability distribution.
In the embodiment, the distribution of the gray value differences in the preprocessed fabric image is determined by counting the probability distribution of the gray value differences of the pixel pairs. Optionally, for each determined displacement vector v, a gray value difference distribution histogram related to the gray value difference a and the displacement vector v is counted, and after the gray value difference distribution histogram is normalized, the corresponding frequency distribution parameter P is obtainedv(a)
According to the frequency distribution parameter Pv(a) Obtaining a pixel pair homogenization parameter with a relative position relation of a vector v according to the following formula:
Figure BDA0003482799430000131
where v denotes a displacement vector of a pixel pair, a denotes a difference value of gray values of the pixel pair, and Pv(a) Representing the frequency distribution parameter of the pixel pair.
S2202, measuring the texture period of the preprocessed fabric image according to the homogenization parameters, and determining the mesh size for mesh division of the preprocessed fabric image.
From the above formula of the homogenization parameters, it can be seen that the homogenization value G (I, v) reaches a maximum value of 1 when the gray-value differences a of the pixel pairs are all 0. Homogeneity value in the same texture periodG (I, v) reaches a maximum value, i.e. the texture period of the pre-processed fabric image can be determined by calculating the maximum value of the homogeneity value G (I, v). The homogeneity value G (I, v) can also be maximized when the vector v is an integer multiple of the texture period of the pre-processed fabric image. Therefore, it is necessary to select a minimum displacement vector v among a plurality of displacement vectors v satisfying the requirements, i.e. to select a minimum displacement vector v
Figure BDA0003482799430000132
The texture period is calculated from the smallest displacement vector v. Optionally, different displacement vectors v are defined, and a minimum displacement vector v is determined in the horizontal direction and the vertical direction respectively, so as to obtain the texture periods in the horizontal direction and the vertical direction. And determining the grid size for dividing the preprocessed fabric image based on the texture period.
S2203, carrying out grid division on the preprocessed fabric image based on the grid size to obtain a plurality of grid fabric images.
For the same preprocessed fabric image, a plurality of meshes with different sizes can be determined according to different texture periods. And according to the determined grid size, grid division is carried out on the preprocessed fabric image to obtain a plurality of grid fabric images.
In this embodiment, the gray value difference of a pixel pair is determined according to the relative position relationship of any pixel pair in the preprocessed fabric image, the probability distribution of the gray value difference is counted, and the homogenization parameter of the pixel pair is determined according to the statistical result of the probability distribution to measure the fabric texture period, so that the measurement efficiency and the measurement accuracy of the texture period are improved, and the mesh division is accurately performed on the preprocessed fabric.
In an embodiment, the extracting the grid texture feature of the gridded fabric image by using the redundant gray level co-occurrence matrix in step S220 may include the following steps:
s2204, carrying out multi-scale decomposition on each gridding fabric image through a pre-established generalized Gaussian filter to obtain a plurality of redundant images with different scales corresponding to the gridding fabric images.
Due to the fact that fabric flaws are different in size, in order to accurately detect the fabric flaws different in size, the technical scheme carries out multi-scale decomposition on the gridded fabric so as to obtain multiple redundant images different in scale. For example, generally speaking, the scale of the gridding fabric image is 256 × 256, in this embodiment, each pixel point of the gridding fabric image is extracted at intervals of a preset number of pixel points, and a plurality of redundant images with different scales, such as the scales of 126 × 126 and 64 × 64, are obtained.
In one embodiment, the gridding fabric images are subjected to multi-scale decomposition through a pre-established generalized Gaussian filter, and a plurality of redundant images with different scales corresponding to the gridding fabric images are obtained. Optionally, the generalized gaussian filter is pre-established. In one embodiment, a generalized gaussian filter is established by taking the gaussian filter as a filtering kernel of the generalized gaussian filter, and the expression is as follows:
Figure BDA0003482799430000141
Figure BDA0003482799430000142
Figure BDA0003482799430000143
wherein, (x, y) represents coordinates of a point relative to the template center (0,0), p (x, y) represents a gray value weight at a pixel point (x, y) from the template center (0,0) in the generalized gaussian filter template, and Γ (·) represents a gamma function. The parameters sigma and beta jointly control the shape of the generalized Gaussian filter model, and the adjusting effect of sigma on the generalized Gaussian filter model is consistent with that of the standard Gaussian model. Different generalized Gaussian filter models are obtained through selection of different beta parameter values, and the generalized Gaussian filter models are used as low-pass filtering kernel functions to carry out image filtering in the image decomposition process, so that the gridding fabric image is subjected to multi-scale decomposition, and a plurality of redundant images with different scales are obtained.
Referring to fig. 4 and 5, fig. 4 is an exploded schematic diagram of an image of a gridding fabric in an embodiment, and fig. 5 is a reference diagram of an image exploded example corresponding to fig. 4. In one embodiment, the process of multi-scale decomposition of a gridded fabric image using a generalized Gaussian low-pass filter with Gaussian properties may be as follows: firstly, low-pass filtering the first-level decomposed original image by a generalized Gaussian low-pass filter to obtain a first-level low-pass image L1, and then using the residual error between the original image and the low-pass image as the high-frequency part H1; then, filtering the second layer original image by using a generalized Gaussian low-pass filter after parameter adjustment to obtain a low-pass image L2, and then using the residual error between the low-pass image L1 and the low-pass image L2 as a high-frequency part H2 of the second layer decomposition; finally, the third layer obtains low-pass images L3 and H3 in the same manner. Through the above operations, the generation of the redundant image is completed.
S2205, generating a gray level co-occurrence matrix corresponding to each redundant image to obtain a redundant gray level co-occurrence matrix of the gridding fabric image.
The gray level co-occurrence matrix is used for counting specific gray level value combinations of pixel point pairs with specific relative position relations in the image, and further describing texture features of the image through related feature quantities. The size of the gray level co-occurrence matrix is determined by the order of the gray level values in the image, typically one Ly×LxAn eight-bit gray map has 256 levels, and thus the corresponding gray co-occurrence matrix size is 256 × 256. As shown in fig. 6, fig. 6 is a schematic diagram of a position relationship between two pixels in a pixel pair in an embodiment, where a pixel is first selected as a reference point, and a position relationship between another pixel and the reference point is (θ, d), where θ represents a direction of a vector formed by the another pixel and the reference point, and θ is selected as {0 °,45 °,90 °,135 ° }, and d represents the number of pixels spaced between the two points.
If the gray values of two points are represented by an ordered real number pair (m, n), (m, n ═ 0,1, 2., 255), let the reference point coordinate be (x)0,y0) The other pixel point coordinate is (x)1,y1) Then statistical result Nθ,d(m, n) can be expressed as the following formulaAnd (4) showing.
N0°,d(m,n)=#{((x0,y0),(x1,y1))∈(Ly×Lx)×(Ly×Lx)|
y0-y1=0,|x0-x1|=d,
I(x0,y0)=m,I(x1,y1)=n}
N45°,d(m,n)=#{((x0,y0),(x1,y1))∈(Ly×Lx)×(Ly×Lx)|
(y0-y1=d,|x0-x1|=-d)or(y0-y1=-d,|x0-x1|=d),
I(x0,y0)=m,I(x1,y1)=n}
N90°,d(m,n)=#{((x0,y0),(x1,y1))∈(Ly×Lx)×(Ly×Lx)|
|y0-y1|=d,x0-x1=0,
I(x0,y0)=m,I(x1,y1)=n}
N135°,d(m,n)=#{((x0,y0),(x1,y1))∈(Ly×Lx)×(Ly×Lx)|
(y0-y1=-d,|x0-x1|=-d)or(y0-y1=d,|x0-x1|=d),
I(x0,y0)=m,I(x1,y1)=n}
The number of times the gray value combination (m, N) of the pixel point pair of which the direction theta and the distance d appear in the whole image is determined to be N through the statistics of the sign # { }θ,d(m, n) in the direction of theta and at a distance of dIs Nθ,dThen P can be usedθ,d(m, n) represents the probability that the combination occurs in the direction θ and distance d:
Figure BDA0003482799430000161
gray level co-occurrence matrix Pθ,dCan be expressed as:
Figure BDA0003482799430000162
s2206, calculating corresponding characteristic quantity according to the redundant gray level co-occurrence matrix to obtain grid texture characteristics of the grid fabric image.
In this embodiment, the obtaining of the feature quantity is to perform corresponding formula calculation based on the redundant gray level co-occurrence matrix to obtain different feature parameters, and the feature parameters are used for representing different attributes of the grid texture features.
In an embodiment, the obtaining of the corresponding feature quantity according to the redundant gray level co-occurrence matrix in step S2206 to obtain the grid texture feature of the grid fabric image may include the following steps:
s401, calculating characteristic parameters of the redundant images based on the redundant gray level co-occurrence matrix, wherein the characteristic parameters comprise an angular second moment, contrast, correlation, an inverse difference moment and entropy.
And (3) calculating characteristics based on the gray level co-occurrence matrix:
angular Second Moment (ASM):
Figure BDA0003482799430000171
contrast (Contrast):
Figure BDA0003482799430000172
correlation (Correlation):
Figure BDA0003482799430000173
in the formula ofx、μy、σx、σyThe calculation method is as follows:
Figure BDA0003482799430000174
Figure BDA0003482799430000175
Figure BDA0003482799430000176
Figure BDA0003482799430000177
moment of opposition difference (IDM):
Figure BDA0003482799430000178
entropy (Entropy):
Figure BDA0003482799430000179
s402, extracting the characteristics of the gridding fabric image according to the characteristic parameters to obtain grid texture characteristics corresponding to the gridding fabric image.
Synthesizing the characteristic parameters corresponding to the multiple redundant images, namely, decomposing to obtain a first-level low-pass redundant image, wherein the corresponding grid texture characteristics can be expressed as:
FL1={AsmL1,ConL1,CorL1,IdmL1,EntL1}
wherein, AsmL1Representing the angular second moment, Con, of the primary low-pass redundant imageL1Representing the contrast value, Cor, of the primary low-pass redundant imageL1Representing the correlation value, Idm, of the primary low-pass redundant imageL1Representing the inverse moment, Ent, of the first-order low-pass redundant imageL1Representing the entropy value of the primary low-pass redundant image.
For the decomposed primary high-pass redundant image, the corresponding grid texture features can be expressed as:
FH1={AsmH1,ConH1,CorH1,IdmH1,EntH1}
wherein, AsmH1Representing the angular second moment, Con, of the primary high-pass redundant imageH1Representing the contrast value, Cor, of the primary high-pass redundant imageH1Representing the correlation value, Idm, of the primary high-pass redundant imageH1Representing the inverse moment, Ent, of the primary high-pass redundant imageH1Representing the entropy value of the primary high-pass redundant image.
And by analogy, the characteristic value corresponding to each redundant image is obtained. Because the gridding fabric image is decomposed into a plurality of different redundant images, the grid texture characteristics corresponding to the gridding fabric image can be obtained according to the characteristic values of the different redundant images.
Determining whether the grid texture image is a defective image or a normal image (i.e., a non-defective image) through each characteristic parameter in the grid texture image, for example, Asm can be used to describe the uniformity degree of gray distribution and the thickness degree of texture in the image, if the value fluctuation of each element in the gray co-occurrence matrix is not large, the Asm value is smaller, otherwise, the Asm value is larger, if the Asm value is larger, the texture is thicker, otherwise, the texture is thinner. Cor may be used to describe the correlation of matrix elements in either the row or column direction, and if an image has texture in a certain direction, the value of Cor for the matrix in that direction may be relatively large. When the texture groove is deep, the Con value is larger, the image is clearer, and on the contrary, when the texture groove is shallow, the Con value is smaller, the image is fuzzy. The larger the entropy value of the Ent is, the more dispersed the element values in the matrix are. If the image has no texture, the end entropy value is smaller, and if the texture in the image is complex, the end entropy value is larger.
Based on this, there is a clear difference between the feature values corresponding to the feature parameters of the defective image and the normal image: the Asm value represents that the flaw image is smaller than the normal image; the Con value is higher in appearance that the flaw image is higher than the normal image on the whole and the fluctuation is more obvious; the Cor value is higher in the appearance of a flaw image correlation than a normal image, and the two images are obviously distinguished; the Idm value is represented by a high overall normal image and small overall fluctuation, and the characteristic representation is associated with contrast; in the expression of the Ent value, the original regularity of the fabric image is damaged due to the occurrence of flaws in the flaw image, and the image becomes disordered, so that the entropy value of the flaw image is larger.
The above examples are merely used to assist in explaining the technical solutions of the present disclosure, and the drawings and specific flows related thereto do not constitute a limitation on the usage scenarios of the technical solutions of the present disclosure.
The following describes in detail a related embodiment of the fabric defect detecting apparatus.
Fig. 7 is a schematic structural diagram of a fabric defect detecting device in an embodiment, which can be implemented at a terminal.
As shown in fig. 7, the apparatus 200 for detecting fabric defects may include: a fabric image acquisition module 210, a texture feature extraction module 220, a flaw feature determination module 230 and a flaw position marking module 240;
the fabric image acquiring module 210 is configured to acquire a fabric image to be detected acquired by a camera device, and preprocess the fabric image to be detected to obtain a preprocessed fabric image;
the texture feature extraction module 220 is configured to perform mesh division on the preprocessed fabric image according to a texture period of the preprocessed fabric image to obtain a plurality of meshed fabric images, and extract mesh texture features of each meshed fabric image by using the redundant gray level co-occurrence matrix;
a defect characteristic determination module 230, configured to classify the mesh texture characteristics through a pre-trained fabric defect detection neural network, and determine whether the mesh texture characteristics are defect characteristics according to a classification result;
and a defect position marking module 240, configured to mark a defect position in the grid texture feature corresponding to the fabric image to be detected if it is determined that the grid texture feature is the defect feature.
The fabric flaw detection device provided by the application acquires a fabric image to be detected through the fabric image acquisition module 210 and performs preprocessing, the texture feature extraction module 220 performs grid division on the preprocessed fabric image to be detected according to a texture period, and extracts grid texture features by using a redundant gray level co-occurrence matrix, the flaw feature determination module 230 classifies the grid texture features through a pre-trained fabric flaw detection neural network to determine whether the grid texture features are flaw features, and the flaw position marking module 240 marks flaw positions in the grid texture features corresponding to the fabric image to be detected, so that the detection accuracy and the detection efficiency of the fabric flaws are improved.
In one embodiment, the apparatus for detecting fabric defects further comprises: the neural network construction module comprises a sample image acquisition unit and a neural network training unit; the system comprises a sample image acquisition unit, a data processing unit and a data processing unit, wherein the sample image acquisition unit is used for acquiring a fabric sample image which is marked in advance; the fabric sample images include a defective fabric sample image and a non-defective fabric sample image; and the neural network training unit is used for extracting the sample fabric texture characteristics of the fabric sample image, inputting the sample fabric texture characteristics into the classifier for training, and constructing to obtain the fabric flaw detection neural network.
In one embodiment, the web image acquisition module 210 comprises: the device comprises a weight value adjusting unit, a gray value calculating unit and a preprocessed fabric image obtaining unit; the weight value adjusting unit is used for adjusting the gray value weight value corresponding to each pixel point in the to-be-detected fabric image and the similarity degree of the neighborhood pixel points around the pixel point in the geometric space or the gray space; the gray value calculation unit is used for carrying out weighted average on the gray value of each neighborhood pixel point according to the gray value weight to obtain a target gray value of the pixel point; and the preprocessing fabric image obtaining unit is used for obtaining the preprocessing fabric image according to the target gray value of each pixel point in the fabric image to be detected.
In one embodiment, the texture feature extraction module 220 comprises: a homogenization parameter determining unit, a grid size determining unit and a gridding fabric image obtaining unit; the homogenization parameter determining unit is used for determining the homogenization parameters of the preprocessed fabric image according to the gray value difference value between each pixel pair in the preprocessed fabric image; the mesh size determining unit is used for measuring the texture period of the preprocessed fabric image according to the homogenization parameters and determining the mesh size for meshing the preprocessed fabric image; and the gridding fabric image obtaining unit is used for carrying out gridding division on the preprocessed fabric image based on the size of the grid to obtain a plurality of gridding fabric images.
In one embodiment, the homogenization parameter determination unit comprises: a gray value difference determining subunit and a homogenization parameter determining subunit; the grey value difference determining subunit is used for determining the relative position relationship of each pixel pair in the preprocessed fabric image and determining the grey value difference of the pixel pairs based on the relative position relationship; and the homogenization parameter determining subunit is used for counting the probability distribution of the difference values of different gray values, calculating the statistical probability that the difference value of the gray value is a preset difference value, and determining the homogenization parameters of the preprocessed fabric image according to the probability distribution result.
In one embodiment, the texture feature extraction module 220 comprises: a redundant image generation unit, a redundant gray level co-occurrence matrix obtaining unit and a grid texture feature obtaining unit; the redundant image generation unit is used for carrying out multi-scale decomposition on each gridded fabric image through a pre-established generalized Gaussian filter to obtain a plurality of redundant images with different scales corresponding to the gridded fabric images; a redundant gray level co-occurrence matrix obtaining unit, configured to generate a gray level co-occurrence matrix corresponding to each redundant image, so as to obtain a redundant gray level co-occurrence matrix of the meshed fabric image; and the grid texture characteristic obtaining unit is used for solving corresponding characteristic quantity according to the redundant gray level co-occurrence matrix to obtain the grid texture characteristic of the grid fabric image.
In one embodiment, the mesh texture feature obtaining unit includes: calculating the characteristic parameters of the subunits and the grid texture characteristics to obtain subunits; the characteristic parameter calculating subunit is used for calculating characteristic parameters of the redundant images based on the redundant gray level co-occurrence matrix, wherein the characteristic parameters comprise an angular second moment, contrast, correlation, an inverse difference moment and entropy; and the grid texture characteristic obtaining subunit is used for extracting the characteristics of the gridding fabric image according to the characteristic parameters to obtain the grid texture characteristics corresponding to the gridding fabric image.
The fabric defect detecting device of the present embodiment can perform the fabric defect detecting method of the present application in the foregoing embodiments, and the implementation principles thereof are similar, and will not be described herein again.
The modules in the fabric defect detecting device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal device, and its internal structure is shown in fig. 8. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with other external devices through network connection. The computer program is executed by a processor to implement a method of detecting fabric defects.
Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring a to-be-detected fabric image acquired by camera equipment, and preprocessing the to-be-detected fabric image to obtain a preprocessed fabric image;
performing grid division on the preprocessed fabric image according to the texture period of the preprocessed fabric image to obtain a plurality of grid fabric images, and extracting grid texture characteristics of each grid fabric image by using a redundant gray level co-occurrence matrix;
classifying the grid texture features through a pre-trained fabric flaw detection neural network, and determining whether the grid texture features are flaw features or not according to a classification result;
and if the grid texture features are determined to be the defect features, marking the defect positions in the grid texture features corresponding to the fabric image to be detected.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring a fabric sample image which is marked in advance; the fabric sample images include a defective fabric sample image and a non-defective fabric sample image; and extracting the fabric texture characteristics of the fabric sample image, inputting the fabric texture characteristics of the sample into a classifier for training, and constructing to obtain the fabric flaw detection neural network.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
adjusting the gray value weight corresponding to the neighborhood pixel points according to the proximity degree of each pixel point in the to-be-detected fabric image and the neighborhood pixel points around the pixel point in the geometric space or the similarity degree in the gray space; carrying out weighted average on the gray value of each neighborhood pixel according to the gray value weight to obtain a target gray value of the pixel; and obtaining a preprocessed fabric image according to the target gray value of each pixel point in the fabric image to be detected.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
determining homogenization parameters of the preprocessed fabric image according to gray value difference values between pixel pairs in the preprocessed fabric image; measuring the texture period of the preprocessed fabric image according to the homogenization parameters, and determining the mesh size for meshing the preprocessed fabric image; and carrying out meshing on the preprocessed fabric image based on the size of the meshes to obtain a plurality of meshed fabric images.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
determining the relative position relation of each pixel pair in the preprocessed fabric image, and determining the gray value difference value of the pixel pair based on the relative position relation; and counting the probability distribution of the gray value difference value, and determining the homogenization parameters of the preprocessed fabric image according to the statistical result of the probability distribution.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
carrying out multi-scale decomposition on each gridding fabric image through a pre-established generalized Gaussian filter to obtain a plurality of redundant images with different scales corresponding to the gridding fabric images; generating a gray level co-occurrence matrix corresponding to each redundant image to obtain a redundant gray level co-occurrence matrix of the gridded fabric image; and solving corresponding characteristic quantity according to the redundant gray level co-occurrence matrix to obtain the grid texture characteristics of the grid fabric image.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
calculating characteristic parameters of the redundant images based on the redundant gray level co-occurrence matrix, wherein the characteristic parameters comprise an angular second moment, contrast, correlation, an inverse difference moment and entropy; and extracting the characteristics of the gridding fabric image according to the characteristic parameters to obtain the grid texture characteristics corresponding to the gridding fabric image.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring a to-be-detected fabric image acquired by camera equipment, and preprocessing the to-be-detected fabric image to obtain a preprocessed fabric image;
performing grid division on the preprocessed fabric image according to the texture period of the preprocessed fabric image to obtain a plurality of grid fabric images, and extracting grid texture characteristics of each grid fabric image by using a redundant gray level co-occurrence matrix;
classifying the grid texture features through a pre-trained fabric flaw detection neural network, and determining whether the grid texture features are flaw features or not according to a classification result;
and if the grid texture features are determined to be the defect features, marking the defect positions in the grid texture features corresponding to the fabric image to be detected.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring a fabric sample image which is marked in advance; the fabric sample images include a defective fabric sample image and a non-defective fabric sample image; and extracting the fabric texture characteristics of the fabric sample image, inputting the fabric texture characteristics of the sample into a classifier for training, and constructing to obtain the fabric flaw detection neural network.
In one embodiment, the computer program when executed by the processor further performs the steps of:
adjusting the gray value weight corresponding to the neighborhood pixel points according to the proximity degree of each pixel point in the to-be-detected fabric image and the neighborhood pixel points around the pixel point in the geometric space or the similarity degree in the gray space; carrying out weighted average on the gray value of each neighborhood pixel according to the gray value weight to obtain a target gray value of the pixel; and obtaining a preprocessed fabric image according to the target gray value of each pixel point in the fabric image to be detected.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining homogenization parameters of the preprocessed fabric image according to gray value difference values between pixel pairs in the preprocessed fabric image; measuring the texture period of the preprocessed fabric image according to the homogenization parameters, and determining the mesh size for meshing the preprocessed fabric image; and carrying out meshing on the preprocessed fabric image based on the size of the meshes to obtain a plurality of meshed fabric images.
In one embodiment, the computer program when executed by the processor further performs the steps of:
determining the relative position relation of each pixel pair in the preprocessed fabric image, and determining the gray value difference value of the pixel pair based on the relative position relation; and counting the probability distribution of the gray value difference value, and determining the homogenization parameters of the preprocessed fabric image according to the statistical result of the probability distribution.
In one embodiment, the computer program when executed by the processor further performs the steps of:
carrying out multi-scale decomposition on each gridding fabric image through a pre-established generalized Gaussian filter to obtain a plurality of redundant images with different scales corresponding to the gridding fabric images; generating a gray level co-occurrence matrix corresponding to each redundant image to obtain a redundant gray level co-occurrence matrix of the gridded fabric image; and solving corresponding characteristic quantity according to the redundant gray level co-occurrence matrix to obtain the grid texture characteristics of the grid fabric image.
In one embodiment, the computer program when executed by the processor further performs the steps of:
calculating characteristic parameters of the redundant images based on the redundant gray level co-occurrence matrix, wherein the characteristic parameters comprise an angular second moment, contrast, correlation, an inverse difference moment and entropy; and extracting the characteristics of the gridding fabric image according to the characteristic parameters to obtain the grid texture characteristics corresponding to the gridding fabric image.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, the computer program can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in the above figures may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of detecting fabric defects, the method comprising:
acquiring a to-be-detected fabric image acquired by camera equipment, and preprocessing the to-be-detected fabric image to obtain a preprocessed fabric image;
performing grid division on the preprocessed fabric image according to the texture period of the preprocessed fabric image to obtain a plurality of grid fabric images, and extracting grid texture features of each grid fabric image by using a redundant gray level co-occurrence matrix;
classifying the grid texture features through a pre-trained fabric flaw detection neural network, and determining whether the grid texture features are flaw features or not according to a classification result;
and if the grid texture features are determined to be defect features, marking defect positions in the grid texture features corresponding to the fabric image to be detected.
2. The method of claim 1, further comprising:
acquiring a fabric sample image which is marked in advance; the fabric sample images include a defective fabric sample image and a non-defective fabric sample image;
and extracting the fabric texture characteristics of the fabric sample image, inputting the fabric texture characteristics of the sample into a classifier for training, and constructing to obtain the fabric flaw detection neural network.
3. The method according to claim 1, wherein the step of preprocessing the image of the fabric to be detected to obtain a preprocessed fabric image comprises:
adjusting the gray value weight corresponding to each pixel point in the to-be-detected fabric image and the similarity of neighborhood pixel points around the pixel point in a geometric space or a gray space;
carrying out weighted average on the gray value of each neighborhood pixel point according to the gray value weight to obtain a target gray value of the pixel point;
and obtaining a preprocessed fabric image according to the target gray value of each pixel point in the fabric image to be detected.
4. The method according to claim 1, wherein the step of meshing the pre-processed fabric image according to the texture cycle of the pre-processed fabric image to obtain a plurality of meshed fabric images comprises:
determining homogenization parameters of the preprocessed fabric image according to gray value difference values between pixel pairs in the preprocessed fabric image;
measuring the texture period of the preprocessed fabric image according to the homogenization parameters, and determining the mesh size for mesh division of the preprocessed fabric image;
and carrying out meshing division on the preprocessed fabric image based on the mesh size to obtain a plurality of meshed fabric images.
5. The method according to claim 4, wherein the step of determining a homogeneity parameter for the pre-processed web image based on gray value differences between respective pairs of pixels in the pre-processed web image comprises:
determining the relative position relation of each pixel pair in the preprocessed fabric image, and determining the gray value difference value of the pixel pair based on the relative position relation;
and counting the probability distribution of the gray value difference value, and determining the homogenization parameters of the preprocessed fabric image according to the statistical result of the probability distribution.
6. The method according to claim 1, wherein the step of extracting the grid texture feature of each of the gridded fabric images by using the redundant gray level co-occurrence matrix comprises:
carrying out multi-scale decomposition on each gridding fabric image through a pre-established generalized Gaussian filter to obtain a plurality of redundant images with different scales corresponding to the gridding fabric image;
generating a gray level co-occurrence matrix corresponding to each redundant image to obtain a redundant gray level co-occurrence matrix of the gridded fabric image;
and obtaining corresponding characteristic quantity according to the redundant gray level co-occurrence matrix to obtain the grid texture characteristics of the grid fabric image.
7. The method according to claim 6, wherein the step of obtaining the grid texture feature of the grid fabric image by performing corresponding feature quantity calculation according to the redundant gray level co-occurrence matrix comprises:
calculating characteristic parameters of the redundant images based on the redundant gray level co-occurrence matrix, wherein the characteristic parameters comprise an angular second moment, contrast, correlation, an inverse difference moment and entropy;
and extracting the characteristics of the gridding fabric image according to the characteristic parameters to obtain the grid texture characteristics corresponding to the gridding fabric image.
8. An apparatus for detecting fabric defects, said apparatus comprising:
the fabric image acquisition module is used for acquiring a fabric image to be detected acquired by camera equipment and preprocessing the fabric image to be detected to obtain a preprocessed fabric image;
the texture feature extraction module is used for carrying out grid division on the preprocessed fabric image according to the texture period of the preprocessed fabric image to obtain a plurality of grid fabric images, and extracting the grid texture features of each grid fabric image by using a redundant gray level co-occurrence matrix;
the flaw characteristic determination module is used for classifying the grid texture characteristics through a pre-trained fabric flaw detection neural network and determining whether the grid texture characteristics are flaw characteristics or not according to a classification result;
and the flaw position marking module is used for marking a flaw position in the grid texture feature corresponding to the fabric image to be detected if the grid texture feature is determined to be the flaw feature.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 7 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202210072590.3A 2022-01-21 2022-01-21 Fabric flaw detection method and device, computer equipment and readable storage medium Pending CN114419004A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210072590.3A CN114419004A (en) 2022-01-21 2022-01-21 Fabric flaw detection method and device, computer equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210072590.3A CN114419004A (en) 2022-01-21 2022-01-21 Fabric flaw detection method and device, computer equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114419004A true CN114419004A (en) 2022-04-29

Family

ID=81275755

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210072590.3A Pending CN114419004A (en) 2022-01-21 2022-01-21 Fabric flaw detection method and device, computer equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114419004A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114612469A (en) * 2022-05-09 2022-06-10 武汉中导光电设备有限公司 Product defect detection method, device and equipment and readable storage medium
CN114646563A (en) * 2022-05-23 2022-06-21 河南银金达新材料股份有限公司 Method for detecting surface abrasion resistance of polyester film with metal coating
CN114820556A (en) * 2022-05-13 2022-07-29 东北大学秦皇岛分校 Method for generating food package flaw sample based on custom algorithm
CN114998290A (en) * 2022-06-20 2022-09-02 佛山技研智联科技有限公司 Fabric flaw detection method, device, equipment and medium based on supervised mode
CN115018826A (en) * 2022-08-02 2022-09-06 南通市爱诺家用纺织品有限公司 Fabric flaw detection method and system based on image recognition
CN115272317A (en) * 2022-09-27 2022-11-01 南通佳之锦纺织有限公司 Self-adaptive control method for singeing of woven grey cloth
CN115588010A (en) * 2022-12-09 2023-01-10 滨州华然化纤绳网有限公司 Surface defect detection method for non-woven fabric
CN116008289A (en) * 2023-02-07 2023-04-25 浙江聚优非织造材料科技有限公司 Nonwoven product surface defect detection method and system
CN117237340A (en) * 2023-11-10 2023-12-15 江西省中鼐科技服务有限公司 Method and system for detecting appearance of mobile phone shell based on artificial intelligence
CN117274248A (en) * 2023-11-20 2023-12-22 滨州三元家纺有限公司 Visual detection method for fabric printing and dyeing flaws and defects

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679183A (en) * 2013-11-13 2014-03-26 河海大学 Defect identification method for plain weave gray cloth
CN109187579A (en) * 2018-09-05 2019-01-11 深圳灵图慧视科技有限公司 Fabric defect detection method and device, computer equipment and computer-readable medium
CN109961432A (en) * 2019-03-27 2019-07-02 广东工业大学 A kind of detection method and system of filter cloth breakage

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679183A (en) * 2013-11-13 2014-03-26 河海大学 Defect identification method for plain weave gray cloth
CN109187579A (en) * 2018-09-05 2019-01-11 深圳灵图慧视科技有限公司 Fabric defect detection method and device, computer equipment and computer-readable medium
CN109961432A (en) * 2019-03-27 2019-07-02 广东工业大学 A kind of detection method and system of filter cloth breakage

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GNANAPR等: "Defect Detection in Fabrics Using Back Propagation Neural Networks", 《IJITEE》, 31 December 2018 (2018-12-31), pages 2 - 3 *
王兆晖著: "《图像复制》", 31 October 2017, 西安电子科技大学出版社, pages: 8 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114612469B (en) * 2022-05-09 2022-08-12 武汉中导光电设备有限公司 Product defect detection method, device and equipment and readable storage medium
CN114612469A (en) * 2022-05-09 2022-06-10 武汉中导光电设备有限公司 Product defect detection method, device and equipment and readable storage medium
CN114820556B (en) * 2022-05-13 2024-05-17 东北大学秦皇岛分校 Custom algorithm-based food packaging flaw sample generation method
CN114820556A (en) * 2022-05-13 2022-07-29 东北大学秦皇岛分校 Method for generating food package flaw sample based on custom algorithm
CN114646563A (en) * 2022-05-23 2022-06-21 河南银金达新材料股份有限公司 Method for detecting surface abrasion resistance of polyester film with metal coating
CN114998290A (en) * 2022-06-20 2022-09-02 佛山技研智联科技有限公司 Fabric flaw detection method, device, equipment and medium based on supervised mode
CN115018826A (en) * 2022-08-02 2022-09-06 南通市爱诺家用纺织品有限公司 Fabric flaw detection method and system based on image recognition
CN115272317A (en) * 2022-09-27 2022-11-01 南通佳之锦纺织有限公司 Self-adaptive control method for singeing of woven grey cloth
CN115272317B (en) * 2022-09-27 2022-12-27 南通佳之锦纺织有限公司 Self-adaptive textile grey cloth singeing control method
CN115588010A (en) * 2022-12-09 2023-01-10 滨州华然化纤绳网有限公司 Surface defect detection method for non-woven fabric
CN116008289A (en) * 2023-02-07 2023-04-25 浙江聚优非织造材料科技有限公司 Nonwoven product surface defect detection method and system
CN117237340A (en) * 2023-11-10 2023-12-15 江西省中鼐科技服务有限公司 Method and system for detecting appearance of mobile phone shell based on artificial intelligence
CN117237340B (en) * 2023-11-10 2024-01-26 江西省中鼐科技服务有限公司 Method and system for detecting appearance of mobile phone shell based on artificial intelligence
CN117274248A (en) * 2023-11-20 2023-12-22 滨州三元家纺有限公司 Visual detection method for fabric printing and dyeing flaws and defects
CN117274248B (en) * 2023-11-20 2024-02-02 滨州三元家纺有限公司 Visual detection method for fabric printing and dyeing flaws and defects

Similar Documents

Publication Publication Date Title
CN114419004A (en) Fabric flaw detection method and device, computer equipment and readable storage medium
CN109583489B (en) Defect classification identification method and device, computer equipment and storage medium
CN106875373B (en) Mobile phone screen MURA defect detection method based on convolutional neural network pruning algorithm
CN106778586B (en) Off-line handwritten signature identification method and system
CN106127779B (en) The defect inspection method and system of view-based access control model identification
US20200133182A1 (en) Defect classification in an image or printed output
Kang et al. A deep learning approach to document image quality assessment
CN109993221B (en) Image classification method and device
CN105277567B (en) A kind of fabric defects detection method
CN109766838B (en) Gait cycle detection method based on convolutional neural network
CN111127383A (en) Digital printing online defect detection system and implementation method thereof
CN111724376B (en) Paper disease detection method based on texture feature analysis
CN115018835B (en) Automobile starter gear detection method
CN113269149A (en) Living body face image detection method and device, computer equipment and storage medium
CN116664565A (en) Hidden crack detection method and system for photovoltaic solar cell
CN116051382A (en) Data enhancement method based on deep reinforcement learning generation type antagonistic neural network and super-resolution reconstruction
CN116228780A (en) Silicon wafer defect detection method and system based on computer vision
CN117314901B (en) Scale-adaptive chip detection neural network system
Yang et al. EHNQ: Subjective and objective quality evaluation of enhanced night-time images
CN115482227A (en) Machine vision self-adaptive imaging environment adjusting method
CN115100068A (en) Infrared image correction method
CN111652108B (en) Anti-interference signal identification method and device, computer equipment and storage medium
Magdalena et al. Identification of beef and pork using gray level co-occurrence matrix and probabilistic neural network
CN117474916B (en) Image detection method, electronic equipment and storage medium
CN113808079B (en) Industrial product surface defect self-adaptive detection method based on deep learning model AGLNet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination