WO2021253632A1 - Cloth defect detection method based on adversarial neural network, and terminal and storage medium - Google Patents

Cloth defect detection method based on adversarial neural network, and terminal and storage medium Download PDF

Info

Publication number
WO2021253632A1
WO2021253632A1 PCT/CN2020/111475 CN2020111475W WO2021253632A1 WO 2021253632 A1 WO2021253632 A1 WO 2021253632A1 CN 2020111475 W CN2020111475 W CN 2020111475W WO 2021253632 A1 WO2021253632 A1 WO 2021253632A1
Authority
WO
WIPO (PCT)
Prior art keywords
cloth
neural network
model
image sample
generator
Prior art date
Application number
PCT/CN2020/111475
Other languages
French (fr)
Chinese (zh)
Inventor
周凯
吴小飞
庞凤江
武艳萍
彭其栋
张帆
Original Assignee
深圳新视智科技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳新视智科技术有限公司 filed Critical 深圳新视智科技术有限公司
Publication of WO2021253632A1 publication Critical patent/WO2021253632A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Definitions

  • the present disclosure generally relates to the technical field of cloth defect detection, and more specifically to a cloth defect detection method, terminal, and storage medium based on an adversarial neural network.
  • the surface quality not only affects the appearance of the product itself, but also may affect the functional characteristics of the product itself, causing significant losses to the enterprise. Therefore, it is very necessary to detect the surface defects of the product, and it is particularly important to design a surface defect detection method with real-time and effective surface defects.
  • the traditional inspection method of cloth inspection is a manual inspection method, that is, under certain light source conditions, workers directly observe the cloth to be tested, and judge whether the cloth is defective or not based on intuitive experience.
  • This method is very easy to be affected by human subjective factors and cause false detection or missed detection, and the detection efficiency is very low, and it is not suitable for the needs of modern large-scale production.
  • the test results and data of manual testing cannot be automatically saved, which is inconvenient for subsequent data query, and it is also difficult to guide future cloth production.
  • the core of the cloth defect detection system lies in the defect detection algorithm.
  • the current common defect detection algorithms can be roughly divided into the following categories: statistics, spectra, models, learning and structure.
  • Statistical algorithm is to realize defect detection by expressing the spatial distribution of image gray value, mainly including co-occurrence matrix, mathematical morphology, etc.
  • the pre-trained Gabor wavelet network is used to extract important features, and the defect detection rate is higher than 90%.
  • Spectral algorithms use different ways to express the characteristics of images to complete the detection of defects in the image, including Fourier transform, wavelet transform, and Gabor transform algorithms. There is also the idea of Fourier transform for defect detection.
  • the histogram equalization is used to enhance the contrast of the fabric image, and then the central frequency spectrum is calculated by the two-point FFT, and a better defect detection result is obtained.
  • the structural algorithm is based on the texture structure of the detected object, and the corresponding algorithm is designed according to different texture structures.
  • saliency analysis algorithms are also commonly used in cloth detection.
  • a context-based local texture saliency analysis algorithm uses local binary to extract texture features and applies them to texture image detection. The better effect proves the algorithm’s effectiveness. Effectiveness.
  • Learning methods are mainly Neural Network (Neural Network) and deep learning, BP artificial neural network and defect detection method based on convolutional neural network.
  • This method can realize defect detection and location on images with texture structure and special structure, and the performance is far Better than traditional detection methods.
  • the above detection methods have achieved good results in specific applications, but still have some shortcomings.
  • the detection method based on Gabor transform can provide a joint spatial representation capability, the calculation of the filter template is complicated and the amount of calculation is amazing.
  • the detection method based on mathematical morphology can detect the size and shape of the defect very well, and show the defect very well, but it does not have enough visual effect support.
  • the detection process of the above methods is roughly image preprocessing, target extraction, feature selection and pattern classification.
  • the existing deep learning detection algorithms mainly include: Faster Rcnn, SSD, Yolov3, etc. These algorithms are good for general scenes, but they are not effective in the cloth industry. The main reason is that there are too many types of cloth and different defect shapes. And for the complex and ever-changing background of plaid patterns and printing, there is no good solution so far. Deep learning has requirements for data samples. The collection of hundreds of fabric defects is time-consuming and laborious. In the final fabric manufacturing process, processes such as shaping, mercerizing, and liquid ammonia are very fast, and deep learning has high computational complexity. Very challenging.
  • the present disclosure provides a cloth defect detection method, a terminal and a storage medium based on an adversarial neural network.
  • the present disclosure provides a cloth defect detection method based on an adversarial neural network, which includes: Step 1: Train a GAN model based on the positive samples of the cloth, and output the GAN model after the training is completed; Step 2: Take the detected image samples of the cloth Send to the GAN model, the GAN model reconstructs the detected image sample to generate a good reconstructed image sample; and Step 3: Compare the detected image sample with the reconstructed image sample for structural similarity, according to the difference in comparison Degree to determine whether the detected image sample is defective.
  • offline training is used when training the GAN model.
  • the GAN model is a DCGAN model.
  • both the generator and discriminator in the DCGAN model use batch normalization; stride convolution is used in the discriminator to replace spatial pooling, and the deconvolution layer in the generator uses microstepping Amplitude convolution; remove the fully connected layer in the deeper architecture of the DCGAN model; use the Tanh activation function in the output layer of the generator, use the ReLU activation function in the other layers of the generator, and use the leaky ReLU activation function in the discriminator.
  • a hidden variable input is added to the generator input in the DCGAN model.
  • the discriminator not only needs to participate in the true and false judgment, but also needs to seek mutual information with the hidden variable.
  • the DCGAN model updates the generator and discriminator based on mutual information.
  • the characteristics of structural similarity contrast include brightness difference, contrast difference, and structural difference.
  • the brightness difference is estimated as:
  • the contrast difference is estimated as:
  • the structural difference is estimated as:
  • M(x,y) L(x,y) ⁇ C(x,y) ⁇ S(x,y).
  • the present disclosure also provides a terminal.
  • the terminal includes a memory, a processor, and a computer program that is stored on the memory and can run on the processor.
  • the processor executes the computer program when the computer program is executed. The above detection method.
  • the present disclosure also provides a storage medium, the storage medium including computer instructions, when the computer instructions run on a computer, the computer executes the above detection method.
  • Fig. 1 is a flowchart of a detection method in an embodiment of the present disclosure.
  • Fig. 2 is a flowchart of the training phase of the detection method in an embodiment of the present disclosure.
  • FIG. 3 is a schematic diagram of the operation of generating a confrontation network in an embodiment of the present disclosure.
  • Fig. 4 is a structure diagram of a DCGAN model in an embodiment of the present disclosure.
  • Fig. 5 is a structural diagram of a model with hidden variables added in an embodiment of the present disclosure.
  • Figure 6 is a flow chart of the testing phase of the detection method in an embodiment of the present disclosure.
  • Fig. 7 is a picture of an inspection image sample of cloth in an embodiment of the present disclosure.
  • FIG. 8 is a picture of a good reconstructed image sample after reconstruction of a GAN model in an embodiment of the present disclosure.
  • the present disclosure provides a cloth defect detection method based on an adversarial neural network.
  • the testing method includes training and testing phases, which specifically include the following steps:
  • Step 1 Train the GAN model according to the positive samples of the cloth, and output the GAN model after the training is completed.
  • Figure 2 is a flow chart of the training phase.
  • the GAN model is trained after collecting positive samples of cloth and preprocessing the positive samples. After the training is completed, the GAN model is output.
  • the training of the GAN model adopts offline training.
  • the detection method adopts the collection of positive samples to train the model, which can avoid the problems of a wide variety of cloth defects and difficult collection, and is more in line with the actual production situation.
  • the GAN (Generative Adversarial Networks) model is inspired by the two-player zero-sum game in game theory.
  • the two game parties in GAN are respectively composed of generators (G) and
  • the discriminator (discriminative model, D) acts as.
  • the generator generates a pseudo sample through the input non-standard image (noise data), the pseudo sample generated by the generator and the real sample of the collected cloth are sent to the discriminator, and the discriminator needs to distinguish between the fake sample and the real sample ,
  • the GAN model updates the model according to the judgment result. Therefore, the generator needs to output as close to the real sample as possible to fool the discriminator, and the discriminator needs to distinguish as far as possible whether the input sample is a real sample or a fake image generated by the generator.
  • the final solution of the game is the Nash equilibrium.
  • the ultimate goal is to hope that the discriminator’s discriminative ability is strong enough, and when the pseudo samples of the generator are input into the discriminator, it is difficult for the discriminator to judge whether it is true or false, that is Maximizing the discriminating ability of the discriminator, minimizing the probability of judging the output image of the generator as a false image. Therefore, the discriminator must maximize the following loss function, divide the real samples into positive examples as much as possible, and generate negative examples as possible:
  • the GAN model adopts the DCGAN (Deep Convolution Generative Adversarial Networks) model, which can improve the effect through the powerful feature extraction capabilities of the convolutional layer.
  • DCGAN Deep Convolution Generative Adversarial Networks
  • Figure 4 is a diagram of the DCGAN model structure.
  • the model structure of DCGAN has the following characteristics:
  • the Tanh activation function is used in the output layer of the generator, the Re LU activation function is used in other layers, and the leaky ReLU activation function is used in the discriminator.
  • the present disclosure adds a latent variable C_vector input on the DCGAN model in order to better reconstruct the data.
  • the hidden variables and noise data are sent to the generator, which generates pseudo samples, and sends the pseudo samples and real samples to the generator.
  • the generator not only participates in the true and false judgment, but also needs to seek mutual information with the hidden variables, And update the generator and the discriminator according to the mutual information, so that more hidden variable information is retained in the generated image of the generator.
  • a trained GAN model can be obtained.
  • the testing phase shown in Figure 6 can be started, and the testing phase includes step 2 and step 3.
  • Step 2 Send the test image sample of the cloth to the GAN model, and the GAN model reconstructs the test image sample to generate a good reconstructed image sample.
  • the pictures of the detected image samples of the cloth in Figure 7. The cloth in Figure 7 is defaced, and Figure 7 is sent to the GAN model, which reconstructs Figure 7 , Generate a reconstructed sample picture (Figure 8), the cloth in Figure 8 does not have the stains in the original Figure 7.
  • Step 3 Perform a structural similarity comparison between the detected image sample and the reconstructed image sample, and determine whether the detected image sample is defective according to the degree of difference of the contrast.
  • Structural similarity is an index that measures the similarity of two images. It measures the similarity of images from three different perspectives: brightness, contrast, and structure. Among them, the brightness difference is estimated as:
  • the contrast difference is estimated as:
  • the structural difference is estimated as:
  • the structural similarity comparison combines the brightness difference, contrast difference and structure difference, and the calculation formula is as follows:
  • Figure 7 and Figure 8 are selected for comparison. Since Figure 8 is a good image generated by the GAN model, if the difference between Figure 7 and Figure 8 reaches a predetermined value, it can be judged that there is a defect in Figure 7, if it does not reach, then you can The sample in Figure 7 is judged to be a good product.
  • a positive sample of cloth is used to train the GAN model, and during testing, a sample of the detected image of the cloth (for example, Figure 7) is sent to the GAN model for data reconstruction, and GAN is used where there are defects.
  • Network generation capability generated reconstructed image samples (such as Figure 8), and then through SSIM similarity feature extraction, calculate the degree of difference between the test image sample and the reconstructed image sample, and judge the output test image sample according to the degree of difference
  • it can detect defects such as color difference, holes, flying flowers, wrong warp, wrong weft, crumpled and stains on the cloth. It is suitable for cotton, linen, non-woven, printed cloth, Various types of fabrics such as denim fabrics and woven fabrics.
  • the GAN model and the method for determining structural similarity described in the embodiments of the present disclosure are only used to illustrate the embodiment of the detection method, and it is not the only implementation mode.
  • the GAN model can also be related to various GANs. The improvement of the model is substituted, and the method of judging the structural similarity can be replaced by the improved method related to the structural similarity, all of which are within the protection scope of this scheme.
  • the terminal includes a memory, a processor, and a computer program that is stored on the memory and can run on the processor.
  • the processor implements the foregoing when the computer program is executed.
  • Detection method the terminal may include hardware such as a cloth transport mechanism, an industrial camera, a screen, and a printing device.
  • the cloth when performing cloth inspection, the cloth is installed and placed on the cloth conveying mechanism, and the cloth conveying mechanism moves and stretches the cloth.
  • the industrial camera collects images of the cloth, and the collected image passes through the above
  • the detection method is used for detection.
  • the detection results and data can be recorded and sent to the cloud for storage, and can also be displayed on the screen or printed out by a printing device. This facilitates data statistics and collation, and helps to improve the process and improve the fabric based on the subsequent data. Rate.
  • the present disclosure provides a storage medium.
  • the storage medium includes computer instructions. When the computer instructions run on a computer, the computer executes the above detection method.
  • the present disclosure trains and generates a GAN model through a positive sample of cloth, and then uses the GAN model to reconstruct the test image sample of the cloth to generate a reconstructed image sample.
  • the samples are compared by structural similarity to judge whether there are defects. It can detect defects such as color difference, holes, flying, wrong warp, wrong weft, wrinkle, and stains. It is suitable for cotton, linen, non-woven fabric, printed fabric, denim, etc.
  • Various types of fabrics, such as fabrics and woven fabrics can well solve the problems of difficulty in collecting defective samples and intricate fabric backgrounds in the actual production process. The detection speed and detection accuracy can meet the requirements of actual production testing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)
  • Treatment Of Fiber Materials (AREA)
  • Image Processing (AREA)

Abstract

Disclosed are a cloth defect detection method based on an adversarial neural network, and a terminal and a storage medium. The detection method comprises the following steps: step 1: training a GAN model according to a positive sample of cloth, and outputting the GAN model after the training is completed; step 2: sending a detection image sample of the cloth to the GAN model, and the GAN model reconstructing the detection image sample, so as to generate a good reconstructed image sample; and step 3: performing structural similarity comparison between the detection image sample and the reconstructed image sample, and determining, according to a difference degree obtained by means of comparison, whether the detection image sample is defective.

Description

基于对抗神经网络的布匹缺陷检测方法、终端和存储介质Fabric defect detection method, terminal and storage medium based on confrontation neural network
相关申请的引用References to related applications
本公开要求于2020年6月19日向中华人民共和国国家知识产权局提交的申请号为202010566989.8、发明名称为“基于对抗神经网络的布匹缺陷检测方法、终端和存储介质”的发明专利申请的全部权益,并通过引用的方式将其全部内容并入本文。This disclosure requires all rights and interests of the invention patent application filed with the State Intellectual Property Office of the People's Republic of China on June 19, 2020, with the application number 202010566989.8 and the invention title "Fabric defect detection method, terminal and storage medium based on anti-neural network" , And incorporate its entire content into this article by reference.
领域field
本公开大体上涉及布匹缺陷检测技术领域,更具体地涉及基于对抗神经网络的布匹缺陷检测方法、终端和存储介质。The present disclosure generally relates to the technical field of cloth defect detection, and more specifically to a cloth defect detection method, terminal, and storage medium based on an adversarial neural network.
背景background
随着经济的快速发展,我国制造业也在迅速发展,每天能够大规模的制造产品以供市场需求。产品数量及种类的增多,致使人们对产品质量的要求也日益提高,外观美丽的产品更易获得消费者的青睐。产品的表面质量会影响产品的商业价值,外观的瑕疵会直接造成产品商业价值的贬值,产品表面质量对产品的直接使用或深加工有着重要影响。在产品生产过程中,由于设备及工艺的影响,产品表面往往会出现不同种类的缺陷。如在织物中出现的破洞、污点、缺经、断纬等缺陷,表面质量不仅影响产品本身外观形象,更有可能影响产品本身的功能特性,给企业造成重大损失。因此,对产品进行表面缺陷检测是十分必要的,设计出具有实时性与有效性的表面缺陷检测方法也显得尤为重要。With the rapid economic development, my country's manufacturing industry is also developing rapidly, and products can be manufactured on a large scale every day for market demand. The increase in the number and variety of products has led to increasing requirements for product quality, and products with beautiful appearances are more likely to be favored by consumers. The surface quality of the product will affect the commercial value of the product, and the flaws in the appearance will directly cause the depreciation of the commercial value of the product. The surface quality of the product has an important influence on the direct use or deep processing of the product. In the production process of the product, due to the influence of equipment and technology, different kinds of defects often appear on the surface of the product. For example, defects such as holes, stains, lack of warp, and weft breaks in the fabric, the surface quality not only affects the appearance of the product itself, but also may affect the functional characteristics of the product itself, causing significant losses to the enterprise. Therefore, it is very necessary to detect the surface defects of the product, and it is particularly important to design a surface defect detection method with real-time and effective surface defects.
传统的检测布匹检测方法是采用人工检测的方法,即在一定的光源条件下,工人们直接观察待测布匹,凭着直观感受判断该布匹是否存在缺陷。这种方法很容易受到人为主观因素的影响而造成错检或漏检,并且检测效率很低,根本不适应现代化大生产的需要。此外,人工检测的检测结果和数据无法自动保存,不便于后续的数据查询,同 时也很难对今后的布匹生产起指导作用。The traditional inspection method of cloth inspection is a manual inspection method, that is, under certain light source conditions, workers directly observe the cloth to be tested, and judge whether the cloth is defective or not based on intuitive experience. This method is very easy to be affected by human subjective factors and cause false detection or missed detection, and the detection efficiency is very low, and it is not suitable for the needs of modern large-scale production. In addition, the test results and data of manual testing cannot be automatically saved, which is inconvenient for subsequent data query, and it is also difficult to guide future cloth production.
布匹缺陷检测系统的核心在于缺陷检测算法,目前常见的缺陷检测算法大致分为以下几类:统计、光谱、模型、学习和结构。统计算法是通过表示图像灰度值的空间分布以实现缺陷检测,主要包括共生矩阵、数学形态学等。对于数学形态学分析的织物瑕疵检测,使用预训练的Gabor小波网络提取重要特征,缺陷检出率高于90%。光谱算法是采用不同的方式对图像进行特征表达,以完成对图像中瑕疵的检测,包括傅立叶变换、小波变换和Gabor变换等算法。还有采用傅立叶变换的思想进行缺陷检测,首先使用直方图均衡化对织物图像的对比度进行增强,之后由两点FFT计算出中央频率谱图,取得了较好的瑕疵检测结果。结构算法是基于检测物体的纹理结构,根据不同的纹理结构设计相应的算法。最近几年显著性分析算法也常用于布匹检测,一种基于上下文的局部纹理显著性分析的算法是利用局部二进制提取纹理特征,并将其应用到纹理图像检测,较好的效果证明了算法的有效性。The core of the cloth defect detection system lies in the defect detection algorithm. The current common defect detection algorithms can be roughly divided into the following categories: statistics, spectra, models, learning and structure. Statistical algorithm is to realize defect detection by expressing the spatial distribution of image gray value, mainly including co-occurrence matrix, mathematical morphology, etc. For the fabric defect detection of mathematical morphology analysis, the pre-trained Gabor wavelet network is used to extract important features, and the defect detection rate is higher than 90%. Spectral algorithms use different ways to express the characteristics of images to complete the detection of defects in the image, including Fourier transform, wavelet transform, and Gabor transform algorithms. There is also the idea of Fourier transform for defect detection. First, the histogram equalization is used to enhance the contrast of the fabric image, and then the central frequency spectrum is calculated by the two-point FFT, and a better defect detection result is obtained. The structural algorithm is based on the texture structure of the detected object, and the corresponding algorithm is designed according to different texture structures. In recent years, saliency analysis algorithms are also commonly used in cloth detection. A context-based local texture saliency analysis algorithm uses local binary to extract texture features and applies them to texture image detection. The better effect proves the algorithm’s effectiveness. Effectiveness.
学习方法主要是神经网络(Neural Network)和深度学习,BP人工神经网络和基于卷积神经网络的缺陷检测方法,该方法可以对纹理结构及特殊结构的图像实现缺陷检测及定位,性能要远远优于传统的检测方法。上述检测方法在具体的应用中都取得了不错的效果,但仍具有一些缺点。如基于Gabor变换的检测方法虽然能够提供一种联合空间表示的能力,但滤波模板的计算复杂且计算量惊人。基于数学形态学的检测方法能够很好地检测缺陷的大小及形状,对缺陷进行很好的表示,但没有足够的视觉效果支撑。以上方法的检测流程大致为图像预处理、目标提取、特征选择及模式分类,每一环节都会对模型的识别率产生影响,尤其在数据量较大且复杂的情况下,特征选择的难度也会增加,在一定程度上会对算法的泛化能力产生约束。除此之外,根据不同的检测需求还需要设计相对应的检测算法,不仅需要大量的先验知识,还要考虑检测方法的效率及有效性,在一定程度上存在很大的局限性。深度学习可以直接对二维图像进行学习,减少图像预处理,无需人工手动提取特征,可以逐层自动学习到更为合适的特征,极大 地降低了人为因素的影响。Learning methods are mainly Neural Network (Neural Network) and deep learning, BP artificial neural network and defect detection method based on convolutional neural network. This method can realize defect detection and location on images with texture structure and special structure, and the performance is far Better than traditional detection methods. The above detection methods have achieved good results in specific applications, but still have some shortcomings. For example, although the detection method based on Gabor transform can provide a joint spatial representation capability, the calculation of the filter template is complicated and the amount of calculation is amazing. The detection method based on mathematical morphology can detect the size and shape of the defect very well, and show the defect very well, but it does not have enough visual effect support. The detection process of the above methods is roughly image preprocessing, target extraction, feature selection and pattern classification. Each link will affect the recognition rate of the model, especially when the amount of data is large and complex, the difficulty of feature selection will also be Increase, to a certain extent, will restrict the generalization ability of the algorithm. In addition, corresponding detection algorithms need to be designed according to different detection requirements, which not only requires a large amount of prior knowledge, but also considers the efficiency and effectiveness of the detection method, which has great limitations to a certain extent. Deep learning can directly learn two-dimensional images, reducing image preprocessing, without manually extracting features, and can automatically learn more appropriate features layer by layer, greatly reducing the impact of human factors.
现有的深度学习检测算法主要有:Faster Rcnn、SSD、Yolov3等,这些算法对于通用场景,检测效果还行,但在布匹行业效果不佳,主要原因是布匹种类太多,缺陷形状各异,而且对于格子花纹,印花这种复杂千变万化的背景,至今没有一个好的解决方法。深度学习对于数据样本有要求,对于几百种布匹缺陷的收集,费时费力,最后布匹制造过程中像定型、丝光以及液氨等流程速度很快,深度学习计算复杂度高,对于布匹在线检测有很大的挑战性。The existing deep learning detection algorithms mainly include: Faster Rcnn, SSD, Yolov3, etc. These algorithms are good for general scenes, but they are not effective in the cloth industry. The main reason is that there are too many types of cloth and different defect shapes. And for the complex and ever-changing background of plaid patterns and printing, there is no good solution so far. Deep learning has requirements for data samples. The collection of hundreds of fabric defects is time-consuming and laborious. In the final fabric manufacturing process, processes such as shaping, mercerizing, and liquid ammonia are very fast, and deep learning has high computational complexity. Very challenging.
概述Overview
本公开提供了基于对抗神经网络的布匹缺陷检测方法、终端和存储介质。The present disclosure provides a cloth defect detection method, a terminal and a storage medium based on an adversarial neural network.
一方面,本公开提供了基于对抗神经网络的布匹缺陷检测方法,其包括:步骤1:根据布匹的正样本训练GAN模型,训练完成后输出所述GAN模型;步骤2:将布匹的检测图像样本发送给所述GAN模型,所述GAN模型对检测图像样本进行重构,生成好的重构图像样本;以及步骤3:将检测图像样本与重构图像样本进行结构相似性对比,根据对比的差异程度判断所述检测图像样本是否有缺陷。On the one hand, the present disclosure provides a cloth defect detection method based on an adversarial neural network, which includes: Step 1: Train a GAN model based on the positive samples of the cloth, and output the GAN model after the training is completed; Step 2: Take the detected image samples of the cloth Send to the GAN model, the GAN model reconstructs the detected image sample to generate a good reconstructed image sample; and Step 3: Compare the detected image sample with the reconstructed image sample for structural similarity, according to the difference in comparison Degree to determine whether the detected image sample is defective.
在某些实施方案中,训练GAN模型时采用离线训练。In some embodiments, offline training is used when training the GAN model.
在某些实施方案中,GAN模型为DCGAN模型。In some embodiments, the GAN model is a DCGAN model.
在某些实施方案中,DCGAN模型中的生成器和判别器均使用批归一化;在判别器中使用步幅卷积来替代空间池化,在生成器中的反卷积层使用微步幅卷积;在DCGAN模型更深的架构中去除全连接层;在生成器的输出层使用Tanh激活函数,在生成器的其它层使用ReLU激活函数,在判别器中使用leaky ReLU激活函数。In some embodiments, both the generator and discriminator in the DCGAN model use batch normalization; stride convolution is used in the discriminator to replace spatial pooling, and the deconvolution layer in the generator uses microstepping Amplitude convolution; remove the fully connected layer in the deeper architecture of the DCGAN model; use the Tanh activation function in the output layer of the generator, use the ReLU activation function in the other layers of the generator, and use the leaky ReLU activation function in the discriminator.
在某些实施方案中,在DCGAN模型中的生成器输入上增加隐变量输入,判别器既要参与真假判断,还需要和隐变量求互信息。In some embodiments, a hidden variable input is added to the generator input in the DCGAN model. The discriminator not only needs to participate in the true and false judgment, but also needs to seek mutual information with the hidden variable.
在某些实施方案中,DCGAN模型根据互信息更新生成器和判别器。In some embodiments, the DCGAN model updates the generator and discriminator based on mutual information.
在某些实施方案中,结构相似性对比的特征包括亮度差异、对比 度差异、结构差异。In some embodiments, the characteristics of structural similarity contrast include brightness difference, contrast difference, and structural difference.
在某些实施方案中,亮度差异估算为:In some embodiments, the brightness difference is estimated as:
Figure PCTCN2020111475-appb-000001
Figure PCTCN2020111475-appb-000001
对比度差异估算为:The contrast difference is estimated as:
Figure PCTCN2020111475-appb-000002
Figure PCTCN2020111475-appb-000002
结构差异估算为:The structural difference is estimated as:
Figure PCTCN2020111475-appb-000003
Figure PCTCN2020111475-appb-000003
在某些实时方案中,u x和u y分别表示图像块x和图像块y的均值,σ x和σ y分别表示图像块x和图像块y的标准差,
Figure PCTCN2020111475-appb-000004
Figure PCTCN2020111475-appb-000005
分别表示图像块x和图像块y的方差,σ xy表示协方差,C 1=(K 1×L) 2,C 2=(K 2×L) 2,C 3=0.5C 2,取K 1=0.01,K 2=0.03,L为像素值的动态范围,L=255,结构相似性对比公式为:
In some real-time schemes, u x and u y represent the mean value of image block x and image block y, respectively, and σ x and σ y represent the standard deviation of image block x and image block y, respectively.
Figure PCTCN2020111475-appb-000004
and
Figure PCTCN2020111475-appb-000005
Respectively represent the variance of the image block x and the image block y, σ xy represents the covariance, C 1 =(K 1 ×L) 2 , C 2 =(K 2 ×L) 2 , C 3 =0.5C 2 , take K 1 =0.01, K 2 =0.03, L is the dynamic range of the pixel value, L=255, the structural similarity comparison formula is:
M(x,y)=L(x,y)×C(x,y)×S(x,y)。M(x,y)=L(x,y)×C(x,y)×S(x,y).
另一方面,本公开还提供了终端,所述终端包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述检测方法。On the other hand, the present disclosure also provides a terminal. The terminal includes a memory, a processor, and a computer program that is stored on the memory and can run on the processor. The processor executes the computer program when the computer program is executed. The above detection method.
又一方面,本公开还提供了存储介质,所述存储介质包括计算机指令,当所述计算机指令在计算机上运行时,使得计算机执行上述检测方法。In another aspect, the present disclosure also provides a storage medium, the storage medium including computer instructions, when the computer instructions run on a computer, the computer executes the above detection method.
附图简要说明Brief description of the drawings
图1为本公开一实施方案中检测方法的流程图。Fig. 1 is a flowchart of a detection method in an embodiment of the present disclosure.
图2为本公开一实施方案中检测方法的训练阶段的流程图。Fig. 2 is a flowchart of the training phase of the detection method in an embodiment of the present disclosure.
图3为本公开一实施方案中生成对抗网络的工作示意图。FIG. 3 is a schematic diagram of the operation of generating a confrontation network in an embodiment of the present disclosure.
图4为本公开一实施方案中DCGAN模型结构图。Fig. 4 is a structure diagram of a DCGAN model in an embodiment of the present disclosure.
图5为本公开一实施方案中增加了隐变量的模型结构图。Fig. 5 is a structural diagram of a model with hidden variables added in an embodiment of the present disclosure.
图6为本公开一实施方案中检测方法测试阶段的流程图。Figure 6 is a flow chart of the testing phase of the detection method in an embodiment of the present disclosure.
图7为本公开一实施方案中布匹的检测图像样本的图片。Fig. 7 is a picture of an inspection image sample of cloth in an embodiment of the present disclosure.
图8为本公开一实施方案中GAN模型重构后的好的重构图像样本的图片。FIG. 8 is a picture of a good reconstructed image sample after reconstruction of a GAN model in an embodiment of the present disclosure.
详述Detail
以下结合附图和具体实施例,对本公开进行详细说明。The present disclosure will be described in detail below with reference to the drawings and specific embodiments.
请参阅图1,一方面,本公开提供了基于对抗神经网络的布匹缺陷检测方法。Please refer to FIG. 1. On the one hand, the present disclosure provides a cloth defect detection method based on an adversarial neural network.
在某些实施方案中,所述测试方法包括训练和测试阶段,其具体包括如下步骤:In some embodiments, the testing method includes training and testing phases, which specifically include the following steps:
步骤1:根据布匹的正样本训练GAN模型,训练完成后输出所述GAN模型。请请参阅图2,图2为训练阶段的流程图,通过收集布匹的正样本,经过将正样本预处理后来训练GAN模型,训练完成后输出所述GAN模型。所述GAN模型的训练采用离线训练。检测方法采用收集正样本来训练模型,可以避免布匹的缺陷存在种类繁多、难以收集的问题,更加符合实际生产状况。Step 1: Train the GAN model according to the positive samples of the cloth, and output the GAN model after the training is completed. Please refer to Figure 2. Figure 2 is a flow chart of the training phase. The GAN model is trained after collecting positive samples of cloth and preprocessing the positive samples. After the training is completed, the GAN model is output. The training of the GAN model adopts offline training. The detection method adopts the collection of positive samples to train the model, which can avoid the problems of a wide variety of cloth defects and difficult collection, and is more in line with the actual production situation.
请参阅图3,GAN(生成对抗网络,Generative Adversarial Networks)模型启发自博弈论中的二人零和博弈(two-player game),GAN中两位博弈方分别由生成器(generative,G)和判别器(discriminative model,D)充当。生成器通过输入的非标准图像(噪声数据)来生成伪样本,所述生成器生成的伪样本和收集的布匹的真实样本均发送给判别器,所述判别器需要分辨出伪样本和真实样本,所述GAN模型并根据判断结果对模型进行更新。因此生成器需要尽量输出接近真实的样本以骗过所述判别器,而判别器则需要尽量区分输入进来的样本是属于真实样本还是生成器生成的假图像。博弈的最终解是纳什均衡,最终的目的是希望所述判别器的判别能力足够强,且将生成器的伪样本输入进所述判别器时,所述判别器很难判断其真假,即极大化所述判别器的判别能力,极小化将所述生成器的输出图像判断为假图像的概率。因此判别器要最大化如下损失函数,尽可能把真实样本分为正例,生成样本为负例:Please refer to Figure 3. The GAN (Generative Adversarial Networks) model is inspired by the two-player zero-sum game in game theory. The two game parties in GAN are respectively composed of generators (G) and The discriminator (discriminative model, D) acts as. The generator generates a pseudo sample through the input non-standard image (noise data), the pseudo sample generated by the generator and the real sample of the collected cloth are sent to the discriminator, and the discriminator needs to distinguish between the fake sample and the real sample , The GAN model updates the model according to the judgment result. Therefore, the generator needs to output as close to the real sample as possible to fool the discriminator, and the discriminator needs to distinguish as far as possible whether the input sample is a real sample or a fake image generated by the generator. The final solution of the game is the Nash equilibrium. The ultimate goal is to hope that the discriminator’s discriminative ability is strong enough, and when the pseudo samples of the generator are input into the discriminator, it is difficult for the discriminator to judge whether it is true or false, that is Maximizing the discriminating ability of the discriminator, minimizing the probability of judging the output image of the generator as a false image. Therefore, the discriminator must maximize the following loss function, divide the real samples into positive examples as much as possible, and generate negative examples as possible:
Figure PCTCN2020111475-appb-000006
Figure PCTCN2020111475-appb-000006
其中p r表示真实样本分布,p g表示由生成器生成的样本分布,对于生成器,有两个损失函数,分别是: Where p r represents the true sample distribution, and p g represents the sample distribution generated by the generator. For the generator, there are two loss functions, namely:
Figure PCTCN2020111475-appb-000007
Figure PCTCN2020111475-appb-000007
Figure PCTCN2020111475-appb-000008
Figure PCTCN2020111475-appb-000008
合并公式1和公式2,则整个训练过程可以用如下公式表示:Combining formula 1 and formula 2, the entire training process can be expressed by the following formula:
Figure PCTCN2020111475-appb-000009
Figure PCTCN2020111475-appb-000009
在某些实施方案中,GAN模型采用DCGAN(深度卷积生成对抗网络,Deep Convolution Generative Adversarial Networks)模型,可以通过卷积层的强大特征提取能力来提升效果。请参阅图4,图4为DCGAN模型结构图。与GAN模型相比,DCGAN的模型结构有以下几个特点:In some embodiments, the GAN model adopts the DCGAN (Deep Convolution Generative Adversarial Networks) model, which can improve the effect through the powerful feature extraction capabilities of the convolutional layer. Please refer to Figure 4, which is a diagram of the DCGAN model structure. Compared with the GAN model, the model structure of DCGAN has the following characteristics:
1、在生成器(generator)和判别器(discriminator)上都使用批归一化(batch normalization),解决了初始化差的问题,同时保证梯度传播到每一层,也能够防止生成器把所有的样本都收敛到同一个点;1. Batch normalization is used on both the generator and discriminator to solve the problem of poor initialization, while ensuring that the gradient is propagated to each layer, and it can also prevent the generator from taking all The samples all converge to the same point;
2、在判别器模型中使用步幅卷积(strided convolutions)来替代空间池化(pooling),而在生成器模型中的反卷积层使用微步幅卷积(fractional strided convolutions);以及2. Use strided convolutions in the discriminator model to replace spatial pooling, while the deconvolution layer in the generator model uses fractional strided convolutions; and
3、对于更深的架构,去除全连接层,增加了模型的稳定性。在所述生成器的输出层使用Tanh激活函数,而在其它层使用Re LU激活函数,在判别器中使用leaky ReLU激活函数。3. For deeper architectures, removing the fully connected layer increases the stability of the model. The Tanh activation function is used in the output layer of the generator, the Re LU activation function is used in other layers, and the leaky ReLU activation function is used in the discriminator.
请参参阅图5,本公开在DCGAN模型上,为了更好地重构数据增加隐变量C_vector输入。将隐变量和噪声数据发送给生成器,所述生成器生成伪样本,将伪样本和真实样本发送给生成器,所述生成器既要参与真假判断,还需要和隐变量求互信息,并根据所述互信息更新所述生成器和所述判别器,使得所述生成器的生成图像中保留更多隐变量的信息。本方案中,通过收集好的正样本,根据上述模型训练网络,从而得到训练好的GAN模型,可以开始如图6所示的测试阶段,测试阶段包括步骤2和步骤3。Please refer to FIG. 5, the present disclosure adds a latent variable C_vector input on the DCGAN model in order to better reconstruct the data. The hidden variables and noise data are sent to the generator, which generates pseudo samples, and sends the pseudo samples and real samples to the generator. The generator not only participates in the true and false judgment, but also needs to seek mutual information with the hidden variables, And update the generator and the discriminator according to the mutual information, so that more hidden variable information is retained in the generated image of the generator. In this solution, by collecting positive samples and training the network according to the above model, a trained GAN model can be obtained. The testing phase shown in Figure 6 can be started, and the testing phase includes step 2 and step 3.
步骤2:将布匹的检测图像样本发送给所述GAN模型,所述GAN模型对检测图像样本进行重构,生成好的重构图像样本。请参阅图7和图8,图7布匹的检测图像样本的图片,图7中的布匹是带有污损的,将图7发送给所述GAN模型,所述GAN模型对图7进行重构,生成重构好的样本图片(图8),图8中的布匹不存在原先图7中的污损。Step 2: Send the test image sample of the cloth to the GAN model, and the GAN model reconstructs the test image sample to generate a good reconstructed image sample. Please refer to Figures 7 and 8, the pictures of the detected image samples of the cloth in Figure 7. The cloth in Figure 7 is defaced, and Figure 7 is sent to the GAN model, which reconstructs Figure 7 , Generate a reconstructed sample picture (Figure 8), the cloth in Figure 8 does not have the stains in the original Figure 7.
步骤3:将检测图像样本与重构图像样本进行结构相似性对比,根据对比的差异程度判断所述检测图像样本是否有缺陷。结构相似性(structural similarity index,SSIM),是一种衡量两幅图像相似度的指标,从亮度、对比度、结构3个不同角度来衡量图像的相似性。其中,亮度差异估算为:Step 3: Perform a structural similarity comparison between the detected image sample and the reconstructed image sample, and determine whether the detected image sample is defective according to the degree of difference of the contrast. Structural similarity (structural similarity index, SSIM) is an index that measures the similarity of two images. It measures the similarity of images from three different perspectives: brightness, contrast, and structure. Among them, the brightness difference is estimated as:
Figure PCTCN2020111475-appb-000010
Figure PCTCN2020111475-appb-000010
对比度差异估算为:The contrast difference is estimated as:
Figure PCTCN2020111475-appb-000011
Figure PCTCN2020111475-appb-000011
结构差异估算为:The structural difference is estimated as:
Figure PCTCN2020111475-appb-000012
Figure PCTCN2020111475-appb-000012
公式5到公式7中,u x和u y分别表示图像块x和图像块y的均值,σ x和σ y分别表示图像块x和图像块y的标准差,
Figure PCTCN2020111475-appb-000013
Figure PCTCN2020111475-appb-000014
分别表示图像块x和图像块y的方差,σ xy表示协方差,为了避免分母为0的情况,设置C 1、C 2、C 3共3个常数,其中C 1=(K 1×L) 2,C 2=(K 2×L) 2,C 3=0.5C 2,一般取K 1=0.01,K 2=0.03,L为像素值的动态范围,L=255。
In Formula 5 to Formula 7, u x and u y represent the mean value of image block x and image block y, respectively, and σ x and σ y represent the standard deviation of image block x and image block y, respectively.
Figure PCTCN2020111475-appb-000013
and
Figure PCTCN2020111475-appb-000014
Respectively represent the variance of image block x and image block y, σ xy represents covariance, in order to avoid the case where the denominator is 0, a total of 3 constants C 1 , C 2 , C 3 are set , where C 1 =(K 1 ×L) 2 , C 2 =(K 2 ×L) 2 , C 3 =0.5C 2 , generally take K 1 =0.01, K 2 =0.03, L is the dynamic range of the pixel value, L=255.
结构相似性对比综合了亮度差异、对比度差异和结构差异,计算公式如下:The structural similarity comparison combines the brightness difference, contrast difference and structure difference, and the calculation formula is as follows:
M(x,y)=L(x,y)×C(x,y)×S(x,y)        (公式8)M(x,y)=L(x,y)×C(x,y)×S(x,y) (Formula 8)
选取图7和图8来进行对比,由于图8是GAN模型生成的好的图像,如果图7跟图8的差异达到了预定的值,则可以判断图7存在缺陷,如果没达到,则可以判断图7的样本为良品。Figure 7 and Figure 8 are selected for comparison. Since Figure 8 is a good image generated by the GAN model, if the difference between Figure 7 and Figure 8 reaches a predetermined value, it can be judged that there is a defect in Figure 7, if it does not reach, then you can The sample in Figure 7 is judged to be a good product.
在某些实施方案中,采用布匹的正样本来训练GAN模型,在测试时,将布匹的检测图像样本(例如图7)发送给所述GAN模型进行数据重构,其中有缺陷的地方利用GAN网络的生成能力,生成好的重构图像样本(例如图8),再经过SSIM相似度特征提取,计算所述测试图像样本和重构图像样本的差异程度,根据差异程度判断输出的测试图像样本中是否有缺陷,从而达到了检测缺陷的目的,可以检测处布匹色差、破洞、飞花、错经、错纬、压皱以及污渍等缺陷,适用于棉布、麻布、无纺布、印花布、牛仔面料、梭织面料等各种类型的布匹。In some embodiments, a positive sample of cloth is used to train the GAN model, and during testing, a sample of the detected image of the cloth (for example, Figure 7) is sent to the GAN model for data reconstruction, and GAN is used where there are defects. Network generation capability, generated reconstructed image samples (such as Figure 8), and then through SSIM similarity feature extraction, calculate the degree of difference between the test image sample and the reconstructed image sample, and judge the output test image sample according to the degree of difference In order to achieve the purpose of detecting defects, it can detect defects such as color difference, holes, flying flowers, wrong warp, wrong weft, crumpled and stains on the cloth. It is suitable for cotton, linen, non-woven, printed cloth, Various types of fabrics such as denim fabrics and woven fabrics.
能够理解的是,本公开实施例中所描述的GAN模型和结构相似 度的判别方法只是用于说明本检测方法的实施例,其并不是唯一的实施方式,GAN模型也可以用GAN相关各种模型的改进来替代,结构相似度的判别方法可以用结构相似度相关的改进方法来替代,均在本方案的保护范围之内。It can be understood that the GAN model and the method for determining structural similarity described in the embodiments of the present disclosure are only used to illustrate the embodiment of the detection method, and it is not the only implementation mode. The GAN model can also be related to various GANs. The improvement of the model is substituted, and the method of judging the structural similarity can be replaced by the improved method related to the structural similarity, all of which are within the protection scope of this scheme.
另一方面,本公开提供了终端,所述终端包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述检测方法。在某些实施方案中,终端可以包括有布匹传送机构、工业相机、屏幕、打印装置等硬件。在某些实施方案中,在进行布匹检测时,将布匹安装放置在布匹传送机构上,所述布匹传送机构运动将布匹传送和舒展,所述工业相机对布匹进行图像采集,采集的图像经过上述检测方法进行检测,检测结果和数据可以进行记录并发送云端存储,也可以通过屏幕进行显示或者通过打印装置打印输出结果,可以方便数据统计和整理,有助于后续依据数据来改善工艺提升布匹良率。On the other hand, the present disclosure provides a terminal. The terminal includes a memory, a processor, and a computer program that is stored on the memory and can run on the processor. The processor implements the foregoing when the computer program is executed. Detection method. In some embodiments, the terminal may include hardware such as a cloth transport mechanism, an industrial camera, a screen, and a printing device. In some embodiments, when performing cloth inspection, the cloth is installed and placed on the cloth conveying mechanism, and the cloth conveying mechanism moves and stretches the cloth. The industrial camera collects images of the cloth, and the collected image passes through the above The detection method is used for detection. The detection results and data can be recorded and sent to the cloud for storage, and can also be displayed on the screen or printed out by a printing device. This facilitates data statistics and collation, and helps to improve the process and improve the fabric based on the subsequent data. Rate.
再一方面,本公开提供了存储介质,存储介质包括计算机指令,当所述计算机指令在计算机上运行时,使得计算机执行上述检测方法。In another aspect, the present disclosure provides a storage medium. The storage medium includes computer instructions. When the computer instructions run on a computer, the computer executes the above detection method.
在某些实施方案中,本公开通过布匹的正样本来训练和生成GAN模型,再通过GAN模型来将布匹的测试图像样本进行重构来生成重构图像样本,通过对测试图像样本和检测图像样本进行结构相似度对比来判断是否存在缺陷,可以检测出布匹色差、破洞、飞花、错经、错纬、压皱以及污渍等缺陷,适用于棉布、麻布、无纺布、印花布、牛仔面料、梭织面料等各种类型的布匹,能很好的解决在实际生产过程中存在的缺陷样本收集困难、布匹背景错综复杂等问题,检测速度和检测精度都能达到实际生产检测的要求。In some embodiments, the present disclosure trains and generates a GAN model through a positive sample of cloth, and then uses the GAN model to reconstruct the test image sample of the cloth to generate a reconstructed image sample. The samples are compared by structural similarity to judge whether there are defects. It can detect defects such as color difference, holes, flying, wrong warp, wrong weft, wrinkle, and stains. It is suitable for cotton, linen, non-woven fabric, printed fabric, denim, etc. Various types of fabrics, such as fabrics and woven fabrics, can well solve the problems of difficulty in collecting defective samples and intricate fabric backgrounds in the actual production process. The detection speed and detection accuracy can meet the requirements of actual production testing.
以上仅为本公开的较佳实施例而已,并不用于限制本公开,凡在本公开的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本公开的保护范围之内。The above are only the preferred embodiments of the present disclosure and are not used to limit the present disclosure. Any modification, equivalent replacement and improvement made within the spirit and principle of the present disclosure shall be included in the protection scope of the present disclosure. Inside.

Claims (10)

  1. 基于对抗神经网络的布匹缺陷检测方法,其包括以下步骤:The cloth defect detection method based on the confrontation neural network includes the following steps:
    步骤1:根据布匹的正样本训练GAN模型,训练完成后输出所述GAN模型;Step 1: Train the GAN model according to the positive samples of the cloth, and output the GAN model after the training is completed;
    步骤2:将布匹的检测图像样本发送给所述GAN模型,所述GAN模型对检测图像样本进行重构,生成好的重构图像样本;以及Step 2: Send the test image sample of the cloth to the GAN model, and the GAN model reconstructs the test image sample to generate a good reconstructed image sample; and
    步骤3:将检测图像样本与重构图像样本进行结构相似性对比,根据对比的差异程度判断所述检测图像样本是否有缺陷。Step 3: Perform a structural similarity comparison between the detected image sample and the reconstructed image sample, and determine whether the detected image sample is defective according to the degree of difference of the contrast.
  2. 根据权利要求1所述的基于对抗神经网络的布匹缺陷检测方法,其中训练所述GAN模型采用离线训练。The method for detecting cloth defects based on a confrontational neural network according to claim 1, wherein the training of the GAN model adopts offline training.
  3. 根据权利要求1或2所述的基于对抗神经网络的布匹缺陷检测方法,其中所述GAN模型为DCGAN模型。The method for detecting cloth defects based on an adversarial neural network according to claim 1 or 2, wherein the GAN model is a DCGAN model.
  4. 根据权利要求3所述的基于对抗神经网络的布匹缺陷检测方法,其中在所述DCGAN模型中,生成器和判别器均使用批归一化;在所述判别器中使用步幅卷积来替代空间池化,在所述生成器中的反卷积层使用微步幅卷积;在所述DCGAN模型更深的架构中去除全连接层;在所述生成器的输出层使用Tanh激活函数,在所述生成器的其它层使用ReLU激活函数,在所述判别器中使用leaky ReLU激活函数。The cloth defect detection method based on the adversarial neural network according to claim 3, wherein in the DCGAN model, both the generator and the discriminator use batch normalization; and the stride convolution is used in the discriminator instead Spatial pooling, the deconvolution layer in the generator uses micro-stride convolution; the fully connected layer is removed in the deeper architecture of the DCGAN model; the Tanh activation function is used in the output layer of the generator, The other layers of the generator use the ReLU activation function, and the leaky ReLU activation function is used in the discriminator.
  5. 根据权利要求3或4所述的基于对抗神经网络的布匹缺陷检测方法,其中在所述DCGAN模型中,在生成器输入上增加隐变量输入,判别器既要参与真假判断,还需要和隐变量求互信息。The method for detecting cloth defects based on the adversarial neural network according to claim 3 or 4, wherein in the DCGAN model, a hidden variable input is added to the generator input, and the discriminator not only needs to participate in the true and false judgment, but also needs to cooperate with the hidden variable input. Variables seek mutual information.
  6. 根据权利要求3至5中任一权利要求所述的基于对抗神经网络的布匹缺陷检测方法,其中所述DCGAN模型根据所述互信息更新生 成器和判别器。The method for detecting cloth defects based on an adversarial neural network according to any one of claims 3 to 5, wherein the DCGAN model updates the generator and the discriminator based on the mutual information.
  7. 根据权利要求1至6中任一权利要求所述的基于对抗神经网络的布匹缺陷检测方法,其中所述结构相似性对比的特征包括亮度差异、对比度差异和结构差异。The method for detecting cloth defects based on an anti-neural network according to any one of claims 1 to 6, wherein the features of the structural similarity comparison include brightness differences, contrast differences, and structural differences.
  8. 根据权利要求7所述的基于对抗神经网络的布匹缺陷检测方法,其中所述亮度差异估算为:8. The method for detecting cloth defects based on an adversarial neural network according to claim 7, wherein the brightness difference is estimated as:
    Figure PCTCN2020111475-appb-100001
    Figure PCTCN2020111475-appb-100001
    所述对比度差异估算为:The contrast difference is estimated as:
    Figure PCTCN2020111475-appb-100002
    Figure PCTCN2020111475-appb-100002
    所述结构差异估算为:The structural difference is estimated as:
    Figure PCTCN2020111475-appb-100003
    Figure PCTCN2020111475-appb-100003
    其中,u x和u y分别表示图像块x和图像块y的均值,σ x和σ y分别表示图像块x和图像块y的标准差,
    Figure PCTCN2020111475-appb-100004
    Figure PCTCN2020111475-appb-100005
    分别表示图像块x和图像块y的方差,σ xy表示协方差,C 1=(K 1×L) 2,C 2=(K 2×L) 2,C 3=0.5C 2,取K 1=0.01,K 2=0.03,L为像素值的动态范围,L=255,结构相似性对比公式为:
    Among them, u x and u y represent the mean values of image block x and image block y, respectively, and σ x and σ y represent the standard deviation of image block x and image block y, respectively.
    Figure PCTCN2020111475-appb-100004
    and
    Figure PCTCN2020111475-appb-100005
    Respectively represent the variance of the image block x and the image block y, σ xy represents the covariance, C 1 =(K 1 ×L) 2 , C 2 =(K 2 ×L) 2 , C 3 =0.5C 2 , take K 1 =0.01, K 2 =0.03, L is the dynamic range of the pixel value, L=255, the structural similarity comparison formula is:
    M(x,y)=L(x,y)×C(x,y)×S(x,y)。M(x,y)=L(x,y)×C(x,y)×S(x,y).
  9. 终端,其包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现权利要求1-8中任一权利要求所述的方法。A terminal, which includes a memory, a processor, and a computer program stored on the memory and capable of running on the processor, and when the processor executes the computer program, it implements what is described in any one of claims 1-8 The method described.
  10. 存储介质,其包括计算机指令,当所述计算机指令在计算机上运行时,使得计算机执行权利要求1-8任一权利要求所述的方法。A storage medium, which includes computer instructions, which when run on a computer, cause the computer to execute the method described in any one of claims 1-8.
PCT/CN2020/111475 2020-06-19 2020-08-26 Cloth defect detection method based on adversarial neural network, and terminal and storage medium WO2021253632A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010566989.8 2020-06-19
CN202010566989.8A CN111724372A (en) 2020-06-19 2020-06-19 Method, terminal and storage medium for detecting cloth defects based on antagonistic neural network

Publications (1)

Publication Number Publication Date
WO2021253632A1 true WO2021253632A1 (en) 2021-12-23

Family

ID=72567830

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/111475 WO2021253632A1 (en) 2020-06-19 2020-08-26 Cloth defect detection method based on adversarial neural network, and terminal and storage medium

Country Status (2)

Country Link
CN (1) CN111724372A (en)
WO (1) WO2021253632A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549507A (en) * 2022-03-01 2022-05-27 浙江理工大学 Method for detecting fabric defects by improving Scaled-YOLOv4
CN114858782A (en) * 2022-07-05 2022-08-05 中国民航大学 Milk powder doping non-directional detection method based on Raman hyperspectral countermeasure discrimination model

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419241A (en) * 2020-11-04 2021-02-26 联想(北京)有限公司 Object identification method and device based on artificial intelligence and readable storage medium
CN112381794B (en) * 2020-11-16 2022-05-31 哈尔滨理工大学 Printing defect detection method based on deep convolution generation network
CN113516615B (en) * 2020-11-24 2024-03-01 阿里巴巴集团控股有限公司 Sample generation method, system, equipment and storage medium
CN113361583A (en) * 2021-06-01 2021-09-07 珠海大横琴科技发展有限公司 Countermeasure sample detection method and device
CN113570549A (en) * 2021-06-30 2021-10-29 青岛海尔科技有限公司 Defect detection method and device for reflective surface
CN113554080A (en) * 2021-07-15 2021-10-26 长沙长泰机器人有限公司 Non-woven fabric defect detection and classification method and system based on machine vision
CN115294555B (en) * 2022-09-27 2022-12-20 江苏景瑞农业科技发展有限公司 Plant disease intelligent diagnosis method and system based on neural network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961217A (en) * 2018-06-08 2018-12-07 南京大学 A kind of detection method of surface flaw based on positive example training
US20190378263A1 (en) * 2018-06-08 2019-12-12 Industrial Technology Research Institute Industrial image inspection method and system and computer readable recording medium
CN110796637A (en) * 2019-09-29 2020-02-14 郑州金惠计算机系统工程有限公司 Training and testing method and device of image defect detection model and storage medium
CN111144477A (en) * 2019-12-25 2020-05-12 浙江工业大学之江学院 Method and system for generating training sample of steel surface defects and electronic equipment
CN111223093A (en) * 2020-03-04 2020-06-02 武汉精立电子技术有限公司 AOI defect detection method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961217A (en) * 2018-06-08 2018-12-07 南京大学 A kind of detection method of surface flaw based on positive example training
US20190378263A1 (en) * 2018-06-08 2019-12-12 Industrial Technology Research Institute Industrial image inspection method and system and computer readable recording medium
CN110796637A (en) * 2019-09-29 2020-02-14 郑州金惠计算机系统工程有限公司 Training and testing method and device of image defect detection model and storage medium
CN111144477A (en) * 2019-12-25 2020-05-12 浙江工业大学之江学院 Method and system for generating training sample of steel surface defects and electronic equipment
CN111223093A (en) * 2020-03-04 2020-06-02 武汉精立电子技术有限公司 AOI defect detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ALEC RADFORD, LUKE METZ, SOUMITH CHINTALA: "Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks", 19 November 2015 (2015-11-19), XP055399452, Retrieved from the Internet <URL:https://arxiv.org/pdf/1511.06434.pdf> *
DOSSELMANN RICHARD, XUE DONG YANG: "A Formal Assessment of the Structural Similarity Index A Formal Assessment of the Structural Similarity Index", 30 September 2008 (2008-09-30), XP055881204, Retrieved from the Internet <URL:https://www.cs.uregina.ca/Research/Techreports/2008-02.pdf> *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549507A (en) * 2022-03-01 2022-05-27 浙江理工大学 Method for detecting fabric defects by improving Scaled-YOLOv4
CN114549507B (en) * 2022-03-01 2024-05-24 浙江理工大学 Improved Scaled-YOLOv fabric flaw detection method
CN114858782A (en) * 2022-07-05 2022-08-05 中国民航大学 Milk powder doping non-directional detection method based on Raman hyperspectral countermeasure discrimination model

Also Published As

Publication number Publication date
CN111724372A (en) 2020-09-29

Similar Documents

Publication Publication Date Title
WO2021253632A1 (en) Cloth defect detection method based on adversarial neural network, and terminal and storage medium
Zhang et al. Fabric defect detection using salience metric for color dissimilarity and positional aggregation
CN106951870B (en) Intelligent detection and early warning method for active visual attention of significant events of surveillance video
CN109543548A (en) A kind of face identification method, device and storage medium
Nurhaida et al. Performance comparison analysis features extraction methods for batik recognition
CN110245593A (en) A kind of images of gestures extraction method of key frame based on image similarity
CN109191430A (en) A kind of plain color cloth defect inspection method based on Laws texture in conjunction with single classification SVM
Avola et al. Real-time deep learning method for automated detection and localization of structural defects in manufactured products
CN110660048B (en) Leather surface defect detection method based on shape characteristics
CN110598646B (en) Depth feature-based unconstrained repeated action counting method
Hu et al. FIN-GAN: Face illumination normalization via retinex-based self-supervised learning and conditional generative adversarial network
Shi et al. Loss functions for pose guided person image generation
Guan et al. Defect detection and classification for plain woven fabric based on deep learning
CN112465810A (en) Method for detecting and classifying defects of textiles
CN108537266A (en) A kind of cloth textured fault sorting technique of depth convolutional network
Yu et al. Research on CNN Algorithm for Monochromatic Fabric Defect Detection
KR102178238B1 (en) Apparatus and method of defect classification using rotating kernel based on machine-learning
Mayerich et al. Fast cell detection in high-throughput imagery using GPU-accelerated machine learning
Perng et al. Automatic surface inspection for directional textures using nonnegative matrix factorization
CN116619780A (en) Intelligent production method and system of phenolic composite material
Liu et al. Fabric defect detection algorithm based on convolution neural network and low-rank representation
Habib et al. A set of geometric features for neural network-based textile defect classification
Liu et al. Fabric defect detection using fully convolutional network with attention mechanism
Yang et al. Infrared and visible image fusion based on modal feature fusion network and dual visual decision
Han et al. Enhanced Mask R-CNN blur instance segmentation based on generated adversarial network

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20941085

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20941085

Country of ref document: EP

Kind code of ref document: A1