WO2021253632A1 - Procédé de détection de défaut de tissu basé sur un réseau de neurones artificiels antagoniste, et terminal et support d'enregistrement - Google Patents

Procédé de détection de défaut de tissu basé sur un réseau de neurones artificiels antagoniste, et terminal et support d'enregistrement Download PDF

Info

Publication number
WO2021253632A1
WO2021253632A1 PCT/CN2020/111475 CN2020111475W WO2021253632A1 WO 2021253632 A1 WO2021253632 A1 WO 2021253632A1 CN 2020111475 W CN2020111475 W CN 2020111475W WO 2021253632 A1 WO2021253632 A1 WO 2021253632A1
Authority
WO
WIPO (PCT)
Prior art keywords
cloth
neural network
model
image sample
generator
Prior art date
Application number
PCT/CN2020/111475
Other languages
English (en)
Chinese (zh)
Inventor
周凯
吴小飞
庞凤江
武艳萍
彭其栋
张帆
Original Assignee
深圳新视智科技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳新视智科技术有限公司 filed Critical 深圳新视智科技术有限公司
Publication of WO2021253632A1 publication Critical patent/WO2021253632A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Definitions

  • the present disclosure generally relates to the technical field of cloth defect detection, and more specifically to a cloth defect detection method, terminal, and storage medium based on an adversarial neural network.
  • the surface quality not only affects the appearance of the product itself, but also may affect the functional characteristics of the product itself, causing significant losses to the enterprise. Therefore, it is very necessary to detect the surface defects of the product, and it is particularly important to design a surface defect detection method with real-time and effective surface defects.
  • the traditional inspection method of cloth inspection is a manual inspection method, that is, under certain light source conditions, workers directly observe the cloth to be tested, and judge whether the cloth is defective or not based on intuitive experience.
  • This method is very easy to be affected by human subjective factors and cause false detection or missed detection, and the detection efficiency is very low, and it is not suitable for the needs of modern large-scale production.
  • the test results and data of manual testing cannot be automatically saved, which is inconvenient for subsequent data query, and it is also difficult to guide future cloth production.
  • the core of the cloth defect detection system lies in the defect detection algorithm.
  • the current common defect detection algorithms can be roughly divided into the following categories: statistics, spectra, models, learning and structure.
  • Statistical algorithm is to realize defect detection by expressing the spatial distribution of image gray value, mainly including co-occurrence matrix, mathematical morphology, etc.
  • the pre-trained Gabor wavelet network is used to extract important features, and the defect detection rate is higher than 90%.
  • Spectral algorithms use different ways to express the characteristics of images to complete the detection of defects in the image, including Fourier transform, wavelet transform, and Gabor transform algorithms. There is also the idea of Fourier transform for defect detection.
  • the histogram equalization is used to enhance the contrast of the fabric image, and then the central frequency spectrum is calculated by the two-point FFT, and a better defect detection result is obtained.
  • the structural algorithm is based on the texture structure of the detected object, and the corresponding algorithm is designed according to different texture structures.
  • saliency analysis algorithms are also commonly used in cloth detection.
  • a context-based local texture saliency analysis algorithm uses local binary to extract texture features and applies them to texture image detection. The better effect proves the algorithm’s effectiveness. Effectiveness.
  • Learning methods are mainly Neural Network (Neural Network) and deep learning, BP artificial neural network and defect detection method based on convolutional neural network.
  • This method can realize defect detection and location on images with texture structure and special structure, and the performance is far Better than traditional detection methods.
  • the above detection methods have achieved good results in specific applications, but still have some shortcomings.
  • the detection method based on Gabor transform can provide a joint spatial representation capability, the calculation of the filter template is complicated and the amount of calculation is amazing.
  • the detection method based on mathematical morphology can detect the size and shape of the defect very well, and show the defect very well, but it does not have enough visual effect support.
  • the detection process of the above methods is roughly image preprocessing, target extraction, feature selection and pattern classification.
  • the existing deep learning detection algorithms mainly include: Faster Rcnn, SSD, Yolov3, etc. These algorithms are good for general scenes, but they are not effective in the cloth industry. The main reason is that there are too many types of cloth and different defect shapes. And for the complex and ever-changing background of plaid patterns and printing, there is no good solution so far. Deep learning has requirements for data samples. The collection of hundreds of fabric defects is time-consuming and laborious. In the final fabric manufacturing process, processes such as shaping, mercerizing, and liquid ammonia are very fast, and deep learning has high computational complexity. Very challenging.
  • the present disclosure provides a cloth defect detection method, a terminal and a storage medium based on an adversarial neural network.
  • the present disclosure provides a cloth defect detection method based on an adversarial neural network, which includes: Step 1: Train a GAN model based on the positive samples of the cloth, and output the GAN model after the training is completed; Step 2: Take the detected image samples of the cloth Send to the GAN model, the GAN model reconstructs the detected image sample to generate a good reconstructed image sample; and Step 3: Compare the detected image sample with the reconstructed image sample for structural similarity, according to the difference in comparison Degree to determine whether the detected image sample is defective.
  • offline training is used when training the GAN model.
  • the GAN model is a DCGAN model.
  • both the generator and discriminator in the DCGAN model use batch normalization; stride convolution is used in the discriminator to replace spatial pooling, and the deconvolution layer in the generator uses microstepping Amplitude convolution; remove the fully connected layer in the deeper architecture of the DCGAN model; use the Tanh activation function in the output layer of the generator, use the ReLU activation function in the other layers of the generator, and use the leaky ReLU activation function in the discriminator.
  • a hidden variable input is added to the generator input in the DCGAN model.
  • the discriminator not only needs to participate in the true and false judgment, but also needs to seek mutual information with the hidden variable.
  • the DCGAN model updates the generator and discriminator based on mutual information.
  • the characteristics of structural similarity contrast include brightness difference, contrast difference, and structural difference.
  • the brightness difference is estimated as:
  • the contrast difference is estimated as:
  • the structural difference is estimated as:
  • M(x,y) L(x,y) ⁇ C(x,y) ⁇ S(x,y).
  • the present disclosure also provides a terminal.
  • the terminal includes a memory, a processor, and a computer program that is stored on the memory and can run on the processor.
  • the processor executes the computer program when the computer program is executed. The above detection method.
  • the present disclosure also provides a storage medium, the storage medium including computer instructions, when the computer instructions run on a computer, the computer executes the above detection method.
  • Fig. 1 is a flowchart of a detection method in an embodiment of the present disclosure.
  • Fig. 2 is a flowchart of the training phase of the detection method in an embodiment of the present disclosure.
  • FIG. 3 is a schematic diagram of the operation of generating a confrontation network in an embodiment of the present disclosure.
  • Fig. 4 is a structure diagram of a DCGAN model in an embodiment of the present disclosure.
  • Fig. 5 is a structural diagram of a model with hidden variables added in an embodiment of the present disclosure.
  • Figure 6 is a flow chart of the testing phase of the detection method in an embodiment of the present disclosure.
  • Fig. 7 is a picture of an inspection image sample of cloth in an embodiment of the present disclosure.
  • FIG. 8 is a picture of a good reconstructed image sample after reconstruction of a GAN model in an embodiment of the present disclosure.
  • the present disclosure provides a cloth defect detection method based on an adversarial neural network.
  • the testing method includes training and testing phases, which specifically include the following steps:
  • Step 1 Train the GAN model according to the positive samples of the cloth, and output the GAN model after the training is completed.
  • Figure 2 is a flow chart of the training phase.
  • the GAN model is trained after collecting positive samples of cloth and preprocessing the positive samples. After the training is completed, the GAN model is output.
  • the training of the GAN model adopts offline training.
  • the detection method adopts the collection of positive samples to train the model, which can avoid the problems of a wide variety of cloth defects and difficult collection, and is more in line with the actual production situation.
  • the GAN (Generative Adversarial Networks) model is inspired by the two-player zero-sum game in game theory.
  • the two game parties in GAN are respectively composed of generators (G) and
  • the discriminator (discriminative model, D) acts as.
  • the generator generates a pseudo sample through the input non-standard image (noise data), the pseudo sample generated by the generator and the real sample of the collected cloth are sent to the discriminator, and the discriminator needs to distinguish between the fake sample and the real sample ,
  • the GAN model updates the model according to the judgment result. Therefore, the generator needs to output as close to the real sample as possible to fool the discriminator, and the discriminator needs to distinguish as far as possible whether the input sample is a real sample or a fake image generated by the generator.
  • the final solution of the game is the Nash equilibrium.
  • the ultimate goal is to hope that the discriminator’s discriminative ability is strong enough, and when the pseudo samples of the generator are input into the discriminator, it is difficult for the discriminator to judge whether it is true or false, that is Maximizing the discriminating ability of the discriminator, minimizing the probability of judging the output image of the generator as a false image. Therefore, the discriminator must maximize the following loss function, divide the real samples into positive examples as much as possible, and generate negative examples as possible:
  • the GAN model adopts the DCGAN (Deep Convolution Generative Adversarial Networks) model, which can improve the effect through the powerful feature extraction capabilities of the convolutional layer.
  • DCGAN Deep Convolution Generative Adversarial Networks
  • Figure 4 is a diagram of the DCGAN model structure.
  • the model structure of DCGAN has the following characteristics:
  • the Tanh activation function is used in the output layer of the generator, the Re LU activation function is used in other layers, and the leaky ReLU activation function is used in the discriminator.
  • the present disclosure adds a latent variable C_vector input on the DCGAN model in order to better reconstruct the data.
  • the hidden variables and noise data are sent to the generator, which generates pseudo samples, and sends the pseudo samples and real samples to the generator.
  • the generator not only participates in the true and false judgment, but also needs to seek mutual information with the hidden variables, And update the generator and the discriminator according to the mutual information, so that more hidden variable information is retained in the generated image of the generator.
  • a trained GAN model can be obtained.
  • the testing phase shown in Figure 6 can be started, and the testing phase includes step 2 and step 3.
  • Step 2 Send the test image sample of the cloth to the GAN model, and the GAN model reconstructs the test image sample to generate a good reconstructed image sample.
  • the pictures of the detected image samples of the cloth in Figure 7. The cloth in Figure 7 is defaced, and Figure 7 is sent to the GAN model, which reconstructs Figure 7 , Generate a reconstructed sample picture (Figure 8), the cloth in Figure 8 does not have the stains in the original Figure 7.
  • Step 3 Perform a structural similarity comparison between the detected image sample and the reconstructed image sample, and determine whether the detected image sample is defective according to the degree of difference of the contrast.
  • Structural similarity is an index that measures the similarity of two images. It measures the similarity of images from three different perspectives: brightness, contrast, and structure. Among them, the brightness difference is estimated as:
  • the contrast difference is estimated as:
  • the structural difference is estimated as:
  • the structural similarity comparison combines the brightness difference, contrast difference and structure difference, and the calculation formula is as follows:
  • Figure 7 and Figure 8 are selected for comparison. Since Figure 8 is a good image generated by the GAN model, if the difference between Figure 7 and Figure 8 reaches a predetermined value, it can be judged that there is a defect in Figure 7, if it does not reach, then you can The sample in Figure 7 is judged to be a good product.
  • a positive sample of cloth is used to train the GAN model, and during testing, a sample of the detected image of the cloth (for example, Figure 7) is sent to the GAN model for data reconstruction, and GAN is used where there are defects.
  • Network generation capability generated reconstructed image samples (such as Figure 8), and then through SSIM similarity feature extraction, calculate the degree of difference between the test image sample and the reconstructed image sample, and judge the output test image sample according to the degree of difference
  • it can detect defects such as color difference, holes, flying flowers, wrong warp, wrong weft, crumpled and stains on the cloth. It is suitable for cotton, linen, non-woven, printed cloth, Various types of fabrics such as denim fabrics and woven fabrics.
  • the GAN model and the method for determining structural similarity described in the embodiments of the present disclosure are only used to illustrate the embodiment of the detection method, and it is not the only implementation mode.
  • the GAN model can also be related to various GANs. The improvement of the model is substituted, and the method of judging the structural similarity can be replaced by the improved method related to the structural similarity, all of which are within the protection scope of this scheme.
  • the terminal includes a memory, a processor, and a computer program that is stored on the memory and can run on the processor.
  • the processor implements the foregoing when the computer program is executed.
  • Detection method the terminal may include hardware such as a cloth transport mechanism, an industrial camera, a screen, and a printing device.
  • the cloth when performing cloth inspection, the cloth is installed and placed on the cloth conveying mechanism, and the cloth conveying mechanism moves and stretches the cloth.
  • the industrial camera collects images of the cloth, and the collected image passes through the above
  • the detection method is used for detection.
  • the detection results and data can be recorded and sent to the cloud for storage, and can also be displayed on the screen or printed out by a printing device. This facilitates data statistics and collation, and helps to improve the process and improve the fabric based on the subsequent data. Rate.
  • the present disclosure provides a storage medium.
  • the storage medium includes computer instructions. When the computer instructions run on a computer, the computer executes the above detection method.
  • the present disclosure trains and generates a GAN model through a positive sample of cloth, and then uses the GAN model to reconstruct the test image sample of the cloth to generate a reconstructed image sample.
  • the samples are compared by structural similarity to judge whether there are defects. It can detect defects such as color difference, holes, flying, wrong warp, wrong weft, wrinkle, and stains. It is suitable for cotton, linen, non-woven fabric, printed fabric, denim, etc.
  • Various types of fabrics, such as fabrics and woven fabrics can well solve the problems of difficulty in collecting defective samples and intricate fabric backgrounds in the actual production process. The detection speed and detection accuracy can meet the requirements of actual production testing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)
  • Treatment Of Fiber Materials (AREA)
  • Image Processing (AREA)

Abstract

Un procédé de détection de défaut de tissu basé sur un réseau de neurones artificiels antagoniste, et un terminal et un support d'enregistrement sont divulgués. Le procédé de détection comprend les étapes suivantes consistant à : étape 1 : former un modèle GAN selon un échantillon positif de tissu, et émettre le modèle GAN à la fin de la formation ; étape 2 : envoyer un échantillon d'image de détection du tissu au modèle GAN, et reconstituer, par le modèle GAN, l'échantillon d'image de détection, de façon à générer un bon échantillon d'image reconstitué ; et étape 3 : exécuter une comparaison de similarité structurelle entre l'échantillon d'image de détection et l'échantillon d'image reconstitué, et déterminer, selon un degré de différence obtenu au moyen de la comparaison, si l'échantillon d'image de détection est défectueux.
PCT/CN2020/111475 2020-06-19 2020-08-26 Procédé de détection de défaut de tissu basé sur un réseau de neurones artificiels antagoniste, et terminal et support d'enregistrement WO2021253632A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010566989.8 2020-06-19
CN202010566989.8A CN111724372A (zh) 2020-06-19 2020-06-19 基于对抗神经网络的布匹缺陷检测方法、终端和存储介质

Publications (1)

Publication Number Publication Date
WO2021253632A1 true WO2021253632A1 (fr) 2021-12-23

Family

ID=72567830

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/111475 WO2021253632A1 (fr) 2020-06-19 2020-08-26 Procédé de détection de défaut de tissu basé sur un réseau de neurones artificiels antagoniste, et terminal et support d'enregistrement

Country Status (2)

Country Link
CN (1) CN111724372A (fr)
WO (1) WO2021253632A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549507A (zh) * 2022-03-01 2022-05-27 浙江理工大学 改进Scaled-YOLOv4的织物瑕疵检测方法
CN114723660A (zh) * 2022-03-01 2022-07-08 华中科技大学 一种布匹缺陷检测方法及系统
CN114858782A (zh) * 2022-07-05 2022-08-05 中国民航大学 基于拉曼高光谱对抗判别模型的奶粉掺杂非定向检测方法
CN117437202A (zh) * 2023-11-01 2024-01-23 深圳市宇创显示科技有限公司 基于图像的偏光片缺陷检测方法、装置、设备及存储介质
CN118482817A (zh) * 2024-07-12 2024-08-13 北京和利时电机技术有限公司 一种基于数据分析的数字化纺织设备管理方法及系统

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419241A (zh) * 2020-11-04 2021-02-26 联想(北京)有限公司 一种基于人工智能的物体鉴别方法、装置和可读存储介质
CN112381794B (zh) * 2020-11-16 2022-05-31 哈尔滨理工大学 一种基于深度卷积生成网络的印刷缺陷检测方法
CN113516615B (zh) * 2020-11-24 2024-03-01 阿里巴巴集团控股有限公司 一种样本生成方法、系统、设备及存储介质
CN113361583A (zh) * 2021-06-01 2021-09-07 珠海大横琴科技发展有限公司 一种对抗样本检测方法和装置
CN113570549A (zh) * 2021-06-30 2021-10-29 青岛海尔科技有限公司 反光表面的缺陷检测方法及装置
CN113554080A (zh) * 2021-07-15 2021-10-26 长沙长泰机器人有限公司 一种基于机器视觉的无纺布瑕疵检测分类方法及系统
CN113674263A (zh) * 2021-08-27 2021-11-19 浙江捷瑞电力科技有限公司 一种基于生成式对抗网络的小样本缺陷检测方法
CN115294555B (zh) * 2022-09-27 2022-12-20 江苏景瑞农业科技发展有限公司 基于神经网络的植物病害智能诊断方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961217A (zh) * 2018-06-08 2018-12-07 南京大学 一种基于正例训练的表面缺陷检测方法
US20190378263A1 (en) * 2018-06-08 2019-12-12 Industrial Technology Research Institute Industrial image inspection method and system and computer readable recording medium
CN110796637A (zh) * 2019-09-29 2020-02-14 郑州金惠计算机系统工程有限公司 图像缺陷检测模型的训练、测试方法、装置及存储介质
CN111144477A (zh) * 2019-12-25 2020-05-12 浙江工业大学之江学院 一种钢材表面缺陷的训练样本生成方法、系统及电子设备
CN111223093A (zh) * 2020-03-04 2020-06-02 武汉精立电子技术有限公司 一种aoi缺陷检测方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961217A (zh) * 2018-06-08 2018-12-07 南京大学 一种基于正例训练的表面缺陷检测方法
US20190378263A1 (en) * 2018-06-08 2019-12-12 Industrial Technology Research Institute Industrial image inspection method and system and computer readable recording medium
CN110796637A (zh) * 2019-09-29 2020-02-14 郑州金惠计算机系统工程有限公司 图像缺陷检测模型的训练、测试方法、装置及存储介质
CN111144477A (zh) * 2019-12-25 2020-05-12 浙江工业大学之江学院 一种钢材表面缺陷的训练样本生成方法、系统及电子设备
CN111223093A (zh) * 2020-03-04 2020-06-02 武汉精立电子技术有限公司 一种aoi缺陷检测方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ALEC RADFORD, LUKE METZ, SOUMITH CHINTALA: "Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks", 19 November 2015 (2015-11-19), XP055399452, Retrieved from the Internet <URL:https://arxiv.org/pdf/1511.06434.pdf> *
DOSSELMANN RICHARD, XUE DONG YANG: "A Formal Assessment of the Structural Similarity Index A Formal Assessment of the Structural Similarity Index", 30 September 2008 (2008-09-30), XP055881204, Retrieved from the Internet <URL:https://www.cs.uregina.ca/Research/Techreports/2008-02.pdf> *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549507A (zh) * 2022-03-01 2022-05-27 浙江理工大学 改进Scaled-YOLOv4的织物瑕疵检测方法
CN114723660A (zh) * 2022-03-01 2022-07-08 华中科技大学 一种布匹缺陷检测方法及系统
CN114549507B (zh) * 2022-03-01 2024-05-24 浙江理工大学 改进Scaled-YOLOv4的织物瑕疵检测方法
CN114858782A (zh) * 2022-07-05 2022-08-05 中国民航大学 基于拉曼高光谱对抗判别模型的奶粉掺杂非定向检测方法
CN117437202A (zh) * 2023-11-01 2024-01-23 深圳市宇创显示科技有限公司 基于图像的偏光片缺陷检测方法、装置、设备及存储介质
CN118482817A (zh) * 2024-07-12 2024-08-13 北京和利时电机技术有限公司 一种基于数据分析的数字化纺织设备管理方法及系统

Also Published As

Publication number Publication date
CN111724372A (zh) 2020-09-29

Similar Documents

Publication Publication Date Title
WO2021253632A1 (fr) Procédé de détection de défaut de tissu basé sur un réseau de neurones artificiels antagoniste, et terminal et support d&#39;enregistrement
CN110866907A (zh) 基于注意力机制的全卷积网络织物疵点检测方法
CN110245593A (zh) 一种基于图像相似度的手势图像关键帧提取方法
CN104199823B (zh) 一种基于视觉数据驱动的织物疵点动态检测方法
CN109191430A (zh) 一种基于Laws纹理与单分类SVM结合的素色布匹缺陷检测方法
Lin et al. Defect enhancement generative adversarial network for enlarging data set of microcrack defect
Shi et al. Loss functions for pose guided person image generation
Hu et al. FIN-GAN: Face illumination normalization via retinex-based self-supervised learning and conditional generative adversarial network
CN110660048B (zh) 一种基于形状特征的皮革表面缺陷检测方法
CN110598646B (zh) 一种基于深度特征的无约束重复动作计数方法
Guan et al. Defect detection and classification for plain woven fabric based on deep learning
CN116619780A (zh) 酚醛复合材料的智能化生产方法及其系统
Xu et al. CP3: Unifying point cloud completion by pretrain-prompt-predict paradigm
CN112465810A (zh) 一种纺织品疵点的检测分类方法
CN113112482A (zh) 一种基于注意力机制网络的pcb缺陷检测方法
Yu et al. Research on CNN Algorithm for Monochromatic Fabric Defect Detection
Mayerich et al. Fast cell detection in high-throughput imagery using GPU-accelerated machine learning
KR102178238B1 (ko) 회전 커널을 이용한 머신러닝 기반 결함 분류 장치 및 방법
Perng et al. Automatic surface inspection for directional textures using nonnegative matrix factorization
Liu et al. Fabric defect detection algorithm based on convolution neural network and low-rank representation
CN116645323A (zh) 一种基于数据增强与标准化流的图像缺陷检测方法
CN113780084B (zh) 基于生成式对抗网络的人脸数据扩增方法、电子设备和存储介质
Hsu et al. Deepfake algorithm using multiple noise modalities with two-branch prediction network
Wu et al. RetinaNet-based visual inspection of flexible materials
Soares et al. Timor leste tais motif recognition using wavelet and backpropagation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20941085

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20941085

Country of ref document: EP

Kind code of ref document: A1