CN112036541B - Fabric defect detection method based on genetic algorithm optimization neural network - Google Patents

Fabric defect detection method based on genetic algorithm optimization neural network Download PDF

Info

Publication number
CN112036541B
CN112036541B CN202011112620.6A CN202011112620A CN112036541B CN 112036541 B CN112036541 B CN 112036541B CN 202011112620 A CN202011112620 A CN 202011112620A CN 112036541 B CN112036541 B CN 112036541B
Authority
CN
China
Prior art keywords
fabric
genetic algorithm
defects
neural network
defect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011112620.6A
Other languages
Chinese (zh)
Other versions
CN112036541A (en
Inventor
余灵婕
陈梦琦
支超
祝双武
孙润军
郜仲元
王帅
柯真霞
周尤勇
朱梦秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Polytechnic University
Original Assignee
Xian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Polytechnic University filed Critical Xian Polytechnic University
Priority to CN202011112620.6A priority Critical patent/CN112036541B/en
Publication of CN112036541A publication Critical patent/CN112036541A/en
Application granted granted Critical
Publication of CN112036541B publication Critical patent/CN112036541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/086Learning methods using evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses a fabric defect detection method based on a genetic algorithm optimized neural network, and belongs to the field of data processing. The invention comprises the following steps: initializing Gabor filtering parameters; collecting fabric defect images; marking to obtain defect types and frames containing defects, and establishing a PascalVOC data set; sending the PascalVOC data set into a fast-RCNN network training model for training, and calculating mAP; taking mAP as a fitness function of a genetic algorithm, performing mutation, crossover and selection to obtain a progeny Gabor parameter until the iteration number reaches a set maximum value, and outputting an optimal genotype, namely an optimal filtering parameter of a fabric defect image; and calling a corresponding Faster-RCNN model to output the positions, types and accuracy of the fabric defects. The invention can well separate the defects from the background, and the fabric detection model has high accuracy and good universality in defect detection.

Description

Fabric defect detection method based on genetic algorithm optimization neural network
Technical Field
The invention belongs to the field of data processing, and particularly relates to a fabric defect detection method based on a genetic algorithm optimized neural network.
Background
The detection of fabric defects is mostly dependent on manual operation, the method is high in repeatability, labor-consuming and long in detection time, the detection result is greatly dependent on the proficiency of workers, and the accuracy, consistency and efficiency cannot be guaranteed. Because the texture of the fabric (including single-weave fabric, composite-weave fabric, knitted fabric, twill fabric, jacquard fabric and the like) is complex, the color of the fabric is changeable, the problem that the contrast ratio between the defects of the fabric and the background texture is low and the like is difficult to separate the defects from the background, so that a good identification effect is achieved. Fabric defect detection can be divided into four categories: statistical methods, structural methods, model-based methods, and spectroscopic methods. Statistical methods typically use first and second order statistics to represent texture features. However, it is very challenging to detect subtle defects using statistics of gray values. In statistical methods, the gray values of an image are defined by various representations, such as autocorrelation functions, co-occurrence matrices, mathematical morphology, and fractal dimensions. The structuring method regards texture as a texture element, and can effectively divide defects when the pattern is regular. The spectroscopic method is one of the most widely used methods, and uses the spectral characteristics of an image to detect fabric defects, and mainly includes fourier transform, wavelet transform, gabor transform, and the like. The Gabor transformation approximates to the representation of the human visual system on frequency and direction, is commonly used for texture filtering, and has good characteristics in the aspects of extracting local space and frequency domain information of a target. However, the Gabor filter parameters are numerous, and the selection of an appropriate Gabor filter parameter is an important factor for the success of the filtering. The existing fitness function is difficult to meet the background of eliminating the defects of the fabric, the requirement of defects on Gabor parameter combination is highlighted, and the accuracy and the efficiency of detecting the defects of the fabric are low.
Disclosure of Invention
The invention aims to overcome the defects that the existing fitness function is difficult to meet the background of eliminating fabric defects, the requirement of defects on Gabor parameter combination is highlighted, and the accuracy and the efficiency of fabric defect detection are low, and provides a fabric defect detection method based on a genetic algorithm optimization neural network.
In order to achieve the purpose, the invention is realized by adopting the following technical scheme:
a fabric defect detection method based on genetic algorithm optimization neural network comprises the following steps:
s1: establishing an initial population based on a genetic algorithm to obtain initial Gabor filtering parameters;
s2: collecting fabric defect images, and cutting and segmenting the fabric defect images;
s3: obtaining defect categories and frames containing defects by using LabelImg marks, and establishing a PascalVOC data set;
s4: sending the PascalVOC data set into a Faster-RCNN network training model for training, and calculating mAP;
s5: taking the mAP as a fitness function of a genetic algorithm, performing mutation, crossover and selection to obtain a progeny Gabor parameter until the iteration number reaches a set maximum value, and outputting an optimal genotype, namely an optimal Gabor filter parameter of a fabric defect image;
the corresponding Faster-RCNN model is then invoked to output the location, type and accuracy of the fabric defects.
Further, in step S1, an initial population-specific operation is established based on a genetic algorithm as:
and setting the size of the population, the iteration times and the fitness function of the population evolution.
Further, in step S2, the collected fabric defect image is:
JPEG format of hole-wiping image, hole-pricking image or stain image.
Further, in the specific process of establishing the PascalVOC dataset in the step S3, the JPEG picture containing the defects is segmented and cut, partial pictures are taken as training sets, the text frames are calibrated by using the software LabelImg, and the label categories are Hole, edgehole and Stain respectively, so as to generate xml files;
the pictures and xml files were formatted into VOC2007 datasets and txt files of test, train, trainval and val were generated.
Further, the process of step S4 is as follows:
s4-1: inputting pictures with any size, entering a backbone network resnet50 for convolution, and outputting a Feature Map;
s4-2: the Feature Map generates a plurality of anchors through an RPN module, cuts the anchors, judges whether the anchors belong to a foreground or a background through softmax, and corrects anchors by frame regression to obtain proposals;
s4-3: the Rol Pooling layer obtains proposal feature maps with fixed size by utilizing proposals generated by the RPN module and the feature map obtained before;
s4-4: classifying the suggested frame feature images by Classification, and classifying specific categories by using the full connection layer and softmax; meanwhile, the L1 Loss is used for completing frame regression operation to obtain the accurate position of the object, a Loss function is calculated, and parameters of the whole network are updated to obtain a training model.
Further, the backbone network resnet50 has four groups of blocks, each group has 3, 4, 6 and 3 blocks, each block has three convolution layers, and the first of the network has a single convolution layer, so that the total is (3+4+6+3) 3+1=49 convolution layers and 1 full connection layer;
conv1:7×7×64, output size 112×112;
conv2_x:3 blocks, each block having 1×1×64, 3×3×64, 1×1×256, and an output size of 56×56;
conv3_x:4 blocks, each block having 1×1×128, 3×3×128, 1×1×512, and an output size of 28×28;
conv4_x:6 blocks, each block having 1×1×256, 3×3×256, 1×1×1024, and output size of 14×14;
conv5_x:3 blocks, each block comprising 1×1×512, 3×3×512, 1×1×2048, output size 7×7.
Further, the training loss of the training model includes a classification loss and a regression loss, and the total loss function is as follows:
wherein: i is an integer representing the subscript of each sample; pi represents the probability that the ith anchor predicts as the target, and represents the probability that the ith calibration frame predicts as the target; tk= { tx, ty, tw, th } represents a vector of four parameterized coordinates of the prediction frame, which is a vector of four parameterized coordinates of the calibration frame; ncls represents the normalized size of the cls term; λ represents an external weight; nreg represents the number of normalization of reg terms to anchor positions; lcls (pi, pi x) represents the classification loss, which is defined as-log [ pi x pi+ (1-pi x) (1-pi)]Pi represents the probability of being predicted to be of a certain class, pk=1 if the current sample is a positive sample, pi=0 if the current sample is a negative sample; pi is the label of the actual data that is marked, lreg (ti, ti) represents the frame regression loss, defined as smooths L1 (t-t*);Smooth L1 The definition of the function is:
compared with the prior art, the invention has the following beneficial effects:
the invention relates to a fabric defect detection method based on a genetic algorithm optimized neural network, which comprises the steps of collecting fabric defect sample images, establishing a data set, and establishing a fabric defect detection model based on PascalVOC; cutting an image before training a model, and performing direct coordinate prediction, loss value calculation and back propagation when training the model to obtain a network weight pre-parameter of a textile defect detection model; performing multiple network weight calculation and updating by using the training set to obtain optimal network weight parameters and obtain a trained fabric defect detection model; and then optimizing a Gabor kernel by taking mAP of an evaluation model index as an objective function based on a multi-population genetic algorithm (MPGA) training method, and finally obtaining the optimal filtering parameters of the fabric defect image and the defect detection result of the fabric image. The optimized Gabor parameters can well separate defects from backgrounds, and the fabric detection model has high accuracy and good universality in defect detection.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 (a) is a chart of oil stain defects of example 1;
FIG. 2 (b) is a graph showing the effect of the oil stain defect map of example 1 after filtering;
FIG. 2 (c) is the location and type of oil stain detection of example 1;
FIG. 3 (a) is a chart of oil stain defects of example 2;
FIG. 3 (b) is a graph showing the effect of the oil stain defect map of example 2 after filtering;
FIG. 3 (c) is the location and type of oil stain detection of example 2;
FIG. 4 (a) is a diagram of the hole-wiping defects of example 3;
FIG. 4 (b) is a graph showing the effect of the hole-cleaning defect map of example 3 after filtering;
FIG. 4 (c) is the position and type of the hole-wiping detection of example 3;
FIG. 5 (a) is a diagram of the hole-wiping defects of example 4;
FIG. 5 (b) is a graph showing the effect of the hole-cleaning defect map of example 4 after filtering;
FIG. 5 (c) is the position and type of the hole-wiping detection of example 4;
FIG. 6 (a) is a chart of edge-pricking defects of example 5;
FIG. 6 (b) is a graph showing the effect of the edge-banding defect map of example 5 after filtering;
FIG. 6 (c) is the position and type of edge-hole detection of example 5;
FIG. 7 (a) is a chart of edge-pricking defects of example 6;
FIG. 7 (b) is a graph showing the effect of the edge-banding defect map of example 6 after filtering;
FIG. 7 (c) shows the position and type of the edge-prick test in example 6.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The genetic algorithm based on the population provides a universal framework for searching global optimization parameters, the problem of searching optimal parameter combinations can be solved, and the selection of fitness functions in the genetic process is the key for training Gabor parameters by using the genetic algorithm.
The invention is described in further detail below with reference to the attached drawing figures:
referring to fig. 1, fig. 1 is a flow chart of the present invention; a fabric defect detection method based on genetic algorithm optimization neural network comprises collecting fabric defect sample image, establishing data set, and establishing fabric defect detection model based on PascalVOC; cutting an image before training a model, and performing direct coordinate prediction, loss value calculation and back propagation when training the model to obtain a network weight pre-parameter of a textile defect detection model; performing multiple network weight calculation and updating by using the training set to obtain optimal network weight parameters, and obtaining a trained fabric defect detection model; and then optimizing a Gabor kernel by taking mAP of an evaluation model index as an objective function based on a multi-population genetic algorithm (MPGA) training method, and finally obtaining the optimal filtering parameters of the fabric defect image and the defect detection result of the fabric image. The optimized Gabor parameters can well separate defects from backgrounds, and the fabric detection model has high accuracy and good universality in defect detection.
Example 1
FIG. 2 (a) is a chart of collected oil stain defects with larger areas and unclear edges; fig. 2 (b) is an effect diagram of the collected oil stain defect map after filtering treatment by using a genetic algorithm and a neural network in combination with the searched optimal filtering parameters suitable for the map, and fig. 2 (c) is a predicted oil stain position and type outputted after target detection.
Example 2
FIG. 3 (a) is a chart of collected oil stain defects that are small in area and near the fabric edge; fig. 3 (b) is an effect diagram of the collected oil stain defect map after filtering treatment by using a genetic algorithm and a neural network in combination with the searched optimal filtering parameters suitable for the map, and fig. 3 (c) is a predicted oil stain position and type outputted after target detection.
Example 3
FIG. 4 (a) is a plot of collected hole-wiping defects, which are no folds in the cloth cover, and which are seen to be more pronounced in broken holes; fig. 4 (b) is an effect diagram of the collected hole-cleaning defect map after filtering treatment by using a genetic algorithm and a neural network in combination with the searched optimal filtering parameters suitable for the map, and fig. 4 (c) is a predicted hole-cleaning position and type outputted after target detection.
Example 4
FIG. 5 (a) is a plot of collected hole-wiping defects with a sharp crease in the face of the defect, and a less pronounced hole-breaking can be seen; fig. 5 (b) is an effect diagram of the collected hole-cleaning defect map after filtering treatment by using a genetic algorithm and a neural network in combination with the searched optimal filtering parameters suitable for the map, and fig. 5 (c) is a predicted hole-cleaning position and type outputted after target detection, and compared with example 3, the accuracy of detection and positioning in example 4 is higher.
Example 5
FIG. 6 (a) is a graph of collected edge hole-pricking defects, which are no obvious crease but uneven, and which are seen to be more obvious; fig. 6 (b) is an effect diagram of the collected edge-banding hole defect map after filtering processing by using a genetic algorithm and a neural network in combination with the searched optimal filtering parameters suitable for the map, and fig. 6 (c) is a predicted edge-banding hole position and type outputted after target detection.
Example 6
FIG. 7 (a) is a graph of collected edge hole-pricking defects with obvious folds in the face of the defect, and it can be seen that the edge hole-pricking is less obvious; fig. 7 (b) is an effect diagram obtained by filtering the collected edge-banding hole defect map by using a genetic algorithm and a neural network in combination with the searched optimal filtering parameters suitable for the map, and fig. 7 (c) is a predicted edge-banding hole position and type outputted after target detection, compared with example 5, the accuracy of detection and positioning of example 6 is higher.
The above is only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited by this, and any modification made on the basis of the technical scheme according to the technical idea of the present invention falls within the protection scope of the claims of the present invention.

Claims (6)

1. A fabric defect detection method based on genetic algorithm optimization neural network is characterized by comprising the following steps:
s1: establishing an initial population based on a genetic algorithm to obtain initial Gabor filtering parameters;
s2: collecting fabric defect images, and cutting and segmenting the fabric defect images;
s3: obtaining defect categories and frames containing defects by using LabelImg marks, and establishing a PascalVOC data set;
s4: sending the PascalVOC data set into a Faster-RCNN network training model for training, and calculating mAP;
s5: taking the mAP as a fitness function of a genetic algorithm, performing mutation, crossover and selection to obtain a progeny Gabor parameter until the iteration number reaches a set maximum value, and outputting an optimal genotype, namely an optimal Gabor filter parameter of a fabric defect image;
then, the corresponding Faster-RCNN model is called to output the position, type and accuracy of the fabric defects;
the specific process of establishing the PascalVOC dataset in the step S3 is to divide and cut the JPEG picture containing the defects, take partial pictures as a training set, calibrate the text frame by using the software LabelImg, and respectively obtain Hole, edgehole and Stain label types to generate xml files;
the pictures and xml files were formatted into VOC2007 datasets and txt files of test, train, trainval and val were generated.
2. A method of detecting defects in fabrics based on a genetic algorithm optimized neural network as claimed in claim 1, wherein in step S1, an initial population-specific operation is established based on the genetic algorithm as:
and setting the size of the population, the iteration times and the fitness function of the population evolution.
3. The method for detecting defects in a fabric based on a genetic algorithm optimized neural network according to claim 1, wherein in step S2, the collected images of defects in the fabric are:
JPEG format of hole-wiping image, hole-pricking image or stain image.
4. The method for detecting defects in a fabric based on a genetic algorithm optimized neural network according to claim 1, wherein the process of step S4 is as follows:
s4-1: inputting pictures with any size, entering a backbone network resnet50 for convolution, and outputting a Feature Map;
s4-2: the Feature Map generates a plurality of anchors through an RPN module, cuts the anchors, judges whether the anchors belong to a foreground or a background through softmax, and corrects anchors by frame regression to obtain proposals;
s4-3: the Rol Pooling layer obtains proposal feature maps with fixed size by utilizing proposals generated by the RPN module and the feature map obtained before;
s4-4: classifying the suggested frame feature images by Classification, and classifying specific categories by using the full connection layer and softmax; meanwhile, the L1 Loss is used for completing frame regression operation to obtain the accurate position of the object, a Loss function is calculated, and parameters of the whole network are updated to obtain a training model.
5. A method of detecting defects in a fabric based on a neural network optimized by a genetic algorithm as claimed in claim 4, wherein the backbone network resnet50 has four groups of blocks, each group having 3, 4, 6, 3 blocks, three convolutional layers in each block, and a single convolutional layer in the beginning of the network, so that the total is (3+4+6+3) ×3+1=49 convolutional layers, 1 fully-connected layer;
conv1:7×7×64, output size 112×112;
conv2_x:3 blocks, each block having 1×1×64, 3×3×64, 1×1×256, and an output size of 56×56;
conv3_x:4 blocks, each block having 1×1×128, 3×3×128, 1×1×512, and an output size of 28×28;
conv4_x:6 blocks, each block having 1×1×256, 3×3×256, 1×1×1024, and output size of 14×14;
conv5_x:3 blocks, each block comprising 1×1×512, 3×3×512, 1×1×2048, output size 7×7.
6. A method of detecting defects in a fabric based on a genetic algorithm optimized neural network as claimed in claim 4, wherein the training loss of the training model includes a classification loss and a regression loss, and the total loss function is as follows:
wherein:iis an integer representing the subscript of each sample;pirepresent the firstiProbability of each Anchor being predicted as target, representing the firstiPredicting the probability of a target by the calibration frames;tk={txtytwththe vector of four parameterized coordinates of the prediction frame is the vector of four parameterized coordinates of the calibration frame;Nclsrepresentation ofclsNormalized size of the term;λrepresents an external weight;Nregrepresentation ofregItem normalization is the number of anchor positions;Lclspi,pi*) Represents a classification loss defined as-log [pi*pi+(1-pi*)(1-pi)],piRepresenting the probability of being predicted to be of a certain class, if the current sample is a positive samplepk=1, if the current sample is a negative sample, thenpi=0;pi* Is a tag of the actual data that is marked,Lreg(ti,ti* ) Representing the frame regression loss, defined asSmooth L1 (t-t*);Smooth L1 The definition of the function is:
CN202011112620.6A 2020-10-16 2020-10-16 Fabric defect detection method based on genetic algorithm optimization neural network Active CN112036541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011112620.6A CN112036541B (en) 2020-10-16 2020-10-16 Fabric defect detection method based on genetic algorithm optimization neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011112620.6A CN112036541B (en) 2020-10-16 2020-10-16 Fabric defect detection method based on genetic algorithm optimization neural network

Publications (2)

Publication Number Publication Date
CN112036541A CN112036541A (en) 2020-12-04
CN112036541B true CN112036541B (en) 2023-11-17

Family

ID=73573675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011112620.6A Active CN112036541B (en) 2020-10-16 2020-10-16 Fabric defect detection method based on genetic algorithm optimization neural network

Country Status (1)

Country Link
CN (1) CN112036541B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116562358B (en) * 2023-03-16 2024-01-09 中国人民解放军战略支援部队航天工程大学士官学校 Construction method of image processing Gabor kernel convolutional neural network

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060122147A (en) * 2005-05-25 2006-11-30 삼성전자주식회사 Gabor filter and filtering method thereof, and image processing method adopting the same
CN103955922A (en) * 2014-04-17 2014-07-30 西安工程大学 Method for detecting flaws of printed fabric based on Gabor filter
CN105205828A (en) * 2015-10-20 2015-12-30 江南大学 Warp knitted fabric flaw detection method based on optimal Gabor filter
CN106845556A (en) * 2017-02-09 2017-06-13 东华大学 A kind of fabric defect detection method based on convolutional neural networks
CN108520114A (en) * 2018-03-21 2018-09-11 华中科技大学 A kind of textile cloth defect detection model and its training method and application
CN109613006A (en) * 2018-12-22 2019-04-12 中原工学院 A kind of fabric defect detection method based on end-to-end neural network
CN110930387A (en) * 2019-11-21 2020-03-27 中原工学院 Fabric defect detection method based on depth separable convolutional neural network

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6753965B2 (en) * 2001-01-09 2004-06-22 The University Of Hong Kong Defect detection system for quality assurance using automated visual inspection

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060122147A (en) * 2005-05-25 2006-11-30 삼성전자주식회사 Gabor filter and filtering method thereof, and image processing method adopting the same
CN103955922A (en) * 2014-04-17 2014-07-30 西安工程大学 Method for detecting flaws of printed fabric based on Gabor filter
CN105205828A (en) * 2015-10-20 2015-12-30 江南大学 Warp knitted fabric flaw detection method based on optimal Gabor filter
CN106845556A (en) * 2017-02-09 2017-06-13 东华大学 A kind of fabric defect detection method based on convolutional neural networks
CN108520114A (en) * 2018-03-21 2018-09-11 华中科技大学 A kind of textile cloth defect detection model and its training method and application
CN109613006A (en) * 2018-12-22 2019-04-12 中原工学院 A kind of fabric defect detection method based on end-to-end neural network
CN110930387A (en) * 2019-11-21 2020-03-27 中原工学院 Fabric defect detection method based on depth separable convolutional neural network

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
Fabric Defect Detection Based on Faster RCNN;Bing Wei等;《Artificial Intelligence on Fashion and Textiles Conference》;45-51 *
基于卷积神经网络的色织物疵点检测与分类算法研究;董阿梅;《中国优秀硕士学位论文全文数据库(信息科技辑)》(第02期);I138-2139 *
基于纹理周期性分析的织物疵点检测方法;祝双武等;《计算机工程与应用》(第21期);163-166 *
实Gabor滤波器的设计及其在织物疵点检测中的应用;陈泽虹;《中国优秀硕士学位论文全文数据库(工程科技Ⅰ辑)》(第04期);B024-8 *
应用GAN和Faster R-CNN的色织物缺陷识别;李明等;《西安工程大学学报》(第06期);663-669 *
应用最优Gabor滤波器的经编织物疵点检测;尉苗苗等;《纺织学报》(第11期);48-54 *
应用遗传算法优化Gabor滤波器的机织物疵点检测;周文明等;《东华大学学报(自然科学版)》(第04期);535-541 *

Also Published As

Publication number Publication date
CN112036541A (en) 2020-12-04

Similar Documents

Publication Publication Date Title
CN110852316B (en) Image tampering detection and positioning method adopting convolution network with dense structure
CN112966691B (en) Multi-scale text detection method and device based on semantic segmentation and electronic equipment
CN113392775B (en) Sugarcane seedling automatic identification and counting method based on deep neural network
CN108288271A (en) Image detecting system and method based on three-dimensional residual error network
CN108038846A (en) Transmission line equipment image defect detection method and system based on multilayer convolutional neural networks
CN109154978A (en) System and method for detecting plant disease
CN109978918A (en) A kind of trajectory track method, apparatus and storage medium
CN110427990A (en) A kind of art pattern classification method based on convolutional neural networks
CN109871875B (en) Building change detection method based on deep learning
CN110647802A (en) Remote sensing image ship target detection method based on deep learning
CN109615604B (en) Part appearance flaw detection method based on image reconstruction convolutional neural network
CN108710893A (en) A kind of digital image cameras source model sorting technique of feature based fusion
CN107944403A (en) Pedestrian's attribute detection method and device in a kind of image
CN102855478A (en) Method and device for positioning text areas in image
CN112036541B (en) Fabric defect detection method based on genetic algorithm optimization neural network
CN114882306A (en) Topographic map scale identification method and device, storage medium and electronic equipment
Xue et al. Automatic identification of butterfly species based on gray-level co-occurrence matrix features of image block
CN116468690A (en) Subtype analysis system of invasive non-mucous lung adenocarcinoma based on deep learning
CN116416523A (en) Machine learning-based rice growth stage identification system and method
CN115063679A (en) Pavement quality assessment method based on deep learning
CN114782948A (en) Global interpretation method and system for cervical liquid-based cytology smear
CN107992863A (en) Multiresolution grain worm species visual identity method
CN114445691A (en) Model training method and device, electronic equipment and storage medium
CN111461060A (en) Traffic sign identification method based on deep learning and extreme learning machine
Semwal et al. Copy move image forgery detection using machine learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant