CN112541915A - Efficient cloth defect detection method, system and equipment for high-resolution images - Google Patents

Efficient cloth defect detection method, system and equipment for high-resolution images Download PDF

Info

Publication number
CN112541915A
CN112541915A CN202110099327.9A CN202110099327A CN112541915A CN 112541915 A CN112541915 A CN 112541915A CN 202110099327 A CN202110099327 A CN 202110099327A CN 112541915 A CN112541915 A CN 112541915A
Authority
CN
China
Prior art keywords
image
defect detection
model
cloth
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110099327.9A
Other languages
Chinese (zh)
Inventor
何泳澔
苏虎
姜锐
向世明
潘春洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Shanghai Maritime University
Original Assignee
Institute of Automation of Chinese Academy of Science
Shanghai Maritime University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science, Shanghai Maritime University filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN202110099327.9A priority Critical patent/CN112541915A/en
Publication of CN112541915A publication Critical patent/CN112541915A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of image recognition, and particularly relates to a high-efficiency cloth defect detection method, system and equipment for a high-resolution image, aiming at solving the problem that real-time and accurate cloth defect detection cannot be realized on a high-resolution cloth surface image. The invention comprises the following steps: constructing a cloth defect monitoring model, and calculating model parameters by combining image resolution; preprocessing an input image and dividing the input image into batchs with set sizes; calculating the loss value of the Batch in the forward propagation of the model, and performing the updating of the model weight parameters in the backward propagation; carrying out iterative training until a set condition is reached, and obtaining a trained cloth defect detection model; and carrying out online defect detection on the cloth surface image with high resolution through the trained model. According to the invention, the image resolution, the network width and the network depth are simultaneously adjusted according to the field configuration and the requirements on the detection speed and precision, and the detection precision and the detection efficiency are improved under the condition of less parameters.

Description

Efficient cloth defect detection method, system and equipment for high-resolution images
Technical Field
The invention belongs to the technical field of image recognition, and particularly relates to a high-efficiency cloth defect detection method, system and device for a high-resolution image.
Background
The generation of surface defects can greatly affect the quality of cloth and reduce the product price. The surface defect detection is an important link in the cloth production process and is the key for ensuring the product quality. The manual detection of the cloth defects is long in time consumption, is easily influenced by subjective factors, and is difficult to guarantee accuracy and reliability. In addition, since some types of defect features are very weak, in order to fully represent the image features of weak defects, a high-resolution image of the surface of the cloth needs to be acquired by using a high-precision camera. Moreover, the acquired image cannot be directly zoomed in the detection, otherwise, small-area defects are easily lost, and detection omission or false detection is caused.
The existing image detection method can not effectively meet the requirements of industrial application on algorithm precision and speed in a high-resolution cloth surface image, so that a cloth defect detection method is urgently needed in the field, the detection precision and accuracy can be ensured, and the detection efficiency can be improved so as to meet the requirement of real-time and accurate detection of cloth defects in industrial application.
Disclosure of Invention
In order to solve the above problems in the prior art, namely, the problem that real-time and accurate cloth defect detection cannot be realized on a high-resolution cloth surface image, the invention provides an efficient cloth defect detection method for a high-resolution image, which comprises the following steps:
step S10, EDD based on deep neural network modelConstructing a cloth defect detection model, and constructing hyper-parameters based on the cloth defect detection model
Figure DEST_PATH_IMAGE001
Calculating the image resolution, and acquiring a training sample set according to the image resolution; the cloth defect detection model comprises a feature extraction network, a feature fusion network and a detection/classification network;
step S20, based on the hyper-parameter
Figure 152754DEST_PATH_IMAGE001
Respectively calculating the width and the depth of the feature fusion network and the depth of a detection/classification network, and defining a multi-task loss function and an optimizer of the cloth defect detection model training;
step S30, dividing the training sample set into Batch with set size;
step S40, selecting any Batch, calculating the output value of forward propagation of the model and the loss value of the label based on the multitask loss function, and reversely propagating and updating the model weight parameter through the optimizer based on the loss value;
step S50, repeating step S40 until the loss value is lower than a set threshold value or reaches a set training frequency, and obtaining a trained cloth defect detection model;
and step S60, detecting the defects of the online cloth surface image with high resolution through the trained cloth defect detection model.
In some preferred embodiments, the training sample set is a preprocessed image set obtained by sequentially performing image scaling, image normalization, image flipping, image grouping and image zero padding on each image in the high-resolution cloth surface image set;
the image scaling is to scale the input image to a set size according to an equal aspect ratio;
the image normalization is to normalize the gray value of the zoomed image pixel into an image with a mean value of 0 and a variance of 1;
the image turning is to turn the normalized image up and down and left and right respectively with the probability of 0.5;
the image grouping is to divide the overturned images into two groups according to the aspect ratio, wherein the image with the aspect ratio larger than 1 is taken as one group, the image with the aspect ratio smaller than 1 is taken as the other group, and the images divided into the same Batch belong to the same group;
and the image zero filling is to adjust the images divided into the same Batch into a uniform size through zero filling operation.
In some preferred embodiments, the hyper-parameters of the cloth defect detection model are based in step S10
Figure 256976DEST_PATH_IMAGE001
The method for calculating the image resolution comprises the following steps:
Figure DEST_PATH_IMAGE003
wherein the content of the first and second substances,
Figure 616413DEST_PATH_IMAGE004
is the image resolution.
In some preferred embodiments, the width and depth of the feature fusion network are calculated by:
Figure 147889DEST_PATH_IMAGE006
Figure 491145DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE009
and
Figure 399058DEST_PATH_IMAGE010
respectively representing the width and depth of the feature fusion network,
Figure DEST_PATH_IMAGE011
representsNatural logarithm.
In some preferred embodiments, the depth of the detection/classification network is calculated by:
Figure DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 862270DEST_PATH_IMAGE014
representing the depth of the detection/classification network.
In some preferred embodiments, the multitasking loss function includes
Figure DEST_PATH_IMAGE015
The focal length function and smooth function, the aspect ratio parameter of Anchor set in the feature layer is
Figure 299067DEST_PATH_IMAGE016
In some preferred embodiments, the feature fusion network includes a feature fusion method that:
step B10, extracting the multi-layer image feature of the input image through the convolution layer and the pooling layer of the feature extraction network
Figure DEST_PATH_IMAGE017
Step B20, after scaling the high-level features, fusing the high-level features with the next-level features in turn,P3、P4、P5 bottom layer feature binding original feature layer
Figure 129620DEST_PATH_IMAGE017
Fusing with high-level features:
Figure DEST_PATH_IMAGE019
Figure DEST_PATH_IMAGE021
Figure DEST_PATH_IMAGE023
Figure DEST_PATH_IMAGE025
Figure DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 762595DEST_PATH_IMAGE028
Figure DEST_PATH_IMAGE029
respectively represent the 1 st fusion input of the 7 th and 6 th layer characteristics,
Figure 96625DEST_PATH_IMAGE030
Figure DEST_PATH_IMAGE031
respectively representing the 1 st and 2 nd fusion inputs of the 5 th layer feature,
Figure 704324DEST_PATH_IMAGE032
Figure DEST_PATH_IMAGE033
respectively representing the 1 st and 3 rd fusion inputs of the layer 4 feature,
Figure 22173DEST_PATH_IMAGE034
Figure DEST_PATH_IMAGE035
respectively representing the 1 st and 2 nd fusion inputs of the layer 3 feature,
Figure 989997DEST_PATH_IMAGE036
a normalized weight representing a characteristic is then determined,
Figure DEST_PATH_IMAGE037
Figure 444112DEST_PATH_IMAGE038
Figure DEST_PATH_IMAGE039
Figure 222713DEST_PATH_IMAGE040
Figure DEST_PATH_IMAGE041
respectively representing the feature fusion output of the 3 rd layer, the 4 th layer, the 5 th layer, the 6 th layer and the 7 th layer.
In another aspect of the invention, an efficient cloth defect detection system for high resolution images is presented, the system comprising the following modules:
a model construction and hyper-parameter calculation module configured to construct a cloth defect detection model based on the deep neural network model EDD, and calculate the hyper-parameter of the cloth defect detection model based on the acquired image resolution of the training image set training samples
Figure 27858DEST_PATH_IMAGE001
(ii) a The cloth defect detection model comprises a feature extraction network, a feature fusion network and a detection/classification network;
a model build parameter and training parameter definition module configured to define a model based on the hyper-parameters
Figure 815685DEST_PATH_IMAGE001
Respectively calculating the width and the depth of the feature fusion network and the depth of a detection/classification network, and defining a multi-task loss function and an optimizer of the cloth defect detection model training;
a sample dividing module configured to divide the training sample set into Batch of a set size;
the model training and parameter updating module is configured to select any Batch, calculate an output value of forward propagation of the model and a loss value of the label based on the multitask loss function, and update a model weight parameter through backward propagation based on the loss value;
the iteration circulating module is configured to iteratively train the model and update the parameters until the loss value is lower than a set threshold value or reaches a set training frequency, and a trained cloth defect detection model is obtained;
and the defect detection module is configured to detect the defects of the online high-resolution cloth surface images through the trained cloth defect detection model.
In a third aspect of the present invention, an electronic device is provided, including:
at least one processor; and
a memory communicatively coupled to at least one of the processors; wherein the content of the first and second substances,
the memory stores instructions executable by the processor for execution by the processor to implement the above-described method for efficient cloth defect detection for high resolution images.
In a fourth aspect of the present invention, a computer-readable storage medium is provided, which stores computer instructions for execution by the computer to implement the above-mentioned efficient cloth defect detection method for high resolution images.
The invention has the beneficial effects that:
(1) the invention discloses a high-efficiency cloth defect detection method for high-resolution images, which is based on a single hyper-parameter, integrally optimizes parameters such as resolution of an input image, width and depth of a network and the like, and can realize high-precision defect detection aiming at the high-resolution cloth surface images under the limitation of different computing resources.
(2) The high-efficiency cloth defect detection method for the high-resolution image carries out the layered fusion of the multilayer image characteristics of the input image through the characteristic fusion network, effectively improves the cloth defect detection precision, and can realize the real-time detection under the condition of ensuring the precision.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a schematic flow diagram of a method of the present invention for high-resolution image efficient cloth defect detection;
FIG. 2 is a schematic diagram of a network structure of the high-efficiency cloth defect detection method for high-resolution images according to the present invention;
FIG. 3 is a schematic diagram of basic operation modules in a feature extraction network of the efficient cloth defect detection method for high-resolution images according to the present invention;
FIG. 4 is a schematic diagram of a feature fusion strategy of the efficient cloth defect detection method for high-resolution images according to the present invention;
FIG. 5 is an exemplary diagram of cloth defect detection results for one embodiment of the present invention of a method for high efficiency cloth defect detection of high resolution images;
FIG. 6 is a schematic diagram of the detection accuracy and speed of an embodiment of the method for detecting defects in cloth with high efficiency for high resolution images according to the present invention;
FIG. 7 is a block diagram of a computer system of a server for implementing embodiments of the method, system, and apparatus of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
The invention discloses a high-efficiency cloth defect detection method for a high-resolution image, which comprises the following steps of:
step S10, constructing a cloth defect detection model based on the EDD of the deep neural network model, and constructing hyper-parameters of the cloth defect detection model
Figure 639153DEST_PATH_IMAGE001
Calculating the image resolution, and acquiring a training sample set according to the image resolution; the cloth defect detection model comprises a feature extraction network, a feature fusion network and a detection/classification network;
step S20, based on the hyper-parameter
Figure 854234DEST_PATH_IMAGE001
Respectively calculating the width and the depth of the feature fusion network and the depth of a detection/classification network, and defining a multi-task loss function and an optimizer of the cloth defect detection model training;
step S30, dividing the training sample set into Batch with set size;
step S40, selecting any Batch, calculating the output value of forward propagation of the model and the loss value of the label based on the multitask loss function, and reversely propagating and updating the model weight parameter through the optimizer based on the loss value;
step S50, repeating step S40 until the loss value is lower than a set threshold value or reaches a set training frequency, and obtaining a trained cloth defect detection model;
and step S60, detecting the defects of the online cloth surface image with high resolution through the trained cloth defect detection model.
In order to more clearly describe the method for detecting cloth defects with high efficiency for high resolution images, the following describes the steps in the embodiment of the present invention in detail with reference to fig. 1.
The high-efficiency cloth defect detection method for high-resolution images of the first embodiment of the invention comprises the steps S10-S60, wherein the steps are described in detail as follows:
step S10, constructing a cloth defect detection model based on the EDD of the deep neural network model, and constructing hyper-parameters of the cloth defect detection model
Figure 881096DEST_PATH_IMAGE001
Calculating the resolution of the image and based on said imageObtaining a training sample set by resolution; the cloth defect detection model comprises a feature extraction network, a feature fusion network and a detection/classification network.
Determining the hyper-parameters of the cloth defect detection model according to the hardware configuration and real-time requirements of the industrial field
Figure 472614DEST_PATH_IMAGE001
And calculates the image resolution of the input image (i.e., the training sample), as shown in equation (1):
Figure 901322DEST_PATH_IMAGE042
wherein the content of the first and second substances,
Figure 552883DEST_PATH_IMAGE004
is the image resolution.
The training sample set is a preprocessing image set obtained by sequentially carrying out image scaling, image normalization, image turnover, image grouping and image zero filling operations on each image in the high-resolution cloth surface image set:
zooming an input image to a set size according to an equal aspect ratio;
normalizing the image, namely normalizing the gray value of the zoomed image pixel into an image with the mean value of 0 and the variance of 1;
image turning, namely turning the normalized image up and down and turning the normalized image left and right respectively at a probability of 0.5;
the image grouping is to divide the overturned images into two groups according to the aspect ratio, wherein the image with the aspect ratio larger than 1 is taken as one group, the image with the aspect ratio smaller than 1 is taken as the other group, and the images divided into the same Batch belong to the same group;
and (4) carrying out zero filling on the images, wherein the images divided into the same Batch are adjusted to be in a uniform size through zero filling operation.
Step S20, based on the hyper-parameter
Figure 332620DEST_PATH_IMAGE001
Respectively calculate theThe width and the depth of the feature fusion network and the depth of the detection/classification network are defined, and a multi-task loss function and an optimizer of the cloth defect detection model training are defined.
As shown in Table 1, model structure and hyper-parameters
Figure 462250DEST_PATH_IMAGE001
The relationship table of (1):
TABLE 1
Figure 745464DEST_PATH_IMAGE044
In Table 1, Conv represents convolution, Stage 1-7 represents different stages of the backbone network, and Output represents the Output layer. [ MBConv1 (in mg),k3×3]represents the expression of [ MBConv1,k3×3]module, [ 2 ]]The internal representation module type, the specific structure of the module is shown in fig. 3, and the multiplication sign and the Arabic number after the module represent the number of the same modules corresponding to the module in series connection. 3 × 3, stride 2 represents convolution with a 3 × 3 convolution kernel with a step size of 2, Conv1 × 1 represents 1 × 1 convolution, Pooling represents Pooling, and FC represents fully connected layers.
As shown in fig. 2, which is a schematic diagram of a network structure of the method for detecting a high-efficiency cloth defect of a high-resolution image according to the present invention, a Backbone network as a feature extraction network is represented by a backhaul network, Input represents an Input image, and in P1/2, P2/4, P3/8, P4/16, P5/32, P6/64, and P7/128, a number after P represents the number of feature layers, and a number after/represents a multiple of resolution reduction of a current feature map compared with the Input image. Feature fusion network represents Feature fusion network, retrieved Block represents Feature fusion network depth, Head represents shared Detection Head network, Classification/Detection network represents Detection/Classification network, and Conv represents convolution.
According to the hyperparameter
Figure DEST_PATH_IMAGE045
And table 1, establishing a backbone network as a feature extraction network, extracting multi-scale features of the input image, wherein basic operation modules are shown in fig. 3, and in H × W × F, H and W are respectively substituted forTable profile height and width, F represents the number of channels of the profile,
Figure 551615DEST_PATH_IMAGE046
representing element-by-element addition, Conv1 × 1 represents a 1 × 1 convolution, and DwiseConv3 × 3, DwiseConv5 × 5 represent separable convolutions of 3 × 3 and 5 × 5, respectively.
The calculation methods of the width and the depth of the feature fusion network are respectively shown as a formula (2) and a formula (3):
Figure DEST_PATH_IMAGE047
Figure DEST_PATH_IMAGE049
wherein the content of the first and second substances,
Figure 287489DEST_PATH_IMAGE050
and
Figure DEST_PATH_IMAGE051
respectively representing the width and depth of the feature fusion network,
Figure 955231DEST_PATH_IMAGE052
representing the natural logarithm.
As shown in fig. 4, a schematic diagram of a feature fusion strategy of the efficient cloth defect detection method for high-resolution images of the present invention is shown, where the feature fusion method specifically includes:
step B10, extracting the multi-layer image feature of the input image through the convolution layer and the pooling layer of the feature extraction network
Figure DEST_PATH_IMAGE053
Step B20, after scaling the high-level features, fusing the high-level features with the next-level features in sequence, for the bottom-level features of P3, P4 and P5, further fusing the bottom-level features with the high-level features, and adding the original feature layer in the fusing process
Figure 827372DEST_PATH_IMAGE053
Direct connection to the present layer. Setting a learnable weight coefficient for each feature layer in the fusion process, and adjusting in network training, wherein the feature fusion formula is shown as formula (4) -formula (8):
Figure DEST_PATH_IMAGE055
Figure DEST_PATH_IMAGE057
Figure DEST_PATH_IMAGE059
Figure DEST_PATH_IMAGE061
Figure DEST_PATH_IMAGE063
wherein the content of the first and second substances,
Figure 742108DEST_PATH_IMAGE064
Figure DEST_PATH_IMAGE065
respectively represent the 1 st fusion input of the 7 th and 6 th layer characteristics,
Figure 230858DEST_PATH_IMAGE066
Figure DEST_PATH_IMAGE067
respectively representing the 1 st and 2 nd fusion inputs of the 5 th layer feature,
Figure 685979DEST_PATH_IMAGE068
Figure 943785DEST_PATH_IMAGE069
respectively representing the 1 st and 3 rd fusion inputs of the layer 4 feature,
Figure DEST_PATH_IMAGE070
Figure 842471DEST_PATH_IMAGE071
respectively representing the 1 st and 2 nd fusion inputs of the layer 3 feature,
Figure DEST_PATH_IMAGE072
a normalized weight representing a characteristic is then determined,
Figure 552938DEST_PATH_IMAGE073
Figure DEST_PATH_IMAGE074
Figure 562482DEST_PATH_IMAGE075
Figure 940374DEST_PATH_IMAGE076
Figure 259228DEST_PATH_IMAGE077
respectively representing the feature fusion output of the 3 rd layer, the 4 th layer, the 5 th layer, the 6 th layer and the 7 th layer.
The calculation method of the depth of the detection/classification network is shown as the formula (9):
Figure 988150DEST_PATH_IMAGE079
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE080
representing the depth of the detection/classification network.
Establishing a feature fusion network according to the formula (2) to the formula (3), establishing a detection/classification network according to the formula (9), completing the construction of a cloth defect detection model, and defining a multi-task loss function and an optimizer of model training;
the model training multitask loss function is shown as equation (10):
Figure DEST_PATH_IMAGE082
wherein the content of the first and second substances,
Figure 270227DEST_PATH_IMAGE083
representing the multitasking loss of the model training,
Figure DEST_PATH_IMAGE084
is shown as
Figure 237046DEST_PATH_IMAGE084
The number of the anchors is one,
Figure 477534DEST_PATH_IMAGE085
and
Figure DEST_PATH_IMAGE086
are respectively the first
Figure 428173DEST_PATH_IMAGE084
Predicted class and actual class labels of the individual anchors, when
Figure 294366DEST_PATH_IMAGE084
When the individual anchor is the background,
Figure 381271DEST_PATH_IMAGE086
is a non-volatile organic compound (I) with a value of 0,
Figure 58240DEST_PATH_IMAGE087
and
Figure DEST_PATH_IMAGE088
are respectively the first
Figure 230595DEST_PATH_IMAGE084
The predicted regression value and the expected regression value of each anchor. In the present invention, in the case of the present invention,
Figure 385633DEST_PATH_IMAGE089
with the Focal local function, the parameter value is set to
Figure 592624DEST_PATH_IMAGE090
Figure 174915DEST_PATH_IMAGE091
A smooth function is employed.
Figure 834566DEST_PATH_IMAGE092
Is represented by
Figure 42562DEST_PATH_IMAGE089
Loss and
Figure 104059DEST_PATH_IMAGE091
parameters of weights in the multitask loss of the model.
Figure 388410DEST_PATH_IMAGE093
And
Figure 535358DEST_PATH_IMAGE094
classification loss and regression loss, respectively.
In step S30, the training sample set is divided into Batch with a set size.
And step S40, selecting any Batch, calculating the output value of forward propagation of the model and the loss value of the label based on the multitask loss function, and reversely propagating and updating the model weight parameter through the optimizer based on the loss value.
The network weights are initialized by using a pre-training model, random initialization is adopted for the newly established convolution layer, the learning rate is set to be 0.005, and the Batch size of each Batch is set to be 16. And adjusting network parameters by adopting an SGD gradient optimizer.
And step S50, repeating the step S40 until the loss value is lower than the set threshold value or reaches the set training times, and obtaining the trained cloth defect detection model.
In one embodiment of the invention, the number of training epoch is set to 100.
And step S60, detecting the defects of the online cloth surface image with high resolution through the trained cloth defect detection model.
As shown in fig. 5 and 6, an exemplary graph of a cloth defect detection result and a schematic graph of detection accuracy and speed according to an embodiment of the method for detecting a cloth defect of high efficiency in high resolution images of the present invention are shown. Fig. 5 is an example of a detection result, and it can be seen that the method of the present invention can effectively detect a defect with a large scale change. Fig. 6 is a schematic diagram of the accuracy and speed of the method of the present invention, and it can be seen that the method of the present invention can be applied to an industrial image with high resolution, and can maintain high detection accuracy only by using a small amount of parameters and calculation, so as to achieve a better balance between accuracy and speed, meet the application requirements of an industrial scene, and have a very high practical value.
A second embodiment of the present invention is a system for efficient cloth defect detection for high resolution images, the system comprising the following modules:
a model construction and hyper-parameter calculation module configured to construct a cloth defect detection model based on the deep neural network model EDD, and calculate the hyper-parameter of the cloth defect detection model based on the acquired image resolution of the training image set training samples
Figure 32198DEST_PATH_IMAGE045
(ii) a The cloth defect detection model comprises a feature extraction network, a feature fusion network and a detection/classification network;
a model build parameter and training parameter definition module configured to define a model based on the hyper-parameters
Figure 948202DEST_PATH_IMAGE045
Respectively calculating the width and the depth of the feature fusion network and the depth of a detection/classification network, and defining a multi-task loss function and an optimizer of the cloth defect detection model training;
a sample dividing module configured to divide the training sample set into Batch of a set size;
the model training and parameter updating module is configured to select any Batch, calculate an output value of forward propagation of the model and a loss value of the label based on the multitask loss function, and update a model weight parameter through backward propagation based on the loss value;
the iteration circulating module is configured to iteratively train the model and update the parameters until the loss value is lower than a set threshold value or reaches a set training frequency, and a trained cloth defect detection model is obtained;
and the defect detection module is configured to detect the defects of the online high-resolution cloth surface images through the trained cloth defect detection model.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and related description of the system described above may refer to the corresponding process in the foregoing method embodiments, and will not be described herein again.
It should be noted that, the efficient cloth defect detection system for high-resolution images provided by the above embodiment is only exemplified by the division of the above functional modules, and in practical applications, the above functions may be allocated to different functional modules according to needs, that is, the modules or steps in the embodiment of the present invention are further decomposed or combined, for example, the modules in the above embodiment may be combined into one module, or may be further split into multiple sub-modules, so as to complete all or part of the above described functions. The names of the modules and steps involved in the embodiments of the present invention are only for distinguishing the modules or steps, and are not to be construed as unduly limiting the present invention.
An electronic apparatus according to a third embodiment of the present invention includes:
at least one processor; and
a memory communicatively coupled to at least one of the processors; wherein the content of the first and second substances,
the memory stores instructions executable by the processor for execution by the processor to implement the above-described method for efficient cloth defect detection for high resolution images.
A computer-readable storage medium of a fourth embodiment of the present invention stores computer instructions for execution by the computer to implement the above-described method for efficient cloth defect detection for high resolution images.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes and related descriptions of the storage device and the processing device described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Reference is now made to FIG. 7, which illustrates a block diagram of a computer system of a server for implementing embodiments of the method, system, and apparatus of the present application. The server shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for system operation are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An Input/Output (I/O) interface 705 is also connected to the bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU)701, performs the above-described functions defined in the method of the present application. It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terms "first," "second," and the like are used for distinguishing between similar elements and not necessarily for describing or implying a particular order or sequence.
The terms "comprises," "comprising," or any other similar term are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.

Claims (10)

1. A method for efficient cloth defect detection for high resolution images, the method comprising the steps of:
step S10, constructing a cloth defect detection model based on the EDD of the deep neural network model, and constructing hyper-parameters of the cloth defect detection model
Figure 380080DEST_PATH_IMAGE001
Calculating the image resolution, and acquiring a training sample set according to the image resolution; the cloth defect detection model comprises a feature extraction network, a feature fusion network and a detection/classification network;
step S20, based on the hyper-parameter
Figure 889558DEST_PATH_IMAGE001
Respectively calculating the width and depth of the feature fusion network and the detection/classification networkThe depth of the network, and a multi-task loss function and an optimizer trained by the cloth defect detection model are defined;
step S30, dividing the training sample set into Batch with set size;
step S40, selecting any Batch, calculating the output value of forward propagation of the model and the loss value of the label based on the multitask loss function, and reversely propagating and updating the model weight parameter through the optimizer based on the loss value;
step S50, repeating step S40 until the loss value is lower than a set threshold value or reaches a set training frequency, and obtaining a trained cloth defect detection model;
and step S60, detecting the defects of the online cloth surface image with high resolution through the trained cloth defect detection model.
2. The method as claimed in claim 1, wherein the training sample set is a pre-processing image set obtained by sequentially performing image scaling, image normalization, image flipping, image grouping and image zero-filling operations on each image in the high-resolution cloth surface image set;
the image scaling is to scale the input image to a set size according to an equal aspect ratio;
the image normalization is to normalize the gray value of the zoomed image pixel into an image with a mean value of 0 and a variance of 1;
the image turning is to turn the normalized image up and down and left and right respectively with the probability of 0.5;
the image grouping is to divide the overturned images into two groups according to the aspect ratio, wherein the image with the aspect ratio larger than 1 is taken as one group, the image with the aspect ratio smaller than 1 is taken as the other group, and the images divided into the same Batch belong to the same group;
and the image zero filling is to adjust the images divided into the same Batch into a uniform size through zero filling operation.
3. According to claim 1The high-efficiency cloth defect detection method for the high-resolution image is characterized in that the hyper-parameters based on the cloth defect detection model in the step S10
Figure 813652DEST_PATH_IMAGE001
The method for calculating the image resolution comprises the following steps:
Figure 556218DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 591170DEST_PATH_IMAGE003
is the image resolution.
4. The method for detecting the cloth defect with high efficiency for the high resolution image according to claim 1, wherein the width and the depth of the feature fusion network are calculated by the following steps:
Figure 486314DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 581308DEST_PATH_IMAGE005
and
Figure 437269DEST_PATH_IMAGE006
respectively representing the width and depth of the feature fusion network,
Figure 151278DEST_PATH_IMAGE007
representing the natural logarithm.
5. The method of claim 1, wherein the depth of the detection/classification network is calculated as:
Figure 510715DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 166825DEST_PATH_IMAGE009
representing the depth of the detection/classification network.
6. The method of claim 1, wherein the multitasking loss function comprises
Figure 244502DEST_PATH_IMAGE010
The focal length function and smooth function, the aspect ratio parameter of Anchor set in the feature layer is
Figure 263667DEST_PATH_IMAGE011
7. The method for detecting the defects of the cloth of high resolution image according to claim 1, wherein the feature fusion network is characterized in that:
step B10, extracting the multi-layer image feature of the input image through the convolution layer and the pooling layer of the feature extraction network
Figure 743190DEST_PATH_IMAGE012
Step B20, after scaling the high-level features, fusing the high-level features with the next-level features in turn,P3、P4、P5 bottom layer feature binding original feature layer
Figure 304621DEST_PATH_IMAGE012
Fusing with high-level features:
Figure 135174DEST_PATH_IMAGE013
Figure 315620DEST_PATH_IMAGE014
Figure 525015DEST_PATH_IMAGE015
Figure 132714DEST_PATH_IMAGE016
Figure 309618DEST_PATH_IMAGE017
wherein the content of the first and second substances,
Figure 293754DEST_PATH_IMAGE018
Figure 121771DEST_PATH_IMAGE019
respectively represent the 1 st fusion input of the 7 th and 6 th layer characteristics,
Figure 900371DEST_PATH_IMAGE020
Figure 705516DEST_PATH_IMAGE021
respectively representing the 1 st and 2 nd fusion inputs of the 5 th layer feature,
Figure 352398DEST_PATH_IMAGE022
Figure 661019DEST_PATH_IMAGE023
respectively representing the 1 st and 3 rd fusion inputs of the layer 4 feature,
Figure 485887DEST_PATH_IMAGE024
Figure 512749DEST_PATH_IMAGE025
respectively representing the 1 st and 2 nd fusion inputs of the layer 3 feature,
Figure 963322DEST_PATH_IMAGE026
a normalized weight representing a characteristic is then determined,
Figure 126450DEST_PATH_IMAGE027
Figure 617824DEST_PATH_IMAGE028
Figure 131982DEST_PATH_IMAGE029
Figure 120667DEST_PATH_IMAGE030
Figure 403881DEST_PATH_IMAGE031
respectively representing the feature fusion output of the 3 rd layer, the 4 th layer, the 5 th layer, the 6 th layer and the 7 th layer.
8. An efficient cloth defect detection system for high resolution images, the system comprising the following modules:
a model construction and hyper-parameter calculation module configured to construct a cloth defect detection model based on the deep neural network model EDD, and calculate the hyper-parameter of the cloth defect detection model based on the acquired image resolution of the training image set training samples
Figure 960764DEST_PATH_IMAGE001
(ii) a The cloth defect detection model comprises a feature extraction network, a feature fusion network and a detection/classification network;
a model build parameter and training parameter definition module configured to define a model based on the hyper-parameters
Figure 572005DEST_PATH_IMAGE001
Respectively calculating the width and the depth of the feature fusion network and the depth of a detection/classification network, and defining a multi-task loss function and an optimizer of the cloth defect detection model training;
a sample dividing module configured to divide the training sample set into Batch of a set size;
the model training and parameter updating module is configured to select any Batch, calculate an output value of forward propagation of the model and a loss value of the label based on the multitask loss function, and update a model weight parameter through backward propagation based on the loss value;
the iteration circulating module is configured to iteratively train the model and update the parameters until the loss value is lower than a set threshold value or reaches a set training frequency, and a trained cloth defect detection model is obtained;
and the defect detection module is configured to detect the defects of the online high-resolution cloth surface images through the trained cloth defect detection model.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to at least one of the processors; wherein the content of the first and second substances,
the memory stores instructions executable by the processor for performing the method of any one of claims 1-7 for high efficiency cloth defect detection of high resolution images.
10. A computer-readable storage medium storing computer instructions for execution by the computer to implement the method for high efficiency cloth defect detection of high resolution images of any of claims 1-7.
CN202110099327.9A 2021-01-25 2021-01-25 Efficient cloth defect detection method, system and equipment for high-resolution images Pending CN112541915A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110099327.9A CN112541915A (en) 2021-01-25 2021-01-25 Efficient cloth defect detection method, system and equipment for high-resolution images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110099327.9A CN112541915A (en) 2021-01-25 2021-01-25 Efficient cloth defect detection method, system and equipment for high-resolution images

Publications (1)

Publication Number Publication Date
CN112541915A true CN112541915A (en) 2021-03-23

Family

ID=75017218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110099327.9A Pending CN112541915A (en) 2021-01-25 2021-01-25 Efficient cloth defect detection method, system and equipment for high-resolution images

Country Status (1)

Country Link
CN (1) CN112541915A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115206078A (en) * 2022-09-15 2022-10-18 法施达(天津)智能科技有限公司 Railway anchoring detection and early warning method, system and equipment based on cloud data analysis

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507972A (en) * 2020-04-20 2020-08-07 南京航空航天大学 Tunnel surface defect detection method combining convolutional neural network and support vector machine

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507972A (en) * 2020-04-20 2020-08-07 南京航空航天大学 Tunnel surface defect detection method combining convolutional neural network and support vector machine

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHOU, T.,ET.AL: "EDDs: A series of Efficient Defect Detectors for fabric quality inspection", 《MEASUREMENT》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115206078A (en) * 2022-09-15 2022-10-18 法施达(天津)智能科技有限公司 Railway anchoring detection and early warning method, system and equipment based on cloud data analysis
CN115206078B (en) * 2022-09-15 2022-12-16 法施达(天津)智能科技有限公司 Railway anchoring detection and early warning method, system and equipment based on cloud data analysis

Similar Documents

Publication Publication Date Title
CN110188765B (en) Image semantic segmentation model generation method, device, equipment and storage medium
CN112560876B (en) Single-stage small sample target detection method for decoupling measurement
US11341626B2 (en) Method and apparatus for outputting information
WO2020000390A1 (en) Systems and methods for depth estimation via affinity learned with convolutional spatial propagation networks
CN109086811B (en) Multi-label image classification method and device and electronic equipment
CN110956126B (en) Small target detection method combined with super-resolution reconstruction
CN105917354A (en) Spatial pyramid pooling networks for image processing
US11164306B2 (en) Visualization of inspection results
CN113139543B (en) Training method of target object detection model, target object detection method and equipment
CN111626176A (en) Ground object target detection method and system of remote sensing image
CN113256592B (en) Training method, system and device of image feature extraction model
CN110910445B (en) Object size detection method, device, detection equipment and storage medium
CN111275660A (en) Defect detection method and device for flat panel display
EP4318313A1 (en) Data processing method, training method for neural network model, and apparatus
US20210056353A1 (en) Joint representation learning from images and text
CN114331985A (en) Electronic component scratch defect detection method and device and computer equipment
CN113780270A (en) Target detection method and device
CN115482221A (en) End-to-end weak supervision semantic segmentation labeling method for pathological image
CN115587964A (en) Entropy screening-based pseudo label cross consistency change detection method
CN114511733A (en) Fine-grained image identification method and device based on weak supervised learning and readable medium
CN112465050B (en) Image template selection method, device, equipment and storage medium
CN112541915A (en) Efficient cloth defect detection method, system and equipment for high-resolution images
CN113780287A (en) Optimal selection method and system for multi-depth learning model
CN116543295A (en) Lightweight underwater target detection method and system based on degradation image enhancement
CN116309612A (en) Semiconductor silicon wafer detection method, device and medium based on frequency decoupling supervision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210323

RJ01 Rejection of invention patent application after publication