CN117333383A - Surface defect detection method, device and equipment - Google Patents

Surface defect detection method, device and equipment Download PDF

Info

Publication number
CN117333383A
CN117333383A CN202311154548.7A CN202311154548A CN117333383A CN 117333383 A CN117333383 A CN 117333383A CN 202311154548 A CN202311154548 A CN 202311154548A CN 117333383 A CN117333383 A CN 117333383A
Authority
CN
China
Prior art keywords
reflection
image
target workpiece
network
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311154548.7A
Other languages
Chinese (zh)
Other versions
CN117333383B (en
Inventor
高红超
杜洪威
闫笑颜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong OPT Machine Vision Co Ltd
Original Assignee
Guangdong OPT Machine Vision Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong OPT Machine Vision Co Ltd filed Critical Guangdong OPT Machine Vision Co Ltd
Priority to CN202311154548.7A priority Critical patent/CN117333383B/en
Publication of CN117333383A publication Critical patent/CN117333383A/en
Application granted granted Critical
Publication of CN117333383B publication Critical patent/CN117333383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0475Generative networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/094Adversarial learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a surface defect detection method, a surface defect detection device and surface defect detection equipment. In the embodiment of the application, the target workpiece image to be detected for surface defects can be obtained; inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type; if the reflection type of the target workpiece image is the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces; and inputting the target workpiece image after the repairing operation into a defect detection network.

Description

Surface defect detection method, device and equipment
Technical Field
The application relates to the technical field of deep learning, in particular to a surface defect detection method, a surface defect detection device and surface defect detection equipment.
Background
The surface defect detection of the workpiece is an important ring for ensuring the quality of products in industrial production, the traditional manual detection is easily influenced by subjective factors, the false detection rate of missed detection is very high, and the detection efficiency is low. Some nondestructive detection methods such as an ultrasonic scanning detection method and an infrared detection method are adopted, but the detection cost is high, so that the nondestructive detection method is mostly limited to the spot inspection of part of high-precision parts. With the development of emerging technologies such as deep learning, techniques for detecting surface defects of a workpiece using a deep neural network typified by a convolutional neural network are being widely studied, and such detection methods are being gradually applied to various actual industrial scenes.
At present, when a deep neural network is used for detecting surface defects of a workpiece, a clear workpiece picture is often an important precondition for efficient detection. However, due to the fact that the surface of some workpieces is generally smooth and has the characteristics of high light reflection and the like, the obtained surface of the picture is seriously reflected, the defect surface of the workpiece is unclear, the picture details of the light reflection part cannot be obtained, defect feature extraction of the detected object can be influenced when defect detection is carried out, the detection result is subjected to error detection, the industrial requirement cannot be met, the success rate of surface defect detection is reduced, and great difficulty is caused to defect detection of the workpiece.
Disclosure of Invention
Aspects of the present application provide a method, an apparatus, and a device for detecting a surface defect, which are used for solving the problem of low success rate of defect detection on a workpiece with a reflective characteristic.
The embodiment of the application provides a surface defect detection method, which comprises the following steps: acquiring a target workpiece image to be subjected to surface defect detection; inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, the reflection type comprises a no-reflection type and a reflection type, and the reflection type comprises a first reflection degree and a second reflection degree; if the reflection type of the target workpiece image is the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces; and inputting the target workpiece image after the repairing operation into a defect detection network to output and obtain a surface defect detection result of the target workpiece image.
The embodiment of the application also provides a surface defect detection device, which comprises: the image acquisition module is used for acquiring a target workpiece image to be subjected to surface defect detection; the reflection estimation module is used for inputting the target workpiece image into a reflection estimation model to output a reflection type for obtaining the target workpiece image, wherein the reflection estimation model is obtained by training based on a plurality of workpiece images and corresponding reflection type labels, the reflection type comprises a non-reflection type and a reflection type, and the reflection type comprises a first reflection degree and a second reflection degree; the reflection repair module is used for performing repair operation of erasing a reflection area on the target workpiece image through a reflection erasing network if the reflection type of the target workpiece image is a reflection type; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces; and the defect detection module is used for inputting the target workpiece image after the repair operation into a defect detection network so as to output and obtain a surface defect detection result of the target workpiece image.
The embodiment of the application also provides electronic equipment, which comprises: a memory and a processor;
the memory is used for storing a computer program;
the processor, coupled to the memory, is configured to execute the computer program for: acquiring a target workpiece image to be subjected to surface defect detection; inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, the reflection type comprises a no-reflection type and a reflection type, and the reflection type comprises a first reflection degree and a second reflection degree; if the reflection type of the target workpiece image is the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces; and inputting the target workpiece image after the repairing operation into a defect detection network to output and obtain a surface defect detection result of the target workpiece image.
The present embodiments also provide a computer-readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the steps in the surface defect detection method provided by the embodiments of the present application.
The surface defect detection method provided by the embodiment of the application can acquire the target workpiece image to be subjected to surface defect detection; and predicting the reflection type of the target workpiece image through a reflection estimation model, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type. And under the condition that the reflection type of the target workpiece image is determined to be the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the light reflection erasing network is obtained by training original light reflection images of a plurality of light reflection workpieces under a plurality of different imaging effects and original light reflection-free images of a plurality of light reflection workpieces, and finally, the target workpiece image after the repairing operation is input into the defect detection network, so that the surface defect detection result of the target workpiece image can be output and obtained.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
fig. 1 is a schematic flow chart of a surface defect detection method according to an exemplary embodiment of the present application;
fig. 2 is a schematic structural diagram of a reflection estimation model in a surface defect detection method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a light reflection erasing network according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a first generator and a second generator according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a first discriminator and a second discriminator according to the embodiments of the present application;
FIG. 6 is a detailed flowchart of a surface defect detection method according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a surface defect detection apparatus according to an exemplary embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
For the purposes, technical solutions and advantages of the present application, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The problem that the success rate of defect detection on a workpiece with light reflection characteristics is low is solved, and in some embodiments of the application, a surface defect detection method is provided. The method comprises the steps of obtaining a target workpiece image to be detected for surface defects; and predicting the reflection type of the target workpiece image through a reflection estimation model, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type. And under the condition that the reflection type of the target workpiece image is determined to be the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the light reflection erasing network is obtained by training original light reflection images of a plurality of light reflection workpieces under a plurality of different imaging effects and original light reflection-free images of a plurality of light reflection workpieces, and finally, the target workpiece image after the repairing operation is input into the defect detection network, so that the surface defect detection result of the target workpiece image can be output and obtained.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a flow chart of a method for constructing a defect detection network according to an exemplary embodiment of the present application. As shown in fig. 1, the method includes:
step 110, a target workpiece image to be subjected to surface defect detection is acquired.
And 120, inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type.
In some exemplary embodiments, the training process of the glistening estimation model includes:
acquiring a sample image data set of a workpiece to be detected in an industrial scene, wherein the sample image data set comprises a plurality of sample images with reflective surfaces, a plurality of sample images with non-reflective surfaces and corresponding reflective degree labels;
randomly selecting a plurality of images from a sample image dataset according to a preset training sample number;
performing a preset image enhancement operation on the plurality of images, wherein the preset image enhancement operation comprises at least one of turning, blurring, scaling and random clipping;
Training to obtain a reflection estimation model based on the images after the image enhancement operation and a preset cross entropy loss function, wherein the reflection estimation model is constructed based on ResNet 18.
Wherein, the light reflection type may include a weak light reflection type and a strong light reflection type, wherein the light reflection degree of the weak light reflection type is a first light reflection degree, and the light reflection degree of the strong light reflection type is a second light reflection degree. Specifically, the training process of the reflection estimation model may include: s1, sample images in a sample image dataset of a workpiece to be detected in an industrial scene can be divided into a non-reflection type, a weak reflection type and a strong reflection type, namely, corresponding reflection degree labels can comprise the non-reflection type, the weak reflection type and the strong reflection type. S2, randomly selecting a batch of images (comprising a plurality of images) from a sample image data set of the workpiece to be detected in the industrial scene, and executing one or more preset image enhancement operations such as overturning, blurring, zooming, random cutting and the like on each image in the batch of images. And S3, inputting each image of the image enhancement operation into a reflection estimation model, extracting the characteristics of each image through the reflection estimation model and carrying out reflection category prediction operation based on the characteristics of each image so as to predict the reflection degree of each image, and calculating a multi-category cross entropy loss function value of the reflection estimation model based on the prediction result and the reflection degree label of each image. S4, repeating the step S3 until the multi-classification cross entropy loss function value of the reflection estimation model is not reduced any more, and obtaining the trained reflection estimation model.
Optionally, in S3, a reflection type prediction operation is performed on each image, so as to specifically and predictably obtain a reflection degree estimated value logits of each image, and then the reflection degree estimated value logits of each image is converted into a multi-label representation form p through a sigmoid activation function, where p= sigmoid (logits), sigmoid (logits) =1/(1+e) -logits ). The calculation formula of the multi-classification cross entropy loss function of the reflection estimation model is as follows:
wherein M is the number of types of the reflective degree labels, namely the number of categories, y ic Is a sign function (0 or 1), if the true class of the sample image i (i.e., true retroreflective label) is equal to the specified retroreflective label c taken as 1, otherwise taken as 0, p ic The prediction probability that the observation sample image i belongs to the specified reflection degree label c is represented. Fig. 2 is a schematic structural diagram of a reflection estimation model provided in an embodiment of the present application, where in fig. 2, the reflection estimation model includes a res net18 for extracting features of an input image and two Fully Connected layers, that is, FC (Fully Connected in english) layers in the figure, and the Fully Connected layers are one common layer type in a neural network, and may perform matrix multiplication and offset addition operations on connection weights between the input features and each neuron, so as to obtain an output result. The first FC layer shown in FIG. 2 performs feature extraction on features of the input image extracted by ResNet18, the second FC layer performs reflection type prediction based on the features extracted by the first FC layer to obtain estimated reflection degree values logits of the input image, and the estimated reflection degree values are obtained by using a sigmoid activation function to activate each image The reflection degree estimated value logits is converted into a multi-label expression form p, and the reflection type of the input image is obtained.
130, if the reflection type of the target workpiece image is the reflection type, repairing the erasing reflection area of the target workpiece image through a reflection erasing network; the light reflection erasing network is obtained by training original light reflection images based on the light reflection workpieces under different imaging effects and original non-light reflection images of the light reflection workpieces.
The construction process of the reflective erasing network comprises the following steps:
acquiring original reflective images of a plurality of reflective workpieces under a plurality of different imaging effects and original non-reflective images of the plurality of reflective workpieces;
establishing a mapping relation between an original reflection image and an original non-reflection image of each reflection workpiece in the plurality of reflection workpieces to obtain a reflection erasing network training set;
based on a training set of the light reflection erasing network, training to obtain the light reflection erasing network, wherein the light reflection erasing network comprises a first generator, a second generator, a first discriminator and a second discriminator, the first generator is connected with the first discriminator, the second generator is connected with the second discriminator, an original light reflection image of each light reflection workpiece in the plurality of light reflection workpieces is used as an input of the first generator, and an original non-light reflection image of each light reflection workpiece in the plurality of light reflection workpieces is used as an input of the second generator;
The network structure of the network module of the first generator is the same as that of the network module of the second generator, and the network module comprises an encoder, an auxiliary classifier and a decoder; the network structure of the network module of the first discriminator is the same as that of the network module of the second discriminator, and the network module of the first discriminator comprises an encoder and an auxiliary classifier; the first generator is used for generating an input non-reflective image of the reflective workpiece based on the input original reflective image of the reflective workpiece, and the first discriminator is used for determining the similarity between the non-reflective image of the input reflective workpiece and the original non-reflective image of the input reflective workpiece generated by the first generator module; the second generator is used for generating a reflective image of the input reflective workpiece based on the original non-reflective image of the input reflective workpiece, and the second discriminator is used for determining the similarity between the reflective image of the input reflective workpiece generated by the second generator module and the original reflective image of the input reflective workpiece.
Fig. 3 is a schematic structural diagram of a light reflection erasing network according to an embodiment of the present application, where the structure of the light reflection erasing network includes a first generator G s→t And a second generator G t→s First discriminator D t And a second discriminator D s Composition is prepared. Wherein the first generator G s→t Generating a non-retroreflective image of a retroreflective workpiece based on an original retroreflective image X of the retroreflective workpieceFirst discriminator D t For determining a first generator module G s→t Non-retroreflective image of retroreflective article produced +.>Similarity with the original non-reflective image Y of the reflective workpiece; second generator G t→s For generating a reflection image of a reflection workpiece based on an original non-reflection image Y of the reflection workpiece>Second discriminator D s For determining a second generator module G t→s Reflective image of the reflective workpiece produced +.>And the similarity between the original reflective image X of the reflective workpiece.
Fig. 4 is a schematic structural diagram of a first generator and a second generator according to an embodiment of the present application. First generator G s→ And a second generator G t→s The structure of (2) is shown in figure 3. Both of these generators consist of three parts, an Encoder, an auxiliary classifier Auxiliary classifier and a Decoder. Wherein the encoder is used for respectively carrying out downsampling operation and passing through the input imageThe image feature extraction module performs feature encoding operation to obtain an encoded feature map (shown in fig. 3)Wherein->For encoding the data of the first channel on the profile,/for the first channel on the profile >For encoding the data of the nth channel on the feature map), an auxiliary classifier is used to generate Gao Yuyi vector based on the encoded feature map to obtain an attention feature map (shown in fig. 3)Wherein->For the data of the first channel on the attention profile,/data of the first channel on the attention profile>For the data of the nth channel on the attention profile), a decoder is used for decoding the attention profile to obtain an image to be output, and upsampling the image to be output to obtain an output image, wherein the decoder consists of a full connection layer and an adaptive image feature extraction network. The size of the output image is consistent with the size of the input image, the size of the image to be output is consistent with the size of the image after the downsampling operation is carried out on the input image, and the size of the image to be output is smaller than the sizes of the output image and the input image, so that the calculation amount of the generator network on the input image is reduced. The auxiliary classifier module is used for determining importance measurement of each channel in the coding feature map, and the available weight w 1 ~w n Is characterized in that w is 1 For the weight of the first channel in the coding feature map, w n Is the weight of the nth channel in the coding feature map. Attention profile shown in FIG. 3 +. >The weighting is obtained based on the weight of each channel in the coding feature map and the data weighting of each channel in the coding feature map.
Wherein the auxiliary classifier is used to generate a channel importance measure that focuses on the more important regions and ignores the secondary regions. Specifically, the auxiliary classifier uses global and max-mapping to learn the weight w of the kth channel data of the feature map in the source domain k Weight w k The importance of the characteristic data of the channel is determined, and the attention mechanism under the characteristic diagram is realized.
Fig. 5 is a schematic structural diagram of a first discriminator and a second discriminator according to the embodiment of the application. Both discriminators consist of an Encoder and an auxiliary classifier Auxiliary classifier. Wherein the encoder is used for respectively performing downsampling operation on input images (an original non-reflection image of the target workpiece image and a non-reflection image of the target workpiece image, the non-reflection image being generated by the first generator) and performing feature encoding operation through the image feature extraction network to obtain an encoded feature map (shown in fig. 4)Wherein->For encoding the data of the first channel on the profile,/for the first channel on the profile>For encoding the data of the nth channel on the feature map), the auxiliary classifier is used for generating Gao Yuyi vector based on the encoded feature map to obtain attention feature map (shown in fig. 4 +. >Wherein->For the first channel on the attention profileThe data set is used to determine, based on the data,and finally, carrying out regression fitting processing on the attention feature map through a regression module logic to obtain the similarity between the input images (namely the original non-reflective image of the target workpiece image and the non-reflective image of the target workpiece image).
In some exemplary embodiments, training to obtain a retroreflective erasure network based on a retroreflective erasure network training set includes:
randomly selecting an original reflection image and an original non-reflection image of a target workpiece from the reflection removal data set, wherein the target workpiece is any one of a plurality of reflection workpieces;
taking the original reflection image of the target workpiece as an input of a first generator to generate a non-reflection image of the target workpiece based on the original reflection image of the target workpiece by the first generator, and taking the original non-reflection image of the target workpiece as an input of a second generator to generate a reflection image of the target workpiece based on the original non-reflection image of the target workpiece by the second generator;
taking an original light reflecting image of the target workpiece and a light reflecting image of the target workpiece as inputs of a first discriminator to determine a first similarity between the original light reflecting image of the target workpiece and the light reflecting image of the target workpiece through the first discriminator, and taking an original light non-reflecting image of the target workpiece and a light non-reflecting image of the target workpiece as inputs of a second discriminator to determine a second similarity between the original light non-reflecting image of the target workpiece and the light non-reflecting image of the target workpiece through the second discriminator;
Optimizing network parameters of network modules of the first generator and the second generator based on the first similarity and the second similarity;
and repeating the steps of randomly selecting an original reflection image and an original non-reflection image of a target workpiece from the reflection removal data set according to a preset training round, and optimizing network parameters of network modules of the first generator and the second generator to obtain a reflection erasing network through training.
Wherein the generator (first generator or second generator) will learn its corresponding transformation function by minimizing the loss, and calculate the loss of the first generator or second generator by measuring the difference of the data generated by the generator (first generator or second generator) from the target data. Taking the first generator as an example, the data generated by the first generator is the non-reflective image of the target workpiece, and the target data of the first generator is the original non-reflective image of the target workpiece, so that the difference between the non-reflective image of the target workpiece and the original non-reflective image of the target workpiece can be used as the loss of the first generator. Taking the second generator as an example, the data generated by the second generator is the reflective image of the target workpiece, and the target data of the first generator is the original reflective image of the target workpiece, so that the difference between the reflective image of the target workpiece and the original reflective image of the target workpiece can be used as the loss of the second generator. The greater the difference between the data generated by the generator (either the first generator or the second generator) and the target data, the higher the penalty the generator will be subjected to. Loss of the discriminators (either the first discriminator or the second discriminator) is also used to train the discriminators such that the first discriminators are able to be good at distinguishing between the real data (i.e., the original non-retroreflective image of the target workpiece) and the synthetic data (i.e., the non-retroreflective image of the target workpiece generated by the first generator) and such that the second discriminators are able to be good at distinguishing between the real data (i.e., the original retroreflective image of the target workpiece) and the synthetic data (i.e., the retroreflective image of the target workpiece generated by the second generator). Obviously, the generated result of the first generator can be used as the input of the first discriminator for training the first discriminator, and the result of the first discriminator is used for optimizing the first generator; the result of the second generator is used as the input of the second discriminator for training the second discriminator, and the result of the second discriminator is used for optimizing the second generator. That is, the first generator and the first discriminator can be improved from each other, and the second generator and the second discriminator can be improved from each other.
In some exemplary embodiments, if the type of retroreflection of the target workpiece image is a type of retroreflection, the method further comprises:
if the reflection type of the target workpiece image is the reflection type, determining a reflection degree estimated value of the target workpiece image through a reflection estimation model;
and determining the reflection degree corresponding to the reflection degree estimated value of the target workpiece image according to the preset mapping relation between the reflection degree estimated value and the reflection degree, wherein the reflection degree comprises a first reflection degree and a second reflection degree, and the first reflection degree is smaller than the second reflection degree.
In some exemplary embodiments, for some images with higher degrees of reflection, even if the defect detection results are defect-free, to improve the accuracy of defect detection for such images, the image with defect detection results of defect-free and higher degrees of reflection may be rechecked. Specifically, if the surface defect detection result of the target workpiece image is defect-free and the reflection degree of the target workpiece image is the second reflection degree, determining that the target workpiece image is an abnormal image.
And 140, inputting the target workpiece image after the repairing operation into a defect detection network to output a surface defect detection result of the target workpiece image.
In some exemplary embodiments, to increase the efficiency of defect detection for non-retroreflective images, such images may be input directly into the defect detection network. Specifically, the method further comprises:
if the reflection type of the target workpiece image is the no-reflection type, inputting the target workpiece image into a defect detection network to output and obtain a surface defect detection result of the target workpiece image.
As an example, the process of surface defect detection will be described in detail with reference to the detailed flowchart of surface defect detection shown in fig. 6, and the flowchart may include:
s11, inputting an image.
Inputting a plurality of workpiece images (1-n) to be detected into the reflection estimation model.
S12, predicting the reflection degree.
Treatment with a glistening estimation modelThe detected plurality of workpiece images (1-n) are subjected to estimation of the degree of surface reflection. For any one workpiece image i in the plurality of workpiece images, determining that the reflection estimated value of the workpiece image i is R through the reflection estimated model i
S13, determining R i Whether or not is greater than or equal to τ.
Reflection estimation value R based on workpiece image i i Corresponding anti-reflection and defect detection operations can be performed on different types of reflection images according to a preset reflection reliability threshold tau, and corresponding defect detection results or abnormal results are output. The reflection reliability threshold τ is used for indicating the reliability of the reflection detection result of the reflection estimation model on the workpiece image i, if the reflection reliability threshold τ is higher than or equal to the threshold, the reflection detection result of the reflection estimation model on the workpiece image i is indicated to be reliable, and if the reflection reliability threshold τ is lower than the threshold, the reflection reliability threshold τ is indicated to be unreliable, and at the moment, the defect detection network can be directly used for detecting defects of the reflection detection result of the reflection estimation model on the workpiece image i.
S14, inputting a defect detection network.
For the reflection estimated value R i <And (3) carrying out defect detection on the workpiece image i of tau through a defect detection network, executing S15 to determine whether a detection result exists, outputting the detection result if the detection result exists, and executing S21 to output an abnormal result if the detection result does not exist. Due to the reflection estimate R of the workpiece image i i <τ, and if the defect detection network detects the defect of the workpiece image i, it indicates that the workpiece image i has an abnormality and needs to be checked manually, and at this time, an abnormal result can be output to indicate that the workpiece image i is checked manually.
S16, calculating argmax (R i )。
For reflection estimated value R i Work image i of ∈τ, argmax (R i ) To estimate R from the reflection i And selecting the workpiece image i with the maximum reflection estimated value from the workpiece images i with the size equal to tau, and performing corresponding reflection removal and/or defect detection operation. Specifically, the reflection type of the workpiece image i may be determined based on a mapping relationship between a preset reflection degree estimated value and a reflection degreeWhether no, weak or strong.
S17, erasing the reflective area through the reflective erasing network.
For the workpiece image i with weak light reflection, firstly, erasing a light reflection area through a light reflection erasing network, then detecting the workpiece image i after light reflection erasing by using a defect detection network, and outputting a detection result.
For a workpiece image i with strong light reflection, firstly erasing a light reflection area through a light reflection erasing network, then detecting the workpiece image i after light reflection erasing by using a defect detection network, outputting a detection result, and outputting an abnormal result if the defect detection network cannot detect defect information.
S18, inputting the defect detection network.
And inputting the workpiece image i without the reflection type into a defect detection network for subsequent defect detection, and outputting a detection result.
S19, outputting a defect detection result.
S20, determining whether a defect detection result exists and whether the light is of a strong reflection type.
If the defect detection network cannot detect the defect information and the workpiece image i is of a strong reflection type, outputting an abnormal result.
S21, outputting an abnormal result.
And repeating the step S15 and the subsequent steps until the detection results of the workpiece images are output.
The surface defect detection method provided by the embodiment of the application obtains a target workpiece image to be subjected to surface defect detection; and predicting the reflection type of the target workpiece image through a reflection estimation model, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type. And under the condition that the reflection type of the target workpiece image is determined to be the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the light reflection erasing network is obtained by training original light reflection images of a plurality of light reflection workpieces under a plurality of different imaging effects and original light reflection-free images of a plurality of light reflection workpieces, and finally, the target workpiece image after the repairing operation is input into the defect detection network, so that the surface defect detection result of the target workpiece image can be output and obtained.
In addition, the method provided by the embodiment can be applied to any application scene of defect detection of the workpiece with reflection.
It should be noted that, the execution subjects of each step of the method provided in the above embodiment may be the same device, or the method may also be executed by different devices. For example, the execution subject of steps 110 to 130 may be device a; for another example, the execution subject of steps 110 to 120 may be device a, and the execution subject of step 130 may be device B; etc.
It should be further noted that, the method for constructing a defect detection network and the method for constructing a network for detecting defects provided by the embodiments of the present application are not limited to the scene of defect detection, and the defect detection is replaced by other target detection, such as a more general object (such as a vehicle, a person, a cat, etc.) detection scene, and the method for constructing a network in the inventive concept is also applicable to such a scene.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations appearing in a particular order are included, but it should be clearly understood that the operations may be performed out of order or performed in parallel in the order in which they appear herein, the sequence numbers of the operations such as 110, 120, 510, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different messages, devices, modules, etc., and do not represent a sequential order, and the descriptions of "first" and "second" are not limited to different categories.
Fig. 7 is a schematic structural diagram of a surface defect detecting device 700 according to an exemplary embodiment of the present application. As shown in fig. 7, the apparatus 700 includes: an image acquisition module 710, a glistening estimation module 720, and a glistening repair module 730, wherein:
an image acquisition module 710 for acquiring a target workpiece image to be subjected to surface defect detection;
the reflection estimation module 720 is configured to input the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, where the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type includes a non-reflection type and a reflection type;
the light reflection repair module 730 is configured to, if the light reflection type of the target workpiece image is a light reflection type, perform repair operation of erasing the light reflection area on the target workpiece image through a light reflection erasing network; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces;
the defect detection module 740 is configured to input the target workpiece image after the repair operation into a defect detection network, so as to output a surface defect detection result of the target workpiece image.
Optionally, the apparatus further comprises: the training module of the reflection estimation model is used for:
acquiring a sample image data set of a workpiece to be detected in an industrial scene, wherein the sample image data set comprises a plurality of sample images with reflective surfaces, a plurality of sample images with non-reflective surfaces and corresponding reflective degree labels;
randomly selecting a plurality of images from the sample image dataset according to a preset training sample number;
performing a preset image enhancement operation on the plurality of images, wherein the preset image enhancement operation comprises at least one of turning, blurring, scaling and random cropping;
training to obtain the reflection estimation model based on the images after the image enhancement operation and a preset cross entropy loss function, wherein the reflection estimation model is constructed based on ResNet 18.
Optionally, the device further comprises a construction module of the light reflection erasing network, for:
acquiring original reflective images of a plurality of reflective workpieces under a plurality of different imaging effects and original non-reflective images of the plurality of reflective workpieces;
establishing a mapping relation between an original reflection image and an original non-reflection image of each reflection workpiece in the plurality of reflection workpieces to obtain a reflection erasing network training set;
Based on the training set of the light reflection erasing network, training to obtain a light reflection erasing network, wherein the light reflection erasing network comprises a first generator, a second generator, a first discriminator and a second discriminator, the first generator is connected with the first discriminator, the second generator is connected with the second discriminator, an original light reflection image of each light reflection workpiece in a plurality of light reflection workpieces is used as an input of the first generator, and an original light reflection image of each light reflection workpiece in the plurality of light reflection workpieces is used as an input of the second generator;
the network structure of the network module of the first generator is the same as that of the network module of the second generator, and the network module of the first generator comprises an encoder, an auxiliary classifier and a decoder; the network structure of the network module of the first discriminator is the same as that of the network module of the second discriminator, and the network module of the first discriminator comprises an encoder and an auxiliary classifier; the first generator is used for generating a non-reflective image of the input reflective workpiece based on an original reflective image of the input reflective workpiece, and the first discriminator is used for determining similarity between the non-reflective image of the input reflective workpiece generated by the first generator module and the original non-reflective image of the input reflective workpiece; the second generator is used for generating a light reflecting image of the input light reflecting workpiece based on the original non-light reflecting image of the input light reflecting workpiece, and the second discriminator is used for determining the similarity between the light reflecting image of the input light reflecting workpiece generated by the second generator module and the original light reflecting image of the input light reflecting workpiece.
Optionally, the building module of the light reflection erasing network is specifically configured to, when training to obtain the light reflection erasing network based on the light reflection erasing network training set:
randomly selecting an original reflection image and an original non-reflection image of a target workpiece from the reflection removal data set, wherein the target workpiece is any one of the plurality of reflection workpieces;
taking the original glistening image of the target workpiece as an input of the first generator to generate a glistening image of the target workpiece based on the original glistening image of the target workpiece through the first generator, and taking the original glistening image of the target workpiece as an input of the second generator to generate a glistening image of the target workpiece based on the original glistening image of the target workpiece through the second generator;
taking an original light-reflecting image of the target workpiece and a light-reflecting image of the target workpiece as inputs of the first discriminator to determine a first similarity between the original light-reflecting image of the target workpiece and the light-reflecting image of the target workpiece by the first discriminator, and taking an original light-non-reflecting image of the target workpiece and a light-non-reflecting image of the target workpiece as inputs of the second discriminator to determine a second similarity between the original light-non-reflecting image of the target workpiece and the light-non-reflecting image of the target workpiece by the second discriminator;
Optimizing network parameters of network modules of the first generator and the second generator based on the first similarity and the second similarity;
repeating the steps of randomly selecting an original reflection image and an original non-reflection image of a target workpiece from the reflection removal data set according to a preset training round, and optimizing network parameters of network modules of the first generator and the second generator to obtain the reflection erasing network through training.
Optionally, the encoder is configured to perform downsampling operation on an input image and perform feature encoding operation through an image feature extraction network to obtain an encoded feature map, the auxiliary classifier is configured to generate Gao Yuyi vectors based on the encoded feature map to obtain an attention feature map, the decoder is configured to perform decoding operation on the attention feature map to obtain an image to be output, and perform upsampling operation on the image to be output to obtain an output image, where the decoder is composed of a fully connected layer and an adaptive image feature extraction network.
Optionally, if the reflection type of the target workpiece image is a reflection type, the apparatus further includes a reflection degree determining module, configured to:
If the reflection type of the target workpiece image is the reflection type, determining a reflection degree estimated value of the target workpiece image through the reflection estimated model;
and determining the reflection degree corresponding to the reflection degree estimated value of the target workpiece image according to the mapping relation between the preset reflection degree estimated value and the reflection degree, wherein the reflection degree comprises a first reflection degree and a second reflection degree, and the first reflection degree is smaller than the second reflection degree.
Optionally, the apparatus further includes an abnormal image determining module, configured to:
and if the surface defect detection result of the target workpiece image is defect-free and the reflection degree of the target workpiece image is the second reflection degree, determining that the target workpiece image is an abnormal image.
The surface defect detection apparatus 700 can implement the method of the method embodiment of fig. 1 to 6, and specifically, reference may be made to the surface defect detection method of the embodiment of fig. 1 to 6, which is not described herein.
The embodiment of the application also provides electronic equipment, which comprises: the system comprises a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory are communicated through the bus, and the machine-readable instructions are executed by the processor to perform the method for constructing the defect detection network or the method for detecting the defect. Specifically, fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application. As shown in fig. 8, the apparatus includes: a memory 81 and a processor 82.
Memory 81 is used to store computer programs and may be configured to store various other data to support operations on the computing device. Examples of such data include instructions for any application or method operating on a computing device, contact data, phonebook data, messages, images, video, and the like.
A processor 82 coupled to the memory 81 for executing the computer program in the memory 81 for: acquiring a target workpiece image to be subjected to surface defect detection; inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type; if the reflection type of the target workpiece image is the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces; and inputting the target workpiece image after the repairing operation into a defect detection network to output and obtain a surface defect detection result of the target workpiece image.
The electronic equipment provided by the embodiment of the application can acquire the target workpiece image to be subjected to surface defect detection; and predicting the reflection type of the target workpiece image through a reflection estimation model, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type. And under the condition that the reflection type of the target workpiece image is determined to be the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the light reflection erasing network is obtained by training original light reflection images of a plurality of light reflection workpieces under a plurality of different imaging effects and original light reflection-free images of a plurality of light reflection workpieces, and finally, the target workpiece image after the repairing operation is input into the defect detection network, so that the surface defect detection result of the target workpiece image can be output and obtained.
Further, as shown in fig. 8, the electronic device further includes: communication component 83, display 84, power component 85, audio component 86, and other components. Only some of the components are schematically shown in fig. 8, which does not mean that the electronic device only comprises the components shown in fig. 8. In addition, depending on the implementation form of the flow playback device, the components within the dashed box in fig. 8 are optional components, not necessarily optional components. For example, when the electronic device is implemented as a terminal device such as a smart phone, tablet computer, or desktop computer, the components within the dashed box in fig. 8 may be included; when the electronic device is implemented as a server-side device such as a conventional server, cloud server, data center, or server array, the components within the dashed box in fig. 8 may not be included.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program, which when executed by a processor, causes the processor to implement the steps in the above-described embodiments of the method for constructing a defect detection network.
The communication assembly of fig. 8 is configured to facilitate wired or wireless communication between the device in which the communication assembly is located and other devices. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component may further include a Near Field Communication (NFC) module, radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and the like.
The memory of fig. 8 described above may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The display in fig. 8 described above includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The power supply assembly shown in fig. 8 provides power to various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the devices in which the power components are located.
The audio component of fig. 8 described above may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive external audio signals when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a speech recognition mode. The received audio signal may be further stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (10)

1. A surface defect detection method, comprising:
acquiring a target workpiece image to be subjected to surface defect detection;
inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type;
if the reflection type of the target workpiece image is the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces;
And inputting the target workpiece image after the repairing operation into a defect detection network to output and obtain a surface defect detection result of the target workpiece image.
2. The method of claim 1, wherein the training process of the glistening estimation model comprises:
acquiring a sample image data set of a workpiece to be detected in an industrial scene, wherein the sample image data set comprises a plurality of sample images with reflective surfaces, a plurality of sample images with non-reflective surfaces and corresponding reflective degree labels;
randomly selecting a plurality of images from the sample image dataset according to a preset training sample number;
performing a preset image enhancement operation on the plurality of images, wherein the preset image enhancement operation comprises at least one of turning, blurring, scaling and random cropping;
training to obtain the reflection estimation model based on the images after the image enhancement operation and a preset cross entropy loss function, wherein the reflection estimation model is constructed based on ResNet 18.
3. The method of claim 1, wherein the construction of the retroreflective erase network includes:
acquiring original reflective images of a plurality of reflective workpieces under a plurality of different imaging effects and original non-reflective images of the plurality of reflective workpieces;
Establishing a mapping relation between an original reflection image and an original non-reflection image of each reflection workpiece in the plurality of reflection workpieces to obtain a reflection erasing network training set;
based on the training set of the light reflection erasing network, training to obtain a light reflection erasing network, wherein the light reflection erasing network comprises a first generator, a second generator, a first discriminator and a second discriminator, the first generator is connected with the first discriminator, the second generator is connected with the second discriminator, an original light reflection image of each light reflection workpiece in a plurality of light reflection workpieces is used as an input of the first generator, and an original light reflection image of each light reflection workpiece in the plurality of light reflection workpieces is used as an input of the second generator;
the network structure of the network module of the first generator is the same as that of the network module of the second generator, and the network module of the first generator comprises an encoder, an auxiliary classifier and a decoder; the network structure of the network module of the first discriminator is the same as that of the network module of the second discriminator, and the network module of the first discriminator comprises an encoder and an auxiliary classifier; the first generator is used for generating a non-reflective image of the input reflective workpiece based on an original reflective image of the input reflective workpiece, and the first discriminator is used for determining similarity between the non-reflective image of the input reflective workpiece generated by the first generator module and the original non-reflective image of the input reflective workpiece; the second generator is used for generating a light reflecting image of the input light reflecting workpiece based on the original non-light reflecting image of the input light reflecting workpiece, and the second discriminator is used for determining the similarity between the light reflecting image of the input light reflecting workpiece generated by the second generator module and the original light reflecting image of the input light reflecting workpiece.
4. The method of claim 3, wherein training to obtain the retroreflective erasure network based on the retroreflective erasure network training set comprises:
randomly selecting an original reflection image and an original non-reflection image of a target workpiece from the reflection removal data set, wherein the target workpiece is any one of the plurality of reflection workpieces;
taking the original glistening image of the target workpiece as an input of the first generator to generate a glistening image of the target workpiece based on the original glistening image of the target workpiece through the first generator, and taking the original glistening image of the target workpiece as an input of the second generator to generate a glistening image of the target workpiece based on the original glistening image of the target workpiece through the second generator;
taking an original light-reflecting image of the target workpiece and a light-reflecting image of the target workpiece as inputs of the first discriminator to determine a first similarity between the original light-reflecting image of the target workpiece and the light-reflecting image of the target workpiece by the first discriminator, and taking an original light-non-reflecting image of the target workpiece and a light-non-reflecting image of the target workpiece as inputs of the second discriminator to determine a second similarity between the original light-non-reflecting image of the target workpiece and the light-non-reflecting image of the target workpiece by the second discriminator;
Optimizing network parameters of network modules of the first generator and the second generator based on the first similarity and the second similarity;
repeating the steps of randomly selecting an original reflection image and an original non-reflection image of a target workpiece from the reflection removal data set according to a preset training round, and optimizing network parameters of network modules of the first generator and the second generator to obtain the reflection erasing network through training.
5. The method according to claim 3 or 4, wherein the encoder is configured to perform a downsampling operation on an input image and a feature encoding operation through an image feature extraction network, respectively, to obtain an encoded feature map, the auxiliary classifier is configured to generate Gao Yuyi vectors based on the encoded feature map, to obtain an attention feature map, and the decoder is configured to perform a decoding operation on the attention feature map to obtain an image to be output and perform an upsampling operation on the image to be output to obtain an output image, wherein the decoder is composed of a fully connected layer and an adaptive image feature extraction network.
6. The method of claim 1, wherein if the type of retroreflection of the target workpiece image is a type of retroreflection, the method further comprises:
If the reflection type of the target workpiece image is the reflection type, determining a reflection degree estimated value of the target workpiece image through the reflection estimated model;
and determining the reflection degree corresponding to the reflection degree estimated value of the target workpiece image according to the mapping relation between the preset reflection degree estimated value and the reflection degree, wherein the reflection degree comprises a first reflection degree and a second reflection degree, and the first reflection degree is smaller than the second reflection degree.
7. The method of claim 6, wherein the method further comprises:
and if the surface defect detection result of the target workpiece image is defect-free and the reflection degree of the target workpiece image is the second reflection degree, determining that the target workpiece image is an abnormal image.
8. A surface defect inspection apparatus, comprising:
the image acquisition module is used for acquiring a target workpiece image to be subjected to surface defect detection;
the reflection estimation module is used for inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type;
The reflection repair module is used for performing repair operation of erasing a reflection area on the target workpiece image through a reflection erasing network if the reflection type of the target workpiece image is a reflection type; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces;
and the defect detection module is used for inputting the target workpiece image after the repair operation into a defect detection network so as to output and obtain a surface defect detection result of the target workpiece image.
9. An electronic device, comprising: a memory and a processor;
the memory is used for storing a computer program;
the processor, coupled to the memory, is configured to execute the computer program for:
acquiring a target workpiece image to be subjected to surface defect detection;
inputting the target workpiece image into a reflection estimation model to output a reflection type of the target workpiece image, wherein the reflection estimation model is trained based on a plurality of workpiece images and corresponding reflection type labels, and the reflection type comprises a non-reflection type and a reflection type;
If the reflection type of the target workpiece image is the reflection type, repairing operation of erasing the reflection area is carried out on the target workpiece image through a reflection erasing network; the reflection erasing network is obtained by training original reflection images based on a plurality of reflection workpieces under a plurality of different imaging effects and original non-reflection images of the plurality of reflection workpieces;
and inputting the target workpiece image after the repairing operation into a defect detection network to output and obtain a surface defect detection result of the target workpiece image.
10. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, causes the processor to carry out the steps in the surface defect detection method according to any one of claims 1 to 7.
CN202311154548.7A 2023-09-07 2023-09-07 Surface defect detection method, device and equipment Active CN117333383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311154548.7A CN117333383B (en) 2023-09-07 2023-09-07 Surface defect detection method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311154548.7A CN117333383B (en) 2023-09-07 2023-09-07 Surface defect detection method, device and equipment

Publications (2)

Publication Number Publication Date
CN117333383A true CN117333383A (en) 2024-01-02
CN117333383B CN117333383B (en) 2024-05-24

Family

ID=89294185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311154548.7A Active CN117333383B (en) 2023-09-07 2023-09-07 Surface defect detection method, device and equipment

Country Status (1)

Country Link
CN (1) CN117333383B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112102201A (en) * 2020-09-24 2020-12-18 深圳市赛为智能股份有限公司 Image shadow reflection eliminating method and device, computer equipment and storage medium
CN113570549A (en) * 2021-06-30 2021-10-29 青岛海尔科技有限公司 Defect detection method and device for reflective surface
WO2022127919A1 (en) * 2020-12-17 2022-06-23 杭州海康威视数字技术股份有限公司 Surface defect detection method, apparatus, system, storage medium, and program product
CN115100191A (en) * 2022-08-22 2022-09-23 南通恒强轧辊有限公司 Metal casting defect identification method based on industrial detection
CN115482220A (en) * 2022-09-21 2022-12-16 淮阴工学院 High-reflectivity metal surface defect detection method based on improved fast RCNN
CN115880223A (en) * 2022-11-10 2023-03-31 淮阴工学院 Improved YOLOX-based high-reflectivity metal surface defect detection method
CN116485735A (en) * 2023-04-06 2023-07-25 深兰人工智能应用研究院(山东)有限公司 Defect detection method, defect detection device, electronic equipment and computer readable storage medium
CN116611748A (en) * 2023-07-20 2023-08-18 吴江市高瑞庭园金属制品有限公司 Titanium alloy furniture production quality monitoring system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112102201A (en) * 2020-09-24 2020-12-18 深圳市赛为智能股份有限公司 Image shadow reflection eliminating method and device, computer equipment and storage medium
WO2022127919A1 (en) * 2020-12-17 2022-06-23 杭州海康威视数字技术股份有限公司 Surface defect detection method, apparatus, system, storage medium, and program product
CN113570549A (en) * 2021-06-30 2021-10-29 青岛海尔科技有限公司 Defect detection method and device for reflective surface
CN115100191A (en) * 2022-08-22 2022-09-23 南通恒强轧辊有限公司 Metal casting defect identification method based on industrial detection
CN115482220A (en) * 2022-09-21 2022-12-16 淮阴工学院 High-reflectivity metal surface defect detection method based on improved fast RCNN
CN115880223A (en) * 2022-11-10 2023-03-31 淮阴工学院 Improved YOLOX-based high-reflectivity metal surface defect detection method
CN116485735A (en) * 2023-04-06 2023-07-25 深兰人工智能应用研究院(山东)有限公司 Defect detection method, defect detection device, electronic equipment and computer readable storage medium
CN116611748A (en) * 2023-07-20 2023-08-18 吴江市高瑞庭园金属制品有限公司 Titanium alloy furniture production quality monitoring system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
喻雷: "基于深度学习的图像去反光算法研究", 《中国优秀硕士学位论文全文数据库》, no. 2022, 15 January 2022 (2022-01-15), pages 138 - 2306 *
闫笑颜: "基于Mask R-CNN算法在轨道扣件缺陷检测中的应用", 《机械工程师》, no. 2021, 9 April 2021 (2021-04-09), pages 20 - 23 *

Also Published As

Publication number Publication date
CN117333383B (en) 2024-05-24

Similar Documents

Publication Publication Date Title
CN111612751B (en) Lithium battery defect detection method based on Tiny-yolov3 network embedded with grouping attention module
CN110675399A (en) Screen appearance flaw detection method and equipment
CN110210513B (en) Data classification method and device and terminal equipment
CN110796647A (en) Method and device for detecting defects of screen area of electronic device
CN110827249A (en) Electronic equipment backboard appearance flaw detection method and equipment
CN112948937B (en) Intelligent pre-judging method and device for concrete strength
CN110335313B (en) Audio acquisition equipment positioning method and device and speaker identification method and system
CN113469807B (en) Credit risk determination and data processing method, apparatus, medium, and program product
CN113658182B (en) Surface defect region segmentation method and device based on parallel multi-branch feature fusion
US20220084234A1 (en) Method and electronic device for identifying size of measurement target object
CN115797349A (en) Defect detection method, device and equipment
CN109584262A (en) Cloud detection method of optic, device and electronic equipment based on remote sensing image
CN109671051A (en) Picture quality detection model training method and device, electronic equipment and storage medium
WO2021147055A1 (en) Systems and methods for video anomaly detection using multi-scale image frame prediction network
CN117693754A (en) Training masked automatic encoders for image restoration
CN113466839B (en) Side-scan sonar sea bottom line detection method and device
CN111929688B (en) Method and equipment for determining radar echo prediction frame sequence
CN113269307B (en) Neural network training method and target re-identification method
CN117333383B (en) Surface defect detection method, device and equipment
CN117437455A (en) Method, device, equipment and readable medium for determining wafer defect mode
CN115810011A (en) Training method, device and equipment for anomaly detection network and anomaly detection method, device and equipment
CN114550129B (en) Machine learning model processing method and system based on data set
CN115424000A (en) Pointer instrument identification method, system, equipment and storage medium
CN115861160A (en) Method and device for detecting surface defects of power interface of mobile phone and storage medium
CN114841255A (en) Detection model training method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant