CN112700442A - Die-cutting machine workpiece defect detection method and system based on Faster R-CNN - Google Patents

Die-cutting machine workpiece defect detection method and system based on Faster R-CNN Download PDF

Info

Publication number
CN112700442A
CN112700442A CN202110134056.6A CN202110134056A CN112700442A CN 112700442 A CN112700442 A CN 112700442A CN 202110134056 A CN202110134056 A CN 202110134056A CN 112700442 A CN112700442 A CN 112700442A
Authority
CN
China
Prior art keywords
image
target
defect
detected
feature map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110134056.6A
Other languages
Chinese (zh)
Inventor
施恒之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Yikm Intelligent Technology Co ltd
Original Assignee
Zhejiang Yikm Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Yikm Intelligent Technology Co ltd filed Critical Zhejiang Yikm Intelligent Technology Co ltd
Priority to CN202110134056.6A priority Critical patent/CN112700442A/en
Publication of CN112700442A publication Critical patent/CN112700442A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]

Abstract

The application relates to a die-cutting machine workpiece defect detection method and a system based on Faster R-CNN, wherein the method comprises the following steps: collecting an image to be detected; inputting an image to be detected into a trained target detection model, generating a candidate frame by a first characteristic diagram through a region generation network (RPN) of the target detection model, and determining a defect target of the image to be detected based on the candidate frame; mapping the candidate frame to the first feature map in the interested Pooling network ROI Pooling of the target detection model to obtain a second feature map mapped with the candidate frame, Pooling the second feature map to obtain a third feature map with the same size, inputting the third feature map into a full-connection layer of the target detection model, outputting a target classification label, and determining a defect classification result corresponding to a defect target of the image to be detected based on the target classification label. According to the invention, by adding the characteristic pyramid network FPN, the accuracy of model identification under the condition of multi-scale change is improved, and multi-scale defect targets of the same product can be detected.

Description

Die-cutting machine workpiece defect detection method and system based on Faster R-CNN
Technical Field
The application relates to the technical field of target detection, in particular to a die-cutting machine workpiece defect detection method and system based on Faster R-CNN.
Background
With the rapid development of economy, the manufacturing industry in China is also rapidly developed, and the center of the global manufacturing industry gradually shifts to China. While the production capacity and the manufacturing capacity are continuously expanded, higher and higher requirements are put forward on the guarantee of the product quality.
The multi-station rotary die-cutting machine is a common production machine in industrial production, and is used for continuously rotating and die-cutting in the form of a hob, and is mainly used for die-cutting, indentation and gold stamping operations, fitting and automatic waste discharge of adhesive stickers, EVA (ethylene vinyl acetate), double-sided adhesive, electronics, mobile phone rubber mats and the like. The die-cut product is widely applied to the fields of mobile phones, computers, liquid crystal displays, digital cameras, LCD backlight sources and the like. For the die cutting machine, the control of the tension and the flatness of the fed material are very important, and as the die cutting machine is always in a high-speed running state, the continuous change of the tension can cause the overload of equipment or the poor flatness of the used material, thereby generating various defects. If various defects such as transverse lines, folds, scratches and the like exist in the patch, the use condition of the electronic product is greatly influenced, so that the defect detection of the die-cut product is an essential link.
At present, the traditional manual visual detection method is still adopted in most of die-cutting machine workpiece production industries, and manual visual detection has many defects, such as limited visual spatial resolution of human eyes, fatigue and the like caused by long-term high-intensity labor work, which easily causes false detection and missed detection, and thus, the quality of die-cutting machine workpieces is greatly influenced to a certain extent. In addition, manual visual inspection also occupies more human resources, so that the production efficiency is low, and the production cost of enterprises is increased.
The detection of the defects of the common industrial target in the related technology is a traditional detection method based on machine learning, various characteristics are extracted through various preprocessing and manual design, and the detection of some defects is completed.
At present, no effective solution is provided for the problems existing in manual detection and defect detection based on traditional machine vision in the related art.
Disclosure of Invention
The embodiment of the application provides a die-cutting machine workpiece defect detection method and system based on Faster R-CNN, which at least solve the problem that manual detection and defect detection based on traditional machine vision in the related art are difficult to effectively and accurately detect products with various defects.
In a first aspect, an embodiment of the present application provides a die-cutting machine workpiece defect detection method based on Faster R-CNN, the method including: collecting an image to be detected; inputting the image to be detected into a trained target detection model, wherein the image to be detected is subjected to a first feature map extraction through a feature extraction network and a feature pyramid network of the target detection model, the first feature map generates a candidate frame through a region generation network (RPN) of the target detection model, and a defect target of the image to be detected is determined based on the candidate frame; mapping the candidate frame to the first feature map in a interested Pooling network ROI Pooling of the target detection model to obtain a second feature map mapped with the candidate frame, Pooling the second feature map to obtain a third feature map with the same size, inputting the third feature map into a full-connection layer of the target detection model, outputting a target classification label, and determining a defect classification result corresponding to a defect target of the image to be detected based on the target classification label.
In a second aspect, an embodiment of the present application provides a die-cutting machine workpiece defect detection system based on Faster R-CNN, including: the acquisition module is used for acquiring an image to be detected; and the detection module is used for inputting the image to be detected into the target detection model and determining a defect target and a defect classification result of the image to be detected.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to execute the method for detecting the workpiece defect of the die cutting machine based on the Faster R-CNN according to the first aspect.
Compared with the related art, the die-cutting machine workpiece defect detection method and system based on the Faster R-CNN provided by the embodiment of the application solve the problems of manual detection and defect detection based on traditional machine vision in the related art, can detect a multi-scale defect target of a product by adding the feature pyramid network FPN, and can identify various defects of the product by the multi-scale defect target and the classification of the defects.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart of a workpiece defect detection method for a Faster R-CNN based die cutting machine according to an embodiment of the present application;
FIG. 2 is a framework diagram of the ResNet + FPN + Faster-R-CNN algorithm for detecting an image to be detected according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a photographing device of a workpiece defect detection method of a die-cutting machine based on Faster R-CNN;
FIG. 4 is a schematic diagram of an FPN infrastructure according to an embodiment of the present application;
FIG. 5 is a flow diagram of training a target detection model according to an embodiment of the present application;
FIG. 6 is a block diagram of a workpiece defect detection system of a cross R-CNN-based die cutting machine according to an embodiment of the present application;
fig. 7 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary embodiments do not represent all implementations consistent with one or more embodiments of the present specification. Rather, they are merely examples of apparatus and methods consistent with certain aspects of one or more embodiments of the specification, as detailed in the claims which follow.
It should be noted that: in other embodiments, the steps of the corresponding methods are not necessarily performed in the order shown and described herein. In some other embodiments, the method may include more or fewer steps than those described herein. Moreover, a single step described in this specification may be broken down into multiple steps for description in other embodiments; multiple steps described in this specification may be combined into a single step in other embodiments.
Example one
Referring to fig. 1, a method for detecting defects of a die-cutting machine workpiece based on Faster R-CNN according to a first embodiment of the present invention is shown, the method includes an image acquisition step and an image processing step, and specifically, the method includes:
step 101, collecting an image to be detected;
step 102, inputting an image to be detected into a trained target detection model, wherein the image to be detected is subjected to a first feature map extraction through a feature extraction network and a feature pyramid network of the target detection model, the first feature map generates a candidate frame through a region generation network (RPN) of the target detection model, and a defect target of the image to be detected is determined based on the candidate frame;
step 103, mapping the candidate frame to the first feature map in the interested Pooling network ROI Pooling of the target detection model to obtain a second feature map mapped with the candidate frame, Pooling the second feature map to obtain a third feature map with the same size, inputting the third feature map to the full-link layer of the target detection model, outputting a target classification label, and determining a defect classification result corresponding to the defect target of the image to be detected based on the target classification label.
In this embodiment, please refer to fig. 2, which is a schematic diagram illustrating an image to be detected according to an embodiment of the present invention, a multi-scale defect target of a product can be detected by adding a feature pyramid network FPN, and various defects of a product can be identified by the multi-scale defect target and classification of defects. Specifically, a plurality of first feature maps with different scales are obtained after passing through a feature pyramid network FPN structure, a feature pyramid is combined with a feature extraction network, the feature maps are up-sampled after down-sampling is carried out on the feature extraction network, the feature maps with the same size in each sampling process are transversely connected and added, and finally the first feature maps with a plurality of scales after feature fusion are obtained. And generating a candidate frame by the first feature maps of the multiple scales through a region generation network RPN. In fig. 3, the identified defect target is represented by a rectangular frame, and the defect type is marked around the rectangular frame, as indicated by 03 in fig. 3 as a notch. The multi-scale target of the scheme refers to a defect target with different scale sizes on the same product. The shooting device acquires an image to be detected by shooting a workpiece conveyed by a crawler belt in the production process of a die-cutting machine product, a plurality of defects with different sizes, such as in-plane foreign matters, scratches, notches, wrinkles, bubbles, dirt and the like, may exist on the workpiece, the identification of a workpiece defect target is different from a conventional identification method, because the volume of a single workpiece is smaller, a plurality of different defects or different types of defects may exist on the same workpiece, but the traditional detection method of machine learning is difficult to effectively and accurately detect products with various types of defects, the application aggregates local context information with three scales in a feature fusion unit in a feature pyramid network FPN by adding the feature pyramid network FPN, deep-layer features contain more semantic information and have a large enough receptive field, and shallow-layer features contain more detailed information, the fusion mode is closer to the purpose of fusion of the global features and the local features so as to generate more distinctive features, so that the whole target detection model can identify and process the defects with different sizes, and the detection precision of various defects on the same workpiece is improved.
In step 101, an image to be measured is acquired by one or more of the following methods: shooting the appearance of the die cutting machine workpiece according to a preset frequency by controlling a shooting device; or manually controlling the shooting device to obtain; or the shooting device is triggered to acquire the workpiece according to the position change of the die-cutting machine.
In this step, the image that awaits measuring is obtained through track conveying work piece collection in shooting cross cutting machine product production process to the shooting device, and in work piece data send process, can acquire the image that awaits measuring through manual or automatic triggering's mode, compares with the artifical detection mode of visualing of tradition, prevents that the condition of lou examining or examining more from appearing in artifical detection, and the mode detection efficiency who acquires through the shooting device is higher, and manpower resource consumption is little.
In step 101, the shooting device includes an industrial camera and a machine vision light source, and the industrial camera is used to match with at least one set of machine vision light source to shoot the appearance of the die cutting machine workpiece and acquire the workpiece image and/or the image to be measured.
For example, please refer to fig. 3, which is a schematic diagram of a camera, in this step, the camera is used to collect a clear image, whether the input image is clear or not directly affects the output result of the target detection model, and a proper camera and light source are collocated to amplify the details of the target defect, reduce the interference characteristics, and improve the overall detection precision. In fig. 3, the industrial CCD camera is collocated with the lens, the distance between the lower end of the lens and the upper surface of the die-cutting machine workpiece to be collected is 350-220 mm, the definition of the image is adjusted by adjusting the aperture and the focal length of the lens and matching with a plurality of bar-shaped light sources, wherein the shapes of the light sources can be different from rectangular and circular, the distance between the two light sources is 300-400mm, and the distance between the two light sources and the upper surface of the die-cutting machine workpiece to be collected is 150-220 mm.
It is worth noting that the distance between the lower end of the lens and the upper surface of the workpiece of the die cutting machine needing to be collected can be 370mm, 384mm, 395mm, 428mm and 450mm, the distance between the two light sources can be 320mm, 340mm and 360mm, the distance between the two light sources and the upper surface of the workpiece of the die cutting machine needing to be collected can be 160mm, 180mm and 200mm, the distances can be adjusted according to the requirements of the type, the size and the like of the workpiece of the die cutting machine in practical application, and the purpose is to clearly collect the image of the surface of the workpiece of the.
In step 102, the feature extraction network may adopt ResNet, and ResNet adopts a residual error network and bn (batch normalization), so as to simplify network parameters, solve the problem that too many parameters cause the training model to fail to converge due to the increase of the model depth, and make the training of the depth model easier. ResNet is larger than VGG, and therefore has a stronger learning ability.
Illustratively, ResNet50 may be used in this step. In the network model, ResNet101 is a convolution block in which the fourth layer is thickened on the basis of ResNet50, and ResNet152 is a convolution block in which the third layer and the fourth layer are thickened on the basis of ResNet 50. Compared with ResNet101 and ResNet152, ResNet50 has almost the same accuracy as ResNet152, but is less in calculation amount and more suitable for detecting scenes with massive workpiece images in industry.
In step 102, for the problem of multi-scale change of the defect target, a feature pyramid network FPN is added for feature fusion, so that target detection can be performed on a plurality of targets of the acquired image to be detected, and the detection accuracy of the model is improved. For example, referring to fig. 4, a schematic diagram of an FPN basic structure is shown, and the FPN structure can be regarded as being composed of three parts, namely, bottom-up, top-down and transverse connection. Specifically, from bottom to top, the conventional step of extracting features by the ResNet convolutional neural network is to set the convolution step s =2 for each layer in convolutional layer conv1-conv4, so that each time a layer of convolution operation is performed, the feature map is scaled to 0.5 times of the previous layer, and so on.
The feature map is up-sampled from the highest layer from top to bottom, and it is assumed that the rows and columns of the feature map are respectively repeated 2 times to complete the last sampling operation, so that the feature map after each operation is twice as large as the feature map at the previous stage, and thus the three-layer feature map generated from the top to bottom part on the right side of the feature map has the same size as the three-layer feature map generated from the bottom to top part on the left side of the feature map.
The transverse connection is the operation of adding two parts of feature maps with the same size, and the convolution operation is carried out on the added feature maps through a convolution kernel of 3 x 3, so that the obtained 3 feature maps with different scales are integrated with the characteristic of multiple scales, the influence of loss of targets with different scales in the down-sampling process is reduced, and the accuracy of the multi-scale target detection problem is improved.
In this step, the multi-layer feature maps with different scales are used as the input of the region generation network RPN.
Referring to fig. 5, a framework diagram of the ResNet + FPN + fast-RCNN algorithm is shown, and specifically, the detection method includes: inputting an image to be detected with any size, extracting features through a ResNet feature extraction network, outputting multilayer feature maps with different scales by combining an FPN network, inputting the feature maps into an RPN network, on the basis of the previous step, firstly generating a candidate frame through the RPN by the feature maps, judging whether the anchor is a prediction frame or a background through softmax, and then correcting the anchor through border regression (bounding box regression), namely further determining the candidate frame to improve the accuracy of the candidate frame, in other words, the anchor is used for improving the positioning accuracy of the candidate frame, and the anchor is generally set to 9 frames.
Since the fully-connected layer can only receive input with a fixed size, candidate frames are mapped onto the first feature map through ROI Pooling of the interested Pooling network, assuming that the size of the fixed output is 3 × 3, one candidate frame is equally divided into 9 parts, maximal Pooling (max Pooling) is performed on each part, so as to obtain 4 values, then the multi-layer second feature maps with different scales and the corresponding candidate frames are maximally pooled by the same method, so that the output size of the feature map mapped by each candidate frame is fixed after ROI Pooling, and finally the detection is completed by outputting the classification result and regression of the candidate frames through two fully-connected layers, it is noted that, in this scheme, the first feature map, the second feature map and the third feature map are names defined for distinguishing different feature maps, for example, the first feature map is a feature map with multiple scales output by a feature pyramid, the second feature map is a feature map to which candidate frames are mapped, and the third feature map is a feature map of the same size obtained by pooling.
In step 102, in the training of the target detection model, the method for acquiring the sample image includes: and acquiring a workpiece image containing workpiece defects, inputting the workpiece image into a GAN model for defect sample expansion, and acquiring a sample image.
In this step, the accuracy of the trained target detection model during recognition is improved through sample expansion. Specifically, the traditional detection method based on machine learning extracts various features through manual design to complete the training of the model, but the method is greatly influenced by the manual design features, and all defect features cannot be completely mastered due to subjective limitations of the manual design features, particularly defect features with a small number of defect samples, and the samples are expanded due to the fact that the abundance degree of the data is small and the data is unbalanced. The sample data after capacity expansion has high richness and contains various different defect characteristics, so that the recognition rate of the target detection model obtained by training is greatly increased
Referring to FIG. 5, a flow chart for training a target detection model is shown. As shown in fig. 5, before training the model, the present invention expands the sample in the following expansion manner: and the GAN model is used for expanding defect samples, so that the richness of the original acquired workpiece image is improved, and the trained model identification accuracy is improved.
Example two
In the first embodiment, although the network parameters are less and the model convergence rate is high, restet is established on the basis of a large number of training and testing data sets, it takes a long time to collect defect samples of the same die-cutting machine workpiece, and it is difficult to collect the defect samples by a long-time accumulation method under the conditions of multiple product types and fast update, and the defect samples are extended by using GAN excessively, which easily causes model overfitting, and the similarity of each type of defect of the die-cutting machine workpiece is too high, which causes insufficient generalization of the model, and the defect detection effect of a new sample is not good.
In contrast, the feature extraction network proposed by the embodiment of the present application includes a dense connection network DenseNet obtained by connecting multiple layers of dense blocks, in view of the problem that the die cutting machine workpiece has a small number of defect samples. Specifically, the step of inputting the image to be detected into the target detection model includes: feature extraction: inputting an image to be detected into a dense connection network DenseNet, wherein the input of each layer of dense blocks is a union set of the outputs of all the previous layers of dense blocks, and obtaining a characteristic diagram of the image to be detected by learning and utilizing all the layers of dense blocks; feature fusion: and fusing the feature maps through a feature pyramid network, generating a candidate frame through a region generation network (RPN) of the feature maps obtained through fusion, and determining a defect target of the image to be detected.
In this embodiment, the dense connection network DenseNet has a feature reuse characteristic, wherein each layer in each DenseBlock accepts all previous layers as additional input, and is more suitable for the case of less defect samples of a single kind.
The dense connection network DenseNet is deployed in an industrial production environment, a lightweight model is needed, and the DenseNet is used as dense connection and has more network parameters, so that structural pruning is carried out on the DenseNet. Specifically, L1 regularization is adopted for sparse pruning aiming at the convolution kernel and the channel of the dense connection network densneet, and the dense connection network densneet obtained through pruning is used as a feature extraction network. In this step, the regularization coefficient may be set to 0.00001, and the shear ratio is set to 30%, so that the complexity of the model is reduced under the condition that the accuracy of the feature extraction network is similar.
For example, in this embodiment, DenseNet121 may be used as a feature extraction network, and the model is connected to a Feature Pyramid Network (FPN) according to the structural adjustment of DenseNet121, so that the model is suitable for detecting a situation where the size of a defect target changes greatly.
Aiming at the first embodiment and the second embodiment, the invention provides a die-cutting machine workpiece defect detection method based on fast R-CNN, which can detect workpieces conveyed on a crawler belt in the production process of a die-cutting machine product based on a trained target detection model. Through identifying and classifying various defects on the surface of the workpiece, the problem of poor quality of electronic products caused by die-cutting workpiece defects can be effectively avoided. In the method, the accuracy of model identification under the condition of multi-scale change is improved by adding the characteristic pyramid network (FPN), and multi-scale defect targets of the same product can be detected. By expanding the sample and reusing the features, the model is more suitable for identifying workpieces with multiple product types and fast iteration, a large number of defect samples do not need to be collected, and the model is more suitable for the condition that a single type of defect sample is less. By pruning the dense connection network DenseNet, the complexity of the model is reduced, and the model is deployed as a feature extraction network in an industrial production environment and is lighter.
EXAMPLE III
Referring to fig. 6, the present embodiment further provides a workpiece defect detecting system of a die cutting machine based on Faster R-CNN, which is used to implement the above embodiments and preferred embodiments, and the description of the system is omitted here. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Specifically, the system comprises:
an acquisition module 201, configured to acquire an image to be detected;
the detection module 202 is configured to input the image to be detected into the target detection model, and determine a defect target and a defect classification result of the image to be detected.
Example four
Referring to fig. 7, a memory 304 stores a computer program, and the processor 302 is configured to execute the computer program to perform the steps of any of the above method embodiments.
Specifically, the processor 302 may include a processor (GPU), or A Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of the embodiments of the present Application.
Memory 304 may include, among other things, mass storage 304 for data or instructions. By way of example, and not limitation, memory 304 may include a Hard Disk Drive (Hard Disk Drive, abbreviated to HDD), a floppy Disk Drive, a Solid State Drive (SSD), flash memory, an optical Disk, a magneto-optical Disk, tape, or a Universal Serial Bus (USB) Drive or a combination of two or more of these. Memory 304 may include removable or non-removable (or fixed) media, where appropriate. The memory 304 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 304 is a Non-Volatile (Non-Volatile) memory. In particular embodiments, Memory 304 includes Read-Only Memory (ROM) and Random Access Memory (RAM). The ROM may be mask-programmed ROM, Programmable ROM (PROM), Erasable PROM (EPROM), Electrically Erasable PROM (EEPROM), Electrically rewritable ROM (EAROM), or FLASH Memory (FLASH), or a combination of two or more of these, where appropriate. The RAM may be a Static Random-Access Memory (SRAM) or a Dynamic Random-Access Memory (DRAM), where the DRAM may be a Fast Page Mode Dynamic Random-Access Memory 304 (FPMDRAM), an Extended data output Dynamic Random-Access Memory (eddram), a Synchronous Dynamic Random-Access Memory (SDRAM), and the like.
Memory 304 may be used to store or cache various data files for processing and/or communication purposes, as well as possibly computer program instructions for execution by processor 302.
The processor 302 reads and executes the computer program instructions stored in the memory 304 to realize any one of the above-mentioned embodiments of the workpiece defect detection method of the die cutting machine based on the Faster R-CNN.
Optionally, the electronic apparatus may further include a transmission device 306 and an input/output device 308, where the transmission device 306 is connected to the processor 302, and the input/output device 308 is connected to the processor 302.
Alternatively, in this embodiment, the processor 302 may be configured to execute the following steps by a computer program:
and S101, collecting an image to be detected.
S102, inputting an image to be detected into a target detection model, wherein the target detection model is obtained by training a neural network model based on the obtained sample image, the defect target of the sample image and the defect classification result, the image to be detected is subjected to feature extraction by a feature extraction network and a feature pyramid network of the target detection model, a candidate frame is generated by the feature map through a region generation network (RPN), and the defect target of the image to be detected is determined based on the candidate frame.
S103, mapping the candidate frame to the feature map to obtain the feature map mapped with the candidate frame, inputting the feature map mapped with the candidate frame to an interested Pooling network ROI Pooling to obtain the feature map with the same size, inputting the feature map with the same size to the full connection layer, outputting a target classification label, and determining a defect classification result of the image to be detected based on the target classification label.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, the embodiment of the application can be realized by providing a storage medium in combination with the method for detecting the workpiece defect of the die-cutting machine based on Faster R-CNN in the embodiment. The storage medium having stored thereon a computer program; when being executed by a processor, the computer program realizes the workpiece defect detection method of the die-cutting machine based on Faster R-CNN in any one of the embodiments.
It should be understood by those skilled in the art that various features of the above embodiments can be combined arbitrarily, and for the sake of brevity, all possible combinations of the features in the above embodiments are not described, but should be considered as within the scope of the present disclosure as long as there is no contradiction between the combinations of the features.
The above examples are merely illustrative of several embodiments of the present application, and the description is more specific and detailed, but not to be construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (10)

1. A die-cutting machine workpiece defect detection method based on fast R-CNN is characterized by comprising the following steps:
collecting an image to be detected;
inputting the image to be detected into a trained target detection model, wherein the image to be detected is subjected to a first feature map extraction through a feature extraction network and a feature pyramid network of the target detection model, the first feature map generates a candidate frame through a region generation network (RPN) of the target detection model, and a defect target of the image to be detected is determined based on the candidate frame;
mapping the candidate frame to the first feature map in a interested Pooling network ROI Pooling of the target detection model to obtain a second feature map mapped with the candidate frame, Pooling the second feature map to obtain a third feature map with the same size, inputting the third feature map into a full-connection layer of the target detection model, outputting a target classification label, and determining a defect classification result corresponding to a defect target of the image to be detected based on the target classification label.
2. The workpiece defect detection method of the die cutting machine based on Faster R-CNN of claim 1, wherein the feature extraction network is ResNet 50.
3. The die cutting machine workpiece defect detection method based on Faster R-CNN as recited in claim 1, wherein the defect classification results include in-plane foreign objects, scratches, chips, wrinkles, bubbles and smudges.
4. The die-cutting machine workpiece defect detection method based on Faster R-CNN as claimed in claim 1, wherein a target detection model is obtained by training using a sample image marked with a defective target and a defect classification result as a training sample, wherein the sample image comprises a workpiece image input into a GAN model for defect sample expansion.
5. The die-cutting machine workpiece defect detection method based on Faster R-CNN as claimed in claim 4, wherein the feature extraction network comprises a dense connection network DenseNet obtained by connecting multiple layers of dense blocks, and inputting the image to be detected into a target detection model specifically comprises:
feature extraction: inputting the image to be detected into the dense connection network DenseNet, wherein the input of each layer of dense blocks is a union set of the outputs of all the previous layers of dense blocks, and the last layer of dense blocks outputs the feature map of the image to be detected by learning and utilizing all the previous layers of dense blocks;
feature fusion: and fusing the characteristic graphs through a characteristic pyramid network, generating a candidate frame through a region generation network (RPN) of a first characteristic graph obtained through fusion, and determining a defect target of the image to be detected.
6. The die-cutting machine workpiece defect detection method based on Faster R-CNN according to claim 5, wherein L1 regularization is adopted for convolution kernels and channels of the dense connection network DenseNet to carry out sparse pruning, and the dense connection network DenseNet obtained through pruning is taken as the feature extraction network.
7. The die-cutting machine workpiece defect detection method based on Faster R-CNN as claimed in claim 6, wherein said dense connection network Densenet adopts Densenet121, and the structure of Densenet121 is adjusted to connect said dense connection network Densenet with said feature pyramid network FPN.
8. The die cutting machine workpiece defect detection method based on Faster R-CNN as claimed in claim 1, wherein the image to be detected is acquired by one or more of the following means:
shooting the appearance of the die cutting machine workpiece according to a preset frequency by controlling a shooting device;
or manually controlling the shooting device to obtain;
or the shooting device is triggered to acquire the workpiece according to the position change of the die-cutting machine.
9. A die-cutting machine workpiece defect detection system based on Faster R-CNN is characterized by comprising:
the acquisition module is used for acquiring an image to be detected;
the detection module is used for inputting the image to be detected into a trained target detection model, wherein the image to be detected is subjected to a first feature map extraction through a feature extraction network and a feature pyramid network of the target detection model, the first feature map generates a candidate frame through a region generation network (RPN) of the target detection model, and a defect target of the image to be detected is determined based on the candidate frame;
mapping the candidate frame to the first feature map in a interested Pooling network ROI Pooling of the target detection model to obtain a second feature map mapped with the candidate frame, Pooling the second feature map to obtain a third feature map with the same size, inputting the third feature map into a full-connection layer of the target detection model, outputting a target classification label, and determining a defect classification result corresponding to a defect target of the image to be detected based on the target classification label.
10. An electronic device comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to execute the fast R-CNN based die cutting machine workpiece defect detection method of any one of claims 1 to 8.
CN202110134056.6A 2021-02-01 2021-02-01 Die-cutting machine workpiece defect detection method and system based on Faster R-CNN Pending CN112700442A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110134056.6A CN112700442A (en) 2021-02-01 2021-02-01 Die-cutting machine workpiece defect detection method and system based on Faster R-CNN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110134056.6A CN112700442A (en) 2021-02-01 2021-02-01 Die-cutting machine workpiece defect detection method and system based on Faster R-CNN

Publications (1)

Publication Number Publication Date
CN112700442A true CN112700442A (en) 2021-04-23

Family

ID=75516484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110134056.6A Pending CN112700442A (en) 2021-02-01 2021-02-01 Die-cutting machine workpiece defect detection method and system based on Faster R-CNN

Country Status (1)

Country Link
CN (1) CN112700442A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113343799A (en) * 2021-05-25 2021-09-03 山东师范大学 Method and system for realizing automatic classification of white blood cells based on mixed attention residual error network
CN113724204A (en) * 2021-08-03 2021-11-30 上海卫星装备研究所 Method and system for positioning and identifying defects of aerospace composite material
CN113920432A (en) * 2021-10-12 2022-01-11 广东电网有限责任公司广州供电局 Cutter image intelligent detection method based on GuidedAnchor optimization
CN114266771A (en) * 2022-03-02 2022-04-01 深圳市智源空间创新科技有限公司 Pipeline defect detection method and device based on improved extended feature pyramid model
CN114462469A (en) * 2021-12-20 2022-05-10 浙江大华技术股份有限公司 Training method of target detection model, target detection method and related device
CN114549529A (en) * 2022-04-26 2022-05-27 武汉福旺家包装有限公司 Carton indentation quality detection method and system based on computer vision
CN117408967A (en) * 2023-10-24 2024-01-16 欧派家居集团股份有限公司 Board defect detection method and system based on 3D visual recognition

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110826416A (en) * 2019-10-11 2020-02-21 佛山科学技术学院 Bathroom ceramic surface defect detection method and device based on deep learning
CN111583198A (en) * 2020-04-23 2020-08-25 浙江大学 Insulator picture defect detection method combining FasterR-CNN + ResNet101+ FPN
CN111951212A (en) * 2020-04-08 2020-11-17 北京交通大学 Method for identifying defects of contact network image of railway

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110826416A (en) * 2019-10-11 2020-02-21 佛山科学技术学院 Bathroom ceramic surface defect detection method and device based on deep learning
CN111951212A (en) * 2020-04-08 2020-11-17 北京交通大学 Method for identifying defects of contact network image of railway
CN111583198A (en) * 2020-04-23 2020-08-25 浙江大学 Insulator picture defect detection method combining FasterR-CNN + ResNet101+ FPN

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHUANG LIU ET AL: "Learning Efficient Convolutional Networks through Network Slimming", 《ARXIV:1708.06519V1》 *
李东洁等: "基于改进Faster RCNN的马克杯缺陷检测方法", 《激光与光电子学进展》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113343799A (en) * 2021-05-25 2021-09-03 山东师范大学 Method and system for realizing automatic classification of white blood cells based on mixed attention residual error network
CN113724204A (en) * 2021-08-03 2021-11-30 上海卫星装备研究所 Method and system for positioning and identifying defects of aerospace composite material
CN113920432A (en) * 2021-10-12 2022-01-11 广东电网有限责任公司广州供电局 Cutter image intelligent detection method based on GuidedAnchor optimization
CN114462469A (en) * 2021-12-20 2022-05-10 浙江大华技术股份有限公司 Training method of target detection model, target detection method and related device
CN114266771A (en) * 2022-03-02 2022-04-01 深圳市智源空间创新科技有限公司 Pipeline defect detection method and device based on improved extended feature pyramid model
CN114549529A (en) * 2022-04-26 2022-05-27 武汉福旺家包装有限公司 Carton indentation quality detection method and system based on computer vision
CN114549529B (en) * 2022-04-26 2022-08-19 武汉福旺家包装有限公司 Carton indentation quality detection method and system based on computer vision
CN117408967A (en) * 2023-10-24 2024-01-16 欧派家居集团股份有限公司 Board defect detection method and system based on 3D visual recognition
CN117408967B (en) * 2023-10-24 2024-03-19 欧派家居集团股份有限公司 Board defect detection method and system based on 3D visual recognition

Similar Documents

Publication Publication Date Title
CN112700442A (en) Die-cutting machine workpiece defect detection method and system based on Faster R-CNN
US20230419472A1 (en) Defect detection method, device and system
CN109671058B (en) Defect detection method and system for large-resolution image
CN113239930B (en) Glass paper defect identification method, system, device and storage medium
CN111415329B (en) Workpiece surface defect detection method based on deep learning
JP2017049974A (en) Discriminator generator, quality determine method, and program
CN111696077A (en) Wafer defect detection method based on wafer Det network
CN111932511B (en) Electronic component quality detection method and system based on deep learning
CN109544522A (en) A kind of Surface Defects in Steel Plate detection method and system
CN111242899B (en) Image-based flaw detection method and computer-readable storage medium
CN115965816B (en) Glass defect classification and detection method and system based on deep learning
CN114998324A (en) Training method and device for semiconductor wafer defect detection model
CN104200215A (en) Method for identifying dust and pocking marks on surface of big-caliber optical element
CN115471466A (en) Steel surface defect detection method and system based on artificial intelligence
Fadli et al. Steel surface defect detection using deep learning
CN112419261A (en) Visual acquisition method and device with abnormal point removing function
CN117392042A (en) Defect detection method, defect detection apparatus, and storage medium
CN113554054A (en) Deep learning-based semiconductor chip gold wire defect classification method and system
CN111179278B (en) Image detection method, device, equipment and storage medium
CN113191235A (en) Sundry detection method, device, equipment and storage medium
CN115830302B (en) Multi-scale feature extraction fusion power distribution network equipment positioning identification method
CN101236164B (en) Method and system for defect detection
CN115861160A (en) Method and device for detecting surface defects of power interface of mobile phone and storage medium
CN116958086B (en) Metal surface defect detection method and system with enhanced feature fusion capability
CN117218476A (en) Model training method, using method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210423