CN114897863A - Defect detection method, device and equipment - Google Patents

Defect detection method, device and equipment Download PDF

Info

Publication number
CN114897863A
CN114897863A CN202210587928.9A CN202210587928A CN114897863A CN 114897863 A CN114897863 A CN 114897863A CN 202210587928 A CN202210587928 A CN 202210587928A CN 114897863 A CN114897863 A CN 114897863A
Authority
CN
China
Prior art keywords
feature
defect
module
matching degree
classes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210587928.9A
Other languages
Chinese (zh)
Inventor
罗丽兰
程大龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Exploration Intelligence Technology Guangdong Co ltd
Original Assignee
Iflytek South China Artificial Intelligence Research Institute Guangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iflytek South China Artificial Intelligence Research Institute Guangzhou Co ltd filed Critical Iflytek South China Artificial Intelligence Research Institute Guangzhou Co ltd
Priority to CN202210587928.9A priority Critical patent/CN114897863A/en
Publication of CN114897863A publication Critical patent/CN114897863A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • G06V10/464Salient features, e.g. scale invariant feature transforms [SIFT] using a plurality of salient features, e.g. bag-of-words [BoW] representations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses a method, a device and equipment for detecting defects, wherein the detection method comprises the following steps: extracting first feature maps of different scales of an image to be detected; obtaining a matching degree set between a first feature point on the first feature map and a feature template library; judging whether the matching degree set corresponding to the first characteristic point meets the requirement or not; and if the requirement is met, outputting a detection result of the new type of defects corresponding to the first characteristic point. The invention determines whether the defects of the new category exist or not by utilizing the difference between the characteristic points in the characteristic diagram of the image to be detected and the defects of the known category, and positions and classifies the defects of the new category timely and accurately.

Description

Defect detection method, device and equipment
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method, an apparatus, and a device for detecting defects.
Background
At present, surface defect equipment based on machine vision has widely replaced artificial vision inspection in various fields of 3C, automobiles, household appliances, mechanical manufacturing, semiconductors and electronics, chemistry, pharmacy, aerospace, light industry and other industries. Conventional surface defect detection methods based on machine vision typically use conventional image processing algorithms or artificially designed features and classifiers. In general, imaging protocols are typically designed using different properties of the surface or defect being inspected. A reasonable imaging scheme helps to obtain an image with uniform illumination and to clearly reflect surface defects of the object. In recent years, many deep learning based defect detection methods have also been widely used in various industrial scenarios.
In an industrial defect detection scenario, detecting and classifying unknown classes of defects is a very important problem to be solved. Conventional surface defect detection algorithms typically manually design feature extraction algorithms and decision algorithms for different defects. In conventional algorithms, selecting important features in each image is a necessary step. As the number of categories increases, feature extraction becomes more and more cumbersome. The determination of which features best describe different object classes depends on the judgment of the algorithm engineer and long trial and error. In addition, each feature definition also requires processing of a large number of parameters, all of which must be adjusted by an algorithm engineer. Therefore, although the traditional algorithm utilizes the experience of an algorithm engineer and manual prior to quickly design a new algorithm for new defects, the traditional algorithm has poor accuracy, excessive parameters needing to be adjusted and depends on the prior knowledge of the algorithm engineer for the defects.
The deep learning model is trained based on given data, wherein the neural network discovers underlying patterns in image classes and automatically extracts features which are most descriptive and most significant for target classes. It is generally accepted that the performance of deep neural networks greatly exceeds that of traditional algorithms. The defect detection method based on deep learning has strong data dependency, and in actual industrial production, defect distribution has a serious long tail distribution problem, which results in that knowledge obtained by a used training set may be incomplete when a deep learning model is trained, and defect categories at the tail part are often difficult to be correctly detected by a detected model due to too little training data, so that during training, categories with too few samples may not establish separate categories for detection. As production progresses, aging of parts of the processing equipment or failure of cleaning equipment may also lead to the creation of new defects. Therefore, the model may encounter new unknown defect classes in the actual application scenario. In training the detection model, since no label training is performed on the defects of unknown classes, these unknown instances will be explicitly learned as background or will be classified as one of the known classes at the time of testing. Therefore, when a factory has difficulty collecting a sufficient number of defect samples due to a change in a manufacturing process, it is difficult for the existing deep learning-based detection algorithm to accurately locate and classify new defects.
Disclosure of Invention
In view of the foregoing, the present invention aims to provide a defect detection method, apparatus and device, which determine whether there is a new type of defect by using the difference between the feature point in the feature map of the image to be detected and the defect of the known type, and timely and accurately locate and classify the defect of the new type.
The technical scheme adopted by the invention is as follows:
in a first aspect, the present invention provides a method for detecting defects, including:
extracting first feature maps of different scales of an image to be detected;
obtaining a matching degree set between a first feature point on the first feature map and a feature template library;
judging whether the matching degree set corresponding to the first characteristic point meets the requirement or not;
and if the requirement is met, outputting a detection result of the new type of defects corresponding to the first characteristic point.
In one possible implementation manner, if the requirement is not met, obtaining and outputting a detection result of a known category corresponding to the first feature point according to the first feature map with different scales;
wherein the known classes include a plurality of known defect classes and non-defect classes.
In one possible implementation manner, obtaining a matching degree set between a first feature point on the first feature map and a feature template library specifically includes:
extracting all first feature points on the first feature map;
for each first feature point, matching the features of the first feature point with the feature points in each feature set in the feature template library to obtain the matching degree of the first feature point and each known class, and taking a set formed by the matching degrees of the first feature point and all known classes as the matching degree set;
wherein the known classes include a plurality of known defect classes and non-defect classes.
In one possible implementation manner, if each matching degree in the matching degree set is smaller than a first threshold, it is determined that the matching degree set corresponding to the first feature point meets the requirement.
In one possible implementation manner, determining whether the matching degree set corresponding to the first feature point meets requirements further includes:
if each matching degree in the matching degree set is smaller than the first threshold value, calculating the average value of all matching degrees in the matching degree set;
judging whether the average value is smaller than a second threshold value;
and if so, judging that the matching degree set corresponding to the first feature point meets the requirement.
In one possible implementation, the first feature map and the detection result of the known class are obtained through a detection model;
and, the detection method further comprises:
collecting the images to be detected with the defects of the new type to form a training set of the new defect type;
and further training the detection model by utilizing the training set of the new defect category.
In one possible embodiment, constructing the feature template library includes:
extracting second feature maps of different scales of each image in the training set of the known class;
acquiring a defect-free feature set and a defect feature set corresponding to each known defect type according to the second feature map;
using the defect feature set of all known defect classes and the first set formed by the defect-free feature set as the feature template library;
wherein the known classes include a plurality of known defect classes and non-defect classes.
In one possible implementation, the obtaining the defect-free feature set and the defect feature set corresponding to each known defect type according to the second feature map specifically includes:
judging whether a second feature point on the second feature map contains a part of the known defect or not;
if so, adding the second feature point into a defect feature set corresponding to the known defect type to which the known defect belongs;
and if the second characteristic point has no defects, adding the second characteristic point into a defect-free characteristic set.
In one possible implementation, if the second feature point includes a part of a known defect, determining whether the part of the known defect is a center of the known defect;
and if so, adding the second feature point into a defect feature set corresponding to the known defect type to which the known defect belongs.
In one possible embodiment, constructing the feature template library includes:
and performing downsampling on the first set to obtain a second set, and taking the second set as the feature template library.
In a second aspect, the present invention provides a defect detection apparatus, including a first feature map extraction module, a matching degree set obtaining module, a first judgment module and an output module;
the first feature map extraction module is used for extracting first feature maps with different scales of the image to be detected;
the matching degree set obtaining module is used for obtaining a matching degree set between a first feature point on the first feature map and a feature template library;
the first judging module is used for judging whether the matching degree set corresponding to the first characteristic point meets the requirement or not;
and the output module is used for outputting the detection result of the new type of defects corresponding to the first characteristic point if the matching degree set corresponding to the first characteristic point meets the requirement.
In one possible implementation manner, the matching degree set obtaining module comprises a first feature point extracting module and a matching module;
the first feature point extraction module is used for extracting all first feature points on the first feature map;
the matching module is used for matching the features of the first feature points with the feature points in each feature set in the feature template library aiming at each first feature point to obtain the matching degree of the first feature points and each known class, and taking a set formed by the matching degrees of the first feature points and all known classes as the matching degree set, wherein the known classes comprise a plurality of known defect classes and defect-free classes.
In one possible implementation manner, the first determining module includes a second determining module, and the second determining module is configured to determine whether each matching degree in the matching degree set is smaller than a first threshold.
In one possible implementation manner, the first determining module further includes a mean calculating module and a third determining module;
the mean value calculating module is used for calculating the mean value of all the matching degrees in the matching degree set if each matching degree in the matching degree set is smaller than the first threshold value;
the third judging module is used for judging whether the average value is smaller than a second threshold value.
In one possible implementation, the detection device further comprises a training set collection module and a training module;
the training set collection module is used for collecting the images to be detected of the defects with the new types to form a training set of the new defect types;
and the training module is used for further training a detection model by utilizing the training set of the new defect category.
In one possible implementation manner, the detection apparatus further includes a feature template library construction module, where the feature template library construction module includes a second feature map extraction module, a feature set obtaining module, and a first set forming module;
the second feature map extraction module is used for extracting second feature maps with different scales of each image in the training set of the known class;
the characteristic set obtaining module is used for obtaining a defect-free characteristic set and a defect characteristic set corresponding to each known defect type according to the second characteristic map;
the first set forming module is used for taking a defect feature set of all known defect classes and a first set formed by the defect-free feature set as the feature template library;
wherein the known classes include a plurality of known defect classes and non-defect classes.
In a third aspect, the present invention provides a defect detection apparatus, comprising:
one or more processors, a memory, and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the defect detection apparatus, cause the defect detection apparatus to perform the above-described method of defect detection.
In a fourth aspect, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform the method as described in the first aspect or any possible implementation of the first aspect.
The idea of the invention lies in that the powerful feature extraction capability of the convolutional neural network is utilized, the features of the samples with known defects and the samples without defects form a feature template library, and the image to be detected is matched with the feature template library in the actual application, so as to judge whether the defects of the image to be detected belong to the defects of the known types, thereby achieving the purpose of identifying the defects of the new types.
Drawings
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of one embodiment of a method for detecting defects provided by the present invention;
FIG. 2 is a flow chart of a preferred embodiment of constructing a library of feature templates provided by the present invention;
FIG. 3 is an example of a preferred embodiment of constructing a library of feature templates provided by the present invention;
FIG. 4 is an example of a defect detection method provided by the present invention;
FIG. 5 is a structural view of a defect detecting apparatus according to the present invention;
fig. 6 is a schematic structural diagram of an embodiment of a defect detection apparatus provided in the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative only and should not be construed as limiting the invention.
The traditional defect detection algorithm is mainly applied to the detection of defects by utilizing algorithms such as image enhancement and the like to preprocess an image, and then utilizing algorithms such as texture information extraction, binarization processing, image morphology and the like to detect the defects. The existing defect detection algorithm based on deep learning can extract image features by using a deep convolutional neural network, and then position and classify defects.
As can be seen from the above description, in the new category of defect detection, both the conventional algorithm and the existing defect detection algorithm based on deep learning explicitly classify the defect into normal images or one of the known defect categories.
Based on the reasons, the invention provides a defect detection method, a defect detection device and defect detection equipment, which utilize the strong feature extraction capability of a convolutional neural network to form a feature template library by using the features of a defect sample with a known type and a defect-free sample, and match an image to be detected with the feature template library in actual application, so as to judge whether the defect of the image to be detected belongs to the defect with the known type, further achieve the purpose of identifying the defect with a new type, and further obtain the features of the defect with the new type in the detection process of the defect with the known type.
In view of the foregoing core concept, the present invention provides an embodiment of a method for detecting at least one defect, as shown in fig. 1, which may include the following steps:
s110: and extracting first feature maps of different scales of the image to be detected.
In one possible embodiment, the first feature maps of different scales may be obtained by separate convolutional neural networks. In another possible embodiment, the first feature map may also be an intermediate result of a detection model (including a convolutional neural network) that is trained. For example, an image to be detected first passes through a convolutional neural network of a detection model to obtain a first feature map, and then the first feature map is input into a detection head of the detection model, the detection head comprises a regression module and a classification module which are respectively used for positioning and classifying defects, the positioning information of the defects is coordinates of the defects in the image to be detected, and the classification result comprises the defects and/or non-defects of a certain known type. And finally, taking the positioning result and the classification result as output data of the detection model.
In one possible embodiment, the first signature may be a signature of different convolutional layer outputs of the convolutional neural network. In another possible implementation, the first feature map may be a feature map obtained after feature maps output by different convolutional layers of the convolutional neural network are respectively subjected to a feature pyramid.
S120: and obtaining a matching degree set between the first feature point on the first feature map and the feature template library.
It should be noted that the feature template library is obtained from the trained detection model.
In one possible embodiment, as shown in fig. 2, constructing the feature template library includes:
s210: and extracting a second feature map of each image in the training set of the known class in different scales.
In one possible embodiment, the second feature map is extracted in the same manner as the first feature map.
It will be appreciated that other techniques known in the art may be employed to extract the second feature map.
S220: and acquiring a defect-free feature set and a defect feature set corresponding to each known defect type according to the second feature map.
In one possible embodiment, obtaining the defect-free feature set and the defect feature set corresponding to each known defect class includes:
s2201: and judging whether the second characteristic point on the second characteristic map contains a part of the known defects. If yes, go to S2204; otherwise, S2202 is executed.
Specifically, whether the second feature point contains a part of a known defect may be determined according to a correspondence between the detection result of the image obtained from the detection model and the second feature point. If the detection result of the image comprises the coordinates of the second feature point and the corresponding classification result, the second feature point comprises a part of the known defect.
S2202: and if the second characteristic point does not contain the known defects, indicating that the second characteristic point does not have the defects, adding the second characteristic point into the defect-free characteristic set.
S2204: and adding the second feature point into a defect feature set corresponding to the known defect class to which the known defect belongs.
In a preferred embodiment, if the second feature point includes a part of a known defect, the following steps are further performed:
s2203: it is determined whether a portion of the known defect is a center of the known defect. If yes, go to S2204; otherwise, no operation is done.
In the preferred embodiment, each defect feature set collects the dominant defect features of each known defect category, helping to more accurately identify the category of the defect.
S230: and taking the first set formed by the defect characteristic set and the defect-free characteristic set of all known defect classes as a characteristic template library.
In a preferred embodiment, after obtaining the first set, S240 is further performed: and performing downsampling on the first set to obtain a second set, and taking the second set as a feature template library.
Figure 3 shows an example of a preferred embodiment of building a library of feature templates. As shown in fig. 3, the training module for detecting the model is mainly divided into two stages, a model training stage and a feature template library construction stage. In the model training stage, a training set is used for training a detection model, so that the feature extraction capability of the model and the classification and positioning capability of N existing defect categories in the training set are obtained, and the model parameters are obtained. As shown in fig. 3, C3-C5 are feature maps of different scales in the convolutional neural network, and P3-P5 are feature maps of different scales obtained after feature pyramid processing. And inputting the P3-P5 into three detection heads to obtain the output of the model. After the model training phase is finished, a new detection module is added into the detection model, and then the construction phase of the characteristic template library is entered.
In the construction stage of the feature template library, each feature point f of feature maps P3-P5 is extracted respectively. If the characteristic point f is the center of the defect with the category of k, adding the characteristic point f into a defect characteristic set C k ,k∈[1,…,N]And N is the total number of defect classes. If the defect part contained in the feature point f is not the center of the defect of the category k, no processing is performed. If the characteristic point f does not contain any defect information, adding the characteristic point f into a defect-free characteristic set C 0 This feature point is free of defects. A first set [ C ] of N +1 feature sets is then formed 0 ,C 1 …C N ]Down-sampling to obtain a second set [ M ] 0 ,M 1 ,...M N ]And taking the second set as a characteristic template library.
And after the characteristic template library is constructed, removing the new detection module and storing the original detection module.
On the basis, obtaining a matching degree set between the first feature point on the first feature map and the feature template library specifically includes:
s1201: and extracting all first feature points on the first feature map.
S1202: and aiming at each first feature point, matching the features of the first feature point with the feature points in each feature set in the feature template library to obtain the matching degree of the first feature point and each known class (comprising a plurality of known defect classes and defect-free classes), and taking a set formed by the matching degrees of the first feature point and all the known classes as a matching degree set.
Specifically, the defect feature set and the defect-free feature set (hereinafter collectively referred to as a feature set) of each known defect category include a plurality of second feature points, and the maximum matching degree between the first feature point and the second feature point in each feature set is used as the matching degree between the first feature point and the known category.
S130: and judging whether the matching degree set corresponding to the first characteristic point meets the requirement or not. If yes, go to S140; otherwise, S150 is performed.
In a possible implementation manner, if each matching degree in the matching degree set is smaller than the first threshold, which indicates that the matching degree of the first feature point and the defect of each category is low, it is determined that the matching degree set corresponding to the first feature point meets the requirement.
In a preferred embodiment, if each matching degree in the matching degree set is smaller than the first threshold, the average of all matching degrees in the matching degree set is further calculated, and whether the average is smaller than the second threshold is determined. The second threshold is a threshold obtained with 95% confidence on the basis of the first threshold. And if the average value is smaller than the second threshold value, which indicates that the deviation between the average value and the first threshold value is large, determining that the matching degree set corresponding to the first feature point meets the requirement. Otherwise, no operation is done.
S140: and if the requirement is met, the difference between the first characteristic point and the known category is large, and the detection result of the defect of the new category corresponding to the first characteristic point is output.
It should be noted that the detection result of the defect of the new category includes a preliminary positioning result of the first feature point.
In a preferred embodiment, the detection result of the defect of the new category further includes the above-mentioned mean value.
S150: if the matching degree greater than the first threshold exists in the matching degree set, which indicates that the first feature point belongs to a certain class in the feature template library, obtaining and outputting a detection result of a known class corresponding to the first feature point according to the first feature map with different scales, namely obtaining the detection result of the known class through the trained detection model.
It is understood that each image under test may have only a known type of defect or a new type of defect, or may have both a known type of defect and a new type of defect.
Fig. 4 shows an example of the defect detection method of the present invention. As shown in FIG. 4, a first feature map P3-P5 is extracted by using a trained inspection model, a first feature point f is extracted from the first feature map P3-P5, and the first feature point f is combined with a feature template library [ M [ [ M ] 0 ,M 1 ,...M N ]Respectively carrying out nearest neighbor search on each feature set to obtain a matching degree set [ S ] 0 ,S 1 ,...S N ]. A first threshold value t is set. Set matching degree to [ S ] 0 ,S 1 ,...S N ]Is compared with a first threshold t.
If the matching degree set [ S ] 0 ,S 1 ,...S N ]If the value of each matching degree is less than the first threshold value t, calculating a matching degree set S 0 ,S 1 ,...S N ]And judging whether the mean value is smaller than a second threshold value, if so, representing that the characteristic point f contains the defect of the new category, and obtaining an abnormal detection result D S And an anomaly score S. Specifically, the abnormality detection result D S The method comprises the initial positioning information of the first characteristic point f and the proportion r of the original image of the image to be detected relative to the first characteristic graph of the first characteristic point f (namely the first characteristicWidth and height of the detection box of the figure). The average value is taken as the abnormality score S.
If the matching degree set [ S ] 0 ,S 1 ,...S N ]If there is a matching degree smaller than the first threshold t, the first feature point f does not include a defect of the new category.
Finally, the abnormal detection result D is obtained S And merging the abnormal score S and the output result D of the detection model to obtain the final defect detection result of the image to be detected.
Based on the above scheme, in a preferred embodiment, the detection method of the present invention further includes collecting an image to be detected with a new class of defects, forming a training set of the new defect class, and when the training set of the new defect class reaches a preset scale, further training the detection model by using the training set of the new defect class, so that the new defect class serves as a known defect class of the detection model, thereby updating the defect type in the detection model.
Corresponding to the above embodiments and preferred schemes, the present invention further provides an embodiment of a defect detection apparatus, as shown in fig. 5, which specifically includes a first feature map extraction module 510, a matching degree set obtaining module 520, a first judgment module 530, and an output module 540.
The first feature map extraction module 510 is configured to extract first feature maps of different scales of the to-be-detected image.
The matching degree set obtaining module 520 is configured to obtain a matching degree set between the first feature point on the first feature map and the feature template library.
The first determining module 530 is configured to determine whether the matching degree set corresponding to the first feature point meets requirements.
The output module 540 is configured to output a detection result of the new category of defects corresponding to the first feature point if the matching degree set corresponding to the first feature point meets the requirement.
In one possible implementation, the matching degree set obtaining module 520 includes a first feature point extracting module 5201 and a matching module 5202.
The first feature point extraction module 5201 is configured to extract all first feature points on the first feature map.
The matching module 5202 is configured to, for each first feature point, match the features of the first feature point with the feature points in each feature set in the feature template library to obtain a matching degree between the first feature point and each known class, and use a set formed by the matching degrees between the first feature point and all known classes as a matching degree set, where the known classes include multiple known defect classes and defect-free classes.
In one possible implementation, the first determining module 530 includes a second determining module 5301, and the second determining module 5301 is configured to determine whether each of the matching degrees in the matching degree set is smaller than a first threshold.
In one possible implementation, the first determining module 530 further includes a mean calculating module 5302 and a third determining module 5303.
The mean value calculating module 5302 is configured to calculate a mean value of all the matching degrees in the matching degree set if each matching degree in the matching degree set is smaller than the first threshold.
The third determining module 5303 is configured to determine whether the average value is smaller than the second threshold.
In one possible implementation, the detection apparatus further includes a training set collection module 550 and a training module 560.
The training set collection module 550 is used for collecting the images to be inspected of the defects with the new category to form a training set of the new defect category.
The training module 560 is configured to further train the inspection model using the training set of new defect classes.
In one possible implementation, the detection apparatus further includes a feature template library construction module 570, and the feature template library construction module 570 includes a second feature map extraction module 5701, a feature set obtaining module 5702, and a first set forming module 5703.
The second feature map extraction module 5701 is used for extracting a second feature map of a different scale for each image in a training set of a known class.
The feature set obtaining module 5702 is used for obtaining a defect-free feature set and a defect feature set corresponding to each known defect type according to the second feature map.
The first set forming module 5703 is used for forming a first set of defect feature sets and defect-free feature sets of all known defect classes as a feature template library.
In one possible implementation, the feature template library construction module 570 further includes a second set formation module 5704, and the second set formation module 5704 is configured to down-sample the first set, obtain a second set, and use the second set as the feature template library.
It should be understood that the division of the components of the defect detection apparatus shown in fig. 5 is merely a logical division, and the actual implementation may be wholly or partially integrated into a physical entity or may be physically separated. And these components may all be implemented in software invoked by a processing element; or may be implemented entirely in hardware; and part of the components can be realized in the form of calling by the processing element in software, and part of the components can be realized in the form of hardware. For example, a certain module may be a separate processing element, or may be integrated into a certain chip of the electronic device. Other components are implemented similarly. In addition, all or part of the components can be integrated together or can be independently realized. In implementation, each step of the above method or each component above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above components may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, these components may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
In view of the foregoing examples and their preferred embodiments, it will be appreciated by those skilled in the art that in practice, the invention may be practiced in a variety of embodiments, and that the invention is illustrated schematically in the following vectors:
(1) an apparatus for detecting defects, which may comprise:
one or more processors, memory, and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the apparatus, cause the apparatus to perform the steps/functions of the foregoing embodiments or an equivalent implementation.
Fig. 6 is a schematic structural diagram of an embodiment of the defect detection device of the present invention, wherein the device may be an electronic device or a circuit device built in the electronic device. The electronic device can be a PC, a server, an intelligent terminal (a mobile phone, a tablet, a watch, glasses, etc.), an intelligent television, an intelligent screen, a teller machine, a robot, an intelligent (automobile) vehicle, a vehicle-mounted device, etc. The present embodiment does not limit the specific form of the defect detection apparatus.
As shown in fig. 6 in particular, the apparatus 900 for detecting defects includes a processor 910 and a memory 930. Wherein, the processor 910 and the memory 930 can communicate with each other and transmit control and/or data signals through the internal connection path, the memory 930 is used for storing computer programs, and the processor 910 is used for calling and running the computer programs from the memory 930. The processor 910 and the memory 930 may be combined into a single processing device, or more generally, separate components, and the processor 910 is configured to execute the program code stored in the memory 930 to implement the functions described above. In particular implementations, the memory 930 may be integrated with the processor 910 or may be separate from the processor 910.
In addition to this, to make the function of the defective detection apparatus 900 more complete, the apparatus 900 may further include one or more of an input unit 960, a display unit 970, an audio circuit 980, a camera 990, a sensor 901, and the like, and the audio circuit may further include a speaker 982, a microphone 984, and the like. The display unit 970 may include a display screen, among others.
Further, the apparatus 900 for detecting defects may further include a power supply 950 for supplying power to various devices or circuits in the apparatus 900.
It should be understood that the defect detection apparatus 900 shown in fig. 6 can implement the processes of the methods provided by the foregoing embodiments. The operations and/or functions of the various components of the apparatus 900 may each be configured to implement the corresponding flow in the above-described method embodiments. Reference is made in detail to the foregoing description of embodiments of the method, apparatus, etc., and a detailed description is omitted here as appropriate to avoid redundancy.
It should be understood that the processor 910 in the defect detecting apparatus 900 shown in fig. 6 may be a system on chip SOC, and the processor 910 may include a Central Processing Unit (CPU), and may further include other types of processors, such as: an image Processing Unit (GPU), etc., which will be described in detail later.
In summary, various portions of the processors or processing units within the processor 910 may cooperate to implement the foregoing method flows, and corresponding software programs for the various portions of the processors or processing units may be stored in the memory 930.
(2) A readable storage medium, on which a computer program or the above-mentioned apparatus is stored, which, when executed, causes the computer to perform the steps/functions of the above-mentioned embodiments or equivalent implementations.
In the several embodiments provided by the present invention, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on this understanding, some aspects of the present invention may be embodied in the form of software products, which are described below, or portions thereof, which substantially contribute to the art.
(3) A computer program product (which may include the above-described apparatus) which, when run on a terminal device, causes the terminal device to perform the method for detecting defects of the preceding embodiments or equivalent implementations.
From the above description of the embodiments, it is clear to those skilled in the art that all or part of the steps in the above implementation method can be implemented by software plus a necessary general hardware platform. With this understanding, the above-described computer program products may include, but are not limited to, refer to APP; continuing on, the aforementioned device/terminal may be a computer device (e.g., a mobile phone, a PC terminal, a cloud platform, a server cluster, or a network communication device such as a media gateway). Moreover, the hardware structure of the computer device may further specifically include: at least one processor, at least one communication interface, at least one memory, and at least one communication bus; the processor, the communication interface and the memory can all complete mutual communication through the communication bus. The processor may be a central Processing unit CPU, a DSP, a microcontroller, or a digital Signal processor, and may further include a GPU, an embedded Neural Network Processor (NPU), and an Image Signal Processing (ISP), and may further include a specific integrated circuit ASIC, or one or more integrated circuits configured to implement the embodiments of the present invention, and the processor may have a function of operating one or more software programs, and the software programs may be stored in a storage medium such as a memory; and the aforementioned memory/storage media may comprise: non-volatile memories (non-volatile memories) such as non-removable magnetic disks, U-disks, removable hard disks, optical disks, etc., and Read-Only memories (ROM), Random Access Memories (RAM), etc.
In the embodiments of the present invention, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of skill in the art will appreciate that the various modules, elements, and method steps described in the embodiments disclosed in this specification can be implemented as electronic hardware, combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In addition, the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments may be referred to each other. In particular, for embodiments of devices, apparatuses, etc., since they are substantially similar to the method embodiments, reference may be made to some of the descriptions of the method embodiments for their relevant points. The above-described embodiments of devices, apparatuses, etc. are merely illustrative, and modules, units, etc. described as separate components may or may not be physically separate, and may be located in one place or distributed in multiple places, for example, on nodes of a system network. Some or all of the modules and units can be selected according to actual needs to achieve the purpose of the above-mentioned embodiment. Can be understood and carried out by those skilled in the art without inventive effort.
The structure, features and effects of the present invention have been described in detail with reference to the embodiments shown in the drawings, but the above embodiments are merely preferred embodiments of the present invention, and it should be understood that technical features related to the above embodiments and preferred modes thereof can be reasonably combined and configured into various equivalent schemes by those skilled in the art without departing from and changing the design idea and technical effects of the present invention; therefore, the invention is not limited to the embodiments shown in the drawings, and all the modifications and equivalent embodiments that can be made according to the idea of the invention are within the scope of the invention as long as they are not beyond the spirit of the description and the drawings.

Claims (17)

1. A method for detecting defects, comprising:
extracting first feature maps of different scales of an image to be detected;
obtaining a matching degree set between a first feature point on the first feature map and a feature template library;
judging whether the matching degree set corresponding to the first characteristic point meets the requirement or not;
and if the requirement is met, outputting a detection result of the new type of defects corresponding to the first characteristic point.
2. The method for detecting the defects according to claim 1, wherein if the requirements are not met, the detection results of the known categories corresponding to the first feature points are obtained according to the first feature maps with different scales and output;
wherein the known classes include a plurality of known defect classes and a defect-free class.
3. The method for detecting defects according to claim 1, wherein obtaining a matching degree set between a first feature point on the first feature map and a feature template library specifically comprises:
extracting all first feature points on the first feature map;
for each first feature point, matching the features of the first feature point with the feature points in each feature set in the feature template library to obtain the matching degree of the first feature point and each known class, and taking a set formed by the matching degrees of the first feature point and all known classes as the matching degree set;
wherein the known classes include a plurality of known defect classes and non-defect classes.
4. The method according to claim 3, wherein if each matching degree in the matching degree set is smaller than a first threshold, it is determined that the matching degree set corresponding to the first feature point satisfies a requirement.
5. The method of claim 4, wherein determining whether the matching degree set corresponding to the first feature point meets a requirement further comprises:
if each matching degree in the matching degree set is smaller than the first threshold value, calculating the average value of all matching degrees in the matching degree set;
judging whether the average value is smaller than a second threshold value;
and if so, judging that the matching degree set corresponding to the first feature point meets the requirement.
6. The method for detecting defects according to claim 2, wherein the first feature map and the detection results of the known classes are obtained through a detection model;
and, the detection method further comprises:
collecting the images to be detected with the defects of the new type to form a training set of the new defect type;
and further training the detection model by utilizing the training set of the new defect category.
7. The method of detecting defects of claim 1, wherein constructing the library of feature templates comprises:
extracting second feature maps of different scales of each image in the training set of the known class;
acquiring a defect-free feature set and a defect feature set corresponding to each known defect type according to the second feature map;
using the defect feature set of all known defect classes and the first set formed by the defect-free feature set as the feature template library;
wherein the known classes include a plurality of known defect classes and non-defect classes.
8. The method according to claim 7, wherein the obtaining the defect-free feature set and the defect feature set corresponding to each known defect type according to the second feature map comprises:
judging whether a second feature point on the second feature map contains a part of the known defect or not;
if so, adding the second feature point into a defect feature set corresponding to the known defect type to which the known defect belongs;
and if the second characteristic point has no defects, adding the second characteristic point into a defect-free characteristic set.
9. The method of claim 8, wherein if the second feature point includes a portion of a known defect, determining whether the portion of the known defect is a center of the known defect;
and if so, adding the second feature point into a defect feature set corresponding to the known defect type to which the known defect belongs.
10. The method of detecting defects of claim 7, wherein constructing the library of feature templates comprises:
and performing downsampling on the first set to obtain a second set, and taking the second set as the feature template library.
11. The defect detection device is characterized by comprising a first feature map extraction module, a matching degree set acquisition module, a first judgment module and an output module;
the first feature map extraction module is used for extracting first feature maps with different scales of the image to be detected;
the matching degree set obtaining module is used for obtaining a matching degree set between a first feature point on the first feature map and a feature template library;
the first judging module is used for judging whether the matching degree set corresponding to the first feature point meets the requirement or not;
and the output module is used for outputting the detection result of the new type of defects corresponding to the first characteristic point if the matching degree set corresponding to the first characteristic point meets the requirement.
12. The apparatus for detecting defects according to claim 11, wherein the matching degree set obtaining module comprises a first feature point extracting module and a matching module;
the first feature point extraction module is used for extracting all first feature points on the first feature map;
the matching module is used for matching the features of the first feature points with the feature points in each feature set in the feature template library aiming at each first feature point to obtain the matching degree of the first feature points and each known class, and taking a set formed by the matching degrees of the first feature points and all known classes as the matching degree set, wherein the known classes comprise a plurality of known defect classes and defect-free classes.
13. The apparatus of claim 12, wherein the first determining module comprises a second determining module, and the second determining module is configured to determine whether each matching degree in the set of matching degrees is smaller than a first threshold.
14. The apparatus for detecting defects according to claim 13, wherein the first determining module further comprises a mean calculating module and a third determining module;
the mean value calculating module is used for calculating the mean value of all the matching degrees in the matching degree set if each matching degree in the matching degree set is smaller than the first threshold value;
the third judging module is used for judging whether the average value is smaller than a second threshold value.
15. The apparatus for detecting defects according to claim 11, further comprising a training set collection module and a training module;
the training set collection module is used for collecting the images to be detected of the defects with the new types to form a training set of the new defect types;
and the training module is used for further training a detection model by utilizing the training set of the new defect category.
16. The apparatus for detecting defects according to claim 15, further comprising a feature template library construction module, wherein the feature template library construction module comprises a second feature map extraction module, a feature set obtaining module and a first set forming module;
the second feature map extraction module is used for extracting second feature maps with different scales of each image in the training set of the known class;
the characteristic set obtaining module is used for obtaining a defect-free characteristic set and a defect characteristic set corresponding to each known defect type according to the second characteristic map;
the first set forming module is used for taking a defect feature set of all known defect classes and a first set formed by the defect-free feature set as the feature template library;
wherein the known classes include a plurality of known defect classes and non-defect classes.
17. An apparatus for detecting defects, comprising:
one or more processors, a memory, and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus for detecting defects, cause the apparatus for detecting defects to perform the method for detecting defects according to any one of claims 1 to 10.
CN202210587928.9A 2022-05-26 2022-05-26 Defect detection method, device and equipment Pending CN114897863A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210587928.9A CN114897863A (en) 2022-05-26 2022-05-26 Defect detection method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210587928.9A CN114897863A (en) 2022-05-26 2022-05-26 Defect detection method, device and equipment

Publications (1)

Publication Number Publication Date
CN114897863A true CN114897863A (en) 2022-08-12

Family

ID=82726060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210587928.9A Pending CN114897863A (en) 2022-05-26 2022-05-26 Defect detection method, device and equipment

Country Status (1)

Country Link
CN (1) CN114897863A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115082722A (en) * 2022-08-22 2022-09-20 四川金信石信息技术有限公司 Equipment defect detection method, system, terminal and medium based on forward sample

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115082722A (en) * 2022-08-22 2022-09-20 四川金信石信息技术有限公司 Equipment defect detection method, system, terminal and medium based on forward sample
CN115082722B (en) * 2022-08-22 2022-11-01 四川金信石信息技术有限公司 Equipment defect detection method, system, terminal and medium based on forward sample

Similar Documents

Publication Publication Date Title
CN110992317B (en) PCB defect detection method based on semantic segmentation
CN113205176B (en) Method, device and equipment for training defect classification detection model and storage medium
CN111325713A (en) Wood defect detection method, system and storage medium based on neural network
CN111754456B (en) Two-dimensional PCB appearance defect real-time automatic detection technology based on deep learning
CN112381788B (en) Part surface defect increment detection method based on double-branch matching network
CN114155181B (en) Automatic optimization of inspection schemes
WO2024021461A1 (en) Defect detection method and apparatus, device, and storage medium
CN112529109A (en) Unsupervised multi-model-based anomaly detection method and system
CN110853091A (en) Method and system for identifying winding defect image of engine fuse
CN111758117A (en) Inspection system, recognition system, and learning data generation device
CN115690670A (en) Intelligent identification method and system for wafer defects
CN114897863A (en) Defect detection method, device and equipment
WO2019176988A1 (en) Inspection system, identification system, and device for evaluating identification apparatus
CN114651172A (en) Using convolution context attributes to find semiconductor defects
Mumbelli et al. An application of Generative Adversarial Networks to improve automatic inspection in automotive manufacturing
CN112884018A (en) Power grid line fault recognition model training method and power grid line inspection method
CN114202544B (en) Complex workpiece defect detection method based on self-encoder
CN115082449A (en) Electronic component defect detection method
CN114155522A (en) Point cloud data quality inspection repairing method and system
CN111126455A (en) Abrasive particle two-stage identification method based on Lightweight CNN and SVM
CN111724352B (en) Patch LED flaw labeling method based on kernel density estimation
CN116993733B (en) Earphone sleeve appearance quality detection method and system
CN113570566B (en) Product appearance defect development cognition detection method and related device
CN113240097B (en) Method and system for expanding and classifying data
CN114581806B (en) Industrial part empty rate calculation method based on trunk edge feature extraction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20231211

Address after: Unit I116, 14th Floor, Block A, Hantian Science and Technology City, No.17 Shenhai Road, Guicheng Street, Nanhai District, Foshan City, Guangdong Province, 528299

Applicant after: Exploration Intelligence Technology (Guangdong) Co.,Ltd.

Address before: 510130 No. 106, Fengze East Road, Nansha District, Guangzhou City, Guangdong Province (Building No. 1), X1301-G5145

Applicant before: IFLYTEK SOUTH CHINA ARTIFICIAL INTELLIGENCE RESEARCH INSTITUTE (GUANGZHOU) Co.,Ltd.

TA01 Transfer of patent application right