CN116645365B - Quartz glass detection method, device, equipment and medium based on frequency spectrum - Google Patents
Quartz glass detection method, device, equipment and medium based on frequency spectrum Download PDFInfo
- Publication number
- CN116645365B CN116645365B CN202310897267.4A CN202310897267A CN116645365B CN 116645365 B CN116645365 B CN 116645365B CN 202310897267 A CN202310897267 A CN 202310897267A CN 116645365 B CN116645365 B CN 116645365B
- Authority
- CN
- China
- Prior art keywords
- feature map
- convolution
- frequency
- enhancement
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 title claims abstract description 95
- 238000001514 detection method Methods 0.000 title claims abstract description 85
- 238000001228 spectrum Methods 0.000 title claims abstract description 31
- 238000000605 extraction Methods 0.000 claims abstract description 47
- 230000004927 fusion Effects 0.000 claims abstract description 44
- 239000011159 matrix material Substances 0.000 claims description 92
- 230000004913 activation Effects 0.000 claims description 55
- 238000010586 diagram Methods 0.000 claims description 54
- 238000000034 method Methods 0.000 claims description 40
- 239000013598 vector Substances 0.000 claims description 38
- 238000011176 pooling Methods 0.000 claims description 29
- 238000012545 processing Methods 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 18
- 238000006243 chemical reaction Methods 0.000 claims description 16
- 238000001914 filtration Methods 0.000 claims description 13
- 230000003213 activating effect Effects 0.000 claims description 11
- 238000003860 storage Methods 0.000 claims description 11
- 230000002708 enhancing effect Effects 0.000 claims description 4
- 230000002787 reinforcement Effects 0.000 claims 1
- 238000013473 artificial intelligence Methods 0.000 abstract description 7
- 230000006870 function Effects 0.000 description 29
- 230000007547 defect Effects 0.000 description 20
- 238000004364 calculation method Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000000354 decomposition reaction Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000009826 distribution Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000001259 photo etching Methods 0.000 description 1
- 238000001303 quality assessment method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 235000012239 silicon dioxide Nutrition 0.000 description 1
- 239000000377 silicon dioxide Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/806—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P40/00—Technologies relating to the processing of minerals
- Y02P40/50—Glass production, e.g. reusing waste heat during processing or shaping
- Y02P40/57—Improving the yield, e-g- reduction of reject rates
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Image Analysis (AREA)
Abstract
The application relates to the technical field of artificial intelligence, in particular to a quartz glass detection method, device, equipment and medium based on frequency spectrum. Performing convolution feature extraction on a quartz glass image to be detected to obtain a first convolution feature image, performing convolution feature extraction on the first convolution feature image, performing image enhancement on the first convolution feature image twice, fusing the enhanced feature image, extracting a high-frequency feature image and a low-frequency feature image in the first convolution feature image, calculating the association degree between the high-frequency feature image and the low-frequency feature image, performing enhancement fusion on the first convolution feature image according to the association degree to obtain a fused feature image, and detecting the fused feature image to obtain a detection result. According to the correlation degree of different spectrum signal characteristic graphs in the quartz glass image, the difference of the different spectrum signal characteristic graphs is enhanced, and the accuracy of quartz glass detection is further improved.
Description
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a quartz glass detection method, device, equipment and medium based on frequency spectrum.
Background
Quartz glass is known as glass king, is a special glass material composed of a single silicon dioxide component, has stable physical and chemical properties, has the advantages of high temperature resistance, corrosion resistance, strong light transmittance, low expansion coefficient, good insulativity, good vacuum property and the like, and is widely applied to high and new technical fields of semiconductors, photovoltaics, optics, optical communication, aerospace and the like. In the process of manufacturing quartz glass, process defects on the surface and in the interior of the quartz glass are required to be strictly controlled to ensure the quality of the quartz glass, especially in the high-end application scenes of semiconductors such as photoetching mask quartz glass substrates, crystal pulling quartz glass crucibles and the like, the purity and performance requirements of the quartz glass are extremely high, and the process defects and impurity content of the used quartz glass can be detected.
In the prior art, when quartz glass is detected, a visual detection method is generally used, and as quartz glass defects may have different shapes, sizes and distributions, tiny defects are not prominent compared with background textures, so that the detection difficulty is increased, and the detection accuracy is lower, therefore, how to improve the detection accuracy in the quartz glass defect detection process becomes a problem to be solved.
Disclosure of Invention
In view of the above, the embodiments of the present application provide a method, an apparatus, and a medium for detecting quartz glass based on frequency spectrum, so as to solve the problem of low detection accuracy when detecting quartz glass.
In a first aspect, an embodiment of the present application provides a method for detecting quartz glass based on a frequency spectrum, where the method includes:
acquiring a quartz glass image to be detected, and carrying out convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature map;
performing convolution feature extraction on the first convolution feature map to obtain a second convolution feature map, performing first downsampling on the second convolution feature map to obtain a first enhancement feature map, performing second downsampling on the first enhancement feature map to obtain a second enhancement feature map, and fusing the first enhancement feature map and the second enhancement feature map to obtain a fused enhancement feature map;
extracting a high-frequency characteristic diagram and a low-frequency characteristic diagram in the first convolution characteristic diagram, and calculating the association degree between the high-frequency characteristic diagram and the low-frequency characteristic diagram to obtain an association degree matrix;
and performing activation treatment on the association degree matrix to obtain a weight activation matrix, using the weight activation matrix to enhance the fusion enhancement feature map to obtain an enhancement feature map, fusing the enhancement feature map with the first convolution feature map to obtain a fusion feature map, and detecting the fusion feature map to obtain a detection result.
In a second aspect, an embodiment of the present application provides a spectrum-based quartz glass detection apparatus, including:
the acquisition module is used for acquiring a quartz glass image to be detected, and carrying out convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature map;
the enhancement module is used for carrying out convolution feature extraction on the convolution feature images to obtain second convolution feature images, carrying out channel dimension enhancement on the second convolution feature images to obtain first enhancement feature images, carrying out space dimension enhancement on the first enhancement feature images to obtain second enhancement feature images, and fusing the first enhancement feature images and the second enhancement feature images to obtain fused enhancement feature images;
the extraction module is used for extracting a high-frequency characteristic diagram of the high-frequency signal and a low-frequency characteristic diagram of the low-frequency signal in the first convolution characteristic diagram, and calculating the association degree between the high-frequency characteristic diagram and the low-frequency characteristic diagram to obtain an association degree matrix;
the detection module is used for carrying out activation processing on the association degree matrix to obtain a weight activation matrix, using the weight activation matrix to strengthen the fusion enhancement feature map to obtain an enhancement feature map, fusing the enhancement feature map with the first convolution feature map to obtain a fusion feature map, and detecting the fusion feature map to obtain a detection result.
In a third aspect, an embodiment of the present application provides a terminal device, where the terminal device includes a processor, a memory, and a computer program stored in the memory and executable on the processor, and where the processor implements the quartz glass detection method according to the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the quartz glass detection method according to the first aspect.
Compared with the prior art, the application has the beneficial effects that:
obtaining a quartz glass image to be detected, carrying out convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature image, carrying out convolution feature extraction on the first convolution feature image to obtain a second convolution feature image, carrying out first downsampling on the second convolution feature image to obtain a first enhancement feature image, carrying out second downsampling on the first enhancement feature image to obtain a second enhancement feature image, fusing the first enhancement feature image with the second enhancement feature image to obtain a fused enhancement feature image, extracting a high-frequency feature image and a low-frequency feature image in the first convolution feature image, calculating the degree of association between the high-frequency feature image and the low-frequency feature image to obtain an association degree matrix, carrying out activation processing on the association degree matrix to obtain a weight activation matrix, enhancing the fused enhancement feature image by using the weight activation matrix to obtain an enhancement feature image, fusing the enhancement feature image with the first convolution feature image to obtain a fused feature image, and detecting the fused feature image to obtain a detection result. According to the correlation degree of different spectrum signal characteristic graphs in the quartz glass image, the difference of the different spectrum signal characteristic graphs is enhanced, and the accuracy of quartz glass detection is further improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of an application environment of a spectrum-based quartz glass detection method according to a first embodiment of the present application;
fig. 2 is a schematic flow chart of a method for detecting quartz glass based on frequency spectrum according to a first embodiment of the present application;
fig. 3 is a schematic flow chart of a method for detecting quartz glass based on frequency spectrum according to a second embodiment of the present application;
FIG. 4 is a graph showing the comparison of the detection effect of a spectrum-based quartz glass detection method and other defect detection models on an original image obtained by a high-definition industrial camera according to the third embodiment of the present application;
fig. 5 is a block diagram of a quartz glass detection device based on spectrum according to a fourth embodiment of the present application.
Fig. 6 is a schematic structural diagram of a terminal device according to a fifth embodiment of the present application.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the invention. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The embodiment of the invention can acquire and process the related data based on the artificial intelligence technology. Among these, artificial intelligence (Artificial Intelligence, AI) is the theory, method, technique and application system that uses a digital computer or a digital computer-controlled machine to simulate, extend and extend human intelligence, sense the environment, acquire knowledge and use knowledge to obtain optimal results.
Artificial intelligence infrastructure technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
It should be understood that the sequence numbers of the steps in the following embodiments do not mean the order of execution, and the execution order of the processes should be determined by the functions and the internal logic, and should not be construed as limiting the implementation process of the embodiments of the present invention.
In order to illustrate the technical scheme of the invention, the following description is made by specific examples.
The method for detecting quartz glass based on frequency spectrum provided by the embodiment of the invention can be applied to an application environment as shown in fig. 1, wherein a client communicates with a server. The client includes, but is not limited to, a handheld computer, a desktop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), and other terminal devices. The server may be implemented as a stand-alone server or as a cluster of servers generated by multiple servers.
Referring to fig. 2, a flowchart of a spectrum-based quartz glass detection method according to an embodiment of the present invention is shown, where the spectrum-based quartz glass detection method may be applied to a server in fig. 1, and the server is connected to a corresponding client, and as shown in fig. 2, the spectrum-based quartz glass detection method may include the following steps.
S201: and acquiring a quartz glass image to be detected, and carrying out convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature map.
In step S201, a quartz glass image to be detected is acquired, which may be an optical feature image, and convolution feature extraction is performed on the quartz glass image to be detected through convolution operation, so as to obtain a first convolution feature map.
In this embodiment, a quartz glass image to be detected is obtained by photographing quartz glass to be detected with an RGB camera, wherein the size of the quartz glass image to be detected is c×h×w, C is the number of channels of the first image and the second image, H is the heights of the first image and the second image, and W is the widths of the first image and the second image. Inputting the quartz glass image to be detected into a 3×3 convolution layer, and performing convolution processing on the quartz glass image to be detected to obtain a first convolution characteristic diagram.
S202: and performing convolution feature extraction on the first convolution feature map to obtain a second convolution feature map, performing first downsampling on the second convolution feature map to obtain a first enhancement feature map, performing second downsampling on the second convolution feature map to obtain a second enhancement feature map, and fusing the first enhancement feature map and the second enhancement feature map to obtain a fused enhancement feature map.
In step S202, the convolution feature extraction is performed on the first convolution feature map again through the convolution operation, so as to obtain a second convolution feature map, enhancement processing is performed on the second convolution feature map through a downsampling mode, and enhancement images obtained through different downsampling are fused, so that a fused enhancement feature map is obtained.
In this embodiment, the first convolution feature map is input to a 3×3 convolution layer, and the first convolution feature map is subjected to convolution processing to obtain a second convolution feature map. And performing first downsampling on the second convolution feature map to obtain a first enhancement feature map, performing second downsampling on the second convolution feature map to obtain a second enhancement feature map, and fusing the first enhancement feature map and the second enhancement feature map to obtain a fused enhancement feature map, wherein the first enhancement feature map is enhancement in a channel dimension, and the second enhancement feature map is enhancement in a space dimension.
Optionally, performing first downsampling on the second convolution feature map to obtain a first enhancement feature map, including:
respectively carrying out global average pooling and global maximum pooling on each channel characteristic in the second convolution characteristic diagram to obtain an average pooled characteristic value and a maximum pooled characteristic value corresponding to each channel, and fusing the average pooled characteristic value and the maximum pooled characteristic value corresponding to each channel to obtain a global fused characteristic vector corresponding to each channel;
feature extraction is carried out on the global fusion feature vector to obtain a feature extraction vector, and activation operation is carried out on the feature extraction vector to obtain a channel weight vector;
and calculating to obtain a first enhancement feature map according to the channel weight vector and the second convolution feature map.
In this embodiment, a global average pooling operation and a global maximum pooling operation are performed on a second convolution feature map with dimensions of h×w×c in channel dimensions to obtain an average pooled feature value and a maximum pooled feature value with dimensions of 1×1×c, the average pooled feature value and the maximum pooled feature value are fused to obtain global fused feature vectors corresponding to each channel, feature extraction is performed on the global fused feature vectors, and feature extraction is performed by using a multi-layer perceptual network (MLP), wherein the multi-layer perceptual network consists of an input layer, two hidden layers and an output layer, and the hidden layers are used for giving weight values and bias items to the input vectors. After feature extraction and channel dimension reduction are carried out through a multi-layer perception network, a feature extraction vector is obtained, the feature extraction vector is activated to obtain a channel weight vector, the channel weight vector is multiplied by a second convolution feature map, and a first enhancement feature map is obtained through calculation, wherein the calculation formula of the channel weight vector is as follows:
Wherein,for the channel weight vector, ++>Multilayer perceptive network->Global maximum pooling of the operating functions->Global average pooling operation function,>is a second convolution signature.
The calculation formula of the first enhancement feature map is as follows:wherein (1)>For the first enhancement profile,/>For the channel weight vector, ++>Is a second convolution signature.
Optionally, performing a second downsampling on the second convolution feature map to obtain a second enhancement feature map, including:
performing convolution operation and activation operation on the first enhancement feature map in the space dimension to obtain a space weight matrix;
and calculating to obtain a second enhancement feature map according to the space weight matrix and the second convolution feature map.
In this embodiment, the first enhancement feature map is subjected to a convolution operation and an activation operation in a spatial dimension, where the convolution function of the convolution operation is a convolution function of 1×1. The activation function in the activation operation isThe function is activated. The spatial weight matrix calculation formula is as follows: />
Wherein,is a space weight matrix->For convolution function +.>To activate the function +.>Activating a function for sigmoid->Is a first enhancement profile.
The calculation formula of the second enhancement feature map is as follows:
wherein, For the second enhancement profile,/>Is a space weight matrix->Is a second convolution signature.
S203: and extracting a high-frequency characteristic diagram and a low-frequency characteristic diagram in the first convolution characteristic diagram, and calculating the association degree between the high-frequency characteristic diagram and the low-frequency characteristic diagram to obtain an association degree matrix.
In step S203, a high-frequency feature map and a low-frequency feature map in the first convolution feature map are obtained by performing frequency decomposition on the first convolution feature map, where the high-frequency feature is a defect feature, the low-frequency feature is a background feature, the high-frequency feature map is a feature map including the defect feature, and the low-frequency feature map is a feature map including the background feature.
In this embodiment, the first convolution feature map is subjected to frequency decomposition to obtain a high-frequency feature map and a low-frequency feature map, and the association degree between the high-frequency feature map and the low-frequency feature map is calculated according to the high-frequency feature map of the low-frequency feature map to obtain an association degree matrix.
Optionally, extracting the high-frequency feature map and the low-frequency feature map in the first convolution feature map includes:
filtering the first convolution feature map by using a preset high-pass filter to obtain a high-frequency feature map of the first convolution feature map;
and filtering the first convolution feature map by using a preset low-pass filter to obtain a low-frequency feature map of the first convolution feature map.
In this embodiment, the first convolution feature diagram is subjected to frequency decomposition to extract the corresponding high-frequency signal and low-frequency signalThe signals are obtained, a corresponding high-frequency characteristic diagram and a corresponding low-frequency characteristic diagram are obtained, and the high-frequency signals and the low-frequency signals in the first convolution characteristic diagram are extracted and expressed as follows:
wherein,for the first convolution feature map,/>For low frequency signals, representing background features, +.>For the high-frequency signal, the defect characteristic is represented, and the product in the formula is inseparable, so the logarithm is taken from the two sides of the formula:
the frequency domain expression is:
wherein,representing a first convolution characteristic map->The frequency domain representation after fourier transformation is performed,for each point in the feature map, +.>For low-frequency signals->The frequency domain representation after fourier transformation is performed,for high-frequency signals->And carrying out frequency domain expression results after Fourier transformation.
The two sides of the formula are processed by using a filtering function to obtain:
wherein,in order to filter the transfer function, this allows the high frequency signal to be separated from the low frequency signal.
And then carrying out inverse Fourier transform on two sides of the formula to obtain the product:
wherein,representation->Result of inverse fourier transform,/>Representation ofResult of inverse fourier transform,/>Representation->And (3) performing an inverse fourier transform.
Taking indexes from two sides simultaneously to obtain:
the filtered image is used as a filter transfer function, so it can be seen from the above that the factor determining the overall filtering effect is entirely due to the selection of the filter, i.e. the filter transfer function +.>Is selected from the group consisting of (a) and (b).
In this embodiment, a preset high-pass filter is used to perform filtering processing on the first convolution feature map to obtain a high-frequency feature map of the first convolution feature map, and a preset low-pass filter is used to perform filtering processing on the first convolution feature map to obtain a low-frequency feature map of the first convolution feature map.
The calculation formulas of the high-frequency characteristic diagram and the low-frequency characteristic diagram are as follows:
wherein,is a low-frequency characteristic diagram->For high frequency characteristic map, < >>For low-pass filtering transfer function, < >>Is a high pass filter transfer function;
wherein,
wherein,for->To the filter center->T is the distance of the control high-frequency stop band conversion rate,/>the value range is [5,15]。
Optionally, calculating the association degree between the high-frequency feature map and the low-frequency feature map to obtain an association degree matrix, including:
performing convolution operation on the low-frequency characteristic map to obtain a low-frequency convolution characteristic map, performing global average pooling operation on the low-frequency convolution characteristic map to obtain a low-frequency pooling characteristic map, and performing projection conversion on the low-frequency pooling characteristic map to obtain a low-frequency projection characteristic map;
Performing projection conversion on the high-frequency characteristic map to obtain a high-frequency projection characteristic map, performing activation operation on the high-frequency projection characteristic map, and calculating to obtain an activated high-frequency characteristic matrix;
multiplying the low-frequency projection feature map with the activated high-frequency feature matrix, and calculating the corresponding association degree to obtain an association degree matrix.
In this embodiment, a convolution operation is performed on a low-frequency feature map to obtain a low-frequency convolution feature map, where the convolution kernel of the convolution operation is a convolution function with a size of 3×3, a global average pooling operation is performed on the low-frequency convolution feature map to obtain a low-frequency pooled feature map, and projection conversion is performed on the low-frequency pooled feature map to obtain a low-frequency projection feature map, where the projection function of the projection conversion isAnd converting the image into a low-frequency projection characteristic diagram with the size of 1 XC. Performing projection conversion on the high-frequency characteristic map to obtain a high-frequency projection characteristic map, wherein the projection function of projection conversion is +.>Converting a high-frequency characteristic diagram with the size of C multiplied by H multiplied by W into a high-frequency projection characteristic diagram with the size of C multiplied by HW, performing activation operation on the high-frequency projection characteristic diagram, and calculating to obtain an activated high-frequency characteristic matrix, wherein the activation function is a Swish activation function, and the calculation formula is as follows: / >
Wherein,for the activated high-frequency feature matrix, +.>For the high-frequency projection characteristic diagram, the derivative of the Swish activation function is constantly larger than 0, and when the input is a value near 0, the output has certain smoothness, so that the optimization and generalization in the training process are facilitated.
Multiplying the low-frequency projection feature map with the activated high-frequency feature matrix, and calculating the corresponding association degree to obtain an association degree matrix. The calculation formula is as follows:
wherein,for the association degree matrix, ++>For low-frequency projection feature map, < >>Is the activated high-frequency characteristic matrix.
S204: and performing activation treatment on the association degree matrix to obtain a weight activation matrix, using the weight activation matrix to enhance the fusion enhancement feature map to obtain an enhancement feature map, fusing the enhancement feature map with the first convolution feature map to obtain a fusion feature map, and detecting the fusion feature map to obtain a detection result.
In step S204, the fused enhancement feature map is enhanced by using the association degree matrix, so as to obtain an enhancement feature map, the enhancement feature map is fused with the first convolution feature map, so as to obtain a fused feature map, and defect detection is performed on the fused feature map, so as to obtain a detection result, wherein the association degree between the low-frequency signal and the high-frequency signal in the feature map can be modeled by using the association degree matrix, so as to obtain association degrees between different frequency signals, and further increase the difference between the low-frequency signal and the high-frequency signal.
In this embodiment, the association degree matrix is activated to obtain a weight activation matrix, where the activation function is a sigmoid activation function, and the calculation formula is:
wherein,activating matrix for weight->Is the association matrix.
The method comprises the steps of using a weight activation matrix to strengthen a fusion enhancement feature map to obtain the enhancement feature map, carrying out projection conversion on the weight activation matrix before enhancement to obtain a frequency difference weight matrix, carrying out dot multiplication on the frequency difference weight matrix and the fusion enhancement feature map to obtain the enhancement feature map, and carrying out addition fusion on the enhancement feature map and a first convolution feature map to obtain the fusion feature map, wherein the calculation formula is as follows:
wherein,for fusing feature map, < >>For the first convolution feature map,/>For the frequency difference weight matrix, +.>To fuse enhanced feature maps.
And detecting the fusion characteristic diagram to obtain a detection result, wherein during detection, a U-net network is used for detection. The U-net network is a U-shaped network whose network structure comprises two main parts, namely a front-end encoder path and a back-end decoder path, the encoder part being composed of 5 convolutional layers, each convolutional layer having two convolutional kernels of size 3 x 3. Between each 3 x 3 convolution kernel is added the fusion feature extraction module of the method of the present invention, BN (Batch normalization) layer, linear correction unit layer (ReLU) and a 2 x 2 max pooling operation for downsampling. Correspondingly, the encoder path mainly comprises 5 continuous feature extraction blocks consisting of a convolution layer, a ReLu activation layer and a pooling layer, 2 times of up-sampling operation is carried out between every two feature extraction blocks by using 2X 2 convolution, so that the restoration of a feature map is realized, more detail information can be obtained, and meanwhile, advanced semantic features and shallow features are fused by using a jump link mode, so that more information is reserved. Using Softmax as a classification layer in the feature map output by the feature extraction block of the last layer of the decoder path, each component feature vector is mapped to the number of classes to be detected by using a 1×1 convolution, and pixel-by-pixel classification is achieved.
Before the detection is performed by using the U-net network, training is performed on the U-net network, and during training, a high-definition industrial camera is used for collecting quartz glass images to be detected, wherein the quartz glass images comprise multiple types of quartz glass images with different specifications and different defect types. For each image, the location information of the defect is manually noted for subsequent defect detection and quality assessment. The data set comprises 3000 quartz glass images, various defect targets such as scratches, color spots, pits, bubbles and pollution are included, the resolution of the images is 512 multiplied by 512, 2000 images are randomly selected as a training set, and 1000 images are selected as a test set. The invention uses Adam optimizer to train the model under the Pytorch framework. Two NVIDIA Quadro M5000 Graphics Processing Units (GPUs) are used with the Windows 10 operating system. The initial training parameters of the network are shown in table 1:
TABLE 1
The invention adopts the Overall Accuracy (OA), the average cross-over ratio (mIoU) and the average F1 score (AF) as evaluation indexes, trains and tests on a quartz glass data set, and the detection accuracy is shown in a table 2 and is compared with the current advanced semantic segmentation model. The method provided by the invention remarkably improves the detection accuracy of the defects of the quartz glass, and the detection effect of the defects of the quartz glass is shown in a table 2:
TABLE 2
According to the comparison of the defect detection accuracy in each defect detection model in table 2, it can be known that the detection accuracy of the defect detection model in the method of the application is higher than the detection accuracy of the detection method in the prior art in terms of overall accuracy, average cross-over ratio and average F1 score, so that the detection method of quartz glass in the application has remarkable beneficial effects.
Optionally, the fusion enhancement feature map is enhanced by using a weight activation matrix, so as to obtain an enhancement feature map, which includes:
performing projection conversion on the weight activation matrix to obtain a frequency difference weight matrix;
and carrying out weighted enhancement on the fusion enhancement feature map by using the frequency difference weight matrix to obtain the enhancement feature map.
In this embodiment, the weight activation matrix is subjected to projective transformation to obtain a frequency difference weight matrix, where the projective function isAnd converting the characteristic map into a frequency difference weight matrix between a low-frequency signal and a high-frequency signal with the size of 1 XH XW, carrying out weighted enhancement on the fusion enhancement characteristic map by using the frequency difference weight matrix to obtain an enhancement characteristic map, and carrying out dot multiplication on the frequency difference weight matrix and the fusion enhancement characteristic map during weighted enhancement.
It should be noted that, the quartz glass detection method based on the frequency spectrum provided by the embodiment of the application can be applied to various computer intelligent processing tasks, including but not limited to intelligent processing tasks of various application scenes such as target detection, target classification and the like, and is not particularly limited.
The above-mentioned intelligent processing task is specifically applicable to image analysis, detection and object classification scenarios including but not limited to micro-scale and nano-scale of semiconductors, for example, and is specifically applicable to semiconductor detection, and the embodiments of the present application are not limited thereto. For example, if the method is applied to the target detection application scene, the target detection processing can be performed by utilizing the output image characteristics finally obtained by the quartz glass detection method based on the frequency spectrum provided by the application, so as to obtain the target detection result.
Referring to fig. 3, a flow chart of a method for detecting quartz glass based on frequency spectrum according to a second embodiment of the present application is shown, a quartz glass image to be detected is obtained, and a convolution layer 1 is used to perform convolution feature extraction on the quartz glass image to be detected, so as to obtain a first convolution feature mapFirst convolution feature map ++through convolution layer 2>Performing convolution feature extraction to obtain a second convolution feature map +.>Performing global average pooling processing and maximum pooling processing on the second convolution feature map, adding and fusing the processed results to obtain a global fusion feature vector, and performing feature extraction and activation by using a multi-layer perceptual network (MLP) to obtain a channel weight vector +. >Channel weight vector +.>With a second convolution feature map->Multiplying to obtain a first enhancement feature map +.>Performing convolution operation and activation operation on the first enhancement feature map in the space dimension to obtain a space weight matrix +.>Spatial weight matrix +.>With a second convolution feature map->Multiplying to obtain a second enhancement profile +.>First enhancement feature map ++>And second enhancement feature map->Fusion is carried out to obtain a fusion enhancement characteristic diagram +.>。
Extracting a first convolution feature mapHigh-frequency characteristic diagram->And low frequency characteristic diagram->For low frequency characteristic diagram->Global average pooling operation is carried out to obtain low-frequency pooling characteristic diagram +.>For high frequency characteristic diagram->Performing dimension conversion to obtain a high-frequency projection characteristic diagram +.>Activating the high-frequency projection feature map by using a Swish activation function to obtain an activated high-frequency feature matrix ++>According to the activated high frequency characteristic matrix +.>Low frequency pooling feature map->Calculating a correlation matrix of the high-frequency characteristic diagram and the low-frequency characteristic diagram>Correlation matrix using sigmoid activation function>Performing activation to obtain weight activation matrix +.>Activating matrix of weight value->And fusion enhancement feature map->Fusion is carried out to obtain an enhanced feature map- >Will enhance the feature map->Fusion with the first convolution profile>Obtaining a fusion characteristic diagram->Fusion profile based->And (5) detecting.
Referring to fig. 4, a comparison chart of detection effects of a spectrum-based quartz glass detection method and other defect detection models on an original image obtained by a high-definition industrial camera is provided in a third embodiment of the present application. The original image is a quartz glass image to be detected, and as can be seen from fig. 4, when the quartz glass to be detected is detected by using the deep V3-Plus model, 3 detection results are obtained, when the quartz glass to be detected is detected by using the HRNet model, 3 detection results are obtained, when the quartz glass to be detected is detected by using the HMANet model, 5 detection results are obtained, and when the quartz glass to be detected is detected by using the method model, 8 detection results are obtained, and according to a comparison result diagram, the method has remarkable beneficial effects.
Obtaining a quartz glass image to be detected, carrying out convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature image, carrying out convolution feature extraction on the first convolution feature image to obtain a second convolution feature image, carrying out first downsampling on the second convolution feature image to obtain a first enhancement feature image, carrying out second downsampling on the first enhancement feature image to obtain a second enhancement feature image, fusing the first enhancement feature image with the second enhancement feature image to obtain a fused enhancement feature image, extracting a characteristic high-frequency feature image and a characteristic low-frequency feature image in the first convolution feature image, calculating a correlation degree between the high-frequency feature image and the low-frequency feature image to obtain a correlation degree matrix, carrying out activation processing on the correlation degree matrix to obtain a weight activation matrix, enhancing the fused enhancement feature image by using the weight activation matrix to obtain an enhancement feature image, fusing the enhancement feature image with the first convolution feature image to obtain a fused feature image, and detecting the fused feature image to obtain a detection result. According to the correlation degree of different spectrum signal characteristic graphs in the quartz glass image, the difference of the different spectrum signal characteristic graphs is enhanced, and the accuracy of quartz glass detection is further improved.
Fig. 5 shows a block diagram of a spectrum-based quartz glass detection apparatus according to a fourth embodiment of the present application, which corresponds to the spectrum-based quartz glass detection method of the above embodiment, and the above quartz glass detection apparatus is applied to the above server. For convenience of explanation, only portions relevant to the embodiments of the present application are shown. Referring to fig. 5, the quartz glass detection device 50 includes: the device comprises an acquisition module 51, an enhancement module 52, an extraction module 53 and a detection module 54.
The obtaining module 51 is configured to obtain a quartz glass image to be detected, and perform convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature map.
The enhancement module 52 is configured to perform convolution feature extraction on the first convolution feature map to obtain a second convolution feature map, perform first downsampling on the second convolution feature map to obtain a first enhancement feature map, perform second downsampling on the second convolution feature map to obtain a second enhancement feature map, and fuse the first enhancement feature map with the second enhancement feature map to obtain a fused enhancement feature map;
the extracting module 53 is configured to extract the high-frequency feature map and the low-frequency feature map in the first convolution feature map, calculate a degree of association between the high-frequency feature map and the low-frequency feature map, and obtain a degree of association matrix.
The detection module 54 is configured to perform activation processing on the association degree matrix to obtain a weight activation matrix, enhance the fusion enhancement feature map by using the weight activation matrix to obtain an enhancement feature map, fuse the enhancement feature map with the first convolution feature map to obtain a fusion feature map, and detect the fusion feature map to obtain a detection result.
Optionally, the enhancement module 52 includes:
and the channel characteristic unit is used for respectively carrying out global average pooling and global maximum pooling on each channel characteristic in the second convolution characteristic diagram to obtain a mean pooled characteristic value and a maximum pooled characteristic value corresponding to each channel, and fusing the mean pooled characteristic value and the maximum pooled characteristic value corresponding to each channel to obtain a global fused characteristic vector corresponding to each channel.
And the activating unit is used for carrying out feature extraction on the global fusion feature vector to obtain a feature extraction vector, and carrying out activating operation on the feature extraction vector to obtain a channel weight vector.
The first calculation unit is used for calculating a first enhancement feature map according to the channel weight vector and the second convolution feature map.
Optionally, the enhancement module 52 includes:
the space weight obtaining unit is used for carrying out convolution operation and activation operation on the first enhancement feature map in the space dimension to obtain a space weight matrix;
The first calculation unit is used for calculating a second enhancement feature map according to the space weight matrix and the second convolution feature map.
Optionally, the extracting module 53 includes:
and the first filtering unit is used for filtering the first convolution characteristic map by using a preset high-pass filter to obtain a high-frequency characteristic map of the first convolution characteristic map.
And the second filtering unit is used for filtering the first convolution characteristic map by using a preset low-pass filter to obtain a low-frequency characteristic map of the first convolution characteristic map.
Optionally, the extracting module 53 includes:
the first projection unit is used for carrying out convolution operation on the low-frequency characteristic map to obtain a low-frequency convolution characteristic map, carrying out global average pooling operation on the low-frequency convolution characteristic map to obtain a low-frequency pooling characteristic map, and carrying out projection conversion on the low-frequency pooling characteristic map to obtain a low-frequency projection characteristic map.
The activating unit is used for carrying out projection conversion on the high-frequency characteristic diagram to obtain a high-frequency projection characteristic diagram, carrying out activating operation on the high-frequency projection characteristic diagram, and calculating to obtain an activated high-frequency characteristic matrix.
And the association degree calculating unit is used for multiplying the low-frequency projection feature map with the activated high-frequency feature matrix and calculating the corresponding association degree to obtain an association degree matrix.
Optionally, the detection module 54 includes:
and the second projection unit is used for performing projection conversion on the weight activation matrix to obtain a frequency difference weight matrix.
And the weighting unit is used for weighting and enhancing the fusion enhancement feature map by using the frequency difference weight matrix to obtain the enhancement feature map.
It should be noted that, because the content of information interaction and execution process between the modules and the embodiment of the method of the present application are based on the same concept, specific functions and technical effects thereof may be referred to in the method embodiment section, and details thereof are not repeated herein.
Fig. 6 is a schematic structural diagram of a terminal device according to a fifth embodiment of the present application. As shown in fig. 6, the terminal device of this embodiment includes: at least one processor (only one shown in fig. 6), a memory, and a computer program stored in the memory and executable on the at least one processor, the processor executing the computer program to perform the steps of any of the various spectrum-based quartz glass detection method embodiments described above.
The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that fig. 6 is merely an example of a terminal device and is not limiting of the terminal device, and that the terminal device may comprise more or less components than shown, or may combine some components, or different components, e.g. may further comprise a network interface, a display screen, input means, etc.
The processor may be a CPU, but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory includes a readable storage medium, an internal memory, etc., where the internal memory may be a memory of the terminal device, and the internal memory provides an environment for the operation of an operating system and computer readable instructions in the readable storage medium. The readable storage medium may be a hard disk of the terminal device, and in other embodiments may be an external storage device of the terminal device, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card), etc. that are provided on the terminal device. Further, the memory may also include both an internal storage unit of the terminal device and an external storage device. The memory is used to store an operating system, application programs, boot loader (BootLoader), data, and other programs such as program codes of computer programs, and the like. The memory may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above device may refer to the corresponding process in the foregoing method embodiment, which is not described herein again. The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above-described embodiment, and may be implemented by a computer program to instruct related hardware, and the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of the method embodiment described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The present application may also be implemented by a computer program product for implementing all or part of the steps of the method embodiments described above, when the computer program product is run on a terminal device, causing the terminal device to execute the steps of the method embodiments described above.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (10)
1. A method for detecting quartz glass based on frequency spectrum, characterized in that the method comprises:
acquiring a quartz glass image to be detected, and carrying out convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature map;
Performing convolution feature extraction on the first convolution feature map to obtain a second convolution feature map, performing first downsampling on the second convolution feature map to obtain a first enhancement feature map, performing second downsampling on the second convolution feature map to obtain a second enhancement feature map, and fusing the first enhancement feature map and the second enhancement feature map to obtain a fused enhancement feature map;
extracting a high-frequency characteristic diagram and a low-frequency characteristic diagram in the first convolution characteristic diagram, and calculating the association degree between the high-frequency characteristic diagram and the low-frequency characteristic diagram to obtain an association degree matrix;
and performing activation treatment on the association degree matrix to obtain a weight activation matrix, using the weight activation matrix to enhance the fusion enhancement feature map to obtain an enhancement feature map, fusing the enhancement feature map with the first convolution feature map to obtain a fusion feature map, and detecting the fusion feature map to obtain a detection result.
2. The method of claim 1, wherein the first downsampling the second convolution signature to obtain a first enhancement signature comprises:
respectively carrying out global average pooling and global maximum pooling on each channel characteristic in the second convolution characteristic diagram to obtain an average pooled characteristic value and a maximum pooled characteristic value corresponding to each channel, and fusing the average pooled characteristic value and the maximum pooled characteristic value corresponding to each channel to obtain a global fused characteristic vector corresponding to each channel;
Feature extraction is carried out on the global fusion feature vector to obtain a feature extraction vector, and activation operation is carried out on the feature extraction vector to obtain a channel weight vector;
and calculating to obtain a first enhancement feature map according to the channel weight vector and the second convolution feature map.
3. The method of claim 1, wherein said second downsampling the second convolution signature to obtain a second enhancement signature, comprising:
performing convolution operation and activation operation on the first enhancement feature map in a space dimension to obtain a space weight matrix;
and calculating to obtain a second enhancement feature map according to the space weight matrix and the second convolution feature map.
4. The quartz glass detection method of claim 1, wherein the extracting the high frequency signature and the low frequency signature in the first convolution signature comprises:
filtering the first convolution feature map by using a preset high-pass filter to obtain a high-frequency feature map of the first convolution feature map;
and filtering the first convolution feature map by using a preset low-pass filter to obtain a low-frequency feature map of the first convolution feature map.
5. The quartz glass detection method of claim 1, wherein the calculating the degree of association between the high frequency feature map and the low frequency feature map to obtain the degree of association matrix comprises:
performing convolution operation on the low-frequency characteristic map to obtain a low-frequency convolution characteristic map, performing global average pooling operation on the low-frequency convolution characteristic map to obtain a low-frequency pooling characteristic map, and performing projection conversion on the low-frequency pooling characteristic map to obtain a low-frequency projection characteristic map;
performing projection conversion on the high-frequency characteristic map to obtain a high-frequency projection characteristic map, performing activation operation on the high-frequency projection characteristic map, and calculating to obtain an activated high-frequency characteristic matrix;
multiplying the low-frequency projection feature map with the activated high-frequency feature matrix, and calculating the corresponding association degree to obtain an association degree matrix.
6. The method for detecting quartz glass according to claim 1, wherein the step of enhancing the fusion enhanced feature map using the weight activation matrix to obtain an enhanced feature map comprises:
performing projection conversion on the weight activating matrix to obtain a frequency difference weight matrix;
and carrying out weighted enhancement on the fusion enhancement feature map by using the frequency difference weight matrix to obtain an enhancement feature map.
7. A spectrum-based quartz glass detection apparatus, characterized in that the quartz glass detection apparatus comprises:
the acquisition module is used for acquiring a quartz glass image to be detected, and carrying out convolution feature extraction on the quartz glass image to be detected to obtain a first convolution feature map;
the enhancement module is used for carrying out convolution feature extraction on the convolution feature images to obtain second convolution feature images, carrying out channel dimension enhancement on the second convolution feature images to obtain first enhancement feature images, carrying out space dimension enhancement on the first enhancement feature images to obtain second enhancement feature images, and fusing the first enhancement feature images and the second enhancement feature images to obtain fused enhancement feature images;
the extraction module is used for extracting a high-frequency characteristic diagram of the high-frequency signal and a low-frequency characteristic diagram of the low-frequency signal in the first convolution characteristic diagram, and calculating the association degree between the high-frequency characteristic diagram and the low-frequency characteristic diagram to obtain an association degree matrix;
the detection module is used for carrying out activation processing on the association degree matrix to obtain a weight activation matrix, using the weight activation matrix to strengthen the fusion enhancement feature map to obtain an enhancement feature map, fusing the enhancement feature map with the first convolution feature map to obtain a fusion feature map, and detecting the fusion feature map to obtain a detection result.
8. The quartz glass detection apparatus of claim 7, wherein the reinforcement module comprises:
the channel feature unit is used for respectively carrying out global average pooling and global maximum pooling on each channel feature in the second convolution feature map to obtain an average pooled feature value and a maximum pooled feature value corresponding to each channel, and fusing the average pooled feature value and the maximum pooled feature value corresponding to each channel to obtain a global fused feature vector corresponding to each channel;
the activating unit is used for carrying out feature extraction on the global fusion feature vector to obtain a feature extraction vector, and carrying out activating operation on the feature extraction vector to obtain a channel weight vector;
and the calculating unit is used for calculating to obtain a first enhancement feature map according to the channel weight vector and the second convolution feature map.
9. A terminal device, characterized in that it comprises a processor, a memory and a computer program stored in the memory and executable on the processor, which processor, when executing the computer program, implements the quartz glass detection method according to any of claims 1 to 6.
10. A computer-readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the quartz glass detection method according to any of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310897267.4A CN116645365B (en) | 2023-07-21 | 2023-07-21 | Quartz glass detection method, device, equipment and medium based on frequency spectrum |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310897267.4A CN116645365B (en) | 2023-07-21 | 2023-07-21 | Quartz glass detection method, device, equipment and medium based on frequency spectrum |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116645365A CN116645365A (en) | 2023-08-25 |
CN116645365B true CN116645365B (en) | 2023-11-17 |
Family
ID=87640350
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310897267.4A Active CN116645365B (en) | 2023-07-21 | 2023-07-21 | Quartz glass detection method, device, equipment and medium based on frequency spectrum |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116645365B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117705827B (en) * | 2024-02-06 | 2024-04-12 | 上海强华实业股份有限公司 | Method for optimizing quartz glass defect detection based on multivariable fine burning energy consumption |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113807368A (en) * | 2021-09-24 | 2021-12-17 | 上海大风实验室设备有限公司 | Glass instrument detection method and device based on YOLO algorithm and related equipment |
US11222217B1 (en) * | 2020-08-14 | 2022-01-11 | Tsinghua University | Detection method using fusion network based on attention mechanism, and terminal device |
CN114936993A (en) * | 2022-05-13 | 2022-08-23 | 常熟理工学院 | High-resolution and pixel relation attention-enhancing strong fusion remote sensing image segmentation method |
CN115578590A (en) * | 2022-10-13 | 2023-01-06 | 深圳云天励飞技术股份有限公司 | Image identification method and device based on convolutional neural network model and terminal equipment |
CN115937571A (en) * | 2022-10-24 | 2023-04-07 | 江苏长欣车辆装备有限公司 | Device and method for detecting sphericity of glass for vehicle |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021247034A1 (en) * | 2020-06-05 | 2021-12-09 | Aetherai Ip Holding Llc | Object detection method and convolution neural network for the same |
US20230162480A1 (en) * | 2021-11-24 | 2023-05-25 | GM Global Technology Operations LLC | Frequency-based feature constraint for a neural network |
-
2023
- 2023-07-21 CN CN202310897267.4A patent/CN116645365B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11222217B1 (en) * | 2020-08-14 | 2022-01-11 | Tsinghua University | Detection method using fusion network based on attention mechanism, and terminal device |
CN113807368A (en) * | 2021-09-24 | 2021-12-17 | 上海大风实验室设备有限公司 | Glass instrument detection method and device based on YOLO algorithm and related equipment |
CN114936993A (en) * | 2022-05-13 | 2022-08-23 | 常熟理工学院 | High-resolution and pixel relation attention-enhancing strong fusion remote sensing image segmentation method |
CN115578590A (en) * | 2022-10-13 | 2023-01-06 | 深圳云天励飞技术股份有限公司 | Image identification method and device based on convolutional neural network model and terminal equipment |
CN115937571A (en) * | 2022-10-24 | 2023-04-07 | 江苏长欣车辆装备有限公司 | Device and method for detecting sphericity of glass for vehicle |
Non-Patent Citations (2)
Title |
---|
基于深度学习的玻璃缺陷检测;李建刚;中国优秀硕士论文全文数据库;正文第1-49页 * |
玻璃缺陷特征提取方法研究;李思民;现代工业经济和信息化(第86期);第70-71页 * |
Also Published As
Publication number | Publication date |
---|---|
CN116645365A (en) | 2023-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhang et al. | Single image defogging based on multi-channel convolutional MSRCR | |
CN110399929B (en) | Fundus image classification method, fundus image classification apparatus, and computer-readable storage medium | |
CN106934397B (en) | Image processing method and device and electronic equipment | |
CN111242127A (en) | Vehicle detection method with granularity level multi-scale characteristics based on asymmetric convolution | |
CN116645365B (en) | Quartz glass detection method, device, equipment and medium based on frequency spectrum | |
CN115326809B (en) | Tunnel lining apparent crack detection method and detection device | |
CN110599455A (en) | Display screen defect detection network model, method and device, electronic equipment and storage medium | |
Rahman et al. | Diverse image enhancer for complex underexposed image | |
CN115861380A (en) | End-to-end unmanned aerial vehicle visual target tracking method and device in foggy low-light scene | |
CN112861987A (en) | Target detection method under dark light environment | |
CN115937693A (en) | Road identification method and system based on remote sensing image | |
CN114202473A (en) | Image restoration method and device based on multi-scale features and attention mechanism | |
CN117952883A (en) | Backlight image enhancement method based on bilateral grid and significance guidance | |
CN116309612B (en) | Semiconductor silicon wafer detection method, device and medium based on frequency decoupling supervision | |
CN113963333A (en) | Traffic sign board detection method based on improved YOLOF model | |
CN110738625B (en) | Image resampling method, device, terminal and computer readable storage medium | |
Yi et al. | Gated residual feature attention network for real-time Dehazing | |
CN115358962B (en) | End-to-end visual odometer method and device | |
CN114419078B (en) | Surface defect region segmentation method and device based on convolutional neural network | |
Yao et al. | A multi-expose fusion image dehazing based on scene depth information | |
CN113256528B (en) | Low-illumination video enhancement method based on multi-scale cascade depth residual error network | |
CN112800952B (en) | Marine organism identification method and system based on improved SSD algorithm | |
CN114549548A (en) | Glass image segmentation method based on polarization clue | |
CN113205530A (en) | Shadow area processing method and device, computer readable medium and electronic equipment | |
CN116823864B (en) | Data processing method, device, equipment and medium based on balance loss function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |