CN114324361B - Metallographic structure degradation rating method and device - Google Patents

Metallographic structure degradation rating method and device Download PDF

Info

Publication number
CN114324361B
CN114324361B CN202111674757.5A CN202111674757A CN114324361B CN 114324361 B CN114324361 B CN 114324361B CN 202111674757 A CN202111674757 A CN 202111674757A CN 114324361 B CN114324361 B CN 114324361B
Authority
CN
China
Prior art keywords
features
classification
image
metallographic
deep learning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111674757.5A
Other languages
Chinese (zh)
Other versions
CN114324361A (en
Inventor
卢伟
朱家乐
杨建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN202111674757.5A priority Critical patent/CN114324361B/en
Publication of CN114324361A publication Critical patent/CN114324361A/en
Application granted granted Critical
Publication of CN114324361B publication Critical patent/CN114324361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The application provides a metallographic structure degradation rating method and device, which are applied to an edge controller, and after metallographic pictures (depth features of image global information, GLCM features for describing textures and principle-based mechanism classification features) shot by a metallographic microscope are obtained, the metallographic pictures are input into a preset rating model (a combined classification model of an image deep learning model and a neural network classifier) for reasoning calculation, and a metallographic structure degradation rating result is determined. According to the method and the device, the rapid grading of the metallographic structure degradation is realized by using a metallographic microscope and edge equipment on site, and the degradation grading accuracy is improved.

Description

Metallographic structure degradation rating method and device
Technical Field
The application relates to the technical field of power industry, in particular to a metallographic structure degradation grading method and device.
Background
The carbon steel series has wide application in the electric power industry, and the No. 20 steel in the series is a main material for a plurality of important parts of a power station boiler. The No. 20 steel seamless steel pipe supplied according to the GB3087 standard is used for manufacturing low-medium pressure boiler pipe fittings; no. 20 seamless steel pipe supplied according to GB5310 standard is used for manufacturing boiler pipe fittings with high pressure and higher steam parameters. Generally, the No. 20 steel is mainly used for boiler heating surface pipes, steam pipelines and headers with wall temperatures not exceeding 450 ℃.
However, the pearlite in the structure of steel No. 20 is subject to the spheroidization phenomenon during the long-term use at high temperature, and for this reason, the spheroidization degree of pearlite in the structure of steel No. 20 has been widely used as one of the evaluation criteria for the use reliability for a long time. At present, the grade of the degradation of the 20 # steel metallographic structure is mainly based on manual classification, the manual classification efficiency is low, the condition of some metallographic pictures is complex, and the phenomenon that the metallographic pictures often contain a plurality of tissue images with similar levels causes that the classification process can be influenced by subjective judgment of people, and the accuracy is not high.
Disclosure of Invention
In view of the above problems, the present application provides a method and an apparatus for grading degradation of a metallographic structure, which utilize a metallographic microscope and edge equipment on site to realize rapid grading of degradation of the metallographic structure, and improve accuracy of degradation grading.
In order to achieve the above object, the present application provides the following technical solutions:
a metallographic structure degradation rating method applied to an edge controller, comprising:
acquiring a metallographic picture shot by a metallographic microscope, wherein the metallographic picture comprises depth features of image global information, GLCM features for describing textures and principle-based mechanism classification features;
inputting the metallographic pictures into a preset rating model for reasoning calculation, and determining a metallographic structure degradation rating result, wherein the preset rating model is a combined classification model of an image deep learning model and a neural network classifier.
Further, the method for constructing the preset rating model comprises the following steps:
acquiring a sample data set, wherein the sample data set comprises metallographic pictures meeting preset conditions;
preprocessing the sample data set by utilizing computer vision and an image processing technology to obtain a preprocessed image data set;
dividing the image data set into a training set, a testing set and a verification set according to a preset proportion;
constructing a keras framework, extracting depth features from the training set and the testing set, extracting GLCM texture description features by using a gray level co-occurrence matrix, and extracting image mechanism classification features based on classification mechanisms;
fusing the depth features, the GLCM texture description features and the image mechanism classification features as inputs of a neural network classifier to construct the neural network classifier;
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set, and selecting a deep learning model and neural network classifier combination with accuracy meeting preset conditions on the verification set as the preset rating model.
Further, the extracting depth features includes:
and inputting the training set and the image data set in the test set into an image deep learning network classification model to obtain corresponding depth characteristics.
Further, the extracting GLCM texture descriptive features using the gray level co-occurrence matrix includes:
extracting energy features, contrast characteristics and entropy features from the image data sets in the training set and the test set by using a gray level co-occurrence matrix;
and combining the energy characteristic, the contrast characteristic and the entropy characteristic according to a preset sequence to generate the GLCM texture description characteristic.
Further, extracting an image mechanism classification feature based on the classification mechanism includes:
extracting the image mechanism classification features from the image dataset in the training set and the testing set based on classification mechanisms.
Further, the deep learning model based on the deep learning model verifies the deep learning model and the neural network classifier according to the verification set, and selects a combination of the deep learning model and the neural network classifier, on the verification set, with accuracy meeting a preset condition as the preset rating model, including:
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set to obtain a plurality of combined classification models of the deep learning model and the corresponding neural network classifier;
and for different combined classification models, determining verification sets corresponding to the classification models, classifying by using a classifier, and taking the combined classification model which enables the classification accuracy in the verification sets to meet the preset conditions as the preset rating model.
A metallographic structure degradation rating device applied to an edge controller, the device comprising:
the first processing unit is used for acquiring a metallographic picture shot by a metallographic microscope, wherein the metallographic picture comprises depth features of image global information, GLCM features for describing textures and principle-based mechanism classification features;
the second processing unit is used for inputting the metallographic pictures into a preset rating model for reasoning calculation and determining a metallographic structure degradation rating result, and the preset rating model is a combined classification model of an image deep learning model and a neural network classifier.
Further, the second processing unit is specifically configured to:
acquiring a sample data set, wherein the sample data set comprises metallographic pictures meeting preset conditions;
preprocessing the sample data set by utilizing computer vision and an image processing technology to obtain a preprocessed image data set;
dividing the image data set into a training set, a testing set and a verification set according to a preset proportion;
constructing a keras framework, extracting depth features from the training set and the testing set, extracting GLCM texture description features by using a gray level co-occurrence matrix, and extracting image mechanism classification features based on classification mechanisms;
fusing the depth features, the GLCM texture description features and the image mechanism classification features as inputs of a neural network classifier to construct the neural network classifier;
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set, and selecting a deep learning model and neural network classifier combination with accuracy meeting preset conditions on the verification set as the preset rating model.
A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the metallographic structure degradation rating method as described above.
An electronic device comprising at least one processor, and at least one memory, bus connected to the processor; the processor and the memory complete communication with each other through the bus; the processor is configured to invoke the program instructions in the memory to perform the metallographic structure degradation rating method as described above.
The metallographic structure degradation rating method and device are applied to an edge controller, after metallographic pictures (depth features of image global information, GLCM features describing textures and principle-based mechanism classification features) shot by a metallographic microscope are obtained, the metallographic pictures are input into a preset rating model (a combined classification model of an image deep learning model and a neural network classifier) for reasoning calculation, and a metallographic structure degradation rating result is determined. According to the method and the device, the rapid grading of the metallographic structure degradation is realized by using a metallographic microscope and edge equipment on site, and the degradation grading accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a metallographic structure degradation rating system disclosed in an embodiment of the present application;
fig. 2 is a schematic flow chart of a metallographic structure degradation grading method disclosed in an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for constructing a preset rating model according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a metallographic structure degradation rating device disclosed in an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The application provides a metallographic structure degradation rating method and device, which are applied to a metallographic structure degradation rating system shown in fig. 1, wherein the metallographic structure degradation rating system comprises: a metallographic microscope 10, an edge device 20 and a display device 30, wherein metallographic pictures are shot through the metallographic microscope 10, and then the depth characteristics containing image global information, GLCM characteristics describing textures and principle-based mechanism classification characteristics are extracted by the edge device 20 to classify the shot metallographic pictures; finally, the classification result is directly given by the edge device 20 via the display device 30.
The application provides a metallographic structure degradation rating method and device, its aim at: how to use a field metallographic microscope and edge equipment to realize rapid grading of metallographic structure degradation, and improve degradation grading accuracy.
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Referring to fig. 2, a schematic flow chart of a metallographic structure degradation grading method is provided in an embodiment of the present application. As shown in fig. 2, an embodiment of the present application provides a metallographic structure degradation grading method, applied to an edge controller, including the following steps:
s201: acquiring a metallographic picture shot by a metallographic microscope, wherein the metallographic picture comprises depth features of image global information, GLCM features for describing textures and principle-based mechanism classification features;
the metallographic structure picture is shot by using a metallographic microscope on site, and the metallographic structure picture shot by the metallographic microscope is obtained by determining the standard resolution, wherein the magnification of the metallographic structure microstructure is 300< a <1000 times. And collecting metallographic structure pictures with the same specification and size obtained by a metallographic microscope under the magnification. In the process of subsequent application, depth features containing image global information, GLCM features describing textures and principle-based mechanism classification features are required to be included in metallographic pictures.
In the embodiment of the application, the depth characteristics containing the image global information are extracted through the edge controller, and the photographed metallographic pictures are classified by GLCM characteristics describing textures and principle-based mechanism classification characteristics. Then, through the classification model loaded on the field edge device, the field metallographic microscope is directly transmitted to the edge device for reasoning calculation after shooting, and finally the edge device directly gives the classification result through the display device.
S202: inputting the metallographic pictures into a preset rating model for reasoning calculation, and determining a metallographic structure degradation rating result, wherein the preset rating model is a combined classification model of an image deep learning model and a neural network classifier.
In this embodiment of the present application, as shown in fig. 3, the method for constructing the preset rating model specifically includes the following steps:
s301: acquiring a sample data set, wherein the sample data set comprises metallographic pictures meeting preset conditions;
the preset condition is that the standard resolution is determined, and the standard resolution a is 300< a <1000 times of microstructure magnification of the metallographic structure. And collecting metallographic structure pictures with the same specification and size obtained by a metallographic microscope under the magnification, and binding the rated grade with the pictures after the rating of the ageing damage by an expert so as to construct a data set V0.
S302: preprocessing the sample data set by utilizing computer vision and an image processing technology to obtain a preprocessed image data set;
in the embodiment of the application, the picture is preprocessed, and the preprocessing method comprises the following steps: the image is grayed, and the influence of the out-of-focus and noise points is weakened by filtering the image. And then the contrast of the picture is increased. Obtaining a gray image P1; binarizing the gray level image P1, removing noise points and isolated points in the picture, wherein the binarized image only keeps a carbide concentrated part, namely a pearlite part in an original image, and the binarized image is P2; p2 is downsampled to compress the picture size to obtain a fixed size image P3.
S303: dividing the image data set into a training set, a testing set and a verification set according to a preset proportion;
it should be noted that, in the embodiment of the present application, after all the images in the data set T0 are processed according to the above 3 steps, a P3 image is obtained as the data set V1 and will be processed according to 7:2:1 divide training set TR0, test set T0 and validation set C0. And downsampling the pictures in the training set and the test set to obtain pictures with the same size, and randomly rotating each picture in the test set by 90 degrees or 180 degrees to extend the pictures into the training set to obtain an image deep learning training set Tr1 and a test set T1.
S304: constructing a keras framework, extracting depth features from the training set and the testing set, extracting GLCM texture description features by using a gray level co-occurrence matrix, and extracting image mechanism classification features based on classification mechanisms;
in an embodiment of the present application, the extracting depth features includes: and inputting the training set and the image data set in the test set into an image deep learning network classification model to obtain corresponding depth characteristics.
It should be noted that, in the embodiment of the present application, an image deep learning network structure is constructed. The image deep learning model is a convolutional neural network, which uses a 26-layer network architecture to form a deep network, and the network structure is characterized in that:
the deep learning network comprises 21 layers of convolution layers, and the neural network is characterized in that the output of the upper layer is used as the input of the lower layer, namely the kth characteristic diagram of the n-1 layerThe n-th layer input is obtained through the operation of the convolution kernel g and the offset term b>The specific formula is as follows:
to reduce the parameters for ease of computation, the downsampling layer is added after the 2 nd and 15 th convolution layers. The calculation formula is as follows:
where down (·) is a downsampling function, taking the n×n maximum pooling as an example, i.e. taking the maximum in the n×n range to represent the region.
In the embodiment of the application, in order to simplify a model, the parameter of the full-connection layer is prevented from being increased suddenly, a global average pooling layer is used for replacing part of full-connection network functions in a layer before the full-connection layer, and the parameter scale of the full-connection layer is reduced. The global average pooling is to calculate the average value of all pixel points of each feature image, one feature image corresponds to one numerical point, and all feature images are compressed into one feature vector.
The last two layers of the model are full-connection layers, and vectors are input into the full-connection layers for classification. The calculation formula of the input vector X and the output vector Y of each layer of the full connection layer is as follows:
Y=w*X+b
wherein w and b are weights and thresholds of the neural network respectively, and are obtained through training. The output is a vector e with dimension equal to the classification category. The vector outputs depth features through a Softmax function, and the Softmax calculation formula is as follows:
the output can be regarded as the degradation level probability of the image given by the image deep learning network, and can also be regarded as the similarity of the image to each type, namely the depth characteristic.
And training the image deep learning network by using the acquired training set Tr1, and optimizing the network by adopting a random gradient descent method by using a cross entropy loss function to obtain an image deep learning network classification model M0.
In this embodiment of the present application, the extracting GLCM texture description features using a gray level co-occurrence matrix includes: extracting energy features, contrast characteristics and entropy features from the image data sets in the training set and the test set by using a gray level co-occurrence matrix; and combining the energy characteristic, the contrast characteristic and the entropy characteristic according to a preset sequence to generate the GLCM texture description characteristic.
It should be noted that, first, the picture P3 is obtained after the picture is preprocessed, and the corresponding classification probability feature A1 is obtained after the picture P3 is input into the image deep learning network classification model M0.
Based on the gray image P1 obtained in the preprocessing, a +gray Co-occurrence matrix (Grey-level Co-occurrence Matrix, GLCM), any point (x, y) in the image (n×n) and another point (x+a, y+b) offset from it are extracted, and the gray value of the point pair is set to (g 1, g 2). When the point (x, y) is moved over the entire screen, various values (g 1, g 2) are obtained, and when the number of gradation values is k, the combination of (g 1, g 2) has k square types. For the whole picture, the number of occurrences of each (G1, G2) value is counted, and then arranged into a square matrix, and then normalized to the probability of occurrence P (G1, G2) by the total number of occurrences of (G1, G2), such square matrix is called gray co-occurrence matrix G. And finally, calculating the GLCM texture descriptive characteristics of the image based on the gray level image.
The features extracted in the embodiment of the application comprise: energy), contrast, and Entropy, wherein:
energy (Energy): the energy can be used to describe the uniformity of the gray distribution and the thickness of the texture in the image. If the value of each element in the gray matrix is said to fluctuate little, the value of this index will be comparatively small, otherwise comparatively large. If this value is large, the texture is coarse, otherwise fine. The calculation method is as follows:
contrast (Contrast): the larger the contrast, the deeper the texture grooves, the more clear the image, whereas the shallower the texture grooves, the more blurred the image. The calculation method is as follows:
entropy (Entropy). The larger the value, the more dispersed the element values in the matrix. This value is smaller if there is no texture in the image. If the texture in the image is complex, the value is large. The calculation method is as follows:
through the steps, energy, contrast and entropy features are extracted by using the gray level co-occurrence matrix, and finally the features are combined into the GLCM texture feature A2 in sequence.
In this embodiment of the present application, the extracting an image mechanism classification feature based on a classification mechanism includes: extracting the image mechanism classification features from the image dataset in the training set and the testing set based on classification mechanisms.
It should be noted that, in the embodiment of the present application, the mechanism classification feature is extracted based on the image P2 obtained after the preprocessing. For example, according to table 1, the spheroidization rating criteria for pearlite of No. 20 steel in a thermal power plant describes that the plate-shaped carbide gradually decreases until it disappears during spheroidization, the spherical carbide gradually becomes large and the carbide in pearlite is dispersed as the spheroidization rating is deepened. The embodiment of the application proposes that the separation ratio is used, the granulating degree is used for evaluating the spheroidization grade, and in the image processing process, the ratio of the carbide area in the pearlite to the blank area is used for representing the separation ratio, and the average area of carbide particles is used for representing the granulating degree; and finally, combining the spheroidization degree and the separation ratio into an image mechanism classification characteristic A3.
S305: fusing the depth features, the GLCM texture description features and the image mechanism classification features as inputs of a neural network classifier to construct the neural network classifier;
according to the embodiment of the application, after the pictures are sequentially processed according to the above, the depth feature A1, the GLCM texture description feature A2 and the image mechanism classification feature A3 are obtained, and then the depth feature A1, the GLCM texture description feature A2 and the image mechanism classification feature A3 are sequentially combined into the fusion feature A4.
S306: based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set, and selecting a deep learning model and neural network classifier combination with accuracy meeting preset conditions on the verification set as the preset rating model.
In this embodiment of the present application, based on the deep learning model, according to the verification set, verifying the deep learning model and the neural network classifier, and selecting a combination of the deep learning model and the neural network classifier, where accuracy on the verification set meets a preset condition, as the preset rating model, where the method includes: based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set to obtain a plurality of combined classification models of the deep learning model and the corresponding neural network classifier; and for different combined classification models, determining verification sets corresponding to the classification models, classifying by using a classifier, and taking the combined classification model which enables the classification accuracy in the verification sets to meet the preset conditions as the preset rating model.
It should be noted that, in the embodiment of the present application, after preprocessing all the original pictures of the metallographic pictures corresponding to the training set TR0, the testing set T0 and the verification set C0, the corresponding fusion features A4 are extracted according to the step 8, so as to form the training set TR1, the testing set T1 and the verification set C2. Using the training set TR1 obtained, the test set T1 trains a fully connected neural network classifier M1.
And then, repeatedly verifying for a plurality of times to obtain a plurality of image deep learning models and a combined classification model corresponding to the neural network classifier. For different combined classification models, a verification set C2 corresponding to each classification model combination is obtained through training and testing, classification is carried out by using a classifier, and a model with highest classification accuracy in the verification set is taken as an image deep learning model M0 and a corresponding full-connection classifier M1.
And using the obtained image deep learning models M0 and M1, obtaining corresponding fusion characteristics A4 after carrying out fusion processing on the pictures to be rated, and inputting the A4 into the obtained full-connection classifier M1 to obtain corresponding degradation levels.
Finally, after training the model, transferring the trained model from the server to the site, converting the deep learning model from a keras at the server end to an rknn model at the edge end, and improving the hardware reasoning speed by means of the acceleration capability of the edge equipment environment; and connecting the edge equipment with a metallographic microscope to obtain an image, and connecting the edge controller with a nixie tube to output a classification result to form a complete metallographic structure degradation rating system.
According to the embodiment of the application, technologies such as computer vision, image processing and the like are utilized to process the 20-steel metallographic structure picture, a keras frame is firstly built through inputting the 20-steel metallographic structure picture, depth features are extracted, then a gray level co-occurrence matrix is utilized to extract GLCM texture description features, and mechanism classification features are extracted based on classification mechanisms. And finally, fusing the depth characteristic, the GLCM texture description characteristic and the mechanism characteristic as the input of a neural network classifier, and outputting a correct result by the classifier. And (3) by verifying the deep learning model, finding the deep learning model with highest accuracy on the verification set and the neural network classifier combination as a model combination for final use. Meanwhile, the classification method is deployed to a detection site through the edge equipment, image data are directly acquired from an optical microscope on the site and calculated on the edge equipment, so that the accuracy and the rapidity of classification are greatly improved, the problems of subjectivity, individual variability and the like of manual classification are effectively avoided by the classification method based on big data, the labor amount is reduced, and the working efficiency is improved.
The embodiment of the application provides a metallographic structure degradation rating method, which is applied to an edge controller, and after a metallographic picture (depth characteristics of image global information, GLCM characteristics for describing textures and principle-based mechanism classification characteristics) shot by a metallographic microscope is obtained, the metallographic picture is input into a preset rating model (a combined classification model of an image deep learning model and a neural network classifier) for reasoning calculation, and a metallographic structure degradation rating result is determined. According to the embodiment of the application, the rapid grading of the metallographic structure degradation is realized by using the on-site metallographic microscope and the edge equipment, and the degradation grading accuracy is improved.
Referring to fig. 4, based on a metallographic structure degradation grading method disclosed in the foregoing embodiment, the present embodiment correspondingly discloses a metallographic structure degradation grading device, applied to an edge controller, the device includes:
the first processing unit 401 is configured to obtain a metallographic image captured by a metallographic microscope, where the metallographic image includes depth features of global image information, GLCM features describing textures, and principle-based mechanism classification features;
the second processing unit 402 is configured to input the metallographic image into a preset rating model for performing inference calculation, and determine a metallographic structure degradation rating result, where the preset rating model is a combined classification model of an image deep learning model and a neural network classifier.
Further, the second processing unit 402 is specifically configured to:
acquiring a sample data set, wherein the sample data set comprises metallographic pictures meeting preset conditions;
preprocessing the sample data set by utilizing computer vision and an image processing technology to obtain a preprocessed image data set;
dividing the image data set into a training set, a testing set and a verification set according to a preset proportion;
constructing a keras framework, extracting depth features from the training set and the testing set, extracting GLCM texture description features by using a gray level co-occurrence matrix, and extracting image mechanism classification features based on classification mechanisms;
fusing the depth features, the GLCM texture description features and the image mechanism classification features as inputs of a neural network classifier to construct the neural network classifier;
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set, and selecting a deep learning model and neural network classifier combination with accuracy meeting preset conditions on the verification set as the preset rating model.
The metallographic structure degradation rating device comprises a processor and a memory, wherein the first processing unit, the second processing unit and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor includes a kernel, and the kernel fetches the corresponding program unit from the memory. The inner core can be provided with one or more than one, and the rapid grading of the metallographic structure degradation is realized by utilizing a metallographic microscope and edge equipment on site by adjusting the parameters of the inner core, so that the degradation grading accuracy is improved.
The embodiment of the application provides a storage medium, on which a program is stored, which when executed by a processor, implements the metallographic structure degradation rating method.
The embodiment of the application provides a processor which is used for running a program, wherein the metallographic structure degradation grading method is executed when the program runs.
An embodiment of the present application provides an electronic device, as shown in fig. 5, where the electronic device 50 includes at least one processor 501, and at least one memory 502 and a bus 503 connected to the processor; wherein, the processor 501 and the memory 502 complete communication with each other through the bus 503; the processor 501 is configured to invoke the program instructions in the memory 502 to execute the metallographic structure degradation rating method described above.
The electronic device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application also provides a computer program product adapted to perform, when executed on a data processing device, a program initialized with the method steps of:
acquiring a metallographic picture shot by a metallographic microscope, wherein the metallographic picture comprises depth features of image global information, GLCM features for describing textures and principle-based mechanism classification features;
inputting the metallographic pictures into a preset rating model for reasoning calculation, and determining a metallographic structure degradation rating result, wherein the preset rating model is a combined classification model of an image deep learning model and a neural network classifier.
Further, the method for constructing the preset rating model comprises the following steps:
acquiring a sample data set, wherein the sample data set comprises metallographic pictures meeting preset conditions;
preprocessing the sample data set by utilizing computer vision and an image processing technology to obtain a preprocessed image data set;
dividing the image data set into a training set, a testing set and a verification set according to a preset proportion;
constructing a keras framework, extracting depth features from the training set and the testing set, extracting GLCM texture description features by using a gray level co-occurrence matrix, and extracting image mechanism classification features based on classification mechanisms;
fusing the depth features, the GLCM texture description features and the image mechanism classification features as inputs of a neural network classifier to construct the neural network classifier;
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set, and selecting a deep learning model and neural network classifier combination with accuracy meeting preset conditions on the verification set as the preset rating model.
Further, the extracting depth features includes:
and inputting the training set and the image data set in the test set into an image deep learning network classification model to obtain corresponding depth characteristics.
Further, the extracting GLCM texture descriptive features using the gray level co-occurrence matrix includes:
extracting energy features, contrast characteristics and entropy features from the image data sets in the training set and the test set by using a gray level co-occurrence matrix;
and combining the energy characteristic, the contrast characteristic and the entropy characteristic according to a preset sequence to generate the GLCM texture description characteristic.
Further, extracting an image mechanism classification feature based on the classification mechanism includes:
extracting the image mechanism classification features from the image dataset in the training set and the testing set based on classification mechanisms.
Further, the deep learning model based on the deep learning model verifies the deep learning model and the neural network classifier according to the verification set, and selects a combination of the deep learning model and the neural network classifier, on the verification set, with accuracy meeting a preset condition as the preset rating model, including:
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set to obtain a plurality of combined classification models of the deep learning model and the corresponding neural network classifier;
and for different combined classification models, determining verification sets corresponding to the classification models, classifying by using a classifier, and taking the combined classification model which enables the classification accuracy in the verification sets to meet the preset conditions as the preset rating model.
The present application is described in terms of methods, apparatus (systems), computer program products, flowcharts, and/or block diagrams in accordance with embodiments of the present application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, the device includes one or more processors (CPUs), memory, and a bus. The device may also include input/output interfaces, network interfaces, and the like.
The memory may include volatile memory, random Access Memory (RAM), and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM), among other forms in computer readable media, the memory including at least one memory chip. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transshipment) such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (6)

1. A metallographic structure degradation rating method, characterized by being applied to an edge controller, comprising:
acquiring a metallographic picture shot by a metallographic microscope, wherein the metallographic picture comprises depth features of image global information, GLCM features for describing textures and principle-based mechanism classification features;
inputting the metallographic pictures into a preset rating model for reasoning calculation, determining a metallographic structure degradation rating result, wherein the preset rating model is a combined classification model of an image deep learning model and a neural network classifier,
the construction method of the preset rating model comprises the following steps:
acquiring a sample data set, wherein the sample data set comprises metallographic pictures meeting preset conditions;
preprocessing the sample data set by utilizing computer vision and an image processing technology to obtain a preprocessed image data set;
dividing the image data set into a training set, a testing set and a verification set according to a preset proportion;
constructing a keras framework, extracting depth features from the training set and the testing set, extracting GLCM texture description features by using a gray level co-occurrence matrix, and extracting image mechanism classification features based on classification mechanisms;
fusing the depth features, the GLCM texture description features and the image mechanism classification features as inputs of a neural network classifier to construct the neural network classifier;
based on a deep learning model, verifying the deep learning model and the neural network classifier according to the verification set, and selecting a deep learning model and neural network classifier combination with accuracy meeting preset conditions on the verification set as the preset rating model;
wherein extracting image mechanism classification features based on the classification mechanism comprises:
extracting the image mechanism classification features from the image dataset in the training set and the testing set based on classification mechanisms;
the deep learning model is based on the deep learning model, the deep learning model and the neural network classifier are verified according to the verification set, and a combination of the deep learning model and the neural network classifier, on which the accuracy meets the preset condition, on the verification set is selected as the preset rating model, and the method comprises the following steps:
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set to obtain a plurality of combined classification models of the deep learning model and the corresponding neural network classifier;
and for different combined classification models, determining verification sets corresponding to the classification models, classifying by using a classifier, and taking the combined classification model which enables the classification accuracy in the verification sets to meet the preset conditions as the preset rating model.
2. The method of claim 1, wherein the extracting depth features comprises:
and inputting the training set and the image data set in the test set into an image deep learning network classification model to obtain corresponding depth characteristics.
3. The method of claim 1, wherein extracting GLCM texture descriptive features using a gray level co-occurrence matrix comprises:
extracting energy features, contrast features and entropy features from the image dataset in the training set and the test set using a gray scale co-occurrence matrix;
and combining the energy characteristic, the contrast characteristic and the entropy characteristic according to a preset sequence to generate the GLCM texture description characteristic.
4. A metallographic structure degradation rating apparatus, characterized in that it is applied to an edge controller, the apparatus comprising:
the first processing unit is used for acquiring a metallographic picture shot by a metallographic microscope, wherein the metallographic picture comprises depth features of image global information, GLCM features for describing textures and principle-based mechanism classification features;
the second processing unit is used for inputting the metallographic pictures into a preset rating model for reasoning calculation and determining a metallographic structure degradation rating result, wherein the preset rating model is a combined classification model of an image deep learning model and a neural network classifier;
wherein the second processing unit is specifically configured to:
acquiring a sample data set, wherein the sample data set comprises metallographic pictures meeting preset conditions;
preprocessing the sample data set by utilizing computer vision and an image processing technology to obtain a preprocessed image data set;
dividing the image data set into a training set, a testing set and a verification set according to a preset proportion;
constructing a keras framework, extracting depth features from the training set and the testing set, extracting GLCM texture description features by using a gray level co-occurrence matrix, and extracting image mechanism classification features based on classification mechanisms;
fusing the depth features, the GLCM texture description features and the image mechanism classification features as inputs of a neural network classifier to construct the neural network classifier;
based on the deep learning model, verifying the deep learning model and the neural network classifier according to the verification set to obtain a plurality of combined classification models of the deep learning model and the corresponding neural network classifier;
for different combined classification models, determining verification sets corresponding to the classification models, classifying by using a classifier, and taking the combined classification model which enables the classification accuracy in the verification sets to meet a preset condition as the preset rating model;
extracting the image mechanism classification features from the image dataset in the training set and the testing set based on classification mechanisms.
5. A storage medium comprising a stored program, wherein the program, when run, controls a device in which the storage medium is located to perform the metallographic structure degradation rating method according to any one of claims 1 to 3.
6. An electronic device comprising at least one processor, and at least one memory, bus coupled to the processor; the processor and the memory complete communication with each other through the bus; the processor is configured to invoke program instructions in the memory to perform the metallographic structure degradation rating method of any one of claims 1 to 3.
CN202111674757.5A 2021-12-31 2021-12-31 Metallographic structure degradation rating method and device Active CN114324361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111674757.5A CN114324361B (en) 2021-12-31 2021-12-31 Metallographic structure degradation rating method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111674757.5A CN114324361B (en) 2021-12-31 2021-12-31 Metallographic structure degradation rating method and device

Publications (2)

Publication Number Publication Date
CN114324361A CN114324361A (en) 2022-04-12
CN114324361B true CN114324361B (en) 2024-03-15

Family

ID=81021080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111674757.5A Active CN114324361B (en) 2021-12-31 2021-12-31 Metallographic structure degradation rating method and device

Country Status (1)

Country Link
CN (1) CN114324361B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017113232A1 (en) * 2015-12-30 2017-07-06 中国科学院深圳先进技术研究院 Product classification method and apparatus based on deep learning
CN108647718A (en) * 2018-05-10 2018-10-12 江苏大学 A kind of different materials metallographic structure is classified the method for grading automatically
CN108921058A (en) * 2018-06-19 2018-11-30 厦门大学 Fish identification method, medium, terminal device and device based on deep learning
CN109034217A (en) * 2018-07-10 2018-12-18 成都先进金属材料产业技术研究院有限公司 Grain size intelligence ranking method based on image recognition depth learning technology
CN109886329A (en) * 2019-02-18 2019-06-14 中国铁建重工集团有限公司 Rock crusher level detection method, detection system and heading equipment
CN110619355A (en) * 2019-08-28 2019-12-27 武汉科技大学 Automatic steel material microstructure identification method based on deep learning
CN111008650A (en) * 2019-11-13 2020-04-14 江苏大学 Metallographic structure automatic rating method based on deep convolution countermeasure neural network
CN112001446A (en) * 2020-08-25 2020-11-27 中国特种设备检测研究院 Method and device for determining aging grade of high-chromium martensite heat-resistant steel structure
CN112132086A (en) * 2020-09-29 2020-12-25 中国特种设备检测研究院 Multi-scale martensite microstructure aging and damage grading method
CN112884001A (en) * 2021-01-15 2021-06-01 广东省特种设备检测研究院珠海检测院 Carbon steel graphitization automatic grading method and system
CN113255814A (en) * 2021-06-09 2021-08-13 大连理工大学 Edge calculation-oriented image classification method based on feature selection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019114147A1 (en) * 2017-12-15 2019-06-20 华为技术有限公司 Image aesthetic quality processing method and electronic device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017113232A1 (en) * 2015-12-30 2017-07-06 中国科学院深圳先进技术研究院 Product classification method and apparatus based on deep learning
CN108647718A (en) * 2018-05-10 2018-10-12 江苏大学 A kind of different materials metallographic structure is classified the method for grading automatically
CN108921058A (en) * 2018-06-19 2018-11-30 厦门大学 Fish identification method, medium, terminal device and device based on deep learning
CN109034217A (en) * 2018-07-10 2018-12-18 成都先进金属材料产业技术研究院有限公司 Grain size intelligence ranking method based on image recognition depth learning technology
CN109886329A (en) * 2019-02-18 2019-06-14 中国铁建重工集团有限公司 Rock crusher level detection method, detection system and heading equipment
CN110619355A (en) * 2019-08-28 2019-12-27 武汉科技大学 Automatic steel material microstructure identification method based on deep learning
CN111008650A (en) * 2019-11-13 2020-04-14 江苏大学 Metallographic structure automatic rating method based on deep convolution countermeasure neural network
CN112001446A (en) * 2020-08-25 2020-11-27 中国特种设备检测研究院 Method and device for determining aging grade of high-chromium martensite heat-resistant steel structure
CN112132086A (en) * 2020-09-29 2020-12-25 中国特种设备检测研究院 Multi-scale martensite microstructure aging and damage grading method
CN112884001A (en) * 2021-01-15 2021-06-01 广东省特种设备检测研究院珠海检测院 Carbon steel graphitization automatic grading method and system
CN113255814A (en) * 2021-06-09 2021-08-13 大连理工大学 Edge calculation-oriented image classification method based on feature selection

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
LBP和GLCM融合的织物组织结构分类;景军锋;邓淇英;李鹏飞;张蕾;张宏伟;;电子测量与仪器学报(09);全文 *
一种基于微观组织特征的粉丝品质检测研究;郭明恩;孙祖莉;王洪来;欧宗瑛;;仪器仪表学报(12);全文 *
基于SOM网络的织物起球客观评价;康雪娟;张赞赞;;计算机与现代化(06);全文 *
基于纹理特征的集成学习金相自动评级方法;许帧英;张琦;朱建栋;;信息技术(第09期);第71-74页,图1 *
基于计算机视觉鉴别肉松与肉粉松;胡孟晗;董庆利;刘阳泰;刘宝林;王芳芳;;食品与发酵工业(第04期);全文 *
采用图像处理的织物缝纫平整度自动评估;张宁;潘如如;高卫东;;纺织学报(04);全文 *

Also Published As

Publication number Publication date
CN114324361A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN111723732B (en) Optical remote sensing image change detection method, storage medium and computing equipment
CN108230278B (en) Image raindrop removing method based on generation countermeasure network
CN105657402A (en) Depth map recovery method
CN109657600B (en) Video area removal tampering detection method and device
CN110895814B (en) Aero-engine hole-finding image damage segmentation method based on context coding network
CN107784288A (en) A kind of iteration positioning formula method for detecting human face based on deep neural network
CN112258470B (en) Intelligent industrial image critical compression rate analysis system and method based on defect detection
CN111798469A (en) Digital image small data set semantic segmentation method based on deep convolutional neural network
Dotel et al. Disaster assessment from satellite imagery by analysing topographical features using deep learning
CN110751195A (en) Fine-grained image classification method based on improved YOLOv3
CN110555461A (en) scene classification method and system based on multi-structure convolutional neural network feature fusion
CN106600613A (en) Embedded GPU-based improved LBP infrared target detection method
CN112132086A (en) Multi-scale martensite microstructure aging and damage grading method
CN115861210A (en) Transformer substation equipment abnormity detection method and system based on twin network
CN112802048B (en) Method and device for generating layer generation countermeasure network with asymmetric structure
CN113723553A (en) Contraband detection method based on selective intensive attention
CN114324361B (en) Metallographic structure degradation rating method and device
CN114419078B (en) Surface defect region segmentation method and device based on convolutional neural network
CN117315387A (en) Industrial defect image generation method
CN113344110B (en) Fuzzy image classification method based on super-resolution reconstruction
CN113034432B (en) Product defect detection method, system, device and storage medium
CN115273089A (en) Optical character restoration method based on condition generation countermeasure network
CN114463300A (en) Steel surface defect detection method, electronic device, and storage medium
Liu et al. Surface defect detection method of hot-rolled steel strip based on improved SSD model
CN112862655A (en) JPEG image steganalysis method based on channel space attention mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant