CN114565193A - Coal spontaneous combustion tendency prediction method based on machine learning - Google Patents

Coal spontaneous combustion tendency prediction method based on machine learning Download PDF

Info

Publication number
CN114565193A
CN114565193A CN202210399315.2A CN202210399315A CN114565193A CN 114565193 A CN114565193 A CN 114565193A CN 202210399315 A CN202210399315 A CN 202210399315A CN 114565193 A CN114565193 A CN 114565193A
Authority
CN
China
Prior art keywords
spontaneous combustion
prediction model
coal
point temperature
cross point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210399315.2A
Other languages
Chinese (zh)
Inventor
宋泽阳
张利冬
赵珊珊
王瑶涵
张�浩
惠绍棠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Science and Technology
Original Assignee
Xian University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Science and Technology filed Critical Xian University of Science and Technology
Priority to CN202210399315.2A priority Critical patent/CN114565193A/en
Publication of CN114565193A publication Critical patent/CN114565193A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/061Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using biological neurons, e.g. biological neurons connected to an integrated circuit
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Strategic Management (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Neurology (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Operations Research (AREA)
  • Human Resources & Organizations (AREA)
  • Pure & Applied Mathematics (AREA)
  • Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Probability & Statistics with Applications (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Development Economics (AREA)
  • Algebra (AREA)
  • Game Theory and Decision Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)

Abstract

The embodiment of the invention discloses a coal spontaneous combustion tendency prediction method based on machine learning. The method comprises the following steps: acquiring a sample set; dividing a sample set into a training set, a verification set and a test set; constructing a coal spontaneous combustion cross point temperature regression prediction model by using a machine learning method (a multilayer perceptron (MLP) or a Random Forest (RF)) based on a training set; obtaining a learning curve of the model based on the training set and the verification set, and evaluating the state of the model; and based on the test set, obtaining the coal spontaneous combustion cross point temperature by using the model, and predicting the risk level of coal spontaneous combustion tendency. The method can express the influence of complex physicochemical process and external factors through a machine learning model, construct a regression prediction model about the intersection temperature, the inherent attributes related to the coal and the external factors, and improve the fitting capacity and generalization capacity of the model by adopting a K-fold cross verification method and a grid search method, thereby predicting the coal spontaneous combustion intersection temperature, and having wide applicability and accurate prediction result.

Description

Coal spontaneous combustion tendency prediction method based on machine learning
Technical Field
The invention relates to the technical field of energy, in particular to a coal spontaneous combustion tendency prediction method based on machine learning.
Background
Coal is fossil fuel with the most abundant reserves and the most extensive distribution regions on the earth, and is one of indispensable energy sources for human production and life. Spontaneous combustion of coal is a very serious natural disaster, which seriously affects the production of coal and causes great economic and resource losses. The spontaneous combustion of coal can emit harmful particles and toxic gas, pollute the environment and threaten the health of people, so that the prediction of the spontaneous combustion tendency of coal has important significance for preventing the harm of the spontaneous combustion of coal. However, since the intrinsic processes involving mass and heat transfer and low temperature oxidation reactions are very complex and various external factors have different effects on coal spontaneous combustion, it is very challenging to predict under what conditions coal spontaneous combustion will occur and what harmful substances will be released.
At present, the problem of spontaneous combustion is often predicted through laboratory experimental research, large-scale experiments, numerical simulation research and the like. For example, numerical models of coupled heat and mass transfer equations and oxygen transport were developed to study the coal auto-ignition process, and the numerical calculation method is more understandable than machine-learned black box operation. However, the existing numerical calculation methods consider a few factors, and although some numerical calculation models consider the factors of coal, the methods mainly focus on the kinetic parameters of coal spontaneous combustion, and other factors (such as volatile content, sulfur content, particle size and the like) are not considered, that is, the numerical calculation models are difficult to consider the influence of the factors at the same time. And part of the models are limited by very small sample data from the same coal mine, and the prediction accuracy of the spontaneous combustion tendency of different coal types is not considered.
In some instances, the factors that affect coal auto-ignition may be a single factor: for example, active coal is more prone to fire; but other factors may also have an impact on coal auto-ignition (including negative and positive effects): for example, moisture content has both negative effects on the risk of spontaneous combustion (reduction of reaction area, oxidation rate and temperature by desorption and evaporation processes) and positive effects (release of heat energy by adsorption, condensation and wetting processes). Therefore, a method for predicting coal spontaneous combustion tendency aiming at different coal qualities, different coal mines and multiple factors is needed.
Disclosure of Invention
In order to solve the technical problems, embodiments of the present invention are expected to provide a coal spontaneous combustion tendency prediction method based on machine learning, which can express the influence of complex physicochemical processes and external factors through a machine learning model, construct a regression prediction model about the cross point temperature and the inherent attributes and external factors related to coal, and improve the generalization ability of the model by using a feature engineering method, thereby predicting the coal spontaneous combustion cross point temperature, and having wide applicability and accurate prediction results.
The technical scheme of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a coal spontaneous combustion tendency prediction method based on machine learning, including:
acquiring a sample set, wherein the sample set comprises a plurality of sample data, and each sample data at least comprises characteristic information of coal and meteorological condition information;
dividing a sample set into a training set, a verification set and a test set;
based on a training set, constructing a coal spontaneous combustion cross point temperature regression prediction model by using a machine learning method, wherein the coal spontaneous combustion cross point temperature regression prediction model is a multilayer perceptron (MLP) prediction model or a Random Forest (RF) prediction model;
obtaining a learning curve of the coal spontaneous combustion cross point temperature regression prediction model based on the training set and the verification set, and adjusting parameters of the coal spontaneous combustion cross point temperature regression prediction model;
and based on the test set, obtaining the coal spontaneous combustion cross point temperature by using a coal spontaneous combustion cross point temperature regression prediction model, and predicting the risk level of coal spontaneous combustion tendency according to the coal spontaneous combustion cross point temperature.
Optionally, the characteristic information of the coal includes moisture, ash, volatile matter, fixed carbon, carbon element, oxygen element, nitrogen element, hydrogen element, sulfur element and particle size; the meteorological condition information includes oxygen concentration and ambient temperature.
Optionally, if the coal spontaneous combustion cross point temperature regression prediction model is a multilayer perceptron (MLP) prediction model, based on the training set, the machine learning method is used to construct the coal spontaneous combustion cross point temperature regression prediction model, including:
the method comprises the following steps: construction of a Square loss function
Figure BDA0003598984290000031
Wherein L (k) is a loss function of a multi-layer perceptron (MLP) prediction model,
Figure BDA0003598984290000032
for the ith output of the neural network,
Figure BDA0003598984290000033
is the ith real value of the neural network;
step two: constructing a cost function
Figure BDA0003598984290000034
Wherein J (w) is a cost function of a multilayer perceptron (MLP) prediction model, n is sample capacity, lambda is a regularization term coefficient, L is the total number of layers of the neural network, SLThe number of the L-th layer units is;
step three: construction of modified Linear Unit (ReLU) function
Figure BDA0003598984290000035
Wherein f (x) is a ReLU function, and x is an input value of a hidden layer neuron;
step four: constructing an identity function f (x) x;
wherein, f (x) is an identity function, and x is an input value of an output layer neuron;
step five: forward propagation and backward propagation algorithms
Assuming that the number of input layer nodes in a multilayer perceptron (MLP) prediction model is 12, the number of hidden layer nodes is 50, and the number of output layer nodes is 1, wherein the threshold value of the jth neuron of the output layer is thetajThe threshold of h node of the hidden layer is represented by gammahThe connection weight between the ith node of the input layer and the h neuron of the hidden layer is represented as vihThe weight of the connection between the h-th neuron of the hidden layer and the j-th neuron of the output layer is whj
The forward propagation process for the multilayer perceptron (MLP) prediction model is as follows:
input of h node of hidden layer:
Figure BDA0003598984290000036
input of jth node of output layer:
Figure BDA0003598984290000037
wherein, bhFor the output of the h-th node of the hidden layer, the activation functions of the hidden layer and the output layer are respectively a ReLu function and an identity function, so that the output of the multilayer perceptron (MLP) prediction model is as follows:
Figure BDA0003598984290000038
the BP algorithm is based on a gradient descent strategy, parameters are adjusted in the negative gradient direction of a target, for an error L (k), a learning rate eta is given, and for a process of back propagation of a multilayer perceptron (MLP) prediction model, the method comprises the following steps:
the process of transferring the output layer to the hidden layer is as follows:
Figure BDA0003598984290000041
for the identity function: f' (x) ═ 1;
from the above formula one can obtain:
Figure BDA0003598984290000042
similarly, the transmission process from the hidden layer to the input layer is as follows: for the ReLU function:
Figure BDA0003598984290000043
Figure BDA0003598984290000044
step six: hyper-parametric regulation
The selection of the number of the hidden layer units is judged by Mean Square Error (MSE), the number of the best hidden layer units is determined to be 50 units according to the MSE result, and the MSE formula is as follows:
Figure BDA0003598984290000045
where MSE is the mean square error, m is the sample capacity,
Figure BDA0003598984290000046
in order to output the predicted value of the target,
Figure BDA0003598984290000047
to output the target experimental values.
Optionally, if the coal spontaneous combustion cross point temperature regression prediction model is a Random Forest (RF) prediction model, based on a training set, the coal spontaneous combustion cross point temperature regression prediction model is constructed by using a machine learning method, including:
the method comprises the following steps: bagging (parallel integration)
Extracting a training set from an original sample set, and extracting n training sets from the original sample set by using a Bootstrap method in each roundTraining samples, and performing m rounds of extraction to obtain m training sets which are independent; using a training set to correspondingly obtain a decision tree model each time, wherein m training sets obtain m decision tree models; calculating the mean of the m decision tree models as the final output result of the Random Forest (RF) prediction model:
Figure BDA0003598984290000051
wherein f (x) is a predicted value of the coal spontaneous combustion cross point temperature, M is the number of trees of a Random Forest (RF) prediction model, fm(x) The predicted result of the mth decision tree;
step two: out-of-bag data errors
Approximately one third of the data in the sample set is not sampled, referred to as out-of-bag data (OOB) may be used to detect the generalization capability of the model; the remaining data is called in-bag data, and the smaller the value of the out-of-bag error, the stronger the generalization ability of the model:
Figure BDA0003598984290000052
therein, MSEOOBIt is the out-of-bag data error n that is the sample size of the data set,
Figure BDA0003598984290000053
is an experimental value of the data outside the bag,
Figure BDA0003598984290000054
is the predicted value of the data outside the bag;
step three: hyper-parametric regulation
A Random Forest (RF) model based on a regression tree is constructed by utilizing a Python tool kit, a Bagging algorithm of the model selects parallel integration, and the optimal tree is determined by utilizing data outside a bag.
Optionally, a learning curve of the coal spontaneous combustion cross point temperature regression prediction model is obtained based on the training set and the verification set, and parameters of the coal spontaneous combustion cross point temperature regression prediction model are adjusted, including:
calculating the average absolute error, the root mean square error, the average absolute percentage error and the decision coefficient of the coal spontaneous combustion cross point temperature regression prediction model according to the training set and the verification set;
and evaluating the coal spontaneous combustion cross point temperature regression prediction model according to the average absolute error, the root-mean-square error, the average absolute percentage error and the decision coefficient of the coal spontaneous combustion cross point temperature regression prediction model.
Optionally, average absolute error
Figure BDA0003598984290000055
Root mean square error
Figure BDA0003598984290000056
Mean absolute percentage error
Figure BDA0003598984290000057
Determining coefficients
Figure BDA0003598984290000061
Where n is the sum of the sample data,
Figure BDA0003598984290000062
is an experimental value of the output target,
Figure BDA0003598984290000063
is the predicted value of the output target, y is the average of the true values.
Optionally, predicting a coal spontaneous combustion tendency risk level according to the coal spontaneous combustion cross point temperature, comprising:
if the temperature of the coal spontaneous combustion cross point is less than 140 ℃, predicting the risk level of coal spontaneous combustion tendency to be high;
if the coal spontaneous combustion cross point temperature is greater than or equal to 140 ℃ and less than or equal to 160 ℃, predicting the coal spontaneous combustion tendency risk level to be middle;
and if the coal spontaneous combustion cross point temperature is more than 160 ℃, predicting the risk level of coal spontaneous combustion tendency to be low.
The embodiment of the invention provides a coal spontaneous combustion tendency prediction method based on machine learning. Since machine learning models (multilayer perceptron (MLP) and Random Forest (RF)) are used to represent the effects of complex physicochemical processes and external factors, the present invention constructs a regression prediction model on the basis of the machine learning models with respect to the intersection temperature (CPT) and 13 input features related to the intrinsic properties of coal and external factors, and screens the input features of the model using a feature engineering method; meanwhile, the probability of over-fitting or under-fitting is reduced by adopting a K-fold cross verification method, and the over-parameters are adjusted by adopting a grid search method to improve the generalization capability of the model. Therefore, the method is beneficial to quickly predicting the spontaneous combustion tendency of the coal of different coal types under different environmental conditions, and has wide applicability and accurate prediction result.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic flow chart of a method for predicting coal spontaneous combustion tendency based on machine learning according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a multi-layered perceptron (MLP) prediction model according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a Random Forest (RF) prediction model according to a second embodiment of the present invention;
fig. 4 is a schematic diagram of a prediction result of the crossover point temperature obtained by using 204 sets of samples according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the following embodiments of the present invention may be implemented individually, or may be implemented in combination with each other, and the embodiments of the present invention are not limited in this respect.
Fig. 1 is a schematic flow chart of a method for predicting coal spontaneous combustion tendency based on machine learning according to an embodiment of the present invention, which may be implemented by a prediction apparatus, which may be implemented in the form of hardware and/or software, and may be configured in an electronic device (e.g., a personal computer, a tablet computer, etc.). As shown in fig. 1, the method comprises the steps of:
s110, obtaining a sample set, wherein the sample set comprises a plurality of sample data, and each sample data at least comprises characteristic information of coal and meteorological condition information.
There are many factors that affect coal spontaneous combustion. In the present invention, these factors are classified into at least two major groups: characteristic information of coal and meteorological condition information. Optionally, each sample data may further include characteristic information of the coal pile.
Specifically, the characteristic information of the coal includes moisture, ash, volatile matter, fixed carbon, oxygen, nitrogen, hydrogen, sulfur, and particle size; the meteorological condition information includes oxygen concentration and ambient temperature.
In general, the number of sample data included in one sample set may vary from several tens to several thousands in general.
And S120, dividing the sample set into a training set, a verification set and a test set.
The training set is mainly used for debugging the model; the verification set is mainly used for checking the training effect; the test set is mainly used for testing the actual learning ability of the model.
For small-scale sample sets, the common distribution proportions are 60% training set, 20% validation set, and 20% testing set; for large-scale sample sets, it is sufficient to ensure that the number of validation sets and test sets is sufficient. In one embodiment, the less hyper-parameters or the easier the hyper-parameters to adjust, the less the proportion of the validation set and more to assign to the training set.
S130, building a coal spontaneous combustion cross point temperature regression prediction model by using a machine learning method based on the training set, wherein the coal spontaneous combustion cross point temperature regression prediction model is a multilayer perceptron (MLP) prediction model or a Random Forest (RF) prediction model.
For step S130, the present invention provides two exemplary schemes, which respectively illustrate a method for constructing a multi-layer perceptron (MLP) prediction model and a method for constructing a Random Forest (RF) prediction model.
Example one, a method of constructing a multi-level perceptron (MLP) prediction model.
The multilayer perceptron (MLP) predictive model, also known as an Artificial Neural Network (ANN), is a neural network structure composed of an input layer, one or more hidden layers, and an output layer. Each layer of the multi-layer perceptron (MLP) prediction model is fully connected to the next layer, except for the input nodes, which are each a neuron (or processing unit) with a nonlinear activation function. In the present invention, a multi-layer perceptron (MLP) prediction model is trained using a back propagation algorithm (BP algorithm). The BP algorithm utilizes the output error to adjust and modify the connection weight and the threshold of the network, and has stronger nonlinear mapping capability, self-learning and self-adaption capability, generalization capability and fault-tolerant capability. Fig. 2 is a schematic structural diagram of a multi-layered perceptron (MLP) prediction model according to an embodiment of the present invention. As shown in fig. 2, the training method of the three-layer multi-layer perceptron (MLP) prediction model is as follows:
1) square loss function
The square loss function is used for measuring the inconsistency degree of the predicted value and the true value of the model, and is a non-negative true value function, and the smaller the loss function is, the better the robustness of the model is.
Figure BDA0003598984290000081
Wherein L (k) is multi-layer perceptionA loss function of a Machine (MLP) prediction model,
Figure BDA0003598984290000082
for the ith output of the neural network,
Figure BDA0003598984290000091
is the ith real value of the neural network.
2) Cost function
In machine learning, the cost function acts on the entire training data set, is the average error of the entire sample data set, and is the average of all the loss function values.
Figure BDA0003598984290000092
Wherein J (w) is a cost function of a multilayer perceptron (MLP) prediction model, n is sample capacity, lambda is a regularization term coefficient, L is the total number of layers of the neural network, SLIs the number of the L-th layer unit.
3) Modified Linear Unit (ReLU) function
The modified linear unit function is used as a hidden layer activation function to calculate a hidden layer activation value, and the ReLU function can quickly fit data, solve the problem of gradient disappearance and reduce the probability of overfitting of the model.
Figure BDA0003598984290000093
Where f (x) is the ReLU function and x is the input value to the hidden layer neurons.
4) Identity function
The identity function is mainly used for a regression model and is used as an activation function of an output layer to calculate an activation value of the layer.
f(x)=x;
Where f (x) is an identity function and x is the input value to the neurons in the output layer.
5) Forward propagation and backward propagation algorithms
In the multilayer perceptron (MLP) prediction model, the number of input layer nodes is 12, the number of hidden layer nodes is 50, and the number of output layer nodes is 1. Wherein, the threshold value of the jth neuron of the output layer is thetajThe threshold of h node of the hidden layer is represented by gammahThe connection weight between the ith node of the input layer and the h neuron of the hidden layer is represented as vihThe weight of the connection between the h-th neuron of the hidden layer and the j-th neuron of the output layer is whj
Forward propagation is the process of information passing from the input layer to the output layer, and is the process of obtaining a predicted value according to input characteristics. The forward propagation process for the model is as follows:
input of h node of hidden layer:
Figure BDA0003598984290000101
input of jth node of output layer:
Figure BDA0003598984290000102
wherein, bhFor the output of the h-th node of the hidden layer, the activation functions of the hidden layer and the output layer are respectively a ReLu function and an identity function, so that the output of the multilayer perceptron (MLP) prediction model is as follows:
Figure BDA0003598984290000103
back propagation is short for "error back propagation" and is an algorithm used in conjunction with optimization methods. And minimizing a loss function by utilizing the value of the error adjustment weight calculated by the BP algorithm, thereby improving the prediction precision of the model. The BP algorithm is based on a gradient descent strategy, parameters are adjusted in the direction of the negative gradient of a target, for an error L (k), a learning rate eta is given, and for a model back propagation process, the method comprises the following steps:
the process of transferring the output layer to the hidden layer is as follows:
Figure BDA0003598984290000104
Figure BDA0003598984290000105
Figure BDA0003598984290000106
for the identity function: f' (x) ═ 1;
from the above formula one can obtain:
Figure BDA0003598984290000107
Δwhj=ηgibh
Δθj=-ηgj
similarly, the transmission process from the hidden layer to the input layer is as follows:
for the ReLU function:
Figure BDA0003598984290000108
Figure BDA0003598984290000111
Δvih=ηehxi
Δr=-ηeh
6) hyper-parametric regulation
The selection of the hidden layer unit number is judged by using Mean Square Error (MSE), the optimal hidden layer unit number is determined to be 50 units according to the MSE result, and the MSE formula is as follows:
Figure BDA0003598984290000112
where MSE is the mean square error, m is the sample capacity,
Figure BDA0003598984290000113
in order to output the predicted value of the target,
Figure BDA0003598984290000114
to output the target experimental values.
In one embodiment, in order to prevent the multi-layer perceptron (MLP) prediction model from generating an overfitting phenomenon, enhance generalization capability and improve the prediction performance of the model, K-folder Cross Validation can be added into the multi-layer perceptron (MLP) prediction model, the algorithm divides a data set into K parts with the same size, one part is randomly selected as a test set every time, and the rest K-1 part is used as a training set. After several iterations, a loss function is selected to evaluate the optimal model and parameters. The mean square error was chosen as a loss function for K-folder Cross Validation, and K values were tuned for the Cross-Validation fold range [1,10] using GridSearchCV in scinit-leann library.
Example two, a method for constructing a Random Forest (RF) prediction model.
Random Forest (RF) prediction models are a machine learning method proposed by Breiman in 2001 that uses an ensemble learning method to combine multiple decision trees to determine the best classification or regression, where the trees can be constructed according to a classification and regression tree (CART) model. Fig. 3 is a schematic structural diagram of a Random Forest (RF) prediction model according to a second embodiment of the present invention. As shown in FIG. 3, for a given data set (P), a bootstrap resampling method is used to randomly form m sample sets (P) from PmAnd M is 1,2, …, M). About one third of the data in P is not sampled, called out-of-bag data (OOB) can be used to detect the generalization capability of the model; the remaining data is called in-bag data and this sampling process is also called guided aggregation (Bagging). Then each sample set PmAnd establishing a corresponding decision tree, randomly extracting a part of characteristics as an optimal solution, applying the optimal solution to the nodes for splitting, and finally obtaining a final prediction result according to voting or calculating the average value of all individuals. The training method of the Random Forest (RF) prediction model comprises the following steps:
1) bagging (parallel integration)
Bagging is the most well-known representative of the parallel ensemble learning method. A training set is extracted from the original sample set. N training samples are extracted from the original sample set in each round by using the Bootstrap method (in the training set, some samples may be extracted multiple times, and some samples may not be extracted at one time). M rounds of extraction are performed to obtain m training sets (m training sets are independent from each other). And correspondingly obtaining a decision tree model by using one training set each time, wherein m training sets obtain m decision tree models. Since a regression model is built here, the mean of the m decision tree models is calculated as the final output of the RF model:
Figure BDA0003598984290000121
wherein f (x) is a predicted value of the coal spontaneous combustion cross point temperature, M is the number of trees of a Random Forest (RF) prediction model, fm(x) Is the predicted result of the mth decision tree.
2) Out-of-bag data error
Approximately one third of the data in the sample set is not sampled, referred to as out-of-bag data (OOB) may be used to detect the generalization capability of the model; the remaining data is called in-bag data, and the smaller the value of the out-of-bag error, the stronger the generalization ability of the model is:
Figure BDA0003598984290000122
therein, MSEOOBIt is the out-of-bag data error n that is the sample size of the data set,
Figure BDA0003598984290000123
is an experimental value of the data outside the bag,
Figure BDA0003598984290000124
is the predicted value of the data outside the bag.
3) Hyper-parametric regulation
An RF model based on a regression tree is established by utilizing a Python tool kit, a Bagging algorithm of the model selects parallel integration, and the optimal tree is determined by utilizing data outside a bag.
In one embodiment, in order to improve the accuracy and generalization capability of the RF model, K-folder Cross Validation is also added to the RF model, the mean square error is selected as its loss function, and the range of Cross-Validation fractals in [1,10] is optimized by using GridSearchCV in scinit-left library.
S140, obtaining a learning curve of the coal spontaneous combustion cross point temperature regression prediction model based on the training set and the verification set, and adjusting parameters of the coal spontaneous combustion cross point temperature regression prediction model.
In one embodiment, after the coal spontaneous combustion cross point temperature regression prediction model is built by using a machine learning method, the average absolute error, the root mean square error, the average absolute percentage error and the decision coefficient of the coal spontaneous combustion cross point temperature regression prediction model can be calculated according to a training set and a verification set; and evaluating the coal spontaneous combustion cross point temperature regression prediction model according to the average absolute error, the root mean square error, the average absolute percentage error and the decision coefficient of the coal spontaneous combustion cross point temperature regression prediction model.
Specifically, the learning curve is used for judging whether the model has a high variance (overfitting) state or a high deviation (underfitting) state by drawing the accuracy of the training set and the accuracy of the verification set and utilizing the change of the accuracy, so that the parameters of the model are adjusted.
In one embodiment, Mean Absolute Error (MAE): the value range is [0, + ∞ ], the error between the predicted value and the experimental value is smaller, the value is smaller, and the prediction model has better accuracy.
Root Mean Square Error (RMSE): sample standard deviations representing the difference between predicted and experimental values. It represents the degree of dispersion of the sample, and when a nonlinear fit is made, the smaller the value, the better.
Mean Absolute Percentage Error (MAPE): a percentage value represents the average deviation of the predicted value from the experimental value. The smaller the value, the better the accuracy of the prediction model.
Determining the coefficient (R)2): the value range is [0, 1 ]]The closer to 1, the stronger the fitting ability of the model is.
These four index equations are defined as follows:
Figure BDA0003598984290000131
Figure BDA0003598984290000132
Figure BDA0003598984290000133
Figure BDA0003598984290000141
where n is the sum of the sample data,
Figure BDA0003598984290000142
is an experimental value of the output target,
Figure BDA0003598984290000143
is the predicted value of the output target, y is the average of the true values.
S150, based on the test set, obtaining the coal spontaneous combustion cross point temperature by using a coal spontaneous combustion cross point temperature regression prediction model, and predicting the risk level of coal spontaneous combustion tendency according to the coal spontaneous combustion cross point temperature.
In one embodiment, the relationship between the coal auto-ignition crossover point temperature and the risk level of coal auto-ignition propensity is shown in Table 1.
TABLE 1
Coal spontaneous combustion Crossover Point Temperature (CPT) Description of the invention Coal spontaneous combustion risk classification
<120℃;120℃-140℃ Higher propensity to self-ignition Height of
140℃-160℃ Moderate propensity to autoignite In (1)
>160℃ Low propensity to self-ignition Is low in
As can be seen from table 1:
if the temperature of the coal spontaneous combustion cross point is less than 140 ℃, determining that the risk level of coal spontaneous combustion tendency is high;
if the coal spontaneous combustion cross point temperature is greater than or equal to 140 ℃ and less than or equal to 160 ℃, determining the coal spontaneous combustion tendency risk level as middle;
and if the coal spontaneous combustion cross point temperature is more than 160 ℃, determining that the coal spontaneous combustion tendency risk level is low.
The Machine Learning (ML) has obvious influence and wide application in the aspects of classification, pattern recognition, regression prediction and the like, and provides a new solution for the prediction problem of the coal spontaneous combustion tendency. As the ML automatically learns from the data and executes tasks such as prediction, decision and the like, and has incomparable advantages in the aspects of trend identification and prediction assistance, the scheme provided by the invention is beneficial to managing coal spontaneous combustion risks from different angles and different influence factors, and is beneficial to establishing a coal spontaneous combustion risk early warning system. The method can effectively carry out corresponding spontaneous combustion risk treatment aiming at different coal risk levels, reduce the spontaneous combustion risk, reduce the loss in the coal storage and processing processes, and effectively reduce the accident risks of fire, dust, gas explosion and the like caused by coal spontaneous combustion.
Fig. 4 is a schematic diagram of a prediction result of the crossover point temperature obtained by using 204 sets of samples according to an embodiment of the present invention. As shown in fig. 4, the present study proposes a method for predicting the coal's propensity to spontaneous combustion using a multi-layered perceptron (MLP) prediction model and a Random Forest (RF) prediction model based on 204 sets of CPT experimental data. Studies have shown that both the multi-layer perceptron (MLP) prediction model (MAE 8.99 ℃, RMSE 12.37, MAPE 14.69%, R2 0.58) and the Random Forest (RF) prediction model (MAE 8.52 ℃, RMSE 11.23, MAPE 13.23%, R2 0.74) can predict CPT with internal and external influencing factors with an error of less than 10%. Random Forest (RF) prediction models have better generalization capability, and one of the potential reasons may be that the structure of the input features is a tree-like structure similar to RF. The machine learning method can predict the spontaneous combustion tendency of the coal, and solves the problem of predicting the spontaneous combustion tendency of the coal under different environmental conditions and different coal types.
The prediction capabilities of a multi-layer perceptron (MLP) prediction model and a Random Forest (RF) prediction model for different coal samples are slightly different, mainly due to the range of CPT values, the size of data sets, and the like. Furthermore, the predictive power of the machine learning model depends on the sample size and the number of input features.
The embodiment of the invention provides a coal spontaneous combustion tendency prediction method based on machine learning, which comprises the following steps: acquiring a sample set, wherein the sample set comprises a plurality of sample data, and each sample data at least comprises characteristic information of coal and meteorological condition information; dividing a sample set into a training set, a verification set and a test set; based on a training set, constructing a coal spontaneous combustion cross point temperature regression prediction model by using a machine learning method, wherein the coal spontaneous combustion cross point temperature regression prediction model is a multilayer perceptron (MLP) prediction model or a Random Forest (RF) prediction model; and based on the test set, obtaining the coal spontaneous combustion cross point temperature by using a coal spontaneous combustion cross point temperature regression prediction model, and predicting the risk level of coal spontaneous combustion tendency according to the coal spontaneous combustion cross point temperature. The method comprises the steps of firstly, constructing a CPT prediction model with causal relation by utilizing a multilayer perceptron (MLP) and a Random Forest (RF) based on 204 CPT experimental data and 13 characteristic variables (internal and external influence factors of coal) so as to predict the spontaneous combustion tendency of different coal types; k-fold cross validation and learning curves are then used to prevent and evaluate over-or under-fitting, and feature engineering is used to analyze the correlation between feature variables to improve the accuracy of the model. Therefore, the method is beneficial to quickly predicting the spontaneous combustion tendency of the coal of different coal types under different environmental conditions, and has wide applicability and accurate prediction result.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (7)

1. A coal spontaneous combustion tendency prediction method based on machine learning is characterized by comprising the following steps:
acquiring a sample set, wherein the sample set comprises a plurality of sample data, and each sample data at least comprises characteristic information of coal and meteorological condition information;
dividing the sample set into a training set, a verification set and a test set;
based on the training set, constructing a coal spontaneous combustion cross point temperature regression prediction model by using a machine learning method, wherein the coal spontaneous combustion cross point temperature regression prediction model is a multilayer perceptron (MLP) prediction model or a Random Forest (RF) prediction model;
obtaining a learning curve of the coal spontaneous combustion cross point temperature regression prediction model based on the training set and the verification set, and adjusting parameters of the coal spontaneous combustion cross point temperature regression prediction model;
and based on the test set, obtaining the coal spontaneous combustion cross point temperature by using the coal spontaneous combustion cross point temperature regression prediction model, and predicting the risk level of coal spontaneous combustion tendency according to the coal spontaneous combustion cross point temperature.
2. The method according to claim 1, wherein the characteristic information of the coal includes moisture, ash, volatile matter, fixed carbon, carbon element, oxygen element, nitrogen element, hydrogen element, sulfur element, and particle size; the meteorological condition information includes oxygen concentration and ambient temperature.
3. The method of claim 1, wherein if the coal auto-ignition cross point temperature regression prediction model is a multilayer perceptron (MLP) prediction model, the constructing the coal auto-ignition cross point temperature regression prediction model using a machine learning method based on the training set comprises:
the method comprises the following steps: construction of a Square loss function
Figure FDA0003598984280000011
Wherein L (k) is a loss function of a multi-layer perceptron (MLP) prediction model,
Figure FDA0003598984280000012
for the ith output of the neural network,
Figure FDA0003598984280000013
is the ith real value of the neural network;
step two: constructing a cost function
Figure FDA0003598984280000014
Wherein J (w) is a cost function of a multilayer perceptron (MLP) prediction model, n is sample capacity, lambda is a regularization term coefficient, L is the total number of layers of the neural network, SLThe number of the L-th layer units is;
step three: construction of modified Linear Unit (ReLU) function
Figure FDA0003598984280000021
Wherein f (x) is a ReLU function, and x is an input value of a hidden layer neuron;
step four: constructing an identity function f (x) x;
wherein, f (x) is an identity function, and x is an input value of an output layer neuron;
step five: forward propagation and backward propagation algorithms
Assuming that the number of input layer nodes in a multilayer perceptron (MLP) prediction model is 12, the number of hidden layer nodes is 50, and the number of output layer nodes is 1, wherein the threshold value of the jth neuron of the output layer is thetajThe threshold of h node of the hidden layer is represented by gammahShowing that the connection weight between the ith node of the input layer and the h neuron of the hidden layer is vihThe weight of the connection between the h-th neuron of the hidden layer and the j-th neuron of the output layer is whj
The forward propagation process for the multi-layer perceptron (MLP) prediction model is as follows:
input of h node of hidden layer:
Figure FDA0003598984280000022
input of jth node of output layer:
Figure FDA0003598984280000023
wherein, bhFor the output of the h-th node of the hidden layer, the activation functions of the hidden layer and the output layer are respectively a ReLu function and an identity function, so that the output of the multilayer perceptron (MLP) prediction model is as follows:
Figure FDA0003598984280000024
the BP algorithm is based on a gradient descent strategy, parameters are adjusted in the direction of the negative gradient of a target, for an error L (k), a learning rate eta is given, and for a process of back propagation of a multilayer perceptron (MLP) prediction model, the method comprises the following steps:
the process of transferring the output layer to the hidden layer is as follows:
Figure FDA0003598984280000025
for the identity function: f' (x) ═ 1;
according to the above formulaIt is possible to obtain:
Figure FDA0003598984280000026
similarly, the transmission process from the hidden layer to the input layer is as follows: for the ReLU function:
Figure FDA0003598984280000031
Figure FDA0003598984280000032
step six: hyper-parametric regulation
The selection of the number of the hidden layer units is judged by Mean Square Error (MSE), the number of the best hidden layer units is determined to be 50 units according to the MSE result, and the MSE formula is as follows:
Figure FDA0003598984280000033
where MSE is the mean square error, m is the sample capacity,
Figure FDA0003598984280000034
in order to output the predicted value of the target,
Figure FDA0003598984280000035
to output the target experimental values.
4. The method of claim 1, wherein if the coal auto-ignition cross point temperature regression prediction model is a Random Forest (RF) prediction model, the building the coal auto-ignition cross point temperature regression prediction model using a machine learning method based on the training set comprises:
the method comprises the following steps: bagging (parallel integration)
Extracting training sets from the original sample set, extracting n training samples from the original sample set in each round by using a Bootstrap method, and performing m rounds of extraction to obtain m training sets which are mutually independent; corresponding to each training setObtaining m decision tree models from m training sets in one decision tree model; calculating the mean of the m decision tree models as the final output result of the Random Forest (RF) prediction model:
Figure FDA0003598984280000036
wherein f (x) is a predicted value of the coal spontaneous combustion cross point temperature, M is the number of trees of a Random Forest (RF) prediction model, fm(x) The predicted result of the mth decision tree;
step two: out-of-bag data error
Approximately one third of the data in the sample set is not sampled, referred to as out-of-bag data (OOB) may be used to detect the generalization capability of the model; the remaining data is called in-bag data, and the smaller the value of the out-of-bag error, the stronger the generalization ability of the model:
Figure FDA0003598984280000041
therein, MSEOOBIt is the out-of-bag data error n that is the sample size of the data set,
Figure FDA0003598984280000042
is an experimental value of the data outside the bag,
Figure FDA0003598984280000043
is the predicted value of the data outside the bag;
step three: hyper-parametric regulation
A Random Forest (RF) model based on a regression tree is constructed by utilizing a Python tool kit, a Bagging algorithm of the model selects parallel integration, and the optimal tree is determined by utilizing data outside a bag.
5. The method according to claim 3 or 4, wherein the obtaining of the learning curve of the coal auto-ignition cross point temperature regression prediction model based on the training set and the validation set, and the adjusting of the parameters of the coal auto-ignition cross point temperature regression prediction model comprise:
calculating the average absolute error, the root mean square error, the average absolute percentage error and the decision coefficient of the coal spontaneous combustion cross point temperature regression prediction model according to the training set and the verification set;
and evaluating the coal spontaneous combustion cross point temperature regression prediction model according to the average absolute error, the root-mean-square error, the average absolute percentage error and the decision coefficient of the coal spontaneous combustion cross point temperature regression prediction model.
6. The method of claim 5,
mean absolute error
Figure FDA0003598984280000044
Root mean square error
Figure FDA0003598984280000045
Mean absolute percentage error
Figure FDA0003598984280000046
Determining coefficients
Figure FDA0003598984280000047
Where n is the sum of the sample data,
Figure FDA0003598984280000048
is an experimental value of the output target,
Figure FDA0003598984280000049
is the predicted value of the output target, y is the average of the true values.
7. The method of claim 1, wherein predicting a coal auto-ignition propensity risk level based on the coal auto-ignition cross-point temperature comprises:
if the coal spontaneous combustion cross point temperature is less than 140 ℃, predicting that the risk level of coal spontaneous combustion tendency is high;
if the coal spontaneous combustion cross point temperature is greater than or equal to 140 ℃ and less than or equal to 160 ℃, predicting the coal spontaneous combustion tendency risk level to be middle;
and if the coal spontaneous combustion cross point temperature is more than 160 ℃, predicting the risk level of coal spontaneous combustion tendency to be low.
CN202210399315.2A 2022-04-15 2022-04-15 Coal spontaneous combustion tendency prediction method based on machine learning Pending CN114565193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210399315.2A CN114565193A (en) 2022-04-15 2022-04-15 Coal spontaneous combustion tendency prediction method based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210399315.2A CN114565193A (en) 2022-04-15 2022-04-15 Coal spontaneous combustion tendency prediction method based on machine learning

Publications (1)

Publication Number Publication Date
CN114565193A true CN114565193A (en) 2022-05-31

Family

ID=81720759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210399315.2A Pending CN114565193A (en) 2022-04-15 2022-04-15 Coal spontaneous combustion tendency prediction method based on machine learning

Country Status (1)

Country Link
CN (1) CN114565193A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115062834A (en) * 2022-06-09 2022-09-16 佛山众陶联供应链服务有限公司 Method for screening Lasso regression prediction firing curve based on orthogonal test
CN115081698A (en) * 2022-06-09 2022-09-20 佛山众陶联供应链服务有限公司 Method, apparatus and computer storage medium for predicting firing curve based on degree of deformation
CN117787728A (en) * 2024-02-27 2024-03-29 贵州省煤层气页岩气工程技术研究中心 Coal mine roadway gas explosion risk level evaluation method based on visualization
CN117969600A (en) * 2024-03-28 2024-05-03 山东科技大学 Titanium powder explosion hazard detection method
CN117969600B (en) * 2024-03-28 2024-06-07 山东科技大学 Titanium powder explosion hazard detection method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115062834A (en) * 2022-06-09 2022-09-16 佛山众陶联供应链服务有限公司 Method for screening Lasso regression prediction firing curve based on orthogonal test
CN115081698A (en) * 2022-06-09 2022-09-20 佛山众陶联供应链服务有限公司 Method, apparatus and computer storage medium for predicting firing curve based on degree of deformation
CN115062834B (en) * 2022-06-09 2022-12-27 佛山众陶联供应链服务有限公司 Method for screening Lasso regression prediction firing curve based on orthogonal test
CN115081698B (en) * 2022-06-09 2023-04-07 佛山众陶联供应链服务有限公司 Method, apparatus and computer storage medium for predicting firing curve based on degree of deformation
CN117787728A (en) * 2024-02-27 2024-03-29 贵州省煤层气页岩气工程技术研究中心 Coal mine roadway gas explosion risk level evaluation method based on visualization
CN117787728B (en) * 2024-02-27 2024-04-30 贵州省煤层气页岩气工程技术研究中心 Coal mine roadway gas explosion risk level evaluation method based on visualization
CN117969600A (en) * 2024-03-28 2024-05-03 山东科技大学 Titanium powder explosion hazard detection method
CN117969600B (en) * 2024-03-28 2024-06-07 山东科技大学 Titanium powder explosion hazard detection method

Similar Documents

Publication Publication Date Title
CN114565193A (en) Coal spontaneous combustion tendency prediction method based on machine learning
Qu et al. A neural-network-based method for RUL prediction and SOH monitoring of lithium-ion battery
Zhu et al. A combined machine learning algorithms and DEA method for measuring and predicting the efficiency of Chinese manufacturing listed companies
Jia et al. A novel optimized GA–Elman neural network algorithm
Gallo Artificial neural networks tutorial
CN102360455B (en) Solar array expansion reliability assessment method based on expert knowledge and neural network
CN113011796B (en) Edible oil safety early warning method based on&#39; analytic hierarchy process-neural network
de Araújo Padilha et al. A multi-level approach using genetic algorithms in an ensemble of least squares support vector machines
Rolls Advantages of dilution in the connectivity of attractor networks in the brain
Gupta et al. Notice of violation of ieee publication principles; artificial neural networks: Its techniques and applications to forecasting
CN116644970A (en) Photovoltaic power prediction method based on VMD decomposition and lamination deep learning
Verma et al. Real-time classification of national and international students for ICT and mobile technology: an experimental study on Indian and Hungarian university
Yan et al. Trustworthiness evaluation and retrieval-based revision method for case-based reasoning classifiers
Altun et al. Identification of dynamic loads on structural component with artificial neural networks
CN112257942A (en) Stress corrosion cracking prediction method and system
Dodo et al. Prediction of energy content of biomass based on hybrid machine learning ensemble algorithm
CN108537581B (en) Energy consumption time series prediction method and device based on GMDH selective combination
Weihong et al. Optimization of BP neural network classifier using genetic algorithm
Telipenko et al. Results of research on development of an intellectual information system of bankruptcy risk assessment of the enterprise
Putra et al. Implementation of neural network to determine the new college students
Aarti et al. Grey relational classification algorithm for software fault proneness with SOM clustering
Rahman et al. Implementation of artificial neural network on regression analysis
Safi et al. A neural network approach for predicting forest fires
Omar et al. Improved classification performance for multiple multilayer perceptron (MMLP) network using voting technique
Chai et al. Production Forecast of Coalbed Methane Based on GA Optimized BP Neural Network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination