CN108446799B - Residual pressure turbine device power generation power prediction method based on Elman neural network - Google Patents
Residual pressure turbine device power generation power prediction method based on Elman neural network Download PDFInfo
- Publication number
- CN108446799B CN108446799B CN201810198547.5A CN201810198547A CN108446799B CN 108446799 B CN108446799 B CN 108446799B CN 201810198547 A CN201810198547 A CN 201810198547A CN 108446799 B CN108446799 B CN 108446799B
- Authority
- CN
- China
- Prior art keywords
- model
- neural network
- layer
- input
- error
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 36
- 238000010248 power generation Methods 0.000 title claims abstract description 16
- 238000012549 training Methods 0.000 claims abstract description 26
- 238000010606 normalization Methods 0.000 claims abstract description 9
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 238000012545 processing Methods 0.000 claims abstract description 5
- 238000005070 sampling Methods 0.000 claims abstract description 5
- 239000007789 gas Substances 0.000 claims description 23
- 230000006870 function Effects 0.000 claims description 12
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 claims description 8
- 229910002092 carbon dioxide Inorganic materials 0.000 claims description 7
- 238000011084 recovery Methods 0.000 claims description 7
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 claims description 4
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 claims description 4
- 239000001569 carbon dioxide Substances 0.000 claims description 4
- 229910002091 carbon monoxide Inorganic materials 0.000 claims description 4
- 229910052739 hydrogen Inorganic materials 0.000 claims description 4
- 239000001257 hydrogen Substances 0.000 claims description 4
- 238000013507 mapping Methods 0.000 claims description 4
- 239000003034 coal gas Substances 0.000 description 8
- 238000005457 optimization Methods 0.000 description 5
- 230000001419 dependent effect Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 229910000831 Steel Inorganic materials 0.000 description 3
- 230000000306 recurrent effect Effects 0.000 description 3
- 238000007619 statistical method Methods 0.000 description 3
- 239000010959 steel Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000013178 mathematical model Methods 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 241000124033 Salix Species 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 238000004540 process dynamic Methods 0.000 description 1
- 238000000611 regression analysis Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y04—INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
- Y04S—SYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
- Y04S10/00—Systems supporting electrical power generation, transmission or distribution
- Y04S10/50—Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Marketing (AREA)
- Biophysics (AREA)
- General Business, Economics & Management (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Tourism & Hospitality (AREA)
- Game Theory and Decision Science (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- Manufacturing & Machinery (AREA)
- Primary Health Care (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Feedback Control In General (AREA)
Abstract
The invention discloses a method for predicting the power generation power of a residual pressure turbine device based on an Elman neural network. The method comprises the following steps: carrying out interpolation processing on the unevenly sampled data to form a sequence with equal time intervals, analyzing the overall trend and correlation coefficient of each input variable, selecting six variables with high correlation degree with the generated energy as input variables, and carrying out interpolation and normalization preprocessing on the input variables by utilizing a python and Matlab platform to ensure that the sampling frequency is uniform; step two: selecting an Elman neural network as a basic model for prediction, adopting an error back propagation learning algorithm (BPTT) and combining a sliding window model for predicting the generated power of the residual pressure turbine device; step three: and after initializing the model, training the model by using the normalized training sample, and using the trained model for predicting the power generation power of the residual pressure turbine device. The invention ensures that the TRT generated power large prediction has higher precision and smaller mean square error, and can be used for on-line real-time prediction.
Description
Technical Field
The invention belongs to the field of industrial process monitoring, modeling and simulation, and particularly relates to a method for predicting the power generation capacity of a residual pressure turbine device by an Elman-BPTT neural network.
Technical Field
The steel industry currently commonly uses blast furnace residual pressure recovery devices to recover part of the energy. The capacity prediction based on Top Gas Pressure Recovery Turbine (TRT) of the blast furnace is an important basis for optimizing and controlling the power generation of the blast furnace. Accurate capacity prediction (or model prediction) is essential to improve Real-time optimization (RTO). Because the RTO adopts a static model, when interference occurs, the controlled system can be optimized only when reaching the steady state again, so that the optimization is delayed, and the inconsistency that the RTO nonlinear steady state optimization period is too long and the loop control period is too short can be solved by combining with the capacity prediction. The direct online optimization method adopts nonlinear model predictive control to optimize economic performance indexes online in a limited step length, and the method requires an accurate nonlinear process model.
The current research on similar industrial processes is mainly divided into two main ways.
The method comprises the steps of firstly, modeling by a traditional mechanism, establishing a dynamic mathematical model of each unit on the basis of analyzing the working mechanism of the TRT system, and deducing a generating capacity mathematical model of the TRT system by combining each unit model based on an empirical formula of a blast furnace gas generation process and a material balance relation. However, because the blast furnace ironmaking process is complex, it is difficult to establish an accurate model through a mechanism, and the true value is very likely to be greatly different from the predicted value by modeling.
Second, data driven modeling. Statistical methods and machine learning methods are mainly used. The statistical method mainly comprises the following steps: partial least squares, principal component analysis, autoregressive analysis, regression analysis. The machine learning method includes a neural network model, a support vector machine and the like. In the case of a large amount of data that can be analyzed or a neural network trained, the prediction is expected to achieve a better effect.
However, data-driven modeling has its own drawbacks, and conventional data modeling approaches suffer from problems due to certain characteristics of the large data in the process industry. Firstly, the existing data modeling method mostly focuses on modeling and analyzing regularly sampled data, and cannot model and analyze irregularly sampled data. This requires preprocessing of the irregularly sampled data, but this part of the process loses some of the data. Secondly, the current data modeling method mainly extracts static data statistical analysis of a space, does not consider time dynamics, cannot extract operation condition characteristic information from historical dynamic big data, and an industrial model needs dynamic analysis.
Third, existing data modeling methods require either non-contaminated data or modeling of pre-processed data. Otherwise, the abnormal point returns to bring great influence to the model parameters, even leading to model mismatch.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a method for predicting the power generation power of a residual pressure turbine device based on an Elman model.
A method for predicting the power generation power of a residual pressure turbine device based on an Elman neural network,
the method comprises the following steps: carrying out interpolation processing on unevenly sampled data to form a sequence with equal time intervals, analyzing the overall trend and correlation coefficient of each input variable, selecting six variables with high correlation degree with generated energy as input variables, wherein the input variables are gas flow, gas inlet pressure, gas inlet temperature, carbon monoxide content, carbon dioxide content and hydrogen content in gas, and carrying out interpolation and normalization preprocessing on the input variables by utilizing a python and Matlab platform to ensure that the sampling frequency is uniform;
step two: selecting an Elman neural network as a basic model for prediction, adopting an error back propagation learning algorithm (BPTT) and combining a sliding window model for predicting the generated power of the residual pressure turbine device;
step three: and after initializing the model, training the model by using the normalized training sample, and using the trained model for predicting the power generation power of the residual pressure turbine device.
Step one correlation coefficient is as follows:
gas flow rate in m3H, correlation coefficient 0.9294;
gas inlet pressure in kPa, correlation coefficient-0.0397;
the gas inlet temperature, unit ℃, correlation coefficient 0.0352;
volume content of CO, 0.1278%;
CO2volume content, 0.1935%;
the correlation coefficient is between 0 and 1, and the larger the correlation coefficient is, the stronger the correlation is;
the normalization method in the first step is as follows:
the normalized result is between 0 and 1.
The structure of the Elman neural network in the step two is as follows:
the ELman neural network is composed of four layers, namely an input layer, a hidden layer, a carrying layer and an output layer:
1) from input layer to hidden layer:
x(k)=f(w1xc(k)+w2(u(k-1)));
2) from hidden layer to output layer:
y(k)=g(w3x(k));
3) the receiving layer stores short-term information:
xc(k)=x(k-1);
4) learning index function: using the sum of squared error function, training ends when E <0.0001
5) The weight value updating method comprises the following steps:
w1for the connection weight of the bearer layer to the hidden layer, w2Is the connection weight of the input layer to the hidden layer, w3The three weights are required to be adjusted repeatedly for connecting the hidden layer with the output layer, error is the difference between prediction and reality in the training process, eta represents the adjustment degree of feedback each time and is set to be 0.001, the weights are updated through continuous feedback, and finally the function value of the learning index meets the requirement to achieve the aim of finishing the training of the neural network
w1(n)=w1(n-1)-η·2·error·w3·xc,
w2(n=w2(n-1)-η·2·error·w3·u,
w3(n)=w3(n-1)-η·2·error·w3·xc。
The sliding window model principle of the second step is as follows:
the sliding window model is established on the assumption that the current output depends on the current input, the mapping rule between the input and the output is obtained through historical data, a certain amount of training set samples are preset according to the assumption, then the sample data is continuously updated, the earliest data point is discarded, and the ELMAN neural network continuously updates the structural parameters and gives the latest predicted value along with the sliding of the window.
The model in the third step is suitable for the characteristics of time variation, dynamics, nonlinearity and strong inertia of the blast furnace gas residual pressure recovery turbine power generation.
The invention ensures that the TRT generated power large prediction has higher precision and smaller mean square error, and can be used for on-line real-time prediction.
Drawings
The invention is further illustrated with reference to the following figures and examples.
FIG. 1 is a schematic diagram of the structure of an Elman neural network;
fig. 2 is a schematic view of a sliding window.
Detailed Description
In the blast furnace ironmaking process, the chemical reaction is relatively slow, the current furnace condition has strong correlation with the historical furnace condition, the correlation is expressed as a dynamic time sequence, and a neural network is required to dynamically memorize historical information. Therefore, a Recurrent Neural Network (RNN) is used to predict TRT generated power. In the branch of the recurrent neural network, according to the characteristic of TRT operation, we select the Elman neural network sensitive to historical data, which can be regarded as a recurrent neural network with local memory units and local feedback connection, in the hidden layer, the output is self-connected to the input through the delay and storage of the receiving layer, so that the Elman neural network has sensitivity to the data of the historical state, and the internal feedback also enhances the capability of the Elman network to process dynamic information.
Currently, in industrial data mining, the Elman neural network has great potential in analyzing time series. The invention uses the neural network model for the prediction of TRT generated power, obtains better effect, and has higher application value for the industrial field needing to analyze dynamic time sequence.
The invention provides a method for predicting the power generation capacity of a residual pressure turbine device based on an Elman model, which comprises the following steps:
the method comprises the following steps: the method is characterized in that 6 variables with high correlation degree with the generated energy are determined as input variables (coal gas flow, coal gas inlet pressure, coal gas inlet temperature, carbon monoxide content in coal gas, carbon dioxide content and hydrogen content) by combining analysis of related data and considering the whole trend and correlation coefficient of each dependent variable, and preprocessing such as interpolation and normalization are carried out on the input variables by utilizing a python and Matlab platform, so that the sampling frequency is uniform.
Step two: the Elman neural network is selected as a basic model for prediction (shown in figure 1), and an error back propagation learning algorithm (BPTT) is adopted and combined with a sliding window model (shown in figure 2) to be used for predicting the generated power of the TRT device.
Step three: and after initializing the model, training the model by using the normalized training sample, and using the trained model for predicting the generated power of the TRT device.
1. And step one, carrying out interpolation processing on the unevenly sampled data to form a sequence with equal time intervals, and selecting 6 variables related to the generating power of the TRT device through analysis of related data and the overall trend and the correlation coefficient of each dependent variable. The correlation coefficients are shown in the following table:
name of variable | Unit of | Correlation coefficient | |
X1 | Gas flow | m3/h | 0.9294 |
X2 | Gas inlet pressure | kPa | -0.0397 |
X3 | Gas inlet temperature | ℃ | 0.0352 |
X4 | CO content | % | 0.1278 |
X5 | CO2Content (wt.) | % | 0.1935 |
X6 | H2Content (wt.) | % | 0.1645 |
The normalization method in the first step is as follows:
the normalized result is between 0 and 1.
2. The structure of the Elman neural network in the step two is as follows:
the ELman neural network is composed of four layers, namely an input layer, a hidden layer, a carrying layer and an output layer.
(1) From input layer to hidden layer:
x(k)=f(w1xc(k)+w2(u(k-1)))
(2) from hidden layer to output layer:
y(k)=g(w3x(k))
(3) the receiving layer stores short-term information:
xc(k)=x(k-1)
(4) learning index function: using the sum of squared error function, training ends when E <0.0001
(5) The weight value updating method comprises the following steps:
w1for the connection weight of the bearer layer to the hidden layer, w2Is the connection weight of the input layer to the hidden layer, w3The connection weight from the hidden layer to the output layer. In the method, three weights are required to be adjusted repeatedly, error is the difference between prediction and reality in the training process, eta represents the adjustment degree of feedback each time and is set to be 0.001, the weights are updated through continuous feedback, and finally the function value of the learning index meets the requirement to achieve the aim of finishing the training of the neural network
w1(n)=w1(n-1)-η·2·error·w3·xc
w2(n)=w2(n-1)-η·2·error·w3·u
w3(n)=w3(n-1)-η·2·rrror·w3·xc。
3. The sliding window model principle of the second step is as follows:
the sliding window model is based on the assumption that the current output depends on the current input, and the mapping rule between the input and the output can be obtained through historical data. Based on this assumption, we preset a certain amount of training set samples, then continuously update the sample data and discard the oldest data point. As the window slides, the ELMAN neural network will continuously update its structural parameters and give up-to-date predicted values.
The model is suitable for the characteristics of time variation, dynamics, nonlinearity and strong inertia of the blast furnace gas residual pressure recovery turbine power generation.
Examples
The steel industry currently commonly uses blast furnace residual pressure recovery devices to recover part of the energy. The capacity prediction based on Top Gas Pressure Recovery Turbine (TRT) of the blast furnace is an important basis for optimizing and controlling the power generation of the blast furnace. Accurate capacity prediction (or model prediction) is essential to improve Real-time optimization (RTO).
We verified the accuracy of the proposed model by studying data from 2016 for 4-6 months on willow steel. In the following, we will describe the implementation steps in detail with reference to specific procedures:
the method comprises the following steps: the method is characterized in that 6 variables with high correlation degree with the generated energy are determined as input variables (coal gas flow, coal gas inlet pressure, coal gas inlet temperature, carbon monoxide content in coal gas, carbon dioxide content and hydrogen content) by combining analysis of related data and considering the whole trend and correlation coefficient of each dependent variable, and preprocessing such as interpolation and normalization are carried out on the input variables by utilizing a python and Matlab platform, so that the sampling frequency is uniform.
Step two: the method comprises the steps of selecting an Elman neural network as a basic model for prediction, adopting an error back propagation learning algorithm (BPTT) and combining a sliding window model for predicting the generating power of the TRT device.
Step three: and after initializing the model, training the model by using the normalized training sample, and using the trained model for predicting the generated power of the TRT device.
1. And step one, carrying out interpolation processing on the unevenly sampled data to form a sequence with equal time intervals, and selecting 6 variables related to the generating power of the TRT device through analysis of related data and the overall trend and the correlation coefficient of each dependent variable. The normalization method in the first step is as follows:
the normalized result is between 0 and 1.
2. The structure of the Elman neural network in the step two is as follows:
the ELman neural network is composed of four layers, namely an input layer, a hidden layer, a carrying layer and an output layer.
(1) From input layer to hidden layer:
x(k)=f(w1xc(k)+w2(u(k-1)))
(2) from hidden layer to output layer:
y(k0=g(w3x(k))
(3) the receiving layer stores short-term information:
xc(k)=x(k-1)
(4) learning index function: using the sum of squared error function, training ends when E <0.0001
(5) The weight value updating method comprises the following steps:
w1for the connection weight of the bearer layer to the hidden layer, w2Is the connection weight of the input layer to the hidden layer, w3The connection weight from the hidden layer to the output layer.
In the method, three weights are required to be adjusted repeatedly, error is the difference between prediction and reality in the training process, eta represents the adjustment degree of feedback each time and is set to be 0.001, the weights are updated through continuous feedback, and finally the function value of the learning index meets the requirement to achieve the aim of finishing the training of the neural network
w1(n)=w1(n-1)-η·2·error·w3·xc
w2(n)=w2(n-1)-η·2·error·w3·u
w3(n)=w3(n-1)-η·2·error·w3·xc。
3. The sliding window model principle of the second step is as follows:
the sliding window model is based on the assumption that the current output depends on the current input, and the mapping rule between the input and the output can be obtained through historical data. Based on this assumption, we preset a certain amount of training set samples, then continuously update the sample data and discard the oldest data point. As the window slides, the ELMAN neural network will continuously update its structural parameters and give up-to-date predicted values.
The data of 500 time points are predicted, and under the prediction, the result of the Elman algorithm is stable and basically has no fluctuation. We verify the accuracy of model prediction with two indexes, prediction hit ratio and mean square error mse:
in the actual production process, the prediction error is less than 0.1, and the requirement can be met. The hit rate of the proposed model reaches 93.2%, and the mean square error is 0.0018. Has high precision and can completely meet the requirement of actual production.
The above-described embodiments are intended to illustrate rather than to limit the invention, and any modifications and variations of the present invention are within the spirit and scope of the claims.
Claims (3)
1. A method for predicting the power generation power of a residual pressure turbine device based on an Elman neural network is characterized in that,
the method comprises the following steps: carrying out interpolation processing on unevenly sampled data to form a sequence with equal time intervals, analyzing the overall trend and correlation coefficient of each input variable, selecting six variables with high correlation degree with generated energy as input variables, wherein the input variables are gas flow, gas inlet pressure, gas inlet temperature, carbon monoxide content, carbon dioxide content and hydrogen content in gas, and carrying out interpolation and normalization preprocessing on the input variables by utilizing a python and Matlab platform to ensure that the sampling frequency is uniform;
step two: selecting an Elman neural network as a basic model for prediction, adopting an error back propagation learning algorithm (BPTT) and combining a sliding window model for predicting the generated power of the residual pressure turbine device;
step three: after initializing the model, training the model by using a normalized training sample, and using the trained model for predicting the power generation power of the residual pressure turbine device;
step one correlation coefficient is as follows:
gas flow rate in m3H, correlation coefficient 0.9294;
gas inlet pressure in kPa, correlation coefficient-0.0397;
the gas inlet temperature, unit ℃, correlation coefficient 0.0352;
volume content of CO, 0.1278%;
CO2volume content, 0.1935%;
the correlation coefficient is between 0 and 1, and the larger the correlation coefficient is, the stronger the correlation is;
the normalization method in the first step is as follows:
the normalized result is between 0 and 1;
the structure of the Elman neural network in the step two is as follows:
the ELman neural network is composed of four layers, namely an input layer, a hidden layer, a carrying layer and an output layer:
1) from input layer to hidden layer:
x(k)=f(w1xc(k)+w2(u(k-1)));
2) from hidden layer to output layer:
y(k)=g(w3x(k));
3) the receiving layer stores short-term information:
xc(k)=x(k-1);
4) learning index function: using the sum of squared error function, the training is ended when E <0.0001
5) The weight value updating method comprises the following steps:
w1for the connection weight of the bearer layer to the hidden layer, w2Is the connection weight of the input layer to the hidden layer, w3The three weights are adjusted repeatedly, error is the difference between prediction and reality in the training process, eta represents the adjustment degree of feedback each time and is set to be 0.001, the weights are updated through continuous feedback, and finally the function value of the learning index meets the requirement to achieve the aim of finishing the training of the neural network
w1(n)=w1(n-1)-η·2·error·w3·xc,
w2(n)=w2(n-1)-η·2·error·w3·u,
w3(n)=w3(n-1)-η·2·error·w3·xc。
2. The method of claim 1, wherein the sliding window model of step two is based on the following principle:
the sliding window model is built on an assumption that the current output depends on the current input, the mapping rule between the input and the output is obtained through historical data, according to the assumption, a certain amount of training set samples are preset, then the sample data are continuously updated, the earliest data points are discarded, and the Elman neural network continuously updates the structural parameters and gives the latest predicted value along with the sliding of the window.
3. The method of claim 1, wherein the model of step three is adapted to time-varying, dynamic, nonlinear, strong inertial characteristics of blast furnace gas top pressure recovery turbine power generation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810198547.5A CN108446799B (en) | 2018-03-12 | 2018-03-12 | Residual pressure turbine device power generation power prediction method based on Elman neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810198547.5A CN108446799B (en) | 2018-03-12 | 2018-03-12 | Residual pressure turbine device power generation power prediction method based on Elman neural network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108446799A CN108446799A (en) | 2018-08-24 |
CN108446799B true CN108446799B (en) | 2021-08-03 |
Family
ID=63194059
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810198547.5A Active CN108446799B (en) | 2018-03-12 | 2018-03-12 | Residual pressure turbine device power generation power prediction method based on Elman neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108446799B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116542882B (en) * | 2023-07-06 | 2023-09-19 | 浙江大学 | Photovoltaic power generation smoothing method, system and storage medium |
CN118070003B (en) * | 2024-04-19 | 2024-07-23 | 中国西安卫星测控中心 | Spacecraft telemetry data interpolation method based on neural network |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102994672A (en) * | 2012-11-30 | 2013-03-27 | 武汉钢铁(集团)公司 | Automatic control method for top pressure of TRT (blast furnace top gas recovery turbine unit) system |
CN104268638A (en) * | 2014-09-11 | 2015-01-07 | 广州市香港科大霍英东研究院 | Photovoltaic power generation system power predicting method of elman-based neural network |
CN106709197A (en) * | 2016-12-31 | 2017-05-24 | 浙江大学 | Molten iron silicon content predicting method based on slide window T-S fuzzy neural network model |
-
2018
- 2018-03-12 CN CN201810198547.5A patent/CN108446799B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102994672A (en) * | 2012-11-30 | 2013-03-27 | 武汉钢铁(集团)公司 | Automatic control method for top pressure of TRT (blast furnace top gas recovery turbine unit) system |
CN104268638A (en) * | 2014-09-11 | 2015-01-07 | 广州市香港科大霍英东研究院 | Photovoltaic power generation system power predicting method of elman-based neural network |
CN106709197A (en) * | 2016-12-31 | 2017-05-24 | 浙江大学 | Molten iron silicon content predicting method based on slide window T-S fuzzy neural network model |
Non-Patent Citations (3)
Title |
---|
Gas/Liquid Two-Phase Flow Regime Recognition by Combining the Features of Shannon"s Entropy with the Improved Elman Network;Jing Liu etal.;《2009 Fifth International Conference on Natural Computation》;20091228;第53-57页 * |
Neural Networks Based Predictive Control for TRT;Chunjie Yang etal.;《2005 International Conference on Neural Networks and Brain》;20060410;第1041-1044页 * |
并网光伏电站发电功率预测方法研究;石磊;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20170315;第19-29页 * |
Also Published As
Publication number | Publication date |
---|---|
CN108446799A (en) | 2018-08-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110705692B (en) | Nonlinear dynamic industrial process product prediction method of space-time attention network | |
CN107526927B (en) | Blast furnace molten iron quality online robust soft measurement method | |
CN111444942B (en) | Intelligent forecasting method and system for silicon content of blast furnace molten iron | |
CN113761787B (en) | Online prediction method and system for silicon content of blast furnace molten iron based on deep migration network | |
CN108446799B (en) | Residual pressure turbine device power generation power prediction method based on Elman neural network | |
CN109992921A (en) | A kind of online soft sensor method and system of the coal-fired plant boiler thermal efficiency | |
Lin et al. | Model of hot metal silicon content in blast furnace based on principal component analysis application and partial least square | |
CN105574297B (en) | Self adaptation blast furnace molten iron silicon content trend prediction method | |
CN104899425A (en) | Variable selection and forecast method of silicon content in molten iron of blast furnace | |
CN106156434A (en) | Sliding window time difference Gaussian process regression modeling method based on the low and deep structure of local time | |
Zhou et al. | Fast just-in-time-learning recursive multi-output LSSVR for quality prediction and control of multivariable dynamic systems | |
CN107299170A (en) | A kind of blast-melted quality robust flexible measurement method | |
CN111832703B (en) | Irregular sampling dynamic sequence modeling method for process manufacturing industry | |
CN114841073A (en) | Instant learning semi-supervised soft measurement modeling method based on local label propagation | |
Li et al. | Dual ensemble online modeling for dynamic estimation of hot metal silicon content in blast furnace system | |
CN110222825B (en) | Cement product specific surface area prediction method and system | |
Hu et al. | A simplified recursive dynamic PCA based monitoring scheme for imperial smelting process | |
Li et al. | Adaptive soft sensor based on a moving window just-in-time learning LS-SVM for distillation processes | |
Wu et al. | Time series online prediction algorithm based on least squares support vector machine | |
Tian et al. | A new incremental learning modeling method based on multiple models for temperature prediction of molten steel in LF | |
CN115688865A (en) | Industrial soft measurement method for long and short term memory network for flue gas of desulfurization process | |
Meng et al. | Prediction of Silicon Content of Hot Metal in Blast Furnace Based on Optuna-GBDT | |
Jiu-sun et al. | Subspace method for identification and control of blast furnace ironmaking process | |
CN112380779B (en) | Robust soft measurement method and system for sintering end point | |
Jiang et al. | A trend prediction method based on fusion model and its application |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |