CN115374998A - Load prediction method based on attention mechanism and CNN-BiGRU - Google Patents

Load prediction method based on attention mechanism and CNN-BiGRU Download PDF

Info

Publication number
CN115374998A
CN115374998A CN202210793199.2A CN202210793199A CN115374998A CN 115374998 A CN115374998 A CN 115374998A CN 202210793199 A CN202210793199 A CN 202210793199A CN 115374998 A CN115374998 A CN 115374998A
Authority
CN
China
Prior art keywords
data
attention
cnn
bigru
load prediction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210793199.2A
Other languages
Chinese (zh)
Inventor
吴晓刚
赵汉鹰
吴新华
陶毓锋
唐雅洁
张雪松
叶吉超
季青锋
陈文进
张俊
陈菁伟
张若伊
祝巍蔚
陈楠
张有鑫
周逸之
杜倩昀
蒋舒婷
张滨滨
徐植
叶碧琦
徐文
胡建鹏
李志浩
龚迪阳
林达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electric Power Research Institute of State Grid Zhejiang Electric Power Co Ltd
Lishui Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Original Assignee
Electric Power Research Institute of State Grid Zhejiang Electric Power Co Ltd
Lishui Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electric Power Research Institute of State Grid Zhejiang Electric Power Co Ltd, Lishui Power Supply Co of State Grid Zhejiang Electric Power Co Ltd filed Critical Electric Power Research Institute of State Grid Zhejiang Electric Power Co Ltd
Priority to CN202210793199.2A priority Critical patent/CN115374998A/en
Publication of CN115374998A publication Critical patent/CN115374998A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J3/00Circuit arrangements for ac mains or ac distribution networks
    • H02J3/003Load forecast, e.g. methods or systems for forecasting future load demand
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Public Health (AREA)
  • Water Supply & Treatment (AREA)
  • Operations Research (AREA)
  • Primary Health Care (AREA)
  • Power Engineering (AREA)
  • Quality & Reliability (AREA)
  • Artificial Intelligence (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a load prediction method based on an attention mechanism and CNN-BiGRU; a bidirectional GRU algorithm and an attention mechanism are added in the preprocessing stage of the historical load to extract load characteristics, a load data characteristic extraction method is improved to improve the prediction precision of a short-term load prediction model, and a coiling machine neural network algorithm is adopted to realize the establishment of the short-term load prediction model and realize the accurate prediction of the short-term load.

Description

Load prediction method based on attention mechanism and CNN-BiGRU
Technical Field
The invention relates to the field of short-term load prediction, in particular to a load prediction method based on an attention mechanism and CNN-BiGRU.
Background
Short-term load forecasting is always an important guarantee that a power system provides qualified and reliable electric energy for users. The Nanning power grid short-term load forecasting mainly forecasts the power load of about one week in the future. The prediction accuracy seriously affects the work of workers in the departments of planning, marketing, market trading, scheduling and the like of the power system. The key of the short-term power load prediction work depends on the establishment of a load prediction model thereof, but the requirement of the load prediction model is higher and higher due to the uncertainty and the randomness of the short-term power load. Factors affecting the accuracy of short-term load forecasting become more complex as the demand of electricity by users tends to be diversified.
And designing a feature extraction model and a prediction model according to factors influencing the load prediction precision. The factors that need to be considered in designing a short-term load prediction model are: historical load data characteristics; meteorological factors; a date type; appropriate data feature extraction and prediction algorithms; and (4) fault tolerance capability. The traditional load prediction method consists of two parts: preprocessing data; constructing a load prediction model; training a model; fine adjustment of model parameters; and (6) testing.
In recent years, with the development of artificial intelligence algorithms, the attention of short-term load prediction model establishment is mainly focused on prediction model establishment. Such as an artificial neural network, long-short-term memory, an extreme learning machine, a support vector machine, etc., while the attention of the data preprocessing part in the previous stage is less, and the influence of the data preprocessing in the previous stage on the prediction result in the later stage is often great. The appropriate load data processing means can not only improve the prediction result, but also save the computer iteration cost by simplifying the complexity of the prediction model.
For example, "a short-term load prediction method and a short-term load prediction apparatus" disclosed in chinese patent literature, and its publication No. CN111382891A; the method comprises the following steps: acquiring load data of a past time period and external parameters of a prediction time period; decomposing the load data of the past period into a second-level detail component, a third-level approximation component, and a third-level detail component by wavelet decomposition; acquiring a second-level detail component of the predicted load data of the prediction time period based on the second-level detail component; acquiring a third-level approximate component and a third-level detail component of predicted load data of a prediction period based on the third-level approximate component and the third-level detail component of the load data of the past period and the external parameter; and reconstructing the second-level detail component of the predicted load data, the third-level approximate component of the predicted load data and the third-level detail component into predicted load data of a predicted time period through wavelet reconstruction. According to the short-term load prediction method and the short-term load prediction device, the short-term load can be accurately predicted. However, the method completes prediction of each segment of detail amount by splitting the detail amount, lacks a preprocessing process for split data, and often influences the later prediction result by the early data preprocessing. The appropriate load data processing means can not only improve the prediction result, but also save the computer iteration cost by simplifying the complexity of the prediction model.
Disclosure of Invention
The present invention is primarily directed to the above problems; providing a load prediction method based on an attention mechanism and CNN-BiGRU; a bidirectional GRU algorithm and an attention mechanism are added in the preprocessing stage of the historical load to extract load characteristics, a load data characteristic extraction method is improved to improve the prediction precision of a short-term load prediction model, the establishment of the short-term load prediction model is realized by adopting a convolutional neural network algorithm, and the accurate prediction of the short-term load is realized.
The technical problem of the invention is mainly solved by the following technical scheme:
a load prediction method based on an attention mechanism and CNN-BiGRU comprises the following steps:
s1, data acquisition: collecting historical load data of a power grid, and storing the historical load data as time series data;
s2, data preprocessing: inputting data into a CNN layer for feature extraction; the CNN layer comprises an input layer, a convolution layer, an excitation layer, a pooling layer and a full-connection FC layer; before inputting a prediction model, eliminating the influence of data dimension difference on the model in advance, namely normalizing the data;
s3, establishing a prediction model: constructing an Attention module; building a BiGRU module; carrying out model performance verification;
s4, load prediction: and inputting the processed data into a prediction model and outputting a load prediction curve.
A bidirectional GRU algorithm and an attention mechanism are added in the preprocessing stage of the historical load to extract load characteristics, a load data characteristic extraction method is improved to improve the prediction precision of a short-term load prediction model, the establishment of the short-term load prediction model is realized by adopting a convolutional neural network algorithm, and the accurate prediction of the short-term load is realized.
Preferably, the convolutional layer is defined as:
S(i,j)=(x×w)(i,j)=∑ mn x(i+m,j+n)w(m,n);
wherein w is the convolution kernel of the convolution neural network, x is the input, and the dimension of x is consistent with the dimension of the matrix w. The convolutional layer is a module which is different from other neural network algorithms in CNN, and has the function of performing feature extraction once on data passing through the convolutional layer, and can extract deep features from the data.
Preferably, the excitation layer is used as a mapping for mapping the convolution of the neurons, the mapping is non-linear, and a ReLU function is adopted as an excitation function of the convolutional neural network; the function is as follows: reLU (x) = max (0, x). In order to make the iteration speed of the model faster, the ReLU function is generally adopted as the excitation function of the convolutional neural network.
Preferably, in step S2, the normalization formula is:
Figure BDA0003731167800000021
wherein: x is the number of max Is the maximum value of data, x min And the data is the minimum value of the data, X is a data set matrix after normalization processing, and X is an input data set sample matrix. Before inputting the prediction model, the influence of the data dimension difference on the model needs to be eliminated in advance.
Preferably, the method for building the Attention module in step S3 includes:
s51, distributing attention values to the input information;
and S52, calculating a weighted average value of the input information based on the attention distribution value.
The Attention mechanism has the functions of distributing weight values to the features output by the CNN layer, distributing higher weight values to the key features, and endowing smaller weight values to the non-key features, so that the prediction accuracy of the model is improved.
Preferably, the attention value assignment includes: setting a vector q, introducing a function, and performing attention value allocation by calculating the correlation between an input vector and the q, wherein the process is as follows:
Figure BDA0003731167800000031
α i attention distribution, s (x), expressed as the ith vector i And q) is an attention screening function. The attention value is assigned by calculating the correlation between the input vector and q, i.e. the greater the correlation, the greater the attention, and vice versa.
Preferably, the weighted average is calculated as follows:
Figure BDA0003731167800000032
a i the degree of attention the ith vector receives. The weight of the key features is improved by weighted averaging of attention, vector data with more distinct features can be obtained after weighted averaging, and training and adjustment of the model can be conveniently performed subsequently.
Preferably, the BiGRU module is a bidirectional gating cycle unit, and includes two GRU models, which are a backward learning historical data feature and a forward learning future data feature, respectively. Namely, historical and future data information can be utilized simultaneously in the model training process and fused, and the model training method shows more excellent performance.
Preferably, in step S3, the data processed in the data preprocessing stage is split into a training set and a test set, and the training data and the test data are respectively used to perform experimental verification on the loss function and the iteration speed of the model; it can be verified by this procedure whether the model meets the expected requirements.
The invention has the beneficial effects that:
a load prediction method based on an attention mechanism and CNN-BiGRU; a bidirectional GRU algorithm and an attention mechanism are added in the preprocessing stage of the historical load to extract load characteristics, a load data characteristic extraction method is improved to improve the prediction precision of a short-term load prediction model, the establishment of the short-term load prediction model is realized by adopting a convolutional neural network algorithm, and the accurate prediction of the short-term load is realized.
Drawings
FIG. 1 is a diagram of the mechanism structure of Attention;
FIG. 2 is a diagram of a BiGRU model structure;
FIG. 3 is a flow chart of the method;
FIG. 4 is a diagram of a prediction model;
FIG. 5 is a graph of model loss values;
fig. 6 is a short term load prediction graph.
Detailed Description
It should be understood that the examples are only for illustrating the present invention and are not intended to limit the scope of the present invention. Further, it should be understood that various changes or modifications of the present invention can be made by those skilled in the art after reading the teaching of the present invention, and these equivalents also fall within the scope of the claims appended to the present application.
The technical scheme of the invention is further specifically described by the following embodiments.
A load prediction method based on an attention mechanism and CNN-BiGRU needs to be divided into the following parts to be realized.
And 1, preprocessing data.
The method comprises the following steps: and (4) data acquisition.
Historical grid load data for 8 days is collected and stored as time series data. The data length is: 8 × 96=768, the time unit is minute (m), and the sampling period T is 15 minutes.
Step two: and (5) feature extraction.
Inputting the data collected in the step one into a CNN layer for feature extraction. The CNN model consists of an input layer, a convolutional layer, an active layer, a pooling layer and a full-connection FC layer. The convolutional layer is a module that is distinguished from other neural network algorithms by CNN. It is defined as:
S(i,j)=(x×w)(i,j)=∑ mn x(i+m,j+n)w(m,n);
w is a convolution kernel of the convolutional neural network, x is input, the dimension of x is consistent with that of the matrix w, and the convolutional layer is used for performing feature extraction on data passing through the layer once and can dig out deep features from the data.
The role of the excitation layer is to map the convolution of the neurons, which is non-linear. In order to make the iteration speed of the model faster, the ReLU function is generally adopted as the excitation function of the convolutional neural network. It is defined as: reLU (x) = max (0, x).
The pooling layer is a module with a relatively simple structure, and is used for selecting data characteristics extracted by the convolution layer, reducing data calculation amount to a certain degree, effectively avoiding over-fitting phenomenon, and effectively improving the fault tolerance of the model after being processed by the pooling layer.
The fully connected FC layer functions like a multi-classifier, and this module contains most of the computational processes in the CNN model. The module mainly carries out linear and nonlinear processing on data, linear operation mainly aims at input data, and nonlinear operation mainly aims at mapping of data among all levels. The layer also acts as a memory where the data features extracted by the convolutional layer, pooled data and data that needs to be output by the model are stored.
Step three: and (6) normalization processing.
Before inputting the prediction model, the influence of the data dimension difference on the model needs to be eliminated in advance. The normalized formula is:
Figure BDA0003731167800000051
wherein: x is a radical of a fluorine atom max Is the maximum value of data, x min And the minimum value is the data minimum value, X is a data set matrix after normalization processing, and X is an input data set sample matrix.
And 2, establishing a prediction model.
The method comprises the following steps: and (5) constructing an Attention module.
The Attention mechanism has the function of distributing a weight to the features output by the CNN, distributing a higher weight to the key features, and endowing the non-key features with a smaller weight, so that the prediction accuracy of the model is improved.
The steps of calculating the Attention mechanism are divided into two steps:
attention distribution value for input information
By setting a vector q, introducing a function and calculating the correlation between the input vector and the q, attention values are assigned, namely, the higher the correlation is, the more the attention is, and the smaller the correlation is, the smaller the attention is. It is defined as follows:
Figure BDA0003731167800000052
α i attention distribution, denoted as the ith vector, s (x) i And q) is an attention screening function.
Based on 1) calculating a weighted average of the input information.
Calculating a weighted average:
Figure BDA0003731167800000053
a i the degree of attention the ith vector receives.
FIG. 1 is a diagram of the structure of the Attention mechanism, in which x = [ x ] 1 ,x 2 ,…x n ]As input vector, h n Output representing a hidden layer, a n And the Attention probability distribution value of the Attention mechanism output to the hidden layer is represented, and y is the output value of the Attention mechanism.
Step two: and building a BiGRU module.
The BiGRU is a bidirectional gating circulation unit and is composed of two GRU models, one of the two GRU models learns the historical data characteristics backwards, the other GRU model learns the future data characteristics forwards, namely, the historical data information and the future data information can be simultaneously utilized in the model training process and are fused, and the bidirectional gating circulation unit has superior performance. The model structure is shown in figure 2.
Step three: and (5) verifying the performance of the model.
And (3) after the attention mechanism and the CNN-BiGRU-based load prediction model are established, adjusting parameters of the model according to the process shown in the attached figure 3. The complete prediction model is shown in fig. 4.
Splitting the data processed in the data preprocessing stage into a training set and a test set according to the reference numeral 7.
Step four: load prediction
The length of the processed data is as follows: 8 × 96, time unit is minute (m), sampling period T is 15 minutes, input to the prediction model, and output load prediction curve is shown in fig. 6. The curve fluctuation of the actual load is large, the amplitude change net value is large, and the front-end prediction curve basically fits the actual load change curve.

Claims (9)

1. A load prediction method based on an attention mechanism and CNN-BiGRU is characterized by comprising the following steps:
s1, data acquisition: collecting historical load data of a power grid, and storing the historical load data as time series data;
s2, data preprocessing: inputting data into a CNN layer for feature extraction; the CNN layer comprises an input layer, a convolution layer, an excitation layer, a pooling layer and a full-connection FC layer; before inputting a prediction model, eliminating the influence of data dimension difference on the model in advance, namely normalizing the data;
s3, establishing a prediction model: constructing an Attention module; building a BiGRU module; carrying out model performance verification;
s4, load prediction: and inputting the processed data into a prediction model and outputting a load prediction curve.
2. The attention mechanism and CNN-BiGRU based load prediction method of claim 1, wherein: the convolutional layer is defined as:
S(i,j)=(x×w)(i,j)=∑ mn x(i+m,j+n)w(m,n);
wherein w is the convolution kernel of the convolutional neural network, x is the input, and the dimension of x is consistent with the dimension of the matrix w.
3. The attention mechanism and CNN-BiGRU based load prediction method of claim 1, wherein: the excitation layer is used for mapping the convolution of the mapping neurons, and a ReLU function is used as an excitation function of the convolutional neural network; the function is as follows:
ReLU(x)=max(0,x)。
4. the attention mechanism and CNN-BiGRU based load prediction method of claim 1, wherein: in step S2, the normalization formula is:
Figure FDA0003731167790000011
wherein: x is the number of max Is the maximum value of data, x min And the data is the minimum value of the data, X is a data set matrix after normalization processing, and X is an input data set sample matrix.
5. The attention mechanism and CNN-BiGRU based load prediction method of claim 1, wherein: the method for building the Attention module in the step S3 comprises the following steps:
s51, distributing attention values to the input information;
and S52, calculating a weighted average value of the input information based on the attention distribution value.
6. The attention mechanism and CNN-BiGRU based load prediction method of claim 5, wherein: the attention value assignment includes: setting a vector q, introducing a function, and performing attention value allocation by calculating the correlation between an input vector and the q, wherein the process is as follows:
Figure FDA0003731167790000012
α i attention distribution, s (x), expressed as the ith vector i And q) is an attention screening function.
7. The attention mechanism and CNN-BiGRU based load prediction method of claim 5, wherein: the weighted average is calculated as follows:
Figure FDA0003731167790000021
a i the degree of attention the ith vector receives.
8. The attention mechanism and CNN-BiGRU based load prediction method of claim 1, wherein: the BiGRU module is a bidirectional gating cycle unit and comprises two GRU models which are respectively used for learning historical data features backwards and learning future data features forwards.
9. The attention mechanism and CNN-BiGRU based load prediction method of claim 1, wherein: in step S3, the data processed in the data preprocessing stage is divided into a training set and a test set, and the loss function and the iteration speed of the model are verified by using the training data and the test data, respectively.
CN202210793199.2A 2022-07-05 2022-07-05 Load prediction method based on attention mechanism and CNN-BiGRU Pending CN115374998A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210793199.2A CN115374998A (en) 2022-07-05 2022-07-05 Load prediction method based on attention mechanism and CNN-BiGRU

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210793199.2A CN115374998A (en) 2022-07-05 2022-07-05 Load prediction method based on attention mechanism and CNN-BiGRU

Publications (1)

Publication Number Publication Date
CN115374998A true CN115374998A (en) 2022-11-22

Family

ID=84061663

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210793199.2A Pending CN115374998A (en) 2022-07-05 2022-07-05 Load prediction method based on attention mechanism and CNN-BiGRU

Country Status (1)

Country Link
CN (1) CN115374998A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117154723A (en) * 2023-10-31 2023-12-01 国网山东省电力公司信息通信公司 Platform short-term load prediction method and system based on multi-source data and model fusion

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117154723A (en) * 2023-10-31 2023-12-01 国网山东省电力公司信息通信公司 Platform short-term load prediction method and system based on multi-source data and model fusion
CN117154723B (en) * 2023-10-31 2024-04-02 国网山东省电力公司信息通信公司 Platform short-term load prediction method and system based on multi-source data and model fusion

Similar Documents

Publication Publication Date Title
CN110705743B (en) New energy consumption electric quantity prediction method based on long-term and short-term memory neural network
CN111860982A (en) Wind power plant short-term wind power prediction method based on VMD-FCM-GRU
CN110212528B (en) Power distribution network measurement data missing reconstruction method
Liu et al. Heating load forecasting for combined heat and power plants via strand-based LSTM
CN114462718A (en) CNN-GRU wind power prediction method based on time sliding window
CN112381673B (en) Park electricity utilization information analysis method and device based on digital twin
CN114792156A (en) Photovoltaic output power prediction method and system based on curve characteristic index clustering
CN111898825A (en) Photovoltaic power generation power short-term prediction method and device
CN111242355A (en) Photovoltaic probability prediction method and system based on Bayesian neural network
CN115358437A (en) Power supply load prediction method based on convolutional neural network
CN116345555A (en) CNN-ISCA-LSTM model-based short-term photovoltaic power generation power prediction method
CN115374998A (en) Load prediction method based on attention mechanism and CNN-BiGRU
CN113762591B (en) Short-term electric quantity prediction method and system based on GRU and multi-core SVM countermeasure learning
CN117370766A (en) Satellite mission planning scheme evaluation method based on deep learning
CN113487064A (en) Photovoltaic power prediction method and system based on principal component analysis and improved LSTM
CN112836876A (en) Power distribution network line load prediction method based on deep learning
CN116845875A (en) WOA-BP-based short-term photovoltaic output prediction method and device
CN111476402A (en) Wind power generation capacity prediction method coupling meteorological information and EMD technology
CN113642784B (en) Wind power ultra-short-term prediction method considering fan state
CN113779861B (en) Photovoltaic Power Prediction Method and Terminal Equipment
CN112766537B (en) Short-term electric load prediction method
CN115238951A (en) Power load prediction method and device
CN114897204A (en) Method and device for predicting short-term wind speed of offshore wind farm
CN113837490A (en) Stock closing price prediction method for generating confrontation network based on wavelet denoising
CN113112085A (en) New energy station power generation load prediction method based on BP neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination