CN112561058B - Short-term photovoltaic power prediction method based on Stacking-integrated learning - Google Patents

Short-term photovoltaic power prediction method based on Stacking-integrated learning Download PDF

Info

Publication number
CN112561058B
CN112561058B CN202011469711.5A CN202011469711A CN112561058B CN 112561058 B CN112561058 B CN 112561058B CN 202011469711 A CN202011469711 A CN 202011469711A CN 112561058 B CN112561058 B CN 112561058B
Authority
CN
China
Prior art keywords
prediction
input
gate
learner
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011469711.5A
Other languages
Chinese (zh)
Other versions
CN112561058A (en
Inventor
黎湛联
陈嘉铭
丁伟锋
陈顺
蔡涌烽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202011469711.5A priority Critical patent/CN112561058B/en
Publication of CN112561058A publication Critical patent/CN112561058A/en
Application granted granted Critical
Publication of CN112561058B publication Critical patent/CN112561058B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Abstract

The invention discloses a short-term photovoltaic power prediction method based on Stacking-integrated learning, which is characterized in that a first layer of prediction model formed by a base learner based on a plurality of different models is trained based on a historical photovoltaic prediction data sample set, a prediction result of the first layer of prediction model is input into a meta learner model to train a second layer of prediction model, and then an input variable of a day to be predicted is predicted by the first layer of prediction model and the second layer of prediction model to obtain the prediction result. The short-term photovoltaic power prediction method based on Stacking-integrated learning considers the training principle difference of different deep learning, fully exerts the advantages of each model, and obviously improves the prediction precision compared with the traditional single-model prediction.

Description

Short-term photovoltaic power prediction method based on Stacking-integrated learning
Technical Field
The invention relates to the field of photovoltaic power prediction, in particular to a short-term photovoltaic power prediction method based on stacking-integrated learning.
Background
At present, photovoltaic power prediction is mainly divided into a physical model, a statistical model and an artificial intelligent model. The physical model researches the weather evolution process through mathematical modeling, predicts according to the photoelectric and wind power conversion physical model, does not need a large amount of data, but the built model is complex, large in calculated amount and poor in anti-interference capability. The statistical model predicts the power by using the statistical relation among the historical measurement data, so that the problem of prediction delay can be effectively solved, however, the statistical model has higher requirements on the processing of the original data and the stability of the time sequence, and the influence of nonlinear factors is difficult to reflect. Therefore, in recent years, the artificial intelligent model has strong nonlinear fitting capability applied to photovoltaic power prediction, but the traditional machine learning model is mostly a shallow neural network, so that deep features of photovoltaic power and other influencing factors such as total solar radiation intensity, strong scattered horizontal radiation and the like are difficult to excavate, and prediction accuracy and generalization capability are difficult to improve.
Compared with traditional machine learning, the deep learning technology is less interfered by noise, can fully mine the relevance between data, and provides a strong support for renewable energy generation power prediction. However, only a single method is adopted to perform photovoltaic prediction, because the assumption space of the photovoltaic prediction problem is large, multiple assumptions can reach the same performance on the training set, and if a single model is used, generalization performance can be poor due to randomness.
There are studies on a method of using a combination prediction to further improve model prediction accuracy. Patent document with publication number CN109767353a (publication day is 2019, 05, 17) discloses a photovoltaic power generation power prediction method based on probability distribution function, which adopts a general distribution fitting method to fit and obtain a photovoltaic power generation power probability density function and an accumulated distribution function corresponding to the interval, performs inverse transformation sampling on the photovoltaic power generation power probability distribution density function to obtain a plurality of photovoltaic power generation power values corresponding to the known irradiance, and then calculates an average value of each photovoltaic power generation power value in the photovoltaic output scene set as a photovoltaic power generation power prediction value corresponding to the known irradiance.
However, most of the combined prediction modes adopt a mean value calculation mode to calculate prediction mean values of multiple algorithm models or different parameter models of the same type of algorithm, the difference of data observation of different prediction algorithms cannot be represented, each algorithm cannot train a more excellent model in a mode of taking the advantages and the advantages of the combination mode, the combination mode does not have enough theoretical support, and the principle is thinner.
Therefore, how to provide a short-term photovoltaic power prediction method for solving the above technical problems is a urgent need for those skilled in the art.
Disclosure of Invention
The invention provides a short-term photovoltaic power prediction method based on Stacking integrated learning, which aims to overcome the technical defects of low prediction precision, poor generalization capability and insufficient stability and accuracy of the existing prediction model in the prior art.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
a short-term photovoltaic power prediction method based on Stacking-ensemble learning comprises the following steps:
s1, acquiring historical photovoltaic power generation power data and historical weather forecast data and performing data preprocessing, so as to construct a photovoltaic forecast data sample set;
s2, dividing the photovoltaic prediction data sample set into a basic model training set and a meta learner training set;
s3, inputting the basic model training set into a plurality of basic learners for training, and establishing a plurality of basic learner prediction models, so that training of a first layer of prediction models is completed;
s4, inputting the meta learner training set into a plurality of base learner prediction models of the first layer prediction model, outputting respective prediction results by each base learner prediction model, and inputting the prediction results of the first layer prediction model into a meta learner for training, so that training of the meta learner prediction model in the second layer prediction model is completed;
s5, inputting the input variable of the day to be predicted into the trained first layer of prediction model and outputting a prediction result, and taking a plurality of prediction results of the first layer of prediction model as the input variable of the meta-learner prediction model of the trained second layer of prediction model to obtain the final predicted photovoltaic output power.
Preferably, in step S1, a time series of historical photovoltaic power generation power data and historical weather forecast data is obtained through the data preprocessing, wherein the time series comprises photovoltaic power generation power P, solar total radiation intensity G, scattered horizontal radiation intensity D and wind speed W s Temperature T, relative humidity H, wind direction W d And the daily rainfall R, and then taking the time sequence as a photovoltaic prediction data sample set.
Preferably, in step S2, the time series of historical weather forecast data is taken as the input of a sample set of photovoltaic forecast dataAnd (3) taking the time sequence of the historical photovoltaic power generation power as an output variable of a photovoltaic prediction data sample set to form a photovoltaic prediction data sample set S= { P n ,G n ,D n ,W sn ,T n ,H n ,W dn ,R n }, wherein P n For the predicted value, x, corresponding to the nth photovoltaic predicted data sample set n ={G n ,D n ,W sn ,T n ,H n ,W dn ,R n And the input characteristic vector corresponding to the nth photovoltaic prediction data sample set is shown.
Preferably, in step S3, the plurality of base learners includes a long-short-term memory learner, a threshold circulation unit learner, and a recurrent neural network learner.
Preferably, in step S3, the step of establishing a long-term memory learner prediction model includes:
s311, taking an input feature vector x t As input at time t.
S312, setting the long-period memory model unit to comprise a forgetting door, an input door and an output door, wherein the forgetting door unit f at the t moment t Input gate i at time t t Output door g at t-th moment t Cell internal state update c t Prediction model output of long-short-term memory learner
Figure BDA0002835847130000031
The method comprises the following steps of:
Figure BDA0002835847130000032
Figure BDA0002835847130000033
Figure BDA0002835847130000034
Figure BDA0002835847130000035
Figure BDA0002835847130000036
wherein σ is a sigmoid activation function that converts a value to a value between 0 and 1, x t Is the input of the current moment, h t-1 Is the hidden state at the previous moment, c t-1 Is the state of the unit at the previous moment, U f 、U i 、U o 、U c Input weight matrix representing forget gate, input gate, output gate and intra-cell state update, respectively, W f 、W i 、W o 、W c A cyclic weight matrix representing respectively forgetting gate, input gate, output gate and intra-cell state update, M f ,M i ,M o A unit state weight matrix respectively representing a forgetting gate, an input gate and an output gate, b f 、b i 、b o 、b c Indicating the bias of the forget gate, the input gate, the output gate and the intra-cell state update, respectively.
S313, taking
Figure BDA0002835847130000037
And outputting a predicted result at the t-th moment. />
Preferably, in step S3, the step of establishing a threshold cycle unit learner prediction model includes:
s321, taking an input feature vector x t As input at time t.
S322, the threshold circulation unit comprises an input layer, a hidden layer and an output layer; the core of the threshold circulation unit is two gates of the hidden layer, so that information can be selectively enabled to influence a final result by controlling historical data; wherein the hidden layer includes an update gate and a reset gate, z t 、r t The update gate and the reset gate at the t time are respectively shown as the following formulas (6) - (9) and then according to z t And
Figure BDA0002835847130000038
update->
Figure BDA0002835847130000039
Figure BDA00028358471300000310
Figure BDA00028358471300000311
Figure BDA00028358471300000312
Figure BDA00028358471300000313
Wherein sigma is the value between 0 and 1 converted by the sigmoid activation function,
Figure BDA00028358471300000314
and->
Figure BDA00028358471300000315
Activation parameters at time t and time t-1, respectively, < >>
Figure BDA0002835847130000041
Is an activation parameter, W z ,W h The cyclic weight matrix respectively representing the update gate and the reset gate, W is the hidden layer-hidden layer connection weight, U z 、U r Input weight matrix respectively representing update gate and reset gate, U is weight matrix of input-hidden layer connection, b z 、b r The bias of the update gate, reset gate, b is shown as bias, respectively.
S323 taking
Figure BDA0002835847130000042
And outputting a predicted result at the t-th moment.
Preferably, in step S3, the step of establishing a recurrent neural network learner prediction model includes:
s331, taking an input feature vector x t As input at time t.
S332 input of the cyclic neural network is x t The output is
Figure BDA0002835847130000043
The hidden layer is h, which is called a memory cell, and has the capability of storing information, the output of which can affect the input of the next moment, and the input of the moment t is assumed to be x t Output->
Figure BDA0002835847130000044
The hidden state is +.>
Figure BDA0002835847130000045
Then->
Figure BDA0002835847130000046
I.e. input x to the current time t The hidden state update at the previous time is also as follows:
Figure BDA0002835847130000047
Figure BDA0002835847130000048
Figure BDA0002835847130000049
wherein Q is i The output of the front-stage output unit of the classifier is represented by i, the category index is represented by C, the total number of categories is represented by S i The ratio of the index of the current element to the sum of the indices of all elements, b and c being bias, U, V, W being input-hidden layer, hidden layer-output, hidden layer-hidden, respectivelyA weight matrix of the layer connections,
Figure BDA00028358471300000410
representing the t hidden layer, and at different times the values of the weight matrix U, V, W are the same, f (x) is the tanh function or the ReLU function in the nonlinear activation function.
S333 taking
Figure BDA00028358471300000411
And outputting a predicted result at the t-th moment.
Preferably, in step S4, the meta learner prediction model is a long-term and short-term memory learner prediction model.
Preferably, the input variable of the day to be predicted is
Figure BDA00028358471300000412
Compared with the prior art, the technical scheme of the invention has the beneficial effects that: the Stacking-integrated learning method provided by the invention considers the differences of data observation and training principles of different algorithms, and fully plays the advantages of each model in the prediction process. And the stronger the learning ability of each base learner, the lower the correlation degree between each base learner and each base learner, the better the final prediction effect. Compared with the traditional single model prediction, the short-term photovoltaic power generation prediction method based on Stacking integrated learning has higher prediction precision.
Drawings
Fig. 1 is a schematic flow chart of a short-term photovoltaic power prediction method based on Stacking-integrated learning according to an embodiment of the present invention.
Fig. 2 is a graph showing a comparison of prediction effects of a short-term photovoltaic power prediction method based on Stacking-ensemble learning and a conventional single-model prediction method according to an embodiment of the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the present patent;
for the purpose of better illustrating the embodiments, certain elements of the drawings may be omitted, enlarged or reduced and do not represent the actual product dimensions;
it will be appreciated by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical scheme of the invention is further described below with reference to the accompanying drawings and examples.
As shown in fig. 1, the method for predicting short-term photovoltaic power based on Stacking-ensemble learning provided by the application comprises the following steps:
s1, acquiring historical photovoltaic power generation power data and historical weather forecast data and performing data preprocessing, so as to construct a photovoltaic forecast data sample set;
s2, dividing the data sample set into a basic model training set and a meta learner training set;
s3, inputting the basic model training set into a plurality of basic learners for training, and establishing a plurality of basic learner prediction models, so that training of a first layer of prediction models is completed;
s4, inputting the meta learner training set into a plurality of base learner prediction models of a first layer prediction model, outputting respective prediction results by each base learner prediction model, and inputting the prediction results of the first layer prediction model into a meta learner for training, so that training of a plurality of meta learner prediction models in a second layer prediction model is completed;
s5, inputting the input variable of the day to be predicted into the trained first layer of prediction model and outputting a prediction result, and taking the prediction result of the first layer of prediction model as the input variable of the meta-learner prediction model of the trained second layer of prediction model to obtain the final predicted photovoltaic output power.
In step S1, a time sequence of historical photovoltaic power generation power data and historical weather forecast data is obtained through the data preprocessing, wherein the time sequence comprises photovoltaic power generation power P, total solar radiation intensity G, scattered horizontal radiation intensity D and wind speed W s Temperature T, relative humidity H, wind direction W d And the daily rainfall R, and then taking the time sequence as a photovoltaic prediction data sample set.
In the step S1, the photovoltaic power station used for collecting data in this embodiment is an Alice springs photovoltaic power station in australia, the power station is composed of 22 photovoltaic panels with rated values of 250W, the rated values of the array are 5.5KW, and grid-connected power generation is performed through an inverter. The time resolution was 5min, i.e. 288 data points total a day.
In step S2, the time sequence of the historical weather forecast data is used as an input variable of the photovoltaic forecast data sample set, and the time sequence of the historical photovoltaic power generation power is used as an output variable of the photovoltaic forecast data sample set. The photovoltaic prediction data sample set has a total of two years, 172800 historical photovoltaic power generation data and weather forecast data. 86400 samples were taken as the base model training set and another 86400 samples were taken as the meta learner training set. Specifically, a photovoltaic prediction data sample set s= { P is formed n ,G n ,D n ,W sn ,T n ,H n ,W dn ,R n }, wherein P n For the predicted value, x, corresponding to the nth photovoltaic predicted data sample set n ={G n ,D n ,W sn ,T n ,H n ,W dn ,R n And the input characteristic vectors corresponding to the nth photovoltaic prediction data sample set are respectively obtained.
In step S3, the plurality of base learners include a Long Short-Term Memory (LSTM), a threshold cyclic unit learner (Gated Recurrent Unit, GRU), and a cyclic neural network learner (Recurrent Neural Network, RNN).
In step S3, the step of establishing a prediction model of the long-term memory learner includes:
s311, taking an input feature vector x t As input at time t.
S312, setting the long-period memory model unit to comprise a forgetting door, an input door and an output door, wherein the forgetting door unit f at the t moment t Input gate i at time t t Output door g at t-th moment t Cell internal state update c t Prediction model output of long-short-term memory learner
Figure BDA0002835847130000061
The method comprises the following steps of:
Figure BDA0002835847130000062
Figure BDA0002835847130000063
Figure BDA0002835847130000064
Figure BDA0002835847130000065
Figure BDA0002835847130000066
wherein σ is a sigmoid activation function that converts a value to a value between 0 and 1, x t Is the input of the current moment, h t-1 Is the hidden state at the previous moment, c t-1 Is the state of the unit at the previous moment, U f 、U i 、U o 、U c Input weight matrix representing forget gate, input gate, output gate and intra-cell state update, respectively, W f 、W i 、W o 、W c A cyclic weight matrix representing respectively forgetting gate, input gate, output gate and intra-cell state update, M f ,M i ,M o A unit state weight matrix respectively representing a forgetting gate, an input gate and an output gate, b f 、b i 、b o 、b c Indicating the bias of the forget gate, the input gate, the output gate and the intra-cell state update, respectively.
S313, taking
Figure BDA0002835847130000071
For the t-th time of deliveryThe prediction result 1 is obtained.
In step S3, the step of establishing a threshold cycle unit learner prediction model includes:
s321, taking an input feature vector x t As input at time t.
S322, the threshold circulation unit comprises an input layer, a hidden layer and an output layer; the core of the threshold circulation unit is two gates of the hidden layer, so that information can be selectively enabled to influence a final result by controlling historical data; wherein the hidden layer includes an update gate and a reset gate, z t 、r t The update gate and the reset gate at the t time are respectively shown as the following formulas (6) - (9) and then according to z t And
Figure BDA0002835847130000072
update->
Figure BDA0002835847130000073
Figure BDA0002835847130000074
Figure BDA0002835847130000075
Figure BDA0002835847130000076
Figure BDA0002835847130000077
Wherein sigma is the value between 0 and 1 converted by the sigmoid activation function,
Figure BDA0002835847130000078
and->
Figure BDA0002835847130000079
Activation parameters at time t and time t-1, respectively, < >>
Figure BDA00028358471300000710
Is an activation parameter, W z ,W h The cyclic weight matrix respectively representing the update gate and the reset gate, W is the hidden layer-hidden layer connection weight, U z 、U r Input weight matrix respectively representing update gate and reset gate, U is weight matrix of input-hidden layer connection, b z 、b r The bias of the update gate, reset gate, b is shown as bias, respectively.
S323 taking
Figure BDA00028358471300000711
The prediction result 2 outputted at the time t.
In step S3, the step of establishing a prediction model of the recurrent neural network learner includes:
s331, taking an input feature vector x t As input at time t.
S332 input of the cyclic neural network is x t The output is
Figure BDA00028358471300000712
The hidden layer is h, which is called a memory cell, and has the capability of storing information, the output of which can affect the input of the next moment, and the input of the moment t is assumed to be x t Output->
Figure BDA00028358471300000713
The hidden state is +.>
Figure BDA00028358471300000714
Then->
Figure BDA00028358471300000715
I.e. input x to the current time t The hidden state update at the previous time is also as follows:
Figure BDA00028358471300000716
Figure BDA00028358471300000717
Figure BDA00028358471300000718
wherein Q is i The output of the front-stage output unit of the classifier is represented by i, the category index is represented by C, the total number of categories is represented by S i The ratio of the index of the current element to the sum of the indices of all elements, b and c are offsets, U, V, W are respectively the input-hidden layer, hidden layer-output, weight matrix of hidden layer-hidden layer connection,
Figure BDA0002835847130000081
representing the t hidden layer, and at different times the values of the weight matrix U, V, W are the same, f (x) is the tanh function or the ReLU function in the nonlinear activation function.
S333 taking
Figure BDA0002835847130000082
The prediction result 3 outputted at the time t.
In step S4, the meta learner prediction model is a long-term and short-term memory learner prediction model. In step S5, the input variables of the day to be predicted are as follows
Figure BDA0002835847130000083
Table 1 shows the comparison of model error indexes provided in the examples of the present invention.
Figure BDA0002835847130000084
It can be seen from table 1 that the short-term photovoltaic power generation prediction method based on stacking-ensemble learning has higher prediction accuracy.
Referring to fig. 2, it can be seen that the short-term photovoltaic power generation prediction method based on stacking-integrated learning is closer to the actual value.
The same or similar reference numerals correspond to the same or similar components;
the terms describing the positional relationship in the drawings are merely illustrative, and are not to be construed as limiting the present patent;
it is to be understood that the above examples of the present invention are provided by way of illustration only and not by way of limitation of the embodiments of the present invention. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the invention are desired to be protected by the following claims.

Claims (6)

1. A short-term photovoltaic power prediction method based on Stacking-integrated learning comprises the following steps:
s1, acquiring historical photovoltaic power generation power data and historical weather forecast data and performing data preprocessing, so as to construct a photovoltaic forecast data sample set;
s2, dividing the photovoltaic prediction data sample set into a basic model training set and a meta learner training set;
s3, inputting the basic model training set into a plurality of basic learners for training, and establishing a plurality of basic learner prediction models, so that training of a first layer of prediction models is completed;
in step S3, the plurality of base learners include a long-short-term memory learner, a threshold circulation unit learner, and a circulation neural network learner;
in step S3, the step of establishing the long-term and short-term memory learner includes:
s311, taking an input feature vector x t As an input at time t;
s312, setting that the long-period memory learner comprises a forgetting door, an input door and an output door, and a forgetting door unit f at the t moment t Input gate i at time t t Output door g at t-th moment t Cell internal state update c t Prediction model output of long-short-term memory learner
Figure FDA0004201331260000011
The method comprises the following steps of:
Figure FDA0004201331260000012
Figure FDA0004201331260000013
Figure FDA0004201331260000014
Figure FDA0004201331260000015
Figure FDA0004201331260000016
wherein σ is a sigmoid activation function that converts a value to a value between 0 and 1, x t Is the input of the current moment, h t-1 Is the hidden state at the previous moment, c t-1 Is the state of the unit at the previous moment, U f 、U i 、U o 、U c Input weight matrix representing forget gate, input gate, output gate and intra-cell state update, respectively, W f 、W i 、W o 、W c A cyclic weight matrix representing respectively forgetting gate, input gate, output gate and intra-cell state update, M f ,M i ,M o A unit state weight matrix respectively representing a forgetting gate, an input gate and an output gate, b f 、b i 、b o 、b c Separate tableBias showing forget gate, input gate, output gate and cell internal state update;
s313, taking
Figure FDA0004201331260000017
The predicted result is output at the t moment;
in step S3, the step of establishing a threshold cycle unit learner prediction model includes:
s321, taking an input feature vector x t As an input at time t;
s322, the threshold circulation unit comprises an input layer, a hidden layer and an output layer; the core of the threshold circulation unit is two gates of the hidden layer, so that information can be selectively enabled to influence a final result by controlling historical data; wherein the hidden layer includes an update gate and a reset gate, z t 、r t The update gate and the reset gate at the t time are respectively shown as the following formulas (6) - (9) and then according to z t And
Figure FDA0004201331260000018
update->
Figure FDA0004201331260000019
Figure FDA00042013312600000110
Figure FDA00042013312600000111
Figure FDA00042013312600000112
Figure FDA00042013312600000113
Wherein sigma is the value between 0 and 1 converted by the sigmoid activation function,
Figure FDA00042013312600000114
and->
Figure FDA00042013312600000115
Activation parameters at time t and time t-1, respectively, < >>
Figure FDA00042013312600000116
Is an activation parameter, W z ,W h The cyclic weight matrix respectively representing the update gate and the reset gate, W is the hidden layer-hidden layer connection weight, U z 、U r Input weight matrix respectively representing update gate and reset gate, U is weight matrix of input-hidden layer connection, b z 、b r The bias of the update gate and the reset gate are respectively represented, and b is the bias;
s323 taking
Figure FDA00042013312600000117
The predicted result is output at the t moment;
in step S3, the step of establishing a prediction model of the recurrent neural network learner includes:
s331, taking an input feature vector x t As an input at time t;
s332 input of the cyclic neural network is x t The output is
Figure FDA0004201331260000021
The hidden layer is h, which is called a memory cell, and has the capability of storing information, the output of which can affect the input of the next moment, and the input of the moment t is assumed to be x t Output->
Figure FDA0004201331260000022
The hidden state is
Figure FDA0004201331260000023
Then->
Figure FDA0004201331260000024
I.e. input x to the current time t The hidden state update at the previous time is also as follows:
Figure FDA0004201331260000025
Figure FDA0004201331260000026
Figure FDA0004201331260000027
wherein Q is i The output of the front-stage output unit of the classifier is represented by i, the category index is represented by C, the total number of categories is represented by S i The ratio of the index of the current element to the sum of the indices of all elements, b and c are offsets, U, V, W are respectively the input-hidden layer, hidden layer-output, weight matrix of hidden layer-hidden layer connection,
Figure FDA0004201331260000028
representing the t hidden layer, wherein the values of the weight matrix U, V, W are the same at different moments, and f (x) is a tanh function or a ReLU function in the nonlinear activation function;
s333 taking
Figure FDA0004201331260000029
The predicted result is output at the t moment;
s4, inputting the meta learner training set into a plurality of base learner prediction models of the first layer prediction model, outputting respective prediction results by each base learner prediction model, and inputting the prediction results of the first layer prediction model into a meta learner for training, so that training of the meta learner prediction model in the second layer prediction model is completed;
s5, inputting the input variable of the day to be predicted into the trained first layer of prediction model, outputting the prediction results, and taking a plurality of prediction results of the first layer of prediction model as the input variable of the meta-learner prediction model of the trained second layer of prediction model to obtain the final predicted photovoltaic output power.
2. The method for short-term photovoltaic power prediction based on Stacking-ensemble learning as claimed in claim 1, wherein in step S1, a time series of historical photovoltaic power generation data and historical weather forecast data is obtained by the data preprocessing, the time series including photovoltaic power generation power P, solar total radiation intensity G, scattered horizontal radiation intensity D, wind speed W s Temperature T, relative humidity H, wind direction W d And the daily rainfall R, and then taking the time sequence as a photovoltaic prediction data sample set.
3. The Stacking-ensemble learning-based short-term photovoltaic power prediction method as claimed in claim 2, wherein in step S2, the time series of historical weather forecast data is used as an input variable of a photovoltaic prediction data sample set, and the time series of historical photovoltaic power generation power is used as an output variable of the photovoltaic prediction data sample set.
4. The Stacking-ensemble learning-based short-term photovoltaic power prediction method as claimed in claim 3, wherein a photovoltaic prediction data sample set s= { P is formed n ,G n ,D n ,W sn ,T n ,H n ,W dn ,R n }, wherein P n For the predicted value corresponding to the nth sample set, x n ={G n ,D n ,W sn ,T n ,H n ,W dn ,R n And the input characteristic vector corresponding to the nth sample set.
5. The Stacking-ensemble learning-based short-term photovoltaic power prediction method as claimed in claim 4, wherein in step S4, the meta learner prediction model is a long-term and short-term memory learner prediction model.
6. The method for short-term photovoltaic power prediction based on Stacking-ensemble learning as claimed in claim 2, wherein said daily input variable to be predicted is
Figure FDA00042013312600000210
/>
CN202011469711.5A 2020-12-15 2020-12-15 Short-term photovoltaic power prediction method based on Stacking-integrated learning Active CN112561058B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011469711.5A CN112561058B (en) 2020-12-15 2020-12-15 Short-term photovoltaic power prediction method based on Stacking-integrated learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011469711.5A CN112561058B (en) 2020-12-15 2020-12-15 Short-term photovoltaic power prediction method based on Stacking-integrated learning

Publications (2)

Publication Number Publication Date
CN112561058A CN112561058A (en) 2021-03-26
CN112561058B true CN112561058B (en) 2023-06-06

Family

ID=75062981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011469711.5A Active CN112561058B (en) 2020-12-15 2020-12-15 Short-term photovoltaic power prediction method based on Stacking-integrated learning

Country Status (1)

Country Link
CN (1) CN112561058B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990593A (en) * 2021-03-29 2021-06-18 广东工业大学 Transformer fault diagnosis and state prediction method based on CSO-ANN-EL algorithm
CN113191091B (en) * 2021-06-03 2022-12-09 上海交通大学 Wind speed prediction method, system and equipment based on hybrid deep learning mechanism
CN113820079A (en) * 2021-07-28 2021-12-21 中铁工程装备集团有限公司 Hydraulic cylinder leakage fault diagnosis method based on cyclostationary theory and Stacking model
CN113866638A (en) * 2021-08-24 2021-12-31 陈九廷 Battery parameter inference method, device, equipment and medium
CN114330935B (en) * 2022-03-10 2022-07-29 南方电网数字电网研究院有限公司 New energy power prediction method and system based on multiple combination strategies integrated learning
CN114692999A (en) * 2022-04-26 2022-07-01 厦门大学 Seawater surface temperature prediction method based on ensemble learning
CN115577857B (en) * 2022-11-15 2023-04-28 南方电网数字电网研究院有限公司 Method and device for predicting output data of energy system and computer equipment
CN117522626A (en) * 2023-11-15 2024-02-06 河北大学 Photovoltaic output prediction method based on feature selection and abnormal multi-model fusion

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734331A (en) * 2018-03-23 2018-11-02 武汉理工大学 Short-term photovoltaic power generation power prediction method based on LSTM and system
CN110705789A (en) * 2019-09-30 2020-01-17 国网青海省电力公司经济技术研究院 Photovoltaic power station short-term power prediction method
CN111008728A (en) * 2019-11-01 2020-04-14 深圳供电局有限公司 Method for predicting short-term output of distributed photovoltaic power generation system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734331A (en) * 2018-03-23 2018-11-02 武汉理工大学 Short-term photovoltaic power generation power prediction method based on LSTM and system
CN110705789A (en) * 2019-09-30 2020-01-17 国网青海省电力公司经济技术研究院 Photovoltaic power station short-term power prediction method
CN111008728A (en) * 2019-11-01 2020-04-14 深圳供电局有限公司 Method for predicting short-term output of distributed photovoltaic power generation system

Also Published As

Publication number Publication date
CN112561058A (en) 2021-03-26

Similar Documents

Publication Publication Date Title
CN112561058B (en) Short-term photovoltaic power prediction method based on Stacking-integrated learning
CN113128793A (en) Photovoltaic power combination prediction method and system based on multi-source data fusion
CN110909919A (en) Photovoltaic power prediction method of depth neural network model with attention mechanism fused
CN109214575A (en) A kind of super short-period wind power prediction technique based on small wavelength short-term memory network
CN115293415A (en) Multi-wind-farm short-term power prediction method considering time evolution and space correlation
CN113052469B (en) Method for calculating wind-solar-water-load complementary characteristic of small hydropower area lacking measurement runoff
CN113554466B (en) Short-term electricity consumption prediction model construction method, prediction method and device
Liu et al. Heating load forecasting for combined heat and power plants via strand-based LSTM
CN114781723A (en) Short-term photovoltaic output prediction method based on multi-model fusion
CN116384583A (en) Photovoltaic power prediction method based on multiple neural networks
CN116341613A (en) Ultra-short-term photovoltaic power prediction method based on Informar encoder and LSTM
CN117233870B (en) Short-term precipitation set forecasting and downscaling method based on multiple meteorological elements
CN113837434A (en) Solar photovoltaic power generation prediction method and device, electronic equipment and storage medium
CN116187540B (en) Wind power station ultra-short-term power prediction method based on space-time deviation correction
CN109447843B (en) Photovoltaic output prediction method and device
CN117113054A (en) Multi-element time sequence prediction method based on graph neural network and transducer
CN111488974A (en) Deep learning neural network-based ocean wind energy downscaling method
CN115099497A (en) CNN-LSTM-based real-time flood forecasting intelligent method
CN112232714B (en) Deep learning-based risk assessment method for distribution network under incomplete structural parameters
CN115764855A (en) Real-time adjustable capacity and available electric quantity prediction method for electric vehicle quick charging station
Li et al. Short-term Power Load Forecasting based on Feature Fusion of Parallel LSTM-CNN
Wan et al. Turbine location wind speed forecast using convolutional neural network
CN113902492A (en) Time-sharing electricity price prediction method and system
CN113449466A (en) Solar radiation prediction method and system for optimizing RELM based on PCA and chaos GWO
Ding et al. Photovoltaic array power prediction model based on EEMD and PSO-KELM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant