CN110232483A - Deep learning load forecasting method, device and terminal device - Google Patents

Deep learning load forecasting method, device and terminal device Download PDF

Info

Publication number
CN110232483A
CN110232483A CN201910527965.9A CN201910527965A CN110232483A CN 110232483 A CN110232483 A CN 110232483A CN 201910527965 A CN201910527965 A CN 201910527965A CN 110232483 A CN110232483 A CN 110232483A
Authority
CN
China
Prior art keywords
sub
time window
traveling time
load
default
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910527965.9A
Other languages
Chinese (zh)
Other versions
CN110232483B (en
Inventor
王颖
荆志朋
邵华
张章
张倩茅
任志刚
齐晓光
张丽洁
袁博
刘芮
习朋
朱士加
赵洪山
任惠
闫西慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North China Electric Power University
Economic and Technological Research Institute of State Grid Hebei Electric Power Co Ltd
Original Assignee
North China Electric Power University
Economic and Technological Research Institute of State Grid Hebei Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North China Electric Power University, Economic and Technological Research Institute of State Grid Hebei Electric Power Co Ltd filed Critical North China Electric Power University
Priority to CN201910527965.9A priority Critical patent/CN110232483B/en
Publication of CN110232483A publication Critical patent/CN110232483A/en
Application granted granted Critical
Publication of CN110232483B publication Critical patent/CN110232483B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Marketing (AREA)
  • Biophysics (AREA)
  • General Business, Economics & Management (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Public Health (AREA)
  • Water Supply & Treatment (AREA)
  • Primary Health Care (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present invention is suitable for technical field of data prediction, provides a kind of deep learning load forecasting method, device and terminal device.The deep learning load forecasting method includes: that forecast interval is divided into the multiple first sub- forecast intervals using the first default traveling time window, and historical load data is divided into the multiple first sub- weight training data, wherein the length of the historical load data is corresponding with the length of the forecast interval;Using the deep learning prediction model of the multiple first the multiple first sub- forecast interval of sub- weight training data training, and obtain the predicted load of every sub- forecast interval;Final predicted value is determined according to the predicted value of each sub- forecast interval.Above-mentioned deep learning load forecasting method is by adjusting the fine granularity and selection historical load data appropriate of cutting forecast interval, and suitable for the load prediction of different durations, and the predicted value obtained is more accurate reliable.

Description

Deep learning load forecasting method, device and terminal device
Technical field
The invention belongs to data prediction fields more particularly to a kind of deep learning load forecasting method, device and terminal to set It is standby.
Background technique
Load prediction is according to historical load changing rule, in conjunction with factors such as weather, temperature, economy, politics to following several The load of hour, several days or several years several months carry out the prediction of science.
Currently, traditional load forecasting method requires greatly according to different predicted times using different load prediction sides Method, the load forecasting model complexity constructed in this way is high, and versatility is low.And with new energy (such as wind-power electricity generation and photovoltaic power generation Deng) infiltration and sustainable growth and flexible controllable burden use electrotransfer, the randomness of power load and uncertain constantly increase Add, just to load prediction accurate, reliable more stringent requirements are proposed for this.
Summary of the invention
In view of this, the embodiment of the invention provides a kind of deep learning load forecasting method, device and terminal device, with The load forecasting model complexity for solving to construct in the prior art is high, versatility is low and load prediction is inaccurate reliably asks Topic.
The first aspect of the embodiment of the present invention provides a kind of deep learning load forecasting method, comprising:
Forecast interval is divided into the multiple first sub- forecast intervals using the first default traveling time window, and history is born Lotus data are divided into the multiple first sub- weight training data, wherein the length of the historical load data and the forecast interval Length is corresponding;
Deep learning using the multiple first the multiple first sub- forecast interval of sub- weight training data training is pre- Model is surveyed, and obtains the predicted load of every sub- forecast interval;
Final predicted value is determined according to the predicted value of each sub- forecast interval.
Preferably, the described first default traveling time window is obtained by clustering algorithm, process are as follows:
It randomly obtains one group second and presets traveling time window, using one group of second preset time window by the history Load data is divided into the sub- weight training data of multiple groups second, wherein the sub- weight training data of the multiple groups second are calculated as cluster The input of method;
The smallest clusters number is found according to the measurement standard of clustering algorithm;
The minimum value of clusters number is selected, and using the traveling time window of maximum length corresponding to the minimum value as described in First default traveling time window.
Preferably, the measurement standard using mean profile coefficient and interval stats amount as the clustering algorithm.
Preferably, described one group second setting method for presetting traveling time window are as follows:
The initial length of each second default traveling time window is identical, will be described using the described second default traveling time window Historical load data is divided into one group of second sub- weight training data, and in the upper length to the described second default traveling time window After degree increases preset step-length Δ T, according to the current length of the second default traveling time window after increase to the historical load number According to being divided, until obtaining the sub- weight training data of multiple groups second;Or
The initial length of each second default traveling time window is different, according to the setting of the change rate of historical load data The length of second default traveling time window, and increase preset step-length Δ in the upper length to the described second default traveling time window After T, the historical load data is divided according to the current length of the second default traveling time window after increase, until To the sub- weight training data of multiple groups second.
Preferably, the move mode of the described first default traveling time window are as follows:
Current first default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t= Twin,i, then the end position of the initial position of the next first default traveling time window and the current first default traveling time window is held in the mouth It connects, wherein Twin,iFor the length of the current first default traveling time window;Or
Current first default traveling time window when performing the next step mobile, if default moving step length Δ t meet Δ t < Twin,i, then the initial position of the next first default traveling time window and the current first default traveling time window be there are Chong Die, wherein Twin,iFor the length of the current first default traveling time window;
The move mode of the second default traveling time window are as follows:
Current second default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t= Twin,i, then the end position of the initial position of the next second default traveling time window and the current second default traveling time window is held in the mouth It connects, wherein Twin,iFor the length of the current second default traveling time window;Or
Current second default traveling time window when performing the next step mobile, if default moving step length Δ t meet Δ t < Twin,i, then the initial position of the next second default traveling time window and the current second default traveling time window Twin,iThere are overlapping, Wherein, Twin,iFor the length of the current second default traveling time window.
Preferably, in the depth using the multiple first the multiple first sub- forecast interval of sub- weight training data training Degree study prediction model, and before obtaining the predicted load of every sub- forecast interval, the deep learning load forecasting method is also Include:
The multiple first sub- weight training data are pre-processed, the multiple first sub- weight training data are corrected In abnormal data or fill up missing data in the multiple first sub- weight training data;
To pretreated multiple first sub- weight training data normalization processing;
The depth using the multiple first the multiple first sub- forecast interval of sub- weight training data training Prediction model is practised, and obtains the predicted load of every sub- forecast interval specifically:
The multiple first sub- weight training data are pre-processed, the multiple first sub- weight training data are corrected In abnormal data or fill up missing data in the multiple first sub- weight training data;
To pretreated multiple first sub- weight training data normalization processing;
Utilize the multiple first the multiple first sub- forecast intervals of sub- weight training data training after normalized Deep learning prediction model, and obtain the predicted load of every sub- forecast interval.
Preferably, the deep learning prediction model is to be predicted using the depth LSTM of Keras deep learning framework establishment Model, the training process of the depth LSTM prediction model are as follows:
Obtain the first sub- weight training data of i-th of sub- forecast interval;
Utilize grid search LSTM hyper parameter;
Depth LSTM prediction model is established using the described first sub- weight training data and the LSTM hyper parameter.
Preferably, the algorithm of the deep learning prediction model is that depth encodes neural network certainly.
The second aspect of the embodiment of the present invention provides a kind of deep learning load prediction device, comprising:
Cutting module for forecast interval to be divided into the multiple first sub- forecast intervals, and historical load data is drawn It is divided into the multiple first sub- weight training data;
Prediction module, for utilizing the multiple first the multiple first sub- forecast interval of sub- weight training data training Deep learning prediction model, and obtain the predicted load of every sub- forecast interval;
Determining module, for determining final predicted value according to the predicted load of each sub- forecast interval.
The third aspect of the embodiment of the present invention provides a kind of terminal device, including memory, processor and is stored in In the memory and the computer program that can run on the processor, when the processor executes the computer program The step of realizing the as above any one deep learning load forecasting method.
The fourth aspect of the embodiment of the present invention provides a kind of computer readable storage medium, the computer-readable storage Media storage has computer program, realizes that as above any one deep learning is negative when the computer program is executed by processor The step of lotus prediction technique.
The embodiment of the present invention proposes a kind of deep learning load forecasting method, by presetting traveling time window for forecast interval Multiple sub- forecast intervals are divided into, historical load data is divided into multiple sub- weight training data;It is instructed by multiple sub- loads Practice the corresponding multiple sub- forecast intervals of data training, by adjusting the first default traveling time window length to adjust the pre- of cutting Survey the fine granularity and selection historical load data appropriate in section, the load prediction suitable for different durations.
Detailed description of the invention
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only of the invention some Embodiment for those of ordinary skill in the art without creative efforts, can also be attached according to these Figure obtains other attached drawings.
Fig. 1 is a kind of flow diagram of deep learning load forecasting method provided by the invention;
Fig. 2 is a kind of specific example figure of deep learning load forecasting method provided by the invention;
Fig. 3 is time window cutting schematic diagram provided in an embodiment of the present invention;
Fig. 4 is load prediction comparison diagram provided in an embodiment of the present invention;
Fig. 5 is the schematic diagram of deep learning load prediction device provided in an embodiment of the present invention;
Fig. 6 is the schematic diagram of terminal device provided in an embodiment of the present invention.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific The present invention also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity The detailed description of road and method, in case unnecessary details interferes description of the invention.
In order to illustrate technical solutions according to the invention, the following is a description of specific embodiments.
Fig. 1 shows a kind of flow diagram of depth load forecasting method provided by the invention, referring to Fig. 1, to this hair Details are as follows for the deep learning load forecasting method of bright offer.
Forecast interval is divided into the multiple first sub- forecast intervals using the first default traveling time window by step S101, with And historical load data is divided into the multiple first sub- weight training data, wherein the length of the historical load data with it is described The length of forecast interval is corresponding.
Traditional load forecasting method is usually gathered historical load data using different for short-term load forecasting Class algorithm is to find out Overload Class similar with day is predicted, according to Overload Class training front stage BP cascade similar with prediction day Neural network;For Mid-long term load forecasting, generally by functional data analysis theory and nonparametric probability side Then the Load Forecast Algorithm of method constructor type non parametric regression corrects the prediction of nonparametric Regression Model by quadratic programming Curve finally obtains the prediction curve of specified forecast interval;Therefore, traditional load forecasting method, usually will be for difference The different load forecasting method of the load prediction block design of duration, these methods have the negative of lower versatility or building Lotus prediction model is sufficiently complex.
Specifically, first with the first default traveling time window by forecast interval and historical load number in the embodiment of the present invention According to being divided, long load prediction section, is conducive to when so as to be shorter by the load prediction interval division of longer duration Simplify the unification of load forecasting model and a variety of load forecasting methods.
Wherein, what historical load data represented is the load data in the past period, the length of historical load data It is corresponding with the length of forecast interval, the historical load data of corresponding length is selected according to the duration of specific forecast interval;Prediction The duration in section can be divided into ultra-short term, short-term, medium and long term according to the purpose of load prediction, ultra-short term it is pre- Section general control is surveyed within one hour following, short-term load forecasting refers to that daily load prediction or all load predictions, mid-term are negative Lotus predicts the load prediction for referring to the moon to year, and long term load forecasting refers to the load in The Next 3-5 Years even a longer period of time Prediction, the duration of forecast interval are independently set according to actual needs, and the embodiment of the present invention is it is not limited here.
Preferably, the first default traveling time window can select to obtain by clustering algorithm, and it is pre- to obtain first by clustering algorithm If the process of traveling time window are as follows: randomly obtain one group second and preset traveling time window, when being preset using described one group second Between window the historical load data is divided into the sub- weight training data of multiple groups second, wherein the sub- weight training of the multiple groups second Input of the data as clustering algorithm;The smallest clusters number is found according to the measurement standard of clustering algorithm;Select clusters number Minimum value, and using the traveling time window of maximum length corresponding to the minimum value as the described first default traveling time window.
It specifically, clustering algorithm is a kind of statistical analysis technique for studying classification problem, while being also the one of data mining A important algorithm can deeply excavate the inner link between historical load data using clustering algorithm, so that it is pre- to improve load The accuracy of survey.
Wherein, clustering algorithm can be divided into partitioning (Partitioning Methods), stratification (Hierarchical Methods), the method based on density (Density-Based Methods), the method (Grid-Based based on grid Methods) and the method based on model (Model-Based Methods) etc., specific clustering algorithm can be selected independently, can It is to find the smallest clusters number, the smallest clusters number is corresponding different measurement standards to be arranged according to actual needs The traveling time window of maximum length can reduce the complexity of constructed load forecasting model as the first default traveling time window Degree.
Preferably, the measurement standard using mean profile coefficient and interval stats amount as the clustering algorithm.
Wherein, silhouette coefficient is a kind of evaluation method of Clustering Effect quality, it combine two kinds of cohesion degree and separating degree because Element evaluates different clustering algorithms or identical clustering algorithm different running method to influence caused by cluster result;Specific to this In inventive embodiments, the silhouette coefficient of i-th of sub- weight training data sample are as follows:
In formula:
For i-th of sub- weight training data sample, the average distance of its every other sample into its affiliated class is calculated, Remember ai
For i-th of sub- weight training data sample, calculating it, all samples in nearest inhomogeneity are averaged with it Distance remembers bi;Wherein " distance " refers to dissimilar degree, and distance is bigger, and it is higher to represent dissmilarity degree, and Euclidean distance Just meet this condition, therefore, calculate the distance between above-mentioned sample using Euclidean distance, it may be assumed that
In formula: d (x, y) indicates the Euclidean distance between two data samples x and y;The length of two samples is n.
For the value of silhouette coefficient in the section of [- 1,1], the bigger expression cohesion degree of value and separating degree are all more excellent;
The silhouette coefficient of all sub- weight training data samples is averaging, is exactly the silhouette coefficient of the cluster result.
Wherein, interval stats amount (Gap Statistic) determines the number of determined class to solve clustering problem;It will Variation is compared with its desired value under reference load data distribution in total cluster of different cluster numbers k values.This reference number It is generated according to using monte carlo method, i.e., for each sample, calculates its maximum value and minimum value, it is then random uniformly to generate Random number of the minimum value to maximum value.For real data and reference data, calculated in total cluster using different cluster k values Variation.Interval stats amount calculates as follows in the case where given cluster numbers k value:
In formula:Indicate that reference length is the expectation of the sample of n;CrIndicate r-th of cluster classification, nr=| Cr|, DrIt indicates Euclidean distance in class between sample point.
Steps are as follows for the calculating of this evaluation index:
1) actual multiple sub- weight training data sets are clustered, changes clusters number k=1,2 ..., kamx, and count Calculate corresponding Wk
2) it generates reference data set and it is clustered, change clusters number k=1,2 ..., kamx, calculate corresponding Gn(k);
3) it enablesB is the number for generating reference data set.Then standard deviation is calculatedAnd it sets
The smallest clusters number k is finally selected, G is metn(k)≥Gn(k+1)-sk+1.K is optimal cluster numbers at this time Mesh selects the traveling time window of the corresponding maximum length of this clusters number as the described first default traveling time window.
When two kinds of measurement standards selection min cluster number difference when, preferentially select cluster data relatively small as Optimal clusters number, to determine the length of the first default traveling time window.
Step S102 utilizes the depth of the multiple first the multiple first sub- forecast interval of sub- weight training data training Degree study prediction model, and obtain the predicted load of every sub- forecast interval.
Specifically, utilizing the depth of the multiple first the multiple first sub- forecast interval of sub- weight training data training Learn prediction model, the predicted load of the every sub- forecast interval obtained can be made more accurate.
Step S103 determines final predicted value according to the predicted value of each sub- forecast interval.
Specifically, the predicted value of each sub- forecast interval is integrated sequentially in time, and to the prediction after integration Value carries out renormalization, obtains final predicted value.
The present invention provides a kind of deep learning Load Forecast Algorithms, will predict including the use of the first default traveling time window Interval division is the multiple first sub- forecast intervals, and historical load data is divided into the multiple first sub- weight training data, Wherein the length of the historical load data is corresponding with the length of the forecast interval;Utilize the multiple first sub- weight training The deep learning prediction model of the multiple first sub- forecast interval of data training, and the load for obtaining every sub- forecast interval is pre- Measured value;Final predicted value is determined according to the predicted value of each sub- forecast interval.
As it can be seen that in the present invention, can using the first default traveling time window by forecast interval and historical load data cutting, It, can be by length with the deep learning prediction model of obtained after cutting first the first sub- forecast interval of sub- weight training data training Spending longer forecast interval cutting is the shorter forecast interval of length, in order to real using a kind of deep learning load forecasting method The load prediction of existing different time length forecast interval, and by adjusting the fine granularity of the sub- forecast interval after cutting, can also be with Increase the accuracy of prediction result.
On the basis of the above embodiments:
Embodiment as one preferred, described one group second setting method for presetting traveling time window are as follows:
The initial length of each second default traveling time window is identical, will be described using the described second default traveling time window Historical load data is divided into one group of second sub- weight training data and in the upper length to the described second default traveling time window After degree increases preset step-length Δ T, according to the current length of the second default traveling time window after increase to the historical load number According to being divided, until obtaining the sub- weight training data of multiple groups second;Or
The initial length of each second default traveling time window is different, according to the setting of the change rate of historical load data The length of second default traveling time window, and increase preset step-length Δ in the upper length to the described second default traveling time window After T, the historical load data is divided according to the current length of the second default traveling time window after increase, until To the sub- weight training data of multiple groups second.
Specifically, the historical load data divided respectively to each second default traveling time window clusters, reference Multiple cluster results, can choose a kind of length of more preferably the first preset time window, and then optimize trained deep learning Prediction model obtains more accurate prediction result.
Wherein, it is pre- not to be further added by second then for each second default traveling time window length after adding preset step-length Δ T sufficiently large If the length of traveling time window, determines according to actual conditions, the embodiment of the present invention is without limitation by preset step-length Δ T.
Embodiment as one preferred, the move mode of the first default traveling time window are as follows:
Current first default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t= Twin,i, then the end position of the initial position of the next first default traveling time window and the current first default traveling time window is held in the mouth It connects, wherein Twin,iFor the length of the current first default traveling time window;Or
Current first default traveling time window when performing the next step mobile, if default moving step length Δ t meet Δ t < Twin,i, then the initial position of the next first default traveling time window and the current first default traveling time window be there are Chong Die, wherein Twin,iFor the length of the current first default traveling time window;
The move mode of the second default traveling time window are as follows:
Current second default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t= Twin,i, then the end position of the initial position of the next second default traveling time window and the current second default traveling time window is held in the mouth It connects, wherein Twin,iFor the length of the current second default traveling time window;Or
Current second default traveling time window when performing the next step mobile, if default moving step length Δ t meet Δ t < Twin,i, then the initial position of the next second default traveling time window and the current second default traveling time window Twin,iThere are overlapping, Wherein, Twin,iFor the length of the current second default traveling time window.
Specifically, moving step length Δ t meets Δ t≤Twin,iWith guarantee historical load data can by complete cutting without It omits, when moving step length Δ t meets Δ t < Twin,iWhen, forecast interval and historical load data can obtain more fine-grained stroke Point, in order to which this deep learning load forecasting method is suitable for the load prediction in different duration prediction sections.
Embodiment as one preferred is utilizing the multiple first sub- weight training data training the multiple first The deep learning prediction model of sub- forecast interval, and before obtaining the predicted load of every sub- forecast interval, the deep learning Load forecasting method further include:
The multiple first sub- weight training data are pre-processed, the multiple first sub- weight training data are corrected In abnormal data or fill up missing data in the multiple first sub- weight training data;
To pretreated multiple first sub- weight training data normalization processing;
The depth using the multiple first the multiple first sub- forecast interval of sub- weight training data training Prediction model is practised, and obtains the predicted load of every sub- forecast interval specifically:
The multiple first sub- weight training data are pre-processed, the multiple first sub- weight training data are corrected In abnormal data or fill up missing data in the multiple first sub- weight training data;
To pretreated multiple first sub- weight training data normalization processing;
Utilize the multiple first the multiple first sub- forecast intervals of sub- weight training data training after normalized Deep learning prediction model, and obtain the predicted load of every sub- forecast interval.
Specifically, the historical load data of equal length corresponding with forecast interval is drawn using the first default traveling time window After being divided into multiple sub- weight training data, the problems such as, ageing equipment improper since in sampling process, there are manual operations, cause There may be missing values and abnormal data for historical load data, i.e. there may be missing values and exception for every sub- weight training data Data, therefore, it is necessary to first be pre-processed to the corresponding sub- weight training data of every sub- forecast interval, correct abnormal data or Fill up missing data;
Wherein, since deep learning prediction algorithm is more sensitive to data scale ratio, we are to pretreated data It is normalized again:
In formula: the data matrix after X ' expression normalization;tkIndicate the length of k-th of Sub Data Set;xiIndicate the i-th of X ' Row vector;minxiAnd maxxiIndicate xiMinimum value and maximum value.
Specifically, making data processing get up more just within the scope of pretreated data are namely mapped to 0~1 It is prompt quick.
Certainly, other than the sub- weight training data after division are normalized, other forms can also be carried out Processing, the embodiment of the present invention is it is not limited here.
Embodiment as one preferred, deep learning prediction model are the depth using Keras deep learning framework establishment Spend LSTM prediction model, the training process of depth LSTM prediction model are as follows:
Obtain the first sub- weight training data of i-th of sub- forecast interval;
Utilize grid search LSTM hyper parameter;
Depth LSTM prediction model is established using the described first sub- weight training data and the LSTM hyper parameter.
Specifically, Keras is a high level neural network API (Application Program Interface), by pure Python (a kind of computer programming language) writes, its purpose be in order to support quick experiment, can be yours Idea is rapidly converted into as a result, having the advantages that user friendly, modularization and easily extension.
Wherein, LSTM is depth shot and long term memory network (Long Short Term Memory), it is a kind of improved RNN (Recurrent Neural Network, Recognition with Recurrent Neural Network) model, relative to the RNN of standard, LSTM is more suitable for place Relatively long critical event is spaced and postponed in reason and predicted time sequence.
Embodiment as one preferred, the algorithm of deep learning prediction model are that depth encodes neural network certainly.
Certainly, in addition to LSTM and depth are from neural network is encoded, the algorithm of deep learning prediction model can also use it The neural network of his type replaces, and the embodiment of the present invention is it is not limited here.
Referring to FIG. 2, Fig. 2 is a kind of specific example figure of deep learning load forecasting method provided by the invention, specifically Exemplary step is as follows:
Selection input load data:
Siding-to-siding block length T including determining load prediction, for different siding-to-siding block lengths, (short-term or Mid-long Term Load is pre- Survey) and the desired load resolution ratio (interval between i.e. adjacent load data point) of load prediction, select different history negative Lotus data.Such as following one day load data of requirement forecast, it can choose the historical load data of identical number of weeks.
Traveling time window cutting data set:
If the length of forecast interval is T, the length of i-th of second default traveling time windows is Twin,i.Pass through design second It predicts traveling time window and moves the second default traveling time window since the starting endpoint of historical load data section T to history The end caps of load data section T, by going through in each second default traveling time window institute scope in moving process The cutting of history load data and individually storage record are as the second sub- weight training data.It is cut at this time with the second default traveling time window Divided data collection is as a preliminary cutting, and the second default traveling time window will train number with the history of forecast interval equal length According to the multiple second sub- weight training data are divided into, the multiple second sub- weight training data after cutting are as the defeated of clustering algorithm Enter, to find the first default traveling time window using clustering algorithm.
Clustering algorithm determines optimal cutting:
It should not only be arranged too short of the length of time window but also should not be arranged too long, determine suitable by clustering algorithm The length of one default traveling time window.
Firstly, being calculated using the multiple second sub- weight training data after the second default traveling time window cutting as cluster The input of method finds optimal number of clusters as measurement standard using mean profile coefficient and interval stats amount;
Secondly, maximum traveling time window length corresponding to the minimum value and this minimum value of selection clusters number.
Finally the traveling time window of the maximum length according to corresponding to the minimum value of clusters number is default as described first Traveling time window.
When two kinds of measurement standards selection min cluster number difference when, preferentially select clusters number relatively small as Optimal clusters number, with the default traveling time window length of determination suitable first.
Optimal cutting data normalization:
The historical load data of equal length corresponding with forecast interval is divided into using suitable first movement time window After multiple first sub- weight training data, need first to carry out the corresponding first sub- weight training data of every sub- forecast interval pre- Processing corrects abnormal data or fills up missing data;
Since deep learning prediction algorithm is more sensitive to data scale ratio, we carry out pretreated data again Normalized, the data after normalized easily facilitate data processing between 0~1.
After normalization data, other characteristics can be increased on the basis of each first sub- weight training data, with The precision of prediction is improved, according to sample frequency, other characteristics, each first sub- weight training data can determine depth The input quantity and output quantity of learning network;
The modeling and prediction of sub- forecast interval:
After every sub- weight training data prediction normalization, prediction model is established for corresponding sub- forecast interval, is had Body:
The initial data of i-th of sub- forecast interval is determined first;
Secondly grid search LSTM hyper parameter is utilized;
Then prediction model is established according to LSTM hyper parameter;
The predicted load in i-th of subinterval is finally obtained according to prediction model.
Wherein, Keras deep learning framework establishment depth LSTM model is utilized.When due to LSTM model foundation it needs to be determined that Multiple hyper parameters in model, such as number, loss function, the model the number of iterations of hidden state etc..It is more using grid search The optimal value of a hyper parameter, and then establish depth LSTM prediction model.
Integrate the predicted value of each sub- forecast interval, renormalization, output prediction load data.
Specifically, assuming that sub- forecast interval there are k, need to judge whether the serial number i of i-th of sub- forecast interval is greater than k, when When i > k, the predicted value of each sub- forecast interval is just integrated, to ensure that the predicted value of every sub- forecast interval is all counted on.
As it can be seen that in this deep learning load prediction specific example, according to the length of different forecast intervals, thus it is possible to vary cutting The fine granularity in section, such as forecast interval are long, total forecast interval fine granularity highland can be divided into multiple careful sons Forecast interval;Thus it is suitable for the load prediction of different durations, the prediction algorithm of different prediction lengths can be unified, reduces load The complexity of prediction model building, enhances its versatility;When since the suitable first default movement has been determined using clustering algorithm Between window, it is right and before the deep learning prediction model using first sub- weight training data the first sub- forecast interval of training First sub- weight training data are pre-processed and have been normalized, and then improve prediction progress and data processing speed.
Embodiment as one preferred verifies above-mentioned deep learning load prediction for predicting following 1 day load Method, wherein historical load data resolution ratio is 15 minutes.Assuming that forecast interval length is T, actual load data are xt, t= 1,2,…,T;Remember that the load data through deep learning model prediction isT=1,2 ..., T;Predict one day data, wherein T Value is 96.The precision of prediction is measured using following 3 kinds of evaluation indexes:
(1) root-mean-square error
(2) mean absolute error
(3) precision of prediction
Successively according to step described previously, optimal time window length is found out using clustering algorithm first.It is optimal in this example Clusters number be 4, in addition, the division of sub- forecast interval is as shown in Fig. 3.Then it is utilized respectively using grid data service pre- Survey the hyper parameter of the corresponding sub- weight training data training deep neural network in section.Instruction is finally passed through based on historical load data LSTM model prediction load data after the completion of white silk, furthermore this algorithm is compared with traditional BP neural network, the two it is pre- It is as shown in Fig. 4 to survey result figure.
The error of the algorithm of the invention design is respectively as follows: error1=9271.51, error2=75.34;Traditional BP mind Prediction error through network are as follows: error1=26192.09, error2=128.77;By comparing this algorithm compared with traditional neural net The error of network reduces respectively are as follows: Δ error1=16920.58, Δ error2=53.43.Furthermore the precision of prediction of LSTM algorithm is η=2.19%, and BP neural network precision of prediction is η=5.15%, precision of prediction improves 2.96%.
As can be seen that using the depth LSTM load forecasting method based on traveling time window, the predicted load of acquisition is more To be accurate reliable, this deep learning load forecasting method is applied to electric system, can be mentioned for the economic load dispatching of electric system For solid reference, reliable basis is provided to formulate load dispatching policy and setting area tou power price.
In addition to this, this deep learning load forecasting method can also be applied to the prediction of resident region consumption gas quantity Deng the embodiment of the present invention is not construed as limiting this.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit It is fixed.
Fig. 5 is a kind of structural schematic diagram for deep learning load prediction device that one embodiment of the invention provides, referring to figure 5, which may include cutting module 50, prediction module 51 and output module 52.
Cutting module 50 is used to for forecast interval being divided into the multiple first sub- forecast intervals, and historical load data is drawn It is divided into the multiple first sub- weight training data;Prediction module 51 is used to utilize the multiple first sub- weight training data training institute The deep learning prediction model of the multiple first sub- forecast intervals is stated, and obtains the predicted load of every sub- forecast interval;Output Module 52 is used to determine final predicted value according to the predicted load of each sub- forecast interval.
Aforementioned depth Learning work load is please referred to for the introduction of the deep learning load prediction device in the embodiment of the present invention Prediction technique embodiment, details are not described herein for the embodiment of the present invention.
Fig. 6 is the schematic diagram for the terminal device that one embodiment of the invention provides.As shown in fig. 6, the terminal of the embodiment is set Standby 6 include: processor 60, memory 61 and are stored in the meter that can be run in the memory 61 and on the processor 60 Calculation machine program 62.The processor 60 realizes above-mentioned each deep learning load forecasting method when executing the computer program 62 Step in embodiment, such as step shown in FIG. 1.Alternatively, realization when the processor 60 executes the computer program 62 The function of each module/unit in above-mentioned each Installation practice, such as the function of module 50 to 52 shown in Fig. 5.
Illustratively, the computer program 62 can be divided into one or more module/units, it is one or Multiple module/units are stored in the memory 61, and are executed by the processor 60, to complete the present invention.Described one A or multiple module/units can be the series of computation machine program instruction section that can complete specific function, which is used for Implementation procedure of the computer program 62 in the load prediction equipment 6 is described.For example, the computer program 62 can be with It is divided into cutting module, prediction module and output module, each module concrete function is as follows:
Cutting module is used to forecast interval being divided into the multiple first sub- forecast intervals, and historical load data is divided For the multiple first sub- weight training data;Prediction module is used for described more using the multiple first sub- weight training data training The deep learning prediction model of a first sub- forecast interval, and obtain the predicted load of every sub- forecast interval;Output module For determining final predicted value according to the predicted load of each sub- forecast interval.
The load prediction equipment 6 can be the calculating such as desktop PC, notebook, palm PC and cloud server Equipment.The load prediction equipment may include, but be not limited only to, processor 60, memory 61.Those skilled in the art can manage Solution, Fig. 6 is only the example of terminal device 6, does not constitute the restriction to terminal device 6, may include more or more than illustrating Few component perhaps combines certain components or different components, such as the terminal device can also be set including input and output Standby, network access equipment, bus etc..
Alleged processor 60 can be central processing unit (Central Processing Unit, CPU), can also be Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor Deng.
The memory 61 can be the internal storage unit of the terminal device 6, such as the hard disk of load prediction equipment 6 Or memory.The memory 61 is also possible to the External memory equipment of the load prediction equipment 6, such as the load prediction is set The plug-in type hard disk being equipped on standby 6, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card (Flash Card) etc..Further, the memory 61 can also both include the load The internal storage unit of pre- measurement equipment 6 also includes External memory equipment.The memory 61 is for storing the computer program And other programs and data needed for the load prediction equipment.The memory 61 can be also used for temporarily storing Output or the data that will be exported.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different Functional unit, module are completed, i.e., the internal structure of described device is divided into different functional unit or module, more than completing The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Above system The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device/terminal device and method, it can be with It realizes by another way.For example, device described above/terminal device embodiment is only schematical, for example, institute The division of module or unit is stated, only a kind of logical function partition, there may be another division manner in actual implementation, such as Multiple units or components can be combined or can be integrated into another system, or some features can be ignored or not executed.Separately A bit, shown or discussed mutual coupling or direct-coupling or communication connection can be through some interfaces, device Or the INDIRECT COUPLING or communication connection of unit, it can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or In use, can store in a computer readable storage medium.Based on this understanding, the present invention realizes above-mentioned implementation All or part of the process in example method, can also instruct relevant hardware to complete, the meter by computer program Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on The step of stating each embodiment of the method.Wherein, the computer program includes computer program code, the computer program Code can be source code form, object identification code form, executable file or certain intermediate forms etc..Computer-readable Jie Matter may include: can carry the computer program code any entity or device, recording medium, USB flash disk, mobile hard disk, Magnetic disk, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that described The content that computer-readable medium includes can carry out increasing appropriate according to the requirement made laws in jurisdiction with patent practice Subtract, such as does not include electric carrier signal and electricity according to legislation and patent practice, computer-readable medium in certain jurisdictions Believe signal.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although referring to aforementioned reality Applying example, invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified Or replacement, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution should all It is included within protection scope of the present invention.

Claims (11)

1. a kind of deep learning load forecasting method characterized by comprising
Forecast interval is divided into the multiple first sub- forecast intervals using the first default traveling time window, and by historical load number According to the multiple first sub- weight training data are divided into, wherein the length of the length of the historical load data and the forecast interval It is corresponding;
Mould is predicted using the deep learning of the multiple first the multiple first sub- forecast interval of sub- weight training data training Type, and obtain the predicted load of every sub- forecast interval;
Final predicted value is determined according to the predicted value of each sub- forecast interval.
2. deep learning load forecasting method as described in claim 1, which is characterized in that the first default traveling time window It is obtained by clustering algorithm, process are as follows:
It randomly obtains one group second and presets traveling time window, using one group of second preset time window by the historical load Data are divided into the sub- weight training data of multiple groups second, wherein the sub- weight training data of the multiple groups second are as clustering algorithm Input;
The smallest clusters number is found according to the measurement standard of clustering algorithm;
The minimum value of clusters number is selected, and using the traveling time window of maximum length corresponding to the minimum value as described first Default traveling time window.
3. deep learning load forecasting method as claimed in claim 2, which is characterized in that use mean profile coefficient and interval Measurement standard of the statistic as the clustering algorithm.
4. deep learning load forecasting method as claimed in claim 2, which is characterized in that when described one group second default movement Between window setting method are as follows:
The initial length of each second default traveling time window is identical, using the described second default traveling time window by the history Load data is divided into one group of second sub- weight training data, and increases in the upper length to the described second default traveling time window After adding preset step-length Δ T, according to the current length of the second default traveling time window after increase to the historical load data into Row divides, until obtaining the sub- weight training data of multiple groups second;Or
The initial length of each second default traveling time window is different, according to the change rate of historical load data setting described second The length of default traveling time window, and after the upper length to the described second default traveling time window increases preset step-length Δ T, The historical load data is divided according to the current length of the second default traveling time window after increase, until obtaining more The second sub- weight training data of group.
5. such as the described in any item deep learning load forecasting methods of claim 2 to 4, which is characterized in that described first is default The move mode of traveling time window are as follows:
Current first default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t=Twin,i, Then the initial position of the next first default traveling time window is connected with the end position of the current first default traveling time window, In, Twin,iFor the length of the current first default traveling time window;Or
Current first default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t < Twin,i, then The initial position of next first default traveling time window and the current first default traveling time window Twin,iThere are overlappings, wherein Twin,iFor the length of the current first default traveling time window;
The move mode of the second default traveling time window are as follows:
Current second default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t=Twin,i, Then the initial position of the next second default traveling time window is connected with the end position of the current second default traveling time window, In, Twin,iFor the length of the current second default traveling time window;Or
Current second default traveling time window is when performing the next step mobile, if default moving step length Δ t meets Δ t < Twin,i, then The initial position of next second default traveling time window and the current second default traveling time window Twin,iThere are overlappings, wherein Twin,iFor the length of the current second default traveling time window.
6. deep learning load forecasting method as claimed in claim 5, which is characterized in that negative using the multiple first son The deep learning prediction model of the multiple first sub- forecast interval of lotus training data training, and obtain every sub- forecast interval Before predicted load, the deep learning load forecasting method further include:
The multiple first sub- weight training data are pre-processed, are corrected in the multiple first sub- weight training data Abnormal data fills up missing data in the multiple first sub- weight training data;
To pretreated multiple first sub- weight training data normalization processing;
The deep learning using the multiple first the multiple first sub- forecast interval of sub- weight training data training is pre- Model is surveyed, and obtains the predicted load of every sub- forecast interval specifically:
The multiple first sub- weight training data are pre-processed, are corrected in the multiple first sub- weight training data Abnormal data fills up missing data in the multiple first sub- weight training data;
To pretreated multiple first sub- weight training data normalization processing;
Utilize the depth of the multiple first the multiple first sub- forecast intervals of sub- weight training data training after normalized Learn prediction model, and obtains the predicted load of every sub- forecast interval.
7. such as the described in any item deep learning load forecasting methods of Claims 1-4, which is characterized in that the deep learning Prediction model is the depth LSTM prediction model using Keras deep learning framework establishment, the deep learning prediction model Training process are as follows:
Obtain the first sub- weight training data of i-th of sub- forecast interval;
Utilize grid search LSTM hyper parameter;
Depth LSTM prediction model is established using the described first sub- weight training data and the LSTM hyper parameter.
8. such as the described in any item deep learning load forecasting methods of Claims 1-4, which is characterized in that the deep learning The algorithm of prediction model is that depth encodes neural network certainly.
9. a kind of deep learning load prediction device characterized by comprising
Cutting module for forecast interval to be divided into the multiple first sub- forecast intervals, and historical load data is divided into Multiple first sub- weight training data;
Prediction module, for the depth using the multiple first the multiple first sub- forecast interval of sub- weight training data training Degree study prediction model, and obtain the predicted load of every sub- forecast interval;
Determining module, for determining final predicted value according to the predicted load of each sub- forecast interval.
10. a kind of terminal device, including memory, processor and storage are in the memory and can be on the processor The computer program of operation, which is characterized in that the processor realizes such as claim 1 to 8 when executing the computer program The step of any one deep learning load forecasting method.
11. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists In realization deep learning load prediction side as described in any one of claim 1 to 8 when the computer program is executed by processor The step of method.
CN201910527965.9A 2019-06-18 2019-06-18 Deep learning load prediction method and device and terminal equipment Active CN110232483B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910527965.9A CN110232483B (en) 2019-06-18 2019-06-18 Deep learning load prediction method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910527965.9A CN110232483B (en) 2019-06-18 2019-06-18 Deep learning load prediction method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110232483A true CN110232483A (en) 2019-09-13
CN110232483B CN110232483B (en) 2021-05-04

Family

ID=67859762

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910527965.9A Active CN110232483B (en) 2019-06-18 2019-06-18 Deep learning load prediction method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110232483B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110701732A (en) * 2019-12-10 2020-01-17 南昌掘策数据服务有限公司 Energy consumption data analysis method and system and energy saving method and system of central air conditioner
CN111126565A (en) * 2019-11-28 2020-05-08 广东电网有限责任公司 Method and device for predicting block load density index based on deep learning
CN111419249A (en) * 2020-03-26 2020-07-17 心图熵动科技(苏州)有限责任公司 Generation method and prediction system of depression prediction model
CN111694830A (en) * 2020-06-12 2020-09-22 复旦大学 Missing data completion method based on deep ensemble learning
CN112330010A (en) * 2020-11-03 2021-02-05 长安大学 Power consumer load interval prediction method based on deep learning
CN112734106A (en) * 2021-01-08 2021-04-30 深圳市国电科技通信有限公司 Method and device for predicting energy load
CN112862143A (en) * 2019-11-28 2021-05-28 新奥数能科技有限公司 Load and price prediction method
CN113761317A (en) * 2020-07-28 2021-12-07 北京沃东天骏信息技术有限公司 Bullet screen based data processing method and device
CN115174237A (en) * 2022-07-08 2022-10-11 河北科技大学 Method and device for detecting malicious traffic of Internet of things system and electronic equipment
CN116995673A (en) * 2023-09-26 2023-11-03 宁德时代新能源科技股份有限公司 Power load prediction method, power load prediction model training method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103606013A (en) * 2013-12-06 2014-02-26 国家电网公司 User annual power consumption prediction method based on support vector machine
CN103901305A (en) * 2014-03-31 2014-07-02 广东电网公司电力科学研究院 Online early warning method for power equipment
US20170206615A1 (en) * 2012-01-23 2017-07-20 Earth Networks, Inc. Optimizing and controlling the energy consumption of a building
CN108009673A (en) * 2017-11-24 2018-05-08 国网北京市电力公司 Novel load Forecasting Methodology and device based on deep learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170206615A1 (en) * 2012-01-23 2017-07-20 Earth Networks, Inc. Optimizing and controlling the energy consumption of a building
CN103606013A (en) * 2013-12-06 2014-02-26 国家电网公司 User annual power consumption prediction method based on support vector machine
CN103901305A (en) * 2014-03-31 2014-07-02 广东电网公司电力科学研究院 Online early warning method for power equipment
CN108009673A (en) * 2017-11-24 2018-05-08 国网北京市电力公司 Novel load Forecasting Methodology and device based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SUN DA-SHUAI,MA LI-XIN,WANG SHOU-ZHENG: "The Design of Short-term Load Forecast Systems Based on the Theory of Complex", 《2010 INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEM DESIGN AND ENGINEERING APPLICATION》 *
王颖: ""典型发达和发展中国家电力负荷预测分析"", 《中国优秀硕士学位论文全文数据库经济与管理科学辑》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111126565A (en) * 2019-11-28 2020-05-08 广东电网有限责任公司 Method and device for predicting block load density index based on deep learning
CN112862143A (en) * 2019-11-28 2021-05-28 新奥数能科技有限公司 Load and price prediction method
CN110701732A (en) * 2019-12-10 2020-01-17 南昌掘策数据服务有限公司 Energy consumption data analysis method and system and energy saving method and system of central air conditioner
CN111419249A (en) * 2020-03-26 2020-07-17 心图熵动科技(苏州)有限责任公司 Generation method and prediction system of depression prediction model
CN111694830A (en) * 2020-06-12 2020-09-22 复旦大学 Missing data completion method based on deep ensemble learning
CN113761317A (en) * 2020-07-28 2021-12-07 北京沃东天骏信息技术有限公司 Bullet screen based data processing method and device
CN112330010A (en) * 2020-11-03 2021-02-05 长安大学 Power consumer load interval prediction method based on deep learning
CN112734106A (en) * 2021-01-08 2021-04-30 深圳市国电科技通信有限公司 Method and device for predicting energy load
CN115174237A (en) * 2022-07-08 2022-10-11 河北科技大学 Method and device for detecting malicious traffic of Internet of things system and electronic equipment
CN116995673A (en) * 2023-09-26 2023-11-03 宁德时代新能源科技股份有限公司 Power load prediction method, power load prediction model training method and device
CN116995673B (en) * 2023-09-26 2024-02-20 宁德时代新能源科技股份有限公司 Power load prediction method, power load prediction model training method and device

Also Published As

Publication number Publication date
CN110232483B (en) 2021-05-04

Similar Documents

Publication Publication Date Title
CN110232483A (en) Deep learning load forecasting method, device and terminal device
CN108846517B (en) Integration method for predicating quantile probabilistic short-term power load
WO2017076154A1 (en) Method and apparatus for predicting network event and establishing network event prediction model
CN108364106A (en) A kind of expense report Risk Forecast Method, device, terminal device and storage medium
CN111160617B (en) Power daily load prediction method and device
CN110751318A (en) IPSO-LSTM-based ultra-short-term power load prediction method
CN105760952A (en) Load prediction method based on Kalman filtering and self-adaptive fuzzy neural network
CN109102155B (en) Ultra-short-term node marginal electricity price probability prediction method and system
Tian et al. A network traffic prediction method based on IFS algorithm optimised LSSVM
CN116307215A (en) Load prediction method, device, equipment and storage medium of power system
CN102768701A (en) High-voltage switch cabinet insulator electric field optimization method based on quantum genetic algorithm
CN109508826A (en) The schedulable capacity prediction methods of electric car cluster of decision tree are promoted based on gradient
CN109633448A (en) Identify the method, apparatus and terminal device of cell health state
Lee et al. Probabilistic wind power forecasting based on the laplace distribution and golden search
CN111985719A (en) Power load prediction method based on improved long-term and short-term memory network
CN116187835A (en) Data-driven-based method and system for estimating theoretical line loss interval of transformer area
CN113516275A (en) Power distribution network ultra-short term load prediction method and device and terminal equipment
AU2021106200A4 (en) Wind power probability prediction method based on quantile regression
CN115310650A (en) Low-complexity high-precision time sequence multi-step prediction method and system
CN117094535B (en) Artificial intelligence-based energy supply management method and system
Langsari et al. Optimizing COCOMO II parameters using particle swarm method
CN113762591A (en) Short-term electric quantity prediction method and system based on GRU and multi-core SVM counterstudy
CN111259340B (en) Saturation load prediction method based on logistic regression
CN113033898A (en) Electrical load prediction method and system based on K-means clustering and BI-LSTM neural network
CN116470491A (en) Photovoltaic power probability prediction method and system based on copula function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant