CN110084406A - Load forecasting method and device based on self-encoding encoder and meta learning strategy - Google Patents
Load forecasting method and device based on self-encoding encoder and meta learning strategy Download PDFInfo
- Publication number
- CN110084406A CN110084406A CN201910267480.0A CN201910267480A CN110084406A CN 110084406 A CN110084406 A CN 110084406A CN 201910267480 A CN201910267480 A CN 201910267480A CN 110084406 A CN110084406 A CN 110084406A
- Authority
- CN
- China
- Prior art keywords
- time series
- algorithm
- self
- load
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000013277 forecasting method Methods 0.000 title claims abstract description 9
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 102
- 238000000034 method Methods 0.000 claims abstract description 38
- 239000000284 extract Substances 0.000 claims abstract description 7
- 238000012549 training Methods 0.000 claims description 40
- 230000015654 memory Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 11
- 241001269238 Data Species 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 7
- 238000000714 time series forecasting Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 6
- 240000002853 Nelumbo nucifera Species 0.000 claims description 3
- 235000006508 Nelumbo nucifera Nutrition 0.000 claims description 3
- 235000006510 Nelumbo pentapetala Nutrition 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 description 8
- 210000002569 neuron Anatomy 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 5
- 238000010276 construction Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 241001123248 Arma Species 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- YHXISWVBGDMDLQ-UHFFFAOYSA-N moclobemide Chemical compound C1=CC(Cl)=CC=C1C(=O)NCCN1CCOCC1 YHXISWVBGDMDLQ-UHFFFAOYSA-N 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000002790 cross-validation Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- -1 electricity Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2458—Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
- G06F16/2474—Sequence data queries, e.g. querying versioned data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/06—Energy or water supply
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Economics (AREA)
- Theoretical Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Entrepreneurship & Innovation (AREA)
- Fuzzy Systems (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Development Economics (AREA)
- Public Health (AREA)
- Water Supply & Treatment (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Quality & Reliability (AREA)
- Mathematical Physics (AREA)
- Probability & Statistics with Applications (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The present invention provides a kind of load forecasting method and device based on self-encoding encoder and meta learning strategy, wherein this method comprises: receiving the target energy resource system object time to be predicted;Historical data of the target energy resource system before the object time is obtained, and extracts the time series feature of the historical data using self-encoding encoder;The disaggregated model obtained according to meta learning selects and the Load Forecast Algorithm of the time series characteristic matching from polyalgorithm;The target energy resource system is exported in the load value of the object time using the Load Forecast Algorithm.Through the invention, the various prediction algorithm time-consumings mostly low technical problem with accuracy rate is attempted when solving in the prior art using Load Forecast Algorithm prediction Energy Load.
Description
Technical field
The present invention relates to energy technology field, in particular to a kind of negative based on self-encoding encoder and meta learning strategy
Lotus prediction technique and device.
Background technique
In the prior art, the actual user of user can be guaranteed by carrying out energy forecast in advance, while reduce the waste of the energy.
Energy forecast in the prior art, either linear model or nonlinear model, due to not advising for basic data
Rule property, is impossible to be adapted to completely, moreover, because not necessarily optimal value has been arrived in training for the weight of various neural networks, threshold value, by
This causes single Load Forecast Algorithm forecasting accuracy not good enough, and prediction deviation is larger, is unfavorable for later period optimizing scheduling.
For the above-mentioned problems in the prior art, at present it is not yet found that the solution of effect.
Summary of the invention
The embodiment of the invention provides a kind of load forecasting method and device based on self-encoding encoder and meta learning strategy.
According to one embodiment of present invention, a kind of load prediction side based on self-encoding encoder and meta learning strategy is provided
Method, comprising: receive the target energy resource system object time to be predicted;Obtain the target energy resource system the object time it
Preceding historical data, and use self-encoding encoder extracts the time series feature of the historical data;According to disaggregated model from multiple
The Load Forecast Algorithm with the time series characteristic matching is selected in algorithm model;Institute is exported using the Load Forecast Algorithm
Target energy resource system is stated in the load value of the object time.
Optionally, it is being selected from polyalgorithm model according to disaggregated model and the load of the time series characteristic matching
Before prediction algorithm, the method also includes: determining includes the algorithm pond of multiple time series forecasting algorithm models, and including
The meta learning training set of multiple time series set;Each time series in the meta learning training set is extracted using self-encoding encoder
Time series feature;The polyalgorithm model in the algorithm pond is traversed, when obtaining each in the meta learning training set
Between sequence optimal algorithm, and using the optimal algorithm of each time series as label information;It is with each time series feature
Input, corresponding label information are output, and training obtains the disaggregated model.
Optionally, the time series feature packet of each time series in the meta learning training set is extracted using self-encoding encoder
It includes: for each time series, obtaining the coding vector of distance sample date nearest K days historical datas, wherein the sample
This date is the time series date to be predicted, and the K is preset value;The K coding vector is built into a spy
Vector is levied, and described eigenvector is determined as the time series feature.
Optionally, the K coding vector feature vector is built into comprise determining that in the K coding vector
The initial position of each coding vector and final position;By the K coding vector according to the sample date timing successively will before
The final position of one coding vector connects the initial position of next coding vector, and is built into a feature vector.
Optionally, extracting the time series feature of the historical data using self-encoding encoder includes following one: will be described
Historical data inputs self-encoding encoder, exports the time series feature of the historical data;Distance in the historical data is current
Time nearest K day data inputs self-encoding encoder, obtains K coding vector, when combining the K coding vector and obtaining described
Between sequence signature, the K be the time series feature default dimension.
Optionally, it is selected from polyalgorithm model according to disaggregated model pre- with the load of the time series characteristic matching
Method of determining and calculating includes: to calculate the polyalgorithm model matching degree with the time series feature respectively according to disaggregated model;Selection
Each Load Forecast Algorithm is distributed as Load Forecast Algorithm, and according to matching degree with highest one or more algorithm models are spent
Weight.
According to another embodiment of the invention, a kind of load prediction based on self-encoding encoder and meta learning strategy is provided
Device, comprising: receiving module, for receiving the target energy resource system object time to be predicted;Module is obtained, it is described for obtaining
Historical data of the target energy resource system before the object time, and use self-encoding encoder extracts the time of the historical data
Sequence signature;Selecting module, for being selected from polyalgorithm model according to disaggregated model and the time series characteristic matching
Load Forecast Algorithm;Output module, for exporting the target energy resource system in the mesh using the Load Forecast Algorithm
Mark the load value of time.
Optionally, described device further includes determining module, in the selecting module according to disaggregated model from multiple calculations
Before selecting the Load Forecast Algorithm with the time series characteristic matching in method model, determine to include multiple time series forecastings
The algorithm pond of algorithm model, and the meta learning training set including multiple time series set;Extraction module, for using self-editing
Code device extracts the time series feature of each time series in the meta learning training set;Processing module, for traversing the calculation
Polyalgorithm model in method pond obtains the optimal algorithm of each time series in the meta learning training set, and will be each
The optimal algorithm of time series is as label information;Training module, for being input, corresponding mark with each time series feature
Signing information is output, and training obtains the disaggregated model.
Optionally, the extraction module includes: acquiring unit, for being directed to each time series, obtains distance sample day
The coding vector of phase nearest K days historical datas, wherein the sample date is the date to be predicted of the time series,
The K is preset value;Construction unit, for the K coding vector to be built into a feature vector, and by the feature to
Amount is determined as the time series feature.
Optionally, the construction unit comprises determining that subelement, for determining each coding in the K coding vector
The initial position of vector and final position;Construct subelement, for by the K coding vector according to the sample date timing according to
The secondary final position by previous coding vector connects the initial position of next coding vector, and be built into a feature to
Amount.
Optionally, the acquisition module includes following one: output unit, for encoding historical data input certainly
Device exports the time series feature of the historical data;Assembled unit, for by the historical data apart from current time most
Close K day data inputs self-encoding encoder, obtains K coding vector, combine the K coding vector and obtain the time series
Feature, the K are the default dimension of the time series feature.
Optionally, the selecting module includes: computing unit, for calculating polyalgorithm model difference according to disaggregated model
With the matching degree of the time series feature;Selecting unit, for selecting the highest one or more algorithm models of matching degree to make
For Load Forecast Algorithm, and distribute according to matching degree the weight of each Load Forecast Algorithm.
According to still another embodiment of the invention, a kind of storage medium is additionally provided, meter is stored in the storage medium
Calculation machine program, wherein the computer program is arranged to execute the step in any of the above-described embodiment of the method when operation.
According to still another embodiment of the invention, a kind of electronic device, including memory and processor are additionally provided, it is described
Computer program is stored in memory, the processor is arranged to run the computer program to execute any of the above-described
Step in embodiment of the method.
Through the invention, the time series feature of the historical data is extracted using self-encoding encoder, and according to disaggregated model
The Load Forecast Algorithm with the time series characteristic matching is selected from polyalgorithm model, and then predicts the negative of object time
Charge values, by combining the meta learning strategy of self-encoding encoder, can choose with the most matched prediction model of historical data, solve existing
The various prediction algorithm time-consumings mostly low technology with accuracy rate is attempted when having in technology using Load Forecast Algorithm prediction Energy Load
Problem reduces prediction deviation, improves the accuracy rate of prediction Energy Load.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes part of this application, this hair
Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is a kind of the hard of load prediction network terminal based on self-encoding encoder and meta learning strategy of the embodiment of the present invention
Part structural block diagram;
Fig. 2 is the process of the load forecasting method according to an embodiment of the present invention based on self-encoding encoder and meta learning strategy
Figure;
Fig. 3 is the network structure of the self-encoding encoder of the embodiment of the present invention;
Fig. 4 is the structural frames of the load prediction device according to an embodiment of the present invention based on self-encoding encoder and meta learning strategy
Figure.
Specific embodiment
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention
Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only
The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people
The model that the present invention protects all should belong in member's every other embodiment obtained without making creative work
It encloses.
It should be noted that description and claims of this specification and term " first " in above-mentioned attached drawing, "
Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way
Data be interchangeable under appropriate circumstances, so as to the embodiment of the present invention described herein can in addition to illustrating herein or
Sequence other than those of description is implemented.In addition, term " includes " and " having " and their any deformation, it is intended that cover
Cover it is non-exclusive include, for example, the process, method, system, product or equipment for containing a series of steps or units are not necessarily limited to
Step or unit those of is clearly listed, but may include be not clearly listed or for these process, methods, product
Or other step or units that equipment is intrinsic.
Embodiment 1
Embodiment of the method provided by the embodiment of the present application one can in server, the network terminal, terminal or
It is executed in similar arithmetic unit.For operating on the network terminal, Fig. 1 is that one kind of the embodiment of the present invention is based on from coding
The hardware block diagram of the load prediction network terminal of device and meta learning strategy.As shown in Figure 1, Network Termination #1 0 may include one
(processor 102 can include but is not limited to Micro-processor MCV or can compile a or multiple (one is only shown in Fig. 1) processor 102
The processing unit of journey logical device FPGA etc.) and memory 104 for storing data, optionally, the above-mentioned network terminal may be used also
To include transmission device 106 and input-output equipment 108 for communication function.Those of ordinary skill in the art can manage
Solution, structure shown in FIG. 1 are only to illustrate, and do not cause to limit to the structure of the above-mentioned network terminal.For example, Network Termination #1 0 is also
It may include more perhaps less component or with the configuration different from shown in Fig. 1 than shown in Fig. 1.
Memory 104 can be used for storing computer program, for example, the software program and module of application software, such as this hair
The corresponding computer program of the load forecasting method based on self-encoding encoder and meta learning strategy in bright embodiment, processor 102
It is realized by running the computer program being stored in memory 104 thereby executing various function application and data processing
Above-mentioned method.Memory 104 may include high speed random access memory, may also include nonvolatile memory, such as one or more
A magnetic storage device, flash memory or other non-volatile solid state memories.In some instances, memory 104 can be further
Including the memory remotely located relative to processor 102, these remote memories can pass through network connection to the network terminal
10.The example of above-mentioned network includes but is not limited to internet, intranet, local area network, mobile radio communication and combinations thereof.
Transmitting device 106 is used to that data to be received or sent via a network.Above-mentioned network specific example may include
The wireless network that the communication providers of Network Termination #1 0 provide.In an example, transmitting device 106 includes a Network adaptation
Device (Network Interface Controller, referred to as NIC), can be connected by base station with other network equipments to
It can be communicated with internet.In an example, transmitting device 106 can for radio frequency (Radio Frequency, referred to as
RF) module is used to wirelessly be communicated with internet.
A kind of load forecasting method based on self-encoding encoder and meta learning strategy is provided in the present embodiment, and Fig. 2 is root
According to the flow chart of the load forecasting method based on self-encoding encoder and meta learning strategy of the embodiment of the present invention, as shown in Fig. 2, the stream
Journey includes the following steps:
Step S202 receives the target energy resource system object time to be predicted;
Step S204 is obtained historical data of the target energy resource system before the object time, and is extracted using self-encoding encoder
The time series feature of historical data;
The object time of the present embodiment can be a time point, be also possible to a period, if as unit of day, mesh
The mark time is 3-25, and current time is 3-21, then the one piece of data before historical data is 3-21;
Step S206 is selected from polyalgorithm model and the load prediction of time series characteristic matching according to disaggregated model
Algorithm;
Step S208 exports target energy resource system in the load value of object time using Load Forecast Algorithm.
Through the above steps, the time series feature of the historical data is extracted using self-encoding encoder, and according to classification mould
Type is selected from polyalgorithm model and the Load Forecast Algorithm of the time series characteristic matching, and then predicts the object time
Load value, by combine self-encoding encoder meta learning strategy, can choose with the most matched prediction model of historical data, solve
The various prediction algorithm time-consumings mostly low skill with accuracy rate is attempted when in the prior art using Load Forecast Algorithm prediction Energy Load
Art problem, reduces prediction deviation, improves the accuracy rate of prediction Energy Load.
Optionally, the executing subject of above-mentioned steps can be data processing equipment, and server, terminal etc. specifically can be
Processor, algoritic module etc., but not limited to this.
The application scenarios of the present embodiment can be applied in the scene of the artificial intelligence such as energy forecast, volume forecasting, the energy
It specifically can be heat energy, be also possible to the expendables energy such as cold, water, electricity, gas.
In the present embodiment, polyalgorithm model includes ARIMA (Autoregressive Integrated Moving
Average model, difference integrate rolling average autoregression model), ARMA (autoregressive moving-average model,
Autoregressive moving average model), neural network, regression tree etc., according to disaggregated model from multiple calculations
In method model before the Load Forecast Algorithm of selection and time series characteristic matching, it is also necessary to which the disaggregated model is set, comprising: from
Third party obtains disaggregated model, such as buys from supplier;Disaggregated model is obtained using sample data training.
The load prediction of the present embodiment is substantially time series forecasting problem.Assuming that have N number of Load Forecast Algorithm (such as
ARIMA, ARMA, neural network, regression tree etc.) { f1, f2..., fNAvailable, while having M time series (as each
Time series includes that data volume is more than 10 time cycles) { t1, t2..., tM}.So for each time series t, if
N number of algorithm is traversed, an algorithm f can be found, by verifying, its error in this time series is the smallest.Therefore
For each time series, the algorithm that effect is best in this time series can be found, algorithm name can be referred to as
With the label of this Time Series Matching.
In addition, the present embodiment is introduced from encoding, for each time series, the one of this time series can be extracted
A little features, such as the kurtosis of time series, the degree of bias, length, very poor, variance, mean value etc..Assuming that each time series is extracted
Equal number of feature, then the time series that M length does not wait, translates into first feature of M equal length.To sum up, often
One time series corresponds to a constituent element feature (first number of features of all time serieses is identical) and a label, this
Machine learning method can be used after sample, the relationship established between first feature and label (some prediction algorithm title) (is denoted as
F), i.e. disaggregated model.When having new time series forecasting problem again in this way, it is special as member that its time sequence signature can be extracted
Sign, then predicts this constituent element feature with F, obtains the most suitable algorithm of the time series, when then with the algorithm to this
Between sequence predicted.
It is being selected to calculate with the load prediction of the time series characteristic matching from polyalgorithm model according to disaggregated model
Before method, using meta learning strategy, training obtains one on the basis of a time series training set and Load Forecast Algorithm pond
Disaggregated model.In an optional embodiment of the present embodiment, time series training set is used to obtain as sample data training
Disaggregated model, comprising:
S11 determines the algorithm pond including multiple time series forecasting algorithm models, and including multiple time series set
Meta learning training set;
The present embodiment is described by taking the load for predicting all day as an example, for example, if load data granularity is hour, that
The each group of data of this model are 24, and what is predicted is following one day 24 data.Assuming that shared M time series is made
To analyze sample (or training set), it is denoted as { t1, t2..., tM}.Each time series long enough, such as each time sequence
There is over 10 days data for column.The load data of the sample data of the present embodiment is completely that the data through over cleaning are free of
Vacancy value.It is selective to have preset a Load Forecast Algorithm set, has been denoted as { f1, f2..., fN}.Model considers all negative
The minimum time unit of lotus data is identical, for example is all hour grade data.
S12 extracts the time series feature of each time series in the meta learning training set using self-encoding encoder;
Optionally, the time series feature packet of each time series in the meta learning training set is extracted using self-encoding encoder
It includes: for each time series, obtaining the coding vector of distance sample date nearest K days historical datas, wherein the sample
This date is the date to be predicted of the time series, and the K is preset value;The K coding vector is built into a spy
Vector is levied, and described eigenvector is determined as the time series feature.
In a specific embodiment, by the K coding vector be built into a feature vector comprise determining that it is described
The initial position of each coding vector and final position in K coding vector;By the K coding vector according to the sample date
The final position of previous coding vector is successively connected the initial position of next coding vector by timing, and is built into a spy
Levy vector.
S13 traverses the polyalgorithm model in the algorithm pond, obtains each time sequence in the meta learning training set
The optimal algorithm of column, and using the optimal algorithm of each time series as label information.Multiple time serieses are separately input into
Polyalgorithm model, and the error amount that time series is directed to each algorithm model is calculated according to model output data, by error amount
The smallest algorithm model is determined as the Load Forecast Algorithm with first sample Data Matching;
S14 is input with each time series feature, and corresponding label information is output, and training obtains the classification mould
Type.Such as there are 10 groups of sample datas, A, B, tri- algorithm models of C, wherein the Load Forecast Algorithm of the 1st~5 group of Data Matching is
A algorithm model, the Load Forecast Algorithm of 6~8 groups of Data Matchings are B algorithm model, and the load prediction of 9~10 groups of Data Matchings is calculated
Method is C algorithm model, then is input label data, A algorithm model (mark, title) by the time series feature of 1~5 group of data
For corresponding output label data, other (6~10 groups) sample datas are similar, and training obtains disaggregated model.
Optionally, extracting the time series feature of historical data using self-encoding encoder includes following one: by historical data
Self-encoding encoder is inputted, the time series feature of historical data is exported;By in the historical data apart from nearest K days of current time
Historical data is divided into K group (one day data is one group of data) by data, inputs self-encoding encoder, obtains K coding vector,
It combines the K coding vector and obtains the time series feature, the K is the default dimension of the time series feature.
The self-encoding encoder (Autoencoders) of the present embodiment is a kind of neural network of special construction.Fig. 3 is the present invention
The network structure of the self-encoding encoder of embodiment, self-encoding encoder (illustrate a kind of stacking-type self-encoding encoder type, are also possible to become
Point self-encoding encoder etc. can be applied to this model), comprising: input layer (Inputs), output layer (Outputs), hidden layer
(Hidden), following feature: (a) target of self-encoding encoder is to keep output vector and input vector identical as far as possible;(b) with
Centered on one layer most intermediate (being called coding layer, coding layer), symmetrical feature is presented in neural network.Practical epineural
The weighted value of network is symmetrical about this layer.(c) coding layer in center acquires a special sense, and indicates from defeated
The feature extracted in entering.Self-encoding encoder one common purposes is exactly the feature for extracting input.Therefore it can use it
Carry out the feature (i.e. first feature of time series) of extraction time sequence.
Optionally, it is selected to calculate with the load prediction of time series characteristic matching from polyalgorithm model according to disaggregated model
Method includes: to calculate the polyalgorithm model matching degree with the time series feature respectively according to disaggregated model;Select matching degree
Highest one or more algorithm model distributes according to matching degree the power of each Load Forecast Algorithm as Load Forecast Algorithm
Weight.Such as three highest Load Forecast Algorithms of matching degree of selection, weight is respectively 0.5,0.3,0.2, is using three loads
After prediction algorithm obtains load value, then multiplied by respective weight, final load value is obtained.
It is described in detail below with reference to scheme of the specific example to the present embodiment, comprising:
Model training step:
(1) load data is daily cut:
Original sample load data is one-dimensional, is multi-group data by its cutting by length for one day as standard,
In each group be data that length is one day.If each data are hour grade data, it is cut into 24 points, one vector;
If it is 15 minutes data points, then it is cut into 96 points, one vector, other times granularity is similar.
(2) it labels to each time series:
To each time series, ergodic algorithm set { f1, f2..., fNIn algorithm, Select Error it is the smallest that
Algorithm title, the label as the time series.Tag set is denoted as { l1, l2..., lM}.Therefore all time serieses point
For N class.
(3) to each load data sequence, its feature is extracted with self-encoding encoder, comprising:
(3.1) as the input layer of self-encoding encoder with output layer neuron number must be with the dimension of feature vector,
The number of coding layer is necessarily equal to given dimension D.The neuron number that the number of other layers and each layer include can be used as
Hyper parameter, the optimal number of plies and neuron number are chosen by cross validation (if in order to reduce calculation amount, can also use conjunction
The fixed value of reason);The number of plies of neuron is unsuitable excessively high, for example the whole network number of plies is no more than ten layers;
(3.2) number of plies of all corresponding self-encoding encoders of load data sequence and every layer of neuron number evidence are set as phase
With;
(3.3) for each load data sequence (it is good to have pressed component), training is what a self-encoding encoder, then obtains
The coding (such as K=10) of the K days data nearest apart from current date, by the end to end composition one of this K coding vector
Vector, the vector are first feature, and first feature of all load datas is denoted as { feature1, feature2...,
featureM}。
(4) with { feature obtained above1, feature2..., featureMIt is used as input feature vector, { l1, l2...,
lMIt is used as output label, with certain machine learning algorithm (such as decision tree or SVM) study to a classification mould
Type F.
(5) disaggregated model F is stored;Store the number of plies and every layer of neuron number of self-encoding encoder;Store numerical value K.
Model applying step:
(1) when thering is a new load data to need to predict (data such as more than 10 days) it, is first cut
It is the several groups data in day at length.
(2) with the structure of the self-encoding encoder saved in above-mentioned steps (5), a self-encoding encoder is trained, it also can be used
Preceding trained self-encoding encoder, then the nearest K of selected distance current date (has stored) coding in day, end to end composition
One vector.Then input model F obtains the optimum prediction algorithm f for being suitble to the data;
(3) future value of the load is predicted with f.
It should be noted that for the various method embodiments described above, for simple description, therefore, it is stated as a series of
Combination of actions, but those skilled in the art should understand that, the present invention is not limited by the sequence of acts described because
According to the present invention, some steps may be performed in other sequences or simultaneously.Secondly, those skilled in the art should also know
It knows, the embodiments described in the specification are all preferred embodiments, and related actions and modules is not necessarily of the invention
It is necessary.
Through the above description of the embodiments, those skilled in the art can be understood that according to above-mentioned implementation
The method of example can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but it is very much
In the case of the former be more preferably embodiment.Based on this understanding, technical solution of the present invention is substantially in other words to existing
The part that technology contributes can be embodied in the form of software products, which is stored in a storage
In medium (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, calculate
Machine, server or network equipment etc.) method that executes each embodiment of the present invention.
Embodiment 2
A kind of load prediction device based on self-encoding encoder and meta learning strategy, the device are additionally provided in the present embodiment
For realizing above-described embodiment and preferred embodiment, the descriptions that have already been made will not be repeated.As used below, term
The combination of the software and/or hardware of predetermined function may be implemented in " module ".Although device is preferably described in following embodiment
It is realized with software, but the realization of the combination of hardware or software and hardware is also that may and be contemplated.
Fig. 4 is the structural frames of the load prediction device according to an embodiment of the present invention based on self-encoding encoder and meta learning strategy
Figure, as shown in figure 4, the device includes:
Receiving module 40, for receiving the target energy resource system object time to be predicted;
Module 42 is obtained, for obtaining historical data of the target energy resource system before the object time, and is adopted
The time series feature of the historical data is extracted with self-encoding encoder;
Selecting module 44, for being selected from polyalgorithm model according to disaggregated model and the time series characteristic matching
Load Forecast Algorithm;
Output module 46, for exporting the target energy resource system in the object time using the Load Forecast Algorithm
Load value.
Optionally, described device further includes determining module, in the selecting module according to disaggregated model from multiple calculations
Before selecting the Load Forecast Algorithm with the time series characteristic matching in method model, determine to include multiple time series forecastings
The algorithm pond of algorithm model, and the meta learning training set including multiple time series set;Extraction module, for using self-editing
Code device extracts the time series feature of each time series in the meta learning training set;Processing module, for traversing the calculation
Polyalgorithm model in method pond obtains the optimal algorithm of each time series in the meta learning training set, and will be each
The optimal algorithm of time series is as label information;Training module, for being input, corresponding mark with each time series feature
Signing information is output, and training obtains the disaggregated model.
Optionally, the extraction module includes: acquiring unit, for being directed to each time series, obtains distance sample day
The coding vector of phase nearest K days historical datas, wherein the sample date is the date to be predicted of the time series,
The K is preset value;Construction unit, for the K coding vector to be built into a feature vector, and by the feature to
Amount is determined as the time series feature.
Optionally, the construction unit comprises determining that subelement, for determining each coding in the K coding vector
The initial position of vector and final position;Construct subelement, for by the K coding vector according to the sample date timing according to
The secondary final position by previous coding vector connects the initial position of next coding vector, and be built into a feature to
Amount.
Optionally, the acquisition module includes following one: output unit, for encoding historical data input certainly
Device exports the time series feature of the historical data;Assembled unit, for by the historical data apart from current time most
Close K day data inputs self-encoding encoder, obtains K coding vector, combine the K coding vector and obtain the time series
Feature, the K are the default dimension of the time series feature.
Optionally, the selecting module includes: computing unit, for calculating polyalgorithm model difference according to disaggregated model
With the matching degree of the time series feature;Selecting unit, for selecting the highest one or more algorithm models of matching degree to make
For Load Forecast Algorithm, and distribute according to matching degree the weight of each Load Forecast Algorithm.
It should be noted that above-mentioned modules can be realized by software or hardware, for the latter, Ke Yitong
Following manner realization is crossed, but not limited to this: above-mentioned module is respectively positioned in same processor;Alternatively, above-mentioned modules are with any
Combined form is located in different processors.
Embodiment 3
The embodiments of the present invention also provide a kind of storage medium, computer program is stored in the storage medium, wherein
The computer program is arranged to execute the step in any of the above-described embodiment of the method when operation.
Optionally, in the present embodiment, above-mentioned storage medium can be set to store by executing based on following steps
Calculation machine program:
S1 receives the target energy resource system object time to be predicted;
S2 is obtained historical data of the target energy resource system before the object time, and is mentioned using self-encoding encoder
Take the time series feature of the historical data;
S3 selects to calculate with the load prediction of the time series characteristic matching from polyalgorithm model according to disaggregated model
Method;
S4 exports the target energy resource system in the load value of the object time using the Load Forecast Algorithm.
Optionally, in the present embodiment, above-mentioned storage medium can include but is not limited to: USB flash disk, read-only memory (Read-
Only Memory, referred to as ROM), it is random access memory (RandomAccess Memory, referred to as RAM), mobile hard
The various media that can store computer program such as disk, magnetic or disk.
The embodiments of the present invention also provide a kind of electronic device, including memory and processor, stored in the memory
There is computer program, which is arranged to run computer program to execute the step in any of the above-described embodiment of the method
Suddenly.
Optionally, above-mentioned electronic device can also include transmission device and input-output equipment, wherein the transmission device
It is connected with above-mentioned processor, which connects with above-mentioned processor.
Optionally, in the present embodiment, above-mentioned processor can be set to execute following steps by computer program:
S1 receives the target energy resource system object time to be predicted;
S2 is obtained historical data of the target energy resource system before the object time, and is mentioned using self-encoding encoder
Take the time series feature of the historical data;
S3 selects to calculate with the load prediction of the time series characteristic matching from polyalgorithm model according to disaggregated model
Method;
S4 exports the target energy resource system in the load value of the object time using the Load Forecast Algorithm.
Optionally, the specific example in the present embodiment can be with reference to described in above-described embodiment and optional embodiment
Example, details are not described herein for the present embodiment.
Obviously, those skilled in the art should be understood that each module of the above invention or each step can be with general
Computing device realize that they can be concentrated on a single computing device, or be distributed in multiple computing devices and formed
Network on, optionally, they can be realized with the program code that computing device can perform, it is thus possible to which they are stored
It is performed by computing device in the storage device, and in some cases, it can be to be different from shown in sequence execution herein
Out or description the step of, perhaps they are fabricated to each integrated circuit modules or by them multiple modules or
Step is fabricated to single integrated circuit module to realize.In this way, the present invention is not limited to any specific hardware and softwares to combine.
The foregoing is only a preferred embodiment of the present invention, is not intended to restrict the invention, for the skill of this field
For art personnel, the invention may be variously modified and varied.It is all within principle of the invention, it is made it is any modification, etc.
With replacement, improvement etc., should all be included in the protection scope of the present invention.
Claims (10)
1. a kind of load forecasting method based on self-encoding encoder and meta learning strategy characterized by comprising
Receive the target energy resource system object time to be predicted;
Historical data of the target energy resource system before the object time is obtained, and is gone through using described in self-encoding encoder extraction
The time series feature of history data;
It is selected from polyalgorithm model according to disaggregated model and the Load Forecast Algorithm of the time series characteristic matching;
The target energy resource system is exported in the load value of the object time using the Load Forecast Algorithm.
2. the method according to claim 1, wherein selected from polyalgorithm model according to disaggregated model with
Before the Load Forecast Algorithm of the time series characteristic matching, the method also includes:
Determine the algorithm pond including multiple time series forecasting algorithm models, and the meta learning including multiple time series set
Training set;
The time series feature of each time series in the meta learning training set is extracted using self-encoding encoder;
The polyalgorithm model in the algorithm pond is traversed, the optimal of each time series in the meta learning training set is obtained
Algorithm, and using the optimal algorithm of each time series as label information;
It is input with each time series feature, corresponding label information is output, and training obtains the disaggregated model.
3. according to the method described in claim 2, it is characterized in that, being extracted using self-encoding encoder every in the meta learning training set
The time series feature of a time series includes:
For each time series, the coding vector of distance sample date nearest K days historical datas is obtained, wherein described
The sample date is the date to be predicted of the time series, and the K is preset value;
The K coding vector is built into a feature vector, and described eigenvector is determined as the time series spy
Sign.
4. according to the method described in claim 3, it is characterized in that, the K coding vector is built into a feature vector
Include:
Determine the initial position of each coding vector and final position in the K coding vector;
The K coding vector is successively next by the final position connection of previous coding vector according to the timing on sample date
The initial position of a coding vector, and it is built into a feature vector.
5. the method according to claim 1, wherein extracting the time sequence of the historical data using self-encoding encoder
Column feature includes following one:
The historical data is inputted into self-encoding encoder, exports the time series feature of the historical data;
By K day data nearest apart from current time in the historical data, self-encoding encoder is inputted, K coding vector, group are obtained
It closes the K coding vector and obtains the time series feature, the K is the default dimension of the time series feature.
6. the method according to claim 1, wherein being selected from polyalgorithm model according to disaggregated model and institute
The Load Forecast Algorithm for stating time series characteristic matching includes:
The polyalgorithm model matching degree with the time series feature respectively is calculated according to disaggregated model;
Select the highest one or more algorithm models of matching degree as Load Forecast Algorithm, and each negative according to matching degree distribution
The weight of lotus prediction algorithm.
7. a kind of load prediction device based on self-encoding encoder and meta learning strategy characterized by comprising
Receiving module, for receiving the target energy resource system object time to be predicted;
Module is obtained, for obtaining historical data of the target energy resource system before the object time, and using self-editing
Code device extracts the time series feature of the historical data;
Selecting module, for being selected from polyalgorithm model according to disaggregated model and the load of the time series characteristic matching
Prediction algorithm;
Output module, for exporting the target energy resource system in the load of the object time using the Load Forecast Algorithm
Value.
8. device according to claim 7, which is characterized in that described device further include:
Determining module, for being selected from polyalgorithm model and the time series in the selecting module according to disaggregated model
Before the Load Forecast Algorithm of characteristic matching, the algorithm pond including polyalgorithm model is determined, and including multiple time serieses
The meta learning training set of set;
Extraction module, the time series for extracting each time series in the meta learning training set using self-encoding encoder are special
Sign;
Processing module obtains each of described meta learning training set for traversing the polyalgorithm model in the algorithm pond
The optimal algorithm of time series, and be the optimal algorithm of each time series as its label information;
Training module, for being input with each time series feature, corresponding label information is output, and training obtains described point
Class model.
9. a kind of storage medium, which is characterized in that be stored with computer program in the storage medium, wherein the computer
Program is arranged to execute method described in any one of claim 1 to 6 when operation.
10. a kind of electronic device, including memory and processor, which is characterized in that be stored with computer journey in the memory
Sequence, the processor are arranged to run the computer program to execute side described in any one of claim 1 to 6
Method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910267480.0A CN110084406B (en) | 2019-04-03 | 2019-04-03 | Load prediction method and device based on self-encoder and meta-learning strategy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910267480.0A CN110084406B (en) | 2019-04-03 | 2019-04-03 | Load prediction method and device based on self-encoder and meta-learning strategy |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110084406A true CN110084406A (en) | 2019-08-02 |
CN110084406B CN110084406B (en) | 2021-09-24 |
Family
ID=67414161
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910267480.0A Active CN110084406B (en) | 2019-04-03 | 2019-04-03 | Load prediction method and device based on self-encoder and meta-learning strategy |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110084406B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110909826A (en) * | 2019-12-10 | 2020-03-24 | 新奥数能科技有限公司 | Diagnosis monitoring method and device for energy equipment and electronic equipment |
CN111859054A (en) * | 2020-07-23 | 2020-10-30 | 中国科学院计算机网络信息中心 | Method and device for processing meteorological satellite data |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102779223A (en) * | 2011-05-13 | 2012-11-14 | 富士通株式会社 | Method and device for forecasting short-term power load |
US20130204820A1 (en) * | 2012-02-08 | 2013-08-08 | Qualcomm Incorporated | Methods and apparatus for spiking neural computation |
CN103699947A (en) * | 2014-01-16 | 2014-04-02 | 湖南大学 | Meta learning-based combined prediction method for time-varying nonlinear load of electrical power system |
CN107590567A (en) * | 2017-09-13 | 2018-01-16 | 南京航空航天大学 | A kind of Recognition with Recurrent Neural Network short-term load forecasting method based on comentropy cluster and notice mechanism |
-
2019
- 2019-04-03 CN CN201910267480.0A patent/CN110084406B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102779223A (en) * | 2011-05-13 | 2012-11-14 | 富士通株式会社 | Method and device for forecasting short-term power load |
US20130204820A1 (en) * | 2012-02-08 | 2013-08-08 | Qualcomm Incorporated | Methods and apparatus for spiking neural computation |
CN103699947A (en) * | 2014-01-16 | 2014-04-02 | 湖南大学 | Meta learning-based combined prediction method for time-varying nonlinear load of electrical power system |
CN107590567A (en) * | 2017-09-13 | 2018-01-16 | 南京航空航天大学 | A kind of Recognition with Recurrent Neural Network short-term load forecasting method based on comentropy cluster and notice mechanism |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110909826A (en) * | 2019-12-10 | 2020-03-24 | 新奥数能科技有限公司 | Diagnosis monitoring method and device for energy equipment and electronic equipment |
CN111859054A (en) * | 2020-07-23 | 2020-10-30 | 中国科学院计算机网络信息中心 | Method and device for processing meteorological satellite data |
CN111859054B (en) * | 2020-07-23 | 2023-12-26 | 中国科学院计算机网络信息中心 | Meteorological satellite data processing method and device |
Also Published As
Publication number | Publication date |
---|---|
CN110084406B (en) | 2021-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110084407A (en) | Load forecasting method and device based on Recognition with Recurrent Neural Network and meta learning strategy | |
CN109255498A (en) | Load forecasting method and device neural network based | |
EP1907996B1 (en) | Load prediction based on-line and off-line training of neural networks | |
CN111754336B (en) | Method and device for scheduling network points | |
Lee et al. | Individualized short-term electric load forecasting with deep neural network based transfer learning and meta learning | |
CN109672795A (en) | Call center resource management method and device, electronic equipment, storage medium | |
CN110084406A (en) | Load forecasting method and device based on self-encoding encoder and meta learning strategy | |
CN110147389A (en) | Account number treating method and apparatus, storage medium and electronic device | |
CN111935140A (en) | Abnormal message identification method and device | |
Khan et al. | Short-term daily peak load forecasting using fast learning neural network | |
Hafeez et al. | Short term load forecasting based on deep learning for smart grid applications | |
Wang et al. | Robust granular optimization: a structured approach for optimization under integrated uncertainty | |
Wang et al. | Improving economic values of day‐ahead load forecasts to real‐time power system operations | |
Muccini et al. | Leveraging machine learning techniques for architecting self-adaptive iot systems | |
Owerko et al. | Unsupervised optimal power flow using graph neural networks | |
CN109543879A (en) | Load forecasting method and device neural network based | |
Guo et al. | Short-Term Water Demand Forecast Based on Deep Neural Network:(029) | |
JP4735973B2 (en) | Power price zone prediction method and power price zone prediction program | |
Rasheduzzaman et al. | Study of different forecasting models on Google cluster trace | |
CN114330784A (en) | Operation and maintenance personnel scheduling method, system and computer readable storage medium | |
CN109978299A (en) | Data analysing method, device and storage medium for offshore wind farm business | |
CN113379344A (en) | Intelligent logistics processing method and system based on cloud computing processing | |
CN114327925A (en) | Power data real-time calculation scheduling optimization method and system | |
CN113935407A (en) | Abnormal behavior recognition model determining method and device | |
CN111582408A (en) | Data processing method, data processing apparatus, storage medium, and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |