This application is related to U.S. provisional patent applications 60/649,677, 60/649,877, 60/649,876 and 60/649,803, all filed on 3/2/2005, which are hereby incorporated by reference in their entirety.
Detailed Description
The invention provides a neural network-based Very Short Term Load Predictor (VSTLP). Aspects of VSTLPs are described in more detail in U.S. patent application 2004/0249775 to Chen, 2004/0246643 to Chen, 2004/0257059 to Mansingh et al, 2004/0260489 to Mansingh et al, 2004/0257858 to Mansingh et al, and these applications are incorporated herein by reference. VSTLP requires determining the actual load data source and training off-line neural networks for on-line prediction and on-line training/tuning to improve prediction accuracy to obtain the actual load data source.
Load data for up to five whole day load curves are used for offline neural network training. The annual value of the specified time period load data is stored in the VSTLP database, where the 10-year value of the holiday load data is also stored. The dates of these load curves are used to identify potential sources of load data. In other words, the load data source is specified by date. The determination of the actual load data source for neural network training is achieved in two stages. The first stage involves the user specifying the date of one or more data sources. The second stage involves looking up the remaining best match dates with the closest load curve using the exemplary algorithm described below. The date of the first load data source may be specified by the user. The first set of load data is used to generate a reference load curve. Additional date data sets, for example four, may be specified by the user or looked up by the proposed algorithm.
Referring to fig. 1, a Very Short Term Load Predictor (VSTLP) module 1000 is a tool that predicts very short term system loads. VSTLP module 1000 uses a set of neural networks to predict system load for short term periods and short term intervals, such as predicting system load for the next 30 minute increment of 1 minute. If available, VSTLP module 1000 uses past load data and short term load prediction (STLF) data from an STFL module (not shown) to predict the upcoming load trend for the next 15 minutes.
VSTLP module 1000 utilizes Artificial Neural Network (ANN) techniques to predict load requirements. Because the loads vary widely between weekdays and weekends, vary from weekday to weekend, and often dynamically vary from day to day, the ANN-based VSTLP module 1000 has the function of distinguishing between different seasons (e.g., summer and winter) in other features; distinguishing weekends, holidays and workdays; a function of distinguishing non-peak time from peak time; a function of predicting a next cycle (15 1-minutes) load value dynamically in accordance with a recent time period (e.g., the past 15 minutes); functions that conform to the time-averaged load values predicted by the STLF module function or equivalent external source. .
Because the STLF module has recorded and simulated weather information, VSTLP module 1000 may not directly consider weather information or input weather information. However, it should be understood that the day pattern may be derived from a weather adaptive load prediction or manually entered. But weather change information is not used directly in the ANN-based VSTLP1000 (although other embodiments would use), but the STLF hourly predicted load value would be used to adjust the ANN-based VSTLP1000 predicted load value per minute.
To record load over season, weekday/weekend/holiday, and off-peak/peak time, a neural network may be trained to acquire a load pattern that is performed during a particular season, a particular day, and a particular time of day. Initially, one year is divided into spring (1 to 3 months), summer (4 to 6 months), autumn (7 to 9 months), and winter (10 to 12 months). This division may vary depending on the actual load characteristics of the particular location to which the VSTL module 1000 is applied. For example, the load properties on weekends (saturday and sunday) are different from those on weekdays (monday through friday). The division between weekdays and weekends can be made and based on the construction, the real environment. For example, weekends may include saturday, sunday, and monday, while weekdays may include tuesday through friday. Generally, the load characteristics on weekdays and weekends are usually recurring. However, the load characteristics on holidays are very different from those on regular weekdays/weekends. Holidays may be particularly considered, particularly large festivals such as thanksgiving and christmas. The load characteristics of the day before and after the holiday are also affected. For example, when collecting data for training a neural network associated with VSTLP1000, the interval from 6.
The ANN-based VSTLP structure is shown in FIG. 1. The decision algorithm 1010 of fig. 1 processes predicted load values from the corresponding ANN VSTLP module 1012 via 1028 (weekday, month, weekend, and holiday) at the end of each time limit and minimizes the effects of switching of the ANN VSTLP module 1000. The decision algorithm 1010 is implemented by a neural network. Over a shorter time period, and as shown in fig. 2, for each individual day, 6 neural networks (NN 1, NN2, NN3, NN4, NN5, NN 6) may be used to cover a 24-hour period (time period). In this embodiment, each neural network is responsible for a 4 hour period (sub-period), although other time allocations and numbers of neural networks may be employed. For example NN1, NN2, NN3, NN4, NN5 and NN6 cover 12. To ensure a smooth transition from one 4 hour period to the next, another half hour is added at the end of each 4 hour period. For example, NN1 covers 11 30pm to 4 30am, nn2 covers 3 30am to 8. Dividing a whole day into 6 different 4 hour periods reflects the fact that the load changes dynamically throughout the day. It should be understood that different time allocations, overlaps, and numbers of neural networks may be used. The use of multiple neural networks for different time periods of the day can be predicted more accurately. Such partitioning may vary according to the pattern of the real conditions. Thus, as also shown in FIG. 2, each ANN VSTLP module 1012-1028 shown in FIG. 1 has multiple ANNs to predict loads corresponding to a particular time period.
The decision algorithm 1010 in the daily ANN-based VSTLP algorithm shown in fig. 2 processes the predicted load values from the corresponding ANN VSTLP module at the end of each time limit and minimizes the effect of ANN VSTLP module switching. Decision algorithm unit 2010 may also be implemented by a neural network and may employ linear or non-linear decision algorithms as well as neural network transfer functions. Each NN will be implemented in one or two hidden layers depending on the complexity of the load dynamics it has to learn. More specifically, one or more neurons may be used in each neural network with varying weights, biases, linear and non-linear transfer functions. It should be understood that the inputs are affected by the weights and biases assigned to each neuron.
In the following equation, the load is represented by P. The next predicted load value for 15 minutes may be expressed as a function of the current load and the previous load value for N1-minutes:
here, P n+i (i is 1. Ltoreq. M) is the predicted load of the ith step (minute) in the future from the current time i. P n 、P n-i 、P n-2 、...、P n-N Is the actual load value for the current time and the previous N minutes. In this description, M represents a value of 15. The choice of N depends on the complexity of the load dynamics and will be determined by a method of trial-and-error experiment (trial-and-error experimental) with any load dynamics reasoning information available。
It was observed that the kinetics change with time in the above equation. However, the time-varying effect can be ignored over any single time period suitably divided over the entire day for 24 hours (23 or 25 when DST occurs). The load dynamics vary over different individual time periods. Thus, multiple NNs are used to address time-varying load dynamics. Within each individual time period, the load dynamics can be simply expressed as:
the above equation can be rewritten as a vector format as follows.
Because f is 1 、f 2 、...、f M Are unknown, so the exact form of these functions is also unknown, and with the available historical load data, a feed-forward neural network with the appropriate layers can be trained to approximate such functions.
Here, θ is a parameter vector containing the weights between adjacent layers and all neuron deviations, and is adjusted to minimize the difference between future calculated and actual values expressed in performance indices.
And training the neural network off line by adopting historical load data. After the neural network training and validation is complete, it is ready for online use. The weights may be adjusted to account for only the display load characteristics of the previous day by limiting on a daily basis. The weights may also be updated offline.
When the actual value P is available n 、P n-i 、P n-2 、...、P n-N The predicted load value P can be calculated immediately n+i (i is more than or equal to 1 and less than or equal to M). When P is not available n 、P n-i 、P n-2 、...、P n-N Will instead use the estimate generated from the ANN-based VSTLP at the previous time and further predict future loads. This can be repeated until all the minute-by-minute predicted values for the entire hour are calculated, as shown in fig. 3.
An adaptation scaling (adaptive scaling) unit 3010 in the ANN-based VSTLP of fig. 3 at the next time graph processes the original per minute predicted load values introduced from the four ANN-based VSTLP modules 3012-3018. Each of these VSTLP modules 3012-3018 is responsible for a prediction time period that is 15 minutes long and is scaled based on the predicted load value per hour issued by the STLF.
Assume that the number of inputs M is not greater than 15. Thus, at any instant n, when the prediction of the next hour's load value begins, the first ANN-based VSTLP will calculate the next 15 minute load value, P, based on the actual previous load value n+i (i is more than or equal to 1 and less than or equal to 15). When predicting load values for a time period from n +16 to n +30, the second ANN-based VSTLP will use some of the available predicted load values P per minute, since the actual load values for that time period are not yet available n+i (i is more than or equal to 1 and less than or equal to 15). In the same manner, the second ANN-based VSTLP predicts a load value P for a time period from n +16 to n +30 n+i (16. Ltoreq. I.ltoreq.30) plus { curvature sign. Similarly, both the third and fourth ANN-based VSTLPs will produce another predicted load value of 15 minutes, P n+i (31. Ltoreq. I. Ltoreq.45) and P n+i (i is more than or equal to 45 and less than or equal to 60). These four ANN-based VSTLPs will collectively yield the next 60 minute predicted load value. However, if the timestamp associated therewith exceeds the current hour, some of these predicted load values will not be employed in the adaptive scaling. If the current time is i minutes after the hour, then for a time period from n-i +1 to n,can obtain P n-i+1 、P n-i+2 、P n-i+3 、...、P n Actual values, and for the rest of the time period in this hour, only predicted values P can be obtained n+k (1. Ltoreq. K. Ltoreq.60-i) plus { curvature sign. The predicted load value P will be discarded according to the final application n+k (60-i + 1. Ltoreq. K. Ltoreq.60-i) or not predicting the corresponding time period at all.
For example, let the scaling factor be s n Let STLF predict load value per hour for current time as P stlf . To match the predicted load per minute during this hour with the predicted load per hour with satisfactory accuracy for STLF, the following equation is used:
therefore, the temperature of the molten metal is controlled,
the modified predicted load per minute value for the future time period from n +1 to n +60-i in the current hour is then s n P n+k (k is more than or equal to 1 and less than or equal to 60-i). However, it should be understood that Sn changes in time as the predicted 15 minute sliding window updates every minute. S. the n Is also a performance marker for ANN-based VSTLP. If S is n With a small fluctuation around 1 (or substantially around 1), and assuming that the hourly load pattern is fully captured by STLF with satisfactory accuracy, this indicates that the ANN-based VSTLP performs reasonably well in the sense that the predicted load values per minute produced by the ANN-based VSTLP are consistent with the predicted load values per hour of STLF. In addition, it should be noted that inconsistent loads must be addressed and separately processed or filtered out to minimize or eliminate their effect on ANN-based VSTLP prediction accuracy. In addition, in some instances, STLF should be more than the most load-predicted oneThe large prediction period is still long.
The historical load data stored in HFD/HIS1030 must be formatted in an appropriate format before it can be used to train the ANN-based VSTLP neural network. This may be accomplished through an interface program that retrieves the historical data from the HFD/HIS, reformats it, and sends it to the main program responsible for neural network training. The historical data is preferably divided into two different sets. The first set of data is used to train the neural network, while the second set of data is used to evaluate the performance of the trained neural network to prevent over-training of the neural network. Based on the performance assessment, the number of training sessions can be specified and the final training achieved with all available and useful historical data. Information related to the ANN-based VSTLP1000, such as the number of layers, the number of neurons in each layer, the initiation functions, weights and offsets, etc., are stored in a file.
Structural and parameter information related to the ANN-based VSTLP1000 is retrieved to initialize the neural network of the ANN-based VSTLP. As with VSTLP offline training, the interface program retrieves historical load data, or other online application that temporarily stores historical load data, from HFD/HIS5030 and sends it to the initialized neural network for load prediction. The resulting predicted load per minute values will be used for generation, scheduling and display purposes. These load values are also stored for post-hoc performance evaluation.
The weights and biases of the neural network can be updated online by using the actual load values in the immediate past and the predicted load values generated from the neural network. This is based on the fact that the passing load characteristics can be used to improve the accuracy of predicting the next future load. Beyond a certain period of time (e.g., the time period defined by a daily ANN-based VSTLP map), the weights and biases are discarded and the initial weights and biases are reinstalled. The weights and biases can also be updated monthly, weekly, daily, hourly or even at shorter intervals depending on the real environment.
The algorithms of the first and second exemplary embodiments take into account two conditions: the candidate load curve is similar to the user-specified reference load curve and the candidate load curve displays a degree of deviation from the reference load curve. The first condition selected load curves are close to the reference curves, so that these load curves show a dominant load trend that is very good for capturing neural networks by training. The second condition requires that the selected load curve be varied around the reference curve so that the neural network can be generalized after the training is completed.
Regarding the first load curve specified by the user as the reference load curve, the reference load curve is represented by t e (0, 1440)](t in min) of C r (t) represents. The summer time (DST) can be treated as follows: for long DST days, load data of repeated hours is not used; for a short DST day, the load data for the second hour was repeated for a time period from 2 00. With this treatment, there was a load curve comprising 1440 1-minute load data points or 288 5-minute load data points per day. For explanation, but without loss of generality, the loading curve is assumed to have 1440 1-minute load data points.
Can be formed by i E [1,N]C with N less than or equal to 4 i lll (t) represents the load curve for the best match date (still to be identified based on the algorithm of the exemplary embodiment). Any date in the year can be represented by the subscript k, 1 ≦ k ≦ 365 and n represents C for the year when the holiday search was performed k n (t) represents the load curve stored in the VSTLP at any date. Target slave C k n (t)'s search for C i lll (t)′s。
Referring to fig. 4, a first exemplary algorithm 400 utilizes both whole load curve matching and part load curve matching. The user provides a reference date, and the system retrieves a reference load curve C r Reference data of (t) (section 402). The system may make the DST adjustment t e (1, 1440) as previously described](element 404). The system determines which day to reduce based on the next day of the week (i.e., the day after the reference day) (element 406). Once the data set is determined, load data for these dates is retrieved from the VSTLP database (element 408). Data can be stored in storage load data for annual valueIn the special table (2). The best matching date (and thus the load data of the neural network training) may be selected on the current day, and the trained neural network used to predict the load characteristics the next day. The screening step to determine the reduced date set Ω is described in more detail in FIG. 6, described below.
The system loops through the reduced date set Ω (element 410). The system may be driven from a VSTLPDatabase retrieval of one load curve at a time, C k n (t) (element 412). The system may make the necessary DST adjustments te e (1, 1440) for each set of date data as previously described](element 413). For each set of date data, calculate each curve C k n The global difference measure and the match measure of (t) (element 414). Reference curve C r (t) and C k n The integrated square of the difference between (t) is as follows:
curve C k n (t) vs. reference curve C r The matching of (t) is as follows:
referring to FIG. 5, the system pairs all A 1 (n, k) sorting to pick the desired lowest value, whereby A 1 (n 1 ,k 1 )≤A 1 (n 2 ,k 2 )≤A 1 (n 3 ,k 3 )≤A 1 (n 4 ,k 4 )≤A 1 (n, k) where the pair of (n, k) identifies not four identification dates (n) 1 ,k 1 )、(n 2 ,k 2 )、(n 3 ,k 3 )、 (n 4 ,k 4 ) Unique date (date k in year n), and definition D 1 = {(n 1 ,k 1 ),(n 2 ,k 2 ),(n 3 ,k 3 ),(n 4 ,k 4 ) A reference date (element 516).According to the exemplary embodiment herein, the desired number is 4, but the present invention is not limited to four sets.
The system also applies to all A 2 (n, k) ordering to pick the four maxima, thus A 2 (n 5 ,k 5 )≤A 2 (n 6 ,k 6 )≤A 2 (n 7 ,k 7 )≤A 2 (n 8 ,k 8 )≤A 2 (n, k) where the pair of (n, k) identifications is not four identification dates (n) 5 ,k 5 )、(n 6 ,k 6 )、(n 7 ,k 7 )、(n 8 ,k 8 ) Unique date of, and definition D 2 ={(n 5 ,k 5 ),(n 6 ,k 6 ),(n 7 ,k 7 ),(n 8 ,k 8 ) The reference date (element 518). Similarly, according to the exemplary embodiment herein, the desired number is 4, but the present invention is not limited to four sets. The system then generates a union of dates D = D 1 ∪D 2 To remove any duplicate dates and therefore have at least four dates and a maximum of eight dates according to exemplary embodiment D (element 520).
The system then cycles through the date set D for the dates identified by pair (n, k) (element 522). The system according to the first exemplary embodiment may calculate the following local match measure (element 524). Local match measure at time period T = { T | N 1 ≤t≤N 2 Match curve C k n (t) and reference curve C r (t), whereby the neural network is trained to be responsible for load prediction. Is defined as follows:
as described above, the entire day is divided into a number of non-overlapping time segments, and for each time segment there is a neural network designed and trained to be responsible for providing load predictions. As described in more detail with reference to the second exemplary embodiment, the local match measure is used when loading the training and predictionThe degree is more reasonable than the local measure of difference. The system is to all A 3 (N, k) sorting to pick the maximum of N (depending on how many automatically generated matching dates N ≦ 4 are required) and find the corresponding date- -depending on the three matching criteria applied: a1, A2, A3 serve as best match dates providing best load curve matching (element 526).
Referring to FIG. 6, the system determines a reduced date set based on the next day of the week (i.e., the base day) according to reduced date data set selection embodiment 600 (element 406). If the system determines that the next first and last day is included inland between Tuesday and Thursday, then any workday between Tuesday and Thursday that includes the same week may be included in the selected set of dates (element 602). If the system determines that the next day is Monday, then the previous Monday, sunday, and Tuesday are included in the selected set of dates (element 604). If the system determines that the next day is Friday, then the previous Thursday, friday, and Saturday are included in the selected set of dates (element 606). If the system determines that the next day is any of the weekends, then the previous Friday, weekend, and Monday are included in the selected set of dates (element 608). These behaviors allow the system to consider the day of the week selected for the data set.
The system also takes holidays into account when selecting dates for the reduced data set. If the system determines that the next day is a holiday, the selected set of dates includes the same holiday before and no more than two days earlier or later than the holiday (element 610). In addition, the system also prevents the reference date from being used as a data set (element 612). The system also provides a data set that takes into account the day of the week and the holiday for the reference date.
An increasing load curve is formed from the load curve. Slave load curve C k n (t) is formed by Δ C k n (t)(t∈1,1440]) Curve of increasing load represented, where t e (1, 1440)]. For illustration purposesThe same interval for the exemplary embodiment is1 minute, but the invention is not limited to 1 minute increments. Various increments may be used, for exampleFor example, the increment may be 5 minutes, in which case, p + -t e (1, 1440)], Neural network based very short term load prediction provides a method that will also train and predict based on load increments. To better apply this method, curve matching based on load increments is an alternative choice.
The second exemplary algorithm 700 also utilizes global load curve matching as well as partial load curve matching. The second exemplary algorithm 700 performs the same steps as described in the first exemplary algorithm 400 with reference to fig. 4. The second exemplary algorithm 700 continues from element 414 of fig. 4 to element 716 of fig. 7.
The system is to all A 1 (n, k) sorting to pick the lowest value desired, thus A 1 (n 1 ,k 1 )≤A 1 (n 2 ,k 2 )≤A 1 (n 3 ,k 3 )≤A 1 (n 4 ,k 4 )≤A 1 (n, k) where the pair of (n, k) identifications is not four identification dates (n) 1 ,k 1 )、(n 2 ,k 2 )、(n 3 ,k 3 )、(n 4 ,k 4 ) Unique date (date k in year n), and definition D 1 ={(n 1 ,k 1 ),(n 2 ,k 2 ), (n 3 ,k 3 ),(n 4 ,k 4 ) The reference date (element 716). According to the exemplary embodiment herein, the desired number is 4, but the present invention is not limited to four sets.
The system also applies to all A 2 (n, k) ordering to pick the four maxima, thus A 2 (n 5 ,k 5 )≤A 2 (n 6 ,k 6 )≤A 2 (n 7 ,k 7 )≤A 2 (n 8 ,k 8 )≤A 2 (n, k) where the pair of (n, k) identifications is not four identification dates (n) 5 ,k 5 )、(n 6 ,k 6 )、(n 7 ,k 7 )、(n 8 ,k 8 ) Unique date of, and definition D 2 ={(n 5 ,k 5 ),(n 6 ,k 6 ),(n 7 ,k 7 ),(n 8 ,k 8 ) The reference date (element 718). Similarly, according to the exemplary embodiment herein, the desired number is 4, but the present invention is not limited to four groups. The system then generates a union of dates D = D 1 ∪D 2 To remove any duplicate dates and therefore have at least four dates and a maximum of eight dates according to exemplary embodiment D (element 720).
The system then cycles through the set of dates D for the dates identified by pair (n, k) (element 7522). The system according to the first exemplary embodiment may calculate the following local match measure (element 724). The system loops through the set of dates D for each date identified by pair (n, k) and calculates a local difference measure. In a time period T = { T | N 1 ≤t≤N 2 Reference curves Δ Cr (t) and Δ C on which the neural network is trained to be responsible for load prediction k n (t) the integrated square of the difference is:
the day is divided into a number of non-overlapping time segments, and for each time segment there is a neural network designed and trained to be responsible for providing load predictions. When the load is added to the entire curve match, the local difference measure is more reasonable than the local match measure of the first embodiment. The system is to all A 3 (N, k) are ordered to pick the minimum value of N (depending on how many automatically generated matching dates N ≦ 4 are needed) (element 726). The system looks up the corresponding date-according to the three matching criteria applied: a1, A2, A3 are used as the best matching dates for providing the best load curve matching.
The systems and methods employed in the first exemplary embodiment 400 and the second exemplary embodiment may be implemented in various ways. Exemplary embodiments use the set of power generation data at different time periods. The present invention is not limited to power generation data. As understood by those skilled in the art, the ANN VSTLP may be used for various simulation and training purposes.
The system and method may be implemented in hardwired or programmable hardware. The systems and methods may be implemented within the scope of software that utilizes various components to implement the embodiments described herein. Aspects disclosed in the exemplary embodiments may be utilized independently or in conjunction with other exemplary embodiments. Moreover, it will be understood that the foregoing is only illustrative of the principles of the invention, and that various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention. Those skilled in the art will appreciate that the present invention can be practiced by other than the described embodiments, which are presented for purposes of illustration and not of limitation, and the present invention is limited only by the claims that follow.