CN111709588A - Power consumption prediction method and system - Google Patents
Power consumption prediction method and system Download PDFInfo
- Publication number
- CN111709588A CN111709588A CN202010578304.1A CN202010578304A CN111709588A CN 111709588 A CN111709588 A CN 111709588A CN 202010578304 A CN202010578304 A CN 202010578304A CN 111709588 A CN111709588 A CN 111709588A
- Authority
- CN
- China
- Prior art keywords
- power consumption
- low
- data
- frequency component
- pass filter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 239000013598 vector Substances 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000010606 normalization Methods 0.000 claims abstract description 15
- 230000009466 transformation Effects 0.000 claims abstract description 14
- 230000005611 electricity Effects 0.000 claims description 14
- 230000010354 integration Effects 0.000 claims description 14
- 210000002569 neuron Anatomy 0.000 claims description 12
- 230000015654 memory Effects 0.000 claims description 8
- 238000000354 decomposition reaction Methods 0.000 claims description 7
- 238000013507 mapping Methods 0.000 claims description 6
- 238000012549 training Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 4
- 238000011160 research Methods 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000010248 power generation Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000007787 long-term memory Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/06—Energy or water supply
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y04—INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
- Y04S—SYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
- Y04S10/00—Systems supporting electrical power generation, transmission or distribution
- Y04S10/50—Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Economics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- General Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Entrepreneurship & Innovation (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- Evolutionary Computation (AREA)
- Development Economics (AREA)
- Data Mining & Analysis (AREA)
- Game Theory and Decision Science (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Water Supply & Treatment (AREA)
- Primary Health Care (AREA)
- Supply And Distribution Of Alternating Current (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention provides a power consumption prediction method, which comprises the following steps: performing stable wavelet transformation on power consumption data acquired by the intelligent electric meter, performing normalization processing on the transformed data, and scaling down the transformed data to 0-1 in proportion; the stable wavelet transform is used for converting power consumption data into a plurality of input vectors with the same characteristic dimensionality; and sequentially importing the plurality of input vectors into a trained power consumption prediction model established based on the GRU network, and utilizing historical data points of the previous n time steps provided by the GRU network to train the imported plurality of input vectors so as to predict the power consumption in the next n time steps. According to the method, the power consumption data collected by the intelligent electric meter can be effectively utilized, the power consumption is predicted from the signal source end, and the prediction result is more time-efficient and targeted.
Description
Technical Field
The invention relates to the field of power consumption prediction, in particular to a power consumption prediction method and a power consumption prediction system.
Background
Under the condition that the power reform is greatly popularized at present, the enterprise power consumption prediction plays an important role in enterprise power purchasing decision making, and meanwhile, effective management support is provided for various related management departments. According to research and research, power supply companies generally improve power generation capacity to meet potential peak demands in order to meet the demand of power consumption units during peak hours, and predict future power consumption, so that reasonable power generation and transmission plans can be formulated by power plants and power supply networks, and waste of power resources is reduced. Through intelligent upgrading and transformation of a power grid, the traditional power grid is fused with advanced information and intelligent technologies, and fundamental change of the power industry is realized.
During electric quantity prediction, a prediction model which is constructed by taking a Recurrent Neural Network (RNN) as a main body is a typical representative in a short-term load prediction method, but the problems that potential high-dimensional features in a historical sequence are difficult to effectively extract and important information is easy to lose when the time sequence is too long exist. A CNN-GRU short-term power load prediction method based on an attention mechanism is proposed in 12 years of 2019, historical load data are used as input, a CNN framework composed of a one-dimensional convolution layer, a pooling layer and the like is built, and high-dimensional features reflecting complex dynamic changes of loads are extracted; constructing the characteristic vector into a time sequence form to be used as the input of a GRU network, modeling and learning a characteristic internal dynamic change rule, introducing an Attention mechanism, endowing different weights to the GRU implicit state through mapping weighting and a learning parameter matrix, reducing the loss of historical information, strengthening the influence of important information, and finally completing short-term load prediction.
However, the smart meter is used as a representative edge device for collecting user electricity consumption information in the smart grid system, and the electricity consumption data collected by the current smart meter has the characteristics of low dimensionality, strong volatility and the like, and meanwhile, other related characteristic information is unavailable and the like, so that the method is difficult to apply for short-term electricity consumption prediction. How to provide a method for predicting power consumption by effectively utilizing a GRU network based on an intelligent electric meter becomes a problem which needs to be solved urgently at present.
Disclosure of Invention
The invention aims to provide a power consumption prediction method and a power consumption prediction system, which are based on wavelet transformation and a combined model wavelet _ GRU (gated cycle unit), extract a plurality of low-frequency approximate parts and a plurality of high-frequency detail parts with the same characteristic dimensionality in a power consumption signal acquired by an intelligent electric meter, introduce a power consumption prediction model established based on a GRU network, and accurately predict the next power consumption with n time steps. According to the method, the power consumption data collected by the intelligent electric meter can be effectively utilized, the power consumption is predicted from the signal source end, and the prediction result is more time-efficient and targeted.
To achieve the above object, with reference to fig. 1, the present invention provides a method for predicting power consumption, which includes the following steps:
s1, performing stable wavelet transformation on the power consumption data collected by the intelligent electric meter, performing normalization processing on the transformed data, and scaling down the transformed data to 0-1; the stable wavelet transform is used for converting power consumption data into a plurality of input vectors with the same characteristic dimensionality;
s2, sequentially importing the plurality of input vectors into a trained electricity consumption prediction model established based on a GRU network, and utilizing historical data points of previous n time steps provided by the GRU network to train the imported plurality of input vectors to predict electricity consumption in the next n time steps, wherein n is a positive integer greater than 1;
the power consumption prediction model is composed of a single hidden layer of a plurality of GRU neurons and a complete connection layer on the top, and is used for mapping GRU output to expected target characteristics.
Further, in step S1, the process of performing stable wavelet transform on the collected power consumption includes:
s11, comprehensively considering data dimension and smoothness requirements, and constructing a multi-Beth wavelet transformation model;
s12, importing the power consumption data into a multi-Beth wavelet transformation model, lifting and sampling the filter after the power consumption data passes through a high-pass filter and a low-pass filter of each order, and obtaining a plurality of low-frequency approximate parts and a plurality of high-frequency detail parts after transformation;
wherein the decomposition coefficients of the multi-Beth wavelet transform model are affected by data dimension and smoothness.
Further, the multi-Behcet wavelet transform model comprises an input end, a first-order filter bank and a second-order filter bank which are sequentially connected, wherein the first-order filter bank comprises a first high-pass filter and a first low-pass filter, and the second-order filter bank comprises a second high-pass filter and a second low-pass filter;
the original signal is led into a first-order filter bank through an input end; performing rotational integration on the original signal and a first high-pass filter to obtain a first high-frequency component contained in the original signal, and outputting the first high-frequency component as a first high-frequency output; performing rotation integration on the original signal and a first low-pass filter to obtain a first low-frequency component contained in the original signal, and simultaneously using the first low-frequency component as an input signal of a first low-frequency output and a second-order filter bank;
performing rotation integration on the first low-frequency component and a second high-pass filter to obtain a second high-frequency component contained in the first low-frequency component and serve as a second high-frequency output; and performing rotation integration on the first low-frequency component and the second low-pass filter to obtain a second low-frequency component contained in the first low-frequency component, and using the second low-frequency component as a second low-frequency output.
Further, in step S1, the transformed data is normalized by the following formula:
where minX is the minimum value of data set X, maxX is the maximum value of data set X, X is the sample of data set, X isscaledI.e. the value corresponding to the normalized x.
Further, the time step is in units of days.
Further, the prediction method further comprises:
and S3, normalizing the prediction result data, and reconstructing and verifying the prediction result data after the normalization processing.
Further, each neuron in the GRU network-based electricity usage prediction model includes a reset gate for combining new input information with previous memory and an update gate for defining the amount of previous memory saved to the current time step.
Based on the method, the invention also provides a power consumption prediction system, which comprises a stable wavelet transform processing module, a normalization processing module and a power consumption prediction model established based on a GRU network;
the stable wavelet transform processing module is used for performing stable wavelet transform on the acquired power consumption data and converting the power consumption data into a plurality of input vectors with the same characteristic dimensionality;
the normalization processing module is used for performing normalization processing on the converted data, scaling down the converted data to 0-1 according to the proportion, and then importing the processed multiple input vectors into the power consumption prediction model;
the power consumption prediction model is used for training a plurality of imported output vectors by utilizing historical data points of previous n time steps provided by the GRU network to predict the power consumption of the next n time steps, wherein n is a positive integer greater than 1;
the power consumption prediction model is composed of a single hidden layer of a plurality of GRU neurons and a complete connection layer on the top, and is used for mapping GRU output to expected target characteristics.
Compared with the prior art, the technical scheme of the invention has the following remarkable beneficial effects:
(1) according to the method, the power consumption data collected by the intelligent electric meter can be effectively utilized, the power consumption is predicted from the signal source end, and the prediction result is more time-efficient and targeted.
(2) Aiming at limited power consumption data, a plurality of low-frequency approximate parts and a plurality of high-frequency detail parts with the same characteristic dimensionality in a power consumption signal collected by the intelligent electric meter are effectively extracted by adopting stable wavelet transformation, so that the power consumption signal can be directly led into a power consumption prediction model for training.
(3) Research shows that after data dimension and smoothness are balanced, the multi-Behcet wavelet with the decomposition coefficient of 2 has the best prediction effect.
It should be understood that all combinations of the foregoing concepts and additional concepts described in greater detail below can be considered as part of the inventive subject matter of this disclosure unless such concepts are mutually inconsistent. In addition, all combinations of claimed subject matter are considered a part of the presently disclosed subject matter.
The foregoing and other aspects, embodiments and features of the present teachings can be more fully understood from the following description taken in conjunction with the accompanying drawings. Additional aspects of the present invention, such as features and/or advantages of exemplary embodiments, will be apparent from the description which follows, or may be learned by practice of specific embodiments in accordance with the teachings of the present invention.
Drawings
The drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures may be represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. Embodiments of various aspects of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of a power usage prediction method of the present invention.
FIG. 2 is a block diagram of a power usage prediction system according to the present invention.
Fig. 3 is a diagram of a digital implementation model of a stationary wavelet transform.
Fig. 4 is a schematic structural diagram of each neuron of the GRU network.
Detailed Description
In order to better understand the technical content of the present invention, specific embodiments are described below with reference to the accompanying drawings.
In this disclosure, aspects of the present invention are described with reference to the accompanying drawings, in which a number of illustrative embodiments are shown. Embodiments of the present disclosure are not necessarily defined to include all aspects of the invention. It should be appreciated that the various concepts and embodiments described above, as well as those described in greater detail below, may be implemented in any of numerous ways, as the disclosed concepts and embodiments are not limited to any one implementation. In addition, some aspects of the present disclosure may be used alone, or in any suitable combination with other aspects of the present disclosure.
With reference to fig. 1, the present invention provides a power consumption prediction method, which includes the following steps:
s1, performing stable wavelet transformation on the power consumption data collected by the intelligent electric meter, performing normalization processing on the transformed data, and scaling down the transformed data to 0-1; the stationary wavelet transform is used to convert power usage data into a plurality of input vectors having the same characteristic dimensions.
And S2, sequentially importing the plurality of input vectors into a trained electricity consumption prediction model established based on the GRU network, and utilizing the historical data points of the previous n time steps provided by the GRU network to train the plurality of imported input vectors to predict the electricity consumption in the next n time steps, wherein n is a positive integer greater than 1.
The power consumption prediction model is composed of a single hidden layer of a plurality of GRU neurons and a complete connection layer on the top, and is used for mapping GRU output to expected target characteristics.
Based on the method, the invention also provides a wavelet transform and GRU (gated cyclic unit) based combined model wavelet _ GRU, which is composed of a stable wavelet transform and a GRU model, and the model framework is shown in figure 2.
Wavelet transforms can be roughly divided into two categories: continuous Wavelet Transform (CWT) and Discrete Wavelet Transform (DWT). At present, the electricity consumption data is recorded according to days, so that the discrete wavelet transform is more suitable for the limited electricity consumption data.
The discrete wavelet transform is defined as:
where α is the scale parameter,. tau.is the translation parameter,. phi.*(x) Is a complex conjugate function, m is a scaling constant (representing the number of levels of decomposition), n is a translation constant, and m, n are both integers.
Considering that the characteristic dimensions input by a power consumption prediction model based on a GRU network are the same, the invention adopts the deformation of discrete wavelet transform: stationary Wavelet Transform (SWT). The stable wavelet transform can make up the translation invariance of the discrete wavelet transform lost due to the reduced sampling, and meet the requirement that all characteristic dimensions of the expected input of the GRU model are the same. The stable wavelet transform is upsampled after passing through a high pass filter and a low pass filter of each order, replacing the downsampling of the conventional discrete wavelet transform after passing through the filter.
The decomposition coefficients of the multi-bayesian wavelet transform model are affected by the data dimensions and smoothness. After the dimensionality and smoothness of data are balanced, Daubechies wavelet transform with a decomposition coefficient of 2 is adopted, and two low-frequency approximate parts and two high-frequency detail parts are obtained after the transform, wherein the two low-frequency approximate parts and the two high-frequency detail parts respectively correspond to cA1, cA2, cD1 and cD2 in the graph 2.
The digital implementation model of the stable wavelet transform with the decomposition coefficient of 2 comprises an input end, a first-order filter bank and a second-order filter bank which are sequentially connected, wherein the first-order filter bank comprises a first high-pass filter and a first low-pass filter, and the second-order filter bank comprises a second high-pass filter and a second low-pass filter.
The original signal is led into a first-order filter bank through an input end; performing rotational integration on the original signal and a first high-pass filter to obtain a first high-frequency component contained in the original signal, and outputting the first high-frequency component as a first high-frequency output; the original signal and the first low-pass filter are subjected to rotational integration to obtain a first low-frequency component contained in the original signal, and the first low-frequency component is used as an input signal of a first low-frequency output and a second-order filter bank.
Performing rotation integration on the first low-frequency component and a second high-pass filter to obtain a second high-frequency component contained in the first low-frequency component and serve as a second high-frequency output; and performing rotation integration on the first low-frequency component and the second low-pass filter to obtain a second low-frequency component contained in the first low-frequency component, and using the second low-frequency component as a second low-frequency output.
FIG. 3 is a diagram of a digital implementation model of a stationary wavelet transform, where x [ n ]]Is the original signal, gi[n]Is a low-pass filter of the ith order, hj[n]A high pass filter of order j. The working process comprises the following steps:
after the original signal and the high-pass filter are subjected to rotation integration, the high-frequency component x in the signal can be obtainedi,H[n]. The high frequency component is the output of the ith high frequency. After the original signal and the low-pass filter are subjected to rotation integration, a signal can be obtainedLow frequency component x in hornj,L[n]The low frequency component is then used as the input to the next order j +1 filter. Repeating the above two steps to make multi-order (2-order) stable wavelet transform
After stable wavelet transformation is carried out on the electricity consumption, normalization is carried out on the data, namely the data are reduced to be between 0 and 1 in proportion. Specifically, the transformed data is normalized by the following formula:
where minX is the minimum value of data set X, maxX is the maximum value of data set X, X is the sample of data set, X isscaled is the corresponding value after x normalization.
And then sequentially putting the processed data into GRU models for training. The GRU prediction model is a neural network consisting of a single hidden layer with multiple GRU neurons and a fully connected layer on top to map the GRU output to the desired target feature. The network provides n time steps of historical data points and is trained to predict power usage in the next n time steps. Preferably, the time step is in days.
The GRU provides long term memory capability so that n-day predictions can be made for daily load of electricity usage. Each neuron of the GRU model includes a reset gate and an update gate. The reset gate determines how the new input information is combined with the previous memory, the update gate defining the amount of the previous memory to be saved to the current time step. The neuron structure of each GRU model is shown in fig. 4. Each neuron in the power usage prediction model created based on the GRU network includes a reset gate for combining new input information with previous memory and an update gate for defining the amount of previous memory saved to the current time step.
In some examples, with reference to fig. 2, the prediction method further comprises:
and S3, normalizing the prediction result data, and reconstructing and verifying the prediction result data after the normalization processing.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make various changes and modifications without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention should be determined by the appended claims.
Claims (8)
1. A power consumption prediction method is characterized by comprising the following steps:
s1, performing stable wavelet transformation on the power consumption data collected by the intelligent electric meter, performing normalization processing on the transformed data, and scaling down the transformed data to 0-1; the stable wavelet transform is used for converting power consumption data into a plurality of input vectors with the same characteristic dimensionality;
s2, sequentially importing the plurality of input vectors into a trained electricity consumption prediction model established based on a GRU network, and utilizing historical data points of previous n time steps provided by the GRU network to train the imported plurality of input vectors to predict electricity consumption in the next n time steps, wherein n is a positive integer greater than 1;
the power consumption prediction model is composed of a single hidden layer of a plurality of GRU neurons and a complete connection layer on the top, and is used for mapping GRU output to expected target characteristics.
2. The power consumption prediction method according to claim 1, wherein in step S1, the process of performing stable wavelet transform on the collected power consumption comprises:
s11, comprehensively considering data dimension and smoothness requirements, and constructing a multi-Beth wavelet transformation model;
s12, importing the power consumption data into a multi-Beth wavelet transformation model, lifting and sampling the filter after the power consumption data passes through a high-pass filter and a low-pass filter of each order, and obtaining a plurality of low-frequency approximate parts and a plurality of high-frequency detail parts after transformation;
wherein the decomposition coefficients of the multi-Beth wavelet transform model are affected by data dimension and smoothness.
3. The power consumption prediction method according to claim 2, wherein the multiple-bayesian wavelet transform model comprises an input end, a first-order filter bank and a second-order filter bank which are connected in sequence, wherein the first-order filter bank comprises a first high-pass filter and a first low-pass filter, and the second-order filter bank comprises a second high-pass filter and a second low-pass filter;
the original signal is led into a first-order filter bank through an input end; performing rotational integration on the original signal and a first high-pass filter to obtain a first high-frequency component contained in the original signal, and outputting the first high-frequency component as a first high-frequency output; performing rotation integration on the original signal and a first low-pass filter to obtain a first low-frequency component contained in the original signal, and simultaneously using the first low-frequency component as an input signal of a first low-frequency output and a second-order filter bank;
performing rotation integration on the first low-frequency component and a second high-pass filter to obtain a second high-frequency component contained in the first low-frequency component and serve as a second high-frequency output; and performing rotation integration on the first low-frequency component and the second low-pass filter to obtain a second low-frequency component contained in the first low-frequency component, and using the second low-frequency component as a second low-frequency output.
4. The power consumption prediction method according to claim 1, wherein in step S1, the transformed data is normalized using the following formula:
where minX is the minimum value of data set X, maxX is the maximum value of data set X, X is the sample of data set, X isscaledI.e. the value corresponding to the normalized x.
5. The power usage prediction method of claim 1, wherein the time steps are in units of days.
6. The power consumption prediction method of claim 1, further comprising:
and S3, normalizing the prediction result data, and reconstructing and verifying the prediction result data after the normalization processing.
7. The power usage prediction method of claim 1, wherein each neuron in the power usage prediction model created based on the GRU network includes a reset gate for combining new input information with previous memory and an update gate for defining an amount of previous memory saved to a current time step.
8. The power consumption prediction system is characterized by comprising a stable wavelet transform processing module, a normalization processing module and a power consumption prediction model established based on a GRU network;
the stable wavelet transform processing module is used for performing stable wavelet transform on the acquired power consumption data and converting the power consumption data into a plurality of input vectors with the same characteristic dimensionality;
the normalization processing module is used for performing normalization processing on the converted data, scaling down the converted data to 0-1 according to the proportion, and then importing the processed multiple input vectors into the power consumption prediction model;
the power consumption prediction model is used for training a plurality of imported output vectors by utilizing historical data points of previous n time steps provided by the GRU network to predict the power consumption of the next n time steps, wherein n is a positive integer greater than 1;
the power consumption prediction model is composed of a single hidden layer of a plurality of GRU neurons and a complete connection layer on the top, and is used for mapping GRU output to expected target characteristics.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010578304.1A CN111709588B (en) | 2020-06-23 | 2020-06-23 | Power consumption prediction method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010578304.1A CN111709588B (en) | 2020-06-23 | 2020-06-23 | Power consumption prediction method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111709588A true CN111709588A (en) | 2020-09-25 |
CN111709588B CN111709588B (en) | 2023-08-15 |
Family
ID=72541562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010578304.1A Active CN111709588B (en) | 2020-06-23 | 2020-06-23 | Power consumption prediction method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111709588B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112559827A (en) * | 2020-12-08 | 2021-03-26 | 上海上实龙创智能科技股份有限公司 | Measurement parameter prediction and sewage treatment control method based on deep learning |
CN112763967A (en) * | 2020-12-11 | 2021-05-07 | 国网辽宁省电力有限公司鞍山供电公司 | BiGRU-based intelligent electric meter metering module fault prediction and diagnosis method |
CN113554466A (en) * | 2021-07-26 | 2021-10-26 | 国网四川省电力公司电力科学研究院 | Short-term power consumption prediction model construction method, prediction method and device |
CN114118530A (en) * | 2021-11-04 | 2022-03-01 | 杭州经纬信息技术股份有限公司 | Prediction method and device based on multi-household power consumption prediction model |
CN114399078A (en) * | 2021-11-30 | 2022-04-26 | 国网山东省电力公司日照供电公司 | Load day-ahead prediction and ultra-short-term prediction collaborative prediction method and system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110070229A (en) * | 2019-04-26 | 2019-07-30 | 中国计量大学 | The short term prediction method of home electrical load |
CN111242377A (en) * | 2020-01-15 | 2020-06-05 | 重庆大学 | Short-term wind speed prediction method integrating deep learning and data denoising |
-
2020
- 2020-06-23 CN CN202010578304.1A patent/CN111709588B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110070229A (en) * | 2019-04-26 | 2019-07-30 | 中国计量大学 | The short term prediction method of home electrical load |
CN111242377A (en) * | 2020-01-15 | 2020-06-05 | 重庆大学 | Short-term wind speed prediction method integrating deep learning and data denoising |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112559827A (en) * | 2020-12-08 | 2021-03-26 | 上海上实龙创智能科技股份有限公司 | Measurement parameter prediction and sewage treatment control method based on deep learning |
CN112763967A (en) * | 2020-12-11 | 2021-05-07 | 国网辽宁省电力有限公司鞍山供电公司 | BiGRU-based intelligent electric meter metering module fault prediction and diagnosis method |
CN113554466A (en) * | 2021-07-26 | 2021-10-26 | 国网四川省电力公司电力科学研究院 | Short-term power consumption prediction model construction method, prediction method and device |
CN113554466B (en) * | 2021-07-26 | 2023-04-28 | 国网四川省电力公司电力科学研究院 | Short-term electricity consumption prediction model construction method, prediction method and device |
CN114118530A (en) * | 2021-11-04 | 2022-03-01 | 杭州经纬信息技术股份有限公司 | Prediction method and device based on multi-household power consumption prediction model |
CN114399078A (en) * | 2021-11-30 | 2022-04-26 | 国网山东省电力公司日照供电公司 | Load day-ahead prediction and ultra-short-term prediction collaborative prediction method and system |
CN114399078B (en) * | 2021-11-30 | 2024-09-10 | 国网山东省电力公司日照供电公司 | Collaborative prediction method and system for load day-ahead prediction and ultra-short-term prediction |
Also Published As
Publication number | Publication date |
---|---|
CN111709588B (en) | 2023-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111709588B (en) | Power consumption prediction method and system | |
Yildiz et al. | An improved residual-based convolutional neural network for very short-term wind power forecasting | |
Dou et al. | Hybrid model for renewable energy and loads prediction based on data mining and variational mode decomposition | |
CN109242212A (en) | A kind of wind-powered electricity generation prediction technique based on change Mode Decomposition and length memory network | |
CN112668611B (en) | Kmeans and CEEMD-PE-LSTM-based short-term photovoltaic power generation power prediction method | |
CN116316591A (en) | Short-term photovoltaic power prediction method and system based on hybrid bidirectional gating cycle | |
CN116881639B (en) | Electricity larceny data synthesis method based on generation countermeasure network | |
CN113361803A (en) | Ultra-short-term photovoltaic power prediction method based on generation countermeasure network | |
CN115065078A (en) | Energy storage capacity configuration method and system in micro-grid environment | |
CN116561567A (en) | Short-term photovoltaic power prediction model based on variation modal decomposition, construction method and application method | |
CN117477539A (en) | Short-term load prediction method based on time depth convolution network | |
JP7529334B1 (en) | Energy time series data forecasting system | |
CN115374908A (en) | Short-term load prediction method and system suitable for flexible interconnection real-time control of distribution network | |
CN116596129A (en) | Electric vehicle charging station short-term load prediction model construction method | |
CN111932007B (en) | Power prediction method and device for photovoltaic power station and storage medium | |
CN117833216A (en) | Photovoltaic power station generated power prediction method and device based on hybrid neural network | |
Koo et al. | Short-term electric load forecasting based on wavelet transform and GMDH | |
CN113779861B (en) | Photovoltaic Power Prediction Method and Terminal Equipment | |
Areekul et al. | Neural-wavelet approach for short term price forecasting in deregulated power market | |
CN114912545A (en) | Power load classification method based on optimized VMD algorithm and DBN network | |
Ning et al. | An ANN and wavelet transformation based method for short term load forecast | |
Prashanthi et al. | A comparative study of the performance of machine learning based load forecasting methods | |
Bunnoon et al. | Wavelet and neural network approach to demand forecasting based on whole and electric sub-control center area | |
Chaturvedi et al. | Neural-Wavelet Based Hybrid Model for Short-Term Load Forecasting | |
Li et al. | A new deep learning method for classification of power quality disturbances using DWT-MRA in utility smart grid |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |