CN115545294A - ISSA-HKELM-based short-term load prediction method - Google Patents

ISSA-HKELM-based short-term load prediction method Download PDF

Info

Publication number
CN115545294A
CN115545294A CN202211199181.6A CN202211199181A CN115545294A CN 115545294 A CN115545294 A CN 115545294A CN 202211199181 A CN202211199181 A CN 202211199181A CN 115545294 A CN115545294 A CN 115545294A
Authority
CN
China
Prior art keywords
hkelm
issa
distribution
value
sparrow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211199181.6A
Other languages
Chinese (zh)
Inventor
司文旭
万俊杰
方严
李新颜
何艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acrel Co Ltd
Jiangsu Acrel Electrical Manufacturing Co Ltd
Original Assignee
Acrel Co Ltd
Jiangsu Acrel Electrical Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acrel Co Ltd, Jiangsu Acrel Electrical Manufacturing Co Ltd filed Critical Acrel Co Ltd
Priority to CN202211199181.6A priority Critical patent/CN115545294A/en
Publication of CN115545294A publication Critical patent/CN115545294A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Marketing (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Molecular Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Water Supply & Treatment (AREA)
  • Primary Health Care (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to a short-term load prediction method based on ISSA-HKELM, which comprises the following steps: firstly, aiming at the defects of a kernel extreme learning machine KELM, combining a Gaussian kernel function and a polynomial kernel function to construct a hybrid kernel extreme learning machine HKELM with stronger generalization capability; secondly, aiming at the problem that the sparrow search algorithm is easy to fall into a local extreme value, a self-adaptive t distribution strategy and a dynamic self-adaptive weight are introduced to improve the sparrow search algorithm; optimizing the parameters of the HKELM of the hybrid kernel extreme learning machine by adopting the improved sparrow search algorithm ISSA again and establishing an ISSA-HKELM prediction model; and finally, performing short-term load prediction by adopting the established ISSA-HKELM model. Compared with the prior art, the method has the advantages of good prediction precision, robustness and the like.

Description

ISSA-HKELM-based short-term load prediction method
Technical Field
The invention relates to a short-term load prediction method, in particular to a short-term load prediction method based on ISSA (improved sparrow search algorithm) -HKELM (optimized hybrid kernel extreme learning machine).
Background
Load prediction is an important basis for realizing energy planning, economic operation and energy management, and generally comprises long-term load prediction, medium-term load prediction and short-term load prediction, wherein the short-term load prediction generally refers to prediction of load of one day or one week in the future, and the short-term load prediction is characterized by being greatly influenced by factors such as weather, equipment conditions, important social activities and the like, so that the short-term load prediction is difficult to predict accurately. The characteristics of industrial, civil and utility electric loads are different, the electric loads have great fluctuation and seasonality due to weather change, and accurate prediction is made on the electric loads, so that the method is the basis for planning expansion, operation, maintenance and the like of an electric power system.
In the field of short-term power load prediction, prediction methods are mainly classified into conventional methods based on statistics and methods based on machine learning. The traditional methods mainly comprise a time sequence method, regression analysis, a Carl filtering method and the like, most of the methods judge the future trend of the load by fitting historical data, and the method is simple but has the defect that the nonlinear characteristic of the load cannot be reflected. The machine learning method mainly comprises a BP neural network, a Support Vector Machine (SVM), an Extreme Learning Machine (ELM) and the like, and although the prediction accuracy of the methods is higher than that of the traditional method, the methods also have the problems of complex process, poor stability, difficult parameter adjustment and the like. The Kernel Extreme Learning Machine (KELM) is a novel single hidden layer feedforward neural network algorithm based on a kernel function, and has a good effect on processing nonlinear regression. Most of the traditional Kernel Extreme Learning Machines (KELMs) adopt a single kernel function, and have weak generalization capability and prediction performance; the sparrow search algorithm serving as a novel group intelligent optimization algorithm has the advantages of simple parameter setting, high convergence speed and the like, but the sparrow search algorithm has the problems that the population diversity is reduced and the sparrow search algorithm is prone to falling into local extreme values in the later iteration stage.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a short-term load prediction method based on ISSA-HKELM, which has good prediction accuracy and robustness.
The purpose of the invention can be realized by the following technical scheme:
according to one aspect of the invention, a method for predicting short-term load based on ISSA-HKELM is provided, which comprises the following steps:
firstly, aiming at the defects of a kernel extreme learning machine KELM, combining a Gaussian kernel function and a polynomial kernel function to construct a hybrid kernel extreme learning machine HKELM with stronger generalization capability;
secondly, aiming at the problem that the sparrow search algorithm is easy to fall into a local extreme value, a self-adaptive t distribution strategy and a dynamic self-adaptive weight are introduced to improve the sparrow search algorithm;
optimizing the parameters of the HKELM of the hybrid kernel extreme learning machine by adopting the improved sparrow search algorithm ISSA again and establishing an ISSA-HKELM prediction model;
and finally, performing short-term load prediction by adopting the established ISSA-HKELM model.
As a preferred technical scheme, the input and output variables of the ISSA-HKELM prediction model are selected as follows:
taking the historical load data of the day before the forecast day, the highest temperature, the lowest temperature, the average temperature, the relative humidity and the rainfall of the day before the forecast day and the highest temperature, the lowest temperature, the average temperature, the relative humidity and the rainfall of the day as input quantities of a model;
and taking the load data of the day of the forecast day as an output quantity.
As a preferable technical scheme, the hybrid nuclear extreme learning machine HKELM effectively solves the problem that the generalization capability of the nuclear extreme learning machine is weak.
As a preferred technical solution, the adaptive t-distribution strategy specifically includes:
the updating of the sparrow position with adaptive t-distribution is shown in the following equation
Figure BDA0003871525000000021
Wherein x i The position of the ith sparrow individual;
Figure BDA0003871525000000022
the position of the sparrow after t mutation is shown; t (k) represents the t distribution with the parameter degree of freedom being the number of iterations k.
As a preferred technical scheme, when the iteration number k in the current period is small, t distribution is similar to Cauchy distribution variation, the probability of a t distribution operator at the moment is large, and the step length adopted by position variation is large, so that the algorithm has good global search capability; in the middle period of iteration, t distribution is changed from Cauchy distribution variation to Gaussian distribution variation, and the rough probability value of a t distribution operator at the moment is relatively compromised, so that the algorithm has both global and local search capabilities; when the later iteration number k is large, the t distribution is similar to Gaussian distribution variation, the probability of the t distribution operator at the moment is small, and the step length adopted by position variation is small.
As a preferred technical solution, the dynamic adaptive weight specifically includes:
the idea of inertial weight is introduced into the position updating formula of the discoverer in the sparrow and is improved, namely, a dynamic weight factor omega is introduced into the position updating mode of the discoverer.
As a preferred technical solution, the value of the dynamic weight factor ω is related to the maximum number of iterations K and the current number of iterations K, and decreases as the current number of iterations K increases; in the initial stage of iteration, when the iteration times are small, the value of omega is large; and in the later iteration stage, when the iteration times are increased, the omega value is smaller.
As a preferred technical solution, the calculation formula of the dynamic weight factor ω and the improved discoverer location updating method are as follows:
Figure BDA0003871525000000031
Figure BDA0003871525000000032
wherein
Figure BDA0003871525000000033
The j-th dimension global optimal solution in the previous generation is represented by k, and the current iteration times are represented by k; j = (1,2, \8230d);
Figure BDA0003871525000000034
indicating the position of the ith sparrow in the jth dimension in the kth iteration; k represents the maximum number of iterations; z ∈ (0, 1)]The random number of (1); r is 2 Represents an early warning value and R 2 ∈[0,1](ii) a ST represents a safety threshold and ST ∈ [0.5,1](ii) a Q represents a random value following a normal distribution; l is a 1 × d matrix with all internal elements 1.
As a preferred technical scheme, the specific process of the improved sparrow search algorithm ISSA for optimizing the parameters of the hybrid kernel extreme learning machine HKELM comprises the following steps:
step 1: dividing the power distribution network data into a training set and a test set and carrying out normalization processing;
step 2: initializing parameters, a sparrow population size N, a maximum iteration number K, a safety threshold ST, a finder proportion PD, a detector proportion SD, a t distribution variation probability p and an optimization interval of model parameters;
and step 3: calculating a fitness value, and learning the training sample by the model to obtain a predicted value yi' and a true value y i The mean square error of (2) as a fitness value;
and 4, step 4: calculating and sequencing fitness values to generate optimal and worst individuals and corresponding positions, selecting a part from sparrows with better fitness values as discoverers, and the rest as followers;
and 5: respectively updating the positions of the follower, the warner and the finder according to respective position updating formulas, calculating the corresponding fitness value after updating, comparing the fitness value with the current optimal fitness value, and updating the global optimal information;
step 6: if rand < p, performing t-distribution mutation operation on the sparrows, wherein rand represents a random number between 0 and 1, and the value of p is 0.5, otherwise, returning to the step 3;
and 7: calculating the fitness value of sparrows subjected to t-distribution mutation, and if the fitness value of a new solution after t mutation operation is superior to the global optimum value, replacing the previous global optimum value with the new value after mutation, otherwise, keeping the new value;
and step 8: and (4) terminating the condition, if the maximum iteration times are reached, outputting the model parameters to obtain the optimal ISSA-HKELM model, and otherwise, returning to the step 3.
As a preferred technical scheme, the root mean square error, the average absolute percentage error, the average absolute error and the fitting degree are selected as evaluation indexes of the ISSA-HKELM model.
Compared with the prior art, the invention has the following advantages:
1) The method optimizes parameters of HKELM through optimizing capability of ISSA (Improved Sparrow Search Algorithm), and mines useful information in load data through the HKELM, so that the load data is well fitted, and ultra-short-term power load prediction is accurately and efficiently completed.
2) Aiming at the problem that a KELM model is weak in generalization capability, a hybrid kernel limit learning machine (HKELM) with stronger generalization capability is constructed by combining a Gaussian kernel function and a polynomial kernel function, aiming at the problem that the prediction performance of the HKELM model is solved by model parameters, a sparrow search algorithm is adopted to optimize and select the parameters, and meanwhile, aiming at the problem that the sparrow search algorithm is reduced in population diversity and easy to fall into a local extreme value in the later iteration stage, the short-term power load prediction model for optimizing the HKELM based on the improved sparrow search algorithm is provided, the stability of model prediction is enhanced, and the prediction accuracy is improved.
Drawings
FIG. 1 is a diagram of a nuclear extreme learning machine topology;
FIG. 2 is a graph of a polynomial kernel function;
FIG. 3 is a Gaussian kernel function graph;
FIG. 4 is a graph of a mixing kernel function;
FIG. 5 is a graph of t-distribution, gaussian distribution, and Cauchy distribution function distribution;
FIG. 6 is a flow chart of the ISSA-HKELM model;
FIG. 7 is a comparison of predicted results of different models.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of protection of the present invention.
The invention relates to a short-term power load prediction method for optimizing a hybrid kernel limit learning machine based on an improved sparrow search algorithm, which comprises the steps of firstly, combining a Gaussian kernel function and a polynomial kernel function to construct the hybrid kernel limit learning machine (HKELM) with stronger generalization capability; then aiming at the problem that the sparrow search algorithm is easy to fall into a local extreme value, a self-adaptive t distribution strategy and a dynamic self-adaptive weight are introduced to improve the sparrow search algorithm; finally, establishing an ISSA-HKELM model for short-term load prediction, and carrying out comparative analysis on the ISSA-HKELM model and the LSTM, DBN, KELM, HKELM, PSO-HKELM model and the prediction results of the SSA-HKELM model, wherein the comparative analysis comprises the following steps:
A. selection of input and output variables of the prediction model:
the power load data has dynamic characteristics, historical load data is divided into 96 time points (one point every 15 minutes) in a day by taking a day as a unit to serve as a load fluctuation rule, historical load data of the day before the forecast day, the highest temperature, the lowest temperature, the average temperature, the relative humidity and the rainfall of the day before the forecast day and the highest temperature, the lowest temperature, the average temperature, the relative humidity and the rainfall of the day serve as input quantities (106 input quantities) of a model, and load data of the day on the forecast day serve as output quantities to carry out model training and forecast (96 output quantities).
B. Mixed nuclear limit learning machine (HKELM)
The Kernel Extreme Learning Machine (KELM), as a feedforward neural network algorithm based on a new single hidden layer of kernel function, has a good effect in processing nonlinear regression, and the output of the network can be expressed as:
f(x)=h(x)β=Hβ (1)
where x represents the input data, f (x) represents the prediction output of the KELM model, H (x) represents the mapping function of the hidden layer, and H represents the matrix resulting from kernel-function mapping of the input samples, which is represented by the element x i Composition, β represents the weight between the hidden layer and the output layer. The kernel matrix for KELM is represented as follows:
Figure BDA0003871525000000051
kelm is with nuclear matrix omega ELM Replace HH in ELM T And mapping the input samples into a high-dimensional hidden layer feature space through a kernel function, wherein K (x) in the formula i ,x j ) Representing a kernel function. The common kernel functions are mainly classified into the following four types:
1) Gaussian kernel function
K(x i ,x j )=exp(-||x i -x j || 2 /(2σ 2 )) (3)
2) Polynomial kernel function
K(x i ,x j )=(m(x i ·x j )+n) d ;d=1,2,…,N (4)
3) Linear kernel function
K(x i ,x j )=x i ·x j (5)
4) Perceptron kernel
K(x i ,x j )=tanh(β(x i ·x j )+b) (6)
The KELM topological structure diagram is shown in FIG. 1, where x is the input vector of the model and x 1 ,x 2 ,…x N For training set samples, in order to prevent overfitting of the model, in kernel matrix omega ELM Is added with a unit matrix I 0 And the penalty coefficient C is calculated, and then the beta is calculated, so that the kernel limit learning machine has better generalization capability, and the output and output weight vectors are shown in formulas (7) and (8).
Figure BDA0003871525000000061
β=(I 0 /C+Ω ELM ) -1 T (8)
In the formula, T is a target output matrix.
The performance of the KELM is determined by a kernel function, which is a combination of a polynomial kernel function and a Gaussian kernel function as the kernel function of the KELM to improve the performance of the model. The polynomial kernel function is a global kernel function and has poor local learning capability, as can be seen from a polynomial kernel function curve, where a test point X =0.5, m =1, n =1, d takes 1,2, 3 and 4, and as can be seen from a polynomial kernel function curve 2, the global characteristic of the polynomial kernel function has a large influence on points far from the test point.
The gaussian kernel function is a local kernel function, has good local learning capability, but can not accurately and effectively predict samples beyond a certain range, as can be seen from a gaussian kernel function graph 3, wherein a test point X =0.5, and σ is 0.1,0.2,0.3 and 0.4, so that it can be known that the local feature of the gaussian kernel function is greatly influenced by points which are relatively close to the test point.
The advantages of both can be taken into account simultaneously by the mixed kernel function, so the two kernel functions are linearly combined to form the mixed kernel function:
Figure BDA0003871525000000062
as can be seen from the graph 4 of the mixed kernel function graph, the test point X =0.5, λ is 0.1,0.2,0.3 and 0.4, σ is 0.1 in the gaussian kernel function, m =1, n =1, d =2 in the polynomial kernel function, and the mixed kernel function not only has an influence on the sample points around the test point, but also has a certain influence on the sample points at a certain distance from the test point, so that the mixed kernel function effectively combines the advantages of the gaussian kernel function polynomial kernel function, and makes up for the deficiency of the single kernel function.
Where σ, m, n, and d are mixed kernel parameters, λ is the weight coefficient of the polynomial kernel, where d =2. And (3) when a mixed kernel extreme learning machine (HKELM) is trained, optimizing sigma, m, n, lambda and a penalty coefficient C by adopting an improved sparrow search algorithm.
C. Model for optimizing mixed kernel extreme learning machine by improving sparrow search algorithm (ISSA-HKELM)
The Sparrow Search Algorithm (SSA) is a new intelligent optimization Algorithm proposed by Xue et al in 2020, which is mainly inspired by Sparrow predation and anti-predation behaviors. The sparrow set matrix is as follows:
X=[x 1 ,x 2 ,…,x N ] T ,x i =[x i,1 ,x i,2 ,…,x i,d ] (10)
where N denotes the number of sparrow populations, i = (1, 2, \8230;, N), and d denotes the dimension of the variable.
The fitness matrix for a sparrow is expressed as follows:
F x =[f(x 1 ),f(x 2 ),…,f(x N )] T (11)
f(x i )=[f(x i,1 ),f(x i,2 ),…,f(x i,d )] (12)
in the formula, F x Each value of (a) represents a fitness value of an individual. Sparrow populations are classified into discoverers, followers and cautionary people. Each timeIn the secondary iteration, a part of sparrows with relatively good fitness value are selected as discoverers, generally account for 10% -20% of the population, are mainly responsible for leading the population to move towards the place with food, and are left as followers, while the alerter randomly selects 10% -20% of the whole population.
The location update mode of the discoverer is as follows:
Figure BDA0003871525000000071
in the formula, k represents the current iteration number; j = (1,2, \8230d);
Figure BDA0003871525000000072
indicating the position of the ith sparrow in the jth dimension in the kth iteration; k represents the maximum number of iterations; z is equal to (0, 1)]The random number of (1); r 2 Represents an early warning value and R 2 ∈[0,1](ii) a ST represents a safety threshold and ST ∈ [0.5,1](ii) a Q represents a random value following a normal distribution; l is a 1 × d matrix with all internal elements 1. When R is 2 If the alarm value is less than ST, the early warning value is lower than a safety value, the sparrow population is in a safe place, no natural enemy exists nearby, and a finder can search the area widely; when R is 2 And when the alarm value is greater than or equal to ST, the early warning value is over the safety value, danger exists around the sparrow population, and the whole population needs to be moved to a safe place.
The follower's location update formula is as follows:
Figure BDA0003871525000000073
in the formula, X worst Representing a current global worst location; x P Represents the optimal location of the finder; a represents a 1 × d matrix with all internal elements 1 or-1; wherein A is + =A T (AA T ) -1 . When i is>N/2, the fitness of the ith follower is poor, and the ith follower needs to go to other areas to search for food.
The position updating formula of the alerter is as follows:
Figure BDA0003871525000000081
in the formula, X best Representing a current global optimal position; beta represents the direction of the movement of the sparrows and is a random number which obeys normal distribution; m is ∈ [ -1,1]A random value of (1); f. of i Representing the fitness value of the current sparrow individual; f. of g And f w Respectively representing the current global optimal and worst fitness values. To prevent (f) i -f w ) + epsilon =0 such that the denominator is 0, epsilon is fixed to the smallest constant, set here to 10E-8. When f is i >f g When the method is used, the current fitness value of sparrow individuals exceeds the current optimal fitness value, and sparrow positions are located in the marginal zone of the population and are easy to encounter danger; when f is i =f g And when the probability of the sparrow is higher than the threshold value, and the probability of the sparrow is higher than the threshold value.
Aiming at the defects of the Sparrow Search Algorithm, an Improved Sparrow Search Algorithm (ISSA) is proposed by improving the SSA: and a self-adaptive t distribution strategy and dynamic self-adaptive weight are introduced on the basis of SSA, so that the algorithm is prevented from falling into a local extreme value, and the convergence speed and the convergence precision of the algorithm are improved.
(1) Adaptive t-distribution strategy
the t distribution is short for student distribution, the distribution function curve form is determined by the value n of the parameter degree of freedom, and as can be seen from the function distribution diagram 5, when the degree of freedom n is smaller, the middle performance of the t distribution curve is lower and flat, the double tails are higher in warping, and the whole body is smoother; when the degree of freedom n =1, the t distribution is cauchy, i.e., t (n = 1) → C (0,1). When the degree of freedom n is larger, the middle performance of a t distribution curve is higher, and the whole is steeper; when the degree of freedom N is infinite, the t distribution is gaussian, i.e., t (N = ∞) → N (0,1).
The sparrow position is updated with an adaptive t-distribution as shown in the following equation:
Figure BDA0003871525000000082
in the formula, x i The position of the ith sparrow individual;
Figure BDA0003871525000000083
the position of a sparrow after t mutation is shown; t (k) represents the t distribution with the parameter degree of freedom being the number of iterations k. The information of the current sparrow population is fully applied in the formula, when the iteration times k in the current period are smaller, the t distribution is similar to Cauchy distribution variation, the probability of a t distribution operator at the moment is larger, and the step length adopted by position variation is larger, so that the algorithm has better global search capability; in the middle period of iteration, t distribution is changed from Cauchy distribution variation to Gaussian distribution variation, and the rough probability value of a t distribution operator at the moment is relatively compromised, so that the algorithm has both global and local search capabilities; when the iteration times k in the later period are large, the t distribution is similar to Gaussian distribution variation, the probability of the t distribution operator at the moment is small, the step length adopted by position variation is small, the algorithm has good local searching capacity, and the strategy is favorable for improving the comprehensive searching capacity of the algorithm.
(2) Dynamic adaptive weights
As can be seen from the position updating formula of the finder, when the algorithm starts to iterate, the finder approaches to the global optimal solution, has a smaller search space, and is easy to fall into a local extremum. Therefore, this chapter considers introducing the idea of inertial weight into the formula of location update of discoverer in sparrows and improving it, i.e. introducing a dynamic weight factor ω in the manner of location update of discoverer. The value of ω is related to the maximum number of iterations K and the current number of iterations K and decreases as the current number of iterations K increases. In the initial stage of iteration, when the iteration times are small, the value of omega is large, so that global search can be well carried out; in the later iteration stage, when the iteration times are increased, the omega value is smaller, and then local search can be well carried out.
And simultaneously introducing the global optimal solution of the previous generation into a position updating formula of the finder, so that the position of the finder of the previous generation and the global optimal solution of the previous generation can simultaneously influence the position of the current finder, and the problem that the algorithm falls into local optimal can not occur. The formula for calculating the weight coefficient ω and the improved finder position update manner are as follows:
Figure BDA0003871525000000091
Figure BDA0003871525000000092
in the formula (I), the compound is shown in the specification,
Figure BDA0003871525000000093
a global optimal solution of the jth dimension in the previous generation; rand denotes a random number between 0 and 1.
As shown in fig. 6, the prediction model of the hybrid kernel-extreme learning machine (ISSA-HKELM) optimized based on the improved sparrow search algorithm is obtained by optimizing the penalty coefficient C, the hybrid kernel parameters σ, m and n, and the weight coefficient λ of the hybrid kernel-extreme learning machine (HKELM) by using the Improved Sparrow Search Algorithm (ISSA), and the method comprises the following specific steps:
1) And dividing the load data into a training set and a test set and carrying out normalization processing.
2) The method comprises the following steps of initializing parameters, sparrow population scale N, maximum iteration times K, a safety threshold ST, a finder proportion PD, a detector proportion SD, t distribution variation probability p, an optimization interval of model parameters and the like.
3) Calculating a fitness value, and learning the training sample by the model to obtain a predicted value y' i And true value y i The Mean Square Error (MSE) of (a) is taken as the fitness value, and the MSE calculation formula is as follows.
Figure BDA0003871525000000101
4) Calculating and sorting fitness values to generate optimal and worst individuals and corresponding positions, selecting a part from sparrows with better fitness values as discoverers, and the rest as followers.
5) And updating the positions of the follower, the warner and the finder respectively according to the equations (13), (14) and (15), calculating the corresponding fitness value after updating, comparing the fitness value with the current optimal fitness value, and updating the global optimal information.
6) If rand < p, the sparrows are subjected to a t-distribution mutation operation according to equation (16). Otherwise, go to step 3.
7) And calculating the fitness value of the sparrows subjected to t-distribution mutation, and if the fitness value of a new solution after the t mutation operation is superior to the global optimum value, replacing the previous global optimum value with the new value after the mutation, otherwise, keeping the new value.
8) And (4) terminating the conditions. And if the maximum iteration times are reached, outputting the model parameters to obtain the optimal HKELM model, and otherwise, returning to the step 3.
9) And (4) carrying out short-term power load prediction by using the established hybrid kernel limit learning machine model.
Selecting Root Mean Square Error (RMSE), mean Absolute Percent Error (MAPE), mean Absolute Error (MAE), and fitness (R) 2 ) As an evaluation index of the model.
Figure BDA0003871525000000102
Figure BDA0003871525000000103
Figure BDA0003871525000000104
Figure BDA0003871525000000105
The example analysis of the short-term load prediction model constructed by the invention takes the historical load of 96 time points each day from 1/2015 to 1/2015 and 17/in a certain area as a basis, adds relevant factors influencing the load, including the highest temperature, the lowest temperature, the average temperature, the relative humidity and the rainfall, and divides multidimensional data into input and output nodes for load prediction. The data are divided into training data and test data, wherein the training data are data from 1 month and 1 day of 2012 to 1 month and 16 days of 2015, and the test data are data of the last day. The prediction result map of the seven models is shown in fig. 7, and the prediction result evaluation index value is shown in table 1.
TABLE 1
LSTM DBN KELM HKELM PSO-HKELM SSA-HKELM ISSA-HKELM
RMSE 582.93 2126.80 212.87 234.69 103.20 78.68 61.71
MAE 487.44 1695.07 186.69 212.90 82.62 64.62 49.82
MAPE 7.27% 26.32% 2.53% 2.9758% 1.22% 0.94% 0.77%
R 2 0.95502 0.83482 0.99643 0.99632 0.99649 0.99746 0.99791
As can be seen from the evaluation index values in Table 1, the ISSA-HKELM model proposed in this example compares the Root Mean Square Error (RMSE), the Mean Absolute Percentage Error (MAPE) and the Mean Absolute Error (MAE) with the LSTM model, the DBN model, the KELM model, the HKELM model, the PSO-HKELM model and the SSA-HKELM modelThe prediction results are all reduced to a certain extent, wherein the smaller the error index is, the better the prediction effect is; and degree of fitting (R) 2 ) Is the highest, where a higher degree of fit indicates a more accurate prediction. In conclusion, the method can effectively remove the randomness of prediction, reduce the error of the predicted value, greatly improve the accuracy of prediction, and further prove that the ISSA-HKELM model has certain advantages in the field of short-term power load prediction.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A short-term load prediction method based on ISSA-HKELM is characterized by comprising the following steps:
firstly, aiming at the defects of a kernel extreme learning machine KELM, combining a Gaussian kernel function and a polynomial kernel function to construct a hybrid kernel extreme learning machine HKELM with stronger generalization capability;
secondly, aiming at the problem that the sparrow search algorithm is easy to fall into a local extreme value, a self-adaptive t distribution strategy and a dynamic self-adaptive weight are introduced to improve the sparrow search algorithm;
optimizing the parameters of the HKELM of the hybrid kernel extreme learning machine by adopting the improved sparrow search algorithm ISSA again and establishing an ISSA-HKELM prediction model;
and finally, performing short-term load prediction by adopting the established ISSA-HKELM model.
2. The ISSA-HKELM based short-term load prediction method according to claim 1, characterized in that the input and output variables of the ISSA-HKELM prediction model are selected as follows:
taking the historical load data of the day before the forecast day, the highest temperature, the lowest temperature, the average temperature, the relative humidity and the rainfall of the day before the forecast day and the highest temperature, the lowest temperature, the average temperature, the relative humidity and the rainfall of the day as input quantities of a model;
and taking the load data of the day of the forecast day as an output quantity.
3. The ISSA-HKELM based short-term load prediction method according to claim 1, characterized in that the hybrid nuclear limit learning machine HKELM effectively solves the problem of weak generalization capability of the nuclear limit learning machine.
4. The ISSA-HKELM based short-term load prediction method according to claim 1, wherein the adaptive t-distribution strategy is specifically as follows:
the updating of the sparrow position with adaptive t-distribution is shown in the following equation
Figure FDA0003871524990000011
Wherein x i The position of the ith sparrow individual;
Figure FDA0003871524990000012
the position of the sparrow after t mutation is shown; t (k) represents the t distribution with the parameter degree of freedom being the number of iterations k.
5. The ISSA-HKELM-based short-term load prediction method according to claim 4, characterized in that when the current iteration number k is small, t distribution is similar to Cauchy distribution variation, the probability of t distribution operator at the moment is large, and the step length adopted by position variation is large, so that the algorithm has good global search capability; in the middle stage of iteration, t distribution is changed from Cauchy distribution to Gaussian distribution, and the probability value of a t distribution operator at the moment is relatively compromised, so that the algorithm simultaneously considers global and local searching capabilities; when the later iteration times k are large, the t distribution is similar to Gaussian distribution variation, the probability of the t distribution operator at the moment is small, and the step length adopted by position variation is small.
6. The ISSA-HKELM based short-term load prediction method according to claim 1, wherein the dynamic adaptive weights are specifically:
the idea of inertial weight is introduced into the position updating formula of the discoverer in the sparrow and is improved, namely, a dynamic weight factor omega is introduced into the position updating mode of the discoverer.
7. The ISSA-HKELM based short-term load prediction method according to claim 6, characterized in that the value of the dynamic weight factor ω is related to the maximum number of iterations K and the current number of iterations K, and decreases with the increase of the current number of iterations K; in the initial stage of iteration, when the iteration times are small, the value of omega is large; and in the later iteration stage, when the iteration times are increased, the omega value is smaller.
8. The ISSA-HKELM based short-term load prediction method according to claim 6, wherein the calculation formula of the dynamic weight factor ω and the improved updating manner of the finder position are as follows:
Figure FDA0003871524990000021
Figure FDA0003871524990000022
wherein
Figure FDA0003871524990000023
The j-th dimension global optimal solution in the previous generation is represented by k, and the current iteration times are represented by k; j = (1,2, \8230d);
Figure FDA0003871524990000024
indicating that the ith sparrow is in the kth iterationThe position in the j-th dimension; k represents the maximum number of iterations; z is equal to (0, 1)]The random number of (1); r is 2 Represents an early warning value and R 2 ∈[0,1](ii) a ST represents a safety threshold and ST ∈ [0.5,1](ii) a Q represents a random value following a normal distribution; l is a 1 × d matrix with all internal elements 1.
9. The ISSA-HKELM-based short-term load prediction method according to claim 1, wherein the improved sparrow search algorithm ISSA is used for optimizing parameters of the HKELM of the hybrid kernel extreme learning machine:
step 1: dividing the power distribution network data into a training set and a test set and carrying out normalization processing;
and 2, step: initializing parameters, a sparrow population scale N, a maximum iteration number K, a safety threshold ST, a finder proportion PD, a detector proportion SD, a t distribution variation probability p and an optimization interval of model parameters;
and step 3: calculating a fitness value, and learning the training sample by the model to obtain a predicted value y' i And true value y i The mean square error of (2) as a fitness value;
and 4, step 4: calculating and sequencing fitness values to generate optimal and worst individuals and corresponding positions, selecting a part from sparrows with better fitness values as discoverers, and the rest as followers;
and 5: respectively updating the positions of the follower, the warner and the finder according to respective position updating formulas, calculating the corresponding fitness value after updating, comparing the fitness value with the current optimal fitness value, and updating the global optimal information;
and 6: if rand < p, carrying out t distribution mutation operation on the sparrows, wherein rand represents a random number between 0 and 1, and the value of p is 0.5; otherwise, returning to the step 3;
and 7: calculating the fitness value of the sparrows after t distribution mutation, if the fitness value of a new solution after t mutation operation is superior to the global optimum value, replacing the former global optimum value with the new value after mutation, otherwise, keeping the new value;
and 8: and (5) terminating the condition, if the maximum iteration times are reached, outputting the model parameters to obtain the optimal ISSA-HKELM model, and if not, returning to the step 3.
10. The ISSA-HKELM-based short-term load prediction method according to claim 1 or 9, characterized in that the root mean square error, the mean absolute percentage error, the mean absolute error and the fitting degree are selected as evaluation indexes of the ISSA-HKELM model.
CN202211199181.6A 2022-09-29 2022-09-29 ISSA-HKELM-based short-term load prediction method Pending CN115545294A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211199181.6A CN115545294A (en) 2022-09-29 2022-09-29 ISSA-HKELM-based short-term load prediction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211199181.6A CN115545294A (en) 2022-09-29 2022-09-29 ISSA-HKELM-based short-term load prediction method

Publications (1)

Publication Number Publication Date
CN115545294A true CN115545294A (en) 2022-12-30

Family

ID=84731776

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211199181.6A Pending CN115545294A (en) 2022-09-29 2022-09-29 ISSA-HKELM-based short-term load prediction method

Country Status (1)

Country Link
CN (1) CN115545294A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116432531A (en) * 2023-04-17 2023-07-14 北方工业大学 Bearing residual service life prediction method based on improved nuclear extreme learning machine

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116432531A (en) * 2023-04-17 2023-07-14 北方工业大学 Bearing residual service life prediction method based on improved nuclear extreme learning machine

Similar Documents

Publication Publication Date Title
CN110728401B (en) Short-term power load prediction method of neural network based on squirrel and weed hybrid algorithm
CN108876001B (en) Short-term power load prediction method based on twin support vector machine
CN109670628A (en) A kind of micro-grid load prediction technique of the neural network based on LSSVM
CN106529818A (en) Water quality evaluation prediction method based on fuzzy wavelet neural network
CN110837915B (en) Low-voltage load point prediction and probability prediction method for power system based on hybrid integrated deep learning
CN113762387B (en) Multi-element load prediction method for data center station based on hybrid model prediction
CN110264079A (en) Hot-rolled product qualitative forecasting method based on CNN algorithm and Lasso regression model
CN113554466A (en) Short-term power consumption prediction model construction method, prediction method and device
CN111008790A (en) Hydropower station group power generation electric scheduling rule extraction method
CN115275991A (en) Active power distribution network operation situation prediction method based on IEMD-TA-LSTM model
CN115545294A (en) ISSA-HKELM-based short-term load prediction method
CN112507613B (en) Second-level ultra-short-term photovoltaic power prediction method
Zhang et al. A network traffic prediction model based on quantum inspired PSO and neural network
CN109697531A (en) A kind of logistics park-hinterland Forecast of Logistics Demand method
CN115829097A (en) Air conditioner ultra-short term load prediction method based on VMD and KELM
CN115619028A (en) Clustering algorithm fusion-based power load accurate prediction method
CN114021476A (en) Relay storage life prediction method based on sparrow search and improved gray model
CN111950811B (en) Regional photovoltaic power prediction method and system based on double-layer artificial neural network
CN113537539B (en) Multi-time-step heat and gas consumption prediction model based on attention mechanism
CN114707684A (en) Improved LSTM-based raw tobacco stack internal temperature prediction algorithm
CN113177675A (en) Air conditioner cold load prediction method based on optimization neural network of longicorn group algorithm
CN114444763A (en) Wind power prediction method based on AFSA-GNN
CN113255963A (en) Road surface use performance prediction method based on road element splitting and deep learning model LSTM
Zhu et al. Stock index prediction based on principal component analysis and machine learning
Gulsen et al. A hierarchical genetic algorithm for system identification and curve fitting with a supercomputer implementation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination