CN103020728A - Method for predicating short-term substation power quality in electrical power system - Google Patents

Method for predicating short-term substation power quality in electrical power system Download PDF

Info

Publication number
CN103020728A
CN103020728A CN 201210434409 CN201210434409A CN103020728A CN 103020728 A CN103020728 A CN 103020728A CN 201210434409 CN201210434409 CN 201210434409 CN 201210434409 A CN201210434409 A CN 201210434409A CN 103020728 A CN103020728 A CN 103020728A
Authority
CN
China
Prior art keywords
neural network
layer
output
input
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201210434409
Other languages
Chinese (zh)
Inventor
薛俊茹
宋锐
张海宁
孔祥鹏
丛贵斌
刘可
赵世昌
王轩
梁英
李春来
杨�嘉
马勇飞
杨军
杨立滨
张�杰
张展
杜永涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yitongyu Science & Technology Development Co Ltd
QINGDAO ELECTRIC POWER RESEARCH INSTITUTE
QINGHAI DIANYAN TECHNOLOGY Co Ltd
QINGHAI ELECTRIC POWER CO Ltd
State Grid Corp of China SGCC
Original Assignee
Beijing Yitongyu Science & Technology Development Co Ltd
QINGDAO ELECTRIC POWER RESEARCH INSTITUTE
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yitongyu Science & Technology Development Co Ltd, QINGDAO ELECTRIC POWER RESEARCH INSTITUTE filed Critical Beijing Yitongyu Science & Technology Development Co Ltd
Priority to CN 201210434409 priority Critical patent/CN103020728A/en
Publication of CN103020728A publication Critical patent/CN103020728A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to the technical field of power grid management, and specifically relates to a method for predicating short-term substation power quality in an electrical power system. The method disclosed by the invention comprises the steps of: (1), BP (band pass) neural network introducing, (2), BP neuron three-layer BP neural network learning, and (3) BP neural network designing. The short-term substation power quality in the electrical power system is predicated by using a neutral network in the method, the method has the advantages that the capability of making complex correlation assumption to input a vector is not required because the method has simulation multivariate; a hidden input/output nonlinear relation is sampled and approximated in the training process via learning by only using observed data without depending on expert advice; and the researches in the recent years show that, compared with the two methods of statistical techniques and the expert system method, the method for predicating the short-term substation power quality in the electrical power system by using a neural network technology can obtain high precision.

Description

Transformer station's short-term powerquality Forecasting Methodology in a kind of electric system
Technical field
The present invention relates to the administration of power networks technical field, specifically relate to transformer station's short-term powerquality Forecasting Methodology in a kind of electric system.
Background technology
The prediction of transformer station's short-term powerquality related data can obtain very high precision in the electric system, is subjected to many-sided the impact because power system load changes, and on the one hand, load variations exists the random fluctuation that is caused by the uncertain factor of the unknown; On the other hand, have again the regularity that the cycle changes, this is also so that load curve has similarity, so the power quality data of transformer station also can fluctuate.
Summary of the invention
The objective of the invention is the deficiency for the prior art existence, transformer station's short-term powerquality Forecasting Methodology in a kind of electric system is provided.
Transformer station's short-term powerquality Forecasting Methodology is achieved by following technical proposals in a kind of electric system of the present invention: transformer station's short-term powerquality Forecasting Methodology in a kind of electric system, it is characterized in that: described method comprises the steps:
(1) the BP neural network is introduced step and is adopted the BP neural network, and namely the error back propagation neural network is a kind of neural network most widely used in the artificial neural network; In the practical application of artificial neural network, the BP neural network is widely used in the fields such as approximation of function, pattern-recognition and classification, data compression, prediction, and the artificial nerve network model of 80%-90% all is the variation of adopting BP neural network or BP neural network;
(2) three layers of BP neural network learning of BP neuron step BP neuron is the most basic ingredient of BP neural network, wherein,
Figure 603341DEST_PATH_IMAGE001
Be the neuronic input of BP,
Figure 356402DEST_PATH_IMAGE002
Represent the connection weights between the BP neuron,
Figure 641759DEST_PATH_IMAGE003
Be threshold value, fBe the neuronic transition function of BP, y is the neuronic output of BP, has
Figure 460722DEST_PATH_IMAGE004
(1)
BP neural network input vector is
Figure 972475DEST_PATH_IMAGE005
BP neural network object vector is
Figure 201331DEST_PATH_IMAGE006
BP neural network middle layer elements input vector is
Figure 544456DEST_PATH_IMAGE007
, BP neural network output vector is
Figure 466145DEST_PATH_IMAGE008
BP neural network output layer unit input vector is
Figure 261931DEST_PATH_IMAGE009
, BP neural network output vector is
Figure 639953DEST_PATH_IMAGE010
BP neural network input layer to the connection weights in middle layer are
Figure 24535DEST_PATH_IMAGE011
, , BP neural network middle layer to the connection weights of output layer are
Figure 970866DEST_PATH_IMAGE014
,
Figure 294444DEST_PATH_IMAGE013
,
Figure 557935DEST_PATH_IMAGE015
Each unit, BP neural network middle layer is output as
Figure 328313DEST_PATH_IMAGE016
,
Figure 764980DEST_PATH_IMAGE013
Each unit of BP neural network output layer is output as
Figure 874887DEST_PATH_IMAGE017
,
Figure 371596DEST_PATH_IMAGE018
Parameter
Figure 561095DEST_PATH_IMAGE019
The learning process of three layers of BP neural network is as follows:
1), initialization connects weights to each
Figure 739136DEST_PATH_IMAGE020
With
Figure 703549DEST_PATH_IMAGE014
, threshold value
Figure 371160DEST_PATH_IMAGE021
With Give the random value in the interval [1 ,+1];
2), choose at random one group of input and target sample
Figure 769966DEST_PATH_IMAGE022
,
Figure 651203DEST_PATH_IMAGE006
Offer the BP neural network;
3), with the input sample
Figure 433258DEST_PATH_IMAGE005
, connect weights
Figure 665525DEST_PATH_IMAGE023
And threshold value
Figure 247685DEST_PATH_IMAGE021
Calculate the input of each unit, middle layer , then use
Figure 258421DEST_PATH_IMAGE024
Calculate the output of each unit, middle layer by transport function
Figure 712405DEST_PATH_IMAGE025
:
Figure 912571DEST_PATH_IMAGE026
(2)
Figure 276611DEST_PATH_IMAGE027
Figure 889995DEST_PATH_IMAGE013
(3)
4), utilize the output in middle layer , connect weights
Figure 733416DEST_PATH_IMAGE014
And threshold value
Figure 84631DEST_PATH_IMAGE017
Calculate the output of each unit of output layer
Figure 450891DEST_PATH_IMAGE028
, the response of then calculating each unit of output layer by transport function
Figure 178544DEST_PATH_IMAGE029
:
Figure 681070DEST_PATH_IMAGE030
Figure 406449DEST_PATH_IMAGE015
(4)
Figure 519910DEST_PATH_IMAGE015
(5)
5), utilize the network objectives vector , the actual output of network
Figure 773223DEST_PATH_IMAGE032
, each unit vague generalization error of calculating output layer
Figure 487976DEST_PATH_IMAGE033
:
Figure 102726DEST_PATH_IMAGE015
(6)
6), utilize the connection weights
Figure 904328DEST_PATH_IMAGE014
, output layer the vague generalization error
Figure 466897DEST_PATH_IMAGE035
Output with the middle layer
Figure 340044DEST_PATH_IMAGE036
Calculate the vague generalization error of each unit, middle layer :
Figure 39511DEST_PATH_IMAGE013
(7)
7), utilize the vague generalization error of each unit of output layer
Figure 388453DEST_PATH_IMAGE039
Output with each unit, middle layer
Figure 840162DEST_PATH_IMAGE040
Revise the connection weights And threshold value
Figure 520728DEST_PATH_IMAGE042
:
Figure 735678DEST_PATH_IMAGE043
Figure 41894DEST_PATH_IMAGE013
Figure 627640DEST_PATH_IMAGE015
Figure 714413DEST_PATH_IMAGE044
(8)
Figure 405158DEST_PATH_IMAGE045
Figure 565880DEST_PATH_IMAGE013
Figure 316668DEST_PATH_IMAGE015
Figure 562841DEST_PATH_IMAGE044
(9)
8), utilize the vague generalization error of each unit, middle layer
Figure 119593DEST_PATH_IMAGE037
, the input of each unit of input layer Revise the connection weights
Figure 378548DEST_PATH_IMAGE047
And threshold value
Figure 112018DEST_PATH_IMAGE021
:
Figure 559420DEST_PATH_IMAGE013
Figure 786045DEST_PATH_IMAGE049
(10)
Figure 622283DEST_PATH_IMAGE050
Figure 203492DEST_PATH_IMAGE013
(11)
9), choose at random next learning sample vector and offer network, turn back to step 3), until
Figure 410537DEST_PATH_IMAGE051
Individual training sample training is complete;
10), again from
Figure 538023DEST_PATH_IMAGE052
Choose at random one group of input sample and target sample in the individual learning sample, return step 3), until network global error E is less than predefined minimal value, i.e. a network convergence; If the study number of times is greater than predefined value, network just can't be restrained;
(3) design procedure of BP neural network
The design of BP neural network mainly comprises the several aspects of the transport function between input layer, hidden layer, output layer and each layer, and the software of use is matlab;
1), network number of plies BP neural network is a kind of multilayer feedforward type network, theory according to neural network, given any one continuous function, all can be realized by three layers of BP neural network, between each node of input layer and hidden layer, can be connected by the weights of adjusting between each node of hidden layer and output layer; The BP neural network that this programme is chosen is three-decker, i.e. input layer, output layer and a hidden layer;
2), inputoutput data is chosen as follows, if certain transformer station is to record a point in per 3 minutes, 480 points were just arranged so in one day, with the input of five days continuous 5 day data of first day to the as sample, with the output of the 9th day data as sample, set up mapping relations, obtain the data model of this index;
3), transition function BP neural network transport function is the important component part of BP neural network, and the BP neural network often adopts logarithm or the tan of S type, also may adopt pure linearity (purelin) function in some specific situation;
The logsig function, i.e. the logarithmic function of S type, function return value is positioned in the interval [0,1];
Figure 972415DEST_PATH_IMAGE053
(12)
The tansig function, i.e. the tan of S type function, function return value is positioned in the interval [1,1];
Figure 902194DEST_PATH_IMAGE054
(13)
The BP neural network from the input layer to the hidden layer and hidden layer adopt respectively logsig function and tansig function to the transport function of output layer;
The establishment of BP neural network: net1=newff (p, t, [40,20], ' logsig', ' tansig'})
Wherein, newff () is the establishment function of BP neural network; P represents to input data, and t represents to export data, and [40,20] expression BP neural network hidden layer node number is 40, and BP neural network output layer nodes is 20; { transport function that ' logsig', ' tansig'} represents BP neural network hidden layer is ' logsig', the transport function of BP neural network output layer be ' tansig';
The parameter of model 4), is set
Net1.trainParam.epochs=1000; % creates iterations
Net1.trainParam.lr=0.01; % creates speed
Net1.trainParam.goal=0.00001; The % aimed at precision
5), the training BP neural network of BP neural network will be trained after creating;
net2=train(net1,p,t);
Wherein, train () is the training function of BP neural network, and net1 is the BP neural network that just now creates, and p is the input training sample data of BP neural network, and t is the target training sample data of BP neural network, and net2 is the BP neural network that trains;
6), the performance quality of the test b P neural network of BP neural network sees mainly whether it has well generalization ability, and the test of generalization ability can not be tested with the data of training sample, and will test with the test sample book data beyond the training sample;
Y=sim(net2,ptest);
Wherein, sim () can carry out emulation to the BP neural network, and net2 is the BP neural network that trains, and ptest is the input test sample data of BP neural network, and Y is the test result of BP neural network;
E=Y-L;
Wherein, Y is the test result of BP neural network, and L is the output test sample book data of BP neural network, and E is the absolute error of BP neural network test sample book data;
Mae (E) i.e. is square absolute error, is a kind of performance evaluation function of BP neural network;
7), preserve the above well model of generalization ability that has, in the data prediction afterwards, every continuous 5 days data can be used as input, by following the 9th day data of this model prediction.
Transformer station's short-term powerquality Forecasting Methodology has following beneficial effect compared with prior art in a kind of electric system of the present invention: the present invention adopts neural network that transformer station's short-term powerquality in the electric system is predicted, its advantage is that it has the ability of simulating multivariate and not needing input variable is done complicated relevant hypothesis.It does not rely on expertise, only utilizes the data of observing, and can be from training process samples and approaches the nonlinear relation of implicit I/O by learning.Studies show that in recent years with respect to statistical technique of the prior art and two kinds of methods of expert system approach, utilized nerual network technique to carry out in the electric system prediction of transformer station's short-term powerquality and can be obtained higher precision.
Embodiment
Below in conjunction with embodiment transformer station's short-term powerquality Forecasting Methodology technical scheme in a kind of electric system of the present invention is further described.
Embodiment 1.
Transformer station's short-term powerquality Forecasting Methodology in a kind of electric system of the present invention, it is characterized in that: described method comprises the steps:
(1) the BP neural network is introduced step and is adopted the BP neural network, and namely the error back propagation neural network is a kind of neural network most widely used in the artificial neural network; In the practical application of artificial neural network, the BP neural network is widely used in the fields such as approximation of function, pattern-recognition and classification, data compression, prediction, and the artificial nerve network model of 80%-90% all is the variation of adopting BP neural network or BP neural network;
(2) three layers of BP neural network learning of BP neuron step BP neuron is the most basic ingredient of BP neural network, wherein,
Figure 142551DEST_PATH_IMAGE001
Be the neuronic input of BP,
Figure 106965DEST_PATH_IMAGE002
Represent the connection weights between the BP neuron, Be threshold value, fBe the neuronic transition function of BP, y is the neuronic output of BP, has
Figure 457230DEST_PATH_IMAGE004
(1)
BP neural network input vector is BP neural network object vector is
Figure 326058DEST_PATH_IMAGE006
BP neural network middle layer elements input vector is
Figure 102253DEST_PATH_IMAGE007
, BP neural network output vector is
Figure 68940DEST_PATH_IMAGE008
BP neural network output layer unit input vector is
Figure 588784DEST_PATH_IMAGE009
, BP neural network output vector is
Figure 590106DEST_PATH_IMAGE010
BP neural network input layer to the connection weights in middle layer are
Figure 537203DEST_PATH_IMAGE011
,
Figure 923010DEST_PATH_IMAGE012
,
Figure 246544DEST_PATH_IMAGE013
BP neural network middle layer to the connection weights of output layer are ,
Figure 220371DEST_PATH_IMAGE013
, Each unit, BP neural network middle layer is output as
Figure 288876DEST_PATH_IMAGE016
, Each unit of BP neural network output layer is output as
Figure 293970DEST_PATH_IMAGE017
,
Figure 660230DEST_PATH_IMAGE018
Parameter
The learning process of three layers of BP neural network is as follows:
1), initialization connects weights to each
Figure 890409DEST_PATH_IMAGE020
With
Figure 615788DEST_PATH_IMAGE014
, threshold value
Figure 203764DEST_PATH_IMAGE021
With
Figure 752687DEST_PATH_IMAGE017
Give the random value in the interval [1 ,+1];
2), choose at random one group of input and target sample
Figure 109719DEST_PATH_IMAGE022
,
Figure 5999DEST_PATH_IMAGE006
Offer the BP neural network;
3), with the input sample
Figure 81272DEST_PATH_IMAGE005
, connect weights
Figure 88411DEST_PATH_IMAGE023
And threshold value
Figure 627845DEST_PATH_IMAGE021
Calculate the input of each unit, middle layer
Figure 695027DEST_PATH_IMAGE024
, then use
Figure 325772DEST_PATH_IMAGE024
Calculate the output of each unit, middle layer by transport function
Figure 136602DEST_PATH_IMAGE025
:
Figure 468226DEST_PATH_IMAGE026
Figure 768627DEST_PATH_IMAGE013
(2)
Figure 880808DEST_PATH_IMAGE027
Figure 229750DEST_PATH_IMAGE013
(3)
4), utilize the output in middle layer
Figure 743777DEST_PATH_IMAGE025
, connect weights
Figure 881323DEST_PATH_IMAGE014
And threshold value Calculate the output of each unit of output layer
Figure 695750DEST_PATH_IMAGE028
, the response of then calculating each unit of output layer by transport function
Figure 1966DEST_PATH_IMAGE029
:
Figure 581852DEST_PATH_IMAGE030
(4)
Figure 851636DEST_PATH_IMAGE015
(5)
5), utilize the network objectives vector
Figure 602423DEST_PATH_IMAGE006
, the actual output of network
Figure 910913DEST_PATH_IMAGE032
, each unit vague generalization error of calculating output layer
Figure 411208DEST_PATH_IMAGE033
:
Figure 692017DEST_PATH_IMAGE034
Figure 613705DEST_PATH_IMAGE015
(6)
6), utilize the connection weights
Figure 409492DEST_PATH_IMAGE014
, output layer the vague generalization error
Figure 442039DEST_PATH_IMAGE035
Output with the middle layer
Figure 577354DEST_PATH_IMAGE036
Calculate the vague generalization error of each unit, middle layer
Figure 732260DEST_PATH_IMAGE037
:
Figure 947167DEST_PATH_IMAGE038
Figure 783405DEST_PATH_IMAGE013
(7)
7), utilize the vague generalization error of each unit of output layer
Figure 835543DEST_PATH_IMAGE039
Output with each unit, middle layer
Figure 99034DEST_PATH_IMAGE040
Revise the connection weights
Figure 807096DEST_PATH_IMAGE041
And threshold value
Figure 509342DEST_PATH_IMAGE042
:
Figure 353670DEST_PATH_IMAGE043
Figure 856238DEST_PATH_IMAGE013
Figure 51596DEST_PATH_IMAGE015
Figure 229637DEST_PATH_IMAGE044
(8)
Figure 318685DEST_PATH_IMAGE045
Figure 923978DEST_PATH_IMAGE013
Figure 668949DEST_PATH_IMAGE015
Figure 402679DEST_PATH_IMAGE044
(9)
8), utilize the vague generalization error of each unit, middle layer
Figure 283917DEST_PATH_IMAGE037
, the input of each unit of input layer
Figure 325691DEST_PATH_IMAGE046
Revise the connection weights
Figure 230062DEST_PATH_IMAGE047
And threshold value
Figure 812222DEST_PATH_IMAGE021
:
Figure 751228DEST_PATH_IMAGE048
Figure 766500DEST_PATH_IMAGE012
Figure 158167DEST_PATH_IMAGE013
Figure 544018DEST_PATH_IMAGE049
(10)
Figure 337531DEST_PATH_IMAGE050
Figure 459125DEST_PATH_IMAGE013
(11)
9), choose at random next learning sample vector and offer network, turn back to step 3), until
Figure 962931DEST_PATH_IMAGE051
Individual training sample training is complete;
10), again from
Figure 579726DEST_PATH_IMAGE052
Choose at random one group of input sample and target sample in the individual learning sample, return step 3), until network global error E is less than predefined minimal value, i.e. a network convergence; If the study number of times is greater than predefined value, network just can't be restrained;
(3) design procedure of BP neural network
The design of BP neural network mainly comprises the several aspects of the transport function between input layer, hidden layer, output layer and each layer, and the software of use is matlab;
1), network number of plies BP neural network is a kind of multilayer feedforward type network, theory according to neural network, given any one continuous function, all can be realized by three layers of BP neural network, between each node of input layer and hidden layer, can be connected by the weights of adjusting between each node of hidden layer and output layer; The BP neural network that this programme is chosen is three-decker, i.e. input layer, output layer and a hidden layer;
2), inputoutput data is chosen as follows, if certain transformer station is to record a point in per 3 minutes, 480 points were just arranged so in one day, with the input of five days continuous 5 day data of first day to the as sample, with the output of the 9th day data as sample, set up mapping relations, obtain the data model of this index;
3), transition function BP neural network transport function is the important component part of BP neural network, and the BP neural network often adopts logarithm or the tan of S type, also may adopt pure linearity (purelin) function in some specific situation;
The logsig function, i.e. the logarithmic function of S type, function return value is positioned in the interval [0,1];
(12)
The tansig function, i.e. the tan of S type function, function return value is positioned in the interval [1,1];
Figure 345742DEST_PATH_IMAGE054
(13)
The BP neural network from the input layer to the hidden layer and hidden layer adopt respectively logsig function and tansig function to the transport function of output layer;
The establishment of BP neural network: net1=newff (p, t, [40,20], ' logsig', ' tansig'})
Wherein, newff () is the establishment function of BP neural network; P represents to input data, and t represents to export data, and [40,20] expression BP neural network hidden layer node number is 40, and BP neural network output layer nodes is 20; { transport function that ' logsig', ' tansig'} represents BP neural network hidden layer is ' logsig', the transport function of BP neural network output layer be ' tansig';
The parameter of model 4), is set
Net1.trainParam.epochs=1000; % creates iterations
Net1.trainParam.lr=0.01; % creates speed
Net1.trainParam.goal=0.00001; The % aimed at precision
5), the training BP neural network of BP neural network will be trained after creating;
net2=train(net1,p,t);
Wherein, train () is the training function of BP neural network, and net1 is the BP neural network that just now creates, and p is the input training sample data of BP neural network, and t is the target training sample data of BP neural network, and net2 is the BP neural network that trains;
6), the performance quality of the test b P neural network of BP neural network sees mainly whether it has well generalization ability, and the test of generalization ability can not be tested with the data of training sample, and will test with the test sample book data beyond the training sample;
Y=sim(net2,ptest);
Wherein, sim () can carry out emulation to the BP neural network, and net2 is the BP neural network that trains, and ptest is the input test sample data of BP neural network, and Y is the test result of BP neural network;
E=Y-L;
Wherein, Y is the test result of BP neural network, and L is the output test sample book data of BP neural network, and E is the absolute error of BP neural network test sample book data;
Mae (E) i.e. is square absolute error, is a kind of performance evaluation function of BP neural network;
7), preserve the above well model of generalization ability that has, in the data prediction afterwards, every continuous 5 days data can be used as input, by following the 9th day data of this model prediction.

Claims (1)

1. transformer station's short-term powerquality Forecasting Methodology in the electric system, it is characterized in that: described method comprises the steps:
(1) the BP neural network is introduced step and is adopted the BP neural network, and namely the error back propagation neural network is a kind of neural network most widely used in the artificial neural network; In the practical application of artificial neural network, the BP neural network is widely used in the fields such as approximation of function, pattern-recognition and classification, data compression, prediction, and the artificial nerve network model of 80%-90% all is the variation of adopting BP neural network or BP neural network;
(2) three layers of BP neural network learning of BP neuron step BP neuron is the most basic ingredient of BP neural network, wherein,
Figure 946115DEST_PATH_IMAGE001
Be the neuronic input of BP,
Figure 571000DEST_PATH_IMAGE002
Represent the connection weights between the BP neuron, Be threshold value, fBe the neuronic transition function of BP, y is the neuronic output of BP, has
Figure 713454DEST_PATH_IMAGE004
(1)
BP neural network input vector is
Figure 19714DEST_PATH_IMAGE005
BP neural network object vector is
Figure 69578DEST_PATH_IMAGE006
BP neural network middle layer elements input vector is
Figure 418520DEST_PATH_IMAGE007
, BP neural network output vector is
Figure 666968DEST_PATH_IMAGE008
BP neural network output layer unit input vector is
Figure 138269DEST_PATH_IMAGE009
, BP neural network output vector is
Figure 737747DEST_PATH_IMAGE010
BP neural network input layer to the connection weights in middle layer are
Figure 884520DEST_PATH_IMAGE011
,
Figure 253053DEST_PATH_IMAGE012
,
Figure 567360DEST_PATH_IMAGE013
BP neural network middle layer to the connection weights of output layer are
Figure 591816DEST_PATH_IMAGE014
,
Figure 282561DEST_PATH_IMAGE013
,
Figure 895814DEST_PATH_IMAGE015
Each unit, BP neural network middle layer is output as
Figure 386881DEST_PATH_IMAGE016
,
Figure 898634DEST_PATH_IMAGE013
Each unit of BP neural network output layer is output as
Figure 455386DEST_PATH_IMAGE017
, Parameter
Figure 454621DEST_PATH_IMAGE019
The learning process of three layers of BP neural network is as follows:
1), initialization connects weights to each
Figure 188090DEST_PATH_IMAGE020
With
Figure 548533DEST_PATH_IMAGE014
, threshold value
Figure 763744DEST_PATH_IMAGE021
With
Figure 856333DEST_PATH_IMAGE017
Give the random value in the interval [1 ,+1];
2), choose at random one group of input and target sample
Figure 139416DEST_PATH_IMAGE022
, Offer the BP neural network;
3), with the input sample , connect weights
Figure 228967DEST_PATH_IMAGE023
And threshold value
Figure 5205DEST_PATH_IMAGE021
Calculate the input of each unit, middle layer
Figure 379554DEST_PATH_IMAGE024
, then use
Figure 489462DEST_PATH_IMAGE024
Calculate the output of each unit, middle layer by transport function
Figure 923854DEST_PATH_IMAGE025
:
Figure 119212DEST_PATH_IMAGE026
Figure 297252DEST_PATH_IMAGE013
(2)
Figure 929277DEST_PATH_IMAGE013
(3)
4), utilize the output in middle layer
Figure 402809DEST_PATH_IMAGE025
, connect weights
Figure 384541DEST_PATH_IMAGE014
And threshold value
Figure 203461DEST_PATH_IMAGE017
Calculate the output of each unit of output layer
Figure DEST_PATH_IMAGE028
, the response of then calculating each unit of output layer by transport function
Figure 166607DEST_PATH_IMAGE029
:
Figure DEST_PATH_IMAGE030
Figure 529367DEST_PATH_IMAGE015
(4)
Figure 111527DEST_PATH_IMAGE031
Figure 784954DEST_PATH_IMAGE015
(5)
5), utilize the network objectives vector
Figure 59946DEST_PATH_IMAGE006
, the actual output of network , each unit vague generalization error of calculating output layer
Figure DEST_PATH_IMAGE033
:
Figure 41993DEST_PATH_IMAGE034
Figure 569926DEST_PATH_IMAGE015
(6)
6), utilize the connection weights
Figure 78137DEST_PATH_IMAGE014
, output layer the vague generalization error Output with the middle layer
Figure DEST_PATH_IMAGE036
Calculate the vague generalization error of each unit, middle layer
Figure 5696DEST_PATH_IMAGE037
:
Figure 721891DEST_PATH_IMAGE038
Figure 276369DEST_PATH_IMAGE013
(7)
7), utilize the vague generalization error of each unit of output layer
Figure 439366DEST_PATH_IMAGE039
Output with each unit, middle layer Revise the connection weights
Figure 291654DEST_PATH_IMAGE041
And threshold value :
Figure 700650DEST_PATH_IMAGE013
Figure 554205DEST_PATH_IMAGE015
Figure 757653DEST_PATH_IMAGE044
(8)
Figure 380265DEST_PATH_IMAGE045
Figure 73283DEST_PATH_IMAGE013
Figure 476451DEST_PATH_IMAGE015
Figure 483591DEST_PATH_IMAGE044
(9)
8), utilize the vague generalization error of each unit, middle layer
Figure 966568DEST_PATH_IMAGE037
, the input of each unit of input layer
Figure 96067DEST_PATH_IMAGE046
Revise the connection weights
Figure 658635DEST_PATH_IMAGE047
And threshold value
Figure 266203DEST_PATH_IMAGE021
:
Figure 898227DEST_PATH_IMAGE012
Figure 965670DEST_PATH_IMAGE013
(10)
Figure DEST_PATH_IMAGE050
Figure 890955DEST_PATH_IMAGE012
(11)
9), choose at random next learning sample vector and offer network, turn back to step 3), until Individual training sample training is complete;
10), again from Choose at random one group of input sample and target sample in the individual learning sample, return step 3), until network global error E is less than predefined minimal value, i.e. a network convergence; If the study number of times is greater than predefined value, network just can't be restrained;
(3) design procedure of BP neural network
The design of BP neural network mainly comprises the several aspects of the transport function between input layer, hidden layer, output layer and each layer, and the software of use is matlab;
1), network number of plies BP neural network is a kind of multilayer feedforward type network, theory according to neural network, given any one continuous function, all can be realized by three layers of BP neural network, between each node of input layer and hidden layer, can be connected by the weights of adjusting between each node of hidden layer and output layer; The BP neural network that this programme is chosen is three-decker, i.e. input layer, output layer and a hidden layer;
2), inputoutput data is chosen as follows, if certain transformer station is to record a point in per 3 minutes, 480 points were just arranged so in one day, with the input of five days continuous 5 day data of first day to the as sample, with the output of the 9th day data as sample, set up mapping relations, obtain the data model of this index;
3), transition function BP neural network transport function is the important component part of BP neural network, and the BP neural network often adopts logarithm or the tan of S type, also may adopt pure linearity (purelin) function in some specific situation;
The logsig function, i.e. the logarithmic function of S type, function return value is positioned in the interval [0,1];
Figure 223180DEST_PATH_IMAGE053
(12)
The tansig function, i.e. the tan of S type function, function return value is positioned in the interval [1,1];
Figure 865383DEST_PATH_IMAGE054
(13)
The BP neural network from the input layer to the hidden layer and hidden layer adopt respectively logsig function and tansig function to the transport function of output layer;
The establishment of BP neural network: net1=newff (p, t, [40,20], ' logsig', ' tansig'})
Wherein, newff () is the establishment function of BP neural network; P represents to input data, and t represents to export data, and [40,20] expression BP neural network hidden layer node number is 40, and BP neural network output layer nodes is 20; { transport function that ' logsig', ' tansig'} represents BP neural network hidden layer is ' logsig', the transport function of BP neural network output layer be ' tansig';
The parameter of model 4), is set
Net1.trainParam.epochs=1000; % creates iterations
Net1.trainParam.lr=0.01; % creates speed
Net1.trainParam.goal=0.00001; The % aimed at precision
5), the training BP neural network of BP neural network will be trained after creating;
net2=train(net1,p,t);
Wherein, train () is the training function of BP neural network, and net1 is the BP neural network that just now creates, and p is the input training sample data of BP neural network, and t is the target training sample data of BP neural network, and net2 is the BP neural network that trains;
6), the performance quality of the test b P neural network of BP neural network sees mainly whether it has well generalization ability, and the test of generalization ability can not be tested with the data of training sample, and will test with the test sample book data beyond the training sample;
Y=sim(net2,ptest);
Wherein, sim () can carry out emulation to the BP neural network, and net2 is the BP neural network that trains, and ptest is the input test sample data of BP neural network, and Y is the test result of BP neural network;
E=Y-L;
Wherein, Y is the test result of BP neural network, and L is the output test sample book data of BP neural network, and E is the absolute error of BP neural network test sample book data;
Mae (E) i.e. is square absolute error, is a kind of performance evaluation function of BP neural network;
7), preserve the above well model of generalization ability that has, in the data prediction afterwards, every continuous 5 days data can be used as input, by following the 9th day data of this model prediction.
CN 201210434409 2012-11-05 2012-11-05 Method for predicating short-term substation power quality in electrical power system Pending CN103020728A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201210434409 CN103020728A (en) 2012-11-05 2012-11-05 Method for predicating short-term substation power quality in electrical power system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201210434409 CN103020728A (en) 2012-11-05 2012-11-05 Method for predicating short-term substation power quality in electrical power system

Publications (1)

Publication Number Publication Date
CN103020728A true CN103020728A (en) 2013-04-03

Family

ID=47969310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201210434409 Pending CN103020728A (en) 2012-11-05 2012-11-05 Method for predicating short-term substation power quality in electrical power system

Country Status (1)

Country Link
CN (1) CN103020728A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598986A (en) * 2014-12-12 2015-05-06 国家电网公司 Big data based power load prediction method
CN105243454A (en) * 2015-11-06 2016-01-13 广州威沃电子有限公司 Big data-based electrical load prediction system
CN105320987A (en) * 2015-09-21 2016-02-10 航天东方红卫星有限公司 Satellite telemetry data intelligent interpretation method based on BP neural network
CN105550425A (en) * 2015-12-09 2016-05-04 天津大学 Logarithm normalization method for improving local prediction ability of neural network
CN103761688B (en) * 2014-01-28 2017-02-01 国家电网公司 Regional-power-grid-oriented power quality problem early warning method
CN106405333A (en) * 2016-10-13 2017-02-15 国网山东省电力公司威海供电公司 Distributed power grid electric energy quality prediction method and apparatus
CN106529095A (en) * 2016-12-12 2017-03-22 广州市扬新技术研究有限责任公司 Photovoltaic power generation prediction research system based on Matlab
CN107392306A (en) * 2017-05-26 2017-11-24 浙江理工大学 The training learning method and system of the neutral net of pattern specifications parameter inference pattern
CN108090563A (en) * 2017-12-15 2018-05-29 烟台港股份有限公司 A kind of electric flux Forecasting Methodology based on BP neural network
CN109919193A (en) * 2019-01-31 2019-06-21 中国科学院上海光学精密机械研究所 A kind of intelligent stage division, system and the terminal of big data
CN110991161A (en) * 2018-09-30 2020-04-10 北京国双科技有限公司 Similar text determination method, neural network model obtaining method and related device
CN111832703A (en) * 2020-06-29 2020-10-27 中南大学 Sampling interval perception long-short term memory network-based process manufacturing industry irregular sampling dynamic sequence modeling method
CN117406137A (en) * 2023-12-12 2024-01-16 国网辽宁省电力有限公司抚顺供电公司 Method and system for monitoring lightning leakage current of power transmission line

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761688B (en) * 2014-01-28 2017-02-01 国家电网公司 Regional-power-grid-oriented power quality problem early warning method
CN104598986B (en) * 2014-12-12 2018-01-19 国家电网公司 Methods of electric load forecasting based on big data
CN104598986A (en) * 2014-12-12 2015-05-06 国家电网公司 Big data based power load prediction method
CN105320987B (en) * 2015-09-21 2018-08-31 航天东方红卫星有限公司 A kind of satellite telemetering data intelligent interpretation method based on BP neural network
CN105320987A (en) * 2015-09-21 2016-02-10 航天东方红卫星有限公司 Satellite telemetry data intelligent interpretation method based on BP neural network
CN105243454A (en) * 2015-11-06 2016-01-13 广州威沃电子有限公司 Big data-based electrical load prediction system
CN105550425A (en) * 2015-12-09 2016-05-04 天津大学 Logarithm normalization method for improving local prediction ability of neural network
CN106405333B (en) * 2016-10-13 2020-01-21 国网山东省电力公司威海供电公司 Method and device for predicting power quality of distributed power grid
CN106405333A (en) * 2016-10-13 2017-02-15 国网山东省电力公司威海供电公司 Distributed power grid electric energy quality prediction method and apparatus
CN106529095A (en) * 2016-12-12 2017-03-22 广州市扬新技术研究有限责任公司 Photovoltaic power generation prediction research system based on Matlab
CN107392306A (en) * 2017-05-26 2017-11-24 浙江理工大学 The training learning method and system of the neutral net of pattern specifications parameter inference pattern
CN108090563A (en) * 2017-12-15 2018-05-29 烟台港股份有限公司 A kind of electric flux Forecasting Methodology based on BP neural network
CN110991161A (en) * 2018-09-30 2020-04-10 北京国双科技有限公司 Similar text determination method, neural network model obtaining method and related device
CN110991161B (en) * 2018-09-30 2023-04-18 北京国双科技有限公司 Similar text determination method, neural network model obtaining method and related device
CN109919193A (en) * 2019-01-31 2019-06-21 中国科学院上海光学精密机械研究所 A kind of intelligent stage division, system and the terminal of big data
CN111832703A (en) * 2020-06-29 2020-10-27 中南大学 Sampling interval perception long-short term memory network-based process manufacturing industry irregular sampling dynamic sequence modeling method
CN117406137A (en) * 2023-12-12 2024-01-16 国网辽宁省电力有限公司抚顺供电公司 Method and system for monitoring lightning leakage current of power transmission line
CN117406137B (en) * 2023-12-12 2024-05-28 国网辽宁省电力有限公司抚顺供电公司 Method and system for monitoring lightning leakage current of power transmission line

Similar Documents

Publication Publication Date Title
CN103020728A (en) Method for predicating short-term substation power quality in electrical power system
CN109270407B (en) Extra-high voltage direct current transmission line fault reason identification method based on multi-source information fusion
CN102520342B (en) Analog circuit test node selecting method based on dynamic feedback neural network modeling
CN103995237A (en) Satellite power supply system online fault diagnosis method
US8775343B2 (en) Method and device for synthesis of network traffic
CN110443724B (en) Electric power system rapid state estimation method based on deep learning
CN105260786A (en) Comprehensive optimization method of simulation credibility evaluation model of electric propulsion system
CN106327357A (en) Load identification method based on improved probabilistic neural network
CN107966600A (en) A kind of electricity anti-theft system and its electricity anti-theft method based on deep learning algorithm
Cao et al. Currency recognition modeling research based on BP neural network improved by gene algorithm
CN102034111A (en) Method for identifying and detecting aircraft structural damage conditions in diversified way
Kajornrit et al. Estimation of missing precipitation records using modular artificial neural networks
CN112381667B (en) Distribution network electrical topology identification method based on deep learning
Wang et al. A genetic-algorithm-based two-stage learning scheme for neural networks
CN107895215A (en) The prediction of community network influence power and maximization System and method for based on neutral net
Xie et al. A method of flood forecasting of chaotic radial basis function neural network
CN103577877B (en) A kind of ship motion forecasting procedure based on time frequency analysis and BP neutral net
Fu et al. The traffic accident prediction based on neural network
CN109525598A (en) A kind of fault-tolerant compression method of wireless sense network depth and system based on variation mixing
CN112199980B (en) Overhead line robot obstacle recognition method
Meng et al. Nonlinear system simulation based on the BP neural network
Liu et al. Theoretical line loss calculation method for low-voltage distribution network via matrix completion and ReliefF-CNN
Cao et al. The urban arterial traffic flow forecasting based on BP neural network
CN108717573A (en) A kind of dynamic process Neural Network Model Identification method
Chen et al. A comparative study on multi-regression analysis and BP neural network of PM2. 5 index

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
ASS Succession or assignment of patent right

Owner name: QINGHAI ELECTRIC POWER CORPORATION QINGDAO ELECTRI

Free format text: FORMER OWNER: BEIJING YITONGYU TECHNOLOGY DEVELOPMENT CO., LTD.

Effective date: 20130820

Owner name: STATE ELECTRIC NET CROP.

Free format text: FORMER OWNER: QINGDAO ELECTRIC POWER RESEARCH INSTITUTE

Effective date: 20130820

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 810008 XINING, QINGHAI PROVINCE TO: 100031 DONGCHENG, BEIJING

TA01 Transfer of patent application right

Effective date of registration: 20130820

Address after: 100031 West Chang'an Avenue, Beijing, No. 86

Applicant after: State Grid Corporation of China

Applicant after: Qinghai Electric Power Co., Ltd.

Applicant after: Qingdao Electric Power Research Institute

Applicant after: Beijing Yitongyu Science & Technology Development Co., Ltd.

Applicant after: Qinghai Dianyan Technology Co., Ltd.

Address before: 54 Xining West Road, Qinghai, No. 8, No. 810008

Applicant before: Qingdao Electric Power Research Institute

Applicant before: Beijing Yitongyu Science & Technology Development Co., Ltd.

C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130403