CN202472729U - SIMF Elman network structure - Google Patents

SIMF Elman network structure Download PDF

Info

Publication number
CN202472729U
CN202472729U CN2011205339127U CN201120533912U CN202472729U CN 202472729 U CN202472729 U CN 202472729U CN 2011205339127 U CN2011205339127 U CN 2011205339127U CN 201120533912 U CN201120533912 U CN 201120533912U CN 202472729 U CN202472729 U CN 202472729U
Authority
CN
China
Prior art keywords
simf
elman
network
network structure
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2011205339127U
Other languages
Chinese (zh)
Inventor
李焱
党小超
郝占军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwest Normal University
Original Assignee
Northwest Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwest Normal University filed Critical Northwest Normal University
Priority to CN2011205339127U priority Critical patent/CN202472729U/en
Application granted granted Critical
Publication of CN202472729U publication Critical patent/CN202472729U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Abstract

An SIMF (Seasonal Input Multilayer Feedback) Elman network structure relates to a network structure, and comprises a network flow monitoring device (1), a data processing device (2), and an SIMF Elman neural network (3); wherein, the network flow monitoring device (1) is connected with the data processing device (2), and the data processing device (2) is connected with the SIMF Elman neural network (3). The SIMF Elman network structure of the utility model has the beneficial effects that: the SIMF Elman network structure introduces a chaotic mechanism to the SIMF Elman network, and utilizes inherent global moving characteristic of the chaotic mechanism to escape from the local minimum point existing in a weight optimization process; in addition, the SIMF Elman network structure studies chaotic characteristics of a Tent mapping, and constructs a corresponding chaotic training algorithm, wherein, the algorithm has a better ergodic uniformity, can search a global optimal solution, and has a quicker search speed; and the SIMF Elman network structure also avoids the problems that a standard Elman neural network has a slow convergence speed and is easy to fall into a local minimum value.

Description

Seasonal input Multi-Layer Feedback Elman network structure
Technical field:
The utility model relates to a kind of network structure, is specifically related to a kind of seasonal input Multi-Layer Feedback Elman network structure.
Background technology:
The Elman neural network is a time delay feedback-type neural network in the typical local recursion, and on the basis of BP neural network, through introducing feedback signal, the storage internal state makes its tool respectively shine upon dynamic function.It is except having input layer, hidden layer, output layer unit; Also has a special layer of accepting; Accepting layer is the output valve that is used for remembering hidden layer unit previous moment, can think that a step postpones, and the delay of the output of hidden layer through accepting layer, storage, is linked to the input of hidden layer certainly.Thisly make it have susceptibility to historical data from the couplet mode, the adding of internal feedback network has increased network itself and has handled the ability of multidate information, thereby reaches the purpose of dynamic modeling.
The mathematical model of Elman neural network: like Fig. 1
If network be input as x (k), x ' is the output of hidden layer (k), x c(k) for accepting layer output; w 1, w 2, w 3Be respectively input layer to hidden layer, accept layer to hidden layer, the connection weights of hidden layer to output layer; F () is the transport function of hidden layer neuron, often adopts the Sigmoid function, and α (0≤α<1) is for accepting a layer self feed back gain factor.The mathematical model of network is:
y(k)=g(W 3x′(k)) (1)
x′(k)=f(W 1x(k)+W 2x c(k)) (2)
x c(k)=x′(k-1)+αx c(k-1) (3)
The tradition neural network model is only studied the time series forecasting with taxis; In addition; The tradition neural network adopts the BP algorithm or basically based on the hybrid algorithm of neural network; These algorithms all have weak point, need find the solution the derivative of objective function based on the gradient descent algorithm of BP, and the scope of application has limitation; The simple problem complicacy is talked about by hybrid algorithm regular meeting based on neural network, also is easier to occur the local convergence problem.
The utility model content:
The purpose of the utility model provides a kind of seasonal input Multi-Layer Feedback Elman network structure; It can be introduced mechanism of chaos in the SIMF Elman network, utilizes the intrinsic overall situation of mechanism of chaos to move about, and escapes from the local minimum point that exists in the weights optimizing process; Studied the chaotic characteristic of Tent mapping; And constructed corresponding chaos training algorithm, and algorithm has better traversal homogeneity, and this algorithm can be searched for globally optimal solution; And search speed is faster arranged, also avoid the speed of convergence of standard Elman neural network slow, be prone to be absorbed in problem such as local minimum.
In order to solve the existing problem of background technology; The utility model is taked following technical scheme: it comprises network traffics monitoring device 1, data processing equipment 2 and SIMF Elman neural network 3; Network traffics monitoring device 1 links to each other with data processing equipment 2, and data processing equipment 2 links to each other with SIMF Elman neural network 3.
The SIMF Elman network Model of the utility model:
y(k)=g(W 3x′(k)+W 5x c2(k)) (4)
x′(k)=f(W 1x(k)+W 2x c1(k)+W 4x c2(k)) (5)
x c1(k)=x′(k-1)+αx c1(k-1) (6)
x c2(k)=y(k-1)+βx c2(k-1) (7)
W wherein 1, w 2, w 3, w 4, w 5Be respectively input layer to hidden layer, accept layer 1 to hidden layer, hidden layer is accepted layer 2 to hidden layer to output layer, accepts the layer 2 connection weights to output layer; X (k) is the input of network, and x ' is the output of hidden layer (k), x C1(k) and x C2(k) be respectively the output of accepting layer 1 and accepting layer 2, y (k) is the output of network; α (0≤α<1) and β (0≤β<1) are for accepting layer 1 and the self feed back gain factor of accepting layer 2.
Accept the introducing of layer 2 and seasonal periodicity algorithm in the SIMF Elman network; Realized the full recurrence of network; Make system not only can dynamically recall internal state information, and can dynamic tracking the variation of output, further improved the approximation capability of network and dynamic mapping ability.
The utility model can be introduced mechanism of chaos in the SIMF Elman network, utilizes the intrinsic overall situation of mechanism of chaos to move about, and escapes from the local minimum point that exists in the weights optimizing process; Studied the chaotic characteristic of Tent mapping; And constructed corresponding chaos training algorithm, and algorithm has better traversal homogeneity, and this algorithm can be searched for globally optimal solution; And search speed is faster arranged, also avoid the speed of convergence of standard Elman neural network slow, be prone to be absorbed in problem such as local minimum.
Description of drawings:
Fig. 1 is the structural representation in the utility model background technology;
Fig. 2 is the structural representation of the utility model;
Fig. 3 is the concrete structure synoptic diagram of Fig. 2.
Embodiment:
With reference to Fig. 1-3; This embodiment is taked following technical scheme: it comprises network traffics monitoring device 1, data processing equipment 2 and SIMF Elman neural network 3; Network traffics monitoring device 1 links to each other with data processing equipment 2, and data processing equipment 2 links to each other with SIMF Elman neural network 3.
The SIMF Elman network Model of this embodiment:
y(k)=g(W 3x′(k)+W 5x c2(k)) (4)
x′(k)=f(W 1x(k)+W 2x c1(k)+W 4x c2(k)) (5)
x c1(k)=x′(k-1)+αx c1(k-1) (6)
x c2(k)=y(k-1)+βx c2(k-1) (7)
W wherein 1, w 2, w 3, w 4, w 5Be respectively input layer to hidden layer, accept layer 1 to hidden layer, hidden layer is accepted layer 2 to hidden layer to output layer, accepts the layer 2 connection weights to output layer; X (k) is the input of network, and x ' is the output of hidden layer (k), x C1(k) and x C2(k) be respectively the output of accepting layer 1 and accepting layer 2, y (k) is the output of network; α (0≤α<1) and β (0≤β<1) are for accepting layer 1 and the self feed back gain factor of accepting layer 2.
Accept the introducing of layer 2 and seasonal periodicity algorithm in the SIMF Elman network; Realized the full recurrence of network; Make system not only can dynamically recall internal state information, and can dynamic tracking the variation of output, further improved the approximation capability of network and dynamic mapping ability.
The Learning Algorithm of this embodiment:
If the input layer of network is a m node; Accept layer 1 and be r node; Accept layer 2 and be n node, actual being output as
Figure BSA00000638968800041
the definition error function of establishing k step system is:
E ( k ) = 1 2 ( y ^ ( k ) - y ( k ) ) T ( y ^ ( k ) - y ( k ) ) - - - ( 8 )
Calculate E (k) respectively each connected the partial derivative of authority credentials, adopt the dynamic BP algorithm that weights are revised, can obtain the basic studies algorithm of SIMF Elman network:
Δ w ji 1 = σ 1 Σ h = 1 n [ ( y ^ h ( k ) - y h ( k ) ) w hj 3 f j ′ ( · ) ] x i ( k ) ; j = 1,2 , . . . , r ; i = 1,2 , . . . , m - - - ( 9 )
In like manner can obtain w 2, w 3, w 4, w 5The study step-length.
Calculate the partial derivative of E (k) respectively, and make it equal 0, the Elman Learning Algorithms after can being improved the connection authority credentials:
Δ w ji 1 = σ 1 Σ h = 1 n [ ( y ^ h ( k ) - y h ( k ) ) w hj 3 f j ′ ( · ) ] x i ( k ) ; j = 1,2 , . . . , r ; i = 1,2 , . . . , m - - - ( 10 )
Δ w jl 2 = σ 2 Σ h = 1 n [ ( y ^ h ( k ) - y h ( k ) ) w hj 3 f j ′ ] ∂ x j ′ ( k ) ∂ w jl 2 ; j = 1,2 , . . . , r ; l = 1,2 , . . , r - - - ( 11 )
Δ wh hj 3 = σ 3 ( y ^ h ( k ) - y h ( k ) ) x j ′ ( k ) ; h = 1,2 , . . . , n ; j = 1,2 , . . . , r - - - ( 12 )
Δ w jq 4 = σ 4 Σ h = 1 n [ ( y ^ h ( k ) - y h ( k ) ) w hj 3 f j ′ ] ∂ x j ′ ( k ) ∂ w jq 4 ; j = 1.2 . . . . . r ; q = 1,2 , . . . , n - - - ( 13 )
Δ w hq 5 = σ 5 ( y ^ h ( k ) - y h ( k ) ) x c 2 , q ( k ) ; h = 1,2 , . . . , n ; q = 1,2 , . . . , n - - - ( 14 )
In the formula:
∂ x j ′ ( k ) ∂ w jl 2 = f ′ ( · ) x l ′ ( k - 1 ) + α ∂ x j ′ ( k - 1 ) ∂ w jl 2 ; j = 1,2 , . . . , r ; l = 1,2 , . . . , r - - - ( 15 )
∂ x j ′ ( k ) ∂ w jq 4 = f j ′ ( · ) y q ( k - 1 ) + β ∂ x j ′ ( k - 1 ) ∂ w jq 4 ; j = 1,2 , . . . , r ; q = 1,2 , . . . , n - - - ( 16 )
σ wherein 1, σ 2, σ 3, σ 4, σ 5, be respectively w 1, w 2, w 3, w 4, w 5The study step-length; M, n, r are respectively the number of input layer, output layer, hidden layer neuron.Formula (15) and (16) have constituted the dynamic recurrence relation of gradient
Figure BSA00000638968800054
and , thereby can realize the effective identification to high order system.
In addition; Elman network after the improvement has increased from output layer and has been connected through the feedback of a step time-delay back to hidden layer and output layer; And the introducing of assembling the periodicity algorithm; Utilize the gradient descent method to ask the backpropagation of weights variation and error, make network possess the ability of handling multidate information, make things convenient for the modeling of dynamic process.
The weights of output layer change.Weights to being input to k output from i have:
Δ w 2 Kj = - η ∂ E ∂ w 2 Kj = - η ∂ E ∂ a 2 k ∂ a 2 k ∂ w 2 Ki = η ( t k - a 2 k ) f 2 ′ a 1 i = η δ Ki a 1 i , Wherein, δ Kj = ( t k - a 2 k ) .
In like manner can get: Δ b 2 Kj = - η ∂ E ∂ b 2 Kj = - η ∂ E ∂ a 2 k ∂ a 2 k ∂ b 2 Ki = η ( t k - a 2 k ) f 2 ′ = η δ Ki .
The hidden layer weights change.Weights to being input to i output from j have:
Δ w 1 Kih = - η ∂ E ∂ w 1 Ij = - η ∂ E ∂ a 2 k ∂ a 2 k ∂ a 1 i ∂ a 1 i ∂ w 1 Ij = η Σ k = 1 s 2 ( t k - a 2 k ) f 2 ′ w 2 Ki f 1 ′ p j = η δ Ki p j , Wherein
δ ij=e if 1′, e i = Σ k = 1 s 2 δ ki w 2 ki .
The seasonal periodicity algorithm of the input and output node of this embodiment
(1) general choosing method
Object vector be P (i, j), i=1,2 ..., 24 totally 24 data; Input vector be P (i, j-1), i=1,2 ..., 24 totally 24 data.The input layer that is network training adopts 24 neurons, shines upon the prediction flow of 24 hours the previous days respectively, and promptly input matrix is the row vector of P (1 * 24); Output layer adopts 24 neurons, following 1 day 24 hours flow of mapping, and promptly output matrix is the row vector of T (1 * 24).The relation of input variable and output variable is shown in table 3-1:
Table 1 is generally chosen neural network input and output vector relation table
Figure BSA00000638968800062
Input point is 48, and output point is 24, and these group data are called first group of data
(2) seasonal cycle choosing method
If hope promptly to carry out M step prediction with the value in following M (M >=1) the individual moment of the individual data prediction of N (N >=1) in the past, a desirable N adjacent sample, and they are mapped as M value; The sample predicted value in this M value representative M the moment this window after, network traffics time series are that to have with the sky be the periodic characteristics of unit, to the network traffics data modeling of collection; Following one day forecast model is following: object vector be P (i, j), i=1; 2 ..., 24 totally 24 data; Input vector be P (i, j-2), P (i, j-1), i=1,2 ..., 24 totally 48 data.The input layer that is network training adopts 48 neurons, and preceding 2 days 48 hours flow is predicted in mapping respectively, and the input window of seasonal periodicity is 2 cycles, and promptly input matrix is the row vector of P (1 * 48); Output layer adopts 24 neurons, following 1 day 24 hours flow of mapping, and promptly output matrix is the row vector of T (1 * 24).The relation of input variable and output variable is shown in table 3-2:
Table 3-2 seasonal cycle is chosen neural network input and output vector relation table
Figure BSA00000638968800071
Input point is 48, and output point is 24, and these group data are called second group of data.
(3) training parameter is selected
Before adjustment connects power; Must connect power to each earlier and give initial value at random, hope that generally initial weight can make each neuronic state value approach zero when input adds up, but can not equal same number; Otherwise system can not train; In training process, can consider to adjust automatically learning rate, the formula of regulating learning rate is shown below:
Figure BSA00000638968800072
Wherein, SSE be the network output error with, choosing of initial learn speed η (0) has very big randomness, generally is taken as 0.01; k IncAnd k DecBe the increment factor and the decrement factor, initial value is 1.05 and 0.7; Can reduce error through adjusting two variablees.
This embodiment can be introduced mechanism of chaos in the SIMF Elman network, utilizes the intrinsic overall situation of mechanism of chaos to move about, and escapes from the local minimum point that exists in the weights optimizing process; Studied the chaotic characteristic of Tent mapping; And constructed corresponding chaos training algorithm, and algorithm has better traversal homogeneity, and this algorithm can be searched for globally optimal solution; And search speed is faster arranged, also avoid the speed of convergence of standard Elman neural network slow, be prone to be absorbed in problem such as local minimum.

Claims (1)

1. seasonal input Multi-Layer Feedback Elman network structure; It is characterized in that it comprises network traffics monitoring device (1), data processing equipment (2) and SIMF Elman neural network (3); Network traffics monitoring device (1) links to each other with data processing equipment (2), and data processing equipment (2) links to each other with SIMF Elman neural network (3).
CN2011205339127U 2011-12-20 2011-12-20 SIMF Elman network structure Expired - Fee Related CN202472729U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011205339127U CN202472729U (en) 2011-12-20 2011-12-20 SIMF Elman network structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011205339127U CN202472729U (en) 2011-12-20 2011-12-20 SIMF Elman network structure

Publications (1)

Publication Number Publication Date
CN202472729U true CN202472729U (en) 2012-10-03

Family

ID=46920873

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011205339127U Expired - Fee Related CN202472729U (en) 2011-12-20 2011-12-20 SIMF Elman network structure

Country Status (1)

Country Link
CN (1) CN202472729U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103267826A (en) * 2013-04-17 2013-08-28 沈阳大学 Soft measurement method for online detection of gelatin concentration
CN110044350A (en) * 2019-04-15 2019-07-23 北京航空航天大学 The MEMS gyro random drift modeling method of application enhancements dynamic recurrent neural network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103267826A (en) * 2013-04-17 2013-08-28 沈阳大学 Soft measurement method for online detection of gelatin concentration
CN110044350A (en) * 2019-04-15 2019-07-23 北京航空航天大学 The MEMS gyro random drift modeling method of application enhancements dynamic recurrent neural network

Similar Documents

Publication Publication Date Title
Qu et al. Multi-step wind speed forecasting based on a hybrid decomposition technique and an improved back-propagation neural network
Gensler et al. Deep Learning for solar power forecasting—An approach using AutoEncoder and LSTM Neural Networks
Chang et al. An improved neural network-based approach for short-term wind speed and power forecast
CN102930358B (en) A kind of neural net prediction method of photovoltaic power station power generation power
Li et al. Multi-reservoir echo state computing for solar irradiance prediction: A fast yet efficient deep learning approach
CN103106544B (en) A kind of photovoltaic generation prognoses system based on T-S Fuzzy neutral net
CN105426956A (en) Ultra-short-period photovoltaic prediction method
CN109978283B (en) Photovoltaic power generation power prediction method based on branch evolution neural network
CN112598180A (en) Distributed regional wind power prediction method
CN103489038A (en) Photovoltaic ultra-short-term power prediction method based on LM-BP neural network
CN104992248A (en) Microgrid photovoltaic power station generating capacity combined forecasting method
CN104537415A (en) Non-linear process industrial fault prediction and identification method based on compressed sensing and DROS-ELM
CN102102626A (en) Method for forecasting short-term power in wind power station
Başaran et al. Systematic literature review of photovoltaic output power forecasting
Chitsazan et al. Wind speed forecasting using an echo state network with nonlinear output functions
CN102915511A (en) Safety monitoring method for neural network model of power-loaded chaotic phase space
CN103927460A (en) Wind power plant short-term wind speed prediction method based on RBF
Park et al. Multi-layer RNN-based short-term photovoltaic power forecasting using IoT dataset
CN107292462A (en) A kind of short-term load forecasting method, apparatus and system
CN111709109A (en) Photovoltaic absorption capacity calculation method and device considering source-load time sequence correlation
CN107977735A (en) A kind of municipal daily water consumption Forecasting Methodology based on deep learning
CN107145968A (en) Photovoltaic apparatus life cycle cost Forecasting Methodology and system based on BP neural network
CN202472729U (en) SIMF Elman network structure
Li et al. Application of ARIMA and LSTM in relative humidity prediction
Tajjour et al. Power generation forecasting of a solar photovoltaic power plant by a novel transfer learning technique with small solar radiation and power generation training data sets

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121003

Termination date: 20121220