CN105978732A - Method and system for optimizing parameters of minimum complexity echo state network based on particle swarm - Google Patents
Method and system for optimizing parameters of minimum complexity echo state network based on particle swarm Download PDFInfo
- Publication number
- CN105978732A CN105978732A CN201610478359.9A CN201610478359A CN105978732A CN 105978732 A CN105978732 A CN 105978732A CN 201610478359 A CN201610478359 A CN 201610478359A CN 105978732 A CN105978732 A CN 105978732A
- Authority
- CN
- China
- Prior art keywords
- particle
- optimal solution
- state network
- simplest
- echo state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/12—Discovery or management of network topologies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/08—Configuration management of networks or network elements
- H04L41/0803—Configuration setting
- H04L41/0823—Configuration setting characterised by the purposes of a change of settings, e.g. optimising configuration for enhancing reliability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/14—Network analysis or design
- H04L41/145—Network analysis or design involving simulating, designing, planning or modelling of a network
Abstract
The invention relates to a method and a system for optimizing parameters of a minimum complexity echo state network based on a particle swarm. The method comprises the following steps of (1) establishing a minimum complexity echo state network model; (2) setting initialization parameters of the minimum complexity echo state network model; (3) establishing a fitness function; (4) computing a particle object function; (5) updating an individual optimal solution and structure parameters of each particle; (6) judging whether to reach an end condition; and (7) outputting an optimal solution of the particle swarm. The minimum complexity echo state network model used by the invention comprises an input layer, a reservoir and an output layer and is provided with a determined reservoir structure, and nerve cells in the reservoir are connected through an annular structure, so that the stability of the reservoir topology is enhanced and the calculated amount is simplified; and a plurality of parameters of the minimum complexity echo state network are optimized based on the characteristics of high speed of calculation and strong global searching ability of a particle swarm optimization algorithm, so that the optimal solution of the particle swarm is obtained, and the prediction precision is improved.
Description
Technical field
The present invention relates to machine learning field, be specifically related to a kind of based on particle group optimizing the simplest echo state network parameter
Method and system.
Background technology
Echo state network ESN (Echo State Network) is a kind of novel recurrent neural network, its uniqueness
Dynamically reserve pool structure makes network possess good short term memory capacity, compared with traditional recurrent neural network, ESN maximum
Advantage is just a simplified the training process of network, solve conventional recursive neural network structure be difficult to determine, training algorithm excessively
Complicated problem, also overcomes the memory that Recursive Networks exists simultaneously and fades problem, but there is also some problems, such as:
(1) reserve pool internal random topology, is mapped to the higher dimensional space of the unknown;
(2) ESN structural parameters are chosen by virtue of experience, do not have the support of scientific theory.
These problems prevent ESN to become can be not enough in order to improve these with widely used instrument, document (Rodan
A,Tino P.Minimum Complexity Echo State Network[J].iEEE Transactions on
NeuralNetworks, 2011,22.) propose a kind of the simplest echo state network with annular reserve pool topological structure
MESN (Minimum ComplexityEcho State Network), is respectively provided with input weight matrix and internal weights square
The weights of battle array are identical, and this guarantees only to need to adjust less free parameter, and the reserve pool topology that this determines simultaneously avoids constantly weight
Multiple experiment is to obtain ESN structure of good performance.Approaching task for nonlinear system, the method does not reduces in holding precision
Meanwhile, amount of calculation is considerably reduced.
Compare classical echo state network, although MESN greatly simplifies ESN structure, solves first problem, i.e.
Reserve pool internal random topology, is mapped to the higher dimensional space of the unknown, but its parameter selects such as neuronal quantity, spectral radius, defeated
Enter the method realization that weights, internal weights and feedback weight etc. are gathered by examination mostly at given parameter space, or according to warp
Testing selection, there is the biggest blindness with uncertain, therefore, how to search out that optimal parameter solves that parameter chooses asks
Topic becomes the focus direction of people's research.
Summary of the invention
The technical problem to be solved is to provide a kind of based on particle group optimizing the simplest echo state network parameter
Method and system, particle swarm optimization algorithm is introduced letter echo state network, optimizes its key parameter, effectively by the method
Accelerate arithmetic speed, improve precision of prediction.
The technical solution adopted for the present invention to solve the technical problems is: a kind of based on particle group optimizing the simplest echo state
The method of network parameter, comprises the following steps:
Step 1: set up an echo state network model the simplest, replaces original echo state network with ring topology
Random topologies;
Step 2: arrange the initiation parameter of the simplest echo state network model, described initiation parameter includes that structure is joined
Number, particle individual optimal solution, population optimal solution, maximum iteration time and the particle number of population;
Step 3: structural parameters and optimal solution to the simplest echo state network model carry out random assortment, set up the simplest time
The fitness function of sound state network model, and calculate the fitness value of each particle;
Step 4: according to the particle of the initiation parameter in the fitness value of each particle calculated in step 3 and step 2
Individual optimal solution and population optimal solution calculate particle object function;
Step 5: according to position and the speed more new formula of particle cluster algorithm, update grain in the simplest echo state network model
Sub-individual optimal solution and structural parameters, iterations adds 1;
Step 6: judge that iterations, whether more than maximum iteration time, if it is, perform step 7, otherwise, performs step
3 to step 5;
Step 7: export this population optimal solution.
Further, the simplest echo state network model in described step 1 includes input layer, reserve pool and output layer,
Input vector, state vector and the output vector that described input layer, reserve pool are corresponding with output layer is respectively as follows: K input block u
(n)=(u1(n),...,uk(n));N number of reserve pool unit x (n)=(x1(n),...,xN(n));With L output unit y (n)
=(y1(n),...,yL(n))。
Further, described reserve pool includes that multiple neuron, each neuron have an excitation state, and deposit
Excitation state x (n+1) of neuron in pond, is updated by state renewal equation:
X (n+1)=f (Winu(n+1)+Wx(n)+Wbacky(n))
Wherein, u (n) is K input block;X (n) is N number of reserve pool unit;Y (n) is L output unit;WinDivide with W
Do not represent N × K input weight matrix and N × N reserve pool connection matrix, WbackIt is the feedback link matrix of a N × L, respectively
W is setinIdentical with all nonzero element absolute values in W, equal to nonzero value a ∈ (-1,1), its weights symbol is by random side
The mode that method or Logistic map produces, and f () represents the excitation function of neuron in reserve pool.
Further, in described reserve pool, the excitation function of neuron is SIN function or S type function.
Further, according to excitation state x (n+1) of neuron in described reserve pool, this model is output as:
Y (n+1)=fout(Wout(u(n+1),x(n+1),y(n)))
Wherein, u (n) is K input block;X (n) is N number of reserve pool unit;Y (n) is L output unit, foutIt is one
Individual output function, WoutRepresent the output connection matrix of L × (K+N+L).
Further, the fitness function in described step 3 includes the error of MESN training stage and test phase, can table
It is shown as:
Fitness=f1(Errortrain)+f2(Errortest)
Training stage needs to consider the approximation capability of ESN model under these structural parameters, by error E rror of trainingtrainBody
Existing, it was predicted that the stage it is considered that the generalization ability of ESN model under this structural parameters, by error E rror of trainingtestEmbody, its
In, f1(·)、f2() is respectively ErrortrainAnd ErrortestExcitation function, Fitness is the fitness for each particle
Value.
Further, in described step 4, the detailed process of calculating target function is:
(1) for a particle in a certain momentOptimal solution by this particle current timeCorresponding adaptation
Angle value and this particle individual optimal solution pidFitness value compare, if the optimal solution of current timeCorresponding adaptation
Angle value is less than this particle individual optimal solution pidFitness value, then by the optimal solution of this particle current timeReplace with this grain
Sub-individual optimal solution pid, otherwise pidKeep constant;
(2) optimal solution p that this particle is obtainedidCorresponding fitness value and population optimal solution pgdCorresponding is suitable
Angle value is answered to compare, if optimal solution p of current timeidCorresponding fitness value is less than population optimal solution pgdAdaptation
Angle value, then by its optimal solution pidAs population optimal solution pgd, otherwise pgdKeep constant.
Further, in described step 2 structural parameters include neuronal quantity, input weight matrix, feedback weight matrix,
The weights of reserve pool, spectral radius, wherein, neuronal quantity initialize from the beginning of 10, excursion between 10-1000, remaining
Parameter initial time is the random number of 0~1.
Further, the position in described step 5 and speed more new formula are as follows:
Wherein, r1,r2[0~1] interval random number generated, d=1,2,3,4,5, i.e. the dimension of solution space be 5 or from
The number of variable is 5, i=1,2 ... M, M are the number of particle, c in colony1And c2For accelerated factor, k represents iterations;
Represent that i-th particle ties up the optimal solution of component at the d that kth time changes;Represent the i-th particle d dimension point in kth time iteration
The structural parameters of amount;=neuronal quantity;=input weight matrix;=feedback weight matrix;=reserve pool
Weights;=spectral radius;pidRepresent that the d of the individual optimal solution of i-th particle ties up component;pgdRepresent population optimal solution
D ties up component.
A kind of system based on particle group optimizing the simplest echo state network parameter, including modeling unit, initialization unit,
Set up function unit, computing unit, updating block, judging unit and output unit;
Described modeling unit, is used for setting up an echo state network model the simplest, replaces original with ring topology
The random topologies of echo state network;
Described initialization unit, for arranging the initiation parameter of the simplest echo state network model, described initialization is joined
Number includes structural parameters, particle individual optimal solution, population optimal solution, maximum iteration time and the particle number of population;
Described set up function unit, for structural parameters and the optimal solution of the simplest echo state network model are carried out at random
Distribution, sets up the fitness function of the simplest echo state network model, and calculates the fitness value of each particle;
Described computing unit, for the particle individuality according to the fitness value of each particle calculated and initiation parameter
Excellent solution and population optimal solution calculate particle object function;
Described updating block, for the position according to particle cluster algorithm and speed more new formula, updates the simplest echo state
Particle individual optimal solution and structural parameters in network model;
Described judging unit, is used for judging whether to reach end condition;
Described output unit, is used for exporting population optimal solution.
Beneficial effects of the present invention: it is crucial that the particle swarm optimization algorithm the simplest echo state network of introducing is optimized it by the present invention
Parameter, the simplest echo state network model includes input layer, reserve pool, output layer, and this model has a reserve pool determined
Structure, neuron therein is connected with loop configuration, enhances the stability of reserve pool topology and enormously simplify amount of calculation,
Meanwhile, based on particle swarm optimization algorithm fast operation, the feature that ability of searching optimum is strong is many to the simplest echo state network
Individual parameter is optimized, thus looks for a globally optimal solution, improves precision of prediction.
Accompanying drawing explanation
Fig. 1 is the simplest echo state network structure chart;
Fig. 2 is the method flow diagram of the present invention;
When Fig. 3 population is 20, the present invention approaches the design sketch of NARMA system;
Fig. 4 population is the error span figure that 20 present invention approach NARMA system;
When Fig. 5 population is 20, the present invention approaches the Performance Evaluation figure of NARMA system.
Detailed description of the invention
Being described principle and the feature of the present invention below in conjunction with accompanying drawing, example is served only for explaining the present invention, and
Non-for limiting the scope of the present invention.
It is simple that particle swarm optimization algorithm PSO (Particle Swarm Optimization) has principle, and parameter is few, receives
Holding back speed fast, the feature that global optimizing ability is strong, this optimized algorithm is that the behavior according to bird predation proposes, and each looks for food
Bird is all a particle, particle in space with certain speed flight (this speed according to the flying experience of itself and with
The flying experience of companion dynamically adjusts) all particles have this value of the adaptive value determined by object function to be applicable to
Evaluate the fine or not degree of particle.Optimizing Search is just the molecular population of the grain formed by such a group random initializtion
In, carry out in an iterative manner.
The present invention utilizes particle swarm optimization algorithm to optimize 5 crucial reserve pool parameters of the simplest echo state network: neural
Unit's quantity, input weight matrix, feedback weight matrix, the weights of reserve pool and spectral radius.
As shown in Figures 1 to 5, the invention provides a kind of side based on particle group optimizing the simplest echo state network parameter
Method, comprises the following steps:
Step 1: as it is shown in figure 1, set up an echo state network model the simplest, replace original time with ring topology
The random topologies of sound state network;Three layers of echo state network model the simplest are by input layer, reserve pool and output layer three
Dividing and constitute, the input vector of its correspondence, state vector and output vector are respectively as follows: K input block u (n)=(u1(n),...,
uk(n));N number of reserve pool unit x (n)=(x1(n),...,xN(n));With L output unit y (n)=(y1(n),...,yL
(n)), wherein, in this model employing single non-linear neural unit and ring retard replacement conventional reserve pond, a large amount of interconnection at random is non-
Linear neuron, to simplify its physical topology, this system is easier to hardware and realizes, because only including two elements: single non-
Linear neuron and a ring retard, each neuron in reserve pool has an excitation state, and reserve pool neuron
Excitation state x (n+1), is updated by state renewal equation:
X (n+1)=f (Winu(n+1)+Wx(n)+Wbacky(n))
Wherein, WinN × K input weight matrix and N × N reserve pool connection matrix, W is represented respectively with WbackIt is a N × L
Feedback link matrix, be respectively provided with WinIdentical with all nonzero element absolute values in W, equal to nonzero value a ∈ (-1,1), its
Weights symbol produces by the way of random method or Logistic mapping, and f represents the excitation function of reserve pool neuron, logical
It is often SIN function or S type function.
According to above-mentioned reserve pool excitation state x (n+1), the output of this model can be calculated by below equation:
Y (n+1)=fout(Wout(u(n+1),x(n+1),y(n)))
Wherein, u (n) is K input block;X (n) is N number of reserve pool unit;Y (n) is L output unit, foutIt is one
Individual output function, WoutRepresent the output connection matrix of L × (K+N+L).
Step 2: arrange the initiation parameter of the simplest echo state network model, described initiation parameter includes (1) structure
Parameter: neuronal quantity, input weight matrix, feedback weight matrix, the weights of reserve pool, spectral radius, wherein, neuronal quantity
Initializing from the beginning of 10, excursion is between 10-1000, and remaining parameter initial time is the random number of 0~1;(2) optimum
Solve: particle individual optimal solution pidWith population optimal solution pgd;(3) particle number of population, maximum iteration time.
Step 3: structural parameters and optimal solution to the simplest echo state network model carry out random assortment, set up the simplest time
The fitness function of sound state network model, and calculate the fitness value of each particle;For guaranteeing the extensive of echo state network
Whole training datas need to be divided into training set and forecast set by ability, first carry out the MESN training stage and are predicted the stage again, described
Fitness function includes the error of MESN training stage and test phase, is represented by:
Fitness=f1(Errortrain)+f2(Errortest)
Training stage needs to consider the approximation capability of ESN model under these structural parameters, by error E rror of trainingtrainBody
Existing.Forecast period is it is considered that the generalization ability of ESN model under this structural parameters, by error E rror of trainingtestEmbody, its
In, f1(·)、f2() is respectively ErrortrainAnd ErrortestExcitation function, Fitness is the fitness of each particle
Value.
Step 4: calculate grain according to the initiation parameter in the fitness value of each particle calculated in step 3 and step 2
Sub-goal function;The detailed process calculating particle object function is:
(1) for a particle in a certain momentOptimal solution by this particle current timeCorresponding fitness
Value and this particle individual optimal solution pidFitness value compare, if the optimal solution of current timeCorresponding fitness
Value is less than this particle individual optimal solution pidFitness value, then by the optimal solution of this particle current timeReplace with this particle
Individual optimal solution pid, otherwise pidKeep constant;
(2) optimal solution p that this particle is obtainedidCorresponding fitness value and population optimal solution pgdCorresponding is suitable
Angle value is answered to compare, if optimal solution p of current timeidCorresponding fitness value is less than population optimal solution pgdAdaptation
Angle value, then by its optimal solution pidAs population optimal solution pgd, otherwise pgdKeep constant.
Step 5: according to position and speed more new formula, more new particle individual optimal solution and the structure ginseng of particle cluster algorithm
Number, iterations adds 1;
Position and speed more new formula is as follows:
Wherein, r1,r2[0~1] interval random number generated, d=1,2,3,4,5, i.e. the dimension of solution space be 5 or from
The number of variable is 5, i=1,2 ... M, M are the number of particle, c in colony1And c2For accelerated factor, k represents iterations;
Represent that i-th particle ties up the optimal solution of component at the d that kth time changes;Represent the i-th particle d dimension in kth time iteration
The structural parameters of component;=neuronal quantity;=input weight matrix;=feedback weight matrix;=reserve pool
Weights;=spectral radius;pidRepresent that the d of the individual optimal solution of i-th particle ties up component;pgdRepresent population optimal solution
D tie up component.
Step 6: judge that iterations, whether more than maximum iteration time, if it is, perform step 7, otherwise, performs step
3 to step 5;
Step 7: output population optimal solution.
A kind of system based on particle group optimizing the simplest echo state network parameter, including modeling unit, initialization unit,
Set up function unit, computing unit, updating block, judging unit and output unit;
Described modeling unit, is used for setting up an echo state network model the simplest, replaces original with ring topology
The random topologies of echo state network;
Described initialization unit, for arranging the initiation parameter of the simplest echo state network model, described initialization is joined
Number includes structural parameters, particle individual optimal solution, population optimal solution, maximum iteration time and the particle number of population;
Described set up function unit, for structural parameters and the optimal solution of the simplest echo state network model are carried out at random
Distribution, sets up the fitness function of the simplest echo state network model, and calculates the fitness value of each particle;
Described computing unit, for the particle individuality according to the fitness value of each particle calculated and initiation parameter
Excellent solution and population optimal solution calculate particle object function;
Described updating block, for the position according to particle cluster algorithm and speed more new formula, updates the simplest echo state
Particle individual optimal solution and structural parameters in network model;
Described judging unit, is used for judging whether to reach end condition;
Described output unit, is used for exporting population optimal solution.
Fig. 3 and Fig. 4 sets forth present invention when population is 20 and approach design sketch and the error span of NARMA system
Figure, it can be seen that utilize the method can effectively follow the future trends of NARMA system.
Fig. 5 gives present invention when population is 20 and approaches the Performance Evaluation figure of NARMA system, it can be seen that at the 16th time
During iteration, NRMSE is minimum, i.e. MESN has optimal None-linear approximation ability, and now optimum results is: reserve pool scale is
620, input weight matrix weights are 0.55, and feedback weight matrix weights are 0.21, and reserve pool weight matrix weights are 0.48, spectrum
Radius is 0.77.
Beneficial effects of the present invention: it is crucial that the particle swarm optimization algorithm the simplest echo state network of introducing is optimized it by the present invention
Parameter, the simplest echo state network model includes input layer, reserve pool, output layer, and this model has a reserve pool determined
Structure, neuron therein is connected with loop configuration, enhances the stability of reserve pool topology and enormously simplify amount of calculation,
Meanwhile, based on particle swarm optimization algorithm fast operation, the feature that ability of searching optimum is strong is many to the simplest echo state network
Individual parameter is optimized, thus looks for a globally optimal solution, improves precision of prediction.
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all spirit in the present invention and
Within principle, any modification, equivalent substitution and improvement etc. made, should be included within the scope of the present invention.
Claims (10)
1. a method based on particle group optimizing the simplest echo state network parameter, it is characterised in that comprise the following steps:
Step 1: set up an echo state network model the simplest, with ring topology replace original echo state network with
Machine topological structure;
Step 2: arrange the initiation parameter of the simplest echo state network model, described initiation parameter includes structural parameters, grain
Sub-individual optimal solution, population optimal solution, maximum iteration time and the particle number of population;
Step 3: structural parameters and optimal solution to the simplest echo state network model carry out random assortment, set up the simplest echo shape
The fitness function of state network model, and calculate the fitness value of each particle;
Step 4: the particle according to the initiation parameter in the fitness value of each particle calculated in step 3 and step 2 is individual
Optimal solution and population optimal solution calculate particle object function;
Step 5: according to position and the speed more new formula of particle cluster algorithm, updates particle in the simplest echo state network model
Body optimal solution and structural parameters, iterations adds 1;
Step 6: judge iterations whether more than maximum iteration time, if it is, perform step 7, otherwise, perform step 3 to
Step 5;
Step 7: output population optimal solution.
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
In, the simplest echo state network model in described step 1 includes input layer, reserve pool and output layer, described input layer, deposit
Pond input vector, state vector and the output vector corresponding with output layer is respectively as follows: K input block u (n)=(u1
(n),...,uk(n));N number of reserve pool unit x (n)=(x1(n),...,xN(n));With L output unit y (n)=(y1
(n),...,yL(n))。
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
Include that multiple neuron, each neuron have an excitation state in, described reserve pool, and the swashing of neuron in reserve pool
Encouraging state x (n+1) is:
X (n+1)=f (Winu(n+1)+Wx(n)+Wbacky(n))
Wherein, u (n) is K input block;X (n) is N number of reserve pool unit;Y (n) is L output unit;WinWith W table respectively
Show N × K input weight matrix and N × N reserve pool connection matrix, WbackIt is the feedback link matrix of a N × L, is respectively provided with
WinIdentical with all nonzero element absolute values in W, equal to nonzero value a ∈ (-1,1), its weights symbol by random method or
The mode that person Logistic maps produces, and f () represents the excitation function of neuron in reserve pool.
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
In, in described reserve pool, the excitation function of neuron is SIN function or S type function.
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
In, according to excitation state x (n+1) of neuron in described reserve pool, this model is output as:
Y (n+1)=fout(Wout(u(n+1),x(n+1),y(n)))
Wherein, u (n) is K input block;X (n) is N number of reserve pool unit;Y (n) is L output unit, foutBe one defeated
Go out function, WoutRepresent the output connection matrix of L × (K+N+L).
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
In, the fitness function in described step 3 includes the simplest echo state network model training stage and the error of test phase, table
It is shown as:
Fitness=f1(Errortrain)+f2(Errortest)
Training stage needs to consider the approximation capability of ESN model under these structural parameters, by error E rror of trainingtrainEmbody;In advance
The survey stage it is considered that the generalization ability of ESN model under this structural parameters, by error E rror of trainingtestEmbody, wherein, f1
(·)、f2() is respectively ErrortrainAnd ErrortestExcitation function, Fitness is the fitness value of each particle.
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
In, the detailed process calculating particle object function in described step 4 is:
(1) for a particle in a certain momentOptimal solution by this particle current timeCorresponding fitness value with
This particle individual optimal solution pidFitness value compare, if the optimal solution of current timeCorresponding fitness value is little
In this particle individual optimal solution pidFitness value, then by the optimal solution of this particle current timeReplace with this particle individual
Optimal solution pid, otherwise pidKeep constant;
(2) optimal solution p that this particle is obtainedidCorresponding fitness value and population optimal solution pgdCorresponding fitness value
Compare, if optimal solution p of current timeidCorresponding fitness value is less than population optimal solution pgdFitness value, then
By its optimal solution pidAs population optimal solution pgd, otherwise pgdKeep constant.
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
In, in described step 2 structural parameters include neuronal quantity, input weight matrix, feedback weight matrix, the weights of reserve pool,
Spectral radius, wherein, neuronal quantity initializes from the beginning of 10, and excursion is between 10-1000, and remaining parameter initial time is
The random number of 0~1.
A kind of method based on particle group optimizing the simplest echo state network parameter, its feature exists
In, position and speed more new formula in described step 5 are as follows:
Wherein, r1,r2[0~1] interval random number generated, d=1,2,3,4,5, i.e. the dimension of solution space be 5 or independent variable
Number is 5, i=1,2 ... M, M are the number of particle, c in colony1And c2For accelerated factor, k represents iterations;Represent i-th grain
The optimal solution of the d dimension component that son changes in kth time;Represent the i-th particle structural parameters at the d dimension component of kth time iteration;
pidRepresent that the d of the individual optimal solution of i-th particle ties up component;pgdRepresent that the d of population optimal solution ties up component.
10. a system based on particle group optimizing the simplest echo state network parameter, it is characterised in that include modeling unit,
Initialization unit, set up function unit, computing unit, updating block, judging unit and output unit;
Described modeling unit, is used for setting up an echo state network model the simplest, replaces original echo with ring topology
The random topologies of state network;
Described initialization unit, for arranging the initiation parameter of the simplest echo state network model, described initiation parameter bag
Include structural parameters, particle individual optimal solution, population optimal solution, maximum iteration time and the particle number of population;
Described set up function unit, for structural parameters and the optimal solution of the simplest echo state network model are divided at random
Join, set up the fitness function of the simplest echo state network model, and calculate the fitness value of each particle;
Described computing unit, for the fitness value according to each particle calculated and the particle individual optimal solution of initiation parameter
And population optimal solution calculates particle object function;
Described updating block, for the position according to particle cluster algorithm and speed more new formula, updates the simplest echo state network
Particle individual optimal solution and structural parameters in model;
Described judging unit, is used for judging whether to reach end condition;
Described output unit, is used for exporting population optimal solution.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610478359.9A CN105978732B (en) | 2016-06-27 | 2016-06-27 | A kind of method and system based on the most simple echo state network parameter of particle group optimizing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610478359.9A CN105978732B (en) | 2016-06-27 | 2016-06-27 | A kind of method and system based on the most simple echo state network parameter of particle group optimizing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105978732A true CN105978732A (en) | 2016-09-28 |
CN105978732B CN105978732B (en) | 2019-08-16 |
Family
ID=57019846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610478359.9A Active CN105978732B (en) | 2016-06-27 | 2016-06-27 | A kind of method and system based on the most simple echo state network parameter of particle group optimizing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105978732B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106777752A (en) * | 2016-12-30 | 2017-05-31 | 华东交通大学 | A kind of bullet train follows the trail of operation curve Optimal Setting method |
CN107273970A (en) * | 2017-05-11 | 2017-10-20 | 西安交通大学 | Support the Reconfigurable Platform and its construction method of the convolutional neural networks of on-line study |
WO2018090580A1 (en) * | 2016-11-17 | 2018-05-24 | 北京智芯微电子科技有限公司 | Method and apparatus for sensing optical access network service stream and computer storage medium |
CN109472070A (en) * | 2018-10-27 | 2019-03-15 | 北京化工大学 | A kind of flexible measurement method of the ESN furnace operation variable based on PLS |
CN111062170A (en) * | 2019-12-03 | 2020-04-24 | 广东电网有限责任公司 | Transformer top layer oil temperature prediction method |
CN111523573A (en) * | 2020-04-10 | 2020-08-11 | 重庆大学 | Bridge structure state evaluation method and system based on multi-parameter fusion |
CN112859924A (en) * | 2021-01-27 | 2021-05-28 | 大连大学 | Unmanned aerial vehicle trajectory planning method combining artificial interference and ESN-PSO |
CN112947055A (en) * | 2021-03-04 | 2021-06-11 | 北京交通大学 | Method for tracking and controlling displacement speed of magnetic suspension train based on echo state network |
CN113033878A (en) * | 2021-03-05 | 2021-06-25 | 西北大学 | Landslide displacement prediction method based on multi-topology hierarchical cooperative particle swarm LSTM |
CN116451763A (en) * | 2023-03-17 | 2023-07-18 | 北京工业大学 | Effluent NH based on EDDESN 4 -N prediction method |
-
2016
- 2016-06-27 CN CN201610478359.9A patent/CN105978732B/en active Active
Non-Patent Citations (4)
Title |
---|
JINCHAO FAN AND MIN HAN: "Online Designed of Echo State Network Based on Particle Swarm Optimization for System Identification", 《THE 2ND INTELLIGENT CONTROL AND INFORMATION PROCESSING》 * |
王娟: "回声状态网络的拓扑结构研究", 《中国优秀硕士学位论文全文数据库》 * |
王河山: "回声状态网络的结构参数优化及其应用", 《中国博士学位论文全文数据库》 * |
马涛: "最简储备池回声状态网络及其应用研究", 《中国优秀硕士学位论文全文数据库》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018090580A1 (en) * | 2016-11-17 | 2018-05-24 | 北京智芯微电子科技有限公司 | Method and apparatus for sensing optical access network service stream and computer storage medium |
CN106777752A (en) * | 2016-12-30 | 2017-05-31 | 华东交通大学 | A kind of bullet train follows the trail of operation curve Optimal Setting method |
CN106777752B (en) * | 2016-12-30 | 2019-04-02 | 华东交通大学 | A kind of bullet train tracking operation curve optimal setting method |
CN107273970A (en) * | 2017-05-11 | 2017-10-20 | 西安交通大学 | Support the Reconfigurable Platform and its construction method of the convolutional neural networks of on-line study |
CN107273970B (en) * | 2017-05-11 | 2020-06-19 | 西安交通大学 | Reconfigurable platform of convolutional neural network supporting online learning and construction method thereof |
CN109472070A (en) * | 2018-10-27 | 2019-03-15 | 北京化工大学 | A kind of flexible measurement method of the ESN furnace operation variable based on PLS |
CN111062170A (en) * | 2019-12-03 | 2020-04-24 | 广东电网有限责任公司 | Transformer top layer oil temperature prediction method |
CN111523573A (en) * | 2020-04-10 | 2020-08-11 | 重庆大学 | Bridge structure state evaluation method and system based on multi-parameter fusion |
CN112859924A (en) * | 2021-01-27 | 2021-05-28 | 大连大学 | Unmanned aerial vehicle trajectory planning method combining artificial interference and ESN-PSO |
CN112859924B (en) * | 2021-01-27 | 2023-11-28 | 大连大学 | Unmanned aerial vehicle track planning method combining artificial interference and ESN-PSO |
CN112947055A (en) * | 2021-03-04 | 2021-06-11 | 北京交通大学 | Method for tracking and controlling displacement speed of magnetic suspension train based on echo state network |
CN112947055B (en) * | 2021-03-04 | 2022-09-09 | 北京交通大学 | Method for tracking and controlling displacement speed of magnetic suspension train based on echo state network |
CN113033878A (en) * | 2021-03-05 | 2021-06-25 | 西北大学 | Landslide displacement prediction method based on multi-topology hierarchical cooperative particle swarm LSTM |
CN113033878B (en) * | 2021-03-05 | 2023-07-25 | 西北大学 | Landslide displacement prediction method based on multi-topology grading collaborative particle swarm LSTM |
CN116451763A (en) * | 2023-03-17 | 2023-07-18 | 北京工业大学 | Effluent NH based on EDDESN 4 -N prediction method |
CN116451763B (en) * | 2023-03-17 | 2024-04-12 | 北京工业大学 | Effluent NH based on EDDESN 4 -N prediction method |
Also Published As
Publication number | Publication date |
---|---|
CN105978732B (en) | 2019-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105978732A (en) | Method and system for optimizing parameters of minimum complexity echo state network based on particle swarm | |
CN108090658A (en) | Arc fault diagnostic method based on time domain charactreristic parameter fusion | |
Saadat et al. | Training echo state neural network using harmony search algorithm | |
CN102622418B (en) | Prediction device and equipment based on BP (Back Propagation) nerve network | |
CN103971160B (en) | particle swarm optimization method based on complex network | |
CN110163410A (en) | It is a kind of based on neural network-time series line loss power predicating method | |
CN113361777B (en) | Runoff prediction method and system based on VMD decomposition and IHHO optimization LSTM | |
CN106777449A (en) | Distribution Network Reconfiguration based on binary particle swarm algorithm | |
CN110444022A (en) | The construction method and device of traffic flow data analysis model | |
CN104200096A (en) | Lightning arrester grading ring optimization method based on differential evolutionary algorithm and BP neural network | |
CN104050505A (en) | Multilayer-perceptron training method based on bee colony algorithm with learning factor | |
Li et al. | Dual-stage hybrid learning particle swarm optimization algorithm for global optimization problems | |
CN110852435A (en) | Neural evolution calculation model | |
Mojarrad et al. | Particle swarm optimization with chaotic velocity clamping (CVC-PSO) | |
CN104598657B (en) | A kind of gene die body reconstructing method based on memetic algorithms | |
Chen et al. | A Spark-based Ant Lion algorithm for parameters optimization of random forest in credit classification | |
CN110378464A (en) | The management method and device of the configuration parameter of artificial intelligence platform | |
Parsa et al. | Multi-objective hyperparameter optimization for spiking neural network neuroevolution | |
CN109697511B (en) | Data reasoning method and device and computer equipment | |
Taik et al. | Hybrid particle swarm and neural network approach for streamflow forecasting | |
Karayiannis | Learning algorithms for reformulated radial basis neural networks | |
CN109308523A (en) | A kind of multilayer perceptron training method based on dove colony optimization algorithm | |
CN111062485A (en) | Novel AUTOML frame | |
Niu et al. | Neural architecture search based on particle swarm optimization | |
Fu et al. | Study of DNN Network Architecture Search for Robot Vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |