CN108197653A - A kind of time series classification method based on convolution echo state network - Google Patents

A kind of time series classification method based on convolution echo state network Download PDF

Info

Publication number
CN108197653A
CN108197653A CN201810004157.XA CN201810004157A CN108197653A CN 108197653 A CN108197653 A CN 108197653A CN 201810004157 A CN201810004157 A CN 201810004157A CN 108197653 A CN108197653 A CN 108197653A
Authority
CN
China
Prior art keywords
echo state
convolution
echo
state network
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810004157.XA
Other languages
Chinese (zh)
Inventor
马千里
沈礼锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201810004157.XA priority Critical patent/CN108197653A/en
Publication of CN108197653A publication Critical patent/CN108197653A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The invention discloses a kind of time series classification methods based on convolution echo state network, echo state network has sequential core and echo state behavior, the former refers to that the signal of input is mapped in the higher dimensional space of reserve pool by echo state network, and the latter refers to the ability that the network has short-term historical information memory.In convolutional neural networks, multiple dimensioned convolutional layer can extract the Analysis On Multi-scale Features in echo state network and can maintain multiple dimensioned sequential invariance by the maximum pondization on time orientation.Echo state network and convolutional neural networks are combined together by the present invention, it is proposed convolution echo state network model, the state that the model exports echo state network represents that information carries out the operations such as the maximum pond on multiple dimensioned convolution and time orientation, realize the mutual supplement with each other's advantages of echo state network and convolutional neural networks, while the high efficiency for keeping echo state network mode of learning, advantage of the convolutional neural networks in terms of feature extraction has also been played.

Description

A kind of time series classification method based on convolution echo state network
Technical field
The present invention relates to reserve pools to calculate and neural network studying technological domain, and in particular to one kind is based on convolution echo shape The time series classification method of state network, suitable for universal Multivariate Time Series classification problem.
Background technology
Echo state network (ESN) is a kind of novel recurrent neural network learning method.At present about echo state net The existing research work of network has been widely used for the fields such as Modelling of Dynamic System, robot control, Chaotic time series forecasting. Its central principle is to realize the modeling to dynamical system by a reserve pool and a simple line decoder.This storage Standby pond includes a large amount of neuron node, and the connection between neuron node is that random initializtion is fixed.When this deposit When pond receives the input signal of external time sequence, the high-dimensional echo status signal with short-term memory is just generated.By defeated The line decoder for going out layer just establishes input time sequence to answering between output signal inside this echo state network Miscellaneous various Nonlinear Mapping relationship, possesses higher efficiency in the processing of time series modeling task.Echo state network There are two important characteristics for tool:Sequential core and echo state behavior, the former refers to that echo state network maps the signal of input Into the higher dimensional space of reserve pool, this process is equivalent to the geo-nuclear tracin4 in kernel method, and function is all to complete higher-dimension projection, then Person refers to the ability that the network has short-term historical information memory.
Convolutional neural networks (CNN) are a kind of deep neural networks of feedforward.It include input layer, convolutional layer, pond layer, Full articulamentum.Wherein, convolutional layer is made of multiple characteristic patterns (feature map), and each characteristic pattern is made of multiple neurons, Its each neuron is connected by convolution kernel with the regional area of last layer characteristic pattern, passes through convolution operation extraction input Different characteristic.Pond layer is equally made of after convolutional layer multiple characteristic patterns, its each characteristic face uniquely corresponds to One layer of a characteristic pattern thereon will not change the number of characteristic pattern.The pond method of generally use has maximum pond, is averaged Chi Hua, random pool etc., these pondizations, which operate, causes characteristic information to obtain effective compression storage, reduces to a certain extent The risk of over-fitting.But convolutional neural networks take very much during training ground, the requirement to equipment is often compared It is high.
For the advantage and disadvantage of echo state network and convolutional neural networks, it would be highly desirable to propose a kind of convolution echo state net Network, for solving time series classification.
Invention content
The purpose of the present invention is to solve drawbacks described above of the prior art, provide a kind of based on convolution echo state net The time series classification method of network.
The purpose of the present invention can be reached by adopting the following technical scheme that:
A kind of time series classification method based on convolution echo state network, the time series classification method include The following steps:
S1, netinit determine the size of reserve pool, generate the input power of reserve pool at random from standardized normal distribution Value with the recurrence connection weight inside reserve pool, determines activation primitive f (z), initialization input scaling parameter IS, spectrum half Diameter parameter Sr and degree of rarefication α;
S2, signal input input the signal u (n) at current n-th moment;
S3, state update, the echo state for collecting input signal represent information X;
S4, information X carries out multiple dimensioned convolution in the direction of time to be represented to echo state, collected in echo status information Analysis On Multi-scale Features;
S5, the maximum pond on time orientation is carried out to the Analysis On Multi-scale Features information in echo state network after convolution, Over-fitting risk is reduced while compressing information;
S6, full articulamentum is added in, the connection of feature in echo state network is realized by full articulamentum;
S7, using all features of full articulamentum as input, by Softmax functions realization original signal is divided Class.
Further, in the step S2, signal input, K dimensional signals u=(u (0), u (1) ..., u (T- are inputted 1)), the output sequence y=(y (0), y (1) ... y (T-1)) of reserve pool N-dimensional init state x (0) and L dimensions;
In the step S3, state update, the state update of entire echo state network is represented by the following formula:
X (t+1)=f (Wresx(t)+Winu(t+1)) (1)
Wherein WinBeing uniformly distributed between obedience [- IS, IS], WresIt is determined by the following formula:
Wherein, the element of matrix W randomly generates between [- 0.5,0.5], and λmax(W) be matrix W maximum eigenvalue;
The echo state for collecting input signal represents information X;
Further, the step S4 processes are as follows:
State vector of the echo state network in t moment is represented with x (t), and time that length is T is described with following formula Sound state represents information:
Wherein,For concatenation operator, z0:T-1For the matrix of T × N, zt:t+k-1It represents from t moment to t+k-1 occasion lengths Echo state for k represents information;Use wK, j∈Rk×NRepresent j-th of wave filter that width is k, input rise state represents information X∈RT×N, and it is 1 to set sliding step, then sliding window has z0:k-1,z1:k,...,zT-k+1:T, convolution results are:
ck,j=(c0,c1,...cT-k+1)T (5)
Wherein, cm, m=0,1 ..., T-k+1 represent the corresponding convolution results of m-th of sliding window, expression It is as follows:
cm=f (αk,j·(wk,j*zm:m+k-1)+b) (6)
Wherein, f is non-linear activation primitive, αk,jBe width be k sliding window and j-th of filter connection weight Weight, * represent point multiplication operation symbol.
Further, the time series classification method is using convolutional neural networks as the decoding of echo state network Device contains the echo state representing matrix of short-term memory using time orientation multi-scale filtering device convolution.
The present invention is had the following advantages relative to the prior art and effect:
1st, echo state network (ESN) and convolutional neural networks (CNN) are combined together by the present invention, it is proposed that Yi Zhongxin Neural network model --- convolution echo state network (ConvESN), pass through CNN convolution pondization operation extraction ESN in containing The echo state for having short-term history information represents feature, realizes the high-accuracy in time series classification task;
2nd, convolution echo state network (ConvESN) is keeping echo state network directly to generate recurrence layer, in training not While this high effective model that must learn, convolutional neural networks advantage powerful in terms of feature extraction has also been played, has been realized The mutual supplement with each other's advantages of echo state network (ESN) and convolutional neural networks (CNN).
Description of the drawings
Fig. 1 is to build the flow diagram based on convolution echo state network model;
Fig. 2 is the schematic diagram that echo state characteristic information carries out maximum pond on multiple dimensioned convolution and time orientation.
Specific embodiment
Purpose, technical scheme and advantage to make the embodiment of the present invention are clearer, below in conjunction with the embodiment of the present invention In attached drawing, the technical solution in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is Part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art All other embodiments obtained without making creative work shall fall within the protection scope of the present invention.
Embodiment
As shown in Figure 1, Fig. 1 be in the present invention structure flow diagram based on convolution echo state network model, this when Between sequence sorting technique include the following steps:
Step S1, netinit determines the size of reserve pool, generates the defeated of reserve pool at random from standardized normal distribution Enter weights, with the recurrence connection weight inside reserve pool, determine activation primitive f (z), initialization input scaling parameter IS, Spectral radius parameter Sr and degree of rarefication α;
Step S2, signal inputs, and inputs the signal u (n) at current n-th moment;
Step S3, state updates, and the echo state for collecting input signal represents information X;
Step S4, information X carries out multiple dimensioned convolution in the direction of time to be represented to echo state, collects echo status information In Analysis On Multi-scale Features;
Step S5, the maximum pond on time orientation is carried out to the Analysis On Multi-scale Features information in echo state network after convolution Change, over-fitting risk is reduced while information is compressed;
Step S6, full articulamentum is added in;
Step S7, the classification to initial data is realized by Softmax functions.
In order to which the specific reality of the time series classification method based on convolution echo state network in the present embodiment is described in detail Mode is applied, is illustrated by taking the Activity recognition image classification data collection with time series characteristic as an example.The data set includes MSR- Tetra- Action 3D, HDM05, Florence3D-Action, UTKinect-Action parts, by being as follows:
Reserve pool is dimensioned to 200 by step S1, netinit, and storage is generated at random in being just distributed very much from standard The input weights in standby pond, the random connection weight inside reserve pool, determine activation primitive tanh (z), initialization input pantograph ratio Example parameter IS=0.1, spectral radius parameter Sr=0.99, degree of rarefication α=0.01;
Step S2, signal inputs, input K dimensional signals u=(u (0), u (1) ..., u (T-1)), the initialization of reserve pool N-dimensional The output sequence y=(y (0), y (1) ... y (T-1)) of state x (0) and L dimensions;
Step S3, state updates, and the state update of entire echo state network system is represented by the following formula:
X (t+1)=f (Wresx(t)+Winu(t+1)) (1)
Wherein WinBeing uniformly distributed between obedience [- IS, IS], WresIt is determined by the following formula:
Wherein, the element of matrix W randomly generates between [- 0.5,0.5], and λmax(W) be matrix W maximum eigenvalue;
The echo state for collecting input signal represents information X;
Step S4, information X carries out multiple dimensioned convolution in the direction of time to be represented to echo state, collects echo status information In Analysis On Multi-scale Features;Represent that echo state network in the state vector of t moment, can then be retouched with following formula with x (t) It states the echo state that length is T and represents information:
Wherein,For concatenation operator, z0:T-1For the matrix of T × N, similar, zt:t+k-1It represents from t moment to t+k-1 The echo state that occasion length is k represents information;Use wK, j∈Rk×NIt represents j-th of wave filter that width is k, inputs rise state Represent information X ∈ RT×N, and it is 1 to set sliding step, then sliding window has z0:k-1,z1:k,...,zT-k+1:T, convolution results are:
ck,j=(c0,c1,...cT-k+1)T (5)
Wherein, cm, m=0,1 ..., T-k+1 represent the corresponding convolution results of m-th of sliding window, expression It is as follows:
cm=f (αk,j·(wk,j*zm:m+k-1)+b) (6)
Wherein, f is non-linear activation primitive, αk,jBe width be k sliding window and j-th of filter connection weight Weight, * represent point multiplication operation symbol.
Step S5, the maximum pond on time orientation is carried out to the Analysis On Multi-scale Features information in echo state network after convolution Change, over-fitting is reduced while information is compressed;
The schematic diagram of above-mentioned steps S4 and step S5 are as shown in Figure 2.
Step S6, the connection of feature in echo state network is realized by full articulamentum;
Step S7, it using all features of full articulamentum as input, is realized by Softmax functions and original signal is carried out Classification.
The time series classification method is using convolutional neural networks as the decoder of echo state network, wherein convolution echo Decoder in state network model -- convolutional neural networks can be flexibly substituted for the decoder of other forms, utilize the time Direction multi-scale filtering device convolution contains the echo state representing matrix of short-term memory, realizes efficiently dividing to time series Class;A novel convolution echo state network model end to end is proposed, realizes efficiently dividing to typical time sequence data Class;By high efficiency of the echo state network in processing time sequence and powerful energy of the convolutional neural networks in terms of feature extraction Power is fused to a Unified frame, is by reserve pool Computational frame and a kind of combined novel side of deep learning frame Decoder convolutional neural networks in method, wherein convolution echo state network model can be flexibly substituted for the solution of other forms Code device.
In conclusion the present embodiment proposes a kind of convolution echo state network, for solving time series classification, realize The mutual supplement with each other's advantages of echo state network and convolutional neural networks is keeping the same of the high efficiency of echo state network mode of learning When, also play convolutional neural networks advantage powerful in terms of feature extraction.
Above-described embodiment is the preferable embodiment of the present invention, but embodiments of the present invention are not by above-described embodiment Limitation, other any Spirit Essences without departing from the present invention with made under principle change, modification, replacement, combine, simplification, Equivalent substitute mode is should be, is included within protection scope of the present invention.

Claims (4)

  1. A kind of 1. time series classification method based on convolution echo state network, which is characterized in that the time series point Class method includes the following steps:
    S1, netinit determine the size of reserve pool, generate the input weights of reserve pool at random from standardized normal distribution, With the recurrence connection weight inside reserve pool, activation primitive f (z), initialization input scaling parameter IS, spectral radius ginseng are determined Number Sr and degree of rarefication α;
    S2, signal input input the signal u (n) at current n-th moment;
    S3, state update, the echo state for collecting input signal represent information X;
    S4, information X carries out multiple dimensioned convolution in the direction of time to be represented to echo state, collects more rulers in echo status information Spend feature;
    S5, the maximum pond on time orientation is carried out to the Analysis On Multi-scale Features information in echo state network after convolution, compressed Over-fitting risk is reduced while information;
    S6, full articulamentum is added in, the connection of feature in echo state network is realized by full articulamentum;
    S7, using all features of full articulamentum as input, by Softmax functions realization classify to original signal.
  2. 2. according to a kind of time series classification method based on convolution echo state network described in claim 1, feature It is, in the step S2, signal input, at the beginning of input K dimensional signals u=(u (0), u (1) ..., u (T-1)), reserve pool N-dimensional The output sequence y=(y (0), y (1) ... y (T-1)) of beginning state x (0) and L dimensions;
    In the step S3, state update, the state update of entire echo state network is represented by the following formula:
    X (t+1)=f (Wresx(t)+Winu(t+1)) (1)
    Wherein WinBeing uniformly distributed between obedience [- IS, IS], WresIt is determined by the following formula:
    Wherein, the element of matrix W randomly generates between [- 0.5,0.5], and λmax(W) be matrix W maximum eigenvalue;
    The echo state for collecting input signal represents information X;
  3. 3. according to a kind of time series classification method based on convolution echo state network described in claim 2, feature It is, the step S4 processes are as follows:
    Represent that in the state vector of t moment, the echo shape that length is T is described with following formula for echo state network with x (t) State represents information:
    Wherein,For concatenation operator, z0:T-1For the matrix of T × N, zt:t+k-1It represents from t moment to t+k-1 occasion lengths as k's Echo state represents information;Use wK, j∈Rk×NRepresent j-th of wave filter that width is k, input rise state represents information X ∈ RT ×N, and it is 1 to set sliding step, then sliding window has z0:k-1,z1:k,...,zT-k+1:T, convolution results are:
    ck,j=(c0,c1,...cT-k+1)T (5)
    Wherein, cm, m=0,1 ..., T-k+1 represent the corresponding convolution results of m-th of sliding window, and expression is as follows:
    cm=f (αk,j·(wk,j*zm:m+k-1)+b) (6)
    Wherein, f is non-linear activation primitive, αk,jBe width be k sliding window and j-th of filter connection weight, * tables Show that point multiplication operation accords with.
  4. 4. according to a kind of time series classification method based on convolution echo state network described in claim 1, feature It is, the time series classification method utilizes time side using convolutional neural networks as the decoder of echo state network Contain the echo state representing matrix of short-term memory to multi-scale filtering device convolution.
CN201810004157.XA 2018-01-03 2018-01-03 A kind of time series classification method based on convolution echo state network Pending CN108197653A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810004157.XA CN108197653A (en) 2018-01-03 2018-01-03 A kind of time series classification method based on convolution echo state network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810004157.XA CN108197653A (en) 2018-01-03 2018-01-03 A kind of time series classification method based on convolution echo state network

Publications (1)

Publication Number Publication Date
CN108197653A true CN108197653A (en) 2018-06-22

Family

ID=62587694

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810004157.XA Pending CN108197653A (en) 2018-01-03 2018-01-03 A kind of time series classification method based on convolution echo state network

Country Status (1)

Country Link
CN (1) CN108197653A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110119760A (en) * 2019-04-11 2019-08-13 华南理工大学 A kind of sequence classification method based on the multiple dimensioned Recognition with Recurrent Neural Network of stratification
CN111209853A (en) * 2020-01-05 2020-05-29 天津大学 Optical fiber sensing vibration signal mode identification method based on AdaBoost-ESN algorithm
CN111753776A (en) * 2020-06-29 2020-10-09 重庆交通大学 Structural damage identification method based on echo state and multi-scale convolution combined model
CN112560784A (en) * 2020-12-25 2021-03-26 华南理工大学 Electrocardiogram classification method based on dynamic multi-scale convolutional neural network
CN112737989A (en) * 2020-11-12 2021-04-30 西安理工大学 Method for predicting chaotic baseband wireless communication decoding threshold value by using echo state network
CN113252790A (en) * 2021-06-21 2021-08-13 四川轻化工大学 Magnetic shoe internal defect detection method based on wide convolution and cyclic neural network
CN113469271A (en) * 2021-07-19 2021-10-01 北京邮电大学 Image classification method based on Echo State Network

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110119760A (en) * 2019-04-11 2019-08-13 华南理工大学 A kind of sequence classification method based on the multiple dimensioned Recognition with Recurrent Neural Network of stratification
CN110119760B (en) * 2019-04-11 2021-08-10 华南理工大学 Sequence classification method based on hierarchical multi-scale recurrent neural network
CN111209853A (en) * 2020-01-05 2020-05-29 天津大学 Optical fiber sensing vibration signal mode identification method based on AdaBoost-ESN algorithm
CN111753776A (en) * 2020-06-29 2020-10-09 重庆交通大学 Structural damage identification method based on echo state and multi-scale convolution combined model
CN111753776B (en) * 2020-06-29 2022-05-10 重庆交通大学 Structural damage identification method based on echo state and multi-scale convolution combined model
CN112737989A (en) * 2020-11-12 2021-04-30 西安理工大学 Method for predicting chaotic baseband wireless communication decoding threshold value by using echo state network
CN112560784A (en) * 2020-12-25 2021-03-26 华南理工大学 Electrocardiogram classification method based on dynamic multi-scale convolutional neural network
CN112560784B (en) * 2020-12-25 2023-06-20 华南理工大学 Electrocardiogram classification method based on dynamic multi-scale convolutional neural network
CN113252790A (en) * 2021-06-21 2021-08-13 四川轻化工大学 Magnetic shoe internal defect detection method based on wide convolution and cyclic neural network
CN113469271A (en) * 2021-07-19 2021-10-01 北京邮电大学 Image classification method based on Echo State Network

Similar Documents

Publication Publication Date Title
CN108197653A (en) A kind of time series classification method based on convolution echo state network
Li et al. Performance guaranteed network acceleration via high-order residual quantization
CN109460817A (en) A kind of convolutional neural networks on piece learning system based on nonvolatile storage
Dony et al. Neural network approaches to image compression
CN105740349B (en) A kind of sensibility classification method of combination Doc2vec and convolutional neural networks
Chua et al. Autonomous cellular neural networks: a unified paradigm for pattern formation and active wave propagation
CN109543838B (en) Image increment learning method based on variational self-encoder
Feng et al. A fuzzy deep model based on fuzzy restricted Boltzmann machines for high-dimensional data classification
CN109902293A (en) A kind of file classification method based on part with global mutually attention mechanism
CN108805270A (en) A kind of convolutional neural networks system based on memory
CN107798381A (en) A kind of image-recognizing method based on convolutional neural networks
CN111027619B (en) Memristor array-based K-means classifier and classification method thereof
CN105160401A (en) WTA neural network based on memristor array and application thereof
CN108875906B (en) A kind of multiple dimensioned convolutional neural networks learning method gradually to add up
Li et al. Artificial evolution of neural networks and its application to feedback control
CN107016437A (en) The method efficiently generated, system and computer-readable recording medium for the random peaks pattern in core base neuromorphic system
Zhang et al. Spiking echo state convolutional neural network for robust time series classification
CN107025228A (en) A kind of method for recommending problem and equipment
CN108830295A (en) Multivariate Time Series classification method based on Multiple Time Scales echo state network
CN108171328A (en) A kind of convolution algorithm method and the neural network processor based on this method
Yin et al. Deep learning in multi-layer architectures of dense nuclei
CN105260736A (en) Fast image feature representing method based on normalized nonnegative sparse encoder
CN114741507A (en) Method for establishing and classifying quotation network classification model of graph convolution network based on Transformer
CN112860856B (en) Intelligent problem solving method and system for arithmetic application problem
Alnemari et al. Efficient deep neural networks for edge computing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180622