CN108830295A - Multivariate Time Series classification method based on Multiple Time Scales echo state network - Google Patents

Multivariate Time Series classification method based on Multiple Time Scales echo state network Download PDF

Info

Publication number
CN108830295A
CN108830295A CN201810440942.XA CN201810440942A CN108830295A CN 108830295 A CN108830295 A CN 108830295A CN 201810440942 A CN201810440942 A CN 201810440942A CN 108830295 A CN108830295 A CN 108830295A
Authority
CN
China
Prior art keywords
jump
echo state
time scales
time series
follows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810440942.XA
Other languages
Chinese (zh)
Other versions
CN108830295B (en
Inventor
马千里
陈恩欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201810440942.XA priority Critical patent/CN108830295B/en
Publication of CN108830295A publication Critical patent/CN108830295A/en
Application granted granted Critical
Publication of CN108830295B publication Critical patent/CN108830295B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Cable Transmission Systems, Equalization Of Radio And Reduction Of Echo (AREA)

Abstract

The invention discloses a kind of Multivariate Time Series classification methods based on Multiple Time Scales echo state network, and steps are as follows:A kind of Multiple Time Scales echo state expression is generated using the M jump reserve pools with different time jump connection length as encoder for a Multivariate Time Series;Using M convolution and pond layer and a full articulamentum as decoder, convolution and pond layer are used to learn the higher-dimension complex characteristic in the expression of Multiple Time Scales echo state, and full articulamentum is used to merge the feature for the different time scales acquired to obtain a Multiple Time Scales feature of input time sequence;It is used as classifier using one Softmax layers, its classification results is generated according to the Multiple Time Scales feature of input time sequence.Using negative log-likelihood function as loss function, Back Propagation Algorithm and gradient optimizing method training book model are used.Classification method of the invention has reached higher accuracy rate in Multivariate Time Series classification problem.

Description

Multivariate Time Series classification method based on Multiple Time Scales echo state network
Technical field
The present invention relates to Time Series Data Mining technical fields, and in particular to one kind is based on Multiple Time Scales echo state The Multivariate Time Series classification method of network.
Background technique
In Time Series Data Mining technical field, Multivariate Time Series classification task has been widely used in gold Melt, medical treatment, the various fields such as process industry.Multivariate Time Series compare univariate time series, and general data amount is bigger, dimension Du Genggao, correlation are stronger.
For Multivariate Time Series classification problem, technology at this stage is broadly divided into three classes, is based on distance respectively Classification method, the classification method based on feature and the classification method based on dynamical system.The base of classification method based on distance This thinking is to be based on some well-designed distance calculating methods (such as dynamic time warping distance and mahalanobis distance etc.), make Classified according to the relative distance between sample to it with support vector machines or nearest neighbor classifier.It is this kind of based on distance Classification method needs the suitable distance of hand-designed, but is all suitable for all time series datas without a kind of distance 's.The basic ideas of classification method based on feature are extracted from time series data by the method for some feature extractions The expression feature (such as statistical nature, symbolic feature and segmentation feature etc.) of multiplicity, being then based on these indicates characteristic use point Class device classifies to it.However this kind of method based on feature is also required to hand-designed and extracts the method for indicating feature, for The feature extraction of particular task depends critically upon domain knowledge or expertise, very consuming time and labour.Based on dynamic The classification method of system assumes that time series data results from some potential dynamical system, passes through machine learning or depth The model for practising (such as hidden Markov model, condition random field and Recognition with Recurrent Neural Network etc.) models dynamical system, so Classify afterwards to it.However the existing method based on dynamical system is without explicitly considering generally to deposit in time series data Multi-scale model and multiple dimensioned Time Dependent wherein included the problem of, therefore, such method is in Multivariate Time sequence There is also the spaces of promotion in column classification problem.
In view of the above-mentioned problems, urgently proposing a kind of Multivariate Time sequence based on Multiple Time Scales echo state network at present Column classification method, come the multiple dimensioned Time Dependent for including in explicitly learning time sequence data, to be carried out subject to more to it Really classify.
Summary of the invention
The purpose of the present invention is to solve drawbacks described above in the prior art, provide a kind of based on Multiple Time Scales echo The Multivariate Time Series classification method of state network.
The purpose of the present invention can be reached by adopting the following technical scheme that:
A kind of Multivariate Time Series classification method based on Multiple Time Scales echo state network, when the described multivariable Between sequence classification method include the following steps:
S1, the volume using the M jump reserve pools with different time jump connection length as Multivariate Time Series Code device enables each jump reserve pool to learn the time-dependent relation of different scale in input time sequence, generates one kind Multiple Time Scales echo state indicates;
S2, it is indicated with pond layer and a full articulamentum as the Multiple Time Scales echo state using M convolution Decoder, wherein the convolution and pond layer are used to learn higher-dimension complex characteristic therein, and the full articulamentum is used for The feature for the different time scales acquired is merged to obtain the Multiple Time Scales feature of an input time sequence;
S3, it is used as classifier using one Softmax layers, according to the Multiple Time Scales feature of the input time sequence Generate its classification results;
S4, using negative log-likelihood function as loss function, use Back Propagation Algorithm and gradient optimizing method ADAM Training book model.
Further, the step S1 includes:
The multivariable input time sequence of S1.1, given D dimension:
U=(u (1) ..., u (t) ..., u (T))T
Wherein,T is the length of time series, and order jump reserve pool original state isN For number of reserve pool units, then echo state x of i-th of jump reserve pool in moment ti(t) renewal equation is:
Wherein,For the connection weight between the reserve pool unit and input layer unit of i-th of reserve pool that jumps Weight,For i-th reserve pool that jumps reserve pool unit and oneself be connected by circulation weight,WithAll it is random initializtion and exempts from training, f is the activation primitive of jump reserve pool, diFor the jump of i-th of reserve pool that jumps Jump length, diEach jump reserve pool is configured in the way of exponential increase, formula is as follows:
di=Bi-1, i=1 ..., M
Wherein,B>1, the jump reserve pool with different jump lengths is for learning in input time sequence not With the time-dependent relation of scale;
S1.2, the echo state updated is obtained due to each moment, then by sequentially collecting i-th of jump storage The echo state at standby pond each moment, a jump echo state for constructing input time sequence U indicate Xi, it is defined as follows:
Wherein, It updates and operates for echo state,For n-th in i-th of reserve pool that jumps Reserve pool unit moment t activation value, when jump echo state caused by different jump reserve pools indicates coding input Between sequence different time scales multidate information, finally, the different jump echo states expression of M is configured to a set X, A Multiple Time Scales echo state as input time sequence indicates, is defined as follows:
Further, the step S2 includes:
S2.1, a given Multiple Time Scales echo state indicateUsing M different convolution and pond Changing layer indicates X to each jump echo state respectivelyiIt is decoded, wherein each convolutional layer is used with J kind different height Filter extract the information of a variety of different lengths from jump echo state expression, X is indicated for jump echo statei, The echo state expression that its length from moment t to moment t+l-1 is l is defined as follows:
Wherein,For a matrix,It is therefore given for cascade operationStep-length It is 1, obtains the set Z that the echo state that T-l+1 length is l indicatesi, it is defined as follows:
The weight of filter that S2.2, k-th of height for defining i-th of convolutional layer are l is Wherein,K=1 ..., K, K are the number of the filter of each height, then the convolution results of the filterFor:
Wherein,For filterThe echo state for being l to t-th of length indicatesEffect Later as a result, its calculation is:
Wherein, f is activation primitive, and * is dot product operations,It is corresponding bias term;
S2.3, pond layer are operated using maximum pondization, then convolution resultsCorresponding pond resultFor:
S2.4, by all pond resultsIt is configured to an one-dimensional vector pi, it is defined as follows:
Wherein, j=1 ..., J, and for each lj, k=1 ..., K, therefore,It is i-th of jump echo State indicates XiThe feature of the input time sequence learnt after i-th of convolution and pond layer decoder;
S2.5, the feature of all different time scales acquired is merged using a full articulamentum, input is logical It crosses all feature p learntiA long vector p is spliced into construct, is defined as follows:
Wherein,For cascade operation, output is a Multiple Time Scales feature vector of input time sequence, calculating side Formula is as follows:
FMTS=f (Wfusp+bfus)
Wherein, WfusAnd bfusThe connection weight of respectively full articulamentum and corresponding bias term, f are activation primitive.
Further, the step S3 process is as follows:
Softmax layers of input is the Multiple Time Scales feature vector F of time series UMTS, export as the condition of class label Distribution namely classification results, are defined as follows:
Wherein, CsFor s-th of classification of time series, NcFor the quantity of classification,For Softmax layers of company Connect weight.
Further, the step S4 process is as follows:
Loss function using negative log-likelihood function as this model, is defined as follows:
Wherein, UnFor n-th of training sample, CsFor s-th of classification of time series, NsFor the quantity of training set sample, Nc For the quantity of classification, δ () is Kronecker function, and r is sample UnTrue class label.
The present invention has the following advantages and effects with respect to the prior art:
1, the invention proposes a kind of novel efficient universal models of training, are used for Multivariate Time Series classification problem Higher classification accuracy can be obtained.
2, present invention employs multiple jump reserve pools with different time scales jump connection to input time sequence It is encoded, may learn the time-dependent relation of its different scale, there is very strong code capacity for time series data, It is decoded using multiple convolution and pond layer and a full articulamentum simultaneously, a kind of Multiple Time Scales feature can be obtained For accurately being classified to it, there is very strong decoding capability.
Detailed description of the invention
Fig. 1 is the Multivariate Time Series classification method disclosed in the present invention based on Multiple Time Scales echo state network Process step figure.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art Every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
Embodiment
For the present embodiment using UTD-MHAD data set as specific example, UTD-MHAD data set contains (27 kinds of 27 classifications Different action sequences) and 861 samples, wherein training set contains 431 samples, and test set contains 430 samples This, each sample includes 60 variables, and time span variation range is [37,121], and task is action recognition, and test mode is Across theme test, the recognition accuracy of the model on test set the high, shows that effect is better.First using prevalence Savitzky-Golay smoothing filter pre-processes the sample in the data set, then each sample zero padding To same length namely maximum length 121.
As shown in Figure 1, the described method comprises the following steps:
Step S1, to the Multivariate Time Series in UTD-MHAD data set, there is different time jump connection using 4 Encoder of the jump reserve pool of length as Multivariate Time Series, enables each jump reserve pool to learn input time The time-dependent relation of different scale in sequence generates a kind of Multiple Time Scales echo state expression.
The multivariable input time sequence of S1.1, given one 60 dimension:
U=(u (1) ..., u (t) ..., u (121))T
Wherein,The length of time series is 121.Enable jump reserve pool original state be Number of reserve pool units is 180, then echo state x of i-th of jump reserve pool in moment ti(t) renewal equation is:
Wherein,Connect between jump reserve pool and input layer unit for i-th of echo state network Weight is connect,Connection weight between the jump reserve pool unit of i-th of echo state network, WithAll it is random initializtion and exempts from training.Here the activation primitive of the jump reserve pool used is tanh function, diFor The jump length of i-th of jump reserve pool.diEach reserve pool is configured in the way of exponential increase, formula is as follows:
di=2i-1, i=1 ..., 4
Here index growth base is 2.Jump reserve pool with different jump lengths is for learning input time sequence The time-dependent relation of middle different scale;
S1.2, the echo state updated is obtained due to each moment, then by sequentially collecting i-th of jump storage The echo state at standby pond each moment, a jump echo state for constructing input time sequence U indicates, is defined as follows:
Wherein, It updates and operates for echo state,For in i-th of jump reserve pool Activation value of n-th of reserve pool unit in moment t.Jump echo state expression caused by different jump reserve pools encodes The multidate information of input time sequence different time scales.Finally, 4 different jump echo state expressions are configured to one Set X, a Multiple Time Scales echo state as input time sequence indicates, is defined as follows:
Step S2, using 4 convolution and pond layer and a full articulamentum as Multiple Time Scales echo shape in step S1 The decoder that state indicates, for learning higher-dimension complex characteristic therein, full articulamentum is acquired for merging for convolution and pond layer The feature of different time scales is to obtain a Multiple Time Scales feature of input time sequence.
S2.1, a given Multiple Time Scales echo state indicateUsing 4 different convolution and pond Changing layer indicates X to each jump echo state respectivelyiIt is decoded.Wherein, each convolutional layer is used with 3 kinds of different heights Filter from jump echo state expression extract the information of a variety of different lengths.X is indicated for jump echo statei, The echo state expression that its length from moment t to moment t+l-1 is l is defined as follows:
Wherein,For a matrix,For cascade operation.Therefore, it gives Step-length is 1, obtains the set Z that the echo state that 121-l+1 length is l indicatesi, it is defined as follows:
The weight of filter that S2.2, k-th of height for defining i-th of convolutional layer are l is WhereinK=1 ..., 54, the number of the filter of each height is 54, then the convolution results of the filterFor:
Wherein,For filterThe echo state for being l to t-th of length indicatesEffect Later as a result, its calculation is:
Wherein, for the activation primitive used here for ReLU function, * is dot product operations,It is corresponding bias term.
S2.3, pond layer are operated using maximum pondization, then convolution resultsCorresponding pond resultFor:
S2.4, by all pond resultsIt is configured to an one-dimensional vector pi, it is defined as follows:
Wherein, j=1 ..., 3, and for each lj, k=1 ..., 54.Therefore,It is that i-th of jump is returned Sound state indicates XiThe feature of the input time sequence learnt after i-th of convolution and pond layer decoder.
S2.5, the feature of all different time scales acquired is merged using a full articulamentum, input is logical It crosses all feature p learntiA long vector p is spliced into construct, is defined as follows:
Wherein,For cascade operation.Its Multiple Time Scales feature vector exported as input time sequence, calculating side Formula is as follows:
FMTS=ReLU (Wfusp+bfus)
Wherein, WfusAnd bfusThe connection weight of respectively full articulamentum and corresponding bias term, the activation letter used here Number is ReLU function.
Step S3, it is used as classifier using one Softmax layers, according to more time rulers of input time sequence in step S2 Degree feature generates its classification results.
Softmax layers of input is the Multiple Time Scales feature vector F of time series UMTS, export as the condition of class label Distribution namely classification results, are defined as follows:
Wherein, CsFor s-th of classification of time series, the quantity of classification is 27 here,For Softmax The connection weight of layer.
Step S4:Using negative log-likelihood function as loss function, Back Propagation Algorithm and gradient optimizing method are used ADAM training book model.
Loss function using negative log-likelihood function as this model, is defined as follows:
Wherein, UnFor n-th of training sample, CsFor s-th of classification of time series, the quantity of training set sample is here 431, the quantity of classification is that 27, δ () is Kronecker function, and r is sample UnTrue class label.Finally to biography after use Algorithm and gradient optimizing method ADAM training book model are broadcast, 96.74% action recognition is achieved on 430 test set samples Accuracy rate.
The above embodiment is a preferred embodiment of the present invention, but embodiments of the present invention are not by above-described embodiment Limitation, other any changes, modifications, substitutions, combinations, simplifications made without departing from the spirit and principles of the present invention, It should be equivalent substitute mode, be included within the scope of the present invention.

Claims (5)

1. a kind of Multivariate Time Series classification method based on Multiple Time Scales echo state network, which is characterized in that described Multivariate Time Series classification method include the following steps:
S1, the encoder using the M jump reserve pools with different time jump connection length as Multivariate Time Series, Enable each jump reserve pool to learn the time-dependent relation of different scale in input time sequence, generates a kind of more times Scale echo state indicates;
S2, the solution indicated using M convolution and pond layer and a full articulamentum as the Multiple Time Scales echo state Code device, wherein the convolution and pond layer is for learning higher-dimension complex characteristic therein, and the full articulamentum is for merging The feature for the different time scales acquired is to obtain the Multiple Time Scales feature of an input time sequence;
S3, it is used as classifier using one Softmax layers, is generated according to the Multiple Time Scales feature of the input time sequence Its classification results;
S4, using negative log-likelihood function as loss function, use Back Propagation Algorithm and gradient optimizing method ADAM training This model.
2. the Multivariate Time Series classification method according to claim 1 based on Multiple Time Scales echo state network, It is characterized in that, the step S1 includes:
The multivariable input time sequence of S1.1, given D dimension:
U=(u (1) ..., u (t) ..., u (T))T
Wherein,T is the length of time series, and order jump reserve pool original state isN is storage Standby pool unit number, then echo state x of i-th of jump reserve pool in moment ti(t) renewal equation is:
Wherein,For i-th jump reserve pool reserve pool unit and input layer unit between connection weight,For i-th reserve pool that jumps reserve pool unit and oneself be connected by circulation weight, Wi inAnd Wi resAll it is Random initializtion and exempt from training, f be jump reserve pool activation primitive, diFor the jump length of i-th of reserve pool that jumps, di Each jump reserve pool is configured in the way of exponential increase, formula is as follows:
di=Bi-1, i=1 ..., M
Wherein,Jump reserve pool with different jump lengths is different in input time sequence for learning The time-dependent relation of scale;
S1.2, the echo state updated is obtained due to each moment, then by sequentially collecting i-th of jump reserve pool The echo state at each moment, a jump echo state for constructing input time sequence U indicate Xi, it is defined as follows:
Wherein, It updates and operates for echo state,For n-th of deposit in i-th of reserve pool that jumps Activation value of the pool unit in moment t, the generated jump echo state expression coding input time sequence of different jump reserve pools The multidate information of column different time scales, finally, M different jump echo state expressions are configured to a set X, as One Multiple Time Scales echo state of input time sequence indicates, is defined as follows:
3. the Multivariate Time Series classification method according to claim 1 based on Multiple Time Scales echo state network, It is characterized in that, the step S2 includes:
S2.1, a given Multiple Time Scales echo state indicateUsing M different convolution and pond layer X is indicated to each jump echo state respectivelyiIt is decoded, wherein each convolutional layer uses the filter with J kind different height Wave device extracts the information of a variety of different lengths from jump echo state expression, for jump echo state indicates Xi, by its The echo state expression that a length from moment t to moment t+l-1 is l is defined as follows:
Wherein,For a matrix,It is therefore given for cascade operationStep-length is 1, Obtain the set Z that the echo state that T-l+1 length is l indicatesi, it is defined as follows:
The weight of filter that S2.2, k-th of height for defining i-th of convolutional layer are l is Wherein,K is the number of the filter of each height, then the convolution results of the filter For:
Wherein,For filterThe echo state for being l to t-th of length indicatesAfter effect As a result, its calculation is:
Wherein, f is activation primitive, and * is dot product operations,It is corresponding bias term;
S2.3, pond layer are operated using maximum pondization, then convolution resultsCorresponding pond resultFor:
S2.4, by all pond resultsIt is configured to an one-dimensional vector pi, it is defined as follows:
Wherein, j=1 ..., J, and for each lj, k=1 ..., K, therefore,It is i-th of jump echo state Indicate XiThe feature of the input time sequence learnt after i-th of convolution and pond layer decoder;
S2.5, the feature of all different time scales acquired is merged using a full articulamentum, input pass through by All feature p learntiA long vector p is spliced into construct, is defined as follows:
Wherein,For cascade operation, output is a Multiple Time Scales feature vector of input time sequence, and calculation is such as Under:
FMTS=f (Wfusp+bfus)
Wherein, WfusAnd bfusThe connection weight of respectively full articulamentum and corresponding bias term, f are activation primitive.
4. the Multivariate Time Series classification method according to claim 1 based on Multiple Time Scales echo state network, It is characterized in that, the step S3 process is as follows:
Softmax layers of input is the Multiple Time Scales feature vector F of time series UMTS, the condition distribution for class label is exported, Namely classification results, it is defined as follows:
Wherein, CsFor s-th of classification of time series, NcFor the quantity of classification,For Softmax layers of connection weight Weight.
5. the Multivariate Time Series classification method according to claim 1 based on Multiple Time Scales echo state network, It is characterized in that, the step S4 process is as follows:
Loss function using negative log-likelihood function as this model, is defined as follows:
Wherein, UnFor n-th of training sample, CsFor s-th of classification of time series, NsFor the quantity of training set sample, NcFor class Other quantity, δ () are Kronecker function, and r is sample UnTrue class label.
CN201810440942.XA 2018-05-10 2018-05-10 Multivariate time sequence classification method based on multi-time scale echo state network Active CN108830295B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810440942.XA CN108830295B (en) 2018-05-10 2018-05-10 Multivariate time sequence classification method based on multi-time scale echo state network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810440942.XA CN108830295B (en) 2018-05-10 2018-05-10 Multivariate time sequence classification method based on multi-time scale echo state network

Publications (2)

Publication Number Publication Date
CN108830295A true CN108830295A (en) 2018-11-16
CN108830295B CN108830295B (en) 2020-09-22

Family

ID=64148675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810440942.XA Active CN108830295B (en) 2018-05-10 2018-05-10 Multivariate time sequence classification method based on multi-time scale echo state network

Country Status (1)

Country Link
CN (1) CN108830295B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110309979A (en) * 2019-07-09 2019-10-08 南方电网科学研究院有限责任公司 Methods of electric load forecasting, device and equipment based on echo state network
CN112732907A (en) * 2020-12-28 2021-04-30 华南理工大学 Financial public opinion analysis method based on multi-scale recurrent neural network
CN112798967A (en) * 2020-12-04 2021-05-14 电子科技大学 Long-term and short-term prediction method for solid oxide fuel cell
CN110503130B (en) * 2019-07-19 2021-11-30 西安邮电大学 Present survey image classification method based on feature fusion
CN113723442A (en) * 2021-07-08 2021-11-30 华中科技大学 Electronic nose gas identification method and system, electronic equipment and storage medium
CN117828407A (en) * 2024-03-04 2024-04-05 江西师范大学 Double-stage gating attention time sequence classification method and system for bidirectional jump storage

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107506740A (en) * 2017-09-04 2017-12-22 北京航空航天大学 A kind of Human bodys' response method based on Three dimensional convolution neutral net and transfer learning model
CN107506712A (en) * 2017-08-15 2017-12-22 成都考拉悠然科技有限公司 Method for distinguishing is known in a kind of human behavior based on 3D depth convolutional networks

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107506712A (en) * 2017-08-15 2017-12-22 成都考拉悠然科技有限公司 Method for distinguishing is known in a kind of human behavior based on 3D depth convolutional networks
CN107506740A (en) * 2017-09-04 2017-12-22 北京航空航天大学 A kind of Human bodys' response method based on Three dimensional convolution neutral net and transfer learning model

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
QIANLI MA ET AL.: "WALKINGWALKing walking: Action Recognition from Action Echoes", 《PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-17)》 *
SUMETH YUENYONG ET AL.: "Evolutionary pre-training for CRJ-type reservoir of echo state networks", 《NEUROCOMPUTING》 *
王新迎: "基于随机映射神经网络的多元时间序列预测方法研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110309979A (en) * 2019-07-09 2019-10-08 南方电网科学研究院有限责任公司 Methods of electric load forecasting, device and equipment based on echo state network
CN110503130B (en) * 2019-07-19 2021-11-30 西安邮电大学 Present survey image classification method based on feature fusion
CN112798967A (en) * 2020-12-04 2021-05-14 电子科技大学 Long-term and short-term prediction method for solid oxide fuel cell
CN112732907A (en) * 2020-12-28 2021-04-30 华南理工大学 Financial public opinion analysis method based on multi-scale recurrent neural network
CN112732907B (en) * 2020-12-28 2022-06-10 华南理工大学 Financial public opinion analysis method based on multi-scale circulation neural network
CN113723442A (en) * 2021-07-08 2021-11-30 华中科技大学 Electronic nose gas identification method and system, electronic equipment and storage medium
CN113723442B (en) * 2021-07-08 2024-02-20 华中科技大学 Electronic nose gas identification method, system, electronic equipment and storage medium
CN117828407A (en) * 2024-03-04 2024-04-05 江西师范大学 Double-stage gating attention time sequence classification method and system for bidirectional jump storage
CN117828407B (en) * 2024-03-04 2024-05-14 江西师范大学 Double-stage gating attention time sequence classification method and system for bidirectional jump storage

Also Published As

Publication number Publication date
CN108830295B (en) 2020-09-22

Similar Documents

Publication Publication Date Title
CN108830295A (en) Multivariate Time Series classification method based on Multiple Time Scales echo state network
CN108629687B (en) Anti-money laundering method, device and equipment
US7801924B2 (en) Decision tree construction via frequent predictive itemsets and best attribute splits
CN107480726A (en) A kind of Scene Semantics dividing method based on full convolution and shot and long term mnemon
CN108647226B (en) Hybrid recommendation method based on variational automatic encoder
CN110490320B (en) Deep neural network structure optimization method based on fusion of prediction mechanism and genetic algorithm
CN109919188A (en) Timing classification method based on sparse local attention mechanism and convolution echo state network
CN111695527A (en) Mongolian online handwriting recognition method
KR20200032258A (en) Finding k extreme values in constant processing time
CN107665248A (en) File classification method and device based on deep learning mixed model
JP2018132969A (en) Sentence preparation device
CN111400494B (en) Emotion analysis method based on GCN-Attention
Rajamohana et al. An effective hybrid cuckoo search with harmony search for review spam detection
CN116643989A (en) Defect prediction method for carrying out deep semantic understanding by adopting graph structure
CN113836896A (en) Patent text abstract generation method and device based on deep learning
CN111882042A (en) Automatic searching method, system and medium for neural network architecture of liquid state machine
Ratajczak et al. Sum-product networks for structured prediction: Context-specific deep conditional random fields
CN111241271B (en) Text emotion classification method and device and electronic equipment
Andrews et al. Fast scalable and accurate discovery of dags using the best order score search and grow shrink trees
CN117151222B (en) Domain knowledge guided emergency case entity attribute and relation extraction method thereof, electronic equipment and storage medium
Kilimci et al. Sentiment analysis based churn prediction in mobile games using word embedding models and deep learning algorithms
CN116757773A (en) Clothing electronic commerce sales management system and method thereof
CN112950019B (en) Electricity selling company evaluation emotion classification method based on joint attention mechanism
CN112686306B (en) ICD operation classification automatic matching method and system based on graph neural network
CN113850185A (en) Multi-classification method, device, terminal and storage medium for underground acoustic emission source

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant