CN111475546A - Financial time sequence prediction method for generating confrontation network based on double-stage attention mechanism - Google Patents
Financial time sequence prediction method for generating confrontation network based on double-stage attention mechanism Download PDFInfo
- Publication number
- CN111475546A CN111475546A CN202010275757.7A CN202010275757A CN111475546A CN 111475546 A CN111475546 A CN 111475546A CN 202010275757 A CN202010275757 A CN 202010275757A CN 111475546 A CN111475546 A CN 111475546A
- Authority
- CN
- China
- Prior art keywords
- data
- model
- attention mechanism
- financial time
- stage
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2458—Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
- G06F16/2474—Sequence data queries, e.g. querying versioned data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q40/00—Finance; Insurance; Tax strategies; Processing of corporate or income taxes
- G06Q40/06—Asset management; Financial planning or analysis
Abstract
The invention discloses a financial time series prediction method for generating an confrontation network based on a two-stage attention mechanism, which belongs to the field of financial time series prediction and comprises the following steps: acquiring original financial time sequence data, and preprocessing the data; introducing an input attention mechanism and a time attention mechanism into a generation model for generating the confrontation network, and constructing a two-stage attention mechanism to generate a confrontation network model; sequentially inputting the training set data into a dual-stage attention mechanism generation confrontation network model to train the model, and obtaining a trained dual-stage attention mechanism generation confrontation network model; the method includes the steps that test data are sequentially input into a trained two-stage attention mechanism to generate a confrontation network model, a prediction result of a financial time sequence is obtained, the input attention mechanism is introduced into a generated model to weight input characteristics, time with large influence on the prediction result is emphatically considered, long-term dependence of the financial time sequence is captured, and two attention mechanisms are introduced into the generated model, so that the prediction accuracy of the model is improved.
Description
Technical Field
The invention relates to the field of financial time series prediction, in particular to a financial time series prediction method for generating a confrontation network based on a two-stage attention mechanism.
Background
The financial time series prediction is to reasonably predict the development condition of future financial data according to the historical rules and the change trend of the financial data. The financial time series is one of the time series, the financial time series prediction has important significance for governments, investment institutions and investors, and the research hotspot in the financial field and the computer field is always provided for how to extract features and improve the prediction accuracy.
The conventional financial time series prediction technology mainly comprises an autoregressive moving average model, an autoregressive condition variance model, a multiple linear regression model and the like, the conventional time series prediction method is mature in theory and is relatively simple to apply, but the conventional time series prediction method cannot well process the problem of nonlinear fitting, the reason is that the methods aim at linear analysis methods, and the quantization relation among data is required to be noticed when processing the nonlinear problem.
The generation of the countermeasure network is subjected to multi-domain authentication, but the generation of the countermeasure network is rarely applied to the prediction of the financial time series, and the main problems of the generation of the countermeasure network are that the input features are difficult to adaptively select and the long-term dependence on the financial time series is difficult to capture.
Disclosure of Invention
According to the problems existing in the prior art, the invention discloses a financial time series prediction method for generating a countermeasure network based on a two-stage attention mechanism, which comprises the following steps:
s1, acquiring original financial time-series data, preprocessing the original financial time-series data to obtain preprocessed financial time-series data, and dividing the preprocessed financial time-series data into training set data and test set data according to a proportion;
s2, introducing an input attention mechanism and a time attention mechanism into the generation model for generating the confrontation network, and constructing a two-stage attention mechanism to generate the confrontation network model;
s3, sequentially inputting the training set data into the two-stage attention mechanism generation confrontation network model to train the model, and obtaining the trained two-stage attention mechanism generation confrontation network model;
and S4, sequentially inputting the test data into the trained two-stage attention mechanism to generate a confrontation network model, and obtaining a prediction result of the financial time sequence.
Further: the process of preprocessing the financial time-series data is as follows:
s1-1, judging whether the original financial time sequence data has data lack by adopting third-party software, and when the original financial time sequence data does not have data lack, performing S1-2 as complete financial time sequence data; when the original financial time-series data is absent, the absent data is supplemented according to the average value in a fixed time before the date of the absent data to obtain complete financial time-series data, and S1-2 is carried out;
and S1-2, carrying out normalization processing on the complete financial time sequence data to obtain the financial time sequence data after the normalization processing.
Further: the normalization processing formula is as follows:
wherein, X*Represents the normalized data; xRepresenting the pre-processed data; xminRepresents the minimum value of the processed data; xmaxRepresents the maximum value of the pre-processed data.
Further: the construction method of the dual-stage attention mechanism generation confrontation network model comprises the following steps:
s2-1, introducing an input attention mechanism before an encoder and a time attention mechanism before a decoder to construct a generating model based on a L STM encoder-decoder;
s2-2, constructing a discrimination model by using a plurality of convolution layers and full connection layers;
s2-3, the generation model selects Wasserstein distance in the WGAN and the sum of mean square errors of real data and predicted data as an objective function, and the discriminant model selects the objective function of the discriminant model in the WGAN as the objective function.
Further: the training process of the two-stage attention mechanism to generate the confrontation network model is as follows:
s1, generating parameter settings of the confrontation network model by the two-stage attention mechanism;
s2, inputting the historical data in the training data set to an attention input mechanism for processing to obtain the data processed by the attention input mechanism;
s3, inputting the data processed by the attention mechanism into the encoder for processing to obtain the data processed by the encoder;
s4, inputting the data processed by the encoder into the time attention mechanism for processing to obtain the data processed by the time attention mechanism;
s5, processing the data processed by the time attention mechanism and inputting the processed data to a decoder for processing to obtain a predicted value;
s6, inputting the predicted value and the real value in the training data set to a discrimination model of a two-stage attention mechanism generation confrontation network model for processing to obtain discrimination information;
s7, feeding back and inputting the discrimination information to the generating model and the discrimination model to adjust the parameters of the generating model and the discrimination model, updating the generating model and the discrimination model after parameter change, and returning to S2 to perform countermeasure training if the generating model and the discrimination model after parameter change are not converged; and if the generated model and the discrimination model are converged after the parameters are changed, the training is finished.
Due to the adoption of the technical scheme, the financial time series prediction method for generating the countermeasure network based on the two-stage attention mechanism, provided by the invention, has the advantages that the input attention mechanism is introduced into the generated model to weight the input characteristics, the characteristics with large influence on subsequent analysis are emphatically considered, the input characteristics are extracted in a self-adaptive manner, the time attention mechanism is introduced into the generated model to weight different moments, the moment with large influence on the prediction result is emphatically considered, the long-term dependence of the financial time series is captured, and the prediction accuracy of the model is improved by introducing the two attention mechanisms into the generated model.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flowchart illustrating the overall design of the financial time series prediction method of the present invention;
FIG. 2 is a diagram of a generative model of the financial time series prediction method of the present invention;
FIG. 3 is a diagram of a discriminant model architecture for a financial time series prediction method according to the present invention.
Detailed Description
In order to make the technical solutions and advantages of the present invention clearer, the following describes the technical solutions in the embodiments of the present invention clearly and completely with reference to the drawings in the embodiments of the present invention:
FIG. 1 is a flowchart illustrating the overall design of the financial time series prediction method of the present invention; a financial time series prediction method for generating a countermeasure network based on a two-stage attention mechanism comprises the following steps:
s1, acquiring Shanghai depth 300 indexes between 2002 and 2019 as original financial time series data, preprocessing the original financial time series data, and using two thirds of the preprocessed financial time series data as training set data and one third of the preprocessed financial time series data as test set data;
s2, introducing an input attention mechanism and a time attention mechanism into the generation model for generating the confrontation network, and constructing a two-stage attention mechanism to generate the confrontation network model;
s3, sequentially inputting the training set data into the two-stage attention mechanism generation confrontation network model to train the model, and obtaining the trained two-stage attention mechanism generation confrontation network model;
and S4, sequentially inputting the test data into the trained two-stage attention mechanism to generate a confrontation network model, and obtaining a prediction result of the financial time sequence.
Further, the process of preprocessing the financial time-series data is as follows:
s1-1, judging whether the original financial time sequence data has data lack by adopting third-party software, and when the original financial time sequence data does not have data lack, performing S1-2 as complete financial time sequence data; when the original financial time-series data is absent, supplementing the absent data according to the average value in the five days before the date of the absent data to obtain complete financial time-series data, and performing S1-2;
s1-2, in order to avoid the influence of the difference of the original input data factor magnitude on the training process of the model, the normalization processing is performed on the complete financial time series data, and the formula of the normalization processing is as follows:
wherein X*Represents the normalized data; x represents the data processed in step a 1; xminRepresents the minimum value of the data processed in step a 1; xmaxWhich represents the maximum value of the data after processing in step a 1.
Further, the method for making the training set data and the test set data comprises: taking the first two thirds of the preprocessed financial time sequence data as training set data, specifically including historical data which is input into a generation model of a two-stage attention mechanism generation countermeasure networkAnd predicted data Y which is real data input to a discrimination model of the two-stage attention mechanism generation countermeasure networkT+1,yT+2,...,ym}. Wherein T is the length of the time sequence, and n is the number of input features; the data of the test set is one third of the data of the pre-processed fused time sequence, and the data form is the same as the historical data X.
Further, the method for constructing the countermeasure network model by using the two-stage attention mechanism comprises the following steps:
s2-1, introducing an input attention mechanism before an encoder and a time attention mechanism before a decoder to construct a generating model based on a L STM encoder-decoder;
s2-2, constructing a discrimination model by using a plurality of convolution layers and full connection layers;
s2-3, generating an objective function of the antagonistic network generation model uses the sum of the Wasserstein distance in the WGAN and the mean square error of the real data and the prediction data, and the objective function of the discriminant model is consistent with the objective function of the discriminant model in the WGAN.
FIG. 2 is a structure diagram of a generated model of the financial time series prediction method, and the generated network model used introduces an input attention mechanism and a time attention mechanism on the basis of an Encoder-Decoder model based on L STM, wherein the Encoder model adopts a 3-layer L STM network model with 20 neurons in each layer and an activation function of tanh, and the Decoder model also adopts 20 neurons in each layer and an activation function of a 3-layer L STM network model of tanh.
The calculation formula of the input attention mechanism is as follows:
wherein h ist-1Concealing layer states for a moment before an STM unit of the encoder L, ct-1In the state of the cell, it is,Wβare parameters that need to be learned. Then, the method is normalized by a softmax function, and the formula is as follows:
wherein the content of the first and second substances,is the input attention weight of the kth input feature at time t.
The calculation formula of the time attention mechanism is as follows:
wherein h ist∈{h1,h2,...,hTS is the state of the hidden layer at each moment of the encodert-1The state of the cell at the time before the STM of L in the decoder,Wdare parameters that need to be learned. Then, the operation is carried out again through the softmax function pairNormalization is performed as shown in equation (5):
wherein the weight of the time attention mechanismOf the representationThe importance of the encoder hidden layer state at time i to the prediction result, which is the output of the generative model.
FIG. 3 is a structure diagram of a discriminant model of the financial time series prediction method, wherein the discriminant model is composed of a convolutional layer, a pooling layer and a full link layer, an activation function is L Re L U, and discriminant loss is obtained through the full link layer and a sigmoid function.
The loss function comprises generation loss and discrimination loss, the generation loss is the loss obtained by the historical data passing through the generation model and specifically comprises the original immunity loss and the content loss, and the discrimination loss is the difference loss obtained by the real value and the generated value passing through the discrimination model.
Wherein, the warsserstein distance of WGAN is adopted for both the antagonistic loss and the loss of the discrimination network, and the content loss is mean square error L of a real value and a predicted valueMSEThe formula is as follows:
where y represents the true sample data, Gθ(. cndot.) represents the output of the generative model, i.e., the predicted value.
Further, the training process of the two-stage attention mechanism to generate the confrontation network model is as follows:
s3-1, utilizing a TensorFlow deep learning platform, wherein an optimization function is Adam, an encoder and a decoder of a generated model are both composed of L STMs with 3 layers, each layer comprises 20 neurons, an activation function is tanh, and the learning rate is 0.0006, a discriminant model is composed of 3 layers of CNNs and one layer of fully-connected layer, the activation function is L Re L U, and the learning rate is 0.0006;
s3-2, generating parameter settings of the confrontation network model by a two-stage attention mechanism;
s3-3, inputting historical data in the training data set to an input attention mechanism to obtain data processed by the input attention mechanism;
s3-4, inputting the data processed by the attention mechanism into the encoder for processing to obtain the data processed by the encoder;
s3-5, inputting the data processed by the encoder into the time attention mechanism for processing to obtain the data processed by the time attention mechanism;
s3-6, processing the data processed by the time attention mechanism and inputting the processed data to a decoder for processing to obtain a predicted value;
s3-7, inputting the predicted value and the real value in the training data set to a discrimination model for generating a confrontation network model to be processed to obtain discrimination information;
s3-8, feeding back and inputting the discrimination information to the generation model and the discrimination model to adjust the parameters of the generation model and the discrimination model, updating the generation model and the discrimination model after parameter change, returning to S2 if the generation model and the discrimination model after parameter change do not converge, and performing countermeasure training; and if the generated model and the discrimination model are converged after the parameters are changed, the training is finished.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be able to cover the technical scope of the present invention and the equivalent alternatives or modifications according to the technical solution and the inventive concept of the present invention within the technical scope of the present invention.
Claims (5)
1. A financial time series prediction method for generating a countermeasure network based on a two-stage attention mechanism is characterized by comprising the following steps:
s1, acquiring original financial time-series data, preprocessing the original financial time-series data to obtain preprocessed financial time-series data, and dividing the preprocessed financial time-series data into training set data and test set data according to a proportion;
s2, introducing an input attention mechanism and a time attention mechanism into the generation model for generating the confrontation network, and constructing a two-stage attention mechanism to generate the confrontation network model;
s3, sequentially inputting the training set data into the two-stage attention mechanism generation confrontation network model to train the model, and obtaining the trained two-stage attention mechanism generation confrontation network model;
and S4, sequentially inputting the test data into the trained two-stage attention mechanism to generate a confrontation network model, and obtaining a prediction result of the financial time sequence.
2. The method of generating a financial time series prediction for a countermeasure network based on a two-stage attentive mechanism of claim 1, further characterized by: the process of preprocessing the financial time-series data is as follows:
s1-1, judging whether the original financial time sequence data has data lack by adopting third-party software, and when the original financial time sequence data does not have data lack, performing S1-2 as complete financial time sequence data; when the original financial time-series data is absent, the absent data is supplemented according to the average value in a fixed time before the date of the absent data to obtain complete financial time-series data, and S1-2 is carried out;
and S1-2, carrying out normalization processing on the complete financial time sequence data to obtain the financial time sequence data after the normalization processing.
3. The method of generating a financial time series prediction for a counterpoise network based on a two-stage attentive mechanism according to claim 2, further characterized by: the normalization processing formula is as follows:
wherein, X*Represents the normalized data; x represents preprocessed data; xminRepresents the minimum value of the processed data; xmaxRepresents the maximum value of the pre-processed data.
4. The method of generating a financial time series prediction for a countermeasure network based on a two-stage attentive mechanism of claim 1, further characterized by: the construction method of the dual-stage attention mechanism generation confrontation network model comprises the following steps:
s2-1, introducing an input attention mechanism before an encoder and a time attention mechanism before a decoder to construct a generating model based on a L STM encoder-decoder;
s2-2, constructing a discrimination model by using a plurality of convolution layers and full connection layers;
s2-3, the generation model selects Wasserstein distance in the WGAN and the sum of mean square errors of real data and predicted data as an objective function, and the discriminant model selects the objective function of the discriminant model in the WGAN as the objective function.
5. The method of generating a financial time series prediction for a countermeasure network based on a two-stage attentive mechanism of claim 1, further characterized by: the training process of the two-stage attention mechanism to generate the confrontation network model is as follows:
s1, generating parameter settings of the confrontation network model by the two-stage attention mechanism;
s2, inputting the historical data in the training data set to an attention input mechanism for processing to obtain the data processed by the attention input mechanism;
s3, inputting the data processed by the attention mechanism into the encoder for processing to obtain the data processed by the encoder;
s4, inputting the data processed by the encoder into the time attention mechanism for processing to obtain the data processed by the time attention mechanism;
s5, processing the data processed by the time attention mechanism and inputting the processed data to a decoder for processing to obtain a predicted value;
s6, inputting the predicted value and the real value in the training data set to a discrimination model of a two-stage attention mechanism generation confrontation network model for processing to obtain discrimination information;
s7, feeding back and inputting the discrimination information to the generating model and the discrimination model, adjusting the parameters of the generating model and the discrimination model, updating the generating model and the discrimination model after the parameters are changed, returning to S2 if the generating model and the discrimination model after the parameters are updated do not converge, and performing countermeasure training; and if the generated model and the discrimination model are converged after the parameters are updated, finishing the training.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010275757.7A CN111475546A (en) | 2020-04-09 | 2020-04-09 | Financial time sequence prediction method for generating confrontation network based on double-stage attention mechanism |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010275757.7A CN111475546A (en) | 2020-04-09 | 2020-04-09 | Financial time sequence prediction method for generating confrontation network based on double-stage attention mechanism |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111475546A true CN111475546A (en) | 2020-07-31 |
Family
ID=71751687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010275757.7A Pending CN111475546A (en) | 2020-04-09 | 2020-04-09 | Financial time sequence prediction method for generating confrontation network based on double-stage attention mechanism |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111475546A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112069397A (en) * | 2020-08-21 | 2020-12-11 | 三峡大学 | Rumor detection method combining self-attention mechanism with generation of confrontation network |
CN112183826A (en) * | 2020-09-15 | 2021-01-05 | 湖北大学 | Building energy consumption prediction method based on deep cascade generation countermeasure network and related product |
CN112508273A (en) * | 2020-12-03 | 2021-03-16 | 中国石油大学(华东) | Residual oil prediction method based on generation countermeasure network |
CN112508170A (en) * | 2020-11-19 | 2021-03-16 | 中南大学 | Multi-correlation time sequence prediction system and method based on generation countermeasure network |
CN112926802A (en) * | 2021-04-01 | 2021-06-08 | 重庆邮电大学 | Time series data countermeasure sample generation method and system, electronic device and storage medium |
CN113468203A (en) * | 2021-04-29 | 2021-10-01 | 华东师范大学 | Financial user image drawing method based on recurrent neural network and attention mechanism |
CN115238941A (en) * | 2022-03-09 | 2022-10-25 | 生态环境部华南环境科学研究所 | Surface water quality prediction method based on two-stage attention weight optimization mechanism |
-
2020
- 2020-04-09 CN CN202010275757.7A patent/CN111475546A/en active Pending
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112069397A (en) * | 2020-08-21 | 2020-12-11 | 三峡大学 | Rumor detection method combining self-attention mechanism with generation of confrontation network |
CN112069397B (en) * | 2020-08-21 | 2023-08-04 | 三峡大学 | Rumor detection method combining self-attention mechanism and generation of countermeasure network |
CN112183826A (en) * | 2020-09-15 | 2021-01-05 | 湖北大学 | Building energy consumption prediction method based on deep cascade generation countermeasure network and related product |
CN112508170A (en) * | 2020-11-19 | 2021-03-16 | 中南大学 | Multi-correlation time sequence prediction system and method based on generation countermeasure network |
CN112508273A (en) * | 2020-12-03 | 2021-03-16 | 中国石油大学(华东) | Residual oil prediction method based on generation countermeasure network |
CN112508273B (en) * | 2020-12-03 | 2023-04-07 | 中国石油大学(华东) | Residual oil prediction method based on generation countermeasure network |
CN112926802A (en) * | 2021-04-01 | 2021-06-08 | 重庆邮电大学 | Time series data countermeasure sample generation method and system, electronic device and storage medium |
CN112926802B (en) * | 2021-04-01 | 2023-05-23 | 重庆邮电大学 | Time sequence data countermeasure sample generation method, system, electronic device and storage medium |
CN113468203A (en) * | 2021-04-29 | 2021-10-01 | 华东师范大学 | Financial user image drawing method based on recurrent neural network and attention mechanism |
CN113468203B (en) * | 2021-04-29 | 2022-10-04 | 华东师范大学 | Financial user image drawing method based on recurrent neural network and attention mechanism |
CN115238941A (en) * | 2022-03-09 | 2022-10-25 | 生态环境部华南环境科学研究所 | Surface water quality prediction method based on two-stage attention weight optimization mechanism |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111475546A (en) | Financial time sequence prediction method for generating confrontation network based on double-stage attention mechanism | |
Liu et al. | Remaining useful life prediction using a novel feature-attention-based end-to-end approach | |
CN112508077B (en) | Social media emotion analysis method and system based on multi-modal feature fusion | |
CN109829541A (en) | Deep neural network incremental training method and system based on learning automaton | |
Li et al. | A wind power forecasting method based on optimized decomposition prediction and error correction | |
CN113159389A (en) | Financial time sequence prediction method based on deep forest generation countermeasure network | |
WO2020143253A1 (en) | Method employing sparse autoencoder to cluster power system operation modes | |
CN111178288B (en) | Human body posture recognition method and device based on local error layer-by-layer training | |
Xie et al. | A novel deep belief network and extreme learning machine based performance degradation prediction method for proton exchange membrane fuel cell | |
CN111222992A (en) | Stock price prediction method of long-short term memory neural network based on attention mechanism | |
CN111210089A (en) | Stock price prediction method of gated cyclic unit neural network based on Kalman filtering | |
CN115964932A (en) | Gas prediction method based on EMD-BilSTM-Attention mechanism transformer digital twin model | |
CN114022311A (en) | Comprehensive energy system data compensation method for generating countermeasure network based on time sequence condition | |
Chou et al. | Imaging time-series with features to enable visual recognition of regional energy consumption by bio-inspired optimization of deep learning | |
CN112101473A (en) | Smoke detection algorithm based on small sample learning | |
CN116643949A (en) | Multi-model edge cloud load prediction method and device based on VaDE clustering | |
CN116227560A (en) | Time sequence prediction model and method based on DTW-former | |
Zhang et al. | Zero-small sample classification method with model structure self-optimization and its application in capability evaluation | |
CN112668543B (en) | Isolated word sign language recognition method based on hand model perception | |
CN117290800B (en) | Timing sequence anomaly detection method and system based on hypergraph attention network | |
CN114580262A (en) | Lithium ion battery health state estimation method | |
Deng et al. | Evolutionary neural architecture search for facial expression recognition | |
Mu et al. | Catalyst optimization design based on artificial neural network | |
CN116543289B (en) | Image description method based on encoder-decoder and Bi-LSTM attention model | |
Wang | Analysis of bank credit risk evaluation model based on BP neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200731 |
|
RJ01 | Rejection of invention patent application after publication |