CN115936237A - Time series prediction method, time series prediction device, computer equipment and storage medium - Google Patents
Time series prediction method, time series prediction device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN115936237A CN115936237A CN202211662382.5A CN202211662382A CN115936237A CN 115936237 A CN115936237 A CN 115936237A CN 202211662382 A CN202211662382 A CN 202211662382A CN 115936237 A CN115936237 A CN 115936237A
- Authority
- CN
- China
- Prior art keywords
- sequence
- unit
- trend
- decomposition
- attention
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y04—INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
- Y04S—SYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
- Y04S10/00—Systems supporting electrical power generation, transmission or distribution
- Y04S10/50—Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications
Landscapes
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The embodiment of the invention provides a time series prediction method, a time series prediction device, computer equipment and a storage medium, and relates to the technical field of data processing. The method comprises the following steps: acquiring an initial season part and an initial trend part corresponding to the historical time sequence; inputting the historical time series into the encoder, and obtaining an encoding output result containing season information through the first cross attention unit; inputting the initial part of the season into the decoder, deriving a predicted part of the season via the second cross attention unit and the third cross attention unit, and separating a remaining trend part; and inputting the initial trend part into the decoder, and accumulating the initial trend part and the residual trend part to obtain a predicted trend part. The time series prediction method provided by the embodiment adopts a progressive decomposition and cross attention mechanism, so that the memory flow is reduced, and the time complexity is reduced. The method can accurately and efficiently predict the long-sequence time sequence.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a time series prediction method, a time series prediction device, computer equipment and a storage medium.
Background
Time series prediction has been widely used in energy consumption, traffic and economic planning, weather and disease transmission prediction. In these practical engineering applications, there is an urgent need to extend the prediction time to the distant future, which is very significant for long-term planning and early warning.
Based on this, the existing time series prediction method generally constructs a prediction model based on historical data for performing prediction analysis on future data. A long-term time series prediction method in the prior art extracts key information in historical data based on a self-attention mechanism to predict future data, but the self-attention mechanism has the following problems: the method is insensitive to local context, is easy to encounter memory bottleneck, and has low calculation efficiency.
Disclosure of Invention
The object of the present invention includes, for example, providing a time series prediction method, apparatus, computer device and storage medium, which can predict a long time series.
Embodiments of the invention may be implemented as follows:
in a first aspect, an embodiment of the present invention provides a time series prediction method applied to a time series prediction architecture, where the time series prediction architecture includes a multi-layer encoder and a multi-layer decoder, the encoder includes a first cross attention unit, and the decoder includes a second cross attention unit and a third cross attention unit, and the method includes:
acquiring an initial season part and an initial trend part corresponding to the historical time sequence;
inputting the historical time series into the encoder, and obtaining an encoding output result containing season information through the first cross attention unit;
inputting the initial seasonal portion to the decoder, deriving a predicted seasonal portion via the second cross attention unit and the third cross attention unit, and separating out a remaining trend portion;
and inputting the initial trend part into the decoder, and accumulating the initial trend part and the residual trend part to obtain a predicted trend part.
In one embodiment, the obtaining of the initial trend part corresponding to the historical time series includes:
carrying out moving average processing on the historical time sequence to obtain a historical trend sequence corresponding to the historical time sequence;
acquiring the last I/2 elements of the historical trend sequence to obtain a segmentation trend sequence;
obtaining the average value of each element of the historical trend sequence;
filling O average values into the segmentation trend sequence, and then obtaining an initial trend part through a concat function; wherein, I is the length of the historical time sequence, and O is the length of the time to be predicted.
In one embodiment, the obtaining of the initial season part corresponding to the historical time series includes:
subtracting the historical trend sequence from the historical time sequence to obtain a historical season sequence corresponding to the historical time sequence;
acquiring the last I/2 elements of the historical seasonal sequence to obtain a segmentation seasonal sequence;
and filling O zero values into the segmentation seasonal sequence, and then obtaining an initial seasonal part through a concat function.
In one embodiment, the encoder further comprises a first sequence decomposition unit, a first feed-forward unit, and a second sequence decomposition unit, wherein the inputting the historical time sequence into the encoder and obtaining an encoded output result containing season information via the first cross attention unit comprises:
the historical time sequence passes through a first cross attention unit to obtain a first attention sequence;
adding the first attention sequence and the historical time sequence, and then passing through a first sequence decomposition unit to obtain a first decomposition result;
enabling the first decomposition result to pass through a first feedforward unit to obtain a first feedforward sequence;
and adding the first feedforward sequence and the first decomposition result, and then passing through a second sequence decomposition unit to obtain the coding output result.
In one embodiment, the decoder further comprises a third sequence decomposition unit, a fourth sequence decomposition unit, a second feed-forward unit, and a fifth sequence decomposition unit; inputting the initial seasonal portion to the decoder, deriving a predicted seasonal portion via the second cross attention unit and the third cross attention unit, and separating out a remaining trend portion, comprising:
passing the initial seasonal portion through the second cross attention unit to obtain a second attention sequence;
adding the second attention sequence and the historical time sequence, and then passing through the third sequence decomposition unit to obtain a second decomposition result and a first residual part;
passing the second decomposition result through the third cross attention unit to obtain a third attention sequence;
adding the third attention sequence and the second decomposition result, and then passing through the fourth sequence decomposition unit to obtain a third decomposition result and a second residual part;
passing the third decomposition result through the second feedforward unit to obtain a second feedforward sequence;
adding the second feedforward sequence and the third decomposition result, and then passing through the fifth sequence decomposition unit to obtain the predicted season part and a third remaining part;
and accumulating the first remaining part, the second remaining part and the third remaining part to obtain the remaining trend part.
In a second aspect, an embodiment of the present application provides an apparatus for time series decomposition, where the apparatus includes:
the decomposition module is used for decomposing the historical time sequence to obtain a season part and a trend part of the historical time sequence;
the first filling module is used for carrying out segmentation filling processing on the seasonal part to obtain seasonal items;
the second filling module is used for carrying out segmentation filling processing on the trend part to obtain a trend item;
an input module for inputting the seasonal item and the trend item into a decoder;
and the acquisition module is used for controlling the decoder to acquire the season information from the season items through a cross attention mechanism and acquiring the trend information from the trend items in an accumulation mode.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory is used to store a computer program, and the computer program executes the time-series decomposition method provided in the first aspect when the processor runs.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which stores a computer program, where the computer program, when executed on a processor, performs the time series decomposition method provided in the first aspect.
The beneficial effects of the embodiment of the invention include, for example: by adopting a progressive decomposition method, a season prediction part and a trend prediction part are extracted from a complex time sequence, a cross attention mechanism is used, and a split and merge strategy is used in a cross-stage mode, so that the possibility of repetition in the information integration process is effectively reduced, and the learning capacity of the network is improved. The cross-attention mechanism also reduces memory traffic and reduces time complexity.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic flow chart illustrating a time series prediction method provided in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a time series prediction model provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a cross attention unit provided in an embodiment of the present application;
fig. 4 shows a schematic structural diagram of a time series prediction apparatus provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present invention, it should be noted that, if the terms "upper", "lower", "inner", "outer", etc. are used to indicate the orientation or positional relationship based on the orientation or positional relationship shown in the drawings or the orientation or positional relationship which the product of the present invention is used to usually place, it is only for convenience of description and simplification of the description, but it is not intended to indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention.
Furthermore, the appearances of the terms "first," "second," and the like, if any, are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
It should be noted that the features of the embodiments of the present invention may be combined with each other without conflict.
Example 1
Referring to fig. 1, the present embodiment provides a time series prediction method.
Step S110, acquiring an initial season part and an initial trend part corresponding to the historical time sequence;
time series decomposition is a method for time series analysis, and the idea is to decompose data into different factors so as to achieve the purposes of explaining the data, establishing a mathematical model and predicting the data. Due to future agnostic property in the prediction problem, the historical time sequence is decomposed first, and the decomposed residual sequence replaces the original sequence to make prediction, which is helpful for making better prediction. The time series can be divided into three parts: a trending period portion, a seasonal portion, and a remaining portion that contains any other content in the time series.
That is, a typical time series can be expressed as a product of three parts, as in equation 1:
y t =S t *T t *R t
wherein y is t Is the original time sequence, S t Representing a seasonal part, T t Representing a part of the trend cycle, R t The remaining portion is indicated.
However, how to decompose the time sequence is a problem to be solved, and based on this, the embodiment of the present application proposes a deep decomposition architecture based on a cross attention mechanism, and the sequence decomposition is embedded into an encoder and a decoder as an internal unit of a framework. In the prediction process, the model alternately performs prediction result optimization and sequence decomposition, namely, a trend term and a period term are gradually separated from a hidden variable, so that progressive decomposition is realized.
A series decomposition unit (series decomposition block) smoothes a period term and a prominent trend term based on a moving average idea. Specifically, the periodic fluctuation is smoothed by adjusting the moving average line, and the long-term tendency is highlighted. For an input series of length L, X ∈ R Lxd The process is formula 2:
X t =AvgPool(Padding(X))
X s =X-X t
in the above formula, X t For the extracted trend term part, the mean value of each sliding window, i.e. the short-term fluctuation, X, of the sequence is stored s Is a smooth sequence that preserves seasonality after subtracting short-term fluctuations.
Wherein padding is to ensure the length of the sequence is not changed, and avgpool is a moving average. X is an implicit variable to be decomposed, X t And X s The Series Decomp Block described above can be written as equation 3, for the trend term and the season term, respectively:
X t ,X s =SeriesDecomp(X)
the basic principle of time-series decomposition and the operation of the sequence decomposition unit are described above, and based on this, the embodiment of the present application provides a time-series decomposition method.
Specifically, in an embodiment, the acquiring an initial trend part corresponding to the historical time series includes: carrying out moving average processing on the historical time sequence to obtain a historical trend sequence corresponding to the historical time sequence; acquiring the last I/2 elements of the historical trend sequence to obtain a segmentation trend sequence; obtaining the average value of each element of the historical trend sequence; filling O average values into the segmentation trend sequence, and then obtaining an initial trend part through a concat function; wherein, I is the length of the historical time sequence, and O is the length of the time to be predicted.
The acquiring of the initial season part corresponding to the historical time sequence comprises:
subtracting the historical trend sequence from the historical time sequence to obtain a historical season sequence corresponding to the historical time sequence; acquiring the last I/2 elements of the historical seasonal sequence to obtain a segmented seasonal sequence; and filling O zero values into the segmentation seasonal sequence, and then obtaining an initial seasonal part through a concat function.
The historical time series first needs to be preprocessed to get input to the encoder and decoder. Specifically, the historical trend sequence and the historical season sequence obtained by decomposing the initial historical time sequence need to be segmented, and if the length of the initial historical time sequence is I, the second half of the initial time sequence, namely the length of I/2, needs to be reserved.
And then filling the segmented segmentation trend sequence and the segmented seasonal sequence according to the prediction purpose, wherein the number of the filled elements is the length O of the future to be predicted. The filling method of the initial season part can be seen in formula 4:
X des =Concat(X ens ,X 0 )
wherein, X en Representing a historical time series, X ens Representing a sequence of divided seasons, X 0 Representing zero-filled placeholders, the number of which is O. X des Is the initial part of the season that is needed. X ens And X 0 The initial seasonal portion is obtained via concat function concatenation.
Then the initial trend portion can be found by equation 5:
X det =Concat(X ent ,X Mean )
X ent representing a sequence of segmentation trends, X mean Represents X en Average value of (a).
Step S120, inputting the historical time sequence into the encoder, and obtaining an encoding output result containing season information through the first cross attention unit;
specifically, please refer to fig. 2, wherein fig. 2 illustrates a structural diagram of the time series decomposition model provided in the embodiment of the present application. The time series decomposition model includes an encoder 210 and a decoder 220.N represents the number of layers of the encoder and M represents the number of layers of the decoder. K. V and Q respectively represent key, query and value, and are parameters of the attention mechanism.
The encoder 210 further comprises a first sequence decomposition unit, a first feed-forward unit, and a second sequence decomposition unit, the inputting the historical time series into the encoder 210, obtaining an encoded output result containing seasonal information via the first cross attention unit, comprising:
the historical time sequence passes through a first cross attention unit to obtain a first attention sequence; adding the first attention sequence and the historical time sequence, and then passing through a first sequence decomposition unit to obtain a first decomposition result; enabling the first decomposition result to pass through a first feedforward unit to obtain a first feedforward sequence; and adding the first feedforward sequence and the first decomposition result, and then passing through a second sequence decomposition unit to obtain the coding output result.
The input to the encoder 210 is the complete historical time series, the output of the encoder 210 is a matrix containing seasonal information, and the information will be input as the third cross attention unit of the decoder to improve the decoder prediction. The processing formula of the encoder 210 is as shown in formula 6:
CrossAttention here represents the part of the trend eliminated after the sequence decomposition, passing through the first cross attention unit, "_" is the part of the trend eliminated;representing a component containing seasonal information, S, obtained via a first cross attention unit i And the coded output result finally output through the feedforward unit is shown.
Step S130, inputting the initial season part into the decoder 220, obtaining a predicted season part via the second cross attention unit and the third cross attention unit, and separating a remaining trend part;
referring to fig. 2, the decoder 220 further includes a third sequence decomposition unit, a fourth sequence decomposition unit, a second feed-forward unit, and a fifth sequence decomposition unit.
Inputting the initial seasonal portion to the decoder 220, deriving a predicted seasonal portion via the second cross attention unit and the third cross attention unit, and separating out a remaining trend portion, comprising: passing the initial seasonal portion through the second cross attention unit to obtain a second attention sequence; adding the second attention sequence and the historical time sequence, and then passing through the third sequence decomposition unit to obtain a second decomposition result and a first residual part; passing the second decomposition result through the third cross attention unit to obtain a third attention sequence; adding the third attention sequence and the second decomposition result, and then passing through the fourth sequence decomposition unit to obtain a third decomposition result and a second residual part; passing the third decomposition result through the second feedforward unit to obtain a second feedforward sequence; adding the second feed-forward sequence and the third decomposition result, and then passing through the fifth sequence decomposition unit to obtain the predicted season part and a third residual part; and accumulating the first remaining part, the second remaining part and the third remaining part to obtain the remaining trend part.
The calculation process of the decoder 220 is a process of separating the trend information from the input information and retaining the season information. Referring to the structures of the dashed box 221 and the dashed box 222, the dashed box 221 includes a second cross attention unit, a third sequence decomposition unit, a third cross attention unit, a fourth sequence decomposition unit, a second feed-forward unit, and a fifth sequence decomposition unit, and is mainly used for gradually separating trend information from input information.
Specifically, the initial season part obtained before is input into the decoder 220, and the operation of each layer of the decoder 220 is shown in formula 7:
equation 7 corresponds to the part 221 of the decoder, progressively extracts the seasonal information from the initial seasonal sequence containing the seasonal information and the encoded output, further separating the trend from the seasonal information. Wherein, X de Representing the input of the decoder, and,representing the sequence containing season information obtained by the second cross attention unit and the third sequence decomposition unit, namely a second decomposition result; />Representing trend information separated by the second cross attention unit and the third sequence decomposition unit, i.e. the first remaining portion.
In the same way, the method for preparing the composite material,represents a third decomposition result, and>representing a second remaining portion; />Representing the resulting part of the predicted season, device for selecting or keeping>Representing a third remaining portion. />The accumulation is performed at 222.
Step S140, inputting the initial trend part into the decoder, and accumulating the initial trend part and the remaining trend part to obtain a predicted trend part.
Specifically, the dashed box 222 includes a plurality of accumulation units for accumulating the separated trend information and the initial trend component input to the decoder 220, that is, accumulating the initial trend component and the first residual component, the second residual component and the third residual component step by step to obtain a final predicted trend component.
The trend information obtained is shown in formula 8:
wherein, P i 、P i+1 And P i+2 Representing the corresponding weight of each remaining portion.
The cross attention mechanism utilizes the periodic property of seasonal information to aggregate subsequences with similar processes in different periods; and for the trend information, the trend information is extracted from the predicted hidden variables step by step in an accumulation mode, and the trend information is alternately carried out and mutually promoted.
Based on the model architecture shown in fig. 2, the model can gradually decompose hidden variables in the prediction process, and obtain the prediction results of season and trend information through a cross attention mechanism and an accumulation mode, so that alternate proceeding and mutual promotion of decomposition and prediction result optimization are realized.
In addition, referring to fig. 3, fig. 3 shows a schematic structural diagram of a cross attention unit provided in an embodiment of the present application, namely a CrossAttention unit. A crossvalidation unit splits the input into two parts. The first part propagates through layer 310, passing through a 1 x 1 convolutional layer, while the second part propagates through layer 320, passing through a self-attention = block. Finally, the outputs of the two parts are connected together by the concat function as the final output of the whole crossiteration unit.
The use of the crossvalidation unit has the following advantages, for example:
input R to CrossAttention Unit L*d Where L is the input length and d is the input dimension, the pass dimensionIs divided into two parts. X 1 After passing through a 1X 1 convolution layer, and X is connected to the end of the CrossAttention cell 2 Acting as an input to the self-attention block. The outputs of a and B are connected by dimensions as the output of the entire crosssettention unit.
The output matrix for one phase of crossvalidation is shown in equation 9:
wherein, A (X) 2h ) Is the scaled dot product, W, of the h-th self-attention block h Is a d h *d h A linear projection matrix. H is the number of heads in the attention mechanism, d h Is the size of each head, W, assuming each head has the same size c Is a value weight matrix of 1 x 1 convolutional layers.
CrossAttention can alleviate the memory bottleneck and computational efficiency problems of the self-attention mechanism, as compared to the self-attention mechanism. CrossAttention also reduces memory traffic and time complexity of the self-attention mechanism.
For example, if two matrices A ∈ R are given a×b And B ∈ R b×c Considering only the calculation of multiplication, the time complexity of AxB is axbxc, which is represented by T (a × B) = a × B × c. Let X ∈ R L×d For an input matrix consisting of L labels, the self-attention block has H headers. Excluding the deviation, the output of the h-th self-attention head can be written as equation 10:
A h (X)=P h XW V,h
calculating the scaled dot product P by equation 11 h :
Thus, ignoring the calculation of softmax, the time complexity of a single self-care head can be written as equation 12:
where d represents the size of a single self-care head.
While the full time complexity of a typical self-attention block is equation 13:
T(SA(X))=H×T(A h (X))+Ld 2
=4d 2 L+2HdL 2
however, when calculating the temporal complexity of crossintersection, assuming crossintersection divides the input dimension into half, the first part of crossintersection has only one linear projection layer, which means that the temporal complexity of the first part can be written as equation 14:
T(CroA 1 (X 1 ))=L×(d/2) 2
the time complexity of the second part is given by equation 15:
T(CroA 2 (X 2 ))=4L(d/2) 2 +2L 2 (d/2)
=Ld 2 +HL 2 d
the total temporal complexity of CrossAttention is therefore equation 16:
T(CroA(X))=1.25d 2 L+HdL 2
obviously, as L grows, the temporal complexity is related to L 2 Are approximately correlated. So the temporal complexity of crossdistances is almost 2Hd/Hd =50% of the canonical selection (canonical self-attention mechanism) when the network is propagating forward. Furthermore, the L-coefficient for the temporal complexity of CrossAttention is 1.25d normalized from the attention temporal complexity 2 /4d 2 =31.25%. Thus, crossAttention compares to the canonical self-attention mechanismThe time complexity is reduced by at least 50%.
Thus CrossAttention reduces memory traffic and time complexity of the self-attention mechanism.
According to the time sequence prediction method provided by the embodiment, a progressive decomposition method is adopted, a season prediction part and a trend prediction part are extracted from a complex time sequence, a cross attention mechanism is used, and a split and merge strategy is used in a cross-stage mode, so that the repeated possibility in the information integration process is effectively reduced, and the learning capacity of a network is improved. The cross attention mechanism also reduces the memory flow and reduces the time complexity.
Example 2
The embodiment also provides a time series prediction apparatus, the apparatus 400 includes:
an obtaining module 410, configured to obtain an initial season part and an initial trend part corresponding to a historical time sequence;
an encoding module 420, configured to input the historical time series into an encoder, and obtain an encoded output result containing season information via a first cross attention unit;
a decoding module 430 for inputting the initial part of the season into a decoder, deriving a predicted part of the season via a second cross attention unit and a third cross attention unit, and separating a remaining trend part;
and an accumulation module 440, configured to input the initial trend portion into the decoder, and accumulate the initial trend portion and the remaining trend portion to obtain a predicted trend portion.
According to the time sequence prediction device provided by the embodiment, a progressive decomposition method is adopted, a prediction season part and a prediction trend part are extracted from a complex time sequence, a cross attention mechanism is used, and a split and combination strategy is used in a cross-stage mode, so that the repeated possibility in the information integration process is effectively reduced, and the learning capacity of a network is improved. The cross-attention mechanism also reduces memory traffic and reduces time complexity.
Example 3
Furthermore, an embodiment of the present disclosure provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the computer program executes the time series decomposition method provided in embodiment 1 when running on the processor.
The electronic device provided in the embodiment of the present invention may implement the time series decomposition method provided in embodiment 1, and is not described herein again to avoid repetition.
The electronic equipment provided by the embodiment of the invention adopts a progressive decomposition method to extract a predicted season part and a predicted trend part from a complex time sequence, uses a cross attention mechanism and uses a split and merge strategy in a cross-stage manner, thereby effectively reducing the repeated possibility in the information integration process and improving the learning capability of the network. The cross-attention mechanism also reduces memory traffic and reduces time complexity.
Example 4
The present application also provides a computer-readable storage medium on which a computer program is stored, which, when executed by a processor, implements the time-series decomposition method provided in embodiment 1.
In this embodiment, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The computer-readable storage medium provided in this embodiment may implement the time series decomposition method provided in embodiment 1, and is not described herein again to avoid repetition.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of additional like elements in the process, method, article, or terminal that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are also within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
Claims (10)
1. A method of temporal sequence prediction applied to a temporal sequence prediction architecture, the temporal sequence prediction architecture comprising a multi-layered encoder and a multi-layered decoder, the encoder comprising a first cross attention unit and the decoder comprising a second cross attention unit and a third cross attention unit, the method comprising:
acquiring an initial season part and an initial trend part corresponding to the historical time sequence;
inputting the historical time series into the encoder, and obtaining an encoding output result containing season information through the first cross attention unit;
inputting the initial part of the season into the decoder, deriving a predicted part of the season via the second cross attention unit and the third cross attention unit, and separating a remaining trend part;
and inputting the initial trend part into the decoder, and accumulating the initial trend part and the residual trend part to obtain a predicted trend part.
2. The time series prediction method according to claim 1, wherein the obtaining of the initial trend part corresponding to the historical time series comprises:
carrying out moving average processing on the historical time sequence to obtain a historical trend sequence corresponding to the historical time sequence;
acquiring the last I/2 elements of the historical trend sequence to obtain a segmentation trend sequence;
obtaining the average value of each element of the historical trend sequence;
filling O average values into the segmentation trend sequence, and then obtaining an initial trend part through a concat function; wherein, I is the length of the historical time sequence, and O is the length of the time to be predicted.
3. The time series prediction method of claim 2, wherein the obtaining of the initial season part corresponding to the historical time series comprises:
subtracting the historical trend sequence from the historical time sequence to obtain a historical season sequence corresponding to the historical time sequence;
acquiring the last I/2 elements of the historical seasonal sequence to obtain a segmentation seasonal sequence;
and filling O zero values into the segmentation seasonal sequence, and then obtaining an initial seasonal part through a concat function.
4. The time series prediction method of claim 1, wherein the encoder further comprises a first series decomposition unit, a first feed forward unit and a second series decomposition unit, the inputting the historical time series into the encoder, obtaining an encoded output result containing season information via the first cross attention unit, comprises:
enabling the historical time sequence to pass through a first cross attention unit to obtain a first attention sequence;
adding the first attention sequence and the historical time sequence, and then passing through a first sequence decomposition unit to obtain a first decomposition result;
enabling the first decomposition result to pass through a first feedforward unit to obtain a first feedforward sequence;
and adding the first feedforward sequence and the first decomposition result, and then passing through a second sequence decomposition unit to obtain the coding output result.
5. The time series prediction method of claim 1, wherein the decoder further comprises a third sequence decomposition unit, a fourth sequence decomposition unit, a second feed forward unit, a fifth sequence decomposition unit; inputting the initial part of the season into the decoder, deriving a predicted part of the season via the second cross attention unit and the third cross attention unit, and separating out a remaining trend part, comprising:
passing the initial seasonal portion through the second cross attention unit to obtain a second attention sequence;
adding the second attention sequence and the historical time sequence, and then passing through the third sequence decomposition unit to obtain a second decomposition result and a first residual part;
passing the second decomposition result through the third cross attention unit to obtain a third attention sequence;
adding the third attention sequence and the second decomposition result, and then passing through the fourth sequence decomposition unit to obtain a third decomposition result and a second residual part;
passing the third decomposition result through the second feedforward unit to obtain a second feedforward sequence;
adding the second feedforward sequence and the third decomposition result, and then passing through the fifth sequence decomposition unit to obtain the predicted season part and a third remaining part;
and accumulating the first remaining part, the second remaining part and the third remaining part to obtain the remaining trend part.
6. A time series prediction apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring an initial season part and an initial trend part corresponding to the historical time sequence;
the encoding module is used for inputting the historical time sequence into an encoder and obtaining an encoding output result containing season information through a first cross attention unit;
a decoding module for inputting the initial season part into a decoder, obtaining a predicted season part through a second cross attention unit and a third cross attention unit, and separating out a remaining trend part;
and the accumulation module is used for inputting the initial trend part into the decoder and accumulating the initial trend part and the residual trend part to obtain a predicted trend part.
7. The apparatus according to claim 6, wherein the encoding module is further configured to:
enabling the historical time sequence to pass through a first cross attention unit to obtain a first attention sequence;
adding the first attention sequence and the historical time sequence, and then passing through a first sequence decomposition unit to obtain a first decomposition result;
enabling the first decomposition result to pass through a first feedforward unit to obtain a first feedforward sequence;
and adding the first feedforward sequence and the first decomposition result, and then passing through a second sequence decomposition unit to obtain the coding output result.
8. The apparatus according to claim 6, wherein the decoding module is further configured to:
passing the initial seasonal portion through the second cross attention unit to obtain a second attention sequence;
adding the second attention sequence and the historical time sequence, and then passing through a third sequence decomposition unit to obtain a second decomposition result and a first residual part;
passing the second decomposition result through the third cross attention unit to obtain a third attention sequence;
adding the third attention sequence and the second decomposition result, and then passing through a fourth sequence decomposition unit to obtain a third decomposition result and a second residual part;
passing the third decomposition result through a second feedforward unit to obtain a second feedforward sequence;
adding the second feedforward sequence and the third decomposition result, and then passing through a fifth sequence decomposition unit to obtain the predicted season part;
and accumulating the first remaining part, the second remaining part and the third remaining part to obtain the remaining trend part.
9. An electronic device, comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, performs the time series prediction method of any one of claims 1 to 5.
10. A computer-readable storage medium, characterized in that it stores a computer program which, when run on a processor, performs the time series prediction method of any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211662382.5A CN115936237A (en) | 2022-12-23 | 2022-12-23 | Time series prediction method, time series prediction device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211662382.5A CN115936237A (en) | 2022-12-23 | 2022-12-23 | Time series prediction method, time series prediction device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115936237A true CN115936237A (en) | 2023-04-07 |
Family
ID=86648875
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211662382.5A Pending CN115936237A (en) | 2022-12-23 | 2022-12-23 | Time series prediction method, time series prediction device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115936237A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116415744A (en) * | 2023-06-12 | 2023-07-11 | 深圳大学 | Power prediction method and device based on deep learning and storage medium |
CN116955932A (en) * | 2023-09-18 | 2023-10-27 | 北京天泽智云科技有限公司 | Time sequence segmentation method and device based on trend |
-
2022
- 2022-12-23 CN CN202211662382.5A patent/CN115936237A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116415744A (en) * | 2023-06-12 | 2023-07-11 | 深圳大学 | Power prediction method and device based on deep learning and storage medium |
CN116415744B (en) * | 2023-06-12 | 2023-09-19 | 深圳大学 | Power prediction method and device based on deep learning and storage medium |
CN116955932A (en) * | 2023-09-18 | 2023-10-27 | 北京天泽智云科技有限公司 | Time sequence segmentation method and device based on trend |
CN116955932B (en) * | 2023-09-18 | 2024-01-12 | 北京天泽智云科技有限公司 | Time sequence segmentation method and device based on trend |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115936237A (en) | Time series prediction method, time series prediction device, computer equipment and storage medium | |
CN110413785B (en) | Text automatic classification method based on BERT and feature fusion | |
CN113673594B (en) | Defect point identification method based on deep learning network | |
US12033077B2 (en) | Learning compressible features | |
CN111858932B (en) | Multiple-feature Chinese and English emotion classification method and system based on Transformer | |
CN110929092B (en) | Multi-event video description method based on dynamic attention mechanism | |
CN113806587A (en) | Multi-mode feature fusion video description text generation method | |
CN112418292A (en) | Image quality evaluation method and device, computer equipment and storage medium | |
CN110188926B (en) | Order information prediction system and method | |
CN116227562A (en) | Timing point process prediction method and system based on graph neural network and transducer | |
CN113033090A (en) | Push model training method, data push device and storage medium | |
CN114420107A (en) | Speech recognition method based on non-autoregressive model and related equipment | |
CN115841119A (en) | Emotional cause extraction method based on graph structure | |
CN114328898A (en) | Text abstract generating method and device, equipment, medium and product thereof | |
CN117828308A (en) | Time sequence prediction method based on local segmentation | |
CN107392229A (en) | A kind of network representation method based on the Relation extraction that most gears to the needs of the society | |
CN117150148A (en) | Social network public opinion situation monitoring method based on pre-training model | |
CN116743182A (en) | Lossless data compression method | |
CN112464637A (en) | Label-based optimization model training method, device, equipment and storage medium | |
CN116596145A (en) | Traffic flow space-time data prediction method, system, equipment and medium | |
CN116341752A (en) | Collaborative supply chain prediction method based on graph neural network | |
CN115238009A (en) | Metadata management method, device and equipment based on blood vessel margin analysis and storage medium | |
CN114648005A (en) | Multi-fragment machine reading understanding method and device for multitask joint learning | |
CN114328910A (en) | Text clustering method and related device | |
CN116415741B (en) | Coal consumption prediction method and system for coal-fired power plant based on deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |