Disclosure of Invention
The invention aims to provide a method, a system and a storage medium for predicting the passenger flow of a scenic spot. The method, the system and the storage medium can improve the prediction precision of the passenger flow in the scenic spot.
In order to achieve the above object, an embodiment of the present invention provides a method for predicting a passenger flow volume in a scenic spot, the method including:
acquiring multiple columns of data of the scenic spot, wherein each column of data comprises a numerical value of at least one influence factor on a time sequence;
defining an attention weight population, wherein the attention weight population comprises a plurality of groups of attention weights, and each group of attention weights comprises weight values corresponding to the influence factors one by one;
weighting the multi-column data by adopting each group of attention weight;
respectively inputting the weighted data of the multiple columns into a GRU (Gated Current Unit) neural network to obtain corresponding predicted values;
respectively calculating the error between each predicted value and the corresponding standard value;
screening the first two multi-column data with the smallest error from the weighted multiple multi-column data;
randomly selecting a weight value from the attention weights of the two screened columns of data;
carrying out gene recombination on the codes of the two selected weighted values;
respectively carrying out gene mutation on the codes of the two weighted values after gene recombination so as to update the attention weight;
replacing the attention weight after gene mutation into the attention weight population to update the attention weight population;
updating the iteration times;
judging whether the iteration times are greater than or equal to a preset threshold value or not;
outputting the attention weight with the minimum error as an optimal solution under the condition that the iteration times are judged to be greater than or equal to the threshold;
under the condition that the iteration times are judged to be smaller than the threshold value, weighting the multiple columns of data by adopting each group of attention weights respectively again, and executing corresponding steps of the method until the iteration times are judged to be larger than or equal to the threshold value;
and adding the optimal solution into the GRU model to predict the passenger flow of the scenic spot.
Optionally, the performing a gene mutation on the codes of the two weighted values after the gene recombination to update the attention weight further comprises:
decoding both of the weight values after gene mutation to update the attention weight.
Optionally, the genetically recombining the selected codes of the two weight values comprises:
and randomly exchanging at least a part of the two selected weight values.
Optionally, the performing a gene mutation on the codes of the two weighted values after the gene recombination to update the attention weight further comprises:
and respectively carrying out the operation of converting at least one part of 0 into 1 and/or 1 into 0 on the codes of the two weight values.
In another aspect, the invention also provides a system for predicting scenic spot passenger flow, the system comprising a processor configured to perform the method as described in any one of the above.
In yet another aspect, the present invention also provides a storage medium storing instructions for reading by a machine to cause the machine to perform a method as claimed in any one of the above.
Through the technical scheme, the method, the system and the storage medium for predicting the scenic spot passenger flow provided by the invention can accurately predict the future passenger flow of the scenic spot according to historical passenger flow data comprising various influence factors by adopting a GRU neural network combined with an attention mechanism, and overcome the technical problem that the prediction precision is low because a neural network model does not comprehensively consider the influence of various factors in the prior art.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.
In the embodiments of the present invention, unless otherwise specified, the use of directional terms such as "upper, lower, top, and bottom" is generally used with respect to the orientation shown in the drawings or the positional relationship of the components with respect to each other in the vertical, or gravitational direction.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present invention, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between the various embodiments can be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not be within the protection scope of the present invention.
Fig. 1 is a flow diagram illustrating a method for predicting scenic spot passenger traffic in accordance with one embodiment of the present invention. In fig. 1, the method may include:
in step S10, the data of a plurality of columns of the scenic spot is acquired. Wherein each column of data may comprise values of at least one influencing factor in a time series. Each column of data therein may be at least one of the temperature of the scenic spot at a different (historical) time point, the total number of people at the same time point in the previous year, the total number of people at the same time point in the previous two years, the number of votes ordered in the previous day, and the like. In one example of the present invention, the influencing factor may also be represented using table 1,
TABLE 1
In step S11, an attention weight population is defined. The attention weight population comprises a plurality of groups of attention weights, and each group of attention weights comprises weight values corresponding to the influence factors one by one. For example, the defined ith set of attention weights may be expressed as: wi=(Wi 1,Wi 2,Wi 3,...,Wi n)。
In step S12, the columns of data are weighted with each set of attention weights, respectively. In this embodiment, it is considered that the technical solution provided by the present invention employs a GRU neural network. Then, the step S12 may also be to input multiple sets of attention weights, for example, as defined in table 1 and step S11, into the attention layer of the GRU neural network to complete the weighting operation. Thus, taking the ith group attention weight as an example, the weighted multi-column data may be, for example, as shown in table 2,
TABLE 2
In step S13, the weighted data in the columns are input into the GRU neural network to obtain corresponding predicted values.
In step S14, the error of each predicted value from the corresponding standard value is calculated, respectively.
In step S15, the first two multi-column data having the smallest error are selected from the weighted plurality of multi-column data.
In step S16, one weight value is randomly selected from the attention weights of the two screened-out data of the plurality of columns, respectively.
In step S17, the codes of the two selected weight values are genetically recombined. For this gene recombination operation, a conventional operation known to those skilled in the art may be used. However, in one example of the present invention, the gene recombination operation may, for example, swap at least some of the two selected weight values for encoding, taking into account the complexity of the algorithm and the computational efficiency. Specifically, taking the binary code of one selected weight value as (0,1,0,0,1,1) and the binary code of the other selected weight value as (1,0,1,0,0,1), the gene recombination operation may be, for example, as shown in fig. 2.
In step S18, the codes of the two weight values after gene recombination are subjected to gene mutation to update the attention weight, respectively. For the gene mutation operation, a conventional operation known to those skilled in the art may be used. However, in one example of the present invention, in consideration of the complexity of the algorithm and the computational efficiency, the gene mutation operation may perform an operation of converting at least a part of 0 into 1 and/or 1 into 0, for example, for the encoding of two weight values, respectively. Specifically, taking the weight values (1,1,1,0,0,1) after gene recombination in step S17 as an example, the gene mutation operation can be, for example, as shown in fig. 3. In addition, in this embodiment, in order to update the set of attention weights, the weight values may be decoded before the attention weights are updated, and then the updating operation is performed.
In step S19, the attention weight after the gene mutation is replaced into the attention weight population to update the attention weight population.
In step S20, the number of iterations is updated. In this embodiment, the initial number of iterations may be 0. The number of iterations may be increased by, for example, 1 each time the number of iterations is updated.
In step S21, it is determined whether the number of iterations is greater than or equal to a preset threshold.
In step S22, when the number of iterations is determined to be greater than or equal to the threshold value, the attention weight (corresponding to the data in the plurality of columns) with the smallest error is output as the optimal solution.
In the case that the number of iterations is determined to be less than the threshold, the columns of data are weighted again with each set of attention weights (i.e., the step S12 is executed again), and the corresponding steps of the method are executed until the number of iterations is determined to be greater than or equal to the threshold.
In step S23, the optimal solution is added to the GRU model to predict the scenic spot passenger volume.
In another aspect, the invention also provides a system for predicting scenic spot passenger traffic, which may include a processor operable to perform any of the methods described above.
In yet another aspect, the present invention also provides a storage medium storing instructions for reading by a machine to cause the machine to perform a method as described in any one of the above.
In addition, in order to further verify the technical effect of the method provided by the present invention, in an embodiment of the present invention, 4 kinds of multi-column data of the same scenic spot are used as source data, and a bp (back propagation) neural network, an LSTM (Long Short-Term Memory) network, a GRU (gated secure unit) network, an a-LSTM (attention mechanism-Long Short Term Memory network), an a-GRU (attention mechanism-gating cycle unit), and an IA-GRU (improved attention mechanism-gating cycle unit, that is, the method provided by the present invention) are respectively used for prediction. The experimental environment of this embodiment includes, among others: the processor is i59400f, the GPU is 1660ti, and the memory is 16 g. In this embodiment, the IA-GRU model is provided with 6 layers, including 1 layer of interest (performing a weighting operation), 4 GRU layers, 1 fully-connected layer, and the number of neurons per GRU layer is 128, 64, and 32, respectively. The threshold number of iterations is set to 15. The Mean Absolute Percent Error (MAPE) of the predicted results are shown in table 3,
TABLE 3
As can be seen from table 3, the method provided by the present invention has higher accuracy (the smaller the average absolute percentage error is, the higher the accuracy is) compared with other neural networks in the prior art.
The correlation coefficient R of the prediction results is shown in table 4,
TABLE 4
As can be seen from table 4, the method provided by the present invention has higher accuracy (the larger the correlation coefficient R is, the more accurate the prediction result is) compared with other neural networks in the prior art.
Through the technical scheme, the method, the system and the storage medium for predicting the scenic spot passenger flow provided by the invention can accurately predict the future passenger flow of the scenic spot according to historical passenger flow data comprising various influence factors by adopting a GRU neural network combined with an attention mechanism, and overcome the technical problem that the prediction precision is low because a neural network model does not comprehensively consider the influence of various factors in the prior art.
Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, the embodiments of the present invention are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solution of the embodiments of the present invention within the technical idea of the embodiments of the present invention, and the simple modifications all belong to the protection scope of the embodiments of the present invention.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present invention will not be described separately for the various possible combinations.
Those skilled in the art can understand that all or part of the steps in the method for implementing the above embodiments may be implemented by a program to instruct related hardware, where the program is stored in a storage medium and includes several instructions to enable a (may be a single chip, a chip, etc.) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In addition, various different embodiments of the present invention may be arbitrarily combined with each other, and the embodiments of the present invention should be considered as disclosed in the disclosure of the embodiments of the present invention as long as the embodiments do not depart from the spirit of the embodiments of the present invention.