CN113283936A - Sales forecasting method, sales forecasting device and electronic equipment - Google Patents

Sales forecasting method, sales forecasting device and electronic equipment Download PDF

Info

Publication number
CN113283936A
CN113283936A CN202110595289.6A CN202110595289A CN113283936A CN 113283936 A CN113283936 A CN 113283936A CN 202110595289 A CN202110595289 A CN 202110595289A CN 113283936 A CN113283936 A CN 113283936A
Authority
CN
China
Prior art keywords
sales
prediction
data
historical sales
recurrent neural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110595289.6A
Other languages
Chinese (zh)
Inventor
何定
刘治
车浩流
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qianan Technology Co ltd
Shenzhen Thousandshores Technology Co Ltd
Original Assignee
Shenzhen Qianan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qianan Technology Co ltd filed Critical Shenzhen Qianan Technology Co ltd
Priority to CN202110595289.6A priority Critical patent/CN113283936A/en
Publication of CN113283936A publication Critical patent/CN113283936A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0202Market predictions or forecasting for commercial activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data

Abstract

The application discloses a sales prediction method, a sales prediction device and an electronic device, wherein the method comprises the following steps: acquiring historical sales data of a product to be predicted; preprocessing historical sales data to obtain a historical sales sequence; if the time length of the historical sales data is larger than or equal to the set threshold, inputting the historical sales sequence into the trained first cyclic neural model for prediction to obtain a sales prediction result of the product to be predicted; and if the time length of the historical sales data is less than the set threshold, predicting the product to be predicted according to the trained second cyclic neural model to obtain a sales prediction result of the product to be predicted. According to the method and the device, the historical sales data of different time lengths are predicted by adopting different recurrent neural models, the generalization capability of the recurrent neural models can be improved, and the accuracy of the sales prediction method is further improved.

Description

Sales forecasting method, sales forecasting device and electronic equipment
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a sales prediction method, a sales prediction apparatus, an electronic device, and a computer-readable storage medium.
Background
The sales forecast generally forecasts the future sales of the product according to the historical sales of the product, and is an indispensable important index in the commodity sales industry. According to the sales prediction result, the merchant can manage and control the stock inventory, reduce the stock cost, further fundamentally control the cost and improve the profit.
Accurate sales prediction results are important for inventory control. At present, the common sales prediction method is based on a deep learning model. Although the accuracy of the sales prediction based on the deep learning model is improved compared with the traditional sales prediction, the existing deep learning model still has poor generalization capability, so that the accuracy of the sales prediction is difficult to further improve.
Disclosure of Invention
The application provides a sales forecasting method, a sales forecasting device, electronic equipment and a computer readable storage medium, which can improve the poor generalization capability of a deep learning model and further improve the accuracy of sales forecasting.
In a first aspect, the present application provides a sales prediction method, including:
acquiring historical sales data of a product to be predicted;
preprocessing the historical sales data to obtain a historical sales sequence;
if the time length corresponding to the historical sales data is larger than or equal to a set threshold, inputting the historical sales sequence into a trained first cyclic neural model for prediction to obtain a sales prediction result of the product to be predicted, wherein the first cyclic neural model comprises N layers of cyclic neural networks, N is larger than or equal to 3, and N is an integer;
and if the time length corresponding to the historical sales data is less than a set threshold, predicting the product to be predicted according to a trained second cyclic neural model to obtain a sales prediction result of the product to be predicted, wherein the second cyclic neural model comprises a feature extraction network and a neural prediction network, and the training process of the second cyclic neural model is related to the historical sales sequence.
In a second aspect, the present application provides a sales prediction apparatus, comprising:
the data acquisition module is used for acquiring historical sales data of a product to be predicted;
the data processing module is used for preprocessing the historical sales data to obtain a historical sales sequence;
a first prediction module, configured to, if a time length corresponding to the historical sales data is greater than or equal to a set threshold, input the historical sales sequence into a trained first recurrent neural model for prediction to obtain a sales prediction result of the product to be predicted, where the first recurrent neural model includes N layers of recurrent neural networks, N is greater than or equal to 3, and N is an integer;
and the second prediction module is used for predicting the product to be predicted according to a trained second recurrent neural model if the time length corresponding to the historical sales data is less than a set threshold value to obtain a sales prediction result of the product to be predicted, the second recurrent neural model comprises a feature extraction network and a neural prediction network, and the training process of the second recurrent neural model is related to the historical sales sequence.
In a third aspect, the present application provides an electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method of the first aspect as described above.
Compared with the prior art, the application has the beneficial effects that: the method and the device divide historical sales data of the product to be predicted according to the time duration, and the historical sales data of different time durations are processed in a targeted mode by adopting different models. Processing the historical sales data with longer time length by adopting a first cyclic neural model; and processing the historical sales data with short time length by adopting a second recurrent neural model. The method not only can improve the generalization capability of the recurrent neural model, but also can further improve the accuracy of the sales prediction method.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating a sales forecasting method according to an embodiment of the present disclosure;
FIG. 2 is a schematic structural diagram of a gated loop unit provided in an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a recurrent monthly neural network provided in an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a sky-cycle neural network provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a quarterly recurrent neural network provided by an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a first recurrent neural model provided in an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a second recurrent neural model to be trained according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a sales prediction apparatus to be trained according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The sales prediction method provided by the embodiment of the application can be applied to electronic devices such as a mobile phone, a tablet personal computer, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the application does not limit the specific types of the electronic devices at all.
In order to explain the technical solution proposed in the present application, the following description will be given by way of specific examples.
Fig. 1 shows a schematic flow chart of a sales prediction method provided by the present application, which may be applied to any of the electronic devices described above by way of example and not limitation.
Step 101, obtaining historical sales data of a product to be predicted.
Sales forecasting, as the name implies, predicts sales over a period of time based on historical data. Therefore, prior to making a sales forecast, historical sales data for the product to be forecasted needs to be obtained. Typically, historical sales data is in days. Assuming that the product to be predicted is an umbrella, the selling time of the umbrella is 2019 to 2020, and the sales volume of the umbrella in 2021 is predicted at the end of 2020, the sales volume data of the umbrella in 2019 to 2020 needs to be acquired before prediction. Certainly, for a product to be predicted with a long sale time, the recent historical sale data can be selected, for example, ten years exist in the sale time of a certain spare part, but the sale data which is long from the time period to be predicted does not have the referential property due to the change of various factors in ten years, and at the moment, the historical sale data of two years can be selected for prediction, so that the difficulty of data processing can be reduced, and the accuracy of sale prediction can be improved.
And 102, preprocessing the historical sales data to obtain a historical sales sequence.
In the embodiment of the application, in order to improve the accuracy and efficiency of sales volume prediction, after historical sales volume data is acquired, the historical sales volume can be preprocessed, and a historical sales volume sequence with high correlation with the sales volume prediction is obtained.
And 103, if the time length of the historical sales data is greater than or equal to a set threshold, inputting the historical sales sequence into the trained first cyclic neural model for prediction to obtain a sales prediction result of the product to be predicted.
Different products to be predicted have different time periods of historical sales data due to different time periods of sales. For a product to be predicted which is sold for a long time, the trained first circulation neural model can be used for predicting the sales volume of the product. Specifically, the historical sales sequence of the product to be predicted can be counted in the first cyclic neural model for prediction, and a sales prediction result is obtained. The first recurrent neural model comprises N layers of recurrent neural networks, N is not less than 3, and N is an integer.
And step 104, if the time length of the historical sales data is less than a set threshold, predicting the product to be predicted according to the trained second cyclic neural model to obtain a sales prediction result of the product to be predicted.
For the product to be predicted with short sale time, the trained second recurrent neural model can be used for prediction. The second recurrent neural model comprises a feature extraction network and a neural prediction network, and the training process of the second recurrent neural model is related to the historical sales volume sequence.
In the embodiment of the application, products to be predicted are divided according to the time duration of the historical sales data, and the historical sales data with different time durations are processed by adopting different models. Processing the historical sales data with longer time length by adopting a first cyclic neural model; and processing the historical sales data with short time length by adopting a second recurrent neural model. The method not only can improve the generalization capability of the recurrent neural network, but also can further improve the accuracy of the sales prediction method.
In some embodiments, in order to improve the accuracy of the sales prediction, the step 102 specifically includes:
A. and if the historical sales data has missing data, filling the missing data according to the date interval of the missing data.
In the present embodiment, the historical sales data is in units of days. If the sales data of a certain day or a certain number of days in the historical sales data is zero, the sales data of the days are indicated to be missing, and the missing data can be filled according to the date interval of the missing data, wherein the date interval is the date between the missing data and the non-missing data closest to the missing data.
In some embodiments, in order to restore the actual sales volume, the step a specifically includes:
if the date interval of the missing data is greater than or equal to the specified value, the average value of the historical sales of the month in which the missing data is located can be selected to fill the missing data. For example, assuming that the designated value is 7, if data missing occurs from 4/month 1 to 4/month 20, then for missing data whose dates fall between 4/month 7 and 4/month 14, since the date interval of the missing data is large, even the sales data closest to the date of the missing data is weak in referential property. For this case, the daily sales average value may be calculated from the non-missing data of month 4, and the missing data may be filled according to the daily sales average value.
If the date interval of the missing data is less than the specified value, the missing data can be filled by selecting the non-missing data closest to the missing data. Similarly, for example, if the specified value is 7, and missing data exists from 4/month 1 to 4/month 20, the missing data from 4/month 1 to 4/month 6 and from 4/month 15 to 4/month 20 may be filled with the non-missing data closest to the missing data because the missing data has a small date interval and sales data close to the date of the missing data has a referential property. That is, missing data of 4/month 1 to 4/month 6 is filled with sales data of 3/month 31, and missing data of 4/month 15 to 4/month 20 is filled with sales data of 4/month 21.
Of course, there may be a case where there are two pieces of missing data that are closest to each other in the missing data, for example, there is missing data from 1/4/3, and for 2/4, there are two days apart from 31/3/4, in which case the average value may be calculated by selecting the sales data of the two days of 31/3/4, and the missing data of 2/4 may be filled with the average value.
B. And performing data smoothing processing and normalization processing on the filled historical sales data to obtain a historical sales sequence.
After the missing data is filled, in order to avoid the abnormal large or abnormal small data on a certain day, data smoothing processing can be performed on the historical sales data after filling; meanwhile, in order to improve the optimization efficiency of the recurrent neural model, normalization processing can be performed on the historical sales data, and a historical sales sequence is finally obtained.
In some embodiments, the performing data smoothing on the filled historical sales data specifically includes:
for the historical sales data of each day, the historical sales data of the day and several days (for example, two days) before the day can be subjected to mean calculation to obtain a corresponding average value, and the average value is updated to the historical sales data of the day. That is, except for the historical sales data of the last several days, the historical sales data of the rest of the days are subjected to data smoothing by adopting the method, so that the influence caused by abnormal sales data can be reduced.
In some embodiments, in order to improve the convergence rate of the recurrent neural model, the historical sales data may be normalized, and in order to distinguish abnormal sales data from normal sales data, the embodiments of the present application use a probability density function of a normal distribution to normalize the data. Specifically, the step of performing normalization processing on the historical sales data to obtain the historical sales sequence specifically includes:
the median and standard deviation of the historical sales data are first used as the expectation and standard deviation of the probability density formula of the normal distribution, and then the historical sales data for each day are normalized to the probability of occurrence under the normal distribution. The specific formula is as follows:
Figure BDA0003090714710000071
where μ is the median of the sales sequence and σ is the standard deviation of the sales sequence.
In some embodiments, in order to further improve the generalization ability and the prediction accuracy of the first recurrent neural model, the step 103 specifically includes:
and B1, inputting the historical sales volume sequence into a daily recurrent neural network to obtain a daily feature vector.
In an embodiment, to extract the feature in the dimension of the day, the B1 specifically includes:
aiming at each month, dividing the historical sales volume sequence of the month by taking days as units, respectively inputting the divided historical sales volume sequence into each day gating circulation unit of the day-cycle neural network to obtain the day hidden state output by each day gating circulation unit, and performing attention operation on each day hidden state to obtain the day characteristic vector of the month.
Referring to fig. 2, fig. 2 shows a schematic diagram of a natural neural cycle network provided in an embodiment of the present application. The basic unit of the sky-cycle neural network is the sky-gated cycle unit. The day gating cycle unit and the month cycle unit and the quarter cycle unit mentioned later have the same structures and are all gating cycle units (GRUs) only for distinguishing the use of the day gating cycle unit and the month cycle unit and the quarter cycle unit in different networks, so that the day gating cycle unit, the month cycle unit and the quarter cycle unit are taken as limits to ensure the clear designation. The gated-loop unit can solve the problem of the disappearance of the gradient of the standard neural network by using an update gate and a control gate, which can determine which information can be used as the final output of the gated-loop unit. A special indication of these two gating mechanisms is that they are able to preserve information in long-term sequences and are not removed over time either explicitly or because they are not relevant to prediction.
By way of example and not limitation, to facilitate understanding of the operation of the gated loop unit, referring to fig. 3, the operation of the gated loop unit is illustrated by taking the skywooting loop unit of fig. 3 as an example. The current day gating cycle unit acquires two gating states through the hidden state transmitted by the last day gating cycle unit and the historical sales sequence of the day input by the current node:
Figure BDA0003090714710000081
Figure BDA0003090714710000082
where σ is a sigmoid function by which data can be transformed to a value in the range of 0-1 to act as a gating signal. WrIs a parameter matrix, xtIs an input vector at time t, ht-1For the concealment vector at time t-1, r is the gating that controls the reset, and z is the gating that controls the update.
After two gating states are obtained, firstly, a reset gate is used for data processing to obtain data after reset; then h ist-1’And xtSplicing is carried out, and the data is scaled to the range of-1 to 1 through a tanh activation function:
Figure BDA0003090714710000083
and finally, performing two steps of forgetting and memorizing at the same time by utilizing the updating gate control to obtain the day hidden state.
ht=(1-z)⊙ht-1+z⊙h
After the day hidden state is obtained, attention operations may be performed on the hidden state; when the hidden state of each day in a month is performed with attention operation, the day feature vector of each day in the month, namely the day feature vector of the month, can be obtained.
In one embodiment, the number of the top gating cycle units is a specified number, and may be 31, for example. Assuming that the number of day-gated cyclic units is set to 31, some months are less than 31 days, such as february, which is usually only 28 days; the sales may be made up to 0 for less than 31 days. For example, the historical sales data for the three days after february is filled up by 0.
And B2, inputting the historical sales volume sequence into the quarterly recurrent neural network to obtain the quarterly hidden state of each quarterly.
In an embodiment, to extract the feature in the dimension of the quarter, the B2 specifically includes:
and aiming at each quarter, inputting the historical sales sequence of each month belonging to the quarter into a quarter gating circulation unit of the quarter cyclic neural network to obtain a quarter hidden state output by the quarter gating circulation unit.
Referring to fig. 4, fig. 4 shows a schematic diagram of a quaternary neural cycle network provided by an embodiment of the present application. The quarterly cyclic neural network also uses the gated cyclic units as basic units, and constructs a cyclic neural network comprising four gated cyclic units (namely quarterly gated cyclic units) based on four quarters of a year, wherein the output of each gated cyclic unit represents the characteristics of 3 months corresponding to the quarter. The data processing process of each quarter gated cycle unit may be similar to that of the top gated cycle unit, and the difference between the two results mainly from the input data content, i.e., the data processing process of the quarter gated cycle unit may be analogized according to the data processing process of the top gated cycle unit, and will not be described herein again.
And B3, inputting the historical sales volume sequence of each month and the splicing characteristics of the corresponding month into the monthly recurrent neural network to obtain a sales volume prediction result.
In the embodiment of the present application, the minimum unit of the historical sales data is day, and the historical sales data can be divided in the unit of month and quarter from the unit of time. The historical sales data obtained by dividing according to the days can reflect the periodic variation trend of the sales, the historical sales data obtained by dividing according to the months can reflect the off-season and the on-season of the sales, and the historical sales data obtained by dividing according to the quarters can reflect the seasonality and the growth trend of the sales. The three time units can reflect the change situation of the sales volume from different dimensions, so in order to improve the accuracy of the sales volume prediction, the sales volume prediction can be carried out by integrating the change situations of the sales volume of different dimensions.
In particular, the first recurrent neural model may include a daily recurrent neural network, a monthly recurrent neural network, and a quarterly recurrent neural network. Inputting the historical sales sequence into a neural network to obtain a day characteristic vector; the seasonal cyclic neural network of historical sales volume sequence data can obtain seasonal hidden states of all seasons; and splicing the historical sales volume sequence of each month, the day characteristic vector of the corresponding month and the seasonal hidden state of the seasonal of the corresponding month to obtain the splicing characteristic of the month. It can be appreciated that the stitching feature integrates sales variation across different dimensions. Referring to fig. 5, fig. 5 is a schematic diagram illustrating a lunar neural cycle network provided by an embodiment of the present application. As can be seen from fig. 5, the historical sales volume sequence of each month and the splicing characteristics of the corresponding month are input into the lunar neural network, and the sales volume prediction result of the product to be predicted can be predicted. The data processing process of the moon gate control circulation unit can be analogized according to the data processing process of the moon gate control circulation unit, and is not described herein again.
In some embodiments, to more clearly understand the structure of the first recurrent neural model, reference may be made to fig. 6, where fig. 6 clearly shows the coupling relationship between the daily recurrent neural network, the monthly recurrent neural network, and the quarterly recurrent neural network. The daily cyclic neural network inputs a historical sales volume sequence by taking days as a unit and outputs a daily feature vector with attention operation to the monthly cyclic neural network, and the quarterly cyclic neural network inputs the historical sales volume sequence of each quarter and outputs a quarterly hidden state of each quarter to the monthly cyclic neural network. After the monthly recurrent neural network receives the day feature vector and the quarterly hidden state, the sales volume sequence of each month, the day feature vector corresponding to the month and the quarterly hidden state belonging to the month are spliced to form splicing features, and finally the sales volume sequence and the splicing features of each month are used as the input of the monthly recurrent neural network, so that the sales volume prediction result of the product to be predicted can be accurately output after the monthly recurrent neural network combines the features of the three dimensions of day, month and quarterly.
From the above, for the historical sales data with the time length being greater than or equal to the set threshold, the first cyclic neural model is adopted for data processing, so that different characteristics of the historical sales data in three dimensions of days, months and quarters can be fully mined, the change situation of past sales can be better reflected, and the accuracy of sales prediction can be improved.
In one embodiment, for the historical sales data with the time length less than the set threshold, because the historical sales data of the product to be predicted is insufficient, if the prediction model is directly constructed and the sales of the product to be predicted is predicted based on the historical sales data, an unstable or overfitting phenomenon may occur, resulting in poor generalization capability of the prediction model. In order to solve the problem of poor generalization capability of the prediction model caused by insufficient historical sales data, the trained second recurrent neural model can be used for predicting the sales of the product to be predicted.
In the embodiment of the present application, referring to fig. 7, fig. 7 shows a schematic structural diagram of a second recurrent neural model to be trained, and specifically, the second recurrent neural model may be obtained by training through the following steps:
and C1, determining a reference product according to the product to be predicted, wherein the time length corresponding to the reference sales data of the reference product is greater than or equal to the set threshold.
In order to improve the prediction effect, a transfer learning mechanism is introduced, and the sales prediction accuracy of the product with the historical sales less than one year can be improved by transferring the prediction result of the product with the historical sales more than one year to the product with the historical sales less than one year. If the product with the historical sales of more than one year is taken as a reference product, the determination rule of the reference product may be: belong to the same general category of international categories as the products to be predicted; and/or products sold by the same merchant as the products to be predicted. By adopting the determination rule, the determined reference product can be made to be more referential. For the convenience of distinguishing, the historical sales data of the reference product is recorded as reference sales data.
And C2, selecting reference sales data corresponding to the time of the historical sales data for preprocessing to obtain a reference sales sequence.
After the reference product is determined, the reference sales data of the reference product needs to be screened and preprocessed. First, reference sales data to be processed can be selected from the reference sales data according to the time period to which the historical sales data belongs. For example, if the time periods to which the historical sales data of the product to be predicted belong are months 6, 7 and 8, the reference sales data of months 6, 7 and 8 are selected according to the time periods. And after the reference sales data are selected, performing preprocessing operation which is the same as the historical sales data on the reference sales data to obtain a reference sales sequence. The specific steps of the preprocessing can be referred to the description of the step 102 and the refinement steps thereof, and are not described herein again.
And C3, inputting the historical sales volume sequence and the reference sales volume sequence into a feature extraction network of a second neural model to be trained for feature extraction, and respectively obtaining a historical sales volume feature vector and a reference sales volume feature vector.
The feature extraction network may include a bidirectional recurrent neural network and a fully-connected network, an output of the bidirectional recurrent neural network being an input to the fully-connected network. Wherein the basic unit of the bidirectional recurrent neural network is also a gated recurrent unit.
And C4, respectively inputting the historical sales characteristic vector and the reference sales characteristic vector into a discrimination network of a second recurrent neural model to be trained for discrimination to obtain a domain classification result.
The second recurrent neural model is provided with a discrimination network, and the discrimination network can output a corresponding domain classification result according to the historical sales characteristic vector and the reference sales characteristic vector, namely, which product is output as the product with the historical sales data with the time length more than or equal to the set threshold value, and which product is the product with the historical sales data with the time length less than the set threshold value.
And C5, inputting the reference sales characteristic vector into a neural prediction network of the second recurrent neural model to be trained for prediction to obtain a training prediction result.
The second cyclic neural model is also provided with a neural prediction network, the neural prediction network can output a training prediction result according to the reference sales characteristic vector, and the training prediction result is sales data of a product to be predicted, which is predicted based on the reference sales characteristic. The historical sales data of the product to be predicted is used as a target of the recurrent neural network, and is compared with the training prediction result to obtain the loss of the neural prediction network; it can be appreciated that the accuracy of the recurrent neural network prediction is higher as the training prediction results are closer to the historical sales data.
And C6, optimizing the second recurrent neural model to be trained by adopting gradient inversion based on the domain classification result and the training prediction result to obtain the trained second recurrent neural model.
In the application, the second recurrent neural model combines a feature extraction network, a neural prediction network and a discrimination network, wherein the discrimination network urges the feature extraction network to complete feature transfer learning for the purpose of judging a product to be predicted and a reference product, so that the neural prediction network can output a sales prediction result of the product to be predicted through a reference sales feature vector of the reference product.
In the training optimization iteration process of the second recurrent neural model, a Gradient Reversal mechanism is introduced, namely, the second recurrent neural model can be provided with a Gradient Reversal Layer (GRL), after a domain classification result and a training prediction result are obtained, a loss function Gradient of the neural prediction network and a loss function Gradient of the discrimination network can be respectively calculated, and then optimization iteration is performed on the feature extraction network, the neural prediction network and the discrimination network based on the two calculated loss function gradients and the Gradient Reversal mechanism. It should be noted that the gradient inversion only acts in the discrimination network. In the training process of the discrimination network, the discrimination network is required to be unable to distinguish the product to be predicted from the reference product, the larger the loss is, the better the loss is, and in order to update the iterative gradient towards the same direction, the inverse operation can be performed on the gradient of the discrimination network. The gradient direction is automatically inverted in the backward propagation process, and only identity transformation is carried out in the forward propagation process:
Rλ(x)=x
Figure BDA0003090714710000121
wherein R isλRepresents a gradient inversion layer, RλX represents that the gradient inversion layer does not change value when propagating forward, only an identity transformation is performed,
Figure BDA0003090714710000122
this means that the error passed to the local layer is multiplied by a negative number (-lambda), which makes the training targets of the network before and after the gradient inversion layer opposite to each other, so as to achieve the antagonistic effect.
The formula for λ is:
Figure BDA0003090714710000131
where γ is a hyper-parameter, generally 10, and ρ changes from 0 to 1 as training progresses, indicating the current number of training steps/total number of training steps.
A gradient inversion layer may be disposed between the feature extraction layer and the discrimination network, and during the back propagation, the gradient of the domain classification loss of the discrimination network is automatically inverted (λ dynamically varies with the number of iterations) before being back propagated to the feature extractor.
It should be understood that the training of the second recurrent neural model cannot be completed at one time, and the trained second recurrent neural model is obtained by continuously repeating the training process until the respective loss function gradients of the neural prediction network and the discriminant network meet the preset condition and then stopping the training.
In some embodiments, after the trained second recurrent neural model is obtained, the model can be used to predict sales of the product to be predicted, whose historical sales are less than one year, and the specific prediction steps are as follows:
and D1, inputting the reference sales volume sequence into the trained feature extraction network of the second recurrent neural model for feature extraction, and obtaining a feature vector to be predicted.
And D2, inputting the feature vector to be predicted into the neural prediction network of the trained second recurrent neural model for prediction, and obtaining the sales prediction result of the product to be predicted.
The trained second recurrent neural model fully learns the characteristics of the product to be predicted and the characteristics of the reference product, the problem that the prediction model is unstable or over-fitted due to insufficient historical sales data can be solved, and the reliability and stability of prediction can be improved. In the process of prediction, only the reference sales sequence needs to be input into the trained second recurrent neural model. Specifically, when the reference sales volume sequence is processed by the second recurrent neural model, the reference sales volume sequence is input into the feature extraction network to obtain a feature vector to be predicted, the feature vector to be predicted is input into the neural prediction network to be predicted, and a prediction result output by the neural prediction network is a sales volume prediction result of a product to be predicted.
It should be understood that, since the time length of the reference sales data of the reference product introduced in the training process of the second recurrent neural model is greater than or equal to the set threshold, the time length of the sales prediction result obtained in the application process of the second recurrent neural model may be longer than the time length of the historical sales data of the product to be predicted. For example, assuming that the product to be predicted is M1, M1 has historical sales data for 3 months; the reference product is M2, and M2 has historical sales data for half a year, and when predicting the sales data of M1, it can predict the sales of M1 for 3 months, half a year, or year without being limited by 3 months.
From the above, for the historical sales data with the time length smaller than the set threshold, the trained second recurrent neural model is used for data processing, so that the phenomenon of overfitting of the second recurrent neural model can be reduced, the reliability of the second recurrent neural model is improved, and the accuracy of sales prediction is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 8 shows a block diagram of a configuration of the sales predicting apparatus 8 according to the embodiment of the present application, which corresponds to the sales predicting method according to the above embodiment, and only the relevant portions according to the embodiment of the present application are shown for convenience of explanation.
Referring to fig. 8, the sales prediction apparatus 8 includes:
a data obtaining module 801, configured to obtain historical sales data of a product to be predicted;
the data processing module 802 is configured to preprocess the historical sales data to obtain a historical sales sequence;
the first prediction module 803 is configured to, if the time length corresponding to the historical sales data is greater than or equal to a set threshold, input the historical sales sequence into a trained first recurrent neural model for prediction to obtain a sales prediction result of a product to be predicted, where the first recurrent neural model includes N layers of recurrent neural networks, N is greater than or equal to 3, and N is an integer;
and the second prediction module 804 is configured to predict a product to be predicted according to a trained second recurrent neural model if the time length corresponding to the historical sales data is smaller than a set threshold, so as to obtain a sales prediction result of the product to be predicted, wherein the second recurrent neural model includes a feature extraction network and a neural prediction network, and a training process of the second recurrent neural model is related to the historical sales sequence.
Alternatively, the historical sales data is in the unit of days, and the data processing module 802 may include:
the first processing unit is used for filling missing data according to a date interval of the missing data if the historical sales data has the missing data, wherein the date interval is a date between the missing data and the nearest non-missing data;
a second processing unit for performing data smoothing and normalization processing on the filled historical sales data to obtain a historical sales sequence
Optionally, the first recurrent neural model includes a daily recurrent neural network, a monthly recurrent neural network, and a quarterly recurrent neural network, and the first prediction module 803 may include:
the first prediction unit is used for inputting the historical sales volume sequence into a daily cyclic neural network to obtain a daily feature vector;
the second prediction unit is used for inputting the historical sales volume sequence into the quarterly cyclic neural network to obtain the quarterly hidden state of each quarterly;
and the third prediction unit is used for inputting the historical sales volume sequence of each month and the splicing characteristics of the corresponding month into the monthly recurrent neural network to obtain a sales volume prediction result, and the splicing characteristics are spliced according to the historical sales volume sequence of each month, the day characteristic vector of the corresponding month and the seasonal hidden state of the seasonal of the corresponding month.
Optionally, the first prediction unit may be specifically configured to, for each month, divide the historical sales volume sequence of the month by taking days as units, and then input the divided historical sales volume sequence into each daily gating and circulating unit of the daily cyclic neural network, to obtain a daily hidden state output by each daily gating and circulating unit, and perform attention operation on each daily hidden state to obtain a daily feature vector of the month.
Optionally, the second prediction unit may be specifically configured to, for each quarter, input the historical sales sequence of each month belonging to the quarter into a quarter gated round robin unit of the quarter round robin neural network, so as to obtain a quarter hidden state of the quarter output by the quarter gated round robin unit.
Alternatively, the sales predicting apparatus 8 may include a model training module; the model training module may include:
the reference product determining unit is used for determining a reference product according to the product to be predicted, and the time length corresponding to the reference sales data of the reference product is greater than or equal to a set threshold value;
the data processing module 802 may be further configured to pre-process reference sales data corresponding to the time of the historical sales data to obtain a reference sales sequence;
the first training unit is used for inputting the historical sales volume sequence and the reference sales volume sequence into a feature extraction network of a second neural model to be trained for feature extraction to respectively obtain a historical sales volume feature vector and a reference sales volume feature vector;
the second training unit is used for respectively inputting the historical sales characteristic vector and the reference sales characteristic vector into a discrimination network of a second recurrent neural model to be trained for discrimination to obtain a domain classification result;
the third training unit is used for inputting the historical sales characteristic vector and the reference sales characteristic vector into a neural prediction network of a second recurrent neural model to be trained for prediction to obtain a training prediction result;
and the fourth training unit is used for optimizing the second recurrent neural model to be trained on the basis of the domain classification result and the training prediction result to obtain the trained second recurrent neural model.
Optionally, the second prediction module 804 may include:
the characteristic extraction unit is used for inputting the reference sales volume sequence into a characteristic extraction network of the trained second recurrent neural model for characteristic extraction to obtain reference sales volume characteristics;
and the sales prediction unit is used for inputting the reference sales characteristics into the sales prediction network of the trained second recurrent neural model for prediction to obtain the sales prediction result of the product to be predicted.
It should be noted that, for the information interaction and execution process between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the method embodiment of the present application, and thus reference may be made to the method embodiment section for details, which are not described herein again.
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 9, the electronic apparatus 9 of this embodiment includes: at least one processor 90 (only one shown in FIG. 9), a memory 91, and a computer program 92 stored in the memory 91 and executable on the at least one processor 90, the processor 90 when executing the computer program 92 implementing the steps in any of the above-described embodiments of the exercise regimen generation method, such as the steps 101 and 109 shown in FIG. 1.
The Processor 90 may be a Central Processing Unit (CPU), and the Processor 90 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 91 may in some embodiments be an internal storage unit of the electronic device 9, such as a hard disk or a memory of the electronic device 9. The memory 91 may also be an external storage device of the electronic device 9 in other embodiments, such as a plug-in hard disk provided on the electronic device 9, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 91 may also include both an internal storage unit of the terminal device 9 and an external storage device. The memory 91 is used for storing an operating device, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of a computer program. The memory 91 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/electronic device, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A sales prediction method, comprising:
acquiring historical sales data of a product to be predicted;
preprocessing the historical sales data to obtain a historical sales sequence;
if the time length of the historical sales data is larger than or equal to a set threshold, inputting the historical sales sequence into a trained first cyclic neural model for prediction to obtain a sales prediction result of the product to be predicted, wherein the first cyclic neural model comprises N layers of cyclic neural networks, N is larger than or equal to 3, and N is an integer;
if the time length of the historical sales data is smaller than a set threshold, predicting the product to be predicted according to a trained second recurrent neural model to obtain a sales prediction result of the product to be predicted, wherein the second recurrent neural model comprises a feature extraction network and a neural prediction network, and the training process of the second recurrent neural model is related to the historical sales sequence.
2. The sales prediction method of claim 1, wherein the historical sales data is in days, and the preprocessing the historical sales data to obtain a historical sales sequence comprises:
if the historical sales data has missing data, filling the missing data according to a date interval of the missing data, wherein the date interval is a date between the missing data and the nearest non-missing data;
and performing data smoothing processing and normalization processing on the filled historical sales data to obtain the historical sales sequence.
3. The sales prediction method of claim 1 or claim 2, wherein the first recurrent neural model comprises a daily recurrent neural network, a monthly recurrent neural network, and a quarterly recurrent neural network, and the inputting the historical sales sequence into the trained first recurrent neural model for prediction to obtain the sales prediction result of the product to be predicted comprises:
inputting the historical sales volume sequence into the daily circulation neural network to obtain a daily feature vector;
inputting the historical sales volume sequence into the quarterly recurrent neural network to obtain the quarterly hidden state of each quarterly;
inputting the historical sales volume sequence of each month and the splicing characteristics of the corresponding month into the monthly recurrent neural network to obtain the sales volume prediction result, wherein the splicing characteristics are obtained by splicing according to the historical sales volume sequence of each month, the day characteristic vector of the corresponding month and the seasonal hidden state of the seasonal of the corresponding month.
4. The method of sales prediction of claim 3, wherein said inputting the historical sales sequence into the daily recurrent neural network results in a daily feature vector, comprising:
aiming at each month, dividing the historical sales volume sequence of the month by taking days as units, respectively inputting the divided historical sales volume sequence into each day gating circulation unit of the day circulation neural network to obtain the day hidden state output by each day gating circulation unit, and performing attention operation on each day hidden state to obtain the day characteristic vector of the month.
5. The sales prediction method of claim 3, wherein inputting the historical sales sequence into the quarterly recurrent neural network results in quarterly hidden states for each quarter, comprising:
and inputting the historical sales sequence of each month belonging to the quarter into a quarter gated cycle unit of the quarter cyclic neural network aiming at each quarter to obtain the quarter hidden state of the quarter output by the quarter gated cycle unit.
6. The sales prediction method of claim 1 or claim 2, wherein the training process of the second recurrent neural model comprises:
determining a reference product according to the product to be predicted, wherein the time length corresponding to the reference sales data of the reference product is greater than or equal to a set threshold;
preprocessing the reference sales data corresponding to the time of the historical sales data to obtain a reference sales sequence;
inputting the historical sales volume sequence and the reference sales volume sequence into a feature extraction network of a second neural model to be trained for feature extraction to respectively obtain a historical sales volume feature vector and a reference sales volume feature vector;
respectively inputting the historical sales characteristic vector and the reference sales characteristic vector into a discrimination network of the second recurrent neural model to be trained for discrimination to obtain a domain classification result;
inputting the reference sales characteristic vector into a neural prediction network of the second recurrent neural model to be trained for prediction to obtain a training prediction result;
and optimizing the second recurrent neural model to be trained based on the domain classification result and the training prediction result to obtain a trained second recurrent neural model.
7. The sales prediction method of claim 6, wherein the predicting the product to be predicted according to the trained second recurrent neural model to obtain the sales prediction result of the product to be predicted comprises:
inputting the reference sales volume sequence into a feature extraction network of the trained second recurrent neural model for feature extraction to obtain a feature vector to be predicted;
and inputting the feature vector to be predicted into the neural prediction network of the trained second recurrent neural model for prediction to obtain a sales prediction result of the product to be predicted.
8. A sales prediction apparatus, comprising:
the data acquisition module is used for acquiring historical sales data of a product to be predicted;
the data processing module is used for preprocessing the historical sales data to obtain a historical sales sequence;
the first prediction module is used for inputting the historical sales volume sequence into a trained first cyclic neural model for prediction to obtain a sales volume prediction result of the product to be predicted if the time length corresponding to the historical sales volume data is greater than or equal to a set threshold value, wherein the first cyclic neural model comprises N layers of cyclic neural networks, N is greater than or equal to 3, and N is an integer;
and the second prediction module is used for predicting the product to be predicted according to a trained second recurrent neural model if the time length corresponding to the historical sales data is less than a set threshold value to obtain a sales prediction result of the product to be predicted, the second recurrent neural model comprises a feature extraction network and a neural prediction network, and the training process of the second recurrent neural model is related to the historical sales sequence.
9. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the sales prediction method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out a sales prediction method according to any one of claims 1 to 7.
CN202110595289.6A 2021-05-28 2021-05-28 Sales forecasting method, sales forecasting device and electronic equipment Pending CN113283936A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110595289.6A CN113283936A (en) 2021-05-28 2021-05-28 Sales forecasting method, sales forecasting device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110595289.6A CN113283936A (en) 2021-05-28 2021-05-28 Sales forecasting method, sales forecasting device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113283936A true CN113283936A (en) 2021-08-20

Family

ID=77282449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110595289.6A Pending CN113283936A (en) 2021-05-28 2021-05-28 Sales forecasting method, sales forecasting device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113283936A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113888235A (en) * 2021-10-22 2022-01-04 创优数字科技(广东)有限公司 Training method of sales forecasting model, sales forecasting method and related device
CN113962750A (en) * 2021-11-16 2022-01-21 重庆邮电大学 Multi-scale information automobile sales volume big data prediction method based on attention mechanism
CN114363193A (en) * 2022-01-04 2022-04-15 北京达佳互联信息技术有限公司 Method and device for training resource prediction model and method and device for resource prediction

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170076357A1 (en) * 2015-09-15 2017-03-16 Google Inc. Guided purchasing via smartphone
CN109447716A (en) * 2018-11-09 2019-03-08 四川长虹电器股份有限公司 Method for Sales Forecast method and server based on Recognition with Recurrent Neural Network
WO2019109790A1 (en) * 2017-12-08 2019-06-13 北京京东尚科信息技术有限公司 Sales volume prediction method and device, and computer-readable storage medium
CN109886747A (en) * 2019-02-22 2019-06-14 网易(杭州)网络有限公司 Method for Sales Forecast method, medium, device and calculating equipment
CN111144923A (en) * 2019-11-11 2020-05-12 中国科学院计算机网络信息中心 Product period demand prediction method based on deep circulation neural network
CN111160968A (en) * 2019-12-27 2020-05-15 清华大学 SKU-level commodity sales prediction method and device
CN111612489A (en) * 2019-02-25 2020-09-01 北京嘀嘀无限科技发展有限公司 Order quantity prediction method and device and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170076357A1 (en) * 2015-09-15 2017-03-16 Google Inc. Guided purchasing via smartphone
WO2019109790A1 (en) * 2017-12-08 2019-06-13 北京京东尚科信息技术有限公司 Sales volume prediction method and device, and computer-readable storage medium
CN109447716A (en) * 2018-11-09 2019-03-08 四川长虹电器股份有限公司 Method for Sales Forecast method and server based on Recognition with Recurrent Neural Network
CN109886747A (en) * 2019-02-22 2019-06-14 网易(杭州)网络有限公司 Method for Sales Forecast method, medium, device and calculating equipment
CN111612489A (en) * 2019-02-25 2020-09-01 北京嘀嘀无限科技发展有限公司 Order quantity prediction method and device and electronic equipment
CN111144923A (en) * 2019-11-11 2020-05-12 中国科学院计算机网络信息中心 Product period demand prediction method based on deep circulation neural network
CN111160968A (en) * 2019-12-27 2020-05-15 清华大学 SKU-level commodity sales prediction method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113888235A (en) * 2021-10-22 2022-01-04 创优数字科技(广东)有限公司 Training method of sales forecasting model, sales forecasting method and related device
CN113962750A (en) * 2021-11-16 2022-01-21 重庆邮电大学 Multi-scale information automobile sales volume big data prediction method based on attention mechanism
CN113962750B (en) * 2021-11-16 2023-09-19 深圳市南方众悦科技有限公司 Multi-scale information automobile sales volume big data prediction method based on attention mechanism
CN114363193A (en) * 2022-01-04 2022-04-15 北京达佳互联信息技术有限公司 Method and device for training resource prediction model and method and device for resource prediction
CN114363193B (en) * 2022-01-04 2024-01-09 北京达佳互联信息技术有限公司 Training method and device of resource prediction model, and resource prediction method and device

Similar Documents

Publication Publication Date Title
Weytjens et al. Cash flow prediction: MLP and LSTM compared to ARIMA and Prophet
US20190325514A1 (en) Credit risk prediction method and device based on lstm model
CN113283936A (en) Sales forecasting method, sales forecasting device and electronic equipment
CN107590688A (en) The recognition methods of target customer and terminal device
US11222262B2 (en) Non-Markovian control with gated end-to-end memory policy networks
CN111352965B (en) Training method of sequence mining model, and processing method and equipment of sequence data
CN110020866B (en) Training method and device for recognition model and electronic equipment
CN112988840A (en) Time series prediction method, device, equipment and storage medium
CN116542673B (en) Fraud identification method and system applied to machine learning
CN115860802A (en) Product value prediction method, device, computer equipment and storage medium
CN115860505A (en) Object evaluation method and device, terminal equipment and storage medium
CN114418776A (en) Data processing method, device, terminal equipment and medium
US20160012536A1 (en) Methods and systems for addressing convexity in automated valuation of financial contracts
CN113724017A (en) Pricing method and device based on neural network, electronic equipment and storage medium
CN108256667A (en) Asset data processing method, device, storage medium and computer equipment
CN113435900A (en) Transaction risk determination method and device and server
Cornelissen A study on forecasting sofr with a recurrent neural network using long short-term memory cells
CN111179070A (en) Loan risk timeliness prediction system and method based on LSTM
CN116151635B (en) Optimization method and device for decision-making of anti-risk enterprises based on multidimensional relation graph
CN114756720B (en) Time sequence data prediction method and device
KR102519878B1 (en) Apparatus, method and recording medium storing commands for providing artificial-intelligence-based risk management solution in credit exposure business of financial institution
US20230419128A1 (en) Methods for development of a machine learning system through layered gradient boosting
US20240119315A1 (en) Methods for development of a machine learning system through layered gradient boosting
Kushwaha et al. Prospective Stock Analysis Model to improve the investment chances using Machine Learning
Choi 6-Parametric factor model with long short-term memory

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210820