CN117688377A - Article index prediction method and device, computer storage medium and electronic equipment - Google Patents

Article index prediction method and device, computer storage medium and electronic equipment Download PDF

Info

Publication number
CN117688377A
CN117688377A CN202211098175.1A CN202211098175A CN117688377A CN 117688377 A CN117688377 A CN 117688377A CN 202211098175 A CN202211098175 A CN 202211098175A CN 117688377 A CN117688377 A CN 117688377A
Authority
CN
China
Prior art keywords
historical
future
attribute information
transaction amount
trained
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211098175.1A
Other languages
Chinese (zh)
Inventor
王鑫
张建申
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wodong Tianjun Information Technology Co Ltd
Original Assignee
Beijing Wodong Tianjun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wodong Tianjun Information Technology Co Ltd filed Critical Beijing Wodong Tianjun Information Technology Co Ltd
Priority to CN202211098175.1A priority Critical patent/CN117688377A/en
Publication of CN117688377A publication Critical patent/CN117688377A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The disclosure relates to the technical field of computers, and provides an article index prediction method, an article index prediction device, a computer storage medium and electronic equipment, wherein the article index prediction method comprises the following steps: acquiring attribute information of a target object and historical index data of the target object; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods; predicting a future price of the target item in at least one future period based on the attribute information and the historical price by the trained index prediction model, and predicting a future access amount of the target item in the at least one future period based on the attribute information and the historical access amount by the index prediction model; the future transaction amount of the target item over at least one future period is predicted by an index prediction model based on the historical transaction amount, the future price, and the future access amount. The method and the device can improve prediction accuracy.

Description

Article index prediction method and device, computer storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an article index prediction method, an article index prediction device, a computer storage medium, and an electronic apparatus.
Background
With the maturity and wide application of technologies such as big data and machine learning, the emerging technologies are beginning to be applied to the field of predicting the object index data, and the prediction of the object index data is an important basis for an enterprise to formulate a next operation plan.
Currently, various machine learning models are generally utilized to predict the future transaction amount of an item based only on the historical transaction amount of the item, however, the above scheme tends to have low prediction accuracy.
In view of the foregoing, there is a need in the art to develop a new method and apparatus for predicting an item index.
It should be noted that the information disclosed in the foregoing background section is only for enhancing understanding of the background of the present disclosure.
Disclosure of Invention
The disclosure aims to provide an article index prediction method, an article index prediction device, a computer storage medium and an electronic device, so as to overcome the technical problem of low prediction accuracy due to the limitation of the related art at least to a certain extent.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an item index prediction method, including: acquiring attribute information of a target object and historical index data of the target object; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods; predicting a future price of the target item in at least one future period based on the attribute information and the historical price by a trained index prediction model, and predicting a future access amount of the target item in at least one future period based on the attribute information and the historical access amount by the index prediction model; predicting, by the indicator prediction model, a future transaction amount of the target item over at least one future period based on the historical transaction amount, the future price, and a future access amount.
In an exemplary embodiment of the present disclosure, after predicting a future transaction amount of the target item over at least one future period, the method further comprises: acquiring the inventory quantity of the target object; and determining the stock quantity of the target object according to the future transaction quantity and the stock quantity so as to stock the target object according to the stock quantity.
In an exemplary embodiment of the present disclosure, the index prediction model is trained by: acquiring a training set; the training set comprises attribute information of a plurality of articles and historical index data of the articles; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the articles in a plurality of historical time periods; inputting the training set into a machine learning model to be trained, and performing iterative training on the machine learning model to be trained by using a gradient descent method; stopping training when the loss function of the machine learning model to be trained tends to converge, and obtaining a trained index prediction model; the loss function is a mean square error loss function.
In an exemplary embodiment of the present disclosure, after acquiring the training set, the method further comprises: performing data expansion processing on the training set; inputting the training set subjected to the data expansion processing into the machine learning model to be trained, and performing iterative training on the machine learning model to be trained by using a gradient descent method; and stopping training when the loss function of the machine learning model to be trained tends to converge, and obtaining the trained index prediction model.
In an exemplary embodiment of the present disclosure, the performing data expansion processing on the training set includes: word embedding representation is carried out on the attribute information of the plurality of articles, so that word embedding vectors corresponding to the attribute information are obtained; word embedding representation is carried out on the historical time periods, so that word embedding vectors corresponding to the historical time periods are obtained; data processing is carried out on the historical transaction amount to obtain a transaction amount processing sequence; and adding the word embedding vector corresponding to each attribute information, the word embedding vector corresponding to each history period and the transaction amount processing sequence into the training set to perform data expansion processing on the training set.
In an exemplary embodiment of the present disclosure, the attribute information of the plurality of items includes a category name; the word embedding representation is carried out on the attribute information of the plurality of articles to obtain word embedding vectors corresponding to the attribute information, and the word embedding vectors comprise: converting the article names of the plurality of articles into pinyin; sorting a plurality of class names according to the initial sequence of pinyin; and generating word embedding vectors corresponding to the attribute information corresponding to the attributes of the articles according to the total number of the articles and the sorting order of the article names.
In an exemplary embodiment of the present disclosure, the word embedding representation is performed on the history period to obtain word embedding vectors corresponding to each history period, including: acquiring a time period characteristic corresponding to the historical time period, wherein the time period characteristic at least comprises any two of the following: the month, week, the number of days of the historical period in the year, and the number of days of the historical period in the month are ordered; and generating word embedding vectors corresponding to the historical time periods according to the time period characteristics.
In an exemplary embodiment of the disclosure, the processing the historical transaction amount data to obtain a transaction amount processing sequence includes: calculating a plurality of transaction amount averages for each of the items over a plurality of different numbers of historical time periods; and generating the transaction amount processing sequence according to the transaction amount mean values.
According to a second aspect of the present disclosure, there is provided an article index prediction apparatus comprising: the acquisition module is used for acquiring attribute information of the target object and historical index data of the target object; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods; a first prediction module for predicting a future price of the target item in at least one future period based on the attribute information and the historical price by a trained index prediction model, and predicting a future access amount of the target item in at least one future period based on the attribute information and the historical access amount by the index prediction model; a second prediction module for predicting, by the indicator prediction model, a future transaction amount of the target item over at least one future period based on the historical transaction amount, the future price, and a future access amount.
According to a third aspect of the present disclosure, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the article index prediction method of the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the item indicator prediction method of the first aspect described above via execution of the executable instructions.
As can be seen from the above technical solutions, the article index prediction method, the article index prediction device, the computer storage medium, and the electronic device in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in the technical solutions provided in some embodiments of the present disclosure, on one hand, attribute information of a target item and historical index data of the target item are obtained, future prices of the target item in at least one future period are predicted based on the attribute information and the historical prices through a trained index prediction model, future access amounts of the target item in the at least one future period are predicted based on the attribute information and the historical access amounts through the index prediction model, and the prices and access amounts of the target item in the future period can be predicted in advance, so that a transaction amount of the target item can be predicted based on a quantitative price relationship. On the other hand, the future transaction amount of the target object in at least one future period is predicted based on the historical transaction amount, the future price and the future access amount through the index prediction model, so that the problem that in the related art, the prediction accuracy is low due to the fact that factors caused by predicting the future transaction amount are too single only according to the historical transaction amount can be solved, the influence factors of the price-quantity relation on the transaction amount are considered, the prediction accuracy of the future transaction amount is improved, and meanwhile, the prediction result is interpretable.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 shows a flow diagram of a method of item indicator prediction in an embodiment of the present disclosure;
FIG. 2 is a flow chart of training to obtain an index prediction model in an embodiment of the disclosure;
FIG. 3 is a schematic flow chart of the data expansion process for the training set;
FIG. 4 is a flowchart illustrating word embedding representation of attribute information of a plurality of items to obtain word embedding vectors corresponding to each attribute information in an embodiment of the present disclosure;
FIG. 5 illustrates an overall block diagram of an item index prediction method in an embodiment of the present disclosure;
FIG. 6 illustrates an overall network architecture diagram of item index prediction in an embodiment of the present disclosure;
FIG. 7 is a schematic diagram showing the structure of an article index prediction device in an exemplary embodiment of the present disclosure;
fig. 8 illustrates a schematic structure of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
In the related art, future transaction amounts are generally predicted by stacking features based on recent and historical contemporaneous transaction amounts based on various regression prediction techniques. However, feature stacking has many subjective and empirical limitations, and simple stacking networks do not provide a good understanding of the cause of historical transaction amounts, and thus the predictive effect is not ideal.
In an embodiment of the present disclosure, an article index prediction method is provided first, which overcomes, at least to some extent, the defect of lower prediction accuracy in the related art.
Fig. 1 illustrates a flowchart of an item indicator prediction method in an embodiment of the disclosure, where an execution subject of the item indicator prediction method may be a server for item indicator prediction.
Referring to fig. 1, an item index prediction method according to one embodiment of the present disclosure includes the steps of:
step S110, acquiring attribute information of a target object and historical index data of the target object; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods;
step S120, predicting the future price of the target object in at least one future period based on the attribute information and the historical price through the trained index prediction model, and predicting the future access amount of the target object in at least one future period based on the attribute information and the historical access amount through the index prediction model;
step S130, predicting future transaction amounts of the target item in at least one future period based on the historical transaction amounts, the future price and the future access amounts through the index prediction model.
In the technical solution provided in the embodiment shown in fig. 1, on one hand, attribute information of a target item and historical index data of the target item are obtained, future prices of the target item in at least one future period are predicted based on the attribute information and the historical prices through a trained index prediction model, future access amounts of the target item in the at least one future period are predicted based on the attribute information and the historical access amounts through the index prediction model, and the prices and the access amounts of the target item in the future period can be predicted in advance, so that the transaction amounts of the target item can be predicted based on the quantitative price relationship conveniently. On the other hand, the future transaction amount of the target object in at least one future period is predicted based on the historical transaction amount, the future price and the future access amount through the index prediction model, so that the problem that in the related art, the prediction accuracy is low due to the fact that factors caused by predicting the future transaction amount are too single only according to the historical transaction amount can be solved, the influence factors of the price-quantity relation on the transaction amount are considered, the prediction accuracy of the future transaction amount is improved, and meanwhile, the prediction result is interpretable.
The specific implementation of each step in fig. 1 is described in detail below:
Before step S110, it should be noted that the present disclosure may pre-train an index prediction model, where the index prediction model functions as follows: predicting a future price of the item according to the attribute information and the historical price of the item, predicting a future access amount of the item according to the attribute information and the historical access amount of the item, and predicting a future transaction amount of the item according to the historical transaction amount, the future price and the future access amount of the item.
Specifically, the index prediction model can be obtained through training in the following manner:
referring to fig. 2, fig. 2 is a schematic flow chart of training to obtain an index prediction model in an embodiment of the disclosure, including step S201 to step S203:
in step S201, a training set is acquired.
In this step, a training set may be obtained and stored in a hive table for training and prediction of a subsequent model. Wherein hive is a data warehouse tool based on Hadoop for data extraction, transformation and loading, which is a mechanism that can store, query and analyze large-scale data stored in Hadoop. The hive data warehouse tool can map a structured data file into a database table, provide SQL query functions, and convert SQL sentences into MapReduce tasks for execution. Hive has the advantages that learning cost is low, rapid MapReduce statistics can be realized through SQL-like sentences, mapReduce is simpler, and a special MapReduce application program does not need to be developed. hive is well suited for statistical analysis of data warehouses.
The training set may include attribute information of a plurality of items and historical index data of the plurality of items.
Wherein the attribute information may include a category name of each item. Where a category refers to the last level of classification of merchandise that a customer involves in a purchase decision, from which a brand may be associated and upon which a corresponding purchase selection may be made, the category may be, for example: washing machine, air conditioner, mobile phone, beer, etc., each category corresponds to a unique SKU (Stock Keeping Unit, stock unit) number, i.e. the basic unit of stock in-out metering, which may be in units of pieces, boxes, trays, etc.
The historical index data may include a historical price, a historical access amount, and a historical transaction amount for each item over a plurality of historical periods.
The historical access amount can be PV (Page View) access amount, namely Page browsing amount or click amount, which is used for measuring the number of web pages accessed by a website user, wherein each time a user opens or refreshes a Page in a certain statistical period, the number of web pages is recorded for 1 time, and the browsing amount is accumulated when the user opens or refreshes the same Page for a plurality of times.
By way of example, each history period may be one day, and thus, taking the above-described plurality of history periods of 7 days as an example, for item a, its history price may be [10,20,30,20,20,20,10], its history access may be [1,2,3,2,2,2,1], and its history transaction may be [1,2,3,2,2,2,1]; for item B, its historical price may be [20,20,50,20,40,20,30], its historical access may be [4,2,3,2,3,2,2], and its historical transaction may be [8,2,3,4,2,2,1].
The number of days included in each history period and the specific number of days in each history period may be set according to the actual situation, which is not particularly limited in the present disclosure.
After the training set is obtained, step S202 may be performed, where the training set is input into the machine learning model to be trained, and the machine learning model to be trained is iteratively trained by using a gradient descent method.
In this step, the machine learning model to be trained may include two basic algorithm blocks: the TCN algorithm block (Temporal Convolutional Network, time domain convolutional network, which consists of expanded, causal 1D convolutional layers with the same input and output length) and LSTM algorithm block (Long short-term memory network, a time-cycled neural network), the rest of the network structure is formed by MLP (Multi-Layer Perceptron).
The training set can be input into a machine learning model to be trained, and the machine learning model to be trained is iteratively trained by using a gradient descent method.
The gradient descent method is a first-order optimization algorithm, and is also commonly referred to as a steepest descent method. To find the local minima of a function using the gradient descent method, an iterative search must be performed for a specified step distance point in the opposite direction of the gradient (or approximate gradient) to the current point on the function.
In step S203, when the loss function of the machine learning model to be trained tends to converge, training is stopped, and a trained index prediction model is obtained.
In this step, the model's loss function may be set to the MSE loss function (Mean Square Error ), which is a regression loss function.
When the loss function of the machine learning model to be trained tends to converge, training can be stopped, and a trained index prediction model is obtained. And further, the trained index prediction model can be stored under a pre-designated HDFS path.
Wherein HDFS (Hadoop Distributed File System, HDFS for short) refers to a distributed file system (Distributed File System) designed to fit on general purpose hardware (commodity hardware). HDFS is a highly fault tolerant system suitable for deployment on inexpensive machines. HDFS can provide high throughput data access, and is well suited for applications on large data sets. HDFS relaxes a portion of the POSIX constraints to achieve the purpose of streaming file system data. HDFS was originally developed as an infrastructure for the Apache Nutch search engine project. HDFS is part of the Apache Hadoop Core project.
It should be noted that, after the training set is obtained in the step S201, data expansion processing may be performed on the training set to increase the data amount included in the training set, so as to increase the generalization capability of the finally obtained model, avoid over-fitting of the model, and increase the prediction accuracy of the model.
Referring to fig. 3, fig. 3 shows a flow chart of performing data expansion processing on a training set, including steps S301 to S304:
in step S301, word embedding is performed on attribute information of a plurality of items, and word embedding vectors corresponding to the respective attribute information are obtained.
In this step, describing the attribute information of the article as an example of article class names, referring to fig. 4, fig. 4 shows a flowchart of word embedding representation of attribute information of a plurality of articles to obtain word embedding vectors corresponding to each attribute information in the embodiment of the present disclosure, and includes steps S401 to S403:
in step S401, the category names of the plurality of items are converted into pinyin.
In this step, the class names of the plurality of articles may be converted into pinyin, and, for example, the class name of the article a is described as "washing machine", the conversion into pinyin may be expressed as "xiyiji", and the class name of the article B is described as "mobile phone", the conversion into pinyin may be expressed as "shouji".
In step S402, the plurality of class names are ordered in the alphabetical order of pinyin.
In this step, for example, when the number of the plurality of articles is 8 in total and the example of the 8 article names is described, the 8 article names may be sorted according to the first alphabetical order (for example, the order of a-z, or the order of z to a, which may be set according to the actual situation, and the disclosure does not limit this in particular), so as to obtain a sorting sequence.
In step S403, a word embedding vector corresponding to each attribute information of the attribute vector corresponding to each item attribute is generated according to the total number of the plurality of items and the order of the item names.
In this step, still taking the total number of the plurality of articles as 8 as an example, assuming that the sorting order of the article type names of the article a in the sorting sequence is 8 th, it can be determined that the word embedding vector corresponding to the article type name "washing machine" of the article a is [0,0,0,0,0,0,0,1].
Referring next to fig. 3, in step S302, word embedding is performed on the history periods, and word embedding vectors corresponding to the history periods are obtained.
In this step, by way of example, taking 1/2/1 of any history period 2020 as an example, the period characteristics of the history period may be obtained, for example: year 2020, month: 2 months, weeks corresponding to the history period: on wednesday, 32 th day of the history period 2020, 1 st day of the history period 2 months, so that word embedding vectors [6, 32, 1, 2] corresponding to the history period can be generated based on the above period characteristics, wherein 6 represents wednesday, 32 represents 32 th day of 2020, 1 represents 1 st day of 2 months, and 2 represents 2 months.
In step S303, data processing is performed on the historical transaction amount, and a transaction amount processing sequence is obtained.
In this step, taking 1 day of each history period, the history transaction amount of the article a includes a total of 60 days of history transaction amounts as an example, the sales average of the previous 7 days, the sales average of the previous 28 days, and the sales average of 60 days may be calculated, and a transaction amount processing sequence may be obtained according to the 3 sales averages, for example: [2,2.5,3.5].
It should be noted that, the specific values may be set or changed according to the actual situation, which is not particularly limited in the disclosure.
In step S304, the word embedding vector corresponding to each attribute information, the word embedding vector corresponding to each history period, and the transaction amount processing sequence are added to the training set, so as to perform data expansion processing on the training set.
In this step, the training set may be added with the word embedding vector corresponding to each attribute information, the word embedding vector corresponding to each history period, and the transaction amount processing sequence, so as to perform data expansion processing on the training set.
After the training set is subjected to data expansion processing, the machine learning model to be trained can be trained according to the training set after the data expansion processing, and when the loss function of the machine learning model to be trained tends to be converged, training is stopped, so that a trained index prediction model is obtained.
After the index prediction model is trained, referring next to fig. 1, in step S110, attribute information of the target item and historical index data of the target item are acquired.
In this step, the target item is the item for which future transaction amount needs to be predicted.
The attribute information of the target object may be an object name of the target object, which may be set by itself according to actual conditions, which is not particularly limited in the present disclosure.
The history index data includes a history price, a history access amount, and a history transaction amount of the target item in a plurality of history periods.
For example, taking the predicted date of 2022, 8, 23, and then the historical price, the historical access amount, and the historical transaction amount may be all the historical price, the historical access amount, and the historical transaction amount of the day before the predicted date (i.e., 2022, 8, 22) and may be set according to the actual situation, which is not particularly limited in this disclosure.
In step S120, a future price of the target item in at least one future period is predicted based on the attribute information and the historical price by the trained index prediction model, and a future access amount of the target item in the at least one future period is predicted based on the attribute information and the historical access amount by the index prediction model.
In this step, after the attribute information of the target item and the historical index data of the target item are obtained, the attribute information of the target item and the historical index data of the target item may be input into the trained index prediction model, and further, the index prediction model may predict a future price of the target item in at least one future period according to the attribute information and the historical price of the target item, and further, the index prediction model may predict a future access amount of the target item in at least one future period according to the attribute information of the target item and the historical access amount of the target item.
After predicting the future price and the future access amount, step S130 may be entered to predict a future transaction amount of the target item in at least one future period based on the historical transaction amount, the future price, and the future access amount by an index prediction model.
In this step, the future transaction amount of the target item in at least one future period of time may be predicted by the index prediction model based on the historical transaction amount, the future price, and the future access amount. For example, a future transaction amount of the target item on a day in the future may be predicted, or a plurality of future transaction amounts of the target item on several days in the future may be predicted.
According to the method and the device for predicting the future transaction amount based on the historical transaction amount, the future price and the future access amount, the problem that in the related art, the prediction accuracy is low due to the fact that factors caused by predicting the future transaction amount only according to the historical transaction amount are too single can be solved, the influence factors of the price-quantity relation on the transaction amount are considered, the prediction accuracy of the future transaction amount is improved, and meanwhile the prediction result is interpretable.
It should be noted that, after predicting the future transaction amount of the target item in at least one future period, the inventory amount of the target item before the future period (for example, the day before the future period) may be further predicted, and then, the stock amount of the target item is determined according to the future transaction amount and the inventory amount of the target item, so as to stock the target item according to the stock amount. For example, when the future transaction amount of the future period is 1000 pieces and the stock amount of the target item on the day before the future period is 500 pieces, 500 target items may be purchased in advance to avoid the occurrence of a backorder or the like.
Referring to fig. 5, fig. 5 shows an overall block diagram of an item index prediction method in an embodiment of the present disclosure:
Firstly, the historical price can be encoded through an LSTM network, and the encoded historical price is input into an MLP network, so that the MLP network predicts the future price according to the encoded historical price;
secondly, the historical access quantity can be encoded through an LSTM network, and the encoded historical access quantity is input into an MLP network, so that the MLP network predicts the future access quantity according to the encoded historical access quantity;
and thirdly, the historical transaction amount, the future price and the future access amount can be encoded through the TCN network, the encoded vector is input into the MLP network, so that the MLP network predicts the transaction amount prediction result of the target object in the future period according to the encoded vector, and the future stock is conveniently carried out according to the transaction amount prediction result.
Referring to fig. 6, fig. 6 illustrates an overall network structure diagram of item index prediction in an embodiment of the present disclosure, specifically,
step 1, a time period (time) can be input into a word embedding module 1, and word embedding vectors corresponding to the time period (comprising word embedding vectors time_emb1 corresponding to a history time period and word embedding vectors time_emb2 corresponding to a future time period) are obtained according to the output of the word embedding module; inputting a category name (name. Cate) into a word embedding module 2, and obtaining a word embedding vector (input_emb) corresponding to the category name according to the output of the word embedding module 2; then, a word embedding vector (time_emb1) corresponding to a history period, a word embedding vector (information_emb) corresponding to a category name, a history transaction amount (samples), a history access amount (PV) and a history price (price) are input into a TCN+MLP-1 module, and a first fusion vector (TCN _layer) is obtained according to the output of the TCN+MLP-1 module;
Step 2, the historical price can be input into an LSTM-1 module, and the coded historical price (price_lstm) is obtained according to the output of the LSTM-1 module;
step 3, the history pv can be input into an LSTM-2 module, and the encoded history pv (pv_lstm) is obtained according to the output of the LSTM-2 module;
step 4, the time_emb2 output in the step 1, the price_lstm output in the step 2 and the pv_lstm output in the step 3 can be input into an MLP-2 module, and a second fusion vector (MLP _layer) is obtained according to the output of the MLP-2 module;
step 5, the historical transaction amount can be input into an LSTM-3 network, and the coded historical transaction amount (sal_lstm) is obtained according to the output of the LSTM-3 network;
step 6, a first historical transaction amount before the prediction day (which may be expressed as sal_ yoy1, assuming that the transaction amount of 2022, 8, 1 is required to be predicted, sal_ yoy1 may be all the historical transaction amounts before 2021, 8, 1) is input into the LSTM-4 module, according to the output of the LSTM-4 module, an encoded historical transaction amount sal_lstm_states is obtained, further, a second historical transaction amount before the sal_lstm_states and the prediction day (which may be expressed as sales_ yoy2, assuming that the transaction amount of 2022, 8, 1 is required to be predicted), sal_ yoy2 may be the historical transaction amount of 2021, 8, 1) is input into the LSTM-5 network, according to the output of the LSTM-5 network, an encoded transaction amount sequence sal_lstm_ yoy is obtained, further, the encoded transaction amount sequence may be input into the p-3 and MLP-4 module, and the MLP-3 module may be multiplied by the third MLP module, and the third MLP module may obtain a vector;
And 7, inputting the second fusion vector obtained in the step 4, the coded historical transaction amount obtained in the step 5 and the third fusion vector obtained in the step 6 into an MLP-5 module, and obtaining future transaction amount in a future period according to the output of the MLP-5 module.
According to the method, three network structures among price, flow and transaction amount are constructed, the network model can learn the price relation in an encoding-decoding mode, the price and the flow are predicted for future article index prediction, end-to-end price article index prediction is achieved, further a downstream system can prepare goods according to article index prediction results, and articles can be high-aged to be in the hands of consumers.
The disclosure further provides an article index prediction device, and fig. 7 shows a schematic structural diagram of the article index prediction device in an exemplary embodiment of the disclosure; as shown in fig. 7, the item index prediction device 700 may include an acquisition module 710, a first prediction module 720, and a second prediction module 730. Wherein:
an obtaining module 710, configured to obtain attribute information of a target item and historical index data of the target item; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods;
A first prediction module 720 for predicting a future price of the target item in at least one future period based on the attribute information and the historical price by means of a trained index prediction model, and predicting a future access amount of the target item in at least one future period based on the attribute information and the historical access amount by means of the index prediction model;
a second prediction module 730 for predicting a future transaction amount of the target item over at least one future period based on the historical transaction amount, the future price, and a future access amount by the index prediction model.
In an exemplary embodiment of the present disclosure, after predicting a future transaction amount of the target item over at least one future period, the acquisition module 710 is configured to:
acquiring the inventory quantity of the target object; and determining the stock quantity of the target object according to the future transaction quantity and the stock quantity so as to stock the target object according to the stock quantity.
In an exemplary embodiment of the present disclosure, the first prediction module 720 is configured to:
acquiring a training set; the training set comprises attribute information of a plurality of articles and historical index data of the articles; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the articles in a plurality of historical time periods; inputting the training set into a machine learning model to be trained, and performing iterative training on the machine learning model to be trained by using a gradient descent method; stopping training when the loss function of the machine learning model to be trained tends to converge, and obtaining a trained index prediction model; the loss function is a mean square error loss function.
In an exemplary embodiment of the present disclosure, after acquiring the training set, the first prediction module 720 is configured to:
performing data expansion processing on the training set; inputting the training set subjected to the data expansion processing into the machine learning model to be trained, and performing iterative training on the machine learning model to be trained by using a gradient descent method; and stopping training when the loss function of the machine learning model to be trained tends to converge, and obtaining the trained index prediction model.
In an exemplary embodiment of the present disclosure, the first prediction module 720 is configured to:
word embedding representation is carried out on the attribute information of the plurality of articles, so that word embedding vectors corresponding to the attribute information are obtained; word embedding representation is carried out on the historical time periods, so that word embedding vectors corresponding to the historical time periods are obtained; data processing is carried out on the historical transaction amount to obtain a transaction amount processing sequence; and adding the word embedding vector corresponding to each attribute information, the word embedding vector corresponding to each history period and the transaction amount processing sequence into the training set to perform data expansion processing on the training set.
In an exemplary embodiment of the present disclosure, the attribute information of the plurality of items includes a category name; a first prediction module 720 configured to:
converting the article names of the plurality of articles into pinyin; sorting a plurality of class names according to the initial sequence of pinyin; and generating word embedding vectors corresponding to the attribute information corresponding to the attributes of the articles according to the total number of the articles and the sorting order of the article names.
In an exemplary embodiment of the present disclosure, the first prediction module 720 is configured to:
acquiring a time period characteristic corresponding to the historical time period, wherein the time period characteristic at least comprises any two of the following: the month, week, the number of days of the historical period in the year, and the number of days of the historical period in the month are ordered; and generating word embedding vectors corresponding to the historical time periods according to the time period characteristics.
In an exemplary embodiment of the present disclosure, the first prediction module 720 is configured to:
calculating a plurality of transaction amount averages for each of the items over a plurality of different numbers of historical time periods; and generating the transaction amount processing sequence according to the transaction amount mean values.
The specific details of each module in the above article index prediction device are described in detail in the corresponding article index prediction method, so that the details are not repeated here.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Furthermore, although the steps of the methods in the present disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order or that all illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a mobile terminal, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
The present application also provides a computer-readable storage medium that may be included in the electronic device described in the above embodiments; or may exist alone without being incorporated into the electronic device.
The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer-readable storage medium carries one or more programs which, when executed by one such electronic device, cause the electronic device to implement the methods described in the embodiments above.
In addition, an electronic device capable of realizing the method is provided in the embodiment of the disclosure.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 800 according to such an embodiment of the present disclosure is described below with reference to fig. 8. The electronic device 800 shown in fig. 8 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 8, the electronic device 800 is embodied in the form of a general purpose computing device. Components of electronic device 800 may include, but are not limited to: the at least one processing unit 810, the at least one storage unit 820, a bus 830 connecting the different system components (including the storage unit 820 and the processing unit 810), and a display unit 840.
Wherein the storage unit stores program code that is executable by the processing unit 810 such that the processing unit 810 performs steps according to various exemplary embodiments of the present disclosure described in the above section of the present specification. For example, the processing unit 810 may perform the operations as shown in fig. 1: step S110, acquiring attribute information of a target object and historical index data of the target object; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods; step S120, predicting future prices of the target object in at least one future period based on the attribute information and the historical prices through a trained index prediction model, and predicting future access amounts of the target object in at least one future period based on the attribute information and the historical access amounts through the index prediction model; step S130, predicting, by the index prediction model, a future transaction amount of the target item in at least one future period based on the historical transaction amount, the future price, and the future access amount.
The storage unit 820 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 8201 and/or cache memory 8202, and may further include Read Only Memory (ROM) 8203.
Storage unit 820 may also include a program/utility 8204 having a set (at least one) of program modules 8205, such program modules 8205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 830 may be one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 800 may also communicate with one or more external devices 900 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 800, and/or any device (e.g., router, modem, etc.) that enables the electronic device 800 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 850. Also, electronic device 800 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 860. As shown, network adapter 860 communicates with other modules of electronic device 800 over bus 830. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 800, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (11)

1. An article index prediction method, comprising:
acquiring attribute information of a target object and historical index data of the target object; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods;
predicting a future price of the target item in at least one future period based on the attribute information and the historical price by a trained index prediction model, and predicting a future access amount of the target item in at least one future period based on the attribute information and the historical access amount by the index prediction model;
Predicting, by the indicator prediction model, a future transaction amount of the target item over at least one future period based on the historical transaction amount, the future price, and a future access amount.
2. The method of claim 1, wherein after predicting a future transaction amount for the target item over at least one future time period, the method further comprises:
acquiring the inventory quantity of the target object;
and determining the stock quantity of the target object according to the future transaction quantity and the stock quantity so as to stock the target object according to the stock quantity.
3. The method according to claim 1 or 2, wherein the index prediction model is trained by:
acquiring a training set; the training set comprises attribute information of a plurality of articles and historical index data of the articles; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the articles in a plurality of historical time periods;
inputting the training set into a machine learning model to be trained, and performing iterative training on the machine learning model to be trained by using a gradient descent method;
Stopping training when the loss function of the machine learning model to be trained tends to converge, and obtaining a trained index prediction model; the loss function is a mean square error loss function.
4. A method according to claim 3, wherein after the training set is acquired, the method further comprises:
performing data expansion processing on the training set;
inputting the training set subjected to the data expansion processing into the machine learning model to be trained, and performing iterative training on the machine learning model to be trained by using a gradient descent method;
and stopping training when the loss function of the machine learning model to be trained tends to converge, and obtaining the trained index prediction model.
5. The method of claim 4, wherein the performing data expansion processing on the training set comprises:
word embedding representation is carried out on the attribute information of the plurality of articles, so that word embedding vectors corresponding to the attribute information are obtained;
word embedding representation is carried out on the historical time periods, so that word embedding vectors corresponding to the historical time periods are obtained;
data processing is carried out on the historical transaction amount to obtain a transaction amount processing sequence;
And adding the word embedding vector corresponding to each attribute information, the word embedding vector corresponding to each history period and the transaction amount processing sequence into the training set to perform data expansion processing on the training set.
6. The method of claim 5, wherein the attribute information of the plurality of items includes a category name;
the word embedding representation is carried out on the attribute information of the plurality of articles to obtain word embedding vectors corresponding to the attribute information, and the word embedding vectors comprise:
converting the article names of the plurality of articles into pinyin;
sorting a plurality of class names according to the initial sequence of pinyin;
and generating word embedding vectors corresponding to the attribute information corresponding to the attributes of the articles according to the total number of the articles and the sorting order of the article names.
7. The method of claim 5, wherein the word embedding the historical periods to obtain word embedding vectors corresponding to the historical periods comprises:
acquiring a time period characteristic corresponding to the historical time period, wherein the time period characteristic at least comprises any two of the following: the month, week, the number of days of the historical period in the year, and the number of days of the historical period in the month are ordered;
And generating word embedding vectors corresponding to the historical time periods according to the time period characteristics.
8. The method of claim 5, wherein said data processing said historical transaction amount to obtain a transaction amount processing sequence, comprising:
calculating a plurality of transaction amount averages for each of the items over a plurality of different numbers of historical time periods;
and generating the transaction amount processing sequence according to the transaction amount mean values.
9. An article index prediction apparatus, comprising:
the acquisition module is used for acquiring attribute information of the target object and historical index data of the target object; the historical index data comprises historical prices, historical access amounts and historical transaction amounts of the target object in a plurality of historical time periods;
a first prediction module for predicting a future price of the target item in at least one future period based on the attribute information and the historical price by a trained index prediction model, and predicting a future access amount of the target item in at least one future period based on the attribute information and the historical access amount by the index prediction model;
A second prediction module for predicting, by the indicator prediction model, a future transaction amount of the target item over at least one future period based on the historical transaction amount, the future price, and a future access amount.
10. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the article index prediction method of any one of claims 1 to 8.
11. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the item indicator prediction method of any one of claims 1 to 8 via execution of the executable instructions.
CN202211098175.1A 2022-09-08 2022-09-08 Article index prediction method and device, computer storage medium and electronic equipment Pending CN117688377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211098175.1A CN117688377A (en) 2022-09-08 2022-09-08 Article index prediction method and device, computer storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211098175.1A CN117688377A (en) 2022-09-08 2022-09-08 Article index prediction method and device, computer storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN117688377A true CN117688377A (en) 2024-03-12

Family

ID=90128867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211098175.1A Pending CN117688377A (en) 2022-09-08 2022-09-08 Article index prediction method and device, computer storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN117688377A (en)

Similar Documents

Publication Publication Date Title
US10936947B1 (en) Recurrent neural network-based artificial intelligence system for time series predictions
US20230144538A1 (en) Query language tool based self driven system and method for operating enterprise and supply chain applications
US10565551B2 (en) System and method of determining item storage strategy
TWI778481B (en) Computer-implemented system for ai-based product integration and deduplication and method integrating and deduplicating products using ai
CA3042926A1 (en) Technology incident management platform
CN111079014B (en) Recommendation method, system, medium and electronic equipment based on tree structure
CN111080417A (en) Processing method for improving booking smoothness rate, model training method and system
US20230137639A1 (en) Data processing system and method for operating an enterprise application
Kuo et al. Smart support system of material procurement for waste reduction based on big data and predictive analytics
US12008497B2 (en) Demand sensing and forecasting
US9443214B2 (en) News mining for enterprise resource planning
CN110969501A (en) Method, system, equipment and storage medium for displaying pages of network shopping cart
Gallina et al. Work in progress level prediction with long short-term memory recurrent neural network
CN112785111A (en) Production efficiency prediction method, device, storage medium and electronic equipment
CN117688377A (en) Article index prediction method and device, computer storage medium and electronic equipment
CN112328899B (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN113792039A (en) Data processing method and device, electronic equipment and storage medium
Peng et al. Design and implementation of an intelligent recommendation system for product information on an e-commerce platform based on machine learning
US11842379B2 (en) Method and system for obtaining item-based recommendations
Wikamulia et al. Predictive business intelligence dashboard for food and beverage business
US20230351420A1 (en) Machine learning technologies to predict opportunities for special pricing agreements
CN114969486B (en) Corpus recommendation method, apparatus, device and storage medium
US20240232754A9 (en) Multi-layer micro model analytics framework in information processing system
US20240135283A1 (en) Multi-layer micro model analytics framework in information processing system
TWI700596B (en) Information integrating system and information integrating method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination