CN109284866A - Goods orders prediction technique and device, storage medium, terminal - Google Patents
Goods orders prediction technique and device, storage medium, terminal Download PDFInfo
- Publication number
- CN109284866A CN109284866A CN201811038035.9A CN201811038035A CN109284866A CN 109284866 A CN109284866 A CN 109284866A CN 201811038035 A CN201811038035 A CN 201811038035A CN 109284866 A CN109284866 A CN 109284866A
- Authority
- CN
- China
- Prior art keywords
- neural network
- current
- feature
- history
- network characteristics
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0202—Market predictions or forecasting for commercial activities
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Development Economics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Entrepreneurship & Innovation (AREA)
- Economics (AREA)
- Marketing (AREA)
- Game Theory and Decision Science (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Human Resources & Organizations (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A kind of goods orders prediction technique and device, storage medium, terminal, which comprises determine current width neural network characteristics and current depth neural network characteristics;By the current width neural network characteristics and the wide deep binding model of current depth neural network characteristics input, to obtain the predicted amount of orders of commodity.The present invention program can improve the accuracy of goods orders prediction fully using the characteristic information obtained in History Order big data.
Description
Technical field
The present invention relates to field of computer technology more particularly to a kind of goods orders prediction technique and device, storage medium,
Terminal.
Background technique
In modern grand scale logistic system, carrying out accurately prediction to goods orders has very strong economy and society meaning
Justice.Accurate goods orders prediction can not only the risks such as pre- explosion-proof silo, order delay, can also be reduced in daily operation
The wasting of resources improves efficiency of operation.
As the important ring for carrying out decision judgement, has and largely predict related correlative study with goods orders.Such as
It is a kind of in the prior art, such as the patent of application publication number CN107038492A, utilize arma modeling progress it discloses a kind of
The method of order forecasting.However although arma modeling can extract the feature of some time serieses from historical data, but still
Belong to traditional macroscopic analysis method.Since the complexity of its model is very limited, the feature type that can be characterized is also relatively more single
One, cause the prediction result accuracy of this method lower, application range is by larger limitation.
In the prior art, there is also the goods orders prediction techniques for introducing wavelet neural network, however due to nerve net
The problem of network only has the perceptron of single hidden layer, equally exists feature extraction scarce capacity, the prediction for causing the prediction technique to obtain
As a result operation is assessed for actual goods orders to help less.
It is urgent to provide a kind of goods orders prediction techniques, fully to utilize the feature obtained in History Order big data to believe
Breath improves the accuracy of goods orders prediction.
Summary of the invention
The technical problem to be solved by the present invention is to provide a kind of goods orders prediction technique and device, storage medium, terminals, can
Fully to utilize the characteristic information obtained in History Order big data, the accuracy of goods orders prediction is improved.
In order to solve the above technical problems, the embodiment of the present invention provides a kind of goods orders prediction technique, comprising the following steps:
Determine current width neural network characteristics and current depth neural network characteristics;By the current width neural network characteristics with
And the wide deep binding model of current depth neural network characteristics input, to obtain the predicted amount of orders of commodity;Wherein, the current width
Degree neural network characteristics and current depth neural network characteristics include the current attribute feature of the commodity, current external spy
The union feature of sign, History Order data and the current attribute feature and current external feature.
Optionally, the width neural network characteristics and deep neural network feature are being input to wide deep binding model
Before, the goods orders prediction technique further include: determine N group history width neural network characteristics and N group historical depth mind
Through network characterization;According to the N group history width neural network characteristics and N group historical depth neural network characteristics, by instruction
Practice and determine K tentative wide deep binding models, K is positive integer, and N is positive integer;According to the K tentative wide deep binding models, really
Fixed width depth binding model;Wherein, the history width neural network characteristics and historical depth neural network characteristics include described
Historical status feature, historical external feature, History Order data and the historical status feature and historical external of commodity are special
The union feature of sign.
Optionally, according to the N group history width neural network characteristics and N group historical depth neural network characteristics, warp
It crosses training and determines that K tentative wide deep binding models include: setting K group hyper parameter, initial hidden layer weighted value and initial output layer
Weighted value, wherein every group of hyper parameter includes the Hidden unit number of the hidden layer number of deep neural network, the deep neural network;
For kth group hyper parameter, more wheel interative computations are carried out, successively to determine more wheel predicted amount of orders and every wheel predicted amount of orders
With the difference e of the practical order numbers of history;When difference e convergence, determine that current wide deep binding model is super corresponding to kth group
The tentative wide deep binding model of k-th of parameter, wherein 1≤k≤K.
Optionally, more wheel interative computations are carried out, successively to determine that more wheel predicted amount of orders include: deep according to n-th group history
Neural network characteristics and the n-th wheel hidden layer weighted value are spent, using deep neural network algorithm, output the n-th wheel depth network output
Value;According to the n-th wheel depth network output valve, n-th group history width neural network characteristics and the n-th wheel output layer weight
Value, is weighted summation operation, determines the n-th wheel predicted amount of orders;Wherein, the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output
Layer weighted value is the difference e according to the n-th wheel predicted amount of orders and the practical order numbers of historynDetermining, 1≤n < N.
Optionally, using following formula, according to the difference e of the n-th wheel predicted amount of orders and the practical order numbers of historyn,
Determine the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value:
wn+1=δ wn+1+wn;
Wherein, wn+1For indicating the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value, wnFor indicating the
N takes turns hidden layer weighted value and the n-th wheel output layer weighted value, and α is for indicating preset learning rate, δ wn+1For indicating the (n+1)th wheel
The renewal amount of the renewal amount of hidden layer weighted value and the (n+1)th wheel output layer weighted value.
Optionally, according to the n-th wheel depth network output valve, n-th group history width neural network characteristics and the n-th wheel
Output layer weighted value, is weighted summation operation, determines that the n-th wheel predicted amount of orders includes: defeated according to the n-th wheel depth network
Value, n-th group history width neural network characteristics and the n-th wheel output layer weighted value out, are weighted summation operation, determine n-th
Wheel prediction order median;The n-th wheel prediction order median is handled using activation primitive, with determination described n-th
Take turns predicted amount of orders.
Optionally, the deep neural network algorithm is selected from: ReLU fully-connected network algorithm and depth convolutional Neural net
Network algorithm.
Optionally, according to the K tentative wide deep binding models, determine that wide deep binding model comprises determining that M group history is wide
Spend neural network characteristics and M group historical depth neural network characteristics, the M group history width neural network characteristics and the N
Group history width neural network characteristics are different, the M group historical depth neural network characteristics and the N group historical depth nerve net
Network feature is different, and M is positive integer;It is special according to the M group history width neural network characteristics and M group historical depth neural network
Sign is respectively verified the K tentative wide deep binding models, with the prediction of determination K tentative wide deep binding models
M difference e of order numbers and the practical order numbers of history;Compare the M difference e, determines the wide deep binding model.
Optionally, the M difference e determines that the wide deep binding model includes: the selection M difference e
The smallest tentative wide deep binding model of mean value as the wide deep binding model.
Optionally, the union feature of the historical status feature and historical external feature is special according to the historical status
Sign, historical external feature, obtain via the mode of apposition.
Optionally, the historical status feature of the commodity is characterized in historical external using only in conjunction with one-hot coding or branch mailbox
Boolean's feature of hot code conversion.
Optionally, the commodity be automobile, the historical status feature be selected from it is following one or more: the automobile it is big
Whether class type, color, price, discharge capacity, is new energy vehicle;The historical external feature is selected from following one or more: history
Monthly festivals or holidays number of days, history monthly crude oil average price, history monthly Consumer Prices index, history monthly broad money
Speedup;The History Order data are the order numbers in past preset duration.
Optionally, the union feature of the current attribute feature and current external feature is special according to the current attribute
Sign, current external feature, obtain via the mode of apposition.
Optionally, the current attribute feature of the commodity is characterized in current external using only in conjunction with one-hot coding or branch mailbox
Boolean's feature of hot code conversion.
Optionally, the commodity be automobile, the current attribute feature be selected from it is following one or more: the automobile it is big
Whether class type, color, price, discharge capacity, is new energy vehicle;The current external feature is selected from following one or more: current
Month festivals or holidays number of days, current moon crude oil average price, current moon Consumer Prices index, current moon broad money speedup refer to
Number;The History Order data are the order numbers in past preset duration.
In order to solve the above technical problems, the embodiment of the present invention provides a kind of goods orders prediction meanss, comprising: current signature
Determining module is adapted to determine that current width neural network characteristics and current depth neural network characteristics;Order numbers determining module,
Suitable for the current width neural network characteristics and current depth neural network characteristics are inputted wide deep binding model, to obtain
The predicted amount of orders of commodity;Wherein, the current width neural network characteristics and current depth neural network characteristics include institute
State current attribute feature, current external feature, History Order data and the current attribute feature and current external of commodity
The union feature of feature.
In order to solve the above technical problems, the embodiment of the present invention provides a kind of storage medium, it is stored thereon with computer instruction,
The step of computer instruction executes above-mentioned goods orders prediction technique when running.
In order to solve the above technical problems, the embodiment of the present invention provides a kind of terminal, including memory and processor, it is described to deposit
The computer instruction that can be run on the processor is stored on reservoir, when the processor runs the computer instruction
The step of executing above-mentioned goods orders prediction technique.
Compared with prior art, the technical solution of the embodiment of the present invention has the advantages that
In embodiments of the present invention, current width neural network characteristics and current depth neural network characteristics are determined;It will
The current width neural network characteristics and the wide deep binding model of current depth neural network characteristics input, to obtain commodity
Predicted amount of orders.Using the above scheme, it according to current width neural network characteristics and current depth neural network characteristics, uses
Wide depth binding model, predicts goods orders number, middle compared with the prior art to predict goods orders using single features, or uses
The perceptron of single hidden layer predicts goods orders according to a small amount of feature, using the scheme of the embodiment of the present invention, due to depth nerve net
Network is conducive to the linear and non-linear rule contained in acquisition time sequence, and width neural network is conducive between processing different characteristic
Correlativity, therefore can help to improve commodity fully using the characteristic information obtained in History Order big data and order
The accuracy of single prediction.
Further, in embodiments of the present invention, deep by using different history width neural network characteristics and history
Neural network characteristics are spent, K tentative wide deep binding models are verified, determine wide deep binding model, it can be according to history number
The higher model of accuracy is selected according in multiple tentative wide deep binding models, helps to further improve goods orders prediction
Accuracy.
Detailed description of the invention
Fig. 1 is a kind of flow chart of goods orders prediction technique in the embodiment of the present invention;
Fig. 2 is the partial process view of another goods orders prediction technique in the embodiment of the present invention;
Fig. 3 is a kind of structural schematic diagram of wide deep binding model in the embodiment of the present invention;
Fig. 4 is a kind of flow chart of specific embodiment of step S22 in Fig. 2;
Fig. 5 is a kind of flow chart of specific embodiment of step S42 in Fig. 4;
Fig. 6 is a kind of flow chart of specific embodiment of step S52 in Fig. 5;
Fig. 7 is a kind of flow chart of specific embodiment of step S23 in Fig. 2;
Fig. 8 is a kind of structural schematic diagram of goods orders prediction meanss in the embodiment of the present invention.
Specific embodiment
As previously mentioned, prediction result accuracy is lower in existing goods orders prediction technique, cause application range by
It is helped less to larger limitation, or for actual operation, therefore needs a kind of goods orders prediction technique, to improve to commodity
The accuracy that order is predicted.
In a kind of existing goods orders Predicting Technique, such as the patent of application publication number CN107038492A, disclosure
A kind of method carrying out order forecasting using arma modeling.Although however arma modeling can extract one from historical data
The feature of a little time serieses, but still fall within traditional macroscopic analysis method.Since the complexity of its model is very limited, institute can table
The feature type of sign is also relatively simple, causes the prediction result accuracy of this method lower, application range is by larger limitation.
It, will such as the patent of application publication number CN103310286A in existing another goods orders Predicting Technique
Wavelet neural network is introduced into order forecasting problem.Wherein, the input data of hidden neuron is to reflect by wavelet basis function
Penetrate by translating, it is flexible after historical data.Further, also right in that patent in order to improve the prediction effect of model
Data such as are normalized at the pretreatment operations.However since neural network only has the perceptron of single hidden layer, feature is equally existed
The problem of Extracting Ability deficiency, the prediction result for causing the prediction technique to obtain run help for the assessment of actual goods orders
Less.
The present inventor has found after study, in the prior art, is often ordered using single features prediction commodity
It is single, or using the perceptron of single hidden layer, goods orders are predicted according to a small amount of feature, and fail to make full use of the big number of History Order
According to the characteristic information of middle acquisition, cause the accuracy of prediction result lower.
In embodiments of the present invention, current width neural network characteristics and current depth neural network characteristics are determined;It will
The current width neural network characteristics and the wide deep binding model of current depth neural network characteristics input, to obtain commodity
Predicted amount of orders.Using the above scheme, it according to current width neural network characteristics and current depth neural network characteristics, uses
Wide depth binding model, predicts goods orders number, middle compared with the prior art to predict goods orders using single features, or uses
The perceptron of single hidden layer predicts goods orders according to a small amount of feature, using the scheme of the embodiment of the present invention, due to depth nerve net
Network is conducive to the linear and non-linear rule contained in acquisition time sequence, and width neural network is conducive between processing different characteristic
Correlativity, therefore can help to improve commodity fully using the characteristic information obtained in History Order big data and order
The accuracy of single prediction.
It is understandable to enable above-mentioned purpose of the invention, feature and beneficial effect to become apparent, with reference to the accompanying drawing to this
The specific embodiment of invention is described in detail.
Referring to Fig.1, Fig. 1 is a kind of flow chart of goods orders prediction technique in the embodiment of the present invention.The goods orders
Prediction technique may include step S11 to step S12:
Step S11: current width neural network characteristics and current depth neural network characteristics are determined;
Step S12: by the current width neural network characteristics and the wide deep knot of current depth neural network characteristics input
Molding type, to obtain the predicted amount of orders of commodity.
Wherein, the current width neural network characteristics and current depth neural network characteristics include working as the commodity
Preceding attributive character, current external feature, History Order data and the current attribute feature are combined with current external feature
Feature
In the specific implementation of step S11, the current width neural network characteristics are used for width nerve net for indicating
The characteristic of network prediction, such as may include the current attribute feature of the commodity, further, the current width nerve
It is can wrap in network characterization containing discrete type data.
The current depth neural network characteristics are used for the characteristic for indicating to predict for deep neural network, such as can
To include current external feature, the History Order data of the commodity, further, in the current depth neural network characteristics
It can wrap containing continuous categorical data.
Specifically, the current width neural network characteristics and current depth neural network characteristics may include the quotient
Current attribute feature, current external feature, History Order data and the current attribute feature and current external feature of product
Union feature.
In specific implementation, it in data preparation part, needs to convert initial data to for neural network model training
The data used, i.e., shaped like { (X(i), y(i)) data pair.
Wherein, each data are to the feature vector and logistics order information illustrated on a timing node.On wherein
Mark (i) is used to indicate the number of data, and X is for indicating feature vector, and y is for indicating logistics quantity on order corresponding with X.
Wherein, described eigenvector X may include width neural network characteristics and deep neural network feature, such as when
When the timing node is current time node to be predicted, described eigenvector X may include that current width neural network is special
Sign and current depth neural network characteristics;When the timing node is pervious historical time node, described eigenvector X
It may include history width neural network characteristics and historical depth neural network characteristics.
Further, each feature vector, X can be refined as following form again:
X=[xd1,xd2,……,xdn,xw1,xw2,…,xwm];
Wherein, xd1,xd2,……,xdnIt can serve to indicate that the History Order data before current time node, it is described to go through
The time interval of history data can according to need the order time granularity of prediction and deep neural network model determines.Such as
In monthly logistics order volume forecasting problem, the History Order data are generally also order volume monthly.The length n of historical data
It can be selected according to data qualification, the more abundant model accuracy rate finally obtained of usual historical information is also higher.
xw1,xw2,……,xwmIt can serve to indicate that required feature when width neural metwork training or prediction, specifically
Ground, may include but be not limited to marque, commodity color, the type of merchandise, electric rate, vehicle price, month, season,
Weather conditions, the prices of raw materials, Consumer Prices index, various expenses of taxation ratios etc..
Further, it is illustrated so that the commodity to be predicted are automobile as an example:
The current attribute feature can be selected from following one or more: the major class of the automobile, type, color, price,
Whether discharge capacity is new energy vehicle.
Wherein, the major class, type of the automobile are used to indicate the classification information of vehicle, for example, may include brand message,
Series name etc. under some brand.Customer can meet application demand according to major class, the type of automobile and select vehicle.
The color (such as appearance color, internal color) of the automobile can serve to indicate that the visual signature of vehicle, customer
Visual demand can be met according to the appearance color of vehicle, internal color and select vehicle.
Price, the discharge capacity of the automobile can serve to indicate that the economic characteristics of vehicle, customer can according to the price of vehicle,
Discharge capacity meets economic needs and selects vehicle.
Price, the discharge capacity of the automobile can serve to indicate that the energy supply feature of vehicle, and customer can be according to the automobile
It is no to select vehicle for new energy vehicle.
The current external feature can be selected from following one or more: current festivals or holidays moon number of days, current moon crude oil are flat
Equal price, current moon Consumer Prices index, current moon broad money speedup index.
Wherein, the current festivals or holidays moon number of days can serve to indicate that the potential purchase intention of customer, and the current moon is former
Oily average price, current moon Consumer Prices index, current moon broad money speedup index can serve to indicate that customer's is latent
In purchasing power.
The History Order data can be the order numbers in preset duration in the past, for example, 12 months in the past order numbers
According to.Further, if before this month the moon number less than 12, can using known to earliest month order numbers polishing.
Further, the current attribute feature Yu current external feature of the commodity, which can be, utilizes one-hot coding or branch mailbox
The boolean's feature converted in conjunction with one-hot coding.
Specifically, due to including a large amount of discrete features in width neural network characteristics, in order to play width nerve net
The advantage of network can use one-hot coding (One Hot Encoding) or branch mailbox combination one-hot coding for discrete Feature Conversion
It for multi-class boolean's feature, such as with 1 indicates that prediction object belongs to such, indicates that prediction object is not belonging to such with 0, or
It indicates that prediction object belongs to such with 0, indicates that prediction object is not belonging to such with 1.
By taking the commodity are automobile as an example, it can be " whether being red " (is_red) by color conversion, can also be converted into
" whether being black " (is_black) etc.;Type can be converted to " whether MPV " (is_mpv), " whether SUV " (is_suv) etc..
In embodiments of the present invention, it is characterized in by current attribute feature and current external that the commodity are arranged using only
Heat coding or boolean's feature of branch mailbox combination one-hot coding conversion, help to reduce the complexity for analyzing characteristic,
To improve the accuracy of prediction.
Further, the union feature of the current attribute feature and current external feature can be according to described current
Attributive character, current external feature, obtain via the mode of apposition.
It specifically, can also be by these single boolean after using one-hot coding or the conversion of branch mailbox combination one-hot coding
Feature is intersected, and obtains union feature by way of apposition (Cross Product).
By taking the commodity are automobile as an example, color characteristic and type feature can be combined by apposition, such as will
Whether " being red " obtains the new spy of " whether being red SUV (is_suv&red) " after combining with " whether being SUV " two features
Sign namely new union feature.
In embodiments of the present invention, the connection of the current attribute feature and current external feature is formed by way of apposition
Feature is closed, the richness and diversity of feature are helped to improve, so that the feature of commodity to be predicted is better described, to improve
The accuracy of prediction.
In another specific embodiment of the embodiment of the present invention, the prediction order of commodity is being obtained using step S12
Before number, it can also include the steps that determining wide deep binding model by training.
It is the partial process view of another goods orders prediction technique in the embodiment of the present invention referring to Fig. 2, Fig. 2.It is described another
A kind of goods orders prediction technique may include step S11 and step S12, can also include step S21 to step S23, with
Under each step is illustrated.
In the step s 21, N group history width neural network characteristics and N group historical depth neural network characteristics are determined.
Wherein, the history width neural network characteristics are used to indicate the characteristic for the training of width neural network model
According to, it such as may include the historical status feature of the commodity, it further, in the history width neural network characteristics can be with
It include discrete type data.
The historical depth neural network characteristics are used to indicate the characteristic for deep neural network model training, example
It such as may include historical external feature, the History Order data of the commodity, further, the historical depth neural network is special
It is can wrap in sign containing continuous categorical data.
Specifically, the history width neural network characteristics and historical depth neural network characteristics may include the quotient
Historical status feature, historical external feature, History Order data and the historical status feature and historical external feature of product
Union feature.
Further, the historical status feature Yu historical external feature of the commodity, which can be, utilizes one-hot coding or branch mailbox
The boolean's feature converted in conjunction with one-hot coding.
Further, the union feature of the historical status feature and historical external feature can be according to the history
Attributive character, historical external feature, obtain via the mode of apposition.
Specifically, it is illustrated so that the commodity of deep binding model wide for training are automobile as an example:
The historical status feature can be selected from following one or more: the major class of the automobile, type, color, price,
Whether discharge capacity is new energy vehicle;
The historical external feature can be selected from following one or more: monthly festivals or holidays number of days, history are monthly former for history
Oily average price, history monthly Consumer Prices index, history monthly broad money speedup;
The History Order data can be the order numbers in preset duration in the past.
In specific implementation, the description that more detailed contents in relation to step S21 please refer to the step S11 in Fig. 1 carries out
It executes, details are not described herein again.
It is special according to the N group history width neural network characteristics and N group historical depth neural network in step S22
Sign can determine K tentative wide deep binding models by training, and K is positive integer, and N is positive integer.
It is a kind of structural schematic diagram of wide deep binding model in the embodiment of the present invention referring to Fig. 3, Fig. 3.
The wide deep binding model may include the part width neural network I and the part deep neural network II, wherein
The part width neural network I may include that input layer A and output layer C, the part deep neural network II can wrap
Include input layer A, hidden layer B and output layer C.
Further, in the input layer A, width neural network characteristics can be inputted respectively and deep neural network is special
Sign 31, in the hidden layer B, available hidden neuron output valve 32, in the output layer C, available top layer neuron
Output valve 33.
In specific implementation, wide depth combination neural net model establishing is broadly divided into deep neural network modeling and width nerve net
Network models two parts.Wherein, it in deep neural network modeled segments, needs to preset hyper parameter.
Referring to Fig. 4, Fig. 4 is a kind of flow chart of specific embodiment of step S22 in Fig. 2.It is wide according to the N group history
Neural network characteristics and N group historical depth neural network characteristics are spent, the step of K tentative wide deep binding models is determined by training
Suddenly it may include step S41 to step S43, each step be illustrated below.
In step S41, setting K group hyper parameter, initial hidden layer weighted value and initial output layer weighted value, wherein every
Group hyper parameter includes the Hidden unit number of the hidden layer number of deep neural network, the deep neural network.
Specifically, in deep neural network modeled segments, need to preset one or more groups of hyper parameters, such as setting is deeply
The hidden layer number of neural network, the Hidden unit number of the deep neural network are spent, the activation letter of neuron can also be predefined
Several classes of types.
More specifically, the hidden layer number of the deep neural network is used to indicate the hidden layer number that the hidden layer B shown in Fig. 3 includes
Mesh, such as can be selected from: 2 to 5 layers, the Hidden unit number of the deep neural network is used to indicate multiple hidden layer B shown in Fig. 3
In, every layer of neuron number for including, the Hidden unit number that different hidden layers include can be identical or different, the Hidden unit number
Such as it can be selected from: 8 to 12.
Wherein, the neuron is the basic unit for constituting neural network, and each neuron has multiple input xi, defeated
Y is the activation primitive corresponding to each input out, can be indicated using following formula:
Y=f (∑ wixi);
Wherein, wiIt can be used to indicate that the weighted value of each layer input, also known as model parameter.
In a kind of concrete application of the embodiment of the present invention, ReLU (rectified linear unit) letter can be used
Number is used as activation primitive, can be indicated using following formula:
F (x)=max (0, x).
In step S42, for kth group hyper parameter, more wheel interative computations are successively carried out, to determine more wheel prediction orders
The difference e of several and every wheel predicted amount of orders and the practical order numbers of history.
Referring to Fig. 5, Fig. 5 is a kind of flow chart of specific embodiment of step S42 in Fig. 4.It is described successively to carry out more wheels
Interative computation, to determine more wheel predicted amount of orders the step of may include step S51 to step S52, below to each step into
Row explanation.
In step s 51, according to n-th group historical depth neural network characteristics and the n-th wheel hidden layer weighted value, using depth
Neural network algorithm, output the n-th wheel depth network output valve.
Specifically, the n-th wheel depth network output valve can be in the part deep neural network II (referring to Fig. 3)
Top layer hidden layer output valve in hidden layer B.
The deep neural network algorithm can be selected from: ReLU fully-connected network algorithm and depth convolutional neural networks are calculated
Method.
In step S52, according to it is described n-th wheel depth network output valve, n-th group history width neural network characteristics and
N-th wheel output layer weighted value, is weighted summation operation, determines the n-th wheel predicted amount of orders, wherein the (n+1)th wheel hidden layer weighted value
And (n+1)th wheel output layer weighted value be according to it is described n-th wheel predicted amount of orders and the practical order numbers of history difference enIt determines
, 1≤n < N.It is understood that n-th wheel hidden layer weighted value and the n-th wheel output layer weighted value are according to institute as n >=2
State the difference e of the (n-1)th wheel predicted amount of orders Yu the practical order numbers of historyn-1Determining.
It should be pointed out that since the part width neural network I generally includes discrete type feature (referring to Fig. 3),
It takes turns output layer weighted value and can be the corresponding power of width neural network characteristics 31 in the part width neural network I corresponding n-th
Weight values.
Referring to Fig. 6, Fig. 6 is a kind of flow chart of specific embodiment of step S52 in Fig. 5.According to the n-th wheel depth
Network output valve, n-th group history width neural network characteristics and the n-th wheel output layer weighted value, are weighted summation operation, really
The step of fixed n-th wheel predicted amount of orders may include step S61 to step S62, be illustrated below to each step.
In step S61, according to it is described n-th wheel depth network output valve, n-th group history width neural network characteristics and
N-th wheel output layer weighted value, is weighted summation operation, determines that the n-th wheel predicts order median.
Specifically, the n-th wheel prediction order median can be the top layer neuron output value (referring to Fig. 3).
In step S62, the n-th wheel prediction order median is handled using activation primitive, described in determination
N-th wheel predicted amount of orders.
Specifically, can using ReLU (rectified linear unit) function as activation primitive, can use with
Lower formula indicates:
F (x)=max (0, x).
With continued reference to Fig. 5, in step S52, following formula can be used, predicted amount of orders is taken turns according to described n-th and goes through
The difference e of historical facts border order numbersn, determine the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value:
wn+1=δ wn+1+wn;
Wherein, wn+1For indicating the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value, wnFor indicating the
N takes turns hidden layer weighted value and the n-th wheel output layer weighted value, and α is for indicating preset learning rate, δ wn+1For indicating the (n+1)th wheel
The renewal amount of the renewal amount of hidden layer weighted value and the (n+1)th wheel output layer weighted value.
As shown from the above formula, the training of the neural network can use back-propagation algorithm, also can be according to accidentally
Difference and chain rule are successively continuously updated model parameter.
With continued reference to Fig. 4, in step S42, according to the n-th wheel predicted amount of orders determined in step S52, excessively taken turns
Operation can determine more wheel predicted amount of orders, and then according to more wheel predicted amount of orders, can determine every wheel predicted amount of orders
With the difference e of the practical order numbers of history.
In step S43, when difference e convergence, determine that current wide deep binding model is corresponding to kth group hyper parameter
K-th of tentative wide deep binding model, wherein 1≤k≤K.
Specifically, when error no longer declines namely described by successive ignition after choosing kth group hyper parameter
When difference e restrains, the model convergence can be considered as, so that it is determined that k-th corresponding with the kth group hyper parameter tentative wide deep
Binding model.
It is tentative wide deep according to k-th determined in step S43 in the specific implementation of step S22 with continued reference to Fig. 2
Binding model, by repeatedly training, can determine K tentative wide deep binding models according to K group hyper parameter.
In step S23, according to the K tentative wide deep binding models, wide deep binding model is determined.
Referring to Fig. 7, Fig. 7 is a kind of flow chart of specific embodiment of step S23 in Fig. 2.It is described temporary according to the K
The step of fixed width depth binding model, determining wide deep binding model may include step S71 to step S73, below to each step
It is illustrated.
In step S71, M group history width neural network characteristics and M group historical depth neural network characteristics, institute are determined
It is different from the N group history width neural network characteristics to state M group history width neural network characteristics, the M group historical depth mind
Different from the N group historical depth neural network characteristics through network characterization, M is positive integer.
Specifically, in embodiments of the present invention, history width neural network characteristics and historical depth mind can determined
After network characterization, it is divided into the training set for wide deep binding model training, and for wide deep binding model verifying
Verifying set.
More specifically, the verifying set may include the M group history width neural network characteristics and M group history
Deep neural network feature, to be verified to the wide deep binding model;The training set may include the N group history
Width neural network characteristics and N group historical depth neural network characteristics, to be trained to the wide deep binding model.
In embodiments of the present invention, by the way that the M group history width neural network characteristics and the N group history width is arranged
Neural network characteristics are different, and the M group historical depth neural network characteristics and the N group historical depth neural network characteristics are not
Together, can influence to avoid training data to wide deep binding model, facilitate in the K tentative wide deep binding models to pass through
The higher wide deep binding model of predictablity rate is selected in verifying.
It is special according to the M group history width neural network characteristics and M group historical depth neural network in step S72
Sign is respectively verified the K tentative wide deep binding models, with the prediction of determination K tentative wide deep binding models
M difference e of order numbers and the practical order numbers of history.
In step S73, the M difference e determines the wide deep binding model.
Further, select the smallest tentative wide deep binding model of the mean value of the M difference e as the wide deep combination
Model.
In another specific embodiment of the embodiment of the present invention, a part of history width nerve can also be reserved again
Network characterization and historical depth neural network characteristics form test set, to obtain in verifying K tentative wide deep binding models
After wide depth binding model, for testing wide deep binding model.
In a preferred embodiment of the embodiment of the present invention, wrapped in the training set, verifying set and test set
The ratio of the characteristic contained can be 7:2:1, wherein the characteristic can for history width neural network characteristics and
Historical depth neural network characteristics.
It should be pointed out that in embodiments of the present invention, it can also be according to the quantity on order newly obtained, regular or non-periodically
Ground is updated the wide deep binding model, to improve the accuracy and real-time of the wide deep binding model.
It in embodiments of the present invention, can by selecting the smallest tentative wide deep binding model of the mean value of the M difference e
So that the deep binding model of the width determined using verifying set when being verified, error mean is minimum, helps to improve prediction
Accuracy.
In embodiments of the present invention, by using different history width neural network characteristics and historical depth nerve net
Network feature verifies K tentative wide deep binding models, determines wide deep binding model, can be according to historical data multiple
The higher model of accuracy is selected in tentative wide deep binding model, helps to further improve the accurate of goods orders prediction
Property.
With continued reference to Fig. 1, in the specific implementation of step S12, by the current width neural network characteristics and currently
The deep binding model of the width that the input of deep neural network feature determines in step S23, the predicted amount of orders of available commodity.
In embodiments of the present invention, it according to current width neural network characteristics and current depth neural network characteristics, adopts
With wide deep binding model, goods orders number is predicted, it is middle compared with the prior art to predict goods orders using single features, or adopt
With the perceptron of single hidden layer, goods orders are predicted according to a small amount of feature, using the scheme of the embodiment of the present invention, due to depth nerve
Network is conducive to the linear and non-linear rule contained in acquisition time sequence, and width neural network is conducive to handle different characteristic
Between correlativity, therefore commodity can be helped to improve fully using the characteristic information obtained in History Order big data
The accuracy of order forecasting.
Specifically, when implementing modeling, History Order sequence information and objective characteristics information can be considered simultaneously, and will
A large amount of different types of features, which are placed in a neural network, to be trained.Deep neural network therein is ordered for handling history
Single time series data, width neural network is for handling other characteristics.When deep neural network can obtain well
Between the linear and non-linear rule that contains in sequence, and width neural network can handle between different characteristic well related closes
System.
Therefore, compared with the prior art, in embodiments of the present invention, the characteristic information of use is more complete, model extraction, processing
The ability of feature is higher, helps to improve the accuracy and confidence level of prediction result.
Referring to Fig. 8, Fig. 8 is a kind of structural schematic diagram of goods orders prediction meanss in the embodiment of the present invention.The commodity
Order forecasting device may include:
Current signature determining module 81 is adapted to determine that current width neural network characteristics and current depth neural network are special
Sign, wherein the current width neural network characteristics and current depth neural network characteristics include the current category of the commodity
The union feature of property feature, current external feature, History Order data and the current attribute feature and current external feature;
Order numbers determining module 82 is suitable for the current width neural network characteristics and current depth neural network are special
The wide deep binding model of sign input, to obtain the predicted amount of orders of commodity;
History feature determining module 83 is suitable for the width neural network characteristics and deep neural network feature are defeated
Enter to before wide deep binding model, determine N group history width neural network characteristics and N group historical depth neural network characteristics,
Wherein, the history width neural network characteristics and historical depth neural network characteristics include that the historical status of the commodity is special
The union feature of sign, historical external feature, History Order data and the historical status feature and historical external feature;
Provisional model determining module 84 is suitable for according to the N group history width neural network characteristics and N group historical depth
Neural network characteristics determine K tentative wide deep binding models by training, and K is positive integer, and N is positive integer;
Wide depth model determining module 85 is suitable for determining wide deep binding model according to the K tentative wide deep binding models.
Further, the Provisional model determining module may include:
Parameter setting submodule (not shown) is suitable for setting K group hyper parameter, initial hidden layer weighted value and initial output layer
Weighted value, wherein every group of hyper parameter includes the Hidden unit number of the hidden layer number of deep neural network, the deep neural network;
Difference determines submodule (not shown), is suitable for for kth group hyper parameter, carries out more wheel interative computations, successively with true
The difference e of fixed more wheel predicted amount of orders and every wheel predicted amount of orders and the practical order numbers of history;
Provisional model determines submodule (not shown), is suitable for when difference e convergence, determines current wide deep binding model
To fix tentatively wide deep binding model corresponding to k-th of kth group hyper parameter, wherein 1≤k≤K.
Further, the difference determines that submodule may include:
N-th wheel output valve output unit is suitable for according to n-th group historical depth neural network characteristics and the n-th wheel hidden layer power
Weight values, using deep neural network algorithm, output the n-th wheel depth network output valve;
N-th wheel order numbers predicting unit is suitable for according to the n-th wheel depth network output valve, n-th group history width nerve
Network characterization and the n-th wheel output layer weighted value, are weighted summation operation, determine the n-th wheel predicted amount of orders, wherein (n+1)th
Wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value are according to the n-th wheel predicted amount of orders and the practical order numbers of history
Difference e n determine, 1≤n < N.
Further, following formula can be used, according to the n-th wheel predicted amount of orders and the practical order numbers of history
Difference e n determines the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value:
wn+1=δ wn+1+wn;
Wherein, wn+1For indicating the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value, wnFor indicating the
N takes turns hidden layer weighted value and the n-th wheel output layer weighted value, and α is for indicating preset learning rate, δ wn+1For indicating the (n+1)th wheel
The renewal amount of the renewal amount of hidden layer weighted value and the (n+1)th wheel output layer weighted value.
Further, the n-th wheel order numbers predicting unit may include:
Median determines subelement, is suitable for according to the n-th wheel the depth network output valve, n-th group history width nerve net
Network feature and the n-th wheel output layer weighted value, are weighted summation operation, determine that the n-th wheel predicts order median;
Order numbers predict subelement, suitable for being handled using activation primitive the n-th wheel prediction order median, with
Determine the n-th wheel predicted amount of orders.
Further, the deep neural network algorithm can be selected from: ReLU fully-connected network algorithm and depth convolution
Neural network algorithm.
Further, the wide deep model determining module may include:
History feature determines submodule, is adapted to determine that M group history width neural network characteristics and M group historical depth nerve
Network characterization, the M group history width neural network characteristics are different from the N group history width neural network characteristics, the M group
Historical depth neural network characteristics are different from the N group historical depth neural network characteristics, and M is positive integer;
Provisional model difference determines submodule, is suitable for according to the M group history width neural network characteristics and M group history
Deep neural network feature respectively verifies the K tentative wide deep binding models, tentative wide deep with determination the K
The predicted amount of orders of binding model and M difference e of the practical order numbers of history;
Wide depth model determines submodule, is adapted to compare the M difference e, determines the wide deep binding model.
Further, the wide deep model determines that submodule may include: wide deep model determination unit, is suitably selected for institute
The smallest tentative wide deep binding model of mean value of M difference e is stated as the wide deep binding model.
Further, the union feature of the historical status feature and historical external feature can be according to the history category
Property feature, historical external feature, obtain via the mode of apposition.
The historical status feature of the commodity can be with historical external feature using solely hot in conjunction with one-hot coding or branch mailbox
Boolean's feature of code conversion.
When the commodity are automobile, the historical status feature can be selected from following one or more: the automobile
Whether major class type, color, price, discharge capacity, is new energy vehicle;The historical external feature is selected from following one or more: going through
History monthly festivals or holidays number of days, history monthly crude oil average price, history monthly Consumer Prices index, history monthly broad sense goods
Coin speedup;The History Order data are the order numbers in past preset duration.
Further, the union feature of the current attribute feature and current external feature can be according to the current category
Property feature, current external feature, obtain via the mode of apposition.
The current attribute feature of the commodity can be with current external feature using solely hot in conjunction with one-hot coding or branch mailbox
Boolean's feature of code conversion.
When the commodity are automobile, the current attribute feature can be selected from following one or more: the automobile
Whether major class type, color, price, discharge capacity, is new energy vehicle;The current external feature is selected from following one or more: when
Preceding festivals or holidays moon number of days, current moon crude oil average price, current moon Consumer Prices index, current moon broad money speedup refer to
Number;The History Order data are the order numbers in past preset duration.
It is please referred to above about the principle of the goods orders prediction meanss, specific implementation and beneficial effect and Fig. 1 to Fig. 7 shows
The associated description about goods orders prediction technique out, details are not described herein again.
The embodiment of the invention also provides a kind of storage mediums, are stored thereon with computer instruction, the computer instruction
Executed when operation shown in above-mentioned Fig. 1 to Fig. 7 about goods orders prediction technique the step of.The storage medium can be calculating
Machine readable storage medium storing program for executing, such as may include non-volatility memorizer (non-volatile) or non-transient (non-
Transitory) memory can also include CD, mechanical hard disk, solid state hard disk etc..
The embodiment of the invention also provides a kind of terminal, including memory and processor, energy is stored on the memory
Enough computer instructions run on the processor, the processor execute above-mentioned Fig. 1 extremely when running the computer instruction
Shown in Fig. 7 about goods orders prediction technique the step of.The terminal includes but is not limited to mobile phone, computer, tablet computer
Equal terminal devices
Although present disclosure is as above, present invention is not limited to this.Anyone skilled in the art are not departing from this
It in the spirit and scope of invention, can make various changes or modifications, therefore protection scope of the present invention should be with claim institute
Subject to the range of restriction.
Claims (18)
1. a kind of goods orders prediction technique, which comprises the following steps:
Determine current width neural network characteristics and current depth neural network characteristics;
By the current width neural network characteristics and the wide deep binding model of current depth neural network characteristics input, to obtain
The predicted amount of orders of commodity;
Wherein, the current width neural network characteristics and current depth neural network characteristics include the current category of the commodity
The union feature of property feature, current external feature, History Order data and the current attribute feature and current external feature.
2. goods orders prediction technique according to claim 1, which is characterized in that by the width neural network characteristics
And deep neural network feature is input to before wide deep binding model, further includes:
Determine N group history width neural network characteristics and N group historical depth neural network characteristics;
According to the N group history width neural network characteristics and N group historical depth neural network characteristics, K is determined by training
A tentative wide deep binding model, K is positive integer, and N is positive integer;
According to the K tentative wide deep binding models, wide deep binding model is determined;
Wherein, the history width neural network characteristics and historical depth neural network characteristics include the history category of the commodity
The union feature of property feature, historical external feature, History Order data and the historical status feature and historical external feature.
3. goods orders prediction technique according to claim 2, which is characterized in that according to the N group history width nerve
Network characterization and N group historical depth neural network characteristics determine that K tentative wide deep binding models include: by training
Set K group hyper parameter, initial hidden layer weighted value and initial output layer weighted value, wherein every group of hyper parameter includes depth
The Hidden unit number of the hidden layer number of neural network, the deep neural network;
For kth group hyper parameter, more wheel interative computations are successively carried out, to determine that more wheel predicted amount of orders and the prediction of every wheel are ordered
The difference e of odd number and the practical order numbers of history;
When difference e convergence, determine that current wide deep binding model is wide deep to fix tentatively corresponding to k-th of kth group hyper parameter
Binding model, wherein 1≤k≤K.
4. goods orders prediction technique according to claim 3, which is characterized in that more wheel interative computations are successively carried out, with
Determine that more wheel predicted amount of orders include:
It is defeated using deep neural network algorithm according to n-th group historical depth neural network characteristics and the n-th wheel hidden layer weighted value
N-th wheel depth network output valve out;
According to the n-th wheel depth network output valve, n-th group history width neural network characteristics and the n-th wheel output layer weight
Value, is weighted summation operation, determines the n-th wheel predicted amount of orders;
Wherein, (n+1)th wheel hidden layer weighted value and (n+1)th wheel output layer weighted value be according to it is described n-th wheel predicted amount of orders with
The difference e of the practical order numbers of historynDetermining, 1≤n < N.
5. goods orders prediction technique according to claim 4, which is characterized in that following formula are used, according to described n-th
Take turns the difference e of predicted amount of orders and the practical order numbers of historyn, determine the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer power
Weight values:
wn+1=δ wn+1+wn;
Wherein, wn+1For indicating the (n+1)th wheel hidden layer weighted value and the (n+1)th wheel output layer weighted value, wnFor indicating that the n-th wheel is hidden
Layer weighted value and the n-th wheel output layer weighted value, α is for indicating preset learning rate, δ wn+1For indicating the (n+1)th wheel hidden layer power
The renewal amount of the renewal amount of weight values and the (n+1)th wheel output layer weighted value.
6. goods orders prediction technique according to claim 4, which is characterized in that defeated according to the n-th wheel depth network
Value, n-th group history width neural network characteristics and the n-th wheel output layer weighted value out, are weighted summation operation, determine n-th
Taking turns predicted amount of orders includes:
According to the n-th wheel depth network output valve, n-th group history width neural network characteristics and the n-th wheel output layer weight
Value, is weighted summation operation, determines that the n-th wheel predicts order median;
The n-th wheel prediction order median is handled using activation primitive, with determination the n-th wheel predicted amount of orders.
7. goods orders prediction technique according to claim 4, which is characterized in that the deep neural network algorithm choosing
From: ReLU fully-connected network algorithm and depth convolutional neural networks algorithm.
8. goods orders prediction technique according to claim 2, which is characterized in that according to the K tentative wide deep combinations
Model determines that wide deep binding model includes:
Determine M group history width neural network characteristics and M group historical depth neural network characteristics, the M group history width mind
It is different from the N group history width neural network characteristics through network characterization, the M group historical depth neural network characteristics with it is described
N group historical depth neural network characteristics are different, and M is positive integer;
According to the M group history width neural network characteristics and M group historical depth neural network characteristics, respectively to the K
Tentative wide deep binding model is verified, and the predicted amount of orders and history for fixing tentatively wide deep binding model with determination the K are practical
M difference e of order numbers;
Compare the M difference e, determines the wide deep binding model.
9. goods orders prediction technique according to claim 8, which is characterized in that the M difference e, really
Determining the wide deep binding model includes:
Select the smallest tentative wide deep binding model of the mean value of the M difference e as the wide deep binding model.
10. goods orders prediction technique according to claim 2, which is characterized in that the historical status feature and history
The union feature of surface is to be obtained according to the historical status feature, historical external feature via the mode of apposition.
11. goods orders prediction technique according to claim 2, which is characterized in that the historical status feature of the commodity
It is characterized in boolean's feature using one-hot coding conversion in conjunction with one-hot coding or branch mailbox with historical external.
12. goods orders prediction technique according to claim 2, which is characterized in that the commodity are automobile, the history
Attributive character is selected from following one or more: whether the major class of the automobile type, color, price, discharge capacity, is new energy vehicle;
The historical external feature is selected from following one or more: history monthly festivals or holidays number of days, history monthly crude oil flat fare
Lattice, history monthly Consumer Prices index, history monthly broad money speedup;
The History Order data are the order numbers in past preset duration.
13. goods orders prediction technique according to claim 1, which is characterized in that the current attribute feature and current
The union feature of surface is to be obtained according to the current attribute feature, current external feature via the mode of apposition.
14. goods orders prediction technique according to claim 1, which is characterized in that the current attribute feature of the commodity
It is characterized in boolean's feature using one-hot coding conversion in conjunction with one-hot coding or branch mailbox with current external.
15. goods orders prediction technique according to claim 1, which is characterized in that the commodity are automobile, described current
Attributive character is selected from following one or more: whether the major class of the automobile type, color, price, discharge capacity, is new energy vehicle;
The current external feature is selected from following one or more: current festivals or holidays moon number of days, current moon crude oil average price, when
Preceding moon Consumer Prices index, current moon broad money speedup index;
The History Order data are the order numbers in past preset duration.
16. a kind of goods orders prediction meanss characterized by comprising
Current signature determining module is adapted to determine that current width neural network characteristics and current depth neural network characteristics;
Order numbers determining module is suitable for inputting the current width neural network characteristics and current depth neural network characteristics
Wide depth binding model, to obtain the predicted amount of orders of commodity;
Wherein, the current width neural network characteristics and current depth neural network characteristics include the current category of the commodity
The union feature of property feature, current external feature, History Order data and the current attribute feature and current external feature.
17. a kind of storage medium, is stored thereon with computer instruction, which is characterized in that the computer instruction executes when running
The step of any one of claim 1 to 15 goods orders prediction technique.
18. a kind of terminal, including memory and processor, be stored on the memory to run on the processor
Computer instruction, which is characterized in that perform claim requires any one of 1 to 15 institute when the processor runs the computer instruction
The step of stating goods orders prediction technique.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811038035.9A CN109284866B (en) | 2018-09-06 | 2018-09-06 | Commodity order prediction method and device, storage medium and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811038035.9A CN109284866B (en) | 2018-09-06 | 2018-09-06 | Commodity order prediction method and device, storage medium and terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109284866A true CN109284866A (en) | 2019-01-29 |
CN109284866B CN109284866B (en) | 2021-01-29 |
Family
ID=65184101
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811038035.9A Active CN109284866B (en) | 2018-09-06 | 2018-09-06 | Commodity order prediction method and device, storage medium and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109284866B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110084438A (en) * | 2019-05-09 | 2019-08-02 | 上汽安吉物流股份有限公司 | Prediction technique and device, the logistics system and computer-readable medium of order |
CN110110932A (en) * | 2019-05-09 | 2019-08-09 | 上汽安吉物流股份有限公司 | Order forecast method and device, logistics system and computer-readable medium |
CN110111311A (en) * | 2019-04-18 | 2019-08-09 | 北京奇艺世纪科技有限公司 | A kind of image quality evaluating method and device |
CN110197309A (en) * | 2019-06-05 | 2019-09-03 | 北京极智嘉科技有限公司 | Order processing method, apparatus, equipment and storage medium |
CN110298497A (en) * | 2019-06-11 | 2019-10-01 | 武汉蓝智科技有限公司 | Manufacturing forecast maintenance system and its application method based on big data |
CN110390433A (en) * | 2019-07-22 | 2019-10-29 | 国网河北省电力有限公司邢台供电分公司 | A kind of order forecast method, order forecasting device and terminal device |
CN110705805A (en) * | 2019-10-15 | 2020-01-17 | 秒针信息技术有限公司 | Cargo assembling method and device, storage medium and electronic device |
CN110738523A (en) * | 2019-10-15 | 2020-01-31 | 北京经纬恒润科技有限公司 | maintenance order quantity prediction method and device |
CN113222202A (en) * | 2021-06-01 | 2021-08-06 | 携程旅游网络技术(上海)有限公司 | Reservation vehicle dispatching method, reservation vehicle dispatching system, reservation vehicle dispatching equipment and reservation vehicle dispatching medium |
CN113256223A (en) * | 2021-06-18 | 2021-08-13 | 深圳远荣智能制造股份有限公司 | Goods storage method, storage device, terminal equipment and storage medium |
CN117649164A (en) * | 2024-01-30 | 2024-03-05 | 四川宽窄智慧物流有限责任公司 | Gradient distribution method and system for overall cargo management |
JP7495058B2 (en) | 2020-09-24 | 2024-06-04 | アズワン株式会社 | Order forecast model creation method, order forecast model creation device, order forecast method and order forecast device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101088651B1 (en) * | 2009-09-30 | 2011-12-01 | 성균관대학교산학협력단 | Method and apparatus for context estimating |
CN102385724A (en) * | 2010-08-27 | 2012-03-21 | 上海财经大学 | Spare part assembling demand forecasting information processing method applied to inventory management |
CN103310286A (en) * | 2013-06-25 | 2013-09-18 | 浙江大学 | Product order prediction method and device with time series characteristics |
CN104766144A (en) * | 2015-04-22 | 2015-07-08 | 携程计算机技术(上海)有限公司 | Order forecasting method and system |
US20160019587A1 (en) * | 2012-12-30 | 2016-01-21 | Certona Corporation | Extracting predictive segments from sampled data |
CN106127329A (en) * | 2016-06-16 | 2016-11-16 | 北京航空航天大学 | Order forecast method and device |
KR20170136357A (en) * | 2016-06-01 | 2017-12-11 | 서울대학교산학협력단 | Apparatus and Method for Generating Prediction Model based on Artificial Neural Networks |
CN108053061A (en) * | 2017-12-08 | 2018-05-18 | 天津大学 | A kind of solar energy irradiation level Forecasting Methodology based on improvement convolutional neural networks |
-
2018
- 2018-09-06 CN CN201811038035.9A patent/CN109284866B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101088651B1 (en) * | 2009-09-30 | 2011-12-01 | 성균관대학교산학협력단 | Method and apparatus for context estimating |
CN102385724A (en) * | 2010-08-27 | 2012-03-21 | 上海财经大学 | Spare part assembling demand forecasting information processing method applied to inventory management |
US20160019587A1 (en) * | 2012-12-30 | 2016-01-21 | Certona Corporation | Extracting predictive segments from sampled data |
CN103310286A (en) * | 2013-06-25 | 2013-09-18 | 浙江大学 | Product order prediction method and device with time series characteristics |
CN104766144A (en) * | 2015-04-22 | 2015-07-08 | 携程计算机技术(上海)有限公司 | Order forecasting method and system |
KR20170136357A (en) * | 2016-06-01 | 2017-12-11 | 서울대학교산학협력단 | Apparatus and Method for Generating Prediction Model based on Artificial Neural Networks |
CN106127329A (en) * | 2016-06-16 | 2016-11-16 | 北京航空航天大学 | Order forecast method and device |
CN108053061A (en) * | 2017-12-08 | 2018-05-18 | 天津大学 | A kind of solar energy irradiation level Forecasting Methodology based on improvement convolutional neural networks |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110111311B (en) * | 2019-04-18 | 2021-07-09 | 北京奇艺世纪科技有限公司 | Image quality evaluation method and device |
CN110111311A (en) * | 2019-04-18 | 2019-08-09 | 北京奇艺世纪科技有限公司 | A kind of image quality evaluating method and device |
CN110110932A (en) * | 2019-05-09 | 2019-08-09 | 上汽安吉物流股份有限公司 | Order forecast method and device, logistics system and computer-readable medium |
CN110084438A (en) * | 2019-05-09 | 2019-08-02 | 上汽安吉物流股份有限公司 | Prediction technique and device, the logistics system and computer-readable medium of order |
CN110197309A (en) * | 2019-06-05 | 2019-09-03 | 北京极智嘉科技有限公司 | Order processing method, apparatus, equipment and storage medium |
CN110197309B (en) * | 2019-06-05 | 2021-11-26 | 北京极智嘉科技股份有限公司 | Order processing method, device, equipment and storage medium |
CN110298497A (en) * | 2019-06-11 | 2019-10-01 | 武汉蓝智科技有限公司 | Manufacturing forecast maintenance system and its application method based on big data |
CN110390433A (en) * | 2019-07-22 | 2019-10-29 | 国网河北省电力有限公司邢台供电分公司 | A kind of order forecast method, order forecasting device and terminal device |
CN110738523A (en) * | 2019-10-15 | 2020-01-31 | 北京经纬恒润科技有限公司 | maintenance order quantity prediction method and device |
CN110705805A (en) * | 2019-10-15 | 2020-01-17 | 秒针信息技术有限公司 | Cargo assembling method and device, storage medium and electronic device |
CN110738523B (en) * | 2019-10-15 | 2023-03-24 | 北京经纬恒润科技股份有限公司 | Maintenance order quantity prediction method and device |
JP7495058B2 (en) | 2020-09-24 | 2024-06-04 | アズワン株式会社 | Order forecast model creation method, order forecast model creation device, order forecast method and order forecast device |
CN113222202A (en) * | 2021-06-01 | 2021-08-06 | 携程旅游网络技术(上海)有限公司 | Reservation vehicle dispatching method, reservation vehicle dispatching system, reservation vehicle dispatching equipment and reservation vehicle dispatching medium |
CN113256223A (en) * | 2021-06-18 | 2021-08-13 | 深圳远荣智能制造股份有限公司 | Goods storage method, storage device, terminal equipment and storage medium |
CN117649164A (en) * | 2024-01-30 | 2024-03-05 | 四川宽窄智慧物流有限责任公司 | Gradient distribution method and system for overall cargo management |
CN117649164B (en) * | 2024-01-30 | 2024-04-16 | 四川宽窄智慧物流有限责任公司 | Gradient distribution method and system for overall cargo management |
Also Published As
Publication number | Publication date |
---|---|
CN109284866B (en) | 2021-01-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109284866A (en) | Goods orders prediction technique and device, storage medium, terminal | |
Rafi et al. | A short-term load forecasting method using integrated CNN and LSTM network | |
Gumus et al. | Crude oil price forecasting using XGBoost | |
Wang et al. | Effective electricity energy consumption forecasting using echo state network improved by differential evolution algorithm | |
CN109784979B (en) | Big data driven supply chain demand prediction method | |
CN110060144A (en) | Amount model training method, amount appraisal procedure, device, equipment and medium | |
CN109214601A (en) | Household electric appliances big data Method for Sales Forecast method | |
CN107346502A (en) | A kind of iteration product marketing forecast method based on big data | |
CN107133695A (en) | A kind of wind power forecasting method and system | |
CN110046764A (en) | The method and device of passenger flow forecast amount | |
CN109840628A (en) | A kind of multizone speed prediction method and system in short-term | |
CN108021773A (en) | The more play flood parameters rating methods of hydrological distribution model based on DSS data base read-writes | |
CN111489259A (en) | Stock market risk prediction intelligent implementation method based on deep learning | |
CN115145993A (en) | Railway freight big data visualization display platform based on self-learning rule operation | |
CN106097094A (en) | A kind of man-computer cooperation credit evaluation new model towards medium-sized and small enterprises | |
Li et al. | A Proposed Multi‐Objective, Multi‐Stage Stochastic Programming With Recourse Model for Reservoir Management and Operation | |
CN114004530B (en) | Enterprise electric power credit modeling method and system based on ordering support vector machine | |
CN112508734B (en) | Method and device for predicting power generation capacity of power enterprise based on convolutional neural network | |
Cano-Martínez et al. | Dynamic energy prices for residential users based on Deep Learning prediction models of consumption and renewable generation | |
CN113052630B (en) | Method for configuring electric power equipment by using model and electric power equipment configuration method | |
Ur-Rehman et al. | Dcnn and lda-rf-rfe based short-term electricity load and price forecasting | |
Sharrard | Understanding the Environment of New Business Ventures | |
CN113706197A (en) | Multi-microgrid electric energy transaction pricing strategy and system based on reinforcement and simulation learning | |
Doaei et al. | ANN-DEA approach of corporate diversification and efficiency in bursa Malaysia | |
Kesornsit et al. | Hybrid Machine Learning Model for Electricity Consumption Prediction Using Random Forest and Artificial Neural Networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |