CN103617459A - Commodity demand information prediction method under multiple influence factors - Google Patents

Commodity demand information prediction method under multiple influence factors Download PDF

Info

Publication number
CN103617459A
CN103617459A CN201310656936.5A CN201310656936A CN103617459A CN 103617459 A CN103617459 A CN 103617459A CN 201310656936 A CN201310656936 A CN 201310656936A CN 103617459 A CN103617459 A CN 103617459A
Authority
CN
China
Prior art keywords
alpha
function
sigma
zeta
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310656936.5A
Other languages
Chinese (zh)
Inventor
李敬泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
The Nanjing smart Logistics Technology Co. Ltd.
Original Assignee
李敬泉
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 李敬泉 filed Critical 李敬泉
Priority to CN201310656936.5A priority Critical patent/CN103617459A/en
Publication of CN103617459A publication Critical patent/CN103617459A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a commodity demand information prediction method under multiple influence factors. The method comprises the steps that historical data are collected at first; the data are processed to form a training sample set, wherein the processing process comprises processing the data in a smooth mode, removing singular values, processing various influence factors in a fuzzy mode, processing the data in a normalization mode, and preventing data calculation overflow; then the formed training sample set is placed into a support vector machine to conduct study, and parameters in the prediction algorithm are adjusted to be optimal values; external information of nodes with prediction demands is input to a fuzzy processing system, and information to be predicted is obtained. According to the commodity demand information prediction method under the multiple influence factors, various factors influencing commodity information are considered comprehensively, and the influence of each factor on the commodity information is processed in a quantification mode, the problem that external information is ignored in a traditional prediction algorithm is solved better, overall optimization of the prediction algorithm can be achieved, and therefore people can grasp and know the commodity information more accurately and conveniently.

Description

Demand for commodity information forecasting method under a kind of many influence factors
Technical field
The present invention relates to demand for commodity information forecasting method under a kind of many influence factors, take historical merchandise news and extraneous data as basis, and data are processed, thereby realize, the information of commodity is predicted.Belong to information prediction technical field.
Background technology
At present, along with the trend of economic globalization, information becomes the key factor that enterprise obtains competitive power in society gradually, that is to say, who is the acquired information of morning more, and who can obtain information more accurately, and who has just occupied absolute advantage in competition of the same trade.And the Forecasting Methodology of applied merchandise news on present market but seldom considers that extraneous factor is for the impact of merchandise news, even if some method has been considered this impact, be also difficult to hold accurately influence factor, realize long-term prediction comparatively accurately.In this case, designing and a kind ofly can realize total optimization, consider the Forecasting Methodology of external influence factor, predict merchandise news, is vital.
Summary of the invention
Goal of the invention: the problem existing in the method for existing merchandise news prediction, the invention provides demand for commodity information forecasting method under a kind of many influence factors.First, historical data need to be collected, guarantee reliability and the adequacy of data.Secondly, data are processed, formed the sample set of training, processing procedure comprises carries out smoothing processing to data, rejects singular value, and various influence factors are carried out Fuzzy processing and data are normalized, and prevents data calculation overflow.Then, the training sample set of formation is put into support vector machine and learn, the parameter adjustment in prediction algorithm is arrived to optimal value.And then, the external information that has the node of forecast demand is input in Fuzzy Processing system and is gone.Finally, obtain the information that will predict.By this process, carry out merchandise news prediction, can hold comparatively accurately the dynamic of merchandise news.
Technical scheme: demand for commodity information forecasting method under a kind of many influence factors, comprising:
1. influence factor is judged
The influence factor of commodity mainly comprises: weather, temperature, season, festivals or holidays, individual preference, special event etc.These influence factors can be divided into two classes: objective history data and demand environment data.
(1) objective history data
Objective history data mainly refer to the sales data in like product past.According to correlation analysis, the previous day of prediction day and the correlativity on the same day of last week in product demand prediction are compared.
(2) demand environment data
With consumption demand environmental correclation because have weather, temperature, season, date type (working day, festivals or holidays) and special event etc.
2. the Fuzzy processing of influence factor
Environmental factor is converted into fuzzy quantity through subordinate function.For linear input, the number of subordinate function can be got less, and for nonlinear output relation, needs to establish some subordinate functions more.
3. the structure of support vector machine
By a kind of like this algorithm that minimizes structure risk of SVM, predict.The basic thought of this Forecasting Methodology may be summarized to be: first by nonlinear transformation, input vector x is mapped to a more space Z for higher-dimension, then in this new space, ask for optimum linearity classifying face, this nonlinear transformation is to realize by defining suitable inner product function, and this inner product function is constructed by kernel function.
Prediction steps:
(1) objective history data are carried out to smoothing processing and normalized, demand environment data are carried out to fuzzy quantization processing, then form sample set;
(2) with training sample, set up the objective function suc as formula (3-11);
(3) utilize SVM training algorithm to solve (3-11), obtain separating α iwith
Figure BDA0000432189530000021
, i=1,2...n;
(4), by the Lagrange multiplier substitution formula (3-12) obtaining, recycling sample is predicted tomorrow requirement amount.
Beneficial effect: compared with prior art, demand for commodity information forecasting method under many influence factors provided by the invention, on the basis of the historical information of comprehensive commodity and the external information of history, processes historical information, form training sample set, in support vector machine, learn.The external information of the time period of simultaneously needs being predicted is processed by Fuzzy Processing system, the sample set of formation is input in support vector machine, and then prediction of output information.This Forecasting Methodology can be better in conjunction with affecting the various factors of merchandise news in actual life, can meet to greatest extent the accuracy predicting the outcome simultaneously, thereby for related personnel provides more accurate and merchandise news timely, for the many industries that comprise logistic industry, there is great meaning.
Accompanying drawing explanation
Fig. 1 is the method flow diagram of the embodiment of the present invention;
Fig. 2 is the schematic diagram of the embodiment of the present invention;
Fig. 3 is the comparative graph of SVM prediction sales volume and true sales volume.
Embodiment
Below in conjunction with specific embodiment, further illustrate the present invention, should understand these embodiment is only not used in and limits the scope of the invention for the present invention is described, after having read the present invention, those skilled in the art all fall within the application's claims limited range to the modification of the various equivalent form of values of the present invention.
As shown in Figure 1-2, demand for commodity information forecasting method under many influence factors, comprises early-stage preparations and two stages of prediction, and wherein the early-stage preparations stage comprises influence factor judgement, the Fuzzy processing of influence factor and the structure of support vector machine; Detailed process is as follows:
Early-stage preparations:
1. influence factor is judged
The variation of merchandise news has certain seasonality, and consumer's selection has diversity and substitutability, makes certain class particular commodity information predict the factor that has a lot of impacts.These factors mainly comprise: weather, temperature, season, festivals or holidays, individual preference, special event etc.These influence factors can be divided into two classes: objective history data and demand environment data.
(1) objective history data
Objective history data mainly refer to the sales data in like product past.Its reflects the level of consumption of these type of commodity in a certain period.According to correlation analysis, in product demand prediction, the previous day of prediction day and the correlativity on the same day of last week are relatively good.
(2) demand environment data
With consumption demand environmental correclation because have weather, temperature, season, date type (working day, festivals or holidays) and special event etc.
2. the Fuzzy processing of influence factor
Except objective history data, the demand environment data such as weather condition, temperature conditions, date type all exert an influence to demand.If consider the impact of these environmental factors in demand forecast, need these environmental datas to process, these environmental factors are converted into fuzzy quantity through subordinate function here.The design of subordinate function can affect the robustness of system and the precision of prediction, and general for linear input, the number of subordinate function can be got less, and for nonlinear output relation, needs to establish some subordinate functions more.(note: in this official documents and correspondence we to take temperature and week be example, summary influence factor Fuzzy processing process)
The subordinate function of temperature is adopted to trapezoidal profile:
(1) subordinate function for low temperature condition adopts type trapezoidal profile less than normal:
u t 1 = 0 t > 10 10 - t 10 - 0 0 &le; t &le; 10 1 t < 0
(2) subordinate function of centering temperature adopts osculant trapezoidal profile:
u t 2 = 0 t > 25 t - 5 15 - 5 5 &le; t &le; 15 25 - t 25 - 15 15 &le; t &le; 25
(3) membership function of high temperature is adopted to type trapezoidal profile bigger than normal:
u t 3 = 0 t < 20 t - 20 40 - 20 20 &le; t &le; 40 1 t > 40
To week, the subordinate function of type adopts half rectangular distribution:
(1) to workaday subordinate function, be:
Figure BDA0000432189530000044
(2) to the membership function of two-day weekend, be:
Figure BDA0000432189530000045
By the maximum temperature T of of the same type day h, minimum temperature T lbring into respectively three of temperature subordinate functions, obtain the degree of membership { T for low temperature, middle temperature, three states of high temperature h1, T h2, T h3, { T l1, T l2, T l3.According to the maximum principle of degree of membership, get T h=max{T h1, T h2, T h3and T l=max{T l1, T l2, T l3, known maximum temperature T h, minimum temperature T laffiliated fuzzy set.
According to the fuzzy set under said temperature, consider the factor of weather, adopt temperature-weather quantization parameter method.As following table 1:
Table 1 temperature-weather condition quantization parameter
Figure BDA0000432189530000051
3. the structure of support vector machine
Traditional machine learning method all, using empirical risk minimization as the target of making great efforts, although can reach very high accuracy on sample set, differs greatly when unknown data is predicted, so-called Generalization Ability is poor.Therefore statistics introduced the concept of extensive error bound, and real exactly risk should be portrayed by two parts content, and the one, empiric risk, has represented the error of sorter on given sample; The 2nd, put trade wind danger, represented the error of sorter on unknown data sample.Put trade wind danger relevant with two amounts, the one, sample size, obviously given sample size is larger, and the structure of machine learning is more possible correct, now puts trade wind dangerous less; The 2nd, the VC dimension of classification function, obviously VC dimension is larger, and Generalization Ability is poorer.Put trade wind and can become by inches large.
The formula of extensive error bound is: R (C)≤R emp(C)+φ (n/h)
R in formula (C) is exactly real risk, R emp(C) be exactly empiric risk, φ (n/h) puts trade wind danger.The target of statistical learning has become and has sought empiric risk with that put trade wind danger and minimum from empirical risk minimization, and SVM is a kind of like this algorithm that minimizes structure risk just.
The basic thought of this Forecasting Methodology may be summarized to be: first by nonlinear transformation, input vector x is mapped to a more space Z for higher-dimension, then in this new space, ask for optimum linearity classifying face, this nonlinear transformation is to realize by defining suitable inner product function, and this inner product function is constructed by kernel function.
A given data set
Figure BDA0000432189530000052
d wherein iexpectation value, x ibe the data vector of input, n is the number of training sample.SVM adopts following formula as regression function:
y=f(x)=wφ(x)+b (3-1)
φ in formula (x) is the Nonlinear Mapping from input vector collection to high-dimensional feature space, and w and b are coefficient, by minimum risk function R (C), is estimated:
Minimize : R ( C ) = 1 2 | | w | | 2 + C 1 n &Sigma; i = 1 n L &epsiv; ( d i , y i ) - - - ( 3 - 2 )
L &epsiv; ( d , y ) = 0 | d - y | < &epsiv; | d - y | - &epsiv; | d - y | &GreaterEqual; &epsiv; - - - ( 3 - 3 )
In formula
Figure BDA0000432189530000063
be that C is free constant to the measuring of how much intervals between support vector, be used for balance model substantially to measure weight parameter with training error item.ε is given error parameter, error term L ε(d, y) is the insensitive loss function of ε, and this loss function determines that in feature space one centered by lineoid y=f (x), and thick is the thin plate region of 2 ε.When sample data falls into this region, mean that the difference between predicted value and actual value is less than ε, risk of loss equals 0; In the time of outside sample data falls into this region, it is carried out to linearity punishment.
In order to seek coefficient w and b, introduce slack variable ζ and ζ *:
Minimize : R ( w , &zeta; , &zeta; * ) = 1 2 | | w | | 2 + C &Sigma; i = 1 n ( &zeta; i + &zeta; i * ) Subjected to : d i - w&phi; ( x i ) - b &le; &epsiv; + &zeta; i w&phi; ( x i ) + b - d i &le; &epsiv; + &zeta; i * &zeta; i , &zeta; i * &GreaterEqual; 0 - - - ( 3 - 4 )
According to Lagrange (Lagrange) principle of duality, introduce dual variable and meet non-negative condition, i.e. α i,
Figure BDA0000432189530000066
η i,
Figure BDA0000432189530000067
constructed fuction is as follows:
L = 1 2 | | w | | 2 + C &Sigma; i = 1 n ( &zeta; i + &zeta; i * ) - &Sigma; i = 1 n &alpha; i ( &epsiv; + &zeta; i - d i + w&phi; ( x i ) + b ) - &Sigma; i = 1 n &alpha; i * ( &epsiv; + &zeta; i * + d i - w&phi; ( x i ) - b ) - &Sigma; i = 1 n ( &eta; i &zeta; i + &eta; i * &zeta; i * ) - - - ( 3 - 5 )
In optimizing problem, Lagrange function has a saddle point, and according to saddle point condition, L function is respectively with respect to original variable (w, b, ζ i, ζ *) partial derivative be 0, that is:
&PartialD; w L = w - &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) &phi; ( x i ) = 0 - - - ( 3 - 6 )
&PartialD; b L = &Sigma; i = 1 n ( &alpha; i * - &alpha; i ) = 0 - - - ( 3 - 7 )
&PartialD; &zeta; i L = C - &alpha; i - &eta; i = 0 - - - ( 3 - 8 )
&PartialD; &zeta; i * L = C - &alpha; i * - &eta; i * = 0 - - - ( 3 - 9 )
By formula (3-6), (3-7), (3-8), (3-9) substitution (3-5), utilize Wolfe antithesis skill, obtain following formula:
max imize : R ( &alpha; i , &alpha; i * ) = - 1 2 &Sigma; i , j = 1 n ( &alpha; i - &alpha; i * ) ( &alpha; j - &alpha; j * ) < &phi; ( x i ) , &phi; ( x j ) > - &epsiv; &Sigma; i = 1 n ( &alpha; i + &alpha; i * ) + &Sigma; i = 1 n d i ( &alpha; i - &alpha; i * ) - - - ( 3 - 10 )
Subjected to : &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) = 0 &alpha; i , &alpha; i * &Element; [ 0 , C ]
For nonlinear regression problem, can problem be transformed in a new space by the new proper vector of structure, this need to be converted into the linear problem in another higher dimensional space by nonlinear transformation by nonlinear problem.In this transformation space, only need the inner product operation after definition conversion, need not know the form of the nonlinear transformation of employing, this algorithm has overcome the dimension disaster problem that may cause, and has reduced the complexity of calculating.In Statistical Learning Theory, according to Hilbert-Schmidt principle, as long as a kind of computing meets Mercer condition, just can be used as inner product and use.Therefore SVM method is by introducing suitable inner product kernel function K (x i, x j) realize the linear regression after nonlinear transformation.
Kernel function K (x i, x j) selection determined the structure of feature space, its value equals two vector x iand x jat feature space φ (x i) and φ (x j) inner product, i.e. K (x i, x j)=φ (x i) * φ (x j).Choosing of kernel function has a great impact precision of prediction, and main kernel function has polynomial kernel function K (x, x i)=(xx i+ 1) q, exponential type radial basis function K (x, x i)=exp (| x-x i| 2/ σ 2) and sigmoid function K (x, x i)=tanh (v (xx i)+c).Wherein, q is the exponent number of polynomial kernel, σ 2be the width parameter of radial basis core, c is constant.Adopt comparatively accurate radial basis kernel function herein.Therefore, objective function (3-10) becomes:
max imize : R ( &alpha; i , &alpha; i * ) = - 1 2 &Sigma; i , j = 1 n ( &alpha; i - &alpha; i * ) ( &alpha; j - &alpha; j * ) K ( x i , x j ) - &epsiv; &Sigma; i = 1 n ( &alpha; i + &alpha; i * ) + &Sigma; i = 1 n d i ( &alpha; i - &alpha; i * ) - - - ( 3 - 11 )
Subjected to : &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) = 0 &alpha; i , &alpha; i * &Element; [ 0 , C ]
Corresponding regression function formula (2-1) is:
y = f ( x , &alpha; i , &alpha; i * ) = &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) K ( x , x i ) + b - - - ( 3 - 12 )
By the character of SVM regression function,
Figure BDA0000432189530000084
and only has minority α i,
Figure BDA0000432189530000085
non-vanishing, vector corresponding to these parameters is referred to as support vector, and regression function is completely by its decision.
Prediction steps:
(1) objective history data are carried out to smoothing processing and normalized, demand environment data are carried out to fuzzy quantization processing, then form sample set;
(2) with training sample, set up the objective function suc as formula (3-11);
(3) utilize SVM training algorithm to solve (3-11), obtain separating α iwith
Figure BDA0000432189530000086
i=1,2...n;
(4), by the Lagrange multiplier substitution formula (3-12) obtaining, recycling sample is predicted tomorrow requirement amount.
Example demonstration
The input of agricultural product dynamic need forecast model comprises following variable:
T max d: the maximum temperature on same day prediction day
T min d: the minimum temperature on same day prediction day
T max (d-1): the maximum temperature of the previous day prediction day
T min (d-1): the minimum temperature of the previous day prediction day
W d: the weather condition (fine day, rainy day etc.) of prediction day
W d-1: the weather condition of the previous day prediction day
Wea d: the week type (weekend or working day) of prediction day
Wea d-1: the week type of the previous day prediction day
S d-1: the sales volume of the previous day prediction day
S d-2: predict the sales volume of the day before yesterday a few days ago
S d-7: the sales volume of previous this day in week of prediction day
SVM data center based on radial basis kernel function (SBF) is support vector, and network weight is these parameters are all produced automatically by SVM, and SVM algorithm can be summed up as a constrained quadratic programming problem, and its arbitrary solution is globally optimal solution.In the prediction of agricultural product dynamic need, input vector x=[x 1, x 2... x n] t=[T max d, T min d... S d-7] t, be output as
Figure BDA0000432189530000092
the number that wherein n is support vector.
1. collect historical data (table 2) and predicted data (table 3), wherein above 400 groups of data as historical data, after 30 groups of data as test data.
Table 2
Figure BDA0000432189530000093
Figure BDA0000432189530000101
Table 3
Figure BDA0000432189530000102
2. pair historical data and predicted data are carried out respectively Fuzzy Processing (table 2.1) and are carried out level and smooth normalized (table 2.2) again, then form sample set (table 4);
Table 2.1
0.4 0.4 0.2 0.25 1 1 1514 1525 1624
0.9 0.95 0.4 0.4 0 1 1645 1514 1352
0.2 0.25 0.9 0.95 0 0 1423 1645 1327
0 0.05 0.2 0.25 0 0 1286 1423 1431
0.2 0.25 0 0.05 0 0 1355 1286 1352
0 0.05 0.2 0.25 0 0 1287 1355 1622
0 0 0 0.05 1 0 1466 1287 1553
0.3 0.2 0 0 1 1 1529 1466 1645
0.9 0.9 0.3 0.2 0 1 1563 1529 1423
0.4 0.45 0.9 0.9 0 0 1326 1563 1286
0 0.05 0.4 0.45 0 0 1245 1326 1355
0.9 0.9 0 0.05 0 0 1465 1245 1287
0.9 0.9 0.9 0.9 0 0 1333 1465 1466
0.2 0.2 0.9 0.9 1 0 1487 1333 1529
0.1 0 0.9 0.9 0 1 1382 1385 1395
0.3 0.2 0.1 0 0 0 1332 1382 1333
0.3 0.2 0.3 0.2 0 0 1359 1332 1358
0.3 0.2 0.3 0.2 0 0 1289 1359 1295
0.3 0.2 0.3 0.2 0 0 1296 1289 1395
1 0.9 0.3 0.2 1 0 1329 1296 1385
1 0.9 1 0.9 1 1 1346 1329 1382
Table 2.2
1 1
2 0.439394
3 0.093434
4 0.267677
5 0.09596
6 0.54798
7 0.707071
8 0.792929
9 0.194444
10 -0.0101
11 0.545455
12 0.212121
13 0.60101
14 0.191919
394 0.209596
395 0.277778
396 0.10101
397 0.118687
398 0.20202
399 0.244949
400 0.184343
Table 4
0.4 0.4 0.2 0.25 1 1 0.669192 0.69697 0.94697
0.9 0.95 0.4 0.4 0 1 1 0.669192 0.260101
0.2 0.25 0.9 0.95 0 0 0.439394 1 0.19697
0 0.05 0.2 0.25 0 0 0.093434 0.439394 0.459596
0.2 0.25 0 0.05 0 0 0.267677 0.093434 0.260101
0 0.05 0.2 0.25 0 0 0.09596 0.267677 0.941919
0 0 0 0.05 1 0 0.54798 0.09596 0.767677
0.3 0.2 0 0 1 1 0.707071 0.54798 1
0.9 0.9 0.3 0.2 0 1 0.792929 0.707071 0.439394
0.4 0.45 0.9 0.9 0 0 0.194444 0.792929 0.093434
0 0.05 0.4 0.45 0 0 -0.0101 0.194444 0.267677
0.9 0.9 0 0.05 0 0 0.545455 -0.0101 0.09596
0.9 0.9 0.9 0.9 0 0 0.212121 0.545455 0.54798
0.2 0.2 0.9 0.9 1 0 0.60101 0.212121 0.707071
0.1 0 0.9 0.9 0 1 0.335859 0.343434 0.368687
0.3 0.2 0.1 0 0 0 0.209596 0.335859 0.212121
0.3 0.2 0.3 0.2 0 0 0.277778 0.209596 0.275253
0.3 0.2 0.3 0.2 0 0 0.10101 0.277778 0.116162
0.3 0.2 0.3 0.2 0 0 0.118687 0.10101 0.368687
1 0.9 0.3 0.2 1 0 0.20202 0.118687 0.343434
1 0.9 1 0.9 1 1 0.244949 0.20202 0.335859
3. pair predicted data is carried out Fuzzy Processing (table 5.1) and is carried out level and smooth normalized (table 5.2) again, then forms sample set (table 6);
Table 5.1
0.9 0.9 1 0.9 0 1 1322 1346 1332
0.9 0.9 0.9 0.9 0 0 1266 1322 1359
0.5 0.4 0.9 0.9 0 0 1222 1266 1289
0.1 0 0.5 0.4 0 0 1243 1222 1296
0.1 0 0.1 0 0 0 1322 1243 1329
0.3 0.2 0.1 0 1 0 1265 1322 1346
0.3 0.2 0.3 0.2 1 1 1385 1265 1322
0.5 0.4 0.3 0.2 0 1 1333 1385 1266
0.9 0.9 0.5 0.4 0 0 1243 1333 1222
0.9 0.9 0.9 0.9 0 0 1193 1243 1243
0.9 0.9 0.9 0.9 0 0 1125 1193 1322
0.3 0.2 0.9 0.9 0 0 1086 1125 1265
0.3 0.2 0.3 0.2 1 0 1196 1086 1385
0.1 0 0.3 0.2 1 1 1245 1196 1333
0.3 0.2 0.1 0 0 1 1353 1245 1243
0.3 0.2 0.3 0.2 0 0 1239 1353 1193
0.3 0.2 0.3 0.2 0 0 1193 1239 1125
0.5 0.4 0.3 0.2 0 0 1127 1193 1086
1 0.9 0.5 0.4 0 0 1083 1127 1196
1 0.9 1 0.9 1 0 1022 1083 1245
0.3 0.2 1 0.9 1 1 1158 1022 1353
1 0.9 0.3 0.2 0 1 1222 1158 1239
0.9 0.9 1 0.9 0 0 1085 1222 1193
1 0.9 0.9 0.9 0 0 1023 1085 1127
1 0.9 1 0.9 0 0 957 1023 1083
0.3 0.2 1 0.9 0 0 938 957 1022
0.1 0 0.3 0.2 1 0 1028 938 1158
0.1 0 0.1 0 1 1 1184 1028 1222
0.3 0.2 0.1 0 0 1 1122 1184 1085
0.3 0.2 0.3 0.2 0 0 1138 1122 1023
Table 5.2
1 0.042929
2 -0.06818
3 -0.01515
4 0.184343
5 0.040404
6 0.343434
7 0.212121
8 -0.01515
9 -0.14141
10 -0.31313
11 -0.41162
12 -0.13384
13 -0.0101
14 0.262626
15 -0.02525
16 -0.14141
17 -0.30808
18 -0.41919
19 -0.57323
20 -0.2298
21 -0.06818
22 -0.41414
23 -0.57071
24 -0.73737
25 -0.78535
26 -0.55808
27 -0.16414
28 -0.32071
29 -0.2803
30 -0.4899
Table 6
0.9 0.9 1 0.9 0 1 0.184343 0.244949 0.209596
0.9 0.9 0.9 0.9 0 0 0.042929 0.184343 0.277778
0.5 0.4 0.9 0.9 0 0 -0.06818 0.042929 0.10101
0.1 0 0.5 0.4 0 0 -0.01515 -0.06818 0.118687
0.1 0 0.1 0 0 0 0.184343 -0.01515 0.20202
0.3 0.2 0.1 0 1 0 0.040404 0.184343 0.244949
0.3 0.2 0.3 0.2 1 1 0.343434 0.040404 0.184343
0.5 0.4 0.3 0.2 0 1 0.212121 0.343434 0.042929
0.9 0.9 0.5 0.4 0 0 -0.01515 0.212121 -0.06818
0.9 0.9 0.9 0.9 0 0 -0.14141 -0.01515 -0.01515
0.9 0.9 0.9 0.9 0 0 -0.31313 -0.14141 0.184343
0.3 0.2 0.9 0.9 0 0 -0.41162 -0.31313 0.040404
0.3 0.2 0.3 0.2 1 0 -0.13384 -0.41162 0.343434
0.1 0 0.3 0.2 1 1 -0.0101 -0.13384 0.212121
0.3 0.2 0.1 0 0 1 0.262626 -0.0101 -0.01515
0.3 0.2 0.3 0.2 0 0 -0.02525 0.262626 -0.14141
0.3 0.2 0.3 0.2 0 0 -0.14141 -0.02525 -0.31313
0.5 0.4 0.3 0.2 0 0 -0.30808 -0.14141 -0.41162
1 0.9 0.5 0.4 0 0 -0.41919 -0.30808 -0.13384
1 0.9 1 0.9 1 0 -0.57323 -0.41919 -0.0101
0.3 0.2 1 0.9 1 1 -0.2298 -0.57323 0.262626
1 0.9 0.3 0.2 0 1 -0.06818 -0.2298 -0.02525
0.9 0.9 1 0.9 0 0 -0.41414 -0.06818 -0.14141
1 0.9 0.9 0.9 0 0 -0.57071 -0.41414 -0.30808
1 0.9 1 0.9 0 0 -0.73737 -0.57071 -0.41919
0.3 0.2 1 0.9 0 0 -0.78535 -0.73737 -0.57323
0.1 0 0.3 0.2 1 0 -0.55808 -0.78535 -0.2298
0.1 0 0.1 0 1 1 -0.16414 -0.55808 -0.06818
0.3 0.2 0.1 0 0 1 -0.32071 -0.16414 -0.41414
0.3 0.2 0.3 0.2 0 0 -0.2803 -0.32071 -0.57071
4. with training sample, set up objective function, in this example with kernel function form K (x, x i)=exp (γ * | x-x i| 2) replace original radial basis kernel function K (x, x i)=exp (| x-x i| 2/ σ 2), wherein γ and 1/ σ 2played effect of equal value.Three parameters can drafting voluntarily in algorithm are ε, C, γ.
In test process, find, the value of parameter ε is very little on precision of prediction impact, therefore fixing ε=0.01.In actual measurement, find, the value of parameters C, γ is larger on precision of prediction impact.Here according to average percent error e and prediction accuracy A cfor performance index, take the get parms optimal value of C, γ of the method for cross validation.The result is as following table:
The affect situation of table 7 parameter on precision of prediction
Figure BDA0000432189530000151
By upper table 7, can be obtained, when parameter γ is fixed on 1, change the value of parameters C, will obtain the forecast model of different accuracy.When C increases to 0.1 by 0.01, prediction accuracy increases, and C is in 0.1 process that continues to increase 100, and the accuracy of prediction is in continuous minimizing, and when C is greater than after 100, prediction accuracy decline rate will increase.When preset parameter C is 0.1, makes γ constantly reduce from 0.25 to both sides or constantly increase the equal variation of prediction accuracy.Therefore this time the parameters optimal value of test is ε=0.01, C=0.1, γ=0.25.
5. utilize SVM training algorithm to solve objective function, obtain separating α iwith
Figure BDA0000432189530000152
i=1,2...n;
6. by the Lagrange multiplier substitution obtaining, recycling sample is predicted tomorrow requirement amount, obtains table 8.
Table 8
1 1225.065 16 1224.134
2 1183.957 17 1169.204
3 1170.928 18 1103.858
4 1255.546 19 1061.722
5 1325.055 20 1117.751
6 1373.847 21 1231.923
7 1376.96 22 1163.791
8 1276.974 23 1040.04
9 1161.522 24 989.3822
10 1103.24 25 963.1091
11 1095.081 26 988.0927
12 1101.476 27 1195.036
13 1329.1 28 1253.879
14 1335.225 29 1121.063
15 1268.517 30 1106.855
As shown in Figure 2, both values are very approaching for the curve of the true sales volume that test obtains under the method and prediction sales volume, and as can be seen here, the prediction sales volume result that method provided by the invention is carried out has higher accuracy.

Claims (5)

1. a demand for commodity information forecasting method under influence factor more than, is characterized in that:
First, need to collect commodity historical data and external information, commodity historical data is carried out to data smoothing processing, data Fuzzy Processing and data normalization and process, information is carried out data Fuzzy Processing and data normalization processing to external world simultaneously, forms training sample set; Then, the training sample set of formation is put into support vector machine and learn, the parameter adjustment in prediction algorithm is arrived to optimal value; Obtain the information that will predict.
2. demand for commodity information forecasting method under many influence factors as claimed in claim 1, is characterized in that:
Commodity historical data mainly refers to the sales data in like product past, according to correlation analysis, the previous day of prediction day and the correlativity on the same day of last week in product demand prediction is compared;
Historical external information mainly refers to and the factor of consumption demand environmental correclation, comprises weather, temperature, season, date type and special event.
3. demand for commodity information forecasting method under many influence factors as claimed in claim 1, is characterized in that: the Fuzzy processing of external information refers to, environmental factor is converted into fuzzy quantity through subordinate function; For linear input, the number of subordinate function can be got less, and for nonlinear output relation, needs to establish some subordinate functions more.
4. demand for commodity information forecasting method under many influence factors as claimed in claim 1, is characterized in that:
In the structure of support vector machine,
The formula of extensive error bound is: R (C)≤R emp(C)+φ (n/h)
R in formula (C) is exactly real risk, R emp(C) be exactly empiric risk, φ (n/h) puts trade wind danger;
First by nonlinear transformation, input vector x is mapped to a more space Z for higher-dimension, then in this new space, ask for optimum linearity classifying face, this nonlinear transformation is to realize by defining suitable inner product function, and this inner product function is constructed by kernel function;
A given data set d wherein iexpectation value, x ibe the data vector of input, n is the number of training sample; SVM adopts following formula as regression function:
y=f(x)=wφ(x)+b (3-1)
φ in formula (x) is the Nonlinear Mapping from input vector collection to high-dimensional feature space, and w and b are coefficient, by minimum risk function R (C), is estimated:
Minimize : R ( C ) = 1 2 | | w | | 2 + C 1 n &Sigma; i = 1 n L &epsiv; ( d i , y i ) - - - ( 3 - 2 )
L &epsiv; ( d , y ) = 0 | d - y | < &epsiv; | d - y | - &epsiv; | d - y | &GreaterEqual; &epsiv; - - - ( 3 - 3 )
In formula
Figure FDA0000432189520000022
be that C is free constant to the measuring of how much intervals between support vector, be used for balance model substantially to measure
Figure FDA0000432189520000023
weight parameter with training error item; ε is given error parameter, error term L ε(d, y) is the insensitive loss function of ε, and this loss function determines that in feature space one centered by lineoid y=f (x), and thick is the thin plate region of 2 ε.When sample data falls into this region, mean that the difference between predicted value and actual value is less than ε, risk of loss equals 0; In the time of outside sample data falls into this region, it is carried out to linearity punishment;
In order to seek coefficient w and b, introduce slack variable ζ and ζ *:
Minimize : R ( w , &zeta; , &zeta; * ) = 1 2 | | w | | 2 + C &Sigma; i = 1 n ( &zeta; i + &zeta; i * ) Subjected to : d i - w&phi; ( x i ) - b &le; &epsiv; + &zeta; i w&phi; ( x i ) + b - d i &le; &epsiv; + &zeta; i * &zeta; i , &zeta; i * &GreaterEqual; 0 - - - ( 3 - 4 )
According to Lagrange duality principle, introduce dual variable and meet non-negative condition, i.e. α i,
Figure FDA00004321895200000210
η i,
Figure FDA00004321895200000211
constructed fuction is as follows:
L = 1 2 | | w | | 2 + C &Sigma; i = 1 n ( &zeta; i + &zeta; i * ) - &Sigma; i = 1 n &alpha; i ( &epsiv; + &zeta; i - d i + w&phi; ( x i ) + b ) - &Sigma; i = 1 n &alpha; i * ( &epsiv; + &zeta; i * + d i - w&phi; ( x i ) - b ) - &Sigma; i = 1 n ( &eta; i &zeta; i + &eta; i * &zeta; i * ) - - - ( 3 - 5 )
In optimizing problem, Lagrange function has a saddle point, and according to saddle point condition, L function is respectively with respect to original variable (w, b, ζ i, ζ *) partial derivative be 0, that is:
&PartialD; w L = w - &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) &phi; ( x i ) = 0 - - - ( 3 - 6 )
&PartialD; b L = &Sigma; i = 1 n ( &alpha; i * - &alpha; i ) = 0 - - - ( 3 - 7 )
&PartialD; &zeta; i L = C - &alpha; i - &eta; i = 0 - - - ( 3 - 8 )
&PartialD; &zeta; i * L = C - &alpha; i * - &eta; i * = 0 - - - ( 3 - 9 )
By formula (3-6), (3-7), (3-8), (3-9) substitution (3-5), utilize Wolfe antithesis skill, obtain following formula:
max imize : R ( &alpha; i , &alpha; i * ) = - 1 2 &Sigma; i , j = 1 n ( &alpha; i - &alpha; i * ) ( &alpha; j - &alpha; j * ) < &phi; ( x i ) , &phi; ( x j ) > - &epsiv; &Sigma; i = 1 n ( &alpha; i + &alpha; i * ) + &Sigma; i = 1 n d i ( &alpha; i - &alpha; i * ) - - - ( 3 - 10 )
Subjected to : &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) = 0 &alpha; i , &alpha; i * &Element; [ 0 , C ]
For nonlinear regression problem, can problem be transformed in a new space by the new proper vector of structure, this need to be converted into the linear problem in another higher dimensional space by nonlinear transformation by nonlinear problem; In this transformation space, only need the inner product operation after definition conversion; According to Hilbert-Schmidt principle, as long as a kind of computing meets Mercer condition, just can be used as inner product and use; Therefore SVM method is by introducing suitable inner product kernel function K (x i, x j) realize the linear regression after nonlinear transformation;
Kernel function K (x i, x j) selection determined the structure of feature space, its value equals two vector x iand x jat feature space φ (x i) and φ (x j) inner product, i.e. K (x i, x j)=φ (x i) * φ (x j); Choosing of kernel function has a great impact precision of prediction, and main kernel function has polynomial kernel function K (x, x i)=(xx i+ 1) q, exponential type radial basis function K (x, x i)=exp (| x-x i| 2/ σ 2) and sigmoid function K (x, x i)=tanh (v (xx i)+c); Wherein, q is the exponent number of polynomial kernel, σ 2it is the width parameter of radial basis core; Adopt comparatively accurate radial basis kernel function herein; Therefore, objective function (3-10) becomes:
max imize : R ( &alpha; i , &alpha; i * ) = - 1 2 &Sigma; i , j = 1 n ( &alpha; i - &alpha; i * ) ( &alpha; j - &alpha; j * ) K ( x i , x j ) - &epsiv; &Sigma; i = 1 n ( &alpha; i + &alpha; i * ) + &Sigma; i = 1 n d i ( &alpha; i - &alpha; i * ) - - - ( 3 - 11 )
Subjected to : &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) = 0 &alpha; i , &alpha; i * &Element; [ 0 , C ]
Corresponding regression function formula (2-1) is:
y = f ( x , &alpha; i , &alpha; i * ) = &Sigma; i = 1 n ( &alpha; i - &alpha; i * ) K ( x , x i ) + b - - - ( 3 - 12 )
By the character of SVM regression function,
Figure FDA0000432189520000041
and only has minority α i,
Figure FDA0000432189520000042
non-vanishing, vector corresponding to these parameters is referred to as support vector, and regression function is completely by its decision.
5. demand for commodity information forecasting method under many influence factors as claimed in claim 4, is characterized in that: with training sample, set up the objective function suc as formula (3-11); Utilize SVM training algorithm to solve (3-11), obtain separating α iwith
Figure FDA0000432189520000043
, i=1,2...n; By in the Lagrange multiplier substitution formula (3-12) obtaining, recycling sample is predicted tomorrow requirement amount.
CN201310656936.5A 2013-12-06 2013-12-06 Commodity demand information prediction method under multiple influence factors Pending CN103617459A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310656936.5A CN103617459A (en) 2013-12-06 2013-12-06 Commodity demand information prediction method under multiple influence factors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310656936.5A CN103617459A (en) 2013-12-06 2013-12-06 Commodity demand information prediction method under multiple influence factors

Publications (1)

Publication Number Publication Date
CN103617459A true CN103617459A (en) 2014-03-05

Family

ID=50168163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310656936.5A Pending CN103617459A (en) 2013-12-06 2013-12-06 Commodity demand information prediction method under multiple influence factors

Country Status (1)

Country Link
CN (1) CN103617459A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971170A (en) * 2014-04-17 2014-08-06 北京百度网讯科技有限公司 Method and device for forecasting changes of feature information
CN104766144A (en) * 2015-04-22 2015-07-08 携程计算机技术(上海)有限公司 Order forecasting method and system
CN104794944A (en) * 2015-05-11 2015-07-22 临沂大学 Online shop sales training system and online shop sales training method
WO2016091148A1 (en) * 2014-12-12 2016-06-16 北京京东尚科信息技术有限公司 User action data processing method and device
CN105825570A (en) * 2016-03-03 2016-08-03 孙腾 Network voting display method and system thereof
CN105894111A (en) * 2016-03-30 2016-08-24 天鸿泰(北京)科技有限公司 Energy consumption prediction method and device based on complementary fuzzy neural network
CN106709826A (en) * 2015-11-13 2017-05-24 湖南餐启科技有限公司 Restaurant turnover prediction method and device thereof
CN107862555A (en) * 2017-11-30 2018-03-30 四川长虹电器股份有限公司 Forecasting system and method based on exponential smoothing
CN108154378A (en) * 2016-12-05 2018-06-12 财团法人资讯工业策进会 Computer device and method for predicting market demand of goods
CN108492142A (en) * 2018-03-28 2018-09-04 联想(北京)有限公司 A kind of method, apparatus and server group calculating order rule
CN108596399A (en) * 2018-05-04 2018-09-28 国家邮政局邮政业安全中心 Method, apparatus, electronic equipment and the storage medium of express delivery amount prediction
CN109191192A (en) * 2018-08-21 2019-01-11 北京京东尚科信息技术有限公司 Data estimation method, apparatus and computer readable storage medium
CN109509030A (en) * 2018-11-15 2019-03-22 北京旷视科技有限公司 Method for Sales Forecast method and its training method of model, device and electronic system
CN109544076A (en) * 2018-11-28 2019-03-29 北京百度网讯科技有限公司 Method and apparatus for generating information
CN109697531A (en) * 2018-12-24 2019-04-30 中铁第四勘察设计院集团有限公司 A kind of logistics park-hinterland Forecast of Logistics Demand method
CN109783485A (en) * 2018-12-30 2019-05-21 国网天津市电力公司电力科学研究院 Distribution historical metrology data bearing calibration based on data mining and support vector machines
CN109919710A (en) * 2019-01-25 2019-06-21 广州富港万嘉智能科技有限公司 A kind of method automatically generating procurement of commodities inventory, electronic equipment and storage medium
CN110517059A (en) * 2019-07-08 2019-11-29 广东工业大学 A kind of fashion handbag sales forecasting method based on random forest
CN110555713A (en) * 2018-05-31 2019-12-10 北京京东尚科信息技术有限公司 method and device for determining sales prediction model
CN111369058A (en) * 2020-03-05 2020-07-03 中国民用航空飞行学院 Forest fire fighting helicopter demand prediction method and system
WO2020186380A1 (en) * 2019-03-15 2020-09-24 State Street Corporation Techniques to forecast future orders using deep learning
CN111893791A (en) * 2020-07-17 2020-11-06 广州博依特智能信息科技有限公司 Method for optimizing operation of drying part of domestic paper making machine based on intelligent algorithm

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101541030A (en) * 2009-05-06 2009-09-23 华为技术有限公司 Method for predicting data based on support vector machine and equipment thereof
CN102682219A (en) * 2012-05-17 2012-09-19 鲁东大学 Method for forecasting short-term load of support vector machine
CN102982383A (en) * 2012-05-15 2013-03-20 红云红河烟草(集团)有限责任公司 Energy supply and demand forecasting method based on support vector machine

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101541030A (en) * 2009-05-06 2009-09-23 华为技术有限公司 Method for predicting data based on support vector machine and equipment thereof
CN102982383A (en) * 2012-05-15 2013-03-20 红云红河烟草(集团)有限责任公司 Energy supply and demand forecasting method based on support vector machine
CN102682219A (en) * 2012-05-17 2012-09-19 鲁东大学 Method for forecasting short-term load of support vector machine

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张昊: "基于SVM的短生命周期产品需求预测模型", 《中国优秀硕士学位论文全文数据库(电子期刊)》, no. 2, 15 December 2011 (2011-12-15) *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971170B (en) * 2014-04-17 2017-09-29 北京百度网讯科技有限公司 The method and apparatus that a kind of change being used for characteristic information is predicted
US10474957B2 (en) 2014-04-17 2019-11-12 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for forecasting characteristic information change
CN103971170A (en) * 2014-04-17 2014-08-06 北京百度网讯科技有限公司 Method and device for forecasting changes of feature information
WO2015158149A1 (en) * 2014-04-17 2015-10-22 北京百度网讯科技有限公司 Method and device for forecasting changes of feature information
RU2670610C9 (en) * 2014-12-12 2018-11-26 Бэйцзин Цзиндун Сенчури Трэйдинг Ко., Лтд. Method and device for processing data of user operation
WO2016091148A1 (en) * 2014-12-12 2016-06-16 北京京东尚科信息技术有限公司 User action data processing method and device
RU2670610C1 (en) * 2014-12-12 2018-10-25 Бэйцзин Цзиндун Сенчури Трэйдинг Ко., Лтд. Method and device for processing data of user operation
CN104766144A (en) * 2015-04-22 2015-07-08 携程计算机技术(上海)有限公司 Order forecasting method and system
CN104794944A (en) * 2015-05-11 2015-07-22 临沂大学 Online shop sales training system and online shop sales training method
CN106709826A (en) * 2015-11-13 2017-05-24 湖南餐启科技有限公司 Restaurant turnover prediction method and device thereof
CN105825570A (en) * 2016-03-03 2016-08-03 孙腾 Network voting display method and system thereof
CN105894111A (en) * 2016-03-30 2016-08-24 天鸿泰(北京)科技有限公司 Energy consumption prediction method and device based on complementary fuzzy neural network
CN105894111B (en) * 2016-03-30 2020-02-04 天鸿泰(北京)科技有限公司 Energy consumption prediction method and device based on complementary fuzzy neural network
CN108154378A (en) * 2016-12-05 2018-06-12 财团法人资讯工业策进会 Computer device and method for predicting market demand of goods
CN107862555A (en) * 2017-11-30 2018-03-30 四川长虹电器股份有限公司 Forecasting system and method based on exponential smoothing
CN108492142A (en) * 2018-03-28 2018-09-04 联想(北京)有限公司 A kind of method, apparatus and server group calculating order rule
CN108596399A (en) * 2018-05-04 2018-09-28 国家邮政局邮政业安全中心 Method, apparatus, electronic equipment and the storage medium of express delivery amount prediction
CN110555713A (en) * 2018-05-31 2019-12-10 北京京东尚科信息技术有限公司 method and device for determining sales prediction model
CN109191192A (en) * 2018-08-21 2019-01-11 北京京东尚科信息技术有限公司 Data estimation method, apparatus and computer readable storage medium
CN109509030A (en) * 2018-11-15 2019-03-22 北京旷视科技有限公司 Method for Sales Forecast method and its training method of model, device and electronic system
CN109544076A (en) * 2018-11-28 2019-03-29 北京百度网讯科技有限公司 Method and apparatus for generating information
CN109544076B (en) * 2018-11-28 2021-06-18 北京百度网讯科技有限公司 Method and apparatus for generating information
CN109697531A (en) * 2018-12-24 2019-04-30 中铁第四勘察设计院集团有限公司 A kind of logistics park-hinterland Forecast of Logistics Demand method
CN109783485A (en) * 2018-12-30 2019-05-21 国网天津市电力公司电力科学研究院 Distribution historical metrology data bearing calibration based on data mining and support vector machines
CN109919710A (en) * 2019-01-25 2019-06-21 广州富港万嘉智能科技有限公司 A kind of method automatically generating procurement of commodities inventory, electronic equipment and storage medium
WO2020186380A1 (en) * 2019-03-15 2020-09-24 State Street Corporation Techniques to forecast future orders using deep learning
CN110517059A (en) * 2019-07-08 2019-11-29 广东工业大学 A kind of fashion handbag sales forecasting method based on random forest
CN111369058A (en) * 2020-03-05 2020-07-03 中国民用航空飞行学院 Forest fire fighting helicopter demand prediction method and system
CN111893791A (en) * 2020-07-17 2020-11-06 广州博依特智能信息科技有限公司 Method for optimizing operation of drying part of domestic paper making machine based on intelligent algorithm

Similar Documents

Publication Publication Date Title
CN103617459A (en) Commodity demand information prediction method under multiple influence factors
Lin et al. Empirical mode decomposition–based least squares support vector regression for foreign exchange rate forecasting
Ye et al. A novel forecasting method based on multi-order fuzzy time series and technical analysis
CN110070145B (en) LSTM hub single-product energy consumption prediction based on incremental clustering
Zhao Futures price prediction of agricultural products based on machine learning
US20210326696A1 (en) Method and apparatus for forecasting power demand
CN109492748B (en) Method for establishing medium-and-long-term load prediction model of power system based on convolutional neural network
CN104504475A (en) AR*-SVM (support vector machine) hybrid modeling based haze time series prediction method
Chen et al. Forecasting financial crises for an enterprise by using the Grey Markov forecasting model
CN112418476A (en) Ultra-short-term power load prediction method
CN117977568A (en) Power load prediction method based on nested LSTM and quantile calculation
CN113111924A (en) Electric power customer classification method and device
CN113240527A (en) Bond market default risk early warning method based on interpretable machine learning
Wang et al. Big data analytics for price forecasting in smart grids
CN116187835A (en) Data-driven-based method and system for estimating theoretical line loss interval of transformer area
CN102208030A (en) Bayesian-model-averaging-based model combing method on regularization path of support vector machine
Calabrese et al. Generalized extreme value regression for binary rare events data: an application to credit defaults
CN111126499A (en) Secondary clustering-based power consumption behavior pattern classification method
CN112330030B (en) System and method for predicting requirements of expansion materials
Hu et al. Prediction of passenger flow on the highway based on the least square suppoert vector machine
CN101807271A (en) Product demand forecasting method based on generalized adjacent substitution
CN103106329A (en) Training sample grouping construction method used for support vector regression (SVR) short-term load forecasting
CN115829418B (en) Method and system for constructing load characteristic portraits of power consumers suitable for load management
CN104143117A (en) Method for extracting correlation coefficient between special loads and day loads of power grid
CN111931992A (en) Power load prediction index selection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: ZHONGCHU NANJING INTELLIGENT LOGISTICS TECHNOLOGY

Free format text: FORMER OWNER: LI JINGQUAN

Effective date: 20140826

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 210093 NANJING, JIANGSU PROVINCE TO: 210000 NANJING, JIANGSU PROVINCE

TA01 Transfer of patent application right

Effective date of registration: 20140826

Address after: River Road, Gulou District of Nanjing city of Jiangsu Province, No. 1 210000

Applicant after: The Nanjing smart Logistics Technology Co. Ltd.

Address before: 210093, 5, lane, Gulou District, Jiangsu, Nanjing

Applicant before: Li Jingquan

ASS Succession or assignment of patent right

Owner name: LI JINGQUAN

Free format text: FORMER OWNER: ZHONGCHU NANJING INTELLIGENT LOGISTICS TECHNOLOGY CO., LTD.

Effective date: 20141209

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 210000 NANJING, JIANGSU PROVINCE TO: 210093 NANJING, JIANGSU PROVINCE

TA01 Transfer of patent application right

Effective date of registration: 20141209

Address after: 210093, 5, lane, Gulou District, Jiangsu, Nanjing

Applicant after: Li Jingquan

Address before: River Road, Gulou District of Nanjing city of Jiangsu Province, No. 1 210000

Applicant before: The Nanjing smart Logistics Technology Co. Ltd.

ASS Succession or assignment of patent right

Owner name: NANJING LUOJIE SIMING LOGISTICS TECHNOLOGY CO., LT

Free format text: FORMER OWNER: LI JINGQUAN

Effective date: 20150105

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 210093 NANJING, JIANGSU PROVINCE TO: 210000 NANJING, JIANGSU PROVINCE

TA01 Transfer of patent application right

Effective date of registration: 20150105

Address after: Xuanwu District of Nanjing City, Jiangsu province 210000 Houzaimen Village No. 95

Applicant after: Nanjing Luojiesi Ming Logistics Technology Co. Ltd.

Address before: 210093, 5, lane, Gulou District, Jiangsu, Nanjing

Applicant before: Li Jingquan

ASS Succession or assignment of patent right

Owner name: ZHONGCHU NANJING INTELLIGENT LOGISTICS TECHNOLOGY

Free format text: FORMER OWNER: NANJING LUOJIE SIMING LOGISTICS TECHNOLOGY CO., LTD.

Effective date: 20150415

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20150415

Address after: 210000 No. 1 River Road, Gulou District, Jiangsu, Nanjing

Applicant after: The Nanjing smart Logistics Technology Co. Ltd.

Address before: Xuanwu District of Nanjing City, Jiangsu province 210000 Houzaimen Village No. 95

Applicant before: Nanjing Luojiesi Ming Logistics Technology Co. Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140305