CN110969378A - Warehouse article configuration method and device - Google Patents

Warehouse article configuration method and device Download PDF

Info

Publication number
CN110969378A
CN110969378A CN201811139144.XA CN201811139144A CN110969378A CN 110969378 A CN110969378 A CN 110969378A CN 201811139144 A CN201811139144 A CN 201811139144A CN 110969378 A CN110969378 A CN 110969378A
Authority
CN
China
Prior art keywords
order
warehouse
vector
item
equal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811139144.XA
Other languages
Chinese (zh)
Other versions
CN110969378B (en
Inventor
陈轶雄
李朝峰
施昱辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201811139144.XA priority Critical patent/CN110969378B/en
Publication of CN110969378A publication Critical patent/CN110969378A/en
Application granted granted Critical
Publication of CN110969378B publication Critical patent/CN110969378B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Human Resources & Organizations (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a warehouse article configuration method and device, which are characterized by comprising the following steps: generating an order splitting rate function based on a first neural network model according to the order data and the article distribution data of each warehouse; the warehouse comprises a main warehouse and a clone warehouse; based on a second neural network model, generating corresponding article configuration information of the clone warehouse by taking the order splitting rate function as an objective function; and configuring the articles for the corresponding cloning warehouse according to the article configuration information. By adopting the invention, the distribution cost of the articles can be effectively saved.

Description

Warehouse article configuration method and device
Technical Field
The invention relates to the field of warehouse logistics, in particular to a warehouse article configuration method and device.
Background
At present, most of online retail electronic suppliers adopt a one-product-one-warehouse mode, namely, a Stock Keeping Unit (SKU) is stored in only one warehouse, and a SKU is stored in only one warehouse attribute. In recent years, with the rapid development of the e-commerce industry, the demand of customers on online retail suppliers is increasing, and the SKU categories are also increasing exponentially. In this case, the traditional mode of one product and one bin directly leads to the rapid increase of the bill splitting rate, and the operation cost of the e-commerce is greatly increased. From the perspective of long-term service development requirements, the concept of clone bins is provided, the limitation that the same SKU only has one warehouse attribute is broken, and the attribute of multiple warehouses is realized. In the case of one-item multi-bin, the SKU corresponding to the bin other than the main body bin is called a clone bin.
At present, the goods configuration scheme for the cloning bin generally selects which goods are put into the cloning bin according to the same order sales ordering result of the goods and the goods in the cloning bin (to-be-selected bin). The method is visual and simple, and the main idea is that if some commodities frequently appear in the same order with the commodities in the cloning bin, the probability that the corresponding orders are met is increased after the commodities are placed in the cloning bin, and the order splitting rate is greatly reduced due to the fact that the quantity of the orders is large; on the contrary, if some commodities are rarely listed with the commodities in the clone bin, the probability that only a very small part of orders are satisfied is increased after the commodities are placed in the clone bin, and therefore, the reduction of the order splitting rate is not greatly contributed. Therefore, the commodities can be ranked from large to small according to the same quantity of the commodities in the clone bin, and the commodities can be placed into the clone bin in priority according to the higher ranking.
In the process of implementing the invention, the inventor finds that the above-mentioned clone bin goods configuration scheme has the problem of high distribution cost.
The scheme only considers the characteristic of sales volume of the same order, and actually, some correlation characteristics implicit in comparison between commodities exist. For example, different commodities may have a certain degree of correlation between them to be purchased simultaneously, which cannot be reflected by the order sales volume. Therefore, in practical application, the above existing cloning bin article configuration scheme cannot effectively reduce the order splitting rate. The sub-sheets after the sheet separation usually need to be packaged and distributed individually, which results in increased cost for distributing the articles. Therefore, the above-mentioned cloning warehouse goods arrangement scheme has a problem of high distribution cost.
Disclosure of Invention
In view of the above, the main objective of the present invention is to provide a method and an apparatus for allocating warehouse items, which can effectively save the distribution cost of the items.
In order to achieve the above purpose, the embodiment of the present invention provides a technical solution:
a warehouse item configuration method, comprising:
generating an order splitting rate function based on a first neural network model according to the order data and the article distribution data of each warehouse; the warehouse comprises a main warehouse and a clone warehouse;
based on a second neural network model, generating corresponding article configuration information of the clone warehouse by taking the order splitting rate function as an objective function;
and configuring the articles for the corresponding cloning warehouse according to the article configuration information.
Preferably, the generating the order splitting rate function includes:
for each order i, calculating a data vector L corresponding to the orderiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j
Wherein L isiIs an M x 1 dimensional vector, LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is not less than 0, i is not less than 1 and not more than N, i is order number, N is orderK is more than or equal to 1 and less than or equal to M, k is the serial number of the article, and M is the number of the article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j=0;
For each order i, according to the ascending order of j, the T X numbers corresponding to the order are usedi,jPerforming serial connection to obtain a (M × T) × 1-dimensional vector Xi
For each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi
Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000031
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000032
Preferably, the generating the order splitting rate function includes:
for each order i, calculating a data vector L corresponding to the order iiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j
Wherein L isiIs an M x 1 dimensional vector, Li=[L1 i,...,Lk i,...,LM i],LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0, i is more than or equal to 1 and less than or equal to N, i is the order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is the item number, and M is the item typeCounting; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j=0;
For each order i, corresponding T X orders to the order ii,jPerforming serial connection according to the ascending order of j to obtain a (M multiplied by T) multiplied by 1 dimensional vector corresponding to the order i;
selecting a clone warehouse from the warehouses, and generating R different article distribution data vectors corresponding to the clone warehouse in a mode of randomly changing 0 component in the article distribution data vectors of the selected clone warehouse into 1; for each order i, calculating a data vector L corresponding to the order iiObtaining R M multiplied by 1 dimensional vectors corresponding to the order i by Hadamard multiplication of the Hadamard multiplication product of each vector in the R different article distribution data vectors, and respectively matching each R multiplied by 1 dimensional vector with M multiplied by 1 dimensional vectors X corresponding to all other warehouses except the selected clone warehousei,jPerforming serial connection according to the ascending order of j to obtain R (M multiplied by T) multiplied by 1 dimensionality vectors corresponding to the order i;
for each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi
Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000041
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000042
Preferably, the generating the item configuration information of the corresponding clone warehouse by using the order splitting rate function as an objective function based on the second neural network model includes:
selecting a clone warehouse s from the warehouses, and modifying the article distribution data vector of the selected clone warehouse s into a nonnegative real continuous vector theta [ theta ]1,,...,θk,...,θM],1≤k≤M;
For each order i, calculating a data vector L corresponding to the order iiAnd an item distribution data vector H of said clone warehouse ssHadamard product of (a) to obtain an M X1-dimensional vector X'i,s
Figure BDA0001815376300000043
For each order i, according to the ascending order of the corresponding warehouse number, the order i corresponds to X'i,sM X1-dimensional vector X corresponding to all but the clone warehouse si,jSerially connected to obtain a vector X 'of (M × T) × 1 dimension corresponding to the order i'i
Vector X 'corresponding to each order i'iAs input samples, to
Figure BDA0001815376300000044
As an activation function, the order splitting rate function is corresponded
Figure BDA0001815376300000045
As a fixed parameter, to
Figure BDA0001815376300000046
As an objective function, training the second neural network model to obtain the objective function
Figure BDA0001815376300000047
A corresponding vector θ; wherein the fully connected layer and the output layer of the second neural network model are the same as those of the first neural network model, the initial value of the vector theta is 0 or a positive real number close to zero, and the vector theta is forcibly updated to zero if the vector theta becomes a negative number in the training process;
according to the vector theta obtained after the training, according to the component theta of the vector thetakAnd generating corresponding article configuration information for the cloning warehouse s according to the principle that the larger the article is, the more suitable the corresponding article is to be put into the cloning warehouse s.
Preferably, the first neural network model and the second neural network model are back propagation BP neural network models.
A warehouse item configuration device, comprising:
the order splitting rate generating unit is used for generating an order splitting rate function based on the first neural network model according to the order data and the article distribution data of each warehouse; the warehouse comprises a main warehouse and a clone warehouse;
the configuration information generating unit is used for generating corresponding article configuration information of the clone warehouse by taking the order splitting rate function as an objective function based on a second neural network model;
and the configuration unit is used for configuring the articles for the corresponding clone warehouse according to the article configuration information.
Preferably, the order splitting rate generating unit is configured to calculate, for each order i, a data vector L corresponding to the orderiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j(ii) a Wherein L isiIs an M x 1 dimensional vector, LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and less than or equal to 1 and less than or equal to N, i is an order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is an article number, and M is the number of article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j0; for each order i, according to the ascending order of j, the T X numbers corresponding to the order are usedi,jAre connected in series to obtainOne (M T) X1 dimensional vector Xi(ii) a For each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi(ii) a Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000051
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000052
Preferably, the order splitting rate generating unit is configured to calculate, for each order i, a data vector L corresponding to the order iiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j(ii) a Wherein L isiIs an M x 1 dimensional vector, Li=[L1 i,...,Lk i,...,LM i],LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and less than or equal to 1 and less than or equal to N, i is an order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is an article number, and M is the number of article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j0; for each order i, corresponding T X orders to the order ii,jPerforming serial connection according to the ascending order of j to obtain a (M multiplied by T) multiplied by 1 dimensional vector corresponding to the order i; selecting one clone warehouse from the warehouses, and generating R different clone warehouses corresponding to the clone warehouse in a mode of randomly changing 0 component in the article distribution data vector of the selected clone warehouse into 1An item distribution data vector; for each order i, calculating a data vector L corresponding to the order iiObtaining R M multiplied by 1 dimensional vectors corresponding to the order i by Hadamard multiplication of the Hadamard multiplication product of each vector in the R different article distribution data vectors, and respectively matching each R multiplied by 1 dimensional vector with M multiplied by 1 dimensional vectors X corresponding to all other warehouses except the selected clone warehousei,jPerforming serial connection according to the ascending order of j to obtain R (M multiplied by T) multiplied by 1 dimensionality vectors corresponding to the order i; for each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi(ii) a Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000061
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000062
Preferably, the configuration information generating unit is configured to select a clone warehouse s from the warehouses, and modify the article distribution data vector of the selected clone warehouse s into a non-negative real continuous vector θ, [ θ ═ θ1,,...,θk,...,θM]K is more than or equal to 1 and less than or equal to M; for each order i, calculating a data vector L corresponding to the order iiAnd an item distribution data vector H of said clone warehouse ssHadamard product of (a) to obtain an M X1-dimensional vector X'i,s
Figure BDA0001815376300000063
For each order i, according to the ascending order of the corresponding warehouse number, the order i corresponds to X'i,sM X1-dimensional vector X corresponding to all but the clone warehouse si,jSerially connected to obtain a vector X 'of (M × T) × 1 dimension corresponding to the order i'i(ii) a Vector X 'corresponding to each order i'iAsInputting samples to
Figure BDA0001815376300000071
As an activation function, the order splitting rate function is corresponded
Figure BDA0001815376300000072
As a fixed parameter, to
Figure BDA0001815376300000073
As an objective function, training the second neural network model to obtain the objective function
Figure BDA0001815376300000074
A corresponding vector θ; wherein the fully connected layer and the output layer of the second neural network model are the same as those of the first neural network model, the initial value of the vector theta is 0 or a positive real number close to zero, and the vector theta is forcibly updated to zero if the vector theta becomes a negative number in the training process; according to the vector theta obtained after the training, according to the component theta of the vector thetakAnd generating corresponding article configuration information for the cloning warehouse s according to the principle that the larger the article is, the more suitable the corresponding article is to be put into the cloning warehouse s.
Preferably, the first neural network model and the second neural network model are back propagation BP neural network models.
A warehouse item configuration device, comprising:
a memory; and a processor coupled to the memory, the processor configured to perform the above-described method of warehouse item configuration based on instructions stored in the memory.
A computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the warehouse item configuration method described above.
In summary, in the warehouse item configuration scheme provided in the embodiment of the present invention, the neural network is used to analyze the order data and the item distribution data of each warehouse to obtain an optimized order splitting rate function, and then the neural network is used to generate the item configuration information of the corresponding clone warehouse with the optimized order splitting rate function as a target. Therefore, the item allocation of the cloning warehouse can effectively reduce the item splitting rate, and the delivery cost of the items can be saved.
Drawings
FIG. 1 is a schematic flow chart of a method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
In the embodiment of the invention, the neural network is utilized to mine the implicit characteristics among the articles so as to reduce the distribution cost of the articles by reducing the order splitting rate.
Neural Networks (NN) are an operational model, which is composed of a large number of nodes (or "neurons" or "cells") and interconnections between them. Each node represents a particular output function, called the excitation function. Every connection between two nodes represents a weighted value, called weight, for the signal passing through the connection, which is equivalent to the memory of the artificial neural network. The output of the network is different according to the connection mode of the network, the weight value and the excitation function. The network itself is usually an approximation to some algorithm or function in nature, and may also be an expression of a logic strategy. Artificial neural networks are usually optimized by a Learning Method (Learning Method) based on mathematical statistics, so that artificial neural networks are also a practical application of mathematical statistics, and a large number of local structural spaces that can be expressed by functions can be obtained by a standard mathematical Method of statistics.
Neural networks are based on neurons. Neurons are biological models based on nerve cells of the biological nervous system. When people research the biological nervous system to discuss the mechanism of artificial intelligence, the neuron is digitalized, and a neuron mathematical model is generated. A large number of neurons of the same form are connected together to form a neural network. Neural networks are highly nonlinear dynamical systems. Although the structure and function of each neuron is not complex, the dynamic behavior of the neural network is quite complex; therefore, various phenomena of the real physical world can be expressed using the neural network. The great appeal of neural network models to people is mainly the following:
1) and (4) parallel distribution processing.
2) High robustness and fault tolerance.
3) Distributing storage and learning capabilities.
4) Can fully approximate complex nonlinear relation.
The Back Propagation (BP) network is proposed by a group of scientists including Rumelhart and mccell in 1986, is a multi-layer feedforward network trained according to an error inverse Propagation algorithm, and is one of the most widely applied neural network models at present. The BP network can learn and store a large number of input-output pattern mappings without prior disclosure of mathematical equations describing such mappings. The learning rule is that the steepest descent method is used, and the weight and the threshold value of the network are continuously adjusted through back propagation, so that the error square sum of the network is minimum. The basic principle of the BP algorithm is to estimate the error of the direct leading layer of the output layer by using the output error, and then estimate the error of the previous layer by using the error, so that the error estimates of all other layers are obtained after the error of one layer is transmitted back. The BP neural network model topological structure comprises an input layer (inputlayer), a hidden layer (hide layer) and an output layer (output layer)
For an input layer (input layer), each neuron of the input layer is responsible for receiving input information from the outside and transmitting the input information to each neuron of the middle layer;
hidden Layer (Hidden Layer): the intermediate layer is an internal information processing layer and is responsible for information transformation, and the intermediate layer can be designed into a single-hidden layer or multi-hidden layer structure according to the requirement of information change capability; the information transmitted to each neuron of the output layer by the last hidden layer is further processed to complete the forward propagation processing process of one learning;
output Layer (Output Layer): for outputting the information processing result to the outside.
Fig. 1 is a schematic flow chart of a warehouse article configuration method implemented in an embodiment of the present invention, as shown in fig. 1, the embodiment mainly includes:
step 101, generating an order splitting rate function based on a first neural network model according to order data and article distribution data of each warehouse; the warehouses include a primary warehouse and a clone warehouse.
The step is used for generating an optimal order splitting rate function by using the neural network, so that the neural network is used for obtaining the article configuration information of the corresponding clone warehouse by taking the order splitting rate function as a target in the subsequent steps, and the optimal order splitting rate can be obtained when the warehouse articles are configured based on the article configuration information.
Preferably, the first neural network model may be a Back Propagation (BP) neural network model.
Preferably, the step 101 may specifically generate the splitting rate function by the following two methods:
the first method comprises the following steps:
step 101a1, for each order i, calculating the data vector L corresponding to the orderiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j
Wherein L isiIs an M x 1 dimensional vector, LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and less than or equal to 1 and less than or equal to N, i is an order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is an article number, and M is the number of article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk jWhen the product is 1 ═ cH when k is not present in warehouse jk j=0。
Step 101a2, for each order i, according to the ascending order of j, the T X numbers corresponding to the orderi,jPerforming serial connection to obtain a (M × T) × 1-dimensional vector Xi
Step 101a3, for each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi
Step 101a4, converting (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000101
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000102
In this step, all of (X) obtained will be usedi,Yi) As training samples, to
Figure BDA0001815376300000103
And training the first neural network model as an objective function, and continuously optimizing a loss function related to the difference between the actual order splitting rate and the target order splitting rate to minimize the error between the actual order splitting rate and the target order splitting rate, so that a corresponding order splitting rate function with the minimum error can be obtained.
The second method comprises the following steps:
step 101b1, for each order i, calculating the data vector L corresponding to the order iiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j
Wherein L isiIs an M x 1 dimensional vector, Li=[L1 i,...,Lk i,...,LM i],LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and less than or equal to 1 and less than or equal to N, i is an order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is an article number, and M is the number of article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j=0;
Step 101b2, for each order i, corresponding T X orders ii,jAnd performing serial connection according to the ascending order of j to obtain a (M multiplied by T) multiplied by 1 dimensional vector corresponding to the order i.
Step 101b3, selecting a clone warehouse from the warehouses, and generating R different article distribution data vectors corresponding to the clone warehouse by adopting a mode of randomly changing 0 component in the article distribution data vectors of the selected clone warehouse into 1; for each order i, calculating a data vector L corresponding to the order iiObtaining R M multiplied by 1 dimensional vectors corresponding to the order i by Hadamard multiplication of the Hadamard multiplication product of each vector in the R different article distribution data vectors, and respectively matching each R multiplied by 1 dimensional vector with M multiplied by 1 dimensional vectors X corresponding to all other warehouses except the selected clone warehousei,jAnd serially connecting according to the ascending order of j to obtain R (M multiplied by T) multiplied by 1-dimensional vectors corresponding to the order i.
Here, the difference of the first method in step 101b3 is that a clone warehouse needs to be selected, the article distribution data vector of the clone warehouse is transformed to obtain R different article distribution data vectors, and the R different article distribution data vectors are used to generate incremental data, that is, for each order i, R (M × T) × 1-dimensional vectors corresponding to the order i are generated. Therefore, the number of samples for training the neural network model in the subsequent steps can be increased, and the universality of the training result can be improved.
Specifically, the clone bank may be selected in any selected manner in this step. The first repository may be conveniently set to a clone repository, upon which incremental data is generated.
Step 101b4, for each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi
In this step, the vector X is used as a basisiCalculating corresponding bill disassembling rate YiThe specific method of (a) is known to those skilled in the art and will not be described herein again.
Step 101b5, mixing (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000121
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000122
In this step, all of (X) obtained will be usedi,Yi) As training samples, to
Figure BDA0001815376300000123
And training the first neural network model as an objective function, and continuously optimizing a loss function related to the difference between the actual order splitting rate and the target order splitting rate to minimize the error between the actual order splitting rate and the target order splitting rate, so that a corresponding order splitting rate function with the minimum error can be obtained.
In the two methods, compared with the first method, the incremental data is additionally generated in the second method, so that the first neural network model can better adapt to the distribution of the articles in different clone warehouses, and overfitting is effectively reduced.
And 102, generating corresponding article configuration information of the clone warehouse by taking the order splitting rate function as an objective function based on a second neural network model.
In step 102, the optimal order splitting rate function obtained in step 101 is used as the objective function, and the second neural network model is utilized to obtain the item configuration information of the clone warehouse corresponding to the order splitting rate function, so that the optimal order splitting rate corresponding to the item configuration information can be ensured, and the purpose of effectively reducing the distribution cost can be achieved.
Preferably, the second neural network model may be a back propagation BP neural network model.
Preferably, in step 102, the following method may be used to generate the item configuration information of the corresponding clone warehouse:
step 1021, selecting a clone warehouse s from the warehouses, and modifying the article distribution data vector of the selected clone warehouse s into a non-negative real number continuous vector theta, wherein theta is [ theta ═ theta1,,...,θk,...,θM],1≤k≤M。
In practical application, the clone warehouse s can be selected by adopting any selection mode in the step. Preferably, the clone warehouse s may be the same clone warehouse selected when used to generate incremental data in step 101b 3.
Step 1022, for each order i, calculating a data vector L corresponding to the order iiAnd an item distribution data vector H of said clone warehouse ssHadamard product of (a) to obtain an M X1-dimensional vector X'i,s
Figure BDA0001815376300000131
Step 1023, for each order i, according to the ascending order of the corresponding warehouse number, the order i corresponds to X'i,sM X1-dimensional vector X corresponding to all but the clone warehouse si,jSerially connected to obtain a vector X 'of (M × T) × 1 dimension corresponding to the order i'i
Step 1024, using the vector X 'corresponding to each order i'iAs input samples, to
Figure BDA0001815376300000132
As an activation function, the order splitting rate functionCorresponding to number
Figure BDA0001815376300000133
As a fixed parameter, to
Figure BDA0001815376300000134
As an objective function, training the second neural network model to obtain the objective function
Figure BDA0001815376300000135
The corresponding vector theta.
And the tan () is a hyperbolic tangent function, the full connection layer and the output layer of the second neural network model are the same as those of the first neural network model, the initial value of the vector theta is 0 or a positive real number close to zero, and the vector theta is forcibly updated to zero if the vector theta is changed into a negative number in the training process.
Figure BDA0001815376300000136
Is in the value range of [0,1 ]]Representing the probability that item k appears in the sub-order split by the clone bin. When L isi kWhen 0, i.e. order i does not contain item k, or when θkWhen the value is 0, namely the cloning bin has no item k,
Figure BDA0001815376300000137
Figure BDA0001815376300000138
with thetakMonotonically increasing when thetakIn the very large of the time,
Figure BDA0001815376300000139
approaching 1, i.e. the probability that item k appears in the sub-order split by the clone bin is 1. Thus, θkThe size of (d) represents the probability that the clone stocked item k.
In this step, the order splitting rate parameter (i.e., the rate of splitting) is maintained
Figure BDA00018153763000001310
) Under the condition of no change, the mapping parameter vector theta between the order data and the sub-order data split by the cloned bin is continuously updated through the second neural network model, so that the loss function, namely the split-order rate function, is the minimum value. The resulting mapping parameter vector θ reflects which items are eligible for placement in the clone bin.
Step 1025, according to the vector theta obtained after the training and the component theta of the vector thetakAnd generating corresponding article configuration information for the cloning warehouse s according to the principle that the larger the article is, the more suitable the corresponding article is to be put into the cloning warehouse s.
And 103, configuring the articles for the corresponding clone warehouse according to the article configuration information.
Fig. 2 is a schematic structural diagram of an embodiment of a warehouse article allocation device corresponding to the above method embodiment, and as shown in fig. 2, the warehouse article allocation device mainly includes:
the order splitting rate generating unit is used for generating an order splitting rate function based on the first neural network model according to the order data and the article distribution data of each warehouse; the warehouse comprises a main warehouse and a clone warehouse;
the configuration information generating unit is used for generating corresponding article configuration information of the clone warehouse by taking the order splitting rate function as an objective function based on a second neural network model;
and the configuration unit is used for configuring the articles for the corresponding clone warehouse according to the article configuration information.
Preferably, the order splitting rate generating unit is configured to calculate, for each order i, a data vector L corresponding to the orderiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j(ii) a Wherein L isiIs an M x 1 dimensional vector, LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is not less than 0, i is not less than 1 and not more than N, i is order number, N is order total numberK is more than or equal to 1 and less than or equal to M, k is the serial number of the article, and M is the number of the article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j0; for each order i, according to the ascending order of j, the T X numbers corresponding to the order are usedi,jPerforming serial connection to obtain a (M × T) × 1-dimensional vector Xi(ii) a For each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi(ii) a Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000141
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000142
Preferably, the order splitting rate generating unit is configured to calculate, for each order i, a data vector L corresponding to the order iiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j(ii) a Wherein L isiIs an M x 1 dimensional vector, Li=[L1 i,...,Lk i,...,LM i],LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and less than or equal to 1 and less than or equal to N, i is an order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is an article number, and M is the number of article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an article k, when the article k exists inTime in warehouse j Hk j1, H when item k is not present in warehouse jk j0; for each order i, corresponding T X orders to the order ii,jPerforming serial connection according to the ascending order of j to obtain a (M multiplied by T) multiplied by 1 dimensional vector corresponding to the order i; selecting a clone warehouse from the warehouses, and generating R different article distribution data vectors corresponding to the clone warehouse in a mode of randomly changing 0 component in the article distribution data vectors of the selected clone warehouse into 1; for each order i, calculating a data vector L corresponding to the order iiObtaining R M multiplied by 1 dimensional vectors corresponding to the order i by Hadamard multiplication of the Hadamard multiplication product of each vector in the R different article distribution data vectors, and respectively matching each R multiplied by 1 dimensional vector with M multiplied by 1 dimensional vectors X corresponding to all other warehouses except the selected clone warehousei,jPerforming serial connection according to the ascending order of j to obtain R (M multiplied by T) multiplied by 1 dimensionality vectors corresponding to the order i; for each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi(ii) a Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure BDA0001815376300000151
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure BDA0001815376300000152
Preferably, the configuration information generating unit is configured to select a clone warehouse s from the warehouses, and modify the article distribution data vector of the selected clone warehouse s into a non-negative real continuous vector θ, [ θ ═ θ1,,...,θk,...,θM]K is more than or equal to 1 and less than or equal to M; for each order i, calculating a data vector L corresponding to the order iiAnd an item distribution data vector H of said clone warehouse ssHadamard product of (a) to obtain an M X1-dimensional vector X'i,s
Figure BDA0001815376300000161
For each order i, according to the ascending order of the corresponding warehouse number, the order i corresponds to X'i,sM X1-dimensional vector X corresponding to all but the clone warehouse si,jSerially connected to obtain a vector X 'of (M × T) × 1 dimension corresponding to the order i'i(ii) a Vector X 'corresponding to each order i'iAs input samples, to
Figure BDA0001815376300000162
As an activation function, the order splitting rate function is corresponded
Figure BDA0001815376300000163
As a fixed parameter, to
Figure BDA0001815376300000164
As an objective function, training the second neural network model to obtain the objective function
Figure BDA0001815376300000165
A corresponding vector θ; wherein the fully connected layer and the output layer of the second neural network model are the same as those of the first neural network model, the initial value of the vector theta is 0 or a positive real number close to zero, and the vector theta is forcibly updated to zero if the vector theta becomes a negative number in the training process; according to the vector theta obtained after the training, according to the component thetakAnd generating corresponding article configuration information for the cloning warehouse s according to the principle that the larger the article is, the more suitable the corresponding article is to be put into the cloning warehouse s.
Preferably, the first neural network model and the second neural network model are back propagation BP neural network models.
An embodiment of the present invention further provides a device for configuring articles in a warehouse, including:
a memory; and a processor coupled to the memory, the processor configured to perform the warehouse item configuration method embodiments described above based on instructions stored in the memory.
The embodiment of the invention also provides a computer-readable storage medium, on which a computer program is stored, wherein the program is used for implementing the embodiment of the warehouse article configuration method when being executed by a processor.
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (12)

1. A method for arranging warehouse items, comprising:
generating an order splitting rate function based on a first neural network model according to the order data and the article distribution data of each warehouse; the warehouse comprises a main warehouse and a clone warehouse;
based on a second neural network model, generating corresponding article configuration information of the clone warehouse by taking the order splitting rate function as an objective function;
and configuring the articles for the corresponding cloning warehouse according to the article configuration information.
2. The method of claim 1, wherein the generating the singleton rate function comprises:
for each order i, calculating a data vector L corresponding to the orderiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j
Wherein L isiIs an M x 1 dimensional vector, LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and less than or equal to 1 and less than or equal to N, i is an order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is an article number, and M is the number of article types; hjIs an Mx 1-dimensional vector, j is a binThe number of the warehouse is more than or equal to 1 and less than or equal to T, T is the total number of the warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j=0;
For each order i, according to the ascending order of j, the T X numbers corresponding to the order are usedi,jPerforming serial connection to obtain a (M × T) × 1-dimensional vector Xi
For each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi
Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure FDA0001815376290000011
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure FDA0001815376290000012
3. The method of claim 1, wherein the generating the singleton rate function comprises:
for each order i, calculating a data vector L corresponding to the order iiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j
Wherein L isiIs an M x 1 dimensional vector, Li=[L1 i,...,Lk i,...,LM i],LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0, i is more than or equal to 1 and less than or equal to N, i is the order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is the item number, M isThe number of article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j=0;
For each order i, according to the ascending order of j, the T X numbers corresponding to the order ii,jPerforming serial connection to obtain a (M multiplied by T) multiplied by 1 dimensional vector corresponding to the order i;
selecting a clone warehouse from the warehouses, and generating R different article distribution data vectors corresponding to the clone warehouse in a mode of randomly changing 0 component in the article distribution data vectors of the selected clone warehouse into 1; for each order i, calculating a data vector L corresponding to the order iiObtaining R M multiplied by 1 dimensional vectors corresponding to the order i by Hadamard multiplication of the Hadamard multiplication product of each vector in the R different article distribution data vectors, and respectively matching each R multiplied by 1 dimensional vector with M multiplied by 1 dimensional vectors X corresponding to all other warehouses except the selected clone warehousei,jPerforming serial connection according to the ascending order of j to obtain R (M multiplied by T) multiplied by 1 dimensionality vectors corresponding to the order i;
for each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi
Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure FDA0001815376290000021
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure FDA0001815376290000022
4. The method according to claim 2 or 3, wherein the generating item configuration information of the corresponding clone warehouse based on the second neural network model with the order splitting rate function as an objective function comprises:
selecting a clone warehouse s from the warehouses, and modifying the article distribution data vector of the selected clone warehouse s into a nonnegative real continuous vector theta [ theta ]1,,...,θk,...,θM],1≤k≤M;
For each order i, calculating a data vector L corresponding to the order iiAnd an item distribution data vector H of said clone warehouse ssHadamard product of (a) to obtain an M X1-dimensional vector X'i,s
Figure FDA0001815376290000031
For each order i, according to the ascending order of the corresponding warehouse number, the order i corresponds to X'i,sM X1-dimensional vector X corresponding to all but the clone warehouse si,jSerially connected to obtain a vector X 'of (M × T) × 1 dimension corresponding to the order i'i
Vector X 'corresponding to each order i'iAs input samples, to
Figure FDA0001815376290000032
As an activation function, the order splitting rate function is corresponded
Figure FDA0001815376290000033
As a fixed parameter, to
Figure FDA0001815376290000034
As an objective function, training the second neural network model to obtain the objective function
Figure FDA0001815376290000035
A corresponding vector θ; wherein the full connection layer and the output layer of the second neural network model are connected with the first neural network modelThe full connection layer and the output layer are the same, the initial value of the vector theta is 0 or a positive real number close to zero, and if the vector theta is changed into a negative number in the training process, the vector theta is forcibly updated to zero;
according to the vector theta obtained after the training, according to the component theta of the vector thetakAnd generating corresponding article configuration information for the cloning warehouse s according to the principle that the larger the article is, the more suitable the corresponding article is to be put into the cloning warehouse s.
5. The method of claim 1, wherein the first neural network model and the second neural network model are back-propagation (BP) neural network models.
6. A warehouse item configuration device, comprising:
the order splitting rate generating unit is used for generating an order splitting rate function based on the first neural network model according to the order data and the article distribution data of each warehouse; the warehouse comprises a main warehouse and a clone warehouse;
the configuration information generating unit is used for generating corresponding article configuration information of the clone warehouse by taking the order splitting rate function as an objective function based on a second neural network model;
and the configuration unit is used for configuring the articles for the corresponding clone warehouse according to the article configuration information.
7. The apparatus of claim 6, wherein the order splitting rate generating unit is configured to calculate, for each order i, a data vector L corresponding to the orderiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j(ii) a Wherein L isiIs an M x 1 dimensional vector, LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and is more than or equal to 1 and less than or equal to N, i is the order number, N is the total number of orders, k is more than or equal to 1 and is less than or equal to M, and k isThe article number, M is the article kind number; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, T is the total number of warehouses, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j0; for each order i, according to the ascending order of j, the T X numbers corresponding to the order are usedi,jPerforming serial connection to obtain a (M × T) × 1-dimensional vector Xi(ii) a For each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi(ii) a Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure FDA0001815376290000041
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure FDA0001815376290000042
8. The apparatus of claim 6, wherein the order splitting rate generating unit is configured to calculate, for each order i, a data vector L corresponding to the order iiAn item distribution data vector H for each of the warehousesjHadamard product of the Hadamard to obtain T M × 1 dimensional vectors Xi,j(ii) a Wherein L isiIs an M x 1 dimensional vector, Li=[L1 i,...,Lk i,...,LM i],LiEach component L ink iCorresponding to an item k, L when item k is present in order ik i1, L when item k is not present in order ik iI is more than or equal to 0 and less than or equal to 1 and less than or equal to N, i is an order number, N is the total number of orders, k is more than or equal to 1 and less than or equal to M, k is an article number, and M is the number of article types; hjIs an Mx 1-dimensional vector, j is a warehouse number, j is more than or equal to 1 and less than or equal to T, and T is a warehouse totalNumber, HjEach component H ink jCorresponding to an item k, H when the item k is present in the warehouse jk j1, H when item k is not present in warehouse jk j0; for each order i, corresponding T X orders to the order ii,jPerforming serial connection according to the ascending order of j to obtain a (M multiplied by T) multiplied by 1 dimensional vector corresponding to the order i; selecting a clone warehouse from the warehouses, and generating R different article distribution data vectors corresponding to the clone warehouse in a mode of randomly changing 0 component in the article distribution data vectors of the selected clone warehouse into 1; for each order i, calculating a data vector L corresponding to the order iiObtaining R M multiplied by 1 dimensional vectors corresponding to the order i by Hadamard multiplication of the Hadamard multiplication product of each vector in the R different article distribution data vectors, and respectively matching each R multiplied by 1 dimensional vector with M multiplied by 1 dimensional vectors X corresponding to all other warehouses except the selected clone warehousei,jPerforming serial connection according to the ascending order of j to obtain R (M multiplied by T) multiplied by 1 dimensionality vectors corresponding to the order i; for each said (M T) X1 dimensional vector XiCalculating the vector XiCorresponding rate of stripping Yi(ii) a Will (X)i,Yi) As training samples, with XiAs input, with YiAs a target value, to
Figure FDA0001815376290000051
Training the first neural network model as an objective function to obtain a corresponding sheet splitting rate function when the objective function value is minimum
Figure FDA0001815376290000052
9. The apparatus according to claim 7 or 8, wherein the configuration information generating unit is configured to select a clone warehouse s from the warehouses, modify an item distribution data vector of the selected clone warehouse s into a non-negative real continuous vector θ, [ θ ═ θ [ θ ] ]1,,...,θk,...,θM]K is more than or equal to 1 and less than or equal to M; for each order i, calculating a data vector L corresponding to the order iiAnd an item distribution data vector H of said clone warehouse ssHadamard product of (a) to obtain an M X1-dimensional vector X'i,s
Figure FDA0001815376290000053
For each order i, according to the ascending order of the corresponding warehouse number, the order i corresponds to X'i,sM X1-dimensional vector X corresponding to all but the clone warehouse si,jSerially connected to obtain a vector X 'of (M × T) × 1 dimension corresponding to the order i'i(ii) a Vector X 'corresponding to each order i'iAs input samples, to
Figure FDA0001815376290000054
As an activation function, the order splitting rate function is corresponded
Figure FDA0001815376290000055
As a fixed parameter, to
Figure FDA0001815376290000056
As an objective function, training the second neural network model to obtain the objective function
Figure FDA0001815376290000057
A corresponding vector θ; wherein the fully connected layer and the output layer of the second neural network model are the same as those of the first neural network model, the initial value of the vector theta is 0 or a positive real number close to zero, and the vector theta is forcibly updated to zero if the vector theta becomes a negative number in the training process; according to the vector theta obtained after the training, according to the component theta of the vector thetakAnd generating corresponding article configuration information for the cloning warehouse s according to the principle that the larger the article is, the more suitable the corresponding article is to be put into the cloning warehouse s.
10. The apparatus of claim 6, wherein the first neural network model and the second neural network model are back-propagation (BP) neural network models.
11. A warehouse item configuration device, comprising:
a memory; and a processor coupled to the memory, the processor configured to perform the method of any of claims 1-5 based on instructions stored in the memory.
12. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of any one of claims 1-5.
CN201811139144.XA 2018-09-28 2018-09-28 Warehouse article configuration method and device Active CN110969378B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811139144.XA CN110969378B (en) 2018-09-28 2018-09-28 Warehouse article configuration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811139144.XA CN110969378B (en) 2018-09-28 2018-09-28 Warehouse article configuration method and device

Publications (2)

Publication Number Publication Date
CN110969378A true CN110969378A (en) 2020-04-07
CN110969378B CN110969378B (en) 2024-05-21

Family

ID=70026766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811139144.XA Active CN110969378B (en) 2018-09-28 2018-09-28 Warehouse article configuration method and device

Country Status (1)

Country Link
CN (1) CN110969378B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113762854A (en) * 2020-11-17 2021-12-07 北京沃东天骏信息技术有限公司 Order processing method and device
CN114186903A (en) * 2020-09-14 2022-03-15 上海顺如丰来技术有限公司 Warehouse product selection method and device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432887A (en) * 1993-03-16 1995-07-11 Singapore Computer Systems Neural network system and method for factory floor scheduling
CN106067102A (en) * 2016-05-24 2016-11-02 北京京东尚科信息技术有限公司 The optimization method of layout for storekeeping and optimization device
CN106384219A (en) * 2016-10-13 2017-02-08 北京京东尚科信息技术有限公司 Warehouse partition assisted analysis method and device
CN107067107A (en) * 2017-04-13 2017-08-18 上海汽车集团股份有限公司 A kind of singulated method and device of logistics order
CN107464177A (en) * 2017-08-23 2017-12-12 北京惠赢天下网络技术有限公司 The processing method and order processing server of a kind of order
CN107545381A (en) * 2016-06-24 2018-01-05 北京京东尚科信息技术有限公司 Manage the method and warehouse management system of warehouse inventory
CN107563702A (en) * 2017-09-14 2018-01-09 北京京东尚科信息技术有限公司 Commodity storage concocting method, device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432887A (en) * 1993-03-16 1995-07-11 Singapore Computer Systems Neural network system and method for factory floor scheduling
CN106067102A (en) * 2016-05-24 2016-11-02 北京京东尚科信息技术有限公司 The optimization method of layout for storekeeping and optimization device
CN107545381A (en) * 2016-06-24 2018-01-05 北京京东尚科信息技术有限公司 Manage the method and warehouse management system of warehouse inventory
CN106384219A (en) * 2016-10-13 2017-02-08 北京京东尚科信息技术有限公司 Warehouse partition assisted analysis method and device
CN107067107A (en) * 2017-04-13 2017-08-18 上海汽车集团股份有限公司 A kind of singulated method and device of logistics order
CN107464177A (en) * 2017-08-23 2017-12-12 北京惠赢天下网络技术有限公司 The processing method and order processing server of a kind of order
CN107563702A (en) * 2017-09-14 2018-01-09 北京京东尚科信息技术有限公司 Commodity storage concocting method, device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114186903A (en) * 2020-09-14 2022-03-15 上海顺如丰来技术有限公司 Warehouse product selection method and device, computer equipment and storage medium
CN113762854A (en) * 2020-11-17 2021-12-07 北京沃东天骏信息技术有限公司 Order processing method and device

Also Published As

Publication number Publication date
CN110969378B (en) 2024-05-21

Similar Documents

Publication Publication Date Title
US20190180186A1 (en) Evolutionary Architectures For Evolution of Deep Neural Networks
Tsou Multi-objective inventory planning using MOPSO and TOPSIS
Kuo et al. Integration of particle swarm optimization and genetic algorithm for dynamic clustering
Bousqaoui et al. Machine learning applications in supply chains: An emphasis on neural network applications
CN110781409B (en) Article recommendation method based on collaborative filtering
US5546503A (en) Apparatus for configuring neural network and pattern recognition apparatus using neural network
CN105844508B (en) Commodity recommendation method based on dynamic periodic neural network
CN110991601B (en) Neural network recommendation method based on multi-user behavior
CN112541575B (en) Method and device for training graph neural network
He An inventory controlled supply chain model based on improved BP neural network
CN112633927B (en) Combined commodity mining method based on knowledge graph rule embedding
CN110969378A (en) Warehouse article configuration method and device
KR20190098801A (en) Classificating method for image of trademark using machine learning
CN116128461A (en) Bidirectional recommendation system and method for online recruitment
Ahamed et al. A recommender system based on deep neural network and matrix factorization for collaborative filtering
Chen Estimating job cycle time in a wafer fabrication factory: A novel and effective approach based on post-classification
Kazemi et al. A hybrid intelligent approach for modeling brand choice and constructing a market response simulator
Talupula Demand forecasting of outbound logistics using machine learning
CN116452293A (en) Deep learning recommendation method and system integrating audience characteristics of articles
Jackson et al. Automl approach to classification of candidate solutions for simulation models of logistic systems
CN115618235A (en) Training method and device for recommendation model
CN106021590B (en) B2B platform supplier recommendation method and system
CN112836839A (en) Warehouse management method and apparatus, and computer readable medium
CN114565428B (en) Commodity recommendation method and recommendation system based on rule learning network
Kibzun et al. Mathematical modelling of a transport system with minimal maintenance costs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant