CN110852513A - Prediction method, prediction device, calculation device, and medium - Google Patents

Prediction method, prediction device, calculation device, and medium Download PDF

Info

Publication number
CN110852513A
CN110852513A CN201911112033.4A CN201911112033A CN110852513A CN 110852513 A CN110852513 A CN 110852513A CN 201911112033 A CN201911112033 A CN 201911112033A CN 110852513 A CN110852513 A CN 110852513A
Authority
CN
China
Prior art keywords
spare part
data
level
bins
prediction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911112033.4A
Other languages
Chinese (zh)
Inventor
欧阳文理
张亚红
张秀玲
朱明达
陈宏业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201911112033.4A priority Critical patent/CN110852513A/en
Publication of CN110852513A publication Critical patent/CN110852513A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Tourism & Hospitality (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Game Theory and Decision Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure provides a prediction method, comprising: the method comprises the steps of obtaining historical data of a plurality of spare part bins, wherein the historical data comprises spare part demand data and environment data of each spare part bin in the plurality of spare part bins at a plurality of first moments, inputting the historical data into a trained prediction model to obtain a prediction result and a first association relation, wherein the prediction result is the spare part demand data of the plurality of spare part bins at least one second moment, the first association relation is the association relation of the spare part demand data among the plurality of spare part bins obtained through the trained prediction model based on the historical data, and outputting the first association relation so as to determine the reliability of the prediction result based on the first association relation. The present disclosure also provides a prediction apparatus, a computing device, and a computer-readable storage medium.

Description

Prediction method, prediction device, calculation device, and medium
Technical Field
The present disclosure relates to a prediction method, a prediction apparatus, a computing device, and a computer-readable storage medium.
Background
An important goal of spare part inventory management is to meet customer requirements for replacement of spare parts during their warranty while reducing the operating costs of the entire spare part supply chain. Therefore, it is critical that a reasonable prediction of the future need for spare parts be made for each location at each level of the supply chain. The purchasing plan of the spare parts can be made more scientifically on the premise of ensuring the service level by reasonably predicting the requirements of each level of the spare part supply chain, and the inventory cost is reduced as much as possible. However, when the demand of future spare parts is predicted through the prediction model in the related art, the obtained prediction result has no interpretability, and it is difficult to convince the user to believe the accuracy of the prediction result.
Disclosure of Invention
One aspect of the present disclosure provides a prediction method, including: historical data of a plurality of spare part bins is obtained, wherein the historical data comprises spare part demand data and environmental data of each spare part bin in the plurality of spare part bins at a plurality of first moments. Inputting the historical data into a trained prediction model to obtain a prediction result and a first incidence relation, wherein the prediction result is spare part demand data of the spare part bins at least one second moment, and the first incidence relation is the incidence relation of the spare part demand data among the spare part bins obtained by the trained prediction model based on the historical data. Outputting the first incidence relation so as to determine the credibility of the prediction result based on the first incidence relation.
Optionally, the method further includes: the method comprises the steps of obtaining training data and verification data, inputting the training data into a prediction model to be trained to obtain output data, and updating the prediction model to be trained based on the output data and the verification data to obtain the trained prediction model.
Optionally, the prediction model to be trained includes a preprocessing submodel and a prediction submodel. The inputting the training data into the prediction model to be trained to obtain output data comprises: inputting the training data into the preprocessing submodel to obtain processed training data, wherein the processed training data comprises a second incidence relation, the second incidence relation is the incidence relation between the spare part demand data and the environment data, and the processed training data is input into the forecasting submodel to obtain the output data.
Optionally, the plurality of spare parts bins include a plurality of levels of spare parts bins, the spare part requirement data of a higher level of the plurality of levels of spare parts bins is determined according to the spare part requirement data of a lower level of the plurality of levels of spare parts bins, and the plurality of levels of spare parts bins at least include: first level spare part storehouse, second level spare part storehouse, third level spare part storehouse, first level spare part storehouse the second level spare part storehouse, the hierarchy in third level spare part storehouse is high-level to low level, wherein, first level spare part storehouse includes N second level spare part storehouses, every second level spare part storehouse in N second level spare part storehouses includes a plurality of third level spare part storehouses, and N is more than or equal to 2's integer.
Optionally, the pretreatment submodel includes N + 1. Inputting the training data into the preprocessing submodel to obtain processed training data, wherein the training data comprises: acquiring N sub-training data related to the N second-level spare part bins and 1 sub-training data related to the first-level spare part bin from the training data, and respectively inputting the N +1 sub-training data into N +1 preprocessing sub-models to obtain the processed training data.
Optionally, each of the N sub-training data for the N second-tier spare part bins comprises: and graph data consisting of spare part demand data and environment data of a plurality of third-level spare part bins below the second-level spare part bin corresponding to the sub-training data. The 1 sub-training data for the first hierarchical spare part bin comprises: and the N second-level spare part bins are respectively provided with graph data consisting of spare part requirement data and environment data.
Optionally, the updating the to-be-trained prediction model based on the output data and the verification data to obtain the trained prediction model includes: and processing spare part demand data in the output data and spare part demand data in the verification data to obtain a prediction error, and updating the to-be-trained prediction model based on the prediction error to obtain the trained prediction model.
Another aspect of the present disclosure provides a prediction apparatus, including: the device comprises a first acquisition module, an input module and an output module. The first obtaining module obtains historical data of a plurality of spare part bins, wherein the historical data comprises spare part demand data and environmental data of each spare part bin of the plurality of spare part bins at a plurality of first moments. The input module is used for inputting the historical data into a trained prediction model to obtain a prediction result and a first incidence relation, wherein the prediction result is spare part demand data of the spare part bins at least one second moment, and the first incidence relation is the incidence relation of the spare part demand data among the spare part bins obtained by the trained prediction model based on the historical data. An output module that outputs the first incidence relation so as to determine the credibility of the prediction result based on the first incidence relation.
Optionally, the apparatus further comprises: the second acquisition module and the update module. The second acquisition module acquires training data and verification data, inputs the training data into the prediction model to be trained, and obtains output data. And the updating module is used for updating the to-be-trained prediction model to obtain the trained prediction model based on the output data and the verification data.
Optionally, the prediction model to be trained includes a preprocessing submodel and a prediction submodel. The inputting the training data into the prediction model to be trained to obtain output data comprises: inputting the training data into the preprocessing submodel to obtain processed training data, wherein the processed training data comprises a second incidence relation, the second incidence relation is the incidence relation between the spare part demand data and the environment data, and the processed training data is input into the forecasting submodel to obtain the output data.
Optionally, the plurality of spare parts bins include a plurality of levels of spare parts bins, the spare part requirement data of a higher level of the plurality of levels of spare parts bins is determined according to the spare part requirement data of a lower level of the plurality of levels of spare parts bins, and the plurality of levels of spare parts bins at least include: first level spare part storehouse, second level spare part storehouse, third level spare part storehouse, first level spare part storehouse the second level spare part storehouse, the hierarchy in third level spare part storehouse is high-level to low level, wherein, first level spare part storehouse includes N second level spare part storehouses, every second level spare part storehouse in N second level spare part storehouses includes a plurality of third level spare part storehouses, and N is more than or equal to 2's integer.
Optionally, the pretreatment submodel includes N + 1. Inputting the training data into the preprocessing submodel to obtain processed training data, wherein the training data comprises: acquiring N sub-training data related to the N second-level spare part bins and 1 sub-training data related to the first-level spare part bin from the training data, and respectively inputting the N +1 sub-training data into N +1 preprocessing sub-models to obtain the processed training data.
Optionally, each of the N sub-training data for the N second-tier spare part bins comprises: and graph data consisting of spare part demand data and environment data of a plurality of third-level spare part bins below the second-level spare part bin corresponding to the sub-training data. The 1 sub-training data for the first hierarchical spare part bin comprises: and the N second-level spare part bins are respectively provided with graph data consisting of spare part requirement data and environment data.
Optionally, the updating the to-be-trained prediction model based on the output data and the verification data to obtain the trained prediction model includes: and processing spare part demand data in the output data and spare part demand data in the verification data to obtain a prediction error, and updating the to-be-trained prediction model based on the prediction error to obtain the trained prediction model.
Another aspect of the disclosure provides a non-transitory readable storage medium storing computer-executable instructions for implementing the method as above when executed.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically illustrates an application scenario of a prediction method and a prediction apparatus according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow diagram of a prediction method according to an embodiment of the present disclosure;
FIG. 3A schematically illustrates a schematic diagram of a first association relationship according to an embodiment of the present disclosure;
FIG. 3B schematically shows a schematic diagram of a prediction method according to an embodiment of the present disclosure;
FIG. 4 schematically shows a block diagram of a prediction apparatus according to an embodiment of the present disclosure; and
FIG. 5 schematically illustrates a block diagram of a computer system for implementing prediction according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable control apparatus to produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
An embodiment of the present disclosure provides a prediction method, including: historical data of a plurality of spare part bins is obtained, wherein the historical data comprises spare part demand data and environmental data of each spare part bin in the plurality of spare part bins at a plurality of first moments. And then, inputting the historical data into the trained prediction model to obtain a prediction result and a first association relation, wherein the prediction result is spare part demand data of the spare part bins at least one second moment, and the first association relation is the association relation of the spare part demand data among the spare part bins obtained by the trained prediction model based on the historical data. Thereafter, the first associative relationship may be output to facilitate determining a confidence level of the prediction result based on the first associative relationship.
Fig. 1 schematically illustrates an application scenario of a prediction method and a prediction apparatus according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
The application scenario 100 shown in FIG. 1, for example, includes historical data about spare parts in multiple regions and a predictive model 110 for predicting spare part requirements. The spare parts referred to in the embodiments of the present disclosure are, for example, parts and components prepared for replacing devices which are easily damaged in machines and instruments. For example, taking a machine as an example of a computer, the spare part may be, for example, a motherboard, a memory stick, a video card, and the like.
In accordance with embodiments of the present disclosure, in order to provide replacement spare parts in time when a machine spare part is damaged, it is often necessary to store a certain number of spare parts in advance. However, if the number of spare parts stored is small, a sufficient number of spare parts cannot be provided in time to replace the damaged spare parts; if the number of spare parts stored is too large, the inventory cost is high. Therefore, the embodiment of the disclosure can make a plan of purchasing spare parts more scientifically by predicting the demand of the spare parts in the future on the premise of ensuring the service level, and reduce the inventory cost as much as possible.
According to the embodiment of the disclosure, the future demands of a plurality of spare parts bins can be predicted by collecting historical data of the plurality of spare parts bins, wherein the historical data can reflect the demands of the spare parts bins in the past, and inputting the historical data of the plurality of spare parts bins into the same prediction model 110.
For example, each of regions 1, 2, and 3 shown in fig. 1 includes a plurality of spare part bins, and historical data about the spare part bins in regions 1, 2, and 3 is input into the predictive model 110 to predict future demands for spare parts about the spare part bins in regions 1, 2, and 3 via the predictive model 110.
The prediction method according to the exemplary embodiment of the present disclosure is described below with reference to fig. 2, 3A, and 3B in conjunction with the application scenario of fig. 1. It should be noted that the above application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present disclosure, and the embodiments of the present disclosure are not limited in this respect. Rather, embodiments of the present disclosure may be applied to any scenario where applicable.
Fig. 2 schematically shows a flow chart of a prediction method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S210 to S230.
In operation S210, historical data of a plurality of spare part bins is obtained, wherein the historical data includes spare part demand data and environmental data of each of the plurality of spare part bins at a plurality of first time instants.
According to an embodiment of the present disclosure, the plurality of spare part bins includes, for example, spare part bins of different regions. For example, for a country, the plurality of spare parts bins may include a national spare parts bin, a provincial spare parts bin, and a municipal spare parts bin, among others. The spare part requirement data is, for example, the number of spare parts consumed by each spare part warehouse at the first time, and the environment data may include, for example, weather information, geographical location information, and the like of an area where the spare part warehouse is located.
According to an embodiment of the present disclosure, the historical data may be, for example, a three-dimensional tensor, for example, having a temporal dimension, a spatial dimension, and an eigen dimension. The three-dimensional tensor is expressed, for example, as
Figure BDA0002272218260000081
Wherein N represents that the historical data is the historical data of N spare part bins,
Figure BDA0002272218260000082
representing historical data for the nth spare part bin. Taking the ith spare part warehouse as an example, the historical data of the ith spare part warehouse
Figure BDA0002272218260000083
For example, as shown inWherein n isiFor example representing the number of first moments in time. Historical data of the ith spare part bin at the jth moment in a plurality of first moments
Figure BDA0002272218260000085
By way of example only, it is possible to use,
Figure BDA0002272218260000086
for example, as shown in
Figure BDA0002272218260000087
Wherein the content of the first and second substances,
Figure BDA0002272218260000088
for example, representA spare part demand of the ith spare part magazine at a jth time instant of the plurality of first time instants,
Figure BDA0002272218260000089
for example, environmental data representing the ith spare part bin with respect to a plurality of environmental factors at a jth time instant of the plurality of first time instants.
In operation S220, the historical data is input into the trained prediction model, and a prediction result and a first association relationship are obtained, where the prediction result is spare part demand data of the plurality of spare part bins at least one second time, and the first association relationship is an association relationship of the spare part demand data among the plurality of spare part bins obtained based on the historical data by the trained prediction model.
According to an embodiment of the present disclosure, the plurality of spare parts bins includes, for example, a multi-level spare parts bin including, for example, a national spare parts bin, a provincial spare parts bin, and a municipal spare parts bin as mentioned above. Wherein, the hierarchy of the spare part warehouse of the national region, the spare part warehouse of the provincial region and the spare part warehouse of the city region is arranged from high to low.
In the disclosed embodiments, historical data may be used, for example
Figure BDA00022722182600000810
The prediction results, such as predicted data regarding the demand for the spare parts at a second time, such as one or more times in the future, input into the trained predictive model include, for example, spare part data for a national spare part warehouse, a provincial spare part warehouse, and a municipal spare part warehouse.
The first association relationship is, for example, an association relationship of spare part requirements among a plurality of spare part bins. For example, the plurality of hierarchical spare parts bins include a spare part bin A of a provincial region and a spare part bin B of a municipality region under the provincial region1Spare parts warehouse B in the city level area2Spare parts warehouse B in the city level area3. The first association comprises, for example, a spare part warehouse B of a city-level region1Spare part warehouse B in city-level region2Has small difference of spare part requirements and is suitable for urban areasSpare part warehouse B1Spare part warehouse B in city-level region3The spare part requirements of (a) are greatly different. I.e. spare part warehouse B in the city level1Spare part warehouse B in city-level region2The spare parts of the system have similar requirements.
In the embodiment of the disclosure, for example, an attention mechanism module is included in the trained prediction model, and the attention mechanism module can enhance the association characteristics between different spare part bins in the process of model prediction, so that the association characteristics between different spare part bins in the prediction result are obvious.
In operation S230, the first association relationship is output so as to determine the reliability of the prediction result based on the first association relationship.
According to the embodiment of the disclosure, the user can conveniently determine the reliability of the prediction result based on the first relation by outputting the first association relation. For example, if the spare part demands of two spare part bins in the predicted result are similar, it is difficult to convince the user of the reliability of the predicted result only through the predicted result. Therefore, the first association relation between different spare part bins can be represented through output, so that a user can know whether the spare part demands between the different spare part bins are related or not from the first association relation, and the reliability of a prediction result is improved.
Fig. 3A schematically shows a schematic diagram of a first association according to an embodiment of the present disclosure.
As shown in fig. 3A, the attention mechanism module of the embodiment of the disclosure includes, for example, an attention matrix, and the attention matrix includes, for example, correlation values between a plurality of spare part bins, and the magnitude of the correlation values can, for example, indicate whether the first correlation relationship is tight. For example, the spare part bins A of the provincial regions comprise 17 spare part bins of the city region, the 17 spare part bins of the city region are represented by serial numbers 0-16, and the association value in the attention matrix can reflect a first association relationship among the 17 spare part bins of the city region. Specifically, 310 as shown in FIG. 3A, for example, represents the correlation value S between the 3 rd market-level regional spare part bin and the 14 th market-level regional spare part bin3-14For example, 320 in FIG. 3A indicates the 3 rd market class spare part bin and 8 th market class spare part binCorrelation value S between3-8. Wherein, darker the region color indicates a larger correlation value, i.e. the closer the first correlation relationship, indicates a more similar demand for spare parts between two spare part bins. Therefore, as can be seen from fig. 3A, the correlation value between the 3 rd market-level spare part bin and the 14 th market-level spare part bin is larger, which means that the similarity between the demands of the 3 rd market-level spare part bin and the demands of the 14 th market-level spare part bin on spare parts is higher.
According to the embodiment of the disclosure, the first incidence relation among the spare part bins is output, and the first incidence relation can explain the prediction result of the prediction model, so that the interpretability of the prediction model is improved, and the reliability of the prediction result is ensured.
The training process of the predictive model will be described below.
According to an embodiment of the present disclosure, training a predictive model may include, for example: the method comprises the steps of obtaining training data and verification data, inputting the training data into a prediction model to be trained to obtain output data, and updating the prediction model to be trained based on the output data and the verification data to obtain a trained prediction model.
According to embodiments of the present disclosure, for example, training data at a plurality of third time instants in the past and verification data at least one fourth time instant in the past may be acquired. The fourth time is, for example, after the third time, the training data at the third time may be used to train the model to be predicted, and the verification data at the fourth time may be used to verify the prediction model.
The plurality of third time instants include, for example, a past third time instant 1, a third time instant 2, a third time instant 3, and so on, and the training data includes, for example, spare part demand data and environment data of each spare part warehouse at the third time instant 1, the third time instant 2, and the third time instant 3, respectively. The spare part requirement data is, for example, the number of spare parts consumed by each spare part warehouse at the third time, and the environmental data may include, for example, weather information, geographical location information, and the like of an area where the spare part warehouse is located.
According to the embodiment of the disclosure, the plurality of spare parts bins includes a plurality of levels of spare parts bins, the output data includes, for example, spare part demand data of each level of spare parts bins in the plurality of levels of spare parts bins, and the spare part demand data of a higher level of spare parts bins in the plurality of levels of spare parts bins is determined according to the spare part demand data of a lower level of spare parts bins.
According to an embodiment of the present disclosure, the multi-level spare parts bins include, for example, the national spare parts bins, the provincial spare parts bins, and the municipal spare parts bins as mentioned above. Wherein, the hierarchy of the spare part warehouse of the national region, the spare part warehouse of the provincial region and the spare part warehouse of the city region is arranged from high to low.
In the embodiment of the present disclosure, the training data is input into the prediction model to be trained, and the obtained output data includes, for example, spare part demand data of a spare part warehouse in the national region, a spare part warehouse in the provincial region, and a spare part warehouse in the city region, and the output data is, for example, prediction data about the spare part demand at the fourth time. The spare part demand data of the high-level spare part warehouse can be determined according to the spare part demand data of the low-level spare part warehouse. For example, the spare part demand data of the spare part warehouse in the market region can be predicted by the model to be predicted based on the training data, the spare part demand of the spare part warehouse in the province region is, for example, the sum of the spare part demands of the spare part warehouses in a plurality of market regions under the province region, and the spare part demand of the spare part warehouse in the national region is, for example, the sum of the spare part demands of the spare part warehouses in a plurality of provinces regions.
According to the embodiment of the disclosure, a certain prediction error possibly existing in the spare part demand data of the high-level spare part warehouse is determined according to the spare part demand data of the low-level spare part warehouse, and in order to reduce the prediction error, the prediction model to be trained can be updated based on the output data and the verification data.
For example, the plurality of hierarchical spare parts bins include a spare part bin A of a provincial region and a spare part bin B of a municipality region under the provincial region1Spare parts warehouse B in the city level area2. The output data of the prediction model comprises, for example, a spare part bin B for a market-level region1Spare part requirement of b1Spare parts warehouse B in the city level area2Spare part requirement of b2And a spare part demand a of a spare part warehouse A in the provincial region, wherein a is b1+b2. It will be appreciated that the prediction data a, b1、b2E.g. forecast data at a fourth moment in time, the verification data comprising e.g. the actual spare part demand a' for spare part warehouse a in provincial region, the spare part warehouse B in municipality region at the fourth moment in time1Actual spare part demand b1' and spare part warehouse B in the city level region2Actual spare part demand b2’。
Then, the prediction error is obtained by processing the spare part requirement data in the output data and the spare part requirement data in the verification data. For example, the error between a and a', b1And b1' error between and b2And b2The error between' is taken as the prediction error. Based on the prediction error, the prediction model to be trained can be updated by using a stochastic gradient descent method. It can be understood that the number of times of updating the prediction model to be trained is not limited in the embodiments of the present disclosure, and the prediction model can be updated in real time according to actual applications.
In the embodiment of the present disclosure, in order to ensure consistency of the spare part demand data, that is, to ensure that the spare part demand of the high-level spare part warehouse is equal to the spare part demand of the low-level spare part warehouse in the prediction result, the embodiment of the present disclosure adopts a mode that the sum of the spare part demands of the low-level spare part warehouse is used as the spare part demand of the high-level spare part warehouse. Therefore, in order to reduce the predicted value of the spare part demand affecting the high-level spare part warehouse caused by the accumulation of the prediction errors of the low-level spare part warehouse layer by layer as much as possible, the embodiment of the disclosure updates the prediction model to be trained based on the prediction errors of each layer so as to improve the prediction accuracy of the prediction model as a whole.
According to embodiments of the present disclosure, a trained predictive model can be used to predict the spare part demand data for a plurality of spare part bins in the future. That is, the spare part demand data for the plurality of spare part bins in the future may be predicted based on the historical data by the trained predictive model.
The technical scheme of the embodiment of the disclosure can input the training data of the multi-level spare part warehouse into the same prediction model to be trained, and predict the future spare part demand data of the multi-level spare part warehouse through the prediction model to be trained. It can be understood that the internal relation among the multi-level spare part bins is considered by adopting the same to-be-trained prediction model, so that the prediction accuracy of the to-be-trained prediction model is more accurate. In addition, the prediction model to be trained is updated based on the prediction errors of all levels, so that the influence degree of the prediction errors of the low-level spare part warehouse on the prediction of the spare part demand of the high-level spare part warehouse can be at least reduced, and the prediction accuracy of the prediction model is integrally improved.
Fig. 3B schematically shows a schematic diagram of a prediction method according to an embodiment of the present disclosure.
As shown in fig. 3B, the multi-level spare part bin of an embodiment of the present disclosure, for example, includes at least: the system comprises a first-level spare part bin, a second-level spare part bin and a third-level spare part bin, wherein the levels of the first-level spare part bin, the second-level spare part bin and the third-level spare part bin are from a high level to a low level. The first-level spare part bin comprises N second-level spare part bins, each second-level spare part bin in the N second-level spare part bins comprises a plurality of third-level spare part bins, and N is an integer greater than or equal to 2. For ease of understanding, the disclosed embodiments are exemplified with N equal to 3. That is, the N second-tier spare parts bins include, for example, a second-tier spare part bin B, a second-tier spare part bin C, and a second-tier spare part bin D.
According to an embodiment of the present disclosure, each of the N sub-training data regarding the N second-tier spare part bins includes: and graph data consisting of spare part demand data and environment data of a plurality of third-level spare part bins below the second-level spare part bin corresponding to the sub-training data. For example, each second-level spare bin has sub-training data, exemplified by second-level spare bin B, which includes a plurality of third-level spare bins, e.g., third-level spare bin B1、b2、b3Etc., third level spare part bin b1、b2、b3The respective spare part requirement data and environmental data for example constitute graph data B of the second hierarchical spare part warehouse B.
According to an embodiment of the present disclosure, the individual sub-training data regarding the first-level spare part bin a includes: and the N second-level spare part bins are respectively provided with a graph data a consisting of spare part requirement data and environment data. For example, the spare part requirement data and the environment data of the second-level spare part warehouse B, the second-level spare part warehouse C and the second-level spare part warehouse D are combined into the map data a of the first-level spare part warehouse a.
According to the embodiment of the disclosure, the graph Data is, for example, Non-Euclidean Structure Data (Non-Euclidean Structure Data), and the embodiment of the disclosure may predict the spare part requirement based on the graph Data through a graph convolutional neural network (GCN). Thus, the predictive model to be trained of embodiments of the present disclosure may include, for example, a graph-convolution neural network.
According to the embodiment of the disclosure, the prediction model to be trained comprises a preprocessing submodel and a prediction submodel. The processing sub-model may be a graph-volume neural network, for example, and the prediction sub-model may be a Long Short-Term Memory network (LSTM), for example.
The inputting of the training data into the prediction model to be trained to obtain the output data may include: and inputting the training data into the preprocessing submodel to obtain processed training data, wherein the processed training data comprises a second incidence relation, and the second incidence relation is the incidence relation between the spare part requirement data and the environment data. That is, the embodiment of the present disclosure performs feature enhancement processing on the training data through the preprocessing submodel to extract the second association relationship between the spare part requirement data and the environment data in the training data. Taking the environmental data as an example of a weather factor, the second correlation may be represented by, for example, a higher demand for certain spare parts in a region with a higher temperature.
According to the embodiment of the disclosure, N sub-training data (graph data b, graph data c, and graph data d) related to N second-level spare part bins and 1 sub-training data (graph data a) related to a first-level spare part bin can be obtained from training data, and then N +1 sub-training data are respectively input to N +1 pre-processing sub-models, so as to obtain processed training data. For example, N +1 sub-training data are map data b, map data c, map data d, and map data a, respectively, and N +1 pre-processing sub-models are GCN _1, GCN _2, GCN _3, and GCN _4, respectively. The processed training data is then input into a predictor model (e.g., LSTM) to obtain output data.
According to the technical scheme, the training data of the multi-level spare part bin are input into the same prediction model to be trained, so that future spare part demand data of the multi-level spare part bin can be predicted through the prediction model to be trained. The internal relation among the multi-level spare part bins is considered by adopting the same to-be-trained prediction model to predict the multi-level spare part bins, so that the prediction accuracy of the to-be-trained prediction model is more accurate. In addition, since the training data of each level is non-euclidean data, the embodiment of the disclosure processes the non-euclidean data through the graph convolution neural network, and at least partially realizes the real simulation of the topology structure of each level data.
Fig. 4 schematically shows a block diagram of a prediction apparatus according to an embodiment of the present disclosure.
As shown in fig. 4, the prediction apparatus 400 includes a first obtaining module 410, an input module 420, and an output module 430.
The first obtaining module 410 can be configured to obtain historical data of a plurality of spare part bins, wherein the historical data includes spare part demand data and environmental data for each of the plurality of spare part bins at a plurality of first times. According to the embodiment of the present disclosure, the first obtaining module 410 may, for example, perform operation S210 described above with reference to fig. 2, which is not described herein again.
The input module 420 may be configured to input the historical data into the trained prediction model to obtain a prediction result and a first association relationship, where the prediction result is spare part demand data of the plurality of spare part bins at least one second time, and the first association relationship is an association relationship of the spare part demand data among the plurality of spare part bins obtained based on the historical data by the trained prediction model. According to the embodiment of the present disclosure, the input module 420 may perform, for example, the operation S220 described above with reference to fig. 2, which is not described herein again.
The output module 430 may be configured to output the first associative relationship to facilitate determining a confidence level of the prediction result based on the first associative relationship. According to the embodiment of the present disclosure, the output module 430 may perform, for example, the operation S230 described above with reference to fig. 2, which is not described herein again.
According to the embodiment of the present disclosure, the prediction apparatus 400 further includes, for example: the second acquisition module and the update module. The second acquisition module acquires training data and verification data, inputs the training data into the prediction model to be trained, and obtains output data. And the updating module is used for updating the prediction model to be trained to obtain a trained prediction model based on the output data and the verification data.
According to the embodiment of the disclosure, the prediction model to be trained comprises a preprocessing submodel and a prediction submodel. Inputting training data into a prediction model to be trained, and obtaining output data comprises the following steps: inputting the training data into the preprocessing submodel to obtain processed training data, wherein the processed training data comprises a second incidence relation, the second incidence relation is an incidence relation between the spare part requirement data and the environment data, and the processed training data is input into the forecasting submodel to obtain output data.
According to the embodiment of the present disclosure, the plurality of spare parts bins includes a plurality of levels of spare parts bins, and the spare part demand data of a high level of the plurality of levels of spare parts bins is determined according to the spare part demand data of a low level of the plurality of levels of spare parts bins, and the plurality of levels of spare parts bins at least include: the device comprises a first-level spare part bin, a second-level spare part bin and a third-level spare part bin, wherein the levels of the first-level spare part bin, the second-level spare part bin and the third-level spare part bin are from a high level to a low level, the first-level spare part bin comprises N second-level spare part bins, each second-level spare part bin in the N second-level spare part bins comprises a plurality of third-level spare part bins, and N is an integer greater than or equal to 2.
According to an embodiment of the present disclosure, the pre-processing submodel includes N + 1. Inputting training data into the preprocessing submodel, wherein the obtained processed training data comprises: and acquiring N sub-training data related to N second-level spare part bins and 1 sub-training data related to a first-level spare part bin from the training data, and respectively inputting the N +1 sub-training data into N +1 preprocessing sub-models to obtain the processed training data.
According to an embodiment of the present disclosure, each of the N sub-training data regarding the N second-tier spare part bins includes: and graph data consisting of spare part demand data and environment data of a plurality of third-level spare part bins below the second-level spare part bin corresponding to the sub-training data. The 1 sub-training data for the first-tier spare part bin includes: and the N second-level spare part bins are respectively provided with graph data consisting of spare part requirement data and environment data.
According to an embodiment of the present disclosure, updating the prediction model to be trained to obtain the trained prediction model based on the output data and the validation data includes: and processing spare part demand data in the output data and spare part demand data in the verification data to obtain a prediction error, and updating the to-be-trained prediction model based on the prediction error to obtain a trained prediction model.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any plurality of the first obtaining module 410, the input module 420, and the output module 430 may be combined and implemented in one module, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the first obtaining module 410, the input module 420, and the output module 430 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware. Alternatively, at least one of the first obtaining module 410, the input module 420 and the output module 430 may be at least partially implemented as a computer program module, which when executed may perform a corresponding function.
FIG. 5 schematically illustrates a block diagram of a computer system for implementing prediction according to an embodiment of the present disclosure. The computer system illustrated in FIG. 5 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 5, a computer system 500 implementing prediction includes a processor 501, a computer-readable storage medium 502. The system 500 may perform a method according to an embodiment of the present disclosure.
In particular, processor 501 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 501 may also include onboard memory for caching purposes. The processor 501 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 502 may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The computer-readable storage medium 502 may include a computer program 503, which computer program 503 may include code/computer-executable instructions that, when executed by the processor 501, cause the processor 501 to perform a method according to an embodiment of the disclosure, or any variation thereof.
The computer program 503 may be configured with computer program code, for example, comprising computer program modules. For example, in an example embodiment, code in computer program 503 may include one or more program modules, including 503A, 503B, … …, for example. It should be noted that the division and number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 501 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 501.
According to an embodiment of the present disclosure, at least one of the first obtaining module 410, the input module 420 and the output module 430 may be implemented as a computer program module described with reference to fig. 5, which, when executed by the processor 501, may implement the respective operations described above.
The present disclosure also provides a computer-readable medium, which may be embodied in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer readable medium carries one or more programs which, when executed, implement the above prediction method.
According to embodiments of the present disclosure, a computer readable medium may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, optical fiber cable, radio frequency signals, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. A prediction method, comprising:
acquiring historical data of a plurality of spare part bins, wherein the historical data comprises spare part demand data and environmental data of each spare part bin in the plurality of spare part bins at a plurality of first moments;
inputting the historical data into a trained prediction model to obtain a prediction result and a first incidence relation, wherein the prediction result is spare part demand data of the spare part bins at least one second moment, and the first incidence relation is the incidence relation of the spare part demand data among the spare part bins obtained by the trained prediction model based on the historical data; and
outputting the first incidence relation so as to determine the credibility of the prediction result based on the first incidence relation.
2. The method of claim 1, further comprising:
acquiring training data and verification data;
inputting the training data into a prediction model to be trained to obtain output data; and
and updating the to-be-trained prediction model based on the output data and the verification data to obtain the trained prediction model.
3. The method of claim 2, wherein the predictive model to be trained comprises a pre-processing sub-model and a predictor sub-model;
the inputting the training data into the prediction model to be trained to obtain output data comprises:
inputting the training data into the preprocessing submodel to obtain processed training data, wherein the processed training data comprises a second incidence relation, and the second incidence relation is the incidence relation between the spare part requirement data and the environment data; and
and inputting the processed training data into the predictor model to obtain the output data.
4. The method of claim 3, wherein,
the plurality of spare parts bins comprising a plurality of levels of spare parts bins, the spare part requirement data for a higher level of the plurality of levels of spare parts bins being determined from the spare part requirement data for a lower level of the plurality of levels of spare parts bins,
the multi-level spare part bin comprises at least: the system comprises a first-level spare part bin, a second-level spare part bin and a third-level spare part bin, wherein the levels of the first-level spare part bin, the second-level spare part bin and the third-level spare part bin are from a high level to a low level,
the first-level spare part bin comprises N second-level spare part bins, each second-level spare part bin in the N second-level spare part bins comprises a plurality of third-level spare part bins, and N is an integer greater than or equal to 2.
5. The method of claim 4, wherein the pre-processing submodel includes N + 1;
inputting the training data into the preprocessing submodel to obtain processed training data, wherein the training data comprises:
obtaining, from the training data, N sub-training data for the N second-tier spare part bins and 1 sub-training data for the first-tier spare part bin; and
and respectively inputting the N +1 sub-training data into the N +1 pre-processing sub-models to obtain the processed training data.
6. The method of claim 5, wherein,
each of N sub-training data for the N second-tier spare bins comprises: graph data consisting of spare part demand data and environment data of a plurality of third-level spare part bins below the second-level spare part bin corresponding to the sub-training data;
the 1 sub-training data for the first hierarchical spare part bin comprises: and the N second-level spare part bins are respectively provided with graph data consisting of spare part requirement data and environment data.
7. The method of claim 2, wherein the updating the predictive model to be trained to the trained predictive model based on the output data and the validation data comprises:
processing spare part demand data in the output data and spare part demand data in the verification data to obtain a prediction error; and
and updating the to-be-trained prediction model based on the prediction error to obtain the trained prediction model.
8. A prediction apparatus, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module acquires historical data of a plurality of spare part bins, and the historical data comprises spare part demand data and environmental data of each of the plurality of spare part bins at a plurality of first moments;
the input module is used for inputting the historical data into a trained prediction model to obtain a prediction result and a first incidence relation, wherein the prediction result is spare part demand data of the spare part bins at least one second moment, and the first incidence relation is the incidence relation of the spare part demand data among the spare part bins obtained by the trained prediction model based on the historical data; and
an output module that outputs the first incidence relation so as to determine the credibility of the prediction result based on the first incidence relation.
9. A computing device, comprising:
one or more processors; and
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
10. A computer-readable storage medium storing computer-executable instructions for implementing the method of any one of claims 1 to 7 when executed.
CN201911112033.4A 2019-11-13 2019-11-13 Prediction method, prediction device, calculation device, and medium Pending CN110852513A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911112033.4A CN110852513A (en) 2019-11-13 2019-11-13 Prediction method, prediction device, calculation device, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911112033.4A CN110852513A (en) 2019-11-13 2019-11-13 Prediction method, prediction device, calculation device, and medium

Publications (1)

Publication Number Publication Date
CN110852513A true CN110852513A (en) 2020-02-28

Family

ID=69600202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911112033.4A Pending CN110852513A (en) 2019-11-13 2019-11-13 Prediction method, prediction device, calculation device, and medium

Country Status (1)

Country Link
CN (1) CN110852513A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070156543A1 (en) * 2005-12-29 2007-07-05 Kimberly-Clark Worldwide, Inc. Spare parts inventory management
US20130166350A1 (en) * 2011-06-28 2013-06-27 Smart Software, Inc. Cluster based processing for forecasting intermittent demand
CN105046370A (en) * 2015-08-18 2015-11-11 国电南瑞科技股份有限公司 Four-line one-storehouse spare part inventory prediction system and establishing method thereof
US20160321606A1 (en) * 2015-04-28 2016-11-03 Accenture Global Services Limited Automated, new spare parts forecasting and demand planning system
CN108492141A (en) * 2018-03-28 2018-09-04 联想(北京)有限公司 A kind of prediction technique and device of multi-model fusion
US20190050794A1 (en) * 2017-08-14 2019-02-14 Zkh Industrial Supply Co., Ltd. Intelligent Warehousing Management Method, Apparatus, System and Unmanned Intelligent Warehousing Device
CN109740814A (en) * 2018-12-29 2019-05-10 联想(北京)有限公司 Prediction technique, device and electronic equipment
US20190286541A1 (en) * 2018-03-19 2019-09-19 International Business Machines Corporation Automatically determining accuracy of a predictive model

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070156543A1 (en) * 2005-12-29 2007-07-05 Kimberly-Clark Worldwide, Inc. Spare parts inventory management
US20130166350A1 (en) * 2011-06-28 2013-06-27 Smart Software, Inc. Cluster based processing for forecasting intermittent demand
US20160321606A1 (en) * 2015-04-28 2016-11-03 Accenture Global Services Limited Automated, new spare parts forecasting and demand planning system
CN105046370A (en) * 2015-08-18 2015-11-11 国电南瑞科技股份有限公司 Four-line one-storehouse spare part inventory prediction system and establishing method thereof
US20190050794A1 (en) * 2017-08-14 2019-02-14 Zkh Industrial Supply Co., Ltd. Intelligent Warehousing Management Method, Apparatus, System and Unmanned Intelligent Warehousing Device
US20190286541A1 (en) * 2018-03-19 2019-09-19 International Business Machines Corporation Automatically determining accuracy of a predictive model
CN108492141A (en) * 2018-03-28 2018-09-04 联想(北京)有限公司 A kind of prediction technique and device of multi-model fusion
CN109740814A (en) * 2018-12-29 2019-05-10 联想(北京)有限公司 Prediction technique, device and electronic equipment

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
张冬等: "基于BP神经网络和设备特性的工业设备备件需求预测", 《机械设计与研究》 *
施海燕: "基于需求预测的S公司备件管理研究", 《中国优秀博硕士学位论文全文数据库(硕士)》 *
王乃超等: "基于备件保障概率的多级库存优化模型", 《航空学报》 *
邱立鹏: "钢铁制造业备件供应管理关键因素研究及应用", 《中国优秀博硕士学位论文全文数据库(博士)》 *

Similar Documents

Publication Publication Date Title
US11488055B2 (en) Training corpus refinement and incremental updating
AU2020385264B2 (en) Fusing multimodal data using recurrent neural networks
US11288065B2 (en) Devops driven cognitive cost function for software defect prediction
US20200074267A1 (en) Data prediction
US20210174196A1 (en) Ground truth quality for machine learning models
CN110688536A (en) Label prediction method, device, equipment and storage medium
US20210357767A1 (en) Automated knowledge infusion for robust and transferable machine learning
US20210110248A1 (en) Identifying and optimizing skill scarcity machine learning algorithms
US10176437B2 (en) Method and apparatus to analytically support parts provision for hardware maintenance service
US11551173B2 (en) Risk failure prediction for line assets
CN111291715A (en) Vehicle type identification method based on multi-scale convolutional neural network, electronic device and storage medium
US11803792B2 (en) Risk management
US20220180274A1 (en) Demand sensing and forecasting
US20220156572A1 (en) Data partitioning with neural network
Fracca et al. Estimating activity start timestamps in the presence of waiting times via process simulation
US20170236056A1 (en) Automated predictive modeling and framework
CN112783423A (en) Data object storage method and device, electronic equipment and computer readable medium
US10824777B2 (en) Employing natural language processing to facilitate geospatial analysis
CN110852513A (en) Prediction method, prediction device, calculation device, and medium
CN116128048A (en) Optical remote sensing image cloud detection model training method, detection method and device
US20230168411A1 (en) Using machine learning for modeling climate data
US20230137184A1 (en) Incremental machine learning for a parametric machine learning model
CN110837937A (en) Prediction method, prediction device, calculation device, and medium
US20220180367A1 (en) Behavior classification and prediction through temporal financial feature processing with recurrent neural network
US11120174B1 (en) Methods and apparatus for evaluation of combinatorial processes using simulation and multiple parallel statistical analyses of real data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination